Open Government Data: possibilities and problems

View a .PDF version of this report here

“Government 2.0 involves direct citizen engagement in conversations about government services and public policy through open access to public sector information and new Internet based technologies. It also encapsulates a way of working that is underpinned by collaboration, openness and engagement”[1]

Back ground and context

The Political Issues Analysis System (PIAS) project (view original report .pdf)—in which this work is a sub-set—sought to investigate how citizens in Melbourne, Australia used the Internet to seek political information about key political issues. It also sought to understand how citizens contacted and interacted with their elected representative in relation to these issues. Through workshops, case studies, and the development and testing of prototype software, the research uncovered some notable trends in terms of user engagement with important aspects of the formal political process online.

The PIAS project principally focussed upon citizen information use through investigating interaction with party web-sites and the policy documents that they made available. However, the participants in our study largely found 1), the sites difficult to use 2), the information hard to navigate and compare with other policies and 3), the written policies unreliable and unclear. One of our key recommendations from the study emphasized that polices published by political parties should be made available in a ‘machine readable’ form so that they can be automatically aggregated into other systems to enable citizens to compare the policy positions of the parties. Also, strict metadata publishing standards and frameworks should be used so that the information aggregated is of a high-standard allowing it be re-utilised effectively.

This work compliments the PIAS project through listing some of the key projects and services that available that utilise government data. It also explores in more detail the limited availability of what could be termed ‘democratic data’. For the purposes here, “democratic data” is described as: 1) Hansard: making the working of government available in new ways, 2) Transparency: newer forms of transparency through ‘data’, and 3) Policy: enhance and extend the policy making process through online open consultation.

Why Open Access to government data?

Much of the impetus behind the drive for Open Access to government data stems from a push for greater transparency to the functions of government. However, in the case of Victoria, for instance, much of the data being released within the Gov 2.0 agenda tends to be of an administrative nature and of little democratic potential. Whist the Parliament of Victoria does make an enormous amount of useful material available to the public through its website; it is not made available in a technically sophisticated, machine readable way, to take full advantage of the potential of the Internet. Bills are only available in .pdf or word format and the most important document about the workings of government, Hansard, is also only available as .pdf (although it is possible to do a full-text search of Hansard from 1991 onwards). If these important documents were available in a machine readable form, they could be utilised by application developers in innovative ways.

The Open Access movement is a push to make data both machine readable and interoperable so that it may be linked together and leveraged for all sorts of purposes. This may be for new business opportunities, medical research, or new areas of social research. However, doing this is no easy task as multiple data sources require linking and matching across diverse and complex systems (and ‘cleansing’). The first step in this process is to expose data in a standardised way so that it may be located and machine-read. The Victorian public sector has a policy framework specifically designed to achieve these tasks titled the Victorian Public Sector Action Plan. Two key points are:

  1. Participation: Engaging communities and citizen through using Government 2.0 initiatives to put citizens at the centre and provide opportunities for co-design, co-production and co-delivery.
  2. Transparency: Opening up government through making government more open and transparent through the release of public sector data and information[2]

Making data available in this way can only help to “deepen democratic processes” and promote a strong and healthy democracy (however this is often an aspiration rather than an actuality).[3] Accordingly, there is a promising international trend to promote a two-way dialogue between political representatives and the public through combining ‘’democratic data’’ with citizen produced data through popular social media platforms.[4] Rather than building a completely new platform (as has been the case with a number of somewhat underutilised government initiatives), some projects take advantage of largely existing and heavily used social network platforms and provide tools and services to augment their existing capacity (usually to inform and communicate government policy processes) The large EU funded WeGov project[5] and other projects in the US and Europe are welcome movements in this direction. [6]

Continue reading “Open Government Data: possibilities and problems”

Web 2.0 in higher education

There is a belief in some circles that Content Management Systems (CMS) such as Joomla and Drupal are labour saving devices and that their very presence online will spontaneously invoke a community of highly-skilled individuals that will submit content and build the system in a coherent and meaningful way. This idea is a myth as virtual communities require a great deal of maintenance, promotion, and strategy to work in a meaningful way for all. It is almost impossible to make a virtual community work if the main concern is the technology alone. It is an inherently socio-technical exercise with the former being extraordinarily difficult in an institutional environment.

JISC will launch a report on Web 2 in Higher Education next Tuesday 12 May (that I will attend). I also draw attention to a case-study report published on the JISC web site last year that claims ‘The features most associated with a Web 2.0 approach (rate, comment, upload, blog and send to friend) were commonly described with reference to social networking or e-commerce sites and were largely considered non-academic and therefore inappropriate for the Pre-Raphaelite online resource’ (link). In other words, building a virtual community is a very labour intensive and difficult task in HE and almost impossible if there is not at least some attempt at a community building strategy. A virtual community needs a strong sense of community through a coherent and interesting concept, a belief that the labour that the user is contributing to the site is meaningful and consequential, and some sort of reward system. There is no rigid method for making a community site work, but it does take a strategy to grow and foster the community but the one that develops may not always be the one that was imagined in the first instance.

Therac-25: the killer of all case studies

Those involved in writing case studies or teaching ethics to ICT  students may find the Therac-25 case of great interest. Basically it is about a medical machine that delivered a lethal dosage of radiation. But rather than being the fault of an individual; it was an entire systems fault. In other words if you have ever doubted the importance of a socio-technical perspective you practical beast you, think again! Well worth a read (link).


Reclaiming the local…

(thanks to the NY Times)

If your local newspaper shuts down, what will take the place of its coverage? Perhaps a package of information about your neighborhood, or even your block, assembled by a computer.  

Minh Uong/The New York Times

A number of Web start-up companies are creating so-called hyperlocal news sites that let people zoom in on what is happening closest to them, often without involving traditional journalists.

The sites, like EveryBlock,, Placeblogger and Patch, collect links to articles and blogs and often supplement them with data from local governments and other sources. They might let a visitor know about an arrest a block away, the sale of a home down the street and reviews of nearby restaurants (link NY Times)


Report: Tools for Data-Driven Scholarship (or tools for value driven scholarship?)


(Google’s data centre)

Another excellent report from some excellent US scholars. But I wish that I had more time to properly interrogate the ideas and claims I often read in these Digital Humanities documents ( but if I may be a bold and superficial blogger, there are some recurring themes in numerous of these documents). ‘Data driven’ scholarship is closely linked to science, meaning that it is the imposition of the scientific method upon the humanities. This means that the intellectual paradigm of ‘data-driven’ scholarship is empirical, positivist, and rational. From a progressive humanistic perspective, these are very old-fashioned ideas perpetuated by elite schools in elite Universities (repeat after me 27 times young man!). In some ways the tools don’t matter; it is the intellectual underpinnings of the so-called claim of ‘scholarly transformations’ that do. The ‘humanities’ have numerous so-called ‘intellectual transformations’ but few if any of them has anything to do with empirical and positivist thought. Sorry, the humanities is not ‘big science’. The human condition is not all together rational. There is a massive tension here; we must never be driven by scientific nor engineering dreams; we must be driven by the values we place in our own intellectual traditions. These tools matter, but only in the context of the latter.

As documented in Our Cultural Commonwealth: The Report of the American Council of Learned Societies Commission on Cyberinfrastructure for the Humanities and Social Sciences, we are witnessing an extraordinary moment in cultural history when the humanities and social sciences are undergoing a foundational transformation. Tools are one essential element of this transformation, along with access to cultural data in digital form. The need to “develop and maintain open standards and robust tools” was one of eight key recommendations in the ACLS report, inspired by the NSF’s 2003 Atkins report on cyberinfrastructure.[Unsworth et al., Our Cultural Commonwealth: The Report of the American Council of Learned Societies Commission on Cyberinfrastructure for the Humanities and Social Sciences, American Council of Learned Societies, 2006] (link).

Towards an institutional typology of digital humanities centres

Thanks to John Unsworth for the link

This Wiki presents a structured list of departments, centres, institutes and other institutional forms that variously instantiate humanities computing. For each entry a link is provided to the relevant site on the WWW and a brief description given. This list represents an ongoing attempt to derive a basic typology from a complex variety of activities and so to provide institutional models for the field. Despite the fact that national academic conventions vary quite widely and cultural differences make comparisons difficult if not hazardous, no attempt has been made here to account for them. The intention is not to define what is happening in the field world-wide, rather it is to provoke discussion leading either to consensus or at least to an improved understanding of the conditions under which computing humanists work. Constructive criticisms and clarifications are not merely welcome, they are to the point.

Summit on Digital Tools in the Humanities

This site from IATH (the Institute for Advanced Technology in the Humanities) at the University of Virginia contains the findings of a summit held in 2006 about digital tools in the humanities. The report is excellent reading; and points to the need for innovations in the humanities such as ICT Guides (link)

Digital tools are enabling and enriching scholarship in the humanities to a great extent. Within the past few years, humanities scholars have begun to design, develop, and apply digital tools for their own scholarship. Both the tool-building and tool-using communities are growing, and there is a need for a summit that can assess the state of development of digital tools for humanities research, as well as the effectiveness of the supporting and integrating cyberinfrastructure.

What defines a digital tool? How are they used by the humanities community? What are the best tools? What tools are missing? How can we develop a common vocabulary so that we can develop and share tools across various communities? What does the community need to do so that these tools are more interoperable? What are the grand challenges for building digital tools for humanities research?