Oxford Internet Surveys

(Another important ‘big picture’ Internet impact study from the Oxford Internet Institute).

Oxford Internet Survey (OxIS) research is designed to offer detailed insights into the influence of the Internet on everyday life in Britain. Launched in 2003 by the Oxford Internet Institute, OxIS is an authoritative source of information about Internet access, use and attitudes. Some of the areas covered include: digital and social inclusion and exclusion; regulation and governance of the Internet; privacy, trust and risk concerns; social networking and entertainment; and online education (link).

Oxford Internet Survey 2009 Report: The Internet in Britain

(A interesting new report from the Oxford Internet Institute)

The Oxford Internet Institute, University of Oxford, has today released the OxIS Report 2009, the latest report in a series of Oxford Internet Surveys (OxIS) that cover the changing landscape of Internet access, use and attitudes in Britain. Dutton, W.H., Helsper, E.J. and Gerber, M.M. (2009) Oxford Internet Survey 2009 Report: The Internet in Britain. Oxford Internet Institute, University of Oxford. Download OxIS 2009 [PDF, 1.9MB]: http://www.oii.ox.ac.uk/research/oxis/oxis2009_report.pdf OxIS website: http://www.oii.ox.ac.uk/microsites/oxis/ The Report will be formally launched at the House of Commons later this afternoon at an event hosted by Derek Wyatt, MP. Presentations on the significance of OxIS will be given by representatives from the sponsoring organisations: Adrian Arthur (British Library), James Thickett (Ofcom) and Mark Cowtan (Scottish and Southern Energy). Continue reading “Oxford Internet Survey 2009 Report: The Internet in Britain”

Geo-referencing Digitised Collections

georeferencing
There are a couple of projects underway here at the Centre for eReseach (CeRch) and the Centre for Computing in the Humanities (CCH) about ‘Geo-referencing’. Geo-referencing is a way of ‘tagging’ digital collections so they can be searched by geographical place names or mapped.  Dr Claire Grover of the Language Technology Group, School of Informatics, University of Edinburgh is working on text-mining methods for extracting geographical information from unstructured text (ie. not encoded).  She is talking here next week. If you would like to come;  just send me an email.

There are vast quantities of textual information which people
typically access through standard search queries. Many collections
have added value in metadata associated with texts but this is costly
and time-consuming to generate by hand. Researchers in the field of
natural language processing (NLP) have been been working for the past
couple of decades on technologies for information extraction (aka text
mining) that will allow for the automatic extraction of structured
information that currently resides in unstructured text. In this talk
I will describe the NLP system that we have been developing to extract
‘who, where and when’ metadata from textual content. The primary focus
of the system is geo-referencing so that the place names in a text can
be recognised and grounded to a gazetteer entry to provide lat/long
information. In addition the system recognises person names as well as
dates and other temporal expressions.

System development was previously funded as part of EDINA’s
GeoCrossWalk project and we are currently refining it further for use
in the GeoDigRef project where we are geo-referencing three digitised
collections, Histpop, parliamentary records from BOPCRIS and metadata
from the British Library’s Archival Sound Recordings. In a parallel
project we are geo-referencing the Stormont Papers. I will discuss the
issues that arise from these different collections and will use them
to illustrate the difficulties in trying to develop a general purpose
tool that can be useful across different text types.

What is technological determinism?

determinsim

Technological determinism is circulated, maintained, and advanced within the pre-existing hierarchies in the world in which we live. Determinism has its own political agendas, its own rules, its own contexts and hierarchies and antagonisms to an imagined ‘other’. Determinism utilises a proprietary language and culture and although it cloaks itself in ideas of inter-disciplinary, deterministic discourse discourages intellectual critique, dissent, and justifies itself with the high ground of capitalist practicality. Deterministic rhetoric is only interested in other knowledge so that it can demonise it, remediate it, appropriate it, make it better, wrestle it out of the hands of the ‘elite’ and make it more ‘democratic’, more in touch with ‘the people’.

I wrote this some time ago (link).  A rather disturbing report I recently read on Web 2 and Education prompted me to re-visit this writing

Report: Tools for Data-Driven Scholarship (or tools for value driven scholarship?)

data

(Google’s data centre)

Another excellent report from some excellent US scholars. But I wish that I had more time to properly interrogate the ideas and claims I often read in these Digital Humanities documents ( but if I may be a bold and superficial blogger, there are some recurring themes in numerous of these documents). ‘Data driven’ scholarship is closely linked to science, meaning that it is the imposition of the scientific method upon the humanities. This means that the intellectual paradigm of ‘data-driven’ scholarship is empirical, positivist, and rational. From a progressive humanistic perspective, these are very old-fashioned ideas perpetuated by elite schools in elite Universities (repeat after me 27 times young man!). In some ways the tools don’t matter; it is the intellectual underpinnings of the so-called claim of ‘scholarly transformations’ that do. The ‘humanities’ have numerous so-called ‘intellectual transformations’ but few if any of them has anything to do with empirical and positivist thought. Sorry, the humanities is not ‘big science’. The human condition is not all together rational. There is a massive tension here; we must never be driven by scientific nor engineering dreams; we must be driven by the values we place in our own intellectual traditions. These tools matter, but only in the context of the latter.

As documented in Our Cultural Commonwealth: The Report of the American Council of Learned Societies Commission on Cyberinfrastructure for the Humanities and Social Sciences, we are witnessing an extraordinary moment in cultural history when the humanities and social sciences are undergoing a foundational transformation. Tools are one essential element of this transformation, along with access to cultural data in digital form. The need to “develop and maintain open standards and robust tools” was one of eight key recommendations in the ACLS report, inspired by the NSF’s 2003 Atkins report on cyberinfrastructure.[Unsworth et al., Our Cultural Commonwealth: The Report of the American Council of Learned Societies Commission on Cyberinfrastructure for the Humanities and Social Sciences, American Council of Learned Societies, 2006] (link).