JISC (the Joint Information Services Committee) fund a number of digitisation projects with content that spans nearly five centuries of British history. Some notable examples include British Newspapers 1620-1900 and the 19th Century Pamphlets Online. The manifold importance of digitisation is that the records are made easily accessible to scholars and the general public, and two once the records are ‘data’ they can be used in new ways to gain fresh insights from the data (especially in a large-scale quantitative sense such as parsing 2 centuries of Legal or Parliamentary records). The UK is fortunate in that it has invested so heavily in digitising some of its immense human history so that now this ‘data’ can be imaginatively used in new ways. As new computational tools and methods are developed, more usages of this data will be found (as long as the data is structured and preserved in a useful way).
(A VRE is a Virtual Research Environment…like a blackboard, well not really)
The following press release is from the Centre that I work within at King’s College; London. A lot of these projects won’t be of that much interest to researchers (as they are infrastructure grants, not research), however the TEXTvre project may be of extraordinary interest to researchers. Although JISC (Joint Information Services Committee) does not fund research as a matter of course, on occasions the projects undertaken under its auspices are in fact research. The TEXTvre project is one of these and is a collaboration between CeRch and our sister organiastion the Centre for Computing in the Humanities (CCH). It is the aim of this project to build a Virtual Research Environment to encode texts using the TEI (Text Encoding Inititaive) standard. I will blog about this project more as it develops.
A new book will be released soon titled: World Wide Web of Reseach: Reshaping the Sciences and Humanities (Cambridge; the MIT Press). It is edited by Bill Dutton and Paul Jeffreys, both of Oxford. Dutton is Director of the Oxford Internet Institute (OII) whilst Paul Jeffreys is Director of IT at Oxford. I believe the book will be focussed upon the issues of eResearch in the Sciences and Humanities; very important issues for the Digital Humanities. The eResearch aganda primarily encapsulates data-reuse and research collaboration through such systems as VREs (Virtual Research Environments). We have a progamme in this field here at King’s called AHESSC (Arts and Humanities eScience Suport Centre). I look forward to the book; I tried to pre-order it on Amazon but with no luck. You can find Bill Dutton’s blog here..
(as researchers, perhaps we are spiders stuck in a web)
(Roy Rosenzweig is the founder of the Centre for History and New Media at George Mason University in the US. The centre is progressive in both its approach to history and technological innovation. This fellowship may be of interest to you budding digital humanists out there).
In 2009, George Mason University and the American Historical Association will offer the first Roy Rosenzweig Fellowship for Innovation in Digital History. This award was developed by friends and colleagues of Roy Rosenzweig (1950–2007), Mark and Barbara Fried Professor of History and New Media at George Mason University, to honor his life and work as a pioneer in the field of digital history.
(Google’s data centre)
Another excellent report from some excellent US scholars. But I wish that I had more time to properly interrogate the ideas and claims I often read in these Digital Humanities documents ( but if I may be a bold and superficial blogger, there are some recurring themes in numerous of these documents). ‘Data driven’ scholarship is closely linked to science, meaning that it is the imposition of the scientific method upon the humanities. This means that the intellectual paradigm of ‘data-driven’ scholarship is empirical, positivist, and rational. From a progressive humanistic perspective, these are very old-fashioned ideas perpetuated by elite schools in elite Universities (repeat after me 27 times young man!). In some ways the tools don’t matter; it is the intellectual underpinnings of the so-called claim of ‘scholarly transformations’ that do. The ‘humanities’ have numerous so-called ‘intellectual transformations’ but few if any of them has anything to do with empirical and positivist thought. Sorry, the humanities is not ‘big science’. The human condition is not all together rational. There is a massive tension here; we must never be driven by scientific nor engineering dreams; we must be driven by the values we place in our own intellectual traditions. These tools matter, but only in the context of the latter.
As documented in Our Cultural Commonwealth: The Report of the American Council of Learned Societies Commission on Cyberinfrastructure for the Humanities and Social Sciences, we are witnessing an extraordinary moment in cultural history when the humanities and social sciences are undergoing a foundational transformation. Tools are one essential element of this transformation, along with access to cultural data in digital form. The need to “develop and maintain open standards and robust tools” was one of eight key recommendations in the ACLS report, inspired by the NSF’s 2003 Atkins report on cyberinfrastructure.[Unsworth et al., Our Cultural Commonwealth: The Report of the American Council of Learned Societies Commission on Cyberinfrastructure for the Humanities and Social Sciences, American Council of Learned Societies, 2006] (link).
(originally written for Arts-humanities.net)
Institutional repositories have become increasing important systems to store the rising amount of data produced by researchers. An institutional repository may be university wide or subject specific. They may serve the needs of a particular institution, a group of institutions, a nation, or an entire region. Examples include the UK’s Archaeological Data Service (ADS) http://ads.ahds.ac.uk/and the History Data Service (HDS) http://hds.essex.ac.uk/ , the Australian National Data Service (ANDS) http://ands.org.au/ and the Australian Social Science Data Archive (ASSDA) http://assda.anu.edu.au/, and the European wide Digital Research Infrastructure for the Arts and Humanities (DARIAH) http://www.dariah.eu/ and the Common Language Resources and Technology Infrastructure (CLARIN) http://www.clarin.eu/.
Institutional repositories collect digital data and usually make it available to a global audience. They may contain an assortment of digital objects including pre and post print articles, theses and dissertations, and results from research such as databases, images, surveys, teaching materials, and computing tools.
Once materiel is in a repository; another researcher may download it to be reused in their own research. Most institutional repositories work in this way; although there is a trend towards building systems to re-use this data in sophisticated, distributed ways through ‘Cyberinfrastructures’ and Virtual Research Environments (VREs). http://www.arts-humanities.net/briefingpaper/vre
Some of the most interesting academic questions for humanists is how do you incorporate data produced in the context of another research project in your own research? What new insights arise, what new problems arise, and how does this data impact upon the underlying evidence layers of your research? If anyone has experience of this; I would be extraordinarily interested to hear from you as I am developing a series of case studies around this problem.
(Western Union’s Automated Electronic Telegraph)
The mechanisms for the evaluation and peer review of the traditional print outputs of scholarly research in the arts and humanities are well established, but no equivalent exists for assessing the value of digital resources and of the scholarly work which leads to their creation. This project proposes to establish a framework for evaluating the quality, sustainability and impact over time of digital resources for the arts and humanities, using History, in its broadest sense, as a case study (link)