I recently attended a seminar at UWS on Friday 26 April, 2013 led by Lynne and Ray Siemens of the University of Victoria in Canada. The theme of the event was collaboration in the humanities and in particular; how digital humanities projects exemplify effective collaboration in the broader humanities. This is because digital humanities projects often cross-disciplines and geography and the often more demanding collaborative terrain of computer science, computational methods and the humanities.
Lynne Siemens, specialises in project management and team building. She stated that people aren’t always well-trained to work together and outlined some of the positives and negatives of working in teams. She claimed that some people are better able to collaborate than others, often because they have developed skills of listening, are flexible, can negotiate, and can compromise. Lynne described these as the ‘soft skills’ of effective collaborative teams. A team approach often produces more diverse and possibly higher quality ideas (and is a good way to learn new skills and perspectives), but some projects are better done as an individual (but of course, some projects are beyond the scope and skills-sets of individuals).
Lynne outlined some of successful team interactions she had observed, partly through research she had undertaken through case –studies. Good communication skills are vital, as is project management, and the ability to think across technology and the humanities and indeed, culture and language. Also the objectives of the team, the outcomes, and the individual tasks need to be clearly described with not too many grey areas that may be potential areas of conflict. And teams operate within institutional contexts so there are certain contingencies to negotiate either within or between institutions. Still, one of the best ways to build teams is through casual conversations, lots of face-to-face meetings, and large bottles of rum (I put in the last one).
Ray Siemans is a Professor of Humanities Computing at the University of Victoria in Victoria, Canada and is well known for his work in the Digital Humanities and in particular, through the founding of the annual Digital Humanities Summer Institute (that I attended 2 years ago and now attracts around 500 participants). He discussed the important work of the digital humanities, particularly around content modelling and computational analysis of content (a core form of scholarship within the field). He also discussed the typology of curriculum development in the digital humanities either through stand-alone degrees or through digital humanities inflicted programs and in particular, the highly successful Summer Institute model.
DHSI (Digital Humanities Summer Institute) http://www.dhsi.org/
ETCL (Electronic Textual Culture Lab) http://etcl.uvic.ca/
The term interdisciplinary is used a lot, often unthinkingly and uncritically. I asserted in the last post that the ”socio-technical’ is a false dichotomy and that technical production is also ”social” and technology advances within its own understanding of ‘the social’ (grounded by the laws of physics). And the separation of the two modes of thinking is unproductive.
It is this idea of ”unproductive” thinking that needs to be explored, especially in fields that exist in the gaps, ie. interdisciplinary fields such as ”STS” or the ‘Digital Humanities”. Understandings of ”the social” and the tools and methods we use to do this advance rapidly. And tool and methods to understand the world through computer science change rapidly as well (ie programming methods change rapidly). It is easy to get stuck in one camp and make claims that one is interdisciplinary whilst falling behind in one of the disciplines that are important for your particular ”interdisciplinary” practice. I see this all the time in fields such as cultural studies that are advanced in the finer skills of academic practice, but the technical objects of their study are often many years behind contemporary technical research . And some technical areas such as eResearch tend to be bogged down in some very old-fashioned ideas of utility and are often unable to contribute to humanities research in a meaningful way because it is far too distant from it and lacks a sophisticated understanding of it.
Interdisciplinary requires deliberation and also empathy towards what one does not know. It is often very difficult if not possible to stay on top of a number of fields, but one can recognize this in ones-self and develop the skills and strategies to make good contributions to the interdisciplinary space that are both balanced and informed and aware of the key work and technical advances across fields.
I have been undertaking a lot of research of late that involves ‘sociology-technical’ approaches to computing. Whilst the subject matter of the studies is interesting and worthy, I do worry about falling into the academic trap (in which there are way too many) of being ‘socially determinist’. What I mean by this, is exclusively using in the research process, books and related theories that are very distant from the creation and understandings of software. ‘Technical capital’ through exercising technical skills (ie. from the people who built software) does not occur in a social void and the decisions made here are important and are often made by individuals unawares of the otherwise important social theories that someone else may have about them in some other research context. In other words, the ‘sociology-technical’ is a false dichotomy because it all-to-often fails to engage with the the technical production of software and the people that do this (so that they may inform each other). They are also ‘human’ and ‘social’ and have their own understanding of this and it is naive to believe that studies that are exclusivity ‘books about books’ are more in tune with the human condition or more ‘social’ (ie they are lacking in wisdom and balance). It would be much better to educate students about socially responsible coding. This is the two-hands of the ‘sociology-technical’ (that are hopefully connected to the same human being). We suffer from the same false dichotomy in the digital humanities and I think that the problem is more acute here in Australia because we import nearly all the software we use, so we are a long way from understanding the context in which it is made. And the people that make it in Germany, the UK, and the USA will continue to make it in their own context.
I think the hardest lesson I learned as a historian is that technology doesn’t need history. And technology certainly doesn’t need an Australian historian; in fact I may just write a book about this! (or learn how to program a machine that writes books without human intervention; the ultimate revenge of the digital humanist).
I wrote about this some time ago; about the connection between eLearning (blended learning etc.) and the Digital Humanities. The problem is that the connection is a weak one and should be further developed. I know of very few Digital Humanities modules or plugins etc. that are be used in existing learning environments such as Moodle or Blackboard. And it is not as though the DH doesn’t have the learning materials and methods. There has been much work done on teaching digital humanities, but the work done (both research and development) seems to miss the enormous body of knowledge around LMS and educational design. This field is particularly strong in Australia and NZ and it would be good to see some movement in this area; in the same way that the DH has developed a good working relationship (if not an intellectual one), with eResearch.
(This is a rough draft of a paper that is planned to be published sometime soon. If you have any comments in terms of factual accuracy or arguments they would be very much appreciated).
The application of diverse forms of eResearch infrastructures to support research has a long history. During the 1970s the genesis of eResearch in the shape of Internet was driven by the needs of the research community. In this latest stage of eResearch infrastructure development, also largely driven by the needs of the research, we are witnessing large scale investments in grids, clouds, federated repositories, and high-end eScience and eResearch projects to support research across institutional, regional, and disciplinary boundaries. But as eResearch expands, there is an increasing need to address the tricky questions of governance. eResearch does not exists in a free-flowing world of ideas, rather like all infrastructures, it exists in a complex, contested, and often contradictory world of varied manifestations of governance. As we will argue, the governance of any system has rarely been brought about in a planned and orderly manner; rather it is usually brought about by a crisis in a system and a contested set of attributes that have forced the extension of governance. As existing capacities meet limits, new approaches to governance are invented and deployed in the attempt to overcome the barriers. eResearch exists in a complex array of governing bodies and without a realistic grounding of its technical vision within the limits of these structures; new infrastructural developments to support eScience or eResearch or even the Digital Humanities will be hindered by institutional divergence.
(This new seeding project has just been accepted for funding from the Institute for Broadband Enabled Society (IBES) at the University of Melbourne. Led by VeRSI and myself, it is a short project with results available towards the end of the year or early next year).
Summary of Proposal
The Internet is recognised as a vital component of our political information systems. Although extensively used by governments and civil society groups, its effects upon political processes; particularly deliberative political processes, currently remains relatively unknown. Emerging research suggests that the Internet’s capacity to easily produce information has also led to data overload, undermining its deliberative potential. With the advent of the National Broadband Network the ‘data deluge’ promises to intensify increasing the need for political information—in its various guises—to be delivered in much more meaningful ways. This is especially important for younger audiences who are increasingly abandoning broadcast media in favour of online political information.
This project is an iterative study and design of an online ‘Political Issues Analysis System’ (PIAS) to assist users’ research and analyse political issues. It will deliver information about important political topics (ie. environmental issues, socio-economic issues, immigration, government policy etc.) using important data sources within a coherent ‘deliberative’ framework. It will evaluate the needs of users to comprehend political issues through the application of a number of semantic indexing and data matching tools and design a prototype system. It will do this in part through five public workshops using the University of Melbourne’s Usability Lab; each workshop focussing on a particular issue utilising particular tools and methods. It will in tandem uncover recommendations to assist in the design of a unique software tool that fosters user-driven processes to effectively filter and visualise online political information obtained from government data-sets (partly within the ‘Government 2.0’ policy framework), the media, NGOs, historical data, and other user-generated online sources; (blogs, video etc).
The outputs of the research will be a working prototype as well as a report documenting the research outcomes with a series of recommendations for further research. This project may lead to the first major study of online deliberative processes within Australia; competitive within the ARC’s Linkage or Discovery scheme. The work will be of benefit to governments, community groups and other major producers of political sites and the users of such sites. The project is within IBES’s Social Infrastructures and Community theme and in particular, adheres to IBES’s and VeRSI’s shared aspirations ‘to make existing and available data more accessible’. In summary the broad aims of the project are:
- To explore the evolving applications of online political information tools in an Australian and International context (especially in the analysis of broadband-enabled video and audio)
- To examine deliberative processes with a number of stakeholder groups using semantic indexing methods and various communication tools at the University’s IDEA Lab.
- To build, test and provide further recommendations for a ‘Political Issues Analysis System’ (PIAS)
Through these processes we address the following research questions:
- How can we better understand online deliberation in the international and Australian context and what tools need to be developed to assist this?
- How can we better design deliberative ‘ideas’ using data and online analysis tools that will involve people in a meaningful and inclusive way in consequential goal-orientated political processes?
Approach and Outcomes:
The combination of theoretical groundwork, empirical study, and the design and implementation of the PIAS, will make an important contribution to the emerging body of research on the nature of political information on the Internet and in particular, the use of government data within it. Of chief significance is that the research will make explicit and open up to critical analysis the dichotomy between the availability of government and other data sources and effective online deliberative design. By consciously foregrounding information abundance as a condition of the present ‘information revolution’—through a unique fusion of political theory with semantic analysis and clustering tools—new perspectives will emerge and fresh research areas in design will open up.
The approach, then, is both innovative and unique because it combines the theoretical sophistication of Politics and Media Studies with the technical proficiency of Humanities Computing, eDemocracy, and Information Systems to expose important issues of online political information to critique in ways that were previously unavailable.  The work will open up theoretical and technological pathways towards a more genuinely identifiable (and sustainable) online political engagement and democratic structuring.
Technology and potential collaborators:
Potential collaborators for this work include the UK’s mysociety.org. They have developed some of the UK’s most well-know sites including TheyWorkForYou.com and its local derivative, OpenAustralia.org. The open source solutions, API, raw data and results will be collaboratively developed and shared with mysociety and OpenAustralia to complete the PIAS. Likewise, solutions developed through the ‘inquiry into Improving Access to Victorian Public Sector Information and Data’ as well as the Federal ‘e-Government Strategy’ will be investigated and may provide potential collaborators. In essence the PIAS is a ‘parsing’ project; to parse structured government and other data sets to extract and deliver meaningful political information to a general audience. It will explore ways to crawl, cluster and analyse unstructured data contained in blogs and other ‘unofficial’ sources including video and audio (perhaps using XPROC processing).
The broad samples obtained through the PIAS iterative design workshops and subsequent prototype will provide a unique model to analyse web-based dialogue, agenda setting, and responses to official government positions on important political topics. This work may be up-scaled at a later date to include other collaborators; particularly the Pollsters who may be eager to invest in such a system.
One of the first major agencies to coin the term the ‘Data Deluge’ was the UK’s JISC (Joint Information Systems Committee): Briefing Paper, Data Deluge: Preparing for the Explosion in Data, 1 November, 2004 <http://www.jisc.ac.uk/publications/briefingpapers/2004/pub_datadeluge.aspx> (Accessed 14 May, 2010).
 See: Clare Kurmond, Readership Decline Continues for Papers, Sydney Morning Herald, Sydney, 14 Mat, 2010
< http://www.smh.com.au/business/media-and-marketing/readership-decline-continues-for-papers-20100513-v1tk.html> (Accessed 14 May, 2010).
Interaction Design Evaluation Analysis (IDEA), Department of Information Systems, University of Melbourne,
< http://disweb.dis.unimelb.edu.au/research/interactiondesign//usability_lab.html> (Accessed 14 May 2010).
 Carson, L ‘Avoiding ghettos of like-minded people: Random selection and organisational collaboration’ in S. Schuman, (ed) Creating a Culture of Collaboration, ed. Jossey Bass/Wiley.pp.418-423.