I recently led a session at the eResearch Australasia conference on the ethics of AI in higher education. It is a big topic to handle, and I’m pretty new to this stuff, but the conversation went pretty well, and the awareness of both AI and ethics is high in this community. The ethical challenges posed by AI are significant, but the benefits are also great, and it is vital for educators and citizens to be aware of both. Here are some of the key points made by the audience (and I am pursuing the topic, so will post some more later on).
Off the shelf solution of AI can influence the decision making of research
There need to be transparency in machine decision making (or avoid certain decisions). And we need to avoid a dependency on machine decisions
Perhaps a certification of AI products from a regulatory body
AI may have a negative impact on the job market
eResearch Australasia Conference, 2018
After many false dawns, AI may be gaining traction. Chatbots, Natural Language Processing, robots, autonomous vehicles, and the combination of big data and AI are all findings applications in a myriad of commercial, educational and other contexts. AI was once about explicit commands; what you put in is what you got out, but now it is largely about machine learning and big data, about machines that not only learn, but also make decisions. This is behind a number of new and emergent applications in medicine, transport and education that hold great promise but also ethical challenges.
In particular, it is an ability to make decisions that poses numerous ethical dilemmas; can an autonomous Volvo car chose to collide with either a pedestrian or a dog ethically; can a Google chatbot impersonate a human for nefarious purposes, and can an autonomous military drone decipher images of illicit activity and then take autonomous action? These are not dystopian projections of a sci-fi future, rather these ethical issues that exist now well within the province of AI and its applications.
Whilst ethicists have provided critique, debate, and numerous ethical frameworks for an AI future, (indeed the Australian Government has just proposed a technology roadmap, a standards framework and a national AI Ethics Framework, and regulation in the space), higher education has been relatively quiet in terms of debating the impacts of AI on teaching and research and the broader HE education system. Indeed, while AI applications are not yet fully realised in research, this could opportune time to think about them, before they are (and this change could occur quite rapidly as did the use of data in research across both the humanities and the sciences).
Some of the ethical issues posed include the stalwart of IT ethics, being privacy, but also new issues arise, particularly around transparency and the interpretation of data using machine learning and how these interpretations may influence later research findings, be credited as research work, and indeed impact upon broader society. This is a particularly difficult issue as AI does afford many benefits in terms of the researchers ability to deal with the scale and complexity of big data, but there are things that machines are good at and things that people do better, and this intersection of machine and people intelligence, including ethical decision making, needs to be considered from the very emergence of AI in research.
This Birds of Feather session proposes to discuss the ethics of AI, big data and research, with the purpose of providing a basic ethical framework for emergent AI and in broader research practice. This framework could be used as a stand-alone guide for researchers or as an addendum to existing research ethics, privacy and data processing guidelines
Anthony Seldon, The Fourth Education Revolution, University of Buckingham Press, 2018
Rose Luckin, Enhancing Learning and Teaching with Technology: What the research says Institute of Education Press (IOE Press), 2018
Bostrom, Nick. Superintelligence: Paths, Dangers, Strategies, Oxford University Press, 2014
I would like to open with an image; it is an image from Fritz Langâ€™s famous 1927 German Expressionist Science Fiction movie, Metropolis. Â Made in Germany during the Weimar period, Metropolis depicts a futuristic dystopian society where wealthy intellectuals rule from the city above ground, oppressing the workers who live in the depths below them.
The plot of the film is as follows:
The film follows Freder (Gustav FrÃ¶hlich), the son of the master of the city, Joh Fredersen (Alfred Abel). While idling away his leisure time in a pleasure garden, Freder encounters a young woman named Maria (Brigitte Helm) who has brought a group of workerâ€™s children to see the privileged lifestyle led by the rich. Maria and the children are quickly ushered away, but Freder is fascinated by Maria and descends to the workerâ€™s city in an attempt to find her. Freder finds the workerâ€™s city and watches in horror as a huge machine explodes injuring many.
I chose this movie because I think it introduces my topic pretty well. Langâ€™s Movie was a harsh critique of industrialisation and the gulf it was creating between workers and the rulers. When it was first released the film was met with a mixed response, with many critics â€œpraising its technical achievements while deriding its simplistic and naÃ¯ve storylineâ€.
Of course this was a dystopian vision of the future of industrialisation and I am using it a little bit flippantly as things didnâ€™t turn out quite so bad (at least not in Hamburg). Â But if you allow me to make the leap, then we are perhaps at a similar juncture in history driven not so much by the dehumanising machines of industrialisation, but driven by the vast computer networks that are being built around the world in many different economic sectors and many different funding contexts. They form an infrastructural layer to a very different economy than the one imagined in Langâ€™s Metropolis.
And in Australia, as in many countries like Canada, the US, and the UK, the investment in computing infrastructure over the past decade has been enormous in both education and the domestic sphere. In fact, our most expensive infrastructure investment to date is a high-speed computer network (the National Broadband Network); that promises to deliver bad American movies to ever corner of the continent with even greater speed and efficiency (well, perhaps it is a little more than that!)
But for this community, the digital humanities, the most important infrastructural development over recent years has been the Cyberinfrastucture movement, or â€˜eScienceâ€™, or â€˜eResearch infrastructureâ€™ (and the term used depends on what country you are in).Â And the vision of eResearch infrastructure (at least at the National policy level) is not to deliver bad American movies to the outer reaches of the Australian outback, but to wire-up entire research sectors through â€˜New Infrastructures for Knowledge Productionâ€™ to use the title of the wonderful book by Christine Hine.
But what does this actually mean in practise? And what does eResearch or Cyberinfrastucture mean for the Humanities and especially the digital humanities as Cyberinfrastruture and its visions have been around for long enough now for us to reflect upon its institutional formation and intellectual underpinnings.
And it is probably worth stating my own position at this stage as I have worked at this precarious juncture between eResearch infrastructure and the Digital Humanities for 5 or 6 years now on various projects and in various universities. Â And I have often felt that this is of the position of interloper; of looking for cracks in the eResearch agenda; of looking for ways to leverage the enormous investments in eResearch infrastructure in ways that supports the digital humanities and our particular contextual ways of engaging with computing.
And an important part of this context is that the digital humanities largely positions itself within the existing research â€˜infrastructuresâ€™ of the humanities (journals, academic departments, conferences, libraries, and sober ethics committees)â€”and is partly Â responsible for building the â€˜human capitalâ€™ to work in the humanitiesâ€” but eResearch or Cyberinfrastruture has largely emerged outside of the perspectives and training of the digital humanities, primarily driven by a â€˜big scienceâ€™ and â€˜big engineeringâ€™ agenda (ie. an emphasis on mass data storage, high-capacity networks, and other infrastructures that arguably largely support scientific needs and ways of collaborating).Â This has created numerous complexities for the digital humanities, particularly in Australia where it may, for better or worse, be emerging as a competing set of discourses and practices to the digital humanities. In others words, eResearch may not be telling us how to think (well perhaps not yet), but it certainly telling us what to think about. It often has a Modernist agenda; the idea that bigger is better or that the humanities suffer from a similar data-deluge to the Sciences, or indeed, we are unable to neither collaborate nor articulate what we want within the rubric of science based infrastructure (and I donâ€™t see this as a major problem!).
But the problem is one of context; eResearch infrastructures are components of the vast and expensive scientific support apparatus; one in which the humanities will always be minor player and one in which many humanities researchers may find confronting (or even enticing) considering the economies of scale involved within it. In Australia just one of the eResearch funding streams, the Super Science initiative, is valued at $1.1 Billion and sums such as this arenâ€™t that unusual in eResearch infrastructure funding streams in Australian and other countries around the world.
Likewise in Australia, the waters are muddied even more by the term eResearch being applied generically to computing in both the sciences and humanities, even though the ability for the perspectives and practices inherent within eResearch to extend beyond scientific problems is questionable (and perhaps 95% of eResearch funding in Australia goes to Science). It is the problems of science looking to solve the problems of the humanities and although many of us may welcome scientific infrastructures to enable us to solve humanities research problems, I doubt whether it is always possible nor desirable, regardless of the price tag.
Admittedly, eResearch infrastructures have created many opportunities for research in the humanities; however, the way in which this agenda has been institutionalised in some countries means that it doesnâ€™t always serve the needs of the humanities. Â It is often measured and driven by different accountability metrics, and also importantly, as Christine Borgmann states in her Digital Humanities Quarterly article in September 2009 â€˜visions for scholarly infrastructures that originate in the humanities are rareâ€™ (so the humanities are partly to blame for a lack of vision but there are exceptions to this and they principally involve XML and TEI virtual environments).Â Yes we do need digital infrastructures in the humanities, but we also need to be cautions that they are not being designed outside of humanities research practices.
As Geoffrey Rockwell states:
…[there are] dangers in general and especially the issue of the turn from research to research infrastructure…we need to be careful about defining the difference and avoid moving into the realm of infrastructure…those things we are still studying.
So, whilst some eResearch infrastructures may inevitably claim a â€˜research enablingâ€™ pedigree for their work, the exact nature of the research being enabled and how it helps us understand human society and culture is, on occasions, yet to be determined (and this is far from an easy task and is largely an experimental practice; rarely a utilitarian one). Plus the institutional positioning of eResearch infrastructure in university service divisions, remote national services, and monolithic government and science-led programmes, means that the tradition of critique, and synthesis of eResearch infrastructure outputs within contemporary humanities scholarship, is barely possible (and a point to make here is that despite the sums invested in the national eResearch agenda in Australia, it hasnâ€™t produced one humanities PhD scholarship, not one fellowship, nor one centre that focuses fully on humanities research). So, in terms of eResearch infrastructures, there have been almost no investments in developing the human side of computing in the humanities in Australia (and I noticed a tweet from a colleague of mine before I left, Dr Tim Sheratt, that said â€˜I am research infrastructureâ€™.
As a historian and long-time digital humanities advocate who has benefited from investments in eResearchâ€”and indeed, I am employed by a particularly enlightened strategic eResearch programmeâ€”I caution against retreating too eagerly from the â€˜infrastructure turnâ€™ as there are still healthy opportunities in many countries between the cracks of otherwise clumsy agendas.Â However these opportunities need to be approached with caution. The outputs from eResearch infrastructure need to be well supported within a humanities research setting and responsible to a humanities research context and pre-existing intellectual perspectives (or in other words it is ok to develop a healthy working scepticism but I am not sure how this is possible if we are not equally investing in people to develop critical perspectives). 
Perhaps a better approach for the humanities (and especially the more acute example of the Australian humanities) than trying to fit into an at times clumsy Science-led eResearch infrastructure funding model would be to lobby harder for a better funding model (and Borgmann also states that it is only humanities scholars themselves that are in a position to move computing in the humanities forward). The digital humanities already has a sophisticated international network of centres, undergraduate and graduate degrees, associations, conferences, journals, and research accountability structures that are largely internal to the humanities and is often in a better position to understand computing in the humanities than Science led-eResearch (and there are some positive institutional developments in this direction such as combining eResearch with the Digital Humanities at Kingâ€™s College London).Â Â And if led by the digital humanities, new research infrastructures such as data and text centres, virtual environments, and digital libraries would be more relevant to humanities research, thus insuring their long term sustainability. But this would require eResearch infrastructures to be institutionalised in a much more responsive way; in a way that isnâ€™t unequally coupled with the needs of science.
And it is also worth stating that eResearch infrastructure investments are usually short-term and those that are tasked with their construction and maintenance are usually on short-term employment contacts and unstable funding streams that seems at odds with the goals of building sustainable and robust infrastructures to transform research.
Again Geoffrey Rockwell states:
Perhaps things like the Text Encoding Initiative Guidelines are the real infrastructure of humanities computing, and the consortia like the TEI are the future of light and shared infrastructure maintenanceâ€™
I would like to think that this is because the TEI and derivatives such as EpiDoc exist within a deeply scholarly and vibrant international research culture that is both embedded within and accountable to humanities research; this is not always the case with eResearch infrastructure. However, for the digital humanities to take a greater lead in terms of guiding the implementation of eResearch infrastructure, in its various institutional settings, would require the digital humanities to be strengthened institutionally to rise to the challenge, especially in countries where â€˜eResearchâ€™ is much stronger than the digital humanities. Â All infrastructure, despite its veneer of utilitarian simplicity, is â€˜among the most complex and expensive things that society createsâ€™.  eResearch infrastructures for the humanities may provide opportunities, but aspects of the present model in various countries lacks a complex humanities research environment and is wedded to an empirical, engineering, and industrial instrumentalism that is often at odds to the humanities. It is not that eResearch does not do some things very well, it is the promise of research that it doesnâ€™t do particularly well. The goals of eResearch infrastructures are often so monumental; that they should perhaps be a set of research questions or national research agendas in themselves rather than practical goals.
And, as evidence suggests, Infrastructures produced outside of a humanities research-context or indeed a science-research-context have difficulty with uptake (and a recent survey by a colleague of mine in Melbourne, a Director of eResearch, Lyle Winton, suggests that computing tools and applications primarily advances in research through a peer process, through researcher to researcher, and not through external pressure). Howeverâ€”as previously statedâ€”the part of the infrastructure building process that lacks investment is the investment in people or â€˜people as infrastructureâ€™ to guide its use in the humanities. There have been numerous cases of eResearch infrastructures that have not worked simply because researchers have not used them; possibly because they donâ€™t know how, they donâ€™t know they exist, or they have been poorly designed for their research practices (but also, eResearch infrastructure is a fairly risky endeavour so a certain amount of failure is inevitable).
Accordingly, many of the recent debates in the digital humanities, such as in Mathew Goldâ€™s work with that title, have been about the fields relationship with broader humanities, about the character of the Digital Humanities, and about its various patterns of institutionalisation (and I was very lucky to hear a key-note by Professor Andrew Prescott, Head of the Department of Digital Humanities at Kings College London, at the Oxford Digital Humanities Summer School, that discussed the Digital Humanities in the UK emphasising the need to revitalise the field through developing stronger research agendas beyond the worn-out arguments of interdiscipilarity)
But there is also a need to understand another front that it opening up and that is our at times uncomfortable relationship with eResearch infrastructures; the enormous and expensive support mechanisms that enable modern science.Â Although there are opportunities within eResearch infrastructures, the relationship is not well understood, it is under theorised, and is there is a danger that it will end in tears!
So perhaps we are at an historical juncture, and we need to be cautious at this juncture that some of the utopian visions of eResearch infrastructures do not turn into the dystopian vision of Langâ€™s Metropolis.Â As Andrew Prescott stated in his Oxford Summer School lecture, industrialisation did alter what it meant to be human; and so too does contemporary science and technology alter what it is to be human so letâ€™s make sure the humanities have a large role in designing and interpreting our relationships with them.
So to try make concrete what it a very broad-ranging argument; do you think it is possible or desirable for the humanities to have its own â€˜conceptual cyberinfrastuctureâ€™ to use the term from Patrik Svenssonâ€™s article on the subject in DHQ last year?
And if so, how may the digital humanities step up to the mark?
Barjak, F, Lane, J, Poschen, M, Proctor, R, Robinson, S, & Weigand, (August 2010), G, â€˜e-Infrastructure adoption in the social sciences and humanities: cross-national evidence from the AVROSS surveyâ€™, Information, Communication and Society, Vol.13, No.5, pp.635-651
Capshew, JH, and Rader, KA. (1992) â€˜Big science: price to the presentâ€™, the history of science society, University of Chicago press, Osiris, 2nd Series, Vol 7, Science after â€™40, pp.2-25, <http://www.jstor.org/stable/301765>
Unsworth, John (Chair), (2006) â€˜Our cultural commonwealth: The report of the American Council of Learned Societies Commission on Cyberinfrastructure for the Humanities and Social Sciences, American council of learned societies.
Recently I have been reading quite a lot about eLearning.Â I know it is one of those words with an â€˜eâ€™ in front of it, but rather than simply existing on the superficial level of language, the sub-field of eLearning is a vibrant one with numerous scholarly contributions, journals, associations, and software. Â One of the most active associations is ASCILITE , or the Australasian Society for Computers in Learning in Tertiary Education, that runs an annual conference, professional development activities , and a journal. Â http://www.ascilite.org.au
Admittedly this association was established in 1985, so it has had a long time to build a scholarly community of practice (and if it has been a key force in the development of the eLearning community in this region, it has certainly done a pretty good job). Â The literature on all aspects of the learning-cycle are well-researched; as are the technical frameworks for large-scale implementation of eLearning environments (as well as the learning outcomes are well researched and mapped). Â Plus, the most important thing is that eLearning largely sits within established educational research on constructivism, constructive alignment, inquiry based learning, blended learning and other theories that help teachers and administrators understand where eLearning may help in the classroom and in other learning contexts. Â Without a strong evidence base to support it, eLearning would arguable not work well as educators would not know how to use it. It would be akin to a dunce that sits in the back-corner, unable to engage constructively with other students; except maybe to distribute assignments to other students every now and again.
Unlike eLearning, eResearch does not really have a discoverable theoretical base, perhaps because it is a lot newer concern or perhaps because it is a large-scale government policy agenda, rather than a focused intellectual concern (ie. there are no journals, no associations, no research focused conferences, and very few developed theories to understand it).Â Although extraordinarily valuable skills, one would need to draw a very long bow to claim that data management is an intellectual concern or that cloud services are a vital method of research inquiry.Â The problem that I see is that although eLearning is undoubtedly about learning and the research about learning (and there is a great amount of literature to support this claim), eResearch is not really research (nor is it usually the research about good research).
Although there are lots of debate about the nature of research and indeed this is a highly contested space of competing ways to interpret and measure the world, the lack of literature about eResearch suggest that it doesnâ€™t really enable new research but simply exists to support data management, remote instrument access, and other important services that are required to do modern scientific research.Â The term â€˜science support servicesâ€™ would be a much more honest term and perhaps Science does not require the same theoretical base and research context to get on with the job of doing good science (or perhaps they have the same concerns as I do about the all-too-often remoteness of the term â€˜eResearchâ€™ from where research happens).Â Journals, conferences, class-rooms, debates, lectures, libraries, curriculum, and even blog-posts are all part of the â€˜infrastructureâ€™ of research built-up over the past one thousand years in many countries (or 10 years in the case of this blog). If â€˜eResearchâ€™ does not comfortably sit within these established â€˜infrastructuresâ€™ it is something else all together.Â eLearning has managed to do this and does it well, but eResearch has a long way to go. Perhaps more humanities and social science educated people working within the eResearch agenda will help build up the theoretical base and arguments for eResearch. At the moment eResearch is theoretically thin and thus cannot be easily communicated within research; and especially humanities research.
HuNI or the Humanities Network Infrastructure was recently awarded $1.3 million by NeCTAR (National eResearch Collaboration Tools and Resources).Â The project will allow:
…arts and humanities researchers to access and, through appropriate tools and services, work with the combined resources of the nationâ€™s major cultural datasets and information assets. This will yield new scholarly outcomes and create an enduring exemplar of national cultural infrastructure to suit the needs of future generations of researchers.
Arguably, the HuNI project is the first serious, large scale inroad into eResearch infrastructure for the humanities in Australia and promises to act as an exemplar for other projects in the region. Of particular note is that the project will also build what is termed a Virtual Research Environment’ (VRE);Â an online environment of tools and services to allow specialist researchers to come together to perform certain computational research tasks with the possibility of uncovering new insights about Australia’s cultural landscape. It is this possibility that makes the project of interest to the Digital Humanities community that has a long track record of serious scholarship that both utilises and advances computing within the humanities to help us understand the human condition.
The project has a number of partners with various ‘cultural data-sets’ and differing means to collect and analyse data (and indeed different conceptual frameworks as to the notion of data).Â Bringing them together will be an exciting and challenging endeavor. The partners and datasets include:
As you can see, there is an extraordinarily diverse collection ofÂ data from lots of different collection agencies and fields.Â The HuNI architecture will consists of a Linked Data Service and a Semantic Mediation and Mapping Service and will allow researchers to do something like this:
The strong argument for the need for such infrastructure is outlined as follows:
The need for such systems is outlined in the Cultural data is extremely laborious to collect. Once collected, however, its scholarly value does not diminish over time as it is highly re-usable and retains relevance in a number of research domains. The cultural datasets represented in this proposal exhibit the fruits of many decades of painstaking documentation of the human cultural record in Australia. The consortium proposing the HuNI VL are custodians of over 2 million rich, interrelated records relating to Australian cultural heritage creators, objects and events. Much of this authoritative data is problematically held within disciplinary silos, often unexplored by researchers in related disciplines. Once these datasets are linked within the HuNI VL the breadth and depth of Australian cultural content will expand exponentially and a new level of comprehensive and multi-disciplinary research on Australian culture will become possible.
Due to a range of funding and institutional factors, these datasets have been constrained in their ability to establish robust interoperability protocols that would enable new avenues of enquiry and reduce duplication of effort. The recent rise of more data-centric research in the humanities, and the plethora of new tools to facilitate this, has meant that many data-rich resources in the humanities need to adapt to increasing demands from the arts and humanities community for support for the rapidly emerging discipline of â€˜digital humanitiesâ€™. Achieving the requisite level of financial support for this, however, has eluded many in the humanities leading to piecemeal and only partially successful collaborations. Some early exemplars have demonstrated the ability for digitally enabled research practices in the humanities to reveal deeper understandings of cultural expressions over time. It is the aim of the CDC to develop a fully integrated, multi-disciplinary research space to exploit the enormous potential for new levels of scholarly engagement suggested Humanities Networked Infrastructure (HuNI) VL by the combination of content and tools for cross-dataset analysis and interpretation. Whilst cultural data integration is the core function of the HuNI VL, we have identified a number of research tools that have relevance and ready potential to be modified and â€˜plugged inâ€™ to the HuNI VL. These tools will underpin the VL as a workspace for processing cultural data and support its core function.
The project will begin shortly and has a two year time-line.Â The project will have a web-site where many of the technical approaches and outcomes will be published. Stay tuned!
The application of diverse forms of eResearch infrastructures to support research has a long history. During the 1970s the genesis of eResearch in the shape of Internet was driven by the needs of the research community. In this latest stage of eResearch infrastructure development, also largely driven by the needs of the research, we are witnessing large scale investments in grids, clouds, federated repositories, and high-end eScience and eResearch projects to support research across institutional, regional, and disciplinary boundaries. But as eResearch expands, there is an increasing need to address the tricky questions of governance. eResearch does not exists in a free-flowing world of ideas, rather like all infrastructures, it exists in a complex, contested, and often contradictory world of varied manifestations of governance. As we will argue, the governance of any system has rarely been brought about in a planned and orderly manner; rather it is usually brought about by a crisis in a system and a contested set of attributes that have forced the extension of governance. As existing capacities meet limits, new approaches to governance are invented and deployed in the attempt to overcome the barriers. eResearch exists in a complex array of governing bodies and without a realistic grounding of its technical vision within the limits of these structures; new infrastructural developments to support eScience or eResearch or even the Digital Humanities will be hindered by institutional divergence.
Nick Thieberger from here at the University of Melbourne has kindly blogged details about a discussion paper inviting a response from humanities researchers. The discussion paper is about the Federal GovernmentsÂ â€™2011 Strategic Roadmap for Australian Research Infrastructureâ€™.
All Australian humanities scholars with an interest in digital scholarship should take this brief opportunity to read and comment on the federal governmentâ€™s â€™2011 Strategic Roadmap for Australian Research Infrastructureâ€™ discussion paper. Why? Because the two previous â€˜Roadmapsâ€™ funded hundreds of millions of dollarsâ€™ worth of â€˜research infrastructureâ€™, almost exclusively NOT in the Humanities, but including hugely expensive science tools like the $100 million Synchrotron. In the previous Roadmap in 2008 there was a section on the Humanities and Social Sciences that included reference to PARADISEC as an exemplary project building infrastructure for Humanities scholars. But not one cent went to support PARADISEC from that process (link to blog)