Building a ‘moral operating system’ for IT students: pedagogies and problems

(This is a paper I will be co-presenting at an applied ethics conference here in Melbourne in December. Ethics in IT has become a very big deal!

Dr Craig Bellamy, Lecturer, CSU Study Centre Melbourne, Nectarios Costadopoulos, Lecturer, CSU Study Centre, Sydney

9th Annual Australasian Business Ethics Network (ABEN) Conference, Melbourne, 8-10 December 2019

In this paper, we will discuss the obstacles, lessons learnt, and innovations in pedagogy in delivering the subject, Topics in Information Technology Ethics, which is the applied professional ethics subject for the Masters in IT at CSU. It is also, more broadly, a mandatory subject for all Computer Science degrees in Australia, under the auspices of the Australian Computer Society (ACS). As part of this discussion, we examine the present ethical landscape in the IT industry and outline how we prepare students to enter the industry with an independent ethical agency. It is the contention of the presentation that argument, reasoning, and logic skills are the seminal learning proficiencies required for students to enter the dynamic ethical landscape of the digital economy, although this is not without limitations.

Indeed, ethics is now the ‘wicked problem’ in the IT field as there is a developing international ‘tecklash’ against the industry led by specific high-profile incidents (i.e. Cambridge Analytica and the Christchurch massacre) and public concern for privacy, transparency, and dysfunctional digital markets. The Australian, New Zealand, EU, and US governments have responded with strict new regulation, including fines for violation of privacy, distribution of inappropriate harmful materials, and copyright infringement. 

Graduate Computer Scientists entering into this complex new domain of enforceable ethical practice may face legal or other action if they are in breach of new and proposed laws regulating the industry. It is in the interests of the Australia Computer Society, the broader industry, and educators in the field to prepare students for the ethical challenges they face, as is already the case with other more established fields such as Accounting and indeed, Higher Education. Damon Howiwtz (2014) put it succinctly, stating that what the IT industry needs is a better ‘moral operating system’ to guide ethical decision making to face today’s looming challenges.

One evident way to certify that students are prepared for ‘ethical practice’ in the industry is to ensure that their ethical judgement is sound and reasoned (Tavani, 2015). We teach ethical reasoning and judgement skills through a number of means; case studies, scenarios, and interactive YouTube videos of ethical dilemmas with multiple outcomes. At CSU we have pioneered a way of streamlining ethical decision making through the Doing Ethics Technique, an early innovation developed by academics in the subject, to build reasoning skills in a systematic and logical way (Simpson, Nevile, Burmeister, 2003). Recently we have been using argument mapping software to allow students to map ethical arguments in imminent ethical dilemmas enabled by the rise of Artificial Intelligence and autonomous vehicles. This has had mixed results in terms of digital pedagogy and assessment outcomes (MindMup, 2019). This is because the leap from classical ethical theories to contemporary ethical problems is difficult for many students (although the link is more apparent between Foots seminal ethical dilemma “the trolly problems” (1967) and rogue autonomous vehicles).

Together, we will discuss the contemporary problem of teaching ethical reasoning and logic in an IT ethics class and our advances in the area.

References:

  1. Horowitz, Damon, “We need a moral operating system”, 2014, Ted Talk, https://www.ted.com/talks/damon_horowitz
  2. Philippa Foot, “The Problem of Abortion and the Doctrine of the Double Effect” in Virtues and Vices (Oxford: Basil Blackwell, 1978) (originally appeared in the Oxford Review, Number 5, 1967.)
  3. MindMup, Sauf Pompiers Limited, Leigh-On-Sea, UK, https://www.mindmup.com/
  4. SIMPSON, Christopher; NEVILE, Liddy; BURMEISTER, Oliver. Doing Ethics: A Universal Technique in an Accessibility Context. Australasian Journal of Information Systems, [S.l.], v. 10, n. 2, May 2003. ISSN 1449-8618. Available at: <https://journal.acs.org.au/index.php/ajis/article/view/159>. Date accessed: 12 Sep. 2019. doi:http://dx.doi.org/10.3127/ajis.v10i2.159.
  5. Tavani, Herman T, Ethics and Technology: Controversies, Questions, and Strategies for Ethical Computing, 5th Edition, Wiley Press, 2015

Would you like to chat?” The Ethics of AI in Higher Education

I recently led a session at the eResearch Australasia conference on the ethics of AI in higher education. It is a big topic to handle, and I’m pretty new to this stuff, but the conversation went pretty well, and the awareness of both AI and ethics is high in this community.  The ethical challenges posed by AI are significant, but the benefits are also great, and it is vital for educators and citizens to be aware of both.  Here are some of the key points made by the audience (and I am pursuing the topic, so will post some more later on).

  • Off the shelf solution of AI can influence the decision making of research
  • There need to be transparency in machine decision making (or avoid certain decisions).  And we need to avoid a dependency on machine decisions
  • Perhaps a certification of AI products from a regulatory body
  • AI may have a negative impact on the job market

eResearch Australasia Conference, 2018

After many false dawns, AI may be gaining traction. Chatbots, Natural Language Processing, robots, autonomous vehicles, and the combination of big data and AI are all findings applications in a myriad of commercial, educational and other contexts.  AI was once about explicit commands; what you put in is what you got out, but now it is largely about machine learning and big data, about machines that not only learn, but also make decisions. This is behind a number of new and emergent applications in medicine, transport and education that hold great promise but also ethical challenges.

In particular, it is an ability to ‘make decisions’ that poses numerous ethical dilemmas; can an autonomous Volvo car chose to collide with either a pedestrian or a dog ‘ethically’; can a Google chatbot impersonate a human for nefarious purposes, and can an autonomous military drone decipher images of illicit activity and then take autonomous action?  These are not dystopian projections of a sci-fi future, rather these ethical issues that exist now well within the province of AI and its applications.

Whilst ethicists have provided critique, debate, and numerous ethical frameworks for an AI future, (indeed the Australian Government has just proposed a “technology roadmap, a standards framework and a national AI Ethics Framework”, and regulation in the space), higher education has been relatively quiet in terms of debating the impacts of AI on teaching and research and the broader HE education system.  Indeed, while AI applications are not yet fully realised in research, this could opportune time to think about them, before they are (and this change could occur quite rapidly as did the use of data in research across both the humanities and the sciences).

Some of the ethical issues posed include the stalwart of IT ethics, being privacy, but also new issues arise, particularly around transparency and the interpretation of data using machine learning and how these interpretations may influence later research findings, be credited as research work, and indeed impact upon broader society.  This is a particularly difficult issue as AI does afford many benefits in terms of the researcher’s ability to deal with the scale and complexity of big data, but there are things that machines are good at and things that people do better, and this intersection of machine and people intelligence, including ethical decision making, needs to be considered from the very emergence of AI in research.

This Birds of Feather session proposes to discuss the ethics of AI, big data and research, with the purpose of providing a basic ethical framework for emergent AI and in broader research practice.  This framework could be used as a stand-alone guide for researchers or as an addendum to existing research ethics, privacy and data processing guidelines

Reference:

  1. Anthony Seldon, “The Fourth Education Revolution”, University of Buckingham Press, 2018
  2. Rose Luckin, “Enhancing Learning and Teaching with Technology: What the research says” Institute of Education Press (IOE Press), 2018
  3. Bostrom, Nick. “Superintelligence: Paths, Dangers, Strategies”, Oxford University Press, 2014
  4. Pollit, Edward.  “Budget 2018: National AI ethics framework on the way, Increased regulation signalled as part of $30m investment” Australian Computer Society, https://ia.acs.org.au/article/2018/budget-2018–ai-boost-with-an-ethical-focus.html (Accessed 13 June 2018).

https://conference.eresearch.edu.au/

What are Open Educational Resources?

As the name suggests, Open Educational Resources (OER) are freely available resources for learning and teaching; such as documents, videos, syllabi, software, and images. The advantage for educators is that these resources may be deposited, shared and re-used thus saving time in creating new courses or updating existing courses (also the promotion of the particular institution or field and peer support for others in the same subject area is an advantage of sharing teaching materials). OER’s may be available as individual objects or bundled together as a package. They are most likely ‘open licensed’ through licenses such as Creative Commons or GNU and are made available either on the open web or within institutions. Also, the term ‘Open CourseWare is often used.

1210px-OER_Logo.svg

 

 

 

What types of materials?
The types of materials that are distributed as Open Educational Resources are usually those that have been previously used in a class-room setting, or designed for a purely online or in a blended learning context. They may be materials for activities or labs, full courses, games, lecture notes, lesson plans, teaching and learning strategies, video recorded lectures, or images and illustrations. The audience for these materials may be lecturers (which is primarily the case) or may be students or even parents or administrators.
What type of licences?
Open Educational Resources are usually licenced so that they may be easily re-used within a non-commercial educational content (ie not re-sold). Many licences allow for ‘re-mixing’ which means that they may be adapted and enhanced to suit differing institutional contexts and student cohorts. Some licences only allow for sharing and re-use and no major revision (ie. ‘read the fine print’) and many are available within the certain educational copyright regime of the particular country (ie. ‘educational use of copyrighted material’ provisions). Attribution is always an important consideration, meaning that the materials taken from OER repositories must be acknowledged so that the original creators of the work are credited.
Where are OER found?
Many OER repositories are available on the open web, such as the OER Commons project or Connexions. The repositories may be run by volunteers or through paid employees on project funding provided by a university or funding agency. Although projects such as OER Commons and Connexions were designed specifically for OER, broader definitions of the term may include projects such as the Internet Archive or even Wikipedia. OER repositories may also exist at a university level to be maintained either by the university library or through the team responsible for the university Leaning Management System (LMS). Leaner Management Systems such as Desire2Learn have inbuilt repositories so that course content may be deposited and shared at a school, faculty, or institutional level (or open to the broader community).
What are the archival (technical) standards?
When OER materials are places into a repository, metadata and archival standards need to be associated with them so that they may be easily located, archived and shared in a meaningful way. SCORM (Sharable Content Object Reference Model) is a common way in which objects may be described, zipped-up into a package and re-used by different Learner Management Systems (LMS). Succinctly, SCORM is a ‘package of lessons’ that are bundled together so as to be understood by the LMS. What this means for educators, is that when placing OER materials into a repository, the correct ‘meta-data’ (data about data) is required about the material; usually inputted through a form to demarcate the type of materials and subjects addressed.
What are the archival (teaching) standards?
Many OER resources are likewise aligned with the teaching standards that may exist in different institutions or jurisdictions. The resources available are often aligned through a peer-assessment of the OER’s utility, quality of explanation, or quality of technical interactivity. The value of this for educators is the certainty that OER resources are of high quality and currency and purposefully meet teaching challenges.

UCL Centre for Digital Humanities

This is a very progressive course in Digital Humanities and I would highly recommend it to students who want to study outside of Australia (well, there is no real option in Australia yet anyhow). And Simon and Melissa are really nice. Check them out!

Review: Seminar, Training, and Large Collaborative Projects, Lynne and Ray Siemens

I recently attended a seminar at UWS on Friday 26 April, 2013 led by Lynne and Ray Siemens of the University of Victoria in Canada. The theme of the event was collaboration in the humanities and in particular; how digital humanities projects exemplify effective collaboration in the broader humanities. This is because digital humanities projects often cross-disciplines and geography and the often more demanding collaborative terrain of computer science, computational methods and the humanities.

Lynne Siemens, specialises in project management and team building. She stated that people aren’t always well-trained to work together and outlined some of the positives and negatives of working in teams. She claimed that some people are better able to collaborate than others, often because they have developed skills of listening, are flexible, can negotiate, and can compromise.  Lynne described these as the ‘soft skills’ of effective collaborative teams. A team approach often produces more diverse and possibly higher quality ideas (and is a good way to learn new skills and perspectives), but some projects are better done as an individual (but of course, some projects are beyond the scope and skills-sets of individuals).

Lynne outlined some of successful team interactions she had observed, partly through research she had undertaken through case –studies.  Good communication skills are vital, as is project management, and the ability to think across technology and the humanities and indeed, culture and language. Also the objectives of the team, the outcomes, and the individual tasks need to be clearly described with not too many grey areas that may be potential areas of conflict. And teams operate within institutional contexts so there are certain contingencies to negotiate either within or between institutions.  Still, one of the best ways to build teams is through casual conversations, lots of face-to-face meetings, and large bottles of rum (I put in the last one).

Ray Siemans is a Professor of Humanities Computing at the University of Victoria in Victoria, Canada and is well known for his work in the Digital Humanities and in particular, through the founding of the annual Digital Humanities Summer Institute (that I attended 2 years ago and now attracts around 500 participants).  He discussed the important work of the digital humanities, particularly around content modelling and computational analysis of content (a core form of scholarship within the field). He also discussed the typology of curriculum development in the digital humanities either through stand-alone degrees or through digital humanities inflicted programs and in particular, the highly successful Summer Institute model.

 

DHSI (Digital Humanities Summer Institute) http://www.dhsi.org/

ETCL (Electronic Textual Culture Lab) http://etcl.uvic.ca/

Fitzroy, Melbourne, November, 2001: oral history archive

Milkbar:The Everyday City and Globalisation was a project that sought to uncover some of the stories and concerns of some of the local residents of Fitzroy; an inner city Australian community. The videos assembled here are part of a larger project on the subject completed in October 2002 (more details below).

Forty four people within the suburb were interviewed with a video camera with the purpose of creating a record of a local, inner-city community in a significant period of change and to try and understand much of this change. It is an attempt to critically objectify historical change at a local level through an online oral history.
(This video is all the interviews stitched together. The individual videos with some contextual information are also on YouTube).

Continue reading “Fitzroy, Melbourne, November, 2001: oral history archive”