Pages

Friday, September 11, 2020

The ethics of computational thinking

 I used to teach philosophy to Year 7-8 students and really enjoyed posing ethical issues for discussion.  Around 2017 driver-less vehicles were being trialed at Christchurch Airport and I used the dilemna of how the vehicles should be programmed.  Unlike the Moral Machine activity today, I avoided stereotyping the potential victims of an accident because I felt the demands on the students' emotional capacity would be too high.  There were so many home and caregiver variables such as children living with grandparents, children in foster care, children in impoverished situations that I wanted to avoid comments or discussions which might cause harm to these vulnerable young people.   My intention was rather to create opportunities for divergent thinking so that the students came up with all the variables that would need to be programmed into the car.   Victims were generalised as 'people' and in different sized groups.    The most mature response I heard across a number of classes was from a girl who had experienced a series of traumatic events and loss in her short life.  She said she would always choose to  be injured if it would save someone else, because she had been happy with her life.  

The black and white nature of most of the scenarios in the moral machine activity involving the driver-less car  activity offended me. No issue can be so succinctly black and white.   There would be few I would  want to raise in a debate with young people as the nature of any discussion would be  based on placing a higher value on some people's lives than others. In fact, I shudder to think how some adults might make these judgments.   I would certainly hope that the scientists programming the cars would not be basing their decisions on the popular vote, instead using the evidence from what we know about accidents - speed, visibility, road conditions, construction of vehicles.   In real time I think it would be just as impossible for a driver less car to bring about a different outcome to a person driven vehicle in situations where judgments would be made on the appearance of what was in front of the vehicle. 

Ethics is about firstly causing no harm, and secondly doing some good.   As technology rapidly expands there are so many ethical issues to consider.  We need to be mindful that decisions are not based on greed, intolerance or bias.  In the current pandemic situation we are fortunate that in New Zealand our medical system is coping with the demands of seriously ill people and limited technological resources such as ventilators.   Within medicine ethical issues have often been about  access to expensive treatments / drugs however the development of technology has created another layer of challenge.   In the process of preserving life I wonder what the personal cost is for the professionals making  impossible decisions? 

Empowering learners, as examples showed today, must also be linked to ethical practice and moral purpose. The Technology Curriculum, which includes the new digital technology components, cannot just be about the creation of products, systems and tools.   The impact on people must be  a part of the learning process.  The curriculum states - students also learn to consider ethics, legal requirements, protocols, codes of practice, and the needs of and potential impacts on stakeholders and the environment.  Compassion, tolerance, empathy, humanity - the Social Sciences - must be integral to teaching and learning programmes.   Juan Enriquez raises the question of "who teaches us right from wrong?"   In the rapidly changing technological world this is the challenge we all have a part in.


No comments:

Post a Comment