Tuesday 20 March 2007

iBot’s Ethical Dilemma

When designing the Robot Teaching Assistant there were a number of ethical issues raised. After yesterdays lecture, Russell discussed ethical issues that as Computer Scientists we face when designing systems.

In relation to our Robot Teaching Assistant, we felt it was important to highlight the ethical issues associated with robots.

In a recent report on BBC News (http://news.bbc.co.uk/1/hi/technology/6432307.stm), South Korean scientists have begun drawing up an ethical code to protect robots from being abused by humans and vice versa.

This ethical code is based on three laws set by Isaac Asimov: -

- A robot may not injure a human being, or, through inaction, allow a human being to come to harm

- A robot must obey the orders given it by human beings except where such orders would conflict with the First Law

- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law

A similar set of principles was also popularized in the iRobot movie starring Will Smith.

One of the key assumptions that we are making in our overall design is that the Robot Teaching Assistant follows these principles. In making this assumption, we should also appreciate the problems associated with this such as iBot having the ability to recognize humans from other similar looking things such as "...chimpanzees, statues and humanoid robots" (BBC News (2007), see URL above).

Another issue, which is important to highlight is the Robot Teaching Assistant’s ability to access the School Database. This raises issues regarding Data Protection, and whether or not, iBot should be allowed to access the children’s information and medical records. We could counter this argument and assert that iBot should be seen as a member of staff at the school, and not an outside entity.

No comments: