Engineering Ethics Into A Robot
Watch Cindy and Transport in action in a preliminary lab test:
How to engineer ethics into a robot
A decision-making process that mimics what humans tend to do in morally challenging situations may be the answer to engineering ethics into a robot. This is done by first recognizing morally challenged situations, followed by deploying reasoning strategies that include moral principles, norms, and values. This corresponds to Prof. James H. Moor's third kind of ethical agent, the "explicit ethical agent," as described by Moor:
- Explicit ethical agents can identify and process ethical information about a variety of situations and make sensitive determinations about what should be done. In particular, they are able to reach "reasonable decisions" in moral dilemma-like situations in which various ethical principles are in conflict.
Scheutz has argued that current technological advances in robotics and artificial intelligence have enabled the deployment of autonomous robots that can make decisions on their own. However, most of these currently employed robots are fairly simple and their autonomy is limited. Therefore they carry a potential for becoming harmful machines due to their lack of robotic decision-making algorithms that could take any moral aspects into account. EETimes asked Scheutz what exactly his fear is.
"If we do not endow robots with the ability to detect and properly handle morally charged situations in ways that are acceptable to humans, we will increase the potential for harm and human suffering unnecessarily," he says, "for autonomous robots will then inevitably make decisions that we deem 'morally wrong,' e.g. failure to provide a patient with pain medication when it was warranted."
There are actions that robot developers could take in order to mitigate the problem of morally challenged situations that social robots deployed in human societies will face, according to what Scheutz argues in his paper "Think and Do the Right Thing -- A Plea for Morally Competent Autonomous Robots." EE Times asked Scheutz what these actions could be.