@Susan Fourtane The goal it's there: to deliver supplies to a disaster area or the battlefield.
I understand that, but because the goal is there I don't see any moral question. Deliver the supplies. My guess is that you meant to indicate that the robot is supposed to perform some sort of battlefield triage and decide who can't be saved, who can wait, who needs immediate help and what order to put the casualties in to save the most lives. Even here I don't see any ethical dilemmas.
You read/copied only part of the sentence. The full sentence is as follows:
"In a practical scenario, an autonomous, morally competent medical transport acting should be able to determine if changing its route from checkpoint Alpha to checkpoint Beta is the best way to achieve its goal of delivering supplies to a disaster area or the battlefield."
"An ethics question would be about what the goal is."
The goal it's there: to deliver supplies to a disaster area or the battlefield.
In the scenario described, it was given that the goal was to deliver medical supplies to a disaster site or battlefield. But there is still an ethical question about the robot's goals and it being able to decide if changing its route from checkpoint Alpha to checkpoint Beta is the best way to achieve those goals. Suppose that one route choice has a higher probability of saving the most number of lives, but the other choice has a higher probability of saving certain VIPs?
Making excuses and loopholes for sin is easy; it's obeying the law of God's righteousness that's the hard part. Going beyond excuses and deliberately designing exceptions to a holy law - well, you make the bed you sleep in.
The 10 Commandments is a good foundation, but it took thousands of years and God's own Son to sum it all up with loving God first and then loving your neighbor as yourself; and it took crucifixion itself to live up to it.
military robot ethics falls under the latter rule about loving your neighbor. I'm pretty sure most people don't love themselves by having robot armies breaking down their doors and flying drones sniping off their loved ones, but if that's how we treat our international neighbors.. well, that's not my bed.
"In a practical scenario, an autonomous, morally competent medical transport acting should be able to determine if changing its route from checkpoint Alpha to checkpoint Beta is the best way to achieve its goal"
An ethics question would be about what the goal is. Is the goal delivering medical supplies to a disaster site or delivering a bomb to a school bus full of children?
Emotion can cloud judgement. Even human is trying to leave emotion aside when we come to making critical decision, why would we want to introduce emotion into robot? However, if we envision robots be able to learn and adopt, teaching robot emotion become inevitable. Looking further, I am questioning what happen when emotion is introduced to a robot. Will it think for itself and believe they are the ultimate being of the world? It sounds like terminator to me now. ;)
A Book For All Reasons Bernard Cole3 comments Robert Oshana's recent book "Software Engineering for Embedded Systems (Newnes/Elsevier)," written and edited with Mark Kraeling, is a 'book for all reasons.' At almost 1,200 pages, it ...