Breaking News
Comments
Sanjib.A
User Rank
CEO
Interesting but immensely difficult
Sanjib.A   7/14/2014 11:36:40 PM
NO RATINGS
"Designing autonomous, morally competent robots may be inspiring and fascinating, but it certainly will not be easy."...

I agree completely. This is a fascinating research topic but seems like an impossible task. Not sure how this could be achieved in the near future. What seems impossible to me is building "Emotion" into the AI...if you have watched the movie iRobot, it is relatively easier for me to explain. The very reason why Del Spooner (Will Smith) used to hate robots...when he met an accident along with his daughter Sarah in their car, a "NS-4 model" robot saved him instead of his 12 years old daughter as NS-4 analyzed 45% chance of his survival vs 11% chance of his daughter's survival.

 

mithrandir
User Rank
Rookie
Re: Interesting but immensely difficult
mithrandir   7/15/2014 12:15:59 AM
NO RATINGS
A very interesting series I watched on AI morality is Ghost in the Shell: SAC, an idea way ahead of its time. A different outlook from most 'western'(no offence meant) concepts on robot morality.

Involved multiple autonomous semi-tanks called 'Tachikoma' which shared a single conciousness which synced among all of them every night. Definitely worth a watch but can be a bit of an investment in time :)

I always feel fiction is a great place pick up on hints on topics like these, especially a lot of Asimov's works.

 

Susan Fourtané
User Rank
Blogger
Re: Interesting but immensely difficult
Susan Fourtané   7/16/2014 9:36:01 AM
NO RATINGS
mithrandir, 

I will need to check on that series Ghost in The Shell: SAC. Thanks for mentioning it. :) 

Yes, I agree. Fiction is a great place for discussing present technology that once was only fiction, and also to see what's coming next.

-Susan

 

Susan Fourtané
User Rank
Blogger
Re: Interesting but immensely difficult
Susan Fourtané   7/16/2014 9:03:48 AM
NO RATINGS
Sanjib, 

Ahhh, you are anticipating to one of my next articles, i.e. building emotion into AI. Both building emotion and building ethics, which we discuss now, are challenging, I believe, as human emotions and ethics are so many times so conflicting and inconsistent, far to be perfect.  

Yes, I see your point referring to iRobot. Also, have you seen Spielberg's AI: Artificial Intelligence? That's another good reference when discussing this type of research. 

"a "NS-4 model" robot saved him instead of his 12 years old daughter as NS-4 analyzed 45% chance of his survival vs 11% chance of his daughter's survival."

That's a great example that you bring here. :)

NS-4's decision was made based on logic according to his analysis rather than on emotions. Will Smith's character was driven by a negative emotion: Hate, as consequence of the experience. For NS-4 saving one human instead of letting two die was the best moral decision. 

What do you think? Did NS-4 make the right decision?

-Susan

 

Sanjib.A
User Rank
CEO
Re: Interesting but immensely difficult
Sanjib.A   7/16/2014 1:42:47 PM
NO RATINGS
@Susan: "What do you think? Did NS-4 make the right decision?"

Thinking logically...yes. Thinking emotionally...no!! :)

Yes, I have watched Spielberg's "AI" multiple times. It is a beautiful movie!!! In that movie, human emotion had been implemented in the robot, which finally was not accepted by humans.     

 

Susan Fourtané
User Rank
Blogger
Re: Interesting but immensely difficult
Susan Fourtané   7/17/2014 6:51:58 AM
NO RATINGS
Sanjib, 

"Thinking logically...yes. Thinking emotionally...no!! :) "

Ahh, the eternal problem between logic and emotion. :) There is no simple answer. But choosing emotionally in the case of the NS-4 would have let die both father and daughter. 

There are cases when choosing logically is the right answer. What would have it been better if NS-4 would have saved the daughter rather than the father?

When I watched AI (several times as well), I thought of the many things that have to be considered when engineering emotion into an AI. We'll see what the researchers say about this. :) 

-Susan

Chrisw270
User Rank
Rookie
Why "Robot" Ethics?
Chrisw270   7/15/2014 7:56:49 AM
NO RATINGS
Surely the robots will make these ethical decisions based on policies and procedures that humans have devised? I don't see how having these actions carried out by robots rather than human operators really changes anything. What's important is transparency - the policies implemented by the robots need to be publically accessible and subject to legal challenge, not just hidden away in the computer code.

zeeglen
User Rank
Blogger
Re: Why "Robot" Ethics?
zeeglen   7/15/2014 9:38:48 AM
Congratulations on your purchase of the most ethical robot software of the highest degree!

Please to select operational mode:

a) Politician

b) Salesman

c) Lawyer

d) Other

LarryM99
User Rank
CEO
Re: Why "Robot" Ethics?
LarryM99   7/15/2014 2:01:39 PM
NO RATINGS
Designing an ethical code is easy (example: the Ten Commandments). The difficulty lies in designing the exceptions to that code. Zeeglen, your list has merit (although it also has a pretty tilted set of options). Asimov's Three Laws recognizes conflicting situations rather than simple absolutes, but I am sure that in Lawyer mode there could be a lot of room for interpretation,

Larry M.

joondan
User Rank
Rookie
Re: Why "Robot" Ethics?
joondan   7/20/2014 10:44:58 PM
NO RATINGS
Making excuses and loopholes for sin is easy; it's obeying the law of God's righteousness that's the hard part.  Going beyond excuses and deliberately designing exceptions to a holy law - well, you make the bed you sleep in.  

The 10 Commandments is a good foundation, but it took thousands of years and God's own Son to sum it all up with loving God first and then loving your neighbor as yourself; and it took crucifixion itself to live up to it.

military robot ethics falls under the latter rule about loving your neighbor.  I'm pretty sure most people don't love themselves by having robot armies breaking down their doors and flying drones sniping off their loved ones, but if that's how we treat our international neighbors.. well, that's not my bed.

aktif3123
User Rank
Rookie
Re: Why "Robot" Ethics?
aktif3123   7/21/2014 8:41:09 AM
NO RATINGS
well, that's not my bed. ayakkabi sandvic panel seo tabela izmir karting

David Ashton
User Rank
Blogger
Re: Why "Robot" Ethics?
David Ashton   7/20/2014 8:38:05 PM
NO RATINGS
@Zeeglen...many a true word is said in jest.....  if we could engineer some ethics into the human race, that would be a good start :-)

zeeglen
User Rank
Blogger
Re: Why "Robot" Ethics?
zeeglen   7/20/2014 8:50:59 PM
NO RATINGS
Thanks, David.  Yes, I did post that with tongue-in-cheek, along the line of Max's recent topic of user manual translations. Humans teaching ethics?

Susan Fourtané
User Rank
Blogger
Re: Why "Robot" Ethics?
Susan Fourtané   7/16/2014 10:15:16 AM
NO RATINGS
Chris, 

"I don't see how having these actions carried out by robots rather than human operators really changes anything."

In a war scenario some actions carried out by robots instead of humans may be beneficial. If you send an autonomous vehicle with the capacity to make decisions to a war zone to deliver supplies, for instance, instead of a regular vehicle driven by a human the human driver can be assigned a different task where a human presence is more needed. 

What other actions do you have in mind?

-Susan

Susan Fourtané
User Rank
Blogger
Re: Why "Robot" Ethics?
Susan Fourtané   7/16/2014 10:23:27 AM
NO RATINGS
Chris, 

"Roboethics" because it's the human ethics that robots' designers, manufacturers, and users need to have when designing, manufacturing, or interacting with a robot. It's not about the ethics of the robot but the ethics of the humans, which need to be in place. 

We are going to come back to this soon. 

-Susan 

chanj0
User Rank
Manager
Emotion?
chanj0   7/18/2014 1:34:22 PM
NO RATINGS
Emotion can cloud judgement. Even human is trying to leave emotion aside when we come to making critical decision, why would we want to introduce emotion into robot? However, if we envision robots be able to learn and adopt, teaching robot emotion become inevitable. Looking further, I am questioning what happen when emotion is introduced to a robot. Will it think for itself and believe they are the ultimate being of the world? It sounds like terminator to me now. ;)

Wnderer
User Rank
CEO
I don't get it.
Wnderer   7/20/2014 10:27:12 PM
NO RATINGS
How is this an ethics question?

"In a practical scenario, an autonomous, morally competent medical transport acting should be able to determine if changing its route from checkpoint Alpha to checkpoint Beta is the best way to achieve its goal"

An ethics question would be about what the goal is. Is the goal delivering medical supplies to a disaster site or delivering a bomb to a school bus full of children?

 

AZskibum
User Rank
CEO
Re: I don't get it.
AZskibum   7/20/2014 11:04:17 PM
NO RATINGS
In the scenario described, it was given that the goal was to deliver medical supplies to a disaster site or battlefield. But there is still an ethical question about the robot's goals and it being able to decide if changing its route from checkpoint Alpha to checkpoint Beta is the best way to achieve those goals. Suppose that one route choice has a higher probability of saving the most number of lives, but the other choice has a higher probability of saving certain VIPs?

Susan Fourtané
User Rank
Blogger
Re: I don't get it.
Susan Fourtané   7/25/2014 5:00:33 AM
NO RATINGS
Winderer,

You read/copied only part of the sentence. The full sentence is as follows: 

"In a practical scenario, an autonomous, morally competent medical transport acting should be able to determine if changing its route from checkpoint Alpha to checkpoint Beta is the best way to achieve its goal of delivering supplies to a disaster area or the battlefield."

"An ethics question would be about what the goal is."

The goal it's there: to deliver supplies to a disaster area or the battlefield.

-Susan

Wnderer
User Rank
CEO
Re: I don't get it.
Wnderer   7/25/2014 8:23:22 AM
NO RATINGS
@Susan Fourtane  The goal it's there: to deliver supplies to a disaster area or the battlefield.


I understand that, but because the goal is there I don't see any moral question. Deliver the supplies. My guess is that you meant to indicate that the robot is supposed to perform some sort of battlefield triage and decide who can't be saved, who can wait, who needs immediate help and what order to put the casualties in to save the most lives. Even here I don't see any ethical dilemmas.



EE Life
Frankenstein's Fix, Teardowns, Sideshows, Design Contests, Reader Content & More
Glen Chenier

Engineers Solve Analog/Digital Problem, Invent Creative Expletives
Glen Chenier
1 Comment
An analog engineer and a digital engineer join forces, use their respective skills, and pull a few bunnies out of a hat to troubleshoot a system with which they are completely unfamiliar. ...

Max Maxfield

What's the Best Traveling Toolkit?
Max Maxfield
13 comments
A few years ago at a family Christmas party, I won a pocket knife as part of a "Dirty Santa" game. This little scamp was a Buck 730 X-Tract. In addition to an incredibly strong and sharp ...

Rishabh N. Mahajani, High School Senior and Future Engineer

Future Engineers: Don’t 'Trip Up' on Your College Road Trip
Rishabh N. Mahajani, High School Senior and Future Engineer
10 comments
A future engineer shares his impressions of a recent tour of top schools and offers advice on making the most of the time-honored tradition of the college road trip.

Larry Desjardin

Engineers Should Study Finance: 5 Reasons Why
Larry Desjardin
41 comments
I'm a big proponent of engineers learning financial basics. Why? Because engineers are making decisions all the time, in multiple ways. Having a good financial understanding guides these ...

Top Comments of the Week
Flash Poll
Like Us on Facebook
EE Times on Twitter
EE Times Twitter Feed

Datasheets.com Parts Search

185 million searchable parts
(please enter a part number or hit search to begin)