|
Post by Hussar on Jul 8, 2004 20:28:22 GMT -5
Ok, probably everyone knows that this movie is coming out this week. And, I'm sure many of you know some of the basic premises. Particularly the Three Laws of Robotics:
Law One: A robot may not injure a human being, or, through inaction, allow a human being to come to harm, unless this would violate a higher order law. Law Two: A robot must obey orders given it by human beings, except where such orders would conflict with a higher order law. Law Three: A robot must protect its own existence as long as such protection does not conflict with a higher order law.
Ok, fair enough. Now, we're talking about sentient robots here. Robots that are self aware. Admittedly, still the stuff of science fiction, but, certainly possible in the forseeable future.
My question is, is this slavery? These are cognizant beings, aware of their own existence. Isn't forcing these edicts upon them tantamount to slavery? What are the implications of this? Does sentience equate with human rights?
|
|
|
Post by Merkuri on Jul 9, 2004 4:42:37 GMT -5
Now, I don't believe that robots will ever be self-aware, at least not the same way people are. The laws themselves, specifically law number two where robots must obey humans before protecting themselves, assume that robots are sub-sentient, or at least of a lesser order than human beings. If we assume that robots have all the mental capabilities (in D&D terms, I'm talking Wis as well as Int, they gotta have good judgement along with being able to crunch numbers) of humans and are our intellectual equals, then yes the three laws are slavery. But the laws do not assume this. The laws are there to prevent a Terminator/Skynet type of coup where the robots rise up to exterminate us due to a judgement error or miscalculation. They assume the robots are powerful enough to do significant damage, yet not smart enough to handle that power in a responsible manner. So no, I don't think it's slavery because I don't think robots will ever be sentient enough to be trusted without the three laws. If they truly become sentient, then the three laws aren't needed.
On a side note, there's a little known zeroth law that says something along the lines of: A robot may not injure humanity, or, through inaction, allow humanity to come to harm.
|
|
|
Post by Galadon on Jul 9, 2004 11:34:31 GMT -5
Thinking along those lines is your computer subject to slavery if it can talk to you.
In a episode of Star Trek the next generation, a officer went to the enterprise and claimed Data was property of Star Fleet and had to report to the Daytrom institute to be subject for tests to see if they could make more copies of Data.
The question was, Is Data a sentience being. Picard had to prove Data was sentience.
If humans make another race of being, robots, was is the requirements for a robot to be recognized as a sentient being.
|
|
|
Post by khyron1144 on Jul 24, 2004 0:50:17 GMT -5
The test for whether artificial intelligence to detemine whether it's worth calling intelligence is called a Turing Test, which is, if you can talk to it and be unable to determine whether it's a person or a computer.
In a number of ways, Data from Star Trek would fail a Turing Test.
I think if we develop artificial intelligence worthy of the name intelligence, building laws as restrictive as Asimov's into it would be a form of slavery.
I'm not convinced that humans are really deserving of the label intelligent. Consider how much of our technological development is in terms of greater capacity to kill. We name our technological ages: stone, bronze, iron, etc. for what we make WEAPONS from. That's not intelligence. Intelligence is reaching a reasonable truce and balance with the rest of life around us. Okay it's a bit hippy or a bit Buddhist or otherwise difficult to swallow.
|
|
|
Post by Chahiero on Jul 24, 2004 2:38:32 GMT -5
Here's the juicy bits from a conversation I had on I, Robot:
(00:46:57) Iain: I, Robot is pretty good as well (00:47:07) Micheal: I dont think Im as keen on that one (00:47:27) Micheal: Seems too visually reminiscient of the matrix for my jaded tastes (00:48:03) Iain: no...it's different (00:48:27) Iain: or at least I found it different (00:48:39) Iain: there are similarities of course...what involving AI and such (00:49:03) Micheal: I said visually reminiscient (00:49:08) Micheal: as in it has a similar look. (00:49:34) Iain: I didn't really notice a smilar look (00:49:36) Micheal: I also think that kind of story has been done better before. (00:49:44) Micheal: (cough) space oddesy (cough) (00:49:57) Iain: no, HAL just went nuts (00:50:02) Micheal: hehe (00:50:04) Micheal: idn (00:50:21) Micheal: Ive seen way too many "eep! computers are taking over the world!" movies (00:50:27) Iain: well, I, Robot put a lot of stress on those '3 Rules of Robotics' you've probably already heard of (00:50:35) Micheal: Good odl Issac Asimov (00:50:39) Iain: exactly (00:50:56) Micheal: yeah but Ive heard that its a bit too much of a focus on that one thing (00:51:07) Iain: yeah (00:51:15) Micheal: I havent seent he movie so I will reserve judgement, but thats what Ive heard (00:51:17) Iain: there's quite a bit, in some occasions there's a little too much (00:51:33) Iain: it unnecessarily builds up to the whole 'I told you so' (00:52:04) Micheal: Yeah (00:52:10) Micheal: and I HATE that in a movie. (00:52:16) Iain: there are points where you wonder how somebody so intelligent can still be grasping at straws like that (00:52:17) Micheal: It insults my intelligence, I feel. (00:53:09) Micheal: idn (00:53:21) Iain: "Oh a robot can't kill it's against those 3 laws." (00:53:38) Iain: "Well all signs point to yes so perhaps there's something wrong with your reasoning" (00:53:58) Micheal: Laws cease to have value when they are not being implemented. (00:54:23) Iain: its more of the there's always a way around it (00:55:15) Micheal: What I meant is you cannot use such a law as a basis to say that an event is impossible when such an even is occuring DUE TO the lack of that laws proper implementation, (00:55:25) Micheal: which therefore creates loopholes that can be exploited. (00:55:55) Iain: well those 3 rules are supposedly hardwired into the robots, it's impossible for them to circumnavigate them (00:56:29) Micheal: Yet the rules are not throughly implemented; therefore there are ways for a robot to exploit loopholes. (00:56:39) Micheal: The premise of the movie hangs on that base assumption.
That addresses the rules anyways, I'll have to wait for Iain to get back on again to get to the "slavery" issue.
|
|
|
Post by Hussar on Jul 24, 2004 19:26:56 GMT -5
Well that's the point isn't it? If a robot, or computer for that matter, is sentient (of course proving sentience is a whole other ball of wax, but let's just assume for the moment shall we?), then hardwiring behavioural constraints is tantantamount to slavery.
I mean, since you have given the machine enough intelligence to make it self aware, then, does it not follow that it should also have the rights of humanity. Which includes the right to be bad.
|
|
|
Post by Chahiero on Jul 25, 2004 11:24:04 GMT -5
But is self awareness all there is to being human, or is there more?
I'd argue more.
|
|
|
Post by Merkuri on Jul 26, 2004 21:27:53 GMT -5
It's not so much slavery, IMO, as mind control (assuming that these hypothetical robots have whatever it takes to be the "spiritual" equal of human beings, i.e. sentience). Anyone ever see or read "A Clockwork Orange"? It's the same idea. The main character in Clockwork Orange is brainwashed where he cannot be violent, not even to save his own life. Admittedly he was a pretty nasty person beforehand, but like Hussar said, humans have the right to be bad, though they should pay the consequences. Inflicting restraints on robots would be the same as brainwashing a human (although the three laws are an order worse because they include the law of obedience, which can force action rather than just inhibiting it). If robots are our equals, which would be very hard to prove (nevermind create), then yes, they should be given the same rights as we have, which includes the right to be violent and evil. They should then, however, be subject to all the same laws as humans and face the same (or robotic substitutions of) penalties that we have.
|
|
|
Post by Black Robed One on Aug 3, 2004 13:23:48 GMT -5
Ok, fair enough. Now, we're talking about sentient robots here. Robots that are self aware. Admittedly, still the stuff of science fiction, but, certainly possible in the forseeable future. My question is, is this slavery? These are cognizant beings, aware of their own existence. Isn't forcing these edicts upon them tantamount to slavery? What are the implications of this? Does sentience equate with human rights? We, humans, make machines to serve us. That's pretty much it. If we will ever manage to make not only artificially intelligent but truly self-aware robot (or computer, for that matter), what we will do to it/them will be a slavery. Oh, I don't doubt that we will create some nicer name for this, calling these robots "Servants of Humanity" or "Humanity's Artificial Friends", but effectively, these robots will become our slaves. And needless to say, if there will be mistake or bug in robots' "slave program", the results will possibly turn out to be direā¦ All of this actually brings Bubblegum Crisis 2032-2033 to mind. Yes, just sci-fi animated series, but with a lot of thoughts put into them. If robots are our equals, which would be very hard to prove (nevermind create), then yes, they should be given the same rights as we have, which includes the right to be violent and evil. They should then, however, be subject to all the same laws as humans and face the same (or robotic substitutions of) penalties that we have. You are definately right, Merkuri, but the question is... Will we give our self-aware robots the rights they deserve? Or will we prefer to use them as cheap labor force, without any rights at all? Personally, I think that the later is far more likely...
|
|
|
Post by nonameapparent on Aug 3, 2004 14:09:03 GMT -5
I`ll step in and be the cynic here... What I thought about was that if you`ve created a sentient robot, would it not be yours convern as you please. Because technicly it is your property. Or would it be like with animals, that you cant kick your dog in front of people without risking getting the police at your doorstep.
Anyway I agree with bro that if we build sentient robots we would probably not be any different than the spanish with the incas. All that is not considered human, sentient or not it doesn`t really matter I think, can treated as one pleases.
|
|