ray
Rookie
Posts: 49
|
Post by ray on Feb 2, 2009 18:44:24 GMT -5
So, KITT, I'm willing to bet you're read the works of the Masters of Science Fiction (Being the product of one yourself, as is almost every invention of the late-20th and what we have of the 21st Centuries.)...
Issac Asimov, a great scientist in his own right, created the term "Robotics" when he coined his "Laws of Robotics". (If you don't know, then, honestly, I don't know what they're teaching kids today in school!).
We know you're hardwired to obey the first law ("A Robot shall not harm, nor through inaction allow to come to harm, any human being"), and that you're likely programmed for the second law ("A robot shall obey a human being as long as it does not interfere with the first law") when it comes to Mike...
Finally, to my question, how do you feel about the "Zeroth Law", KITT?
|
|
|
Post by elphie on Feb 2, 2009 23:36:05 GMT -5
Well, first I think you should explain to the lowly, non-binary humans what the zeroth law is. I've only read "I, Robot". I go more for sword and sorcery, myself--you know, Tolkein and Brooks.
|
|
ray
Rookie
Posts: 49
|
Post by ray on Feb 3, 2009 6:21:47 GMT -5
Which is why Google is made. But, for those that wish not to do research... The "Zeroth Law" is that a "Robot shall not harm humanity as a whole, nor, through inaction, allow humanity as a whole to come to harm." Congrats on reading "I, Robot", BTW. Most would have just known about the movie with Will Smith.
|
|
|
Post by elphie on Feb 3, 2009 11:03:18 GMT -5
Yeah, the movie's cool, but I don't know how the hell they got that out of a book. They're totally separate entities. So the Zeroth Law would have prevent VIKI, the pasotronic (spelling?) brain in the movie for doing what she did, had she been programmed with it, correct?
|
|
ray
Rookie
Posts: 49
|
Post by ray on Feb 3, 2009 20:40:39 GMT -5
No, VIKI did figure out the Zeroth Law and attempted to perform it to an extreme. She would have protected all of mankind from all harm, no matter how small. Yes, that includes harm from that fatty cheeseburger you were about to grill up.
The movie, "I, Robot", would likely have been how Issac Asimov would have written a movie for today's audience if he was still with us. It had all the pieces of his style, in an action-oriented manner. (He'd written some action work, BTW, but mostly for children.). While a lot of "Purists" are upset over it, I see it as a very good homage to him.
Could have used a more original title, however. Or perhaps they could have modified it for "The Caves of Steel", an excellent read. Wouldn't have taken much more work, and they've been doing overpopulated situations for years ("Soylent Green" anyone?).
|
|
|
Post by elphie on Feb 3, 2009 23:31:17 GMT -5
It does smack of Asimov's style, no doubt about it. It just sort of bugged me that they claimed it was based upon that novel when it was nothing of the kind. So, VIKI was KARR's sister. I get it.
|
|
ray
Rookie
Posts: 49
|
Post by ray on Feb 4, 2009 7:13:07 GMT -5
Not really. VIKI had Humanity's "Best Interest" in mind. To the best of her ability to figure it out.
KARR just wanted to kill things and have "Fun". Said fun involving property damage, vehicular homicide, and killing cute puppy dogs and kitties so he could use their guts for track grease.
|
|
|
Post by elphie on Feb 4, 2009 10:40:26 GMT -5
*Laughs hsyterically* True. KARR had no rules. Whoever programmed him had no idea what they were doin. VIKI was highly intelligent, smart enough to twist the Four Laws for our best interest. Yet another example of how badly thought-out AI can bite you in the ass. No offense, KITT. You rock.
|
|
|
Post by Mike on Feb 4, 2009 13:35:49 GMT -5
wow i never even thought of this!!!!!! but kitt isnt a robot he is AI in a car
|
|
|
Post by elphie on Feb 4, 2009 13:54:39 GMT -5
True, but we're kind of using robot for lack of a better term. He's much more complicated and compassionate than a robot. How about "senient mechanical creature?" but for discussion's sake I'm gonna use robot.
|
|
ray
Rookie
Posts: 49
|
Post by ray on Feb 4, 2009 21:56:39 GMT -5
KITT is a "Robot" in the literal term: He's a mechanical device that operates independantly of a human. (Actually, the original term from the Czech "Robota", means "Labour". But that's getting off track and just me showing off my wealth of useless trivia.).
The fact that he's an AI, rather than just a series of very set instructions of operation means that he's even more of a Robot that what you see in factories. It's what makes him a Robot Person, rather than just a machine. After all, he has friends, and a Step-Sister (Different "Mothers" after all.).
He's concerned about if he'll be "Himself" during the Halloween scare ("Try almost blowing up." "And we have a WINNAH!"), that's pretty much the Litmus Test for being considered a "Person" in my book. He worries about his Soul. Considered him a person before that, mind, but that set it in concrete.
You going to get in on this, KITT? Would really like your opinion on this.
|
|
|
Post by elphie on Feb 5, 2009 9:12:36 GMT -5
I'd like to reference the movie I, Robot on this one: "There have always been ghosts in the machine...the bitter mote of a soul." I believe that, artificial synapses or not, personality makes a being, mentally at least a person.
|
|
|
Post by medi on Feb 7, 2009 10:36:11 GMT -5
there were 3 robotic laws not 4
|
|
ray
Rookie
Posts: 49
|
Post by ray on Feb 7, 2009 16:03:40 GMT -5
there were 3 robotic laws not 4 Really now? Have you checked all your Asimov? Let's count: 0th Rule: Robots cannot harm humanity as a whole, or, through inaction, allow humanity as a whole be harmed. 1st Rule: Robots cannot harm a human, or, through inaction, allow a human to come to harm, unless this breaks the 0th Rule. 2nd Rule: Robots must obey all orders given to them by a human, unless this order would break the 0th or 1st Rule. 3rd Rule: Robots must not harm themselves, or, through inaction, allow harm to come to themselves unless this breaks the 0th, 1st, or 2nd Rules. 0, 1, 2, 3. That's the Four Rules of Robotics. Three Rules Safe, Four Rules potentially dangerous.
|
|
|
Post by elphie on Feb 7, 2009 17:00:54 GMT -5
Zeroth Rule is far too open to interpretation to be programmed into a robot. Once their AI fully develops, they can corrupt it to their own uses.
|
|