Kuro5hin.org: technology and culture, from the trenches
create account | help/FAQ | contact | links | search | IRC | site news
[ Everything | Diaries | Technology | Science | Culture | Politics | Media | News | Internet | Op-Ed | Fiction | Meta | MLP ]
We need your support: buy an ad | premium membership

[P]
A.I.: Artificial Intelligence

By sventhatcher in Culture
Tue Jul 03, 2001 at 05:21:12 PM EST
Tags: Movies (all tags)
Movies

The new Kubrick inspired Spielburg film may fall far from being scientifically correct, but it does rouse a number of interesting questions about our humanity and how society would interact with A.I.

The main question is posed in the movie directly:

If we could design a robot who could love, would it be possible for the owner to reciprocate that love?

(Warning: Spoilers)


A.I. is the latest in a long line of films to tackle the subject of the interactions between machines with intelligence and/or emotion. Making it a good time to bring up some of the moral and ethical issues that revolve around the premise.

In A.I., David is the first child programmed to be capable of experiencing love. His creator(s) believe that emotion will be the key to unlocking a deeper level of intelligence. It appears they were right, because once David's love-function is activated he slowly becomes more human. He develops survival instincts, he has desires, and in the end he even has dreams.

Could humanity ever learn to feel emotion towards a machine

A.I. seems to answer no to this question. His mother abandons him, because though she feels strongly for the childlike aspects of David, no matter how many qualities of humanity he exhibits she cannot fully forget the fact that he is not, like her, organic. His creator loves David in the way that I might love a particularly brilliant comment I've made. His care for David is simply ego-stroking. Joe and Teddy actually seem to genuinely care for David more than anyone else in the film, and they aren't even programmed for emotion.

If we as a society were ever able to advance A.I. research to the point of a creation like David, I find it unlikely that our reactions would be any different. Humans have consistently shown fear and prejudice against things that are new and different, portrayed by the Flesh Fair in A.I. Regardless of how realistic and accurate the simulation of humanity may be, as long as we're aware that the entity is not organic then that reoccuring fear of the unknown will haunt us.

A topic that is has been less explored, in my opinion, which A.I. also tackles is how such an intelligence would react to a world that refuses to accept it.

The actions of David, which probably makes up the strongest point of criticism with A.I., are decidedly mechanical. He acts with a single-minded devotion and idealism to obtain the return of the love that dominates his existence. He only briefly gives up in his pursuit, even in the face of what we would consider impossible odds. Only for a brief moment does David give in, once he sees the many copies of himself. At this point, he immediately tries to destroy himself which is not because of the fact that he is not unique itself, but because it shatters his illusions that he is special and that his specialness could somehow eventually lead to gaining the love he so greatly desires.

Sci-fi in general usually takes a slightly different approach to this issue. The most common story seems to be of the robots resenting humanity and eventually turning on them. I find this an unlikely situation in reality if machines truly gain emotion and independent thought. The reason is not due to some sort of Asimovian restrictions (undoubtedly the system would be way too complex to apply such blanket concepts, especially since such entities blur the lines of humanity), but due to fear. If a being is capable of self-preservation and emotion, then it will naturally fear it's own death or the electronic mind equivalent of pain. Fear of punishment is what I'm talking about, of course. Law enforcement for robotic beings would undoubtedly be highly effective as long as they were considered such and not human which brings me to the most relevant issue.

What does it mean to be alive?

If a being (mechanical or otherwise) thinks for itself, feels, and fears it's own demise, is it not more or less alive? Does a sentient, feeling machine have the right to be considered human or at least a citizen as opposed to property?

A.I. portrays my opinion best in the Flesh Fair when they bring David forth to be destroyed and the crowd protests, because he fears his own death and pleads for his life. I believe strongly that anything that fears it's own demise (if it's true emotion and not just a script or something) can be considered living. If such a being has an intelligence level up to or beyond that of humanity, then I believe that it should be given all the rights attributed with being human. To not do so would be another form of slavey.

Sadly, if such technology ever develops this is the way I see reality going. Fear of the unknown. Fear of a robot revolt. Reluctance to accept the new. All of these things add up to a society unwilling to accept what I would consider it's newest members. So these beings which are capable of thought along the same lines as you and I are bought and sold as property and are nothing but objects. They can be destroyed/killed on a whim. And hundreds of years later everyone will regret it terribly.

History does repeat itself.

Sponsors

Voxel dot net
o Managed Hosting
o VoxCAST Content Delivery
o Raw Infrastructure

Login

Poll
Will we ever have sentient/feeling machines?
o Feeling, Not Sentient 2%
o Sentient, Not Feeling 9%
o Both 65%
o Neither 22%

Votes: 44
Results | Other Polls

Related Links
o Also by sventhatcher


Display: Sort:
A.I.: Artificial Intelligence | 52 comments (47 topical, 5 editorial, 0 hidden)
Love Given (4.00 / 3) (#2)
by Devil Ducky on Tue Jul 03, 2001 at 01:34:26 PM EST

In the movie David was loved by a human. The reason his mother could not have him turned off/killed was that she loved him too much. Perhaps she should have thought of it as putting him out of his misery but that tends to get hard to do when love gets in the way. Damn chemical reactions!

It's also noteworthy that David was the first mecha that needed to be put out of his misery. With the positive emotions come the negatives.

Devil Ducky

Immune to the Forces of Duct Tape
Day trading at it's Funnest
I disagree (5.00 / 1) (#4)
by sventhatcher on Tue Jul 03, 2001 at 01:37:22 PM EST

I think his mother wanted very much to love him, because he seemed so real to her. She loved the child David could've been had he been real, but his artificial nature held her back from ever truly feeling emotion for him. It was the conception of him that she cared for. It was the reality she couldn't deal with.

[ Parent ]
I disagree (5.00 / 1) (#23)
by acronos on Tue Jul 03, 2001 at 07:47:07 PM EST

At the end of the movie his mother made it clear that she loved him. She said she had always loved him.

[ Parent ]
Assumptions (5.00 / 1) (#24)
by sventhatcher on Tue Jul 03, 2001 at 08:16:43 PM EST

You're assuming that the experiences he had with his mother really happened.

It's my belief that in all likelyhood one of two things was going on there:

1. Everything post-ferris wheel was David's dream.
2. The advanced robots of the future created the experience of being with his, now loving, mother.

[ Parent ]
True (5.00 / 1) (#31)
by acronos on Wed Jul 04, 2001 at 07:03:10 AM EST

These are two real possibilities. There is definitely some ambiguity here.

The robots of the future said they could only bring her back for one day. That the "temporal distortions" only allowed a being to be brought back for one period of consciousness implies a soul to me. In any case, it also implies that it was the real mother. Had they just been fabricating a perfect world, why could it only last for one day?

I can see no solid problem with the dream scenario you mentioned. My impression was that the future world was real. My impression is that David was unlikely to suddenly dream up something after 2000 years of sitting there. Maybe as his circuits started to fail, new behaviors began to manifest. I don't know.

[ Parent ]
Is it real or is it memorex? (4.20 / 5) (#6)
by Anonymous 242 on Tue Jul 03, 2001 at 01:44:41 PM EST

sventhatcher asked:
If a being (mechanical or otherwise) thinks for itself, feels, and fears it's own demise, is it not more or less alive?
More importantly, how would we know if a being (mechanical or otherwise) thinks for itself, feels, and fears its own demise? Humans, it would seem, have a propensity to antropromorphosize unhuman objects. People talk to their cars, their computers, their pets as if these objects were conscious. While almost everyone in their right mind "knows" that these objects are not rational beings, we still feel that they are.

Given the nature of A.I., how would we know if such were truly feeling or just a better, more extravagant Eliza?

LEE! (3.50 / 2) (#15)
by CaptainZornchugger on Tue Jul 03, 2001 at 04:27:03 PM EST

You're back! For a second there, I thought the site was going to fall apart without you.

No seriously, I did.


Look at that chord structure. There's sadness in that chord structure.
[ Parent ]
It's not really me (5.00 / 1) (#40)
by Anonymous 242 on Thu Jul 05, 2001 at 02:33:49 PM EST

You just think it is.

And that's the point, eh?

[ Parent ]

Ahoy matey, Spoilers ahead! (4.66 / 3) (#7)
by gridwerk on Tue Jul 03, 2001 at 01:55:47 PM EST

Why was this movie even made? "Bladerunner" already addressed all these issues- 20 years ago. And was a MUCH better movie. Was the first thing I thought after seeing it.

Personally, I did not like the movie. It brought up interesting questions and it looked amazing, but the storytelling itself was terrible. You can tell that Spielberg hasn't written a screenplay on his own since Close Encounters of the Thrid Kind. It seemed like Spielberg knew where he wanted to go, he just couldn't get there intelligently. Also, I thought the ending was pointless.

There is the Kubrick and Spielberg ending. Obviously the Kubrick ending was when the carnival wheel eternally trapped the kid for 2000 years until he ran out of "batteries" ( at least I think he did). This ending is great on so many levels. Kubrick would have accomplished his final statement in that you would have never known if the kid was really a kid at the end. In fact, because of the eternal "loop" in which the kid seems to be stuck in wishing, it establishes that he was merely a program in search of Mommy and trying every possible logical step to attain the Mommy again. If Directive A cannot be reached without component C, then one must find component C to react Directive A again. Being that David finally came to the Blue Fairy, it was evident that he simply got caught in a loop in which that thing which would have granted him component C ( at least what he was programmed to believe) could not do so, he became stuck. Therefore, if this ending had taken place and was the finale, then I would believed moreso that David was still only an android and not real boy.

And then there was the Spielberg ending. I won't even go to much into that. I just feel all the work that was done to produce such a convincing and lush environment was shattered the moment you hear, "2000 years" blah blah. I would have to say this movie was just slightly better than battlefield earth. the only reason it's better is because of the Lord of the Rings trailer



Obvious or Wishfully? (5.00 / 3) (#9)
by momocrome on Tue Jul 03, 2001 at 02:55:23 PM EST

Obviously the Kubrick ending was when the carnival wheel eternally trapped the kid for 2000 years until he ran out of "batteries"

This is entirely incorrect. The far future ending was part of the original Kubrick package. I notice that a lot of people are finding 'obvious' shortcomings in Speilbergs involvement, but I firmly believe the film was loyally executed to Kubricks specifications.

I think the scene(s) set 2000 years forward might have been a little tough for some viewers, but it is perhaps the most important part of the film. This is where Kubrick asserts that 'artificial' souls are as substantial as an organic soul. By showing David falling asleep with a smile on his face, after an (possibly synthesized) emotionally fulfilling experience with his mother shows us that emotions are not functions of organic beingness but rather deep rooted conflicts common to anything approaching sentience.

This film is a masterpiece. The visual effects, cinematography, performances, moral investigation and mythic proportions are all portrayed with spectacular skill. What I am able to discern from so many objections and negative opinions is that the viewing public actually requires the sugar coating of hollywood. How fickle a bunch we all are! After noting the reaction of the public to this particular film, I no longer doubt the correctness of cynicism in Hollywood. An attitude of any other stripe is as good as a death certificate in this town.

"Give a wide berth to all that foam and spray." - - Lucian, The Way to Write History
[ Parent ]

The Kubrick ending... (3.33 / 6) (#11)
by ucblockhead on Tue Jul 03, 2001 at 03:23:07 PM EST

While it is true that the "2000 years later" thing was in the Kubrick treatment, it was not the cuddly ending that we saw in the movie. It was decidedly a darker story. In Kubrick's version, the mother character is an alcoholic who treats David as a servent, and the ending involves an artificial mother treating David as a servant...forever.

All that "just one day"..."I love you!" crap was Spielberg.
-----------------------
This is k5. We're all tools - duxup
[ Parent ]

The Script and Kubrick (5.00 / 1) (#25)
by sventhatcher on Tue Jul 03, 2001 at 08:20:08 PM EST

Keep in mind also that Kubrick was never satisfied with his ability to write A.I., and he had planned to probably hand the job over to Spielburg anyway.

I hardly think he's rolling in his grave, nor do I think the alcoholic mother and servant David ending would've been near as compelling.

I'm slightly curious actually as to why the general response has been that the movie should've ended darkly. Why is a happy ending a bad thing?


[ Parent ]
its not so much that its happy (5.00 / 1) (#28)
by typhatix on Wed Jul 04, 2001 at 02:06:30 AM EST

is that it feels shallow compared to the rest of the film. The film shifts focus and badly explains this part using almost horrid plot changes (only 1 day because of the nature of spacetime? blah). There are ways that don't feel as hokey that they could have showed him becoming more human-like in his emotion...



[ Parent ]
shallow? (5.00 / 1) (#37)
by khallow on Thu Jul 05, 2001 at 11:22:58 AM EST

is that it feels shallow compared to the rest of the film. The film shifts focus and badly explains this part using almost horrid plot changes (only 1 day because of the nature of spacetime? blah). There are ways that don't feel as hokey that they could have showed him becoming more human-like in his emotion...

I don't know about this ending being "shallow" ("hokey" comes to mind :-). Definitely the Kubrik "dark" ending (as described here) was shallow. And you really couldn't stop at the ferris wheel cage thing, because a likely outcome at that point is the police come pick up their vehicle (and David). Despite the "1 day" band aid on the plot, I was able to maintain a sufficient level of suspension of belief. And I think the ending wasn't bad, if a bit sappy.

[ Parent ]

one day (none / 0) (#50)
by goosedaemon on Wed Jul 11, 2001 at 06:52:23 PM EST

It wasn't so much one day. It was when they went to sleep and lost consciousness.

[ Parent ]
better? (5.00 / 1) (#42)
by ignatios on Fri Jul 06, 2001 at 03:38:17 AM EST

the only reason it's better is because of the Lord of the Rings trailer

that's one of the funniest things i've read in a *long* time ... not because it's necessarily true, however ;-)

[ Parent ]

skunks (5.00 / 2) (#8)
by Locke on Tue Jul 03, 2001 at 02:42:54 PM EST

I saw a show a month or two ago on animal planet, or maybe the discovery channel. It was about a convention of sorts for owners of de-stinkified pet skunks. It was sort of like a dog show in that they gave blue ribbons for the best skunks in various categories and whatnot. They also had the testimonials of many of the owners on what a great pets skunks make.

The whole time I was watching it I was thinking, could I own a skunk and love and trust it in the same way I've loved cats and dogs? While all the interaction between the skunks and their owners were full of affection and what seemed genuine love, I really couldn't see them as anything but wild animals. How could I ever understand something that is essentially wild. How could I ever come to trust a wild animal? A dog I can communicate with, empathize with, and understand. I can even comminicate with and understand cats to a degree.

When I see a dog there is no mystery or fear. At the start of that movie you can definetly sense the fear that comes from having a mysterious it living in your house. The fear that Monica can't really be sure of how David will act, behave, etc. How can he be trusted if he can't be understood?

The irony, to me anyway, is that in the end all he ever displayed were understandable human emotions. Jealousy, love, obsession, hate, fear, gullibility, anxiety, panic, sadness, happiness, etc. All traits that any human should be able to relate to and understand, but still they treat him as other. How fitting that the intelligent machines from the future look to us like aliens.



Bad science (4.50 / 2) (#10)
by spacejack on Tue Jul 03, 2001 at 03:07:08 PM EST

You can't make a movie about AI with such bad science and so many logical loopholes. This story has dated 1970's sensibilities about a modern technology that almost everyone these days is at least slightly familiar with (even if just by playing video games). Sometimes you can bend a plot or story idea until it lines up with technology sufficiently, but Pinnochio->AI just didn't work; he pushed it too far and the whole thing broke down. Fairy tale or techno-thriller? He couldn't make up his mind.

The biggest letdown for me was the development of "David". The movie implies they made robot after robot until they got one right. But this is completely dumb; "A.I." is a computer program and would evolve on a computer. Long before you arrived at "David", you'd be faced with the moral dilemmia of killing all of these sentient(?), feeling(?) programs "living" on your PC as you worked out the bugs. Hopefully they wouldn't have developed self-preservation instincts and try to escape out on to the 'net in the meantime. Hopefully they wouldn't discover the corpses of their predecessors in the deleted sectors of your hard disk and get angry.

And WTF was with that opening statement: "I propose that we build a feeling robot that can love!" Okay then! Well, I propose that we make a faster-than-light spacecraft!

Bah, there were all kinds of neat things they could've done with the story and they dropped the ball at just about every opportunity.

And one final question: why was the bear smarter than the kid?!

The bear was smarter because... (5.00 / 1) (#20)
by anansi on Tue Jul 03, 2001 at 07:03:54 PM EST

He didn't have to maintain this psycokiller emotion called 'love'. He just had to keep stitching himself together and looking cuddly. David's love made him do all kinds of non-robotic stupid things, in the same catagory as the stupid stuff humans do.

Don't call it Fascism. Use Musollini's term: "Corporatism"
[ Parent ]

Screamers (5.00 / 1) (#21)
by dr k on Tue Jul 03, 2001 at 07:20:19 PM EST

For a minute I thought this was a discussion about Screamers, the 1995 film based on a Philip K. Dick story. Are there any sawblades in A.I.?
Destroy all trusted users!
[ Parent ]
RE: Bad Science (5.00 / 1) (#33)
by Anm on Wed Jul 04, 2001 at 10:03:39 AM EST

The movie implies they made robot after robot until they got one right. But this is completely dumb; "A.I." is a computer program and would evolve on a computer.
You assume the type of intelligence developed in an android can be developed independently of it's body and all the feedback mechanisms that implies. While I won't speak for everyone, several in the AI community would argue against that, not the least of which are the Cog researchers.

Anm

[ Parent ]

P.O.S. (4.00 / 4) (#12)
by Nitesurfer on Tue Jul 03, 2001 at 03:37:57 PM EST

I liked BiCentenial Man ( another movie with basis from a premeire SCI-FI writers book or stories). Initally Robots are just programmed to do a task or set of tasks. They have very limited decision making capabilities, and cannot handle anything outside the bounds of their programming.

Obviously at the start of the two movies, the ROBOTs Limited processing has evolved to be almost actors of real humans. But at what point do beings like this attain a consciousness that would basically give them a soul? They are not inamimate objects, but are they really people? At the flesh faire Spielberg tried to HUMANIZE the mechas, but he did something just as bad as when we try to humanize characteristics within animals. For example do you really think these machines would be upset about their impending destruction? Does your PC quiver when you shut it down? Does your FAX machine get nervous you might beat the crap out of it like the guys in the movie "Office Space"? The answer to all these questions should be: NO!!! Remember by Dr. Hoby's own comments the current robots had no feelings. Only with the NEW love programming would they feel emotions.

The evolution of robot to intelligent machine/person is much more believable in BiCentennial man as we actually saw him evolve.

David rather has no evolution before our eyes. He is programmed to love(?) and imprint on his new family. The rejection from his imprinted mother causes him to make decisions to try to be what his imprinted ,mother really wanted... to David that is a REAL BOY. You could argue it, but I could write a simplistic program that says "I love my mother". But based on certain possible instances program responses. Simple but could be advanced to the degree David acts in a movie. Think about it.. he sits in a hovercopter for two-thousand years saying "I wish I was a real boy" over and over and over. Talk about your infinite loops. Not much Learning occurred there... not much evolving either. Not to smart, sitting there for two-thousand years we he could have tried to escape. Definitely shows a ROBOT not relating exactly with his environment.

So as far as articial intelligience goes, sure you can program a neural network for thousands of different triggered responses... but that only makes the program as smart as you programmed the network.

Lastly, the programming forces the ROBOT to imprint on its new family (i.e. when that sequence is initiated), but he/she never grows old. When the family no longer requires this Robot Child, the child must be shut down. Talk about a Throw Away society. I love you... now let me shut you off. At least the other robots, if they performed their duties well could move on from owner to owner with no impending destruction. What were his marketing plans to sale these to mother who like to shake their babies to death? How can anyone actually love a "child" and then destroy it at any point.

Sorry to ramble but I thought this was a COMPLETE waste of money. While the special effects were pretty good... the story sucked.


David Byrd

CEO --- Twenty First Century Technologies, Inc.
Home of the Nite-Surfer Illuminated Keyboard

What? (5.00 / 1) (#39)
by CrayDrygu on Thu Jul 05, 2001 at 01:55:43 PM EST

"For example do you really think these machines would be upset about their impending destruction? Does your PC quiver when you shut it down? Does your FAX machine get nervous you might beat the crap out of it like the guys in the movie "Office Space"?"

Excuse me...what? Obviously not! And I have to wonder if you were deliberaltely skipping the reason for that, or if you honestly don't see it.

PCs and fax machines don't have emotions, or more accurately, aren't programmed to have emotions. They don't even think, or pretend to think. They just follow instructions. The mechas in AI were programmed with some level of emotion, to make them more appealing to their owners, undoubtedly. Maybe not real emotion, but any good emulation of emotion would certainly have a reaction to the Flesh Fair.

"Not to smart, sitting there for two-thousand years we he could have tried to escape. Definitely shows a ROBOT not relating exactly with his environment."

Precisely! I think that's exactly what Speilberg was trying to show, there. David was programmed to love, with the hope that emotions and all that other stuff would follow, which -- to an extent -- they did. However, David was not perfect. He knew how to love, but he didn't know how to stop when he needed to. Human emotions with the patience of a computer.

Personally, I can think of better things to pick on about the movie. Like that fact that eating spinach makes him short-circuit, but he can swim just fine. And that he was awake on the table while they were taking cards out of his belly -- if he malfunctioned because they got spinach on them, they must be essential to his operation, so how can he be awake with them removed?

[ Parent ]

P.O.S. (none / 0) (#47)
by Nitesurfer on Mon Jul 09, 2001 at 03:20:54 PM EST

You make valid points --- I am just glad I did not take my daughter to this. It certainly went beyond what I was expecting in a Morbid way.... Not the Spielberg I was expecting. This is not to say he should alter his artistic freedoms.... but on the other hand give us a little warning.
Border-line Sixth Sense with Bicentennial Man mixed in.... you could almost imagine David saying " I seeee Deactivated Robots!" Whew scary!!


David Byrd

CEO --- Twenty First Century Technologies, Inc.
Home of the Nite-Surfer Illuminated Keyboard

[ Parent ]
Cliches (none / 0) (#49)
by goosedaemon on Wed Jul 11, 2001 at 06:24:49 PM EST

I think (for instance) that the example of one robot at the fair asking the other to deactivate his pain receptors was simply inserted because, hey, that's what everyone else would put in a place like that. However, there were examples of what I feel are more realistic.

The nanny robot was loving to David, and smiled, and all that. I don't think this was to give comfort to David, at least not on her intentional part: it was simply her programming, and, not being intended to be in a situation like this, she didn't have programming to react as a human might. On the other hand, she was aware that this was it (whether this awareness was aware like we are aware or aware like a flag being set is debatable) hence she said goodbye.

Another example might be Joe. All the time, he's doing his swagger thing. I don't think he really felt swaggery. It was just how he walked.

...That said, never again will I look at heavy metal the same again. Celebration of life my ass.



[ Parent ]
What is emotion? (5.00 / 2) (#16)
by acronos on Tue Jul 03, 2001 at 05:13:41 PM EST

We as humans like to place ourselves at the center of the universe. We like to think of our emotions as something of the Gods. Emotions are simply genetic or learned motivations toward behaviors that profit our genes. There are good evolutionary reasons for our emotions. The experience of emotion is different from the behavior they elicit. Sorrow feels bad, Love feels good. (most of the time) This experience is only subjective programming. Once we have tracked the paths in the brain that generate it, it will be harder for people to view it in such a deified way. Emotion is a physical phenomenon. It is much easier to copy than human vision, speech, or problem solving. Emotions can be created simply by applying numeric or symbolic weight adjustments to decisions the machine associates with an emotional object and making the emotional object be tied to certain behaviors. This is also how we will find it is done in the human brain. There is no magic here. We deceive ourselves into thinking that our own emotions are real. They are no more real than David's. And, they are no less real.

I don't expect the majority of people to accept this yet. Eventually we will begin to prove this stuff in experiments. It is an intuitive leap but a reasonable conclusion from the data we already have available. Just as we discovered that emotion was not physically located in the heart and that courage was not physically located in the gut, we will discover that our emotion is not generated by magic. And when we do everyone will claim that they knew it all along and forget that we ever believed anything so stupid.


Emotion (5.00 / 1) (#26)
by sventhatcher on Tue Jul 03, 2001 at 08:36:38 PM EST

Emotion is partially chemical and neurological, but it's largely social as well. What makes us happy/sad/angry/etc. is largely the product of past experience. Initially there might be some for of chemical recation in the body to provoke an emotion, but that chemical reaction can be removed ad the same emotion can occur in a similar situation in a learned rather than neurological capacity.

[ Parent ]
I don't deny (5.00 / 1) (#30)
by acronos on Wed Jul 04, 2001 at 06:42:49 AM EST

I don't deny that we have learned responses, even learned emotions. I deny that socially learned emotions are any less electro-chemical than the ones we are born with. Most likely, the learned responses are actually tied in the brain to the responses we are born with. We learn new emotional responses when old emotional responses are triggered. Admittedly, my wife is afraid of water because of her mother's constant encouragement to fear everything. She never had a bad experience in water. But, her mother tied it to Tiffany's innate fear of death and to the innate need for affection from her mother. So, even in this case it goes back to the responses we are born with.

[ Parent ]
more to the point (5.00 / 2) (#34)
by kubalaa on Wed Jul 04, 2001 at 04:18:36 PM EST

People commonly believe that emotions are special and unique to being human because they are intimately tied into our sensation of consciousness. Emotions are felt, but they are not feelings. As best as I can tell, there's no reason a robot can't act on emotions without feeling them, any more than your computer can't do mathematics without having a conscious concept of math.

In a practical sense, an emotion is an inclination to act in certain ways. In this sense it is no different from a logical decision-making sequence, except that the causes of the decision are more complicated, more automatic, lower-level, and hidden from our consciousness. Emotions help us balance between being inflexible and having to think too much.

I firmly believe that any successful attempt at AI must incorporate something like emotions. As an example: Douglas Hoffstadter worked on a program which built English-sounding words using a cellular metaphor of enzymes and proteins. The system employed a concept called "temperature" to guide its actions; much like emotion, temperature controlled the likeliness of certain actions. At high temperature, corresponding to a high state of entropy in the system, destructive and random backtracking was more likely to encourage the exploration of new paths. As words began to form, the entropy would drop and the temperature would lower, discouraging (but not prohibiting) destructive behaviour.

[ Parent ]

One nitpick, and one differing opinion (4.00 / 2) (#17)
by regeya on Tue Jul 03, 2001 at 05:18:39 PM EST

NOTE: I've posted this as a topical comment because I don't feel the need to post two comments. Bleh.

Asimov. Not Assimov. As, not ass.

Other than that, one could argue the reason the mother abandoned David in the woods is because she did love him. She was on her way to the robotics company to turn David in for deactivation. Instead, she abandoned him in a woods, giving him instructions to stay away from humans, lest he be deactivated.

Yes, her character loved the David character.

Anyone find it funny how many Data-isms Joe displayed? :-)

[ yokelpunk | kuro5hin diary ]

assuming this scifi extravaganza is realizable... (4.00 / 2) (#18)
by eLuddite on Tue Jul 03, 2001 at 05:55:44 PM EST

(which is even less likely than Spielberg making a movie which will break out of the ET mold. He's forever remaking the same sacharine movies with the same saccharine themes.)

His mother abandons him, because though she feels strongly for the childlike aspects of David, no matter how many qualities of humanity he exhibits she cannot fully forget the fact that he is not, like her, organic.

I think it has more to do with the fact that he is not her species. She'd probably feel the same way if he were wholly built out of slimy material.

Can we feel emotion for an AI? Possibly, in the sense that we can feel emotion for our pets, for example. As far as the full range of our emotions, it's hard to imagine how we can escape the meaningful confines of our species. Even if we recognize moral equality for AIs, making it a capital offense to "kill" one, for example, how does it make sense to relate fully humanly with something that isnt human?

---
God hates human rights.

You need more than love (4.66 / 3) (#19)
by anansi on Tue Jul 03, 2001 at 06:56:15 PM EST

I'm seeing a lot of bad reviews by people who (IMHO) just didn't 'get it'. The U.S. is doing a really bad job of raising our children, and A.I. is pointing its finger at this trend. Ever notice how there's a huge, thriving industry around our pets, but we can't seem to keep good teachers in the classroom? To me, this was what A.I. was really about.

Oh, and it was also about the meanings of 'love'. The love David feels is not agape, philia, or eros, but the love of a child for its mother. Just as HAL was over-optimized for 'the success of the mission' in 2001, David is imprinted on his mom, to the exclusion of being a loving brother or playmate to the other kids. He malfunctions in the same way HAl did.

Loving a robot doesn't seem any weirder to me than loving a dog or a cat. None of them will challenge me in the way a person will. None of them will refuse my love, make me jealous by loving someone else better.

What I've finally come to believe about this movie, is that made things cannot themselves love, but they can convey the love of their maker. When I get a perfumed letter from my lover (as if that ever happens!) it's not the paper and ink and scent that loves me, it's the person I can't see who created that artifact. Which tends to make me think that David was not a loving entitiy in and of himself, but rather a messenger from his creator. (The William Hurt character? Kubrick? Speilberg? I dunno.)

Don't call it Fascism. Use Musollini's term: "Corporatism"

Your beliefs are hilarious! =) (2.33 / 3) (#27)
by rakslice on Tue Jul 03, 2001 at 09:29:21 PM EST

"What I've finally come to believe about this movie, is that made things cannot themselves love, but they can convey the love of their maker."

Heh. Are you trolling? If so, keep up the good work. If not, find a good book on critical thinking; it will improve your grasp of the obvious trenendously. (Practice makes perfect! =)

For those who missed it: That statement is trivially false. I could explain why, in great detail; but I think I'll stay within the bounds of good taste (read "socially imposed prudishness") and let your parents take care of that. =)

All joking aside, were you making some kind of unstated assumption about sentience (probably circular)? Would you like to share it with us?

[ Parent ]

He may be meaning God. (5.00 / 1) (#32)
by acronos on Wed Jul 04, 2001 at 07:18:49 AM EST

made things cannot themselves love, but they can convey the love of their maker

If maker is refering to the parents, I agree it is a circular argument. If maker is refering to God, then it is a valid opinion and not circular. I just don't agree with it.

[ Parent ]

I hadn't thought of that (none / 0) (#46)
by anansi on Sun Jul 08, 2001 at 05:58:50 PM EST

I didn't mean parents, and if pressed, I'd have to say that parents don't have the power to "make" childeren, they just get the ball rolling. I suppose you could have the parents be the only people the kid ever sees, and give those parents complete control of the child's envornoment, but that would be silly.

Again, this is just a personal belief, but I think kids are self-constructing as much as they are products of their genes and environment... we get to make choices, and those choices get to matter. (the alternative is too grim to dwell on)

As for "God", I think he's an artifact, a made thing, as in explanatory principle. The carpet we sweep our ignorance under. The ultimate conspiracy theory. As such, he's sometimes a useful container for human love (Churches sometimes do wonderful things!) without necessarily being the button-pusher in the sky.

Don't call it Fascism. Use Musollini's term: "Corporatism"
[ Parent ]

By "things", I mean artifacts (none / 0) (#45)
by anansi on Sun Jul 08, 2001 at 05:45:54 PM EST

My implicit assumptions about sentience (in this case) were that People(human) create artifacts, and that while we sometimes (foolishly) fall in love with these objects, it's even more foolish to expect these artifacts to love us back. Rather, it makes more sense to think of these items as messengers, i.e. carriers of this intagable "love", like relief aid to a starving region.

FWIW, I'm not trying to lay down the law, just that I see an 'easy' way out of the logical puzzle offered up by the film.

Since you declined to offer any opposing ideas in favor of flaming me, I'm left to guess at what your objections were. Is this any less hilarious? Or am I still a troll? =)

Don't call it Fascism. Use Musollini's term: "Corporatism"
[ Parent ]

erg (none / 0) (#52)
by ellF on Fri Jul 13, 2001 at 09:55:41 AM EST

i'll never cease to be suprised by posts like this. comments like "If not, find a good book on critical thinking; it will improve your grasp of the obvious trenendously" are inflamatory, and that's rarely a good thing. If you disagree, present an argument that leans against that which you take issue with - don't assault the intellectual capacity of the person who came up with the argument.

What exactly is your point, by the way? You claim that anansi made a "trivially false" statement, but offer no evidence of this falsehood. Indeed, I see an interesting notion raised - that a material thing can be symbolic of the emotion with which it was made. This is *not* "trivially false" - rather, this is the basis for one of the fundamental questions of aesthetics. Does an object have meaning (or in this case, does it convey emotion) in and of itself, or does it derive (meaning | love) from it creator?

If we're going to be pedantic, rakslice, let's at least do it with some style. Go pick up a book on modern critique of aesthetic theory, such as From the Fountainhead to the Future and Other Essays on Art and Excellence, by York and Noble. Read it, and come back with a reasoned attack against the argument that anansi makes - don't just dismiss it without reason, or hide behind prudishness.

"Ah", you say, but I was _implying_ that children are made things, and can express love! Anansi makes a blatently flawed supposition!" Let us look at this notion, then First, it presumes that what I imagine anansi's definition of "made things" to be - synthetic objects - is incorrect. I take issue with this - 'thing' and 'person' are not the same. A person is, in this context, not a subclass of 'thing'. Moreover, depending on anansi's religious beliefs, a person *could* be merely expressing the love felt by its divine creator. Not something I'd posit - but a definite argument, albeit one that assumes theism.

All joking aside, were you making some kind of unstated assumption about sentience (probably circular)?

If you don't know what argument is being made, don't suppose that it's flawed. It's insulting, and in a discussion - especially one with philosophical overtones - it's especially poor form to resort to insult. There's no reason why anansi "probably" would make a circular argument.

*growl* Apologies for lashing out, but if there's one thing on k5 that really dismays me, it's seeing a worthwhile discussion be reduced to flames.



[ Parent ]
plot summary request (3.50 / 2) (#22)
by dr k on Tue Jul 03, 2001 at 07:26:23 PM EST

Could someone post a plot summary, or a link to one, so I don't have to go and see this film? I like to keep up-to-date on my science fiction, but I spent all my money on the X-Files DVD box set.
Destroy all trusted users!
Answer: Yes. (5.00 / 1) (#29)
by Kasreyn on Wed Jul 04, 2001 at 02:57:46 AM EST

Recommended reading: Isaac Asimov's short story, "Satisfaction Guaranteed".

People fall in love with themselves regularly. Everquest players lack social lives to the extent that they stage weddings for their characters. And AOLiza proves how easy it is becoming to fool people into thinking the machine they're talking to is a person.

I write this as a man who fell in love with a woman he met online. By an amazing chance, things worked out decently well (she turned out not only to be female, but also to be honest about her true situation). However, what if she'd just been an AI program run by some jerk snickering up his sleeve at me? Would I have caught on? If so, when?

I find this very interesting to ponder, to say the least!


-Kasreyn


"Extenuating circumstance to be mentioned on Judgement Day:
We never asked to be born in the first place."

R.I.P. Kurt. You will be missed.
I completely don't get your question. (5.00 / 1) (#35)
by SIGFPE on Wed Jul 04, 2001 at 06:45:51 PM EST

Could humanity ever learn to feel emotion towards a machine
I feel emotion towards the chess AI that just beat me, towards the monster in Quake that beat me, towards the characters in a trashy movie like AI, even towards the characters in Star Trek. Why shouldn't I feel emotion towards a machine?
SIGFPE
A *positive* emotion. (5.00 / 1) (#43)
by decoy on Fri Jul 06, 2001 at 03:45:55 AM EST

It's really easy to get pissed of at machines, but could you love that cuddly quake monster? Could you feel something other than utter loating toward any of the characters on Star Trek?

[ Parent ]
Well (5.00 / 1) (#44)
by SIGFPE on Fri Jul 06, 2001 at 12:30:09 PM EST

but could you love that cuddly quake monster?
Check out people who own AIBO's. They actually become very attached.

Many years ago I was developing a software package that I was very proud of. I stored backups of the source in many places. Yet somehow over a period of a year I lost each of the backups until eventually I found I had no version of my code. A couple of years' work had disappeared. I still feel depressed to think about it and feel like I haven't got over it yet. It's not love but the point is you can have very strong attachments to something that is merely data.

So imagine you had a piece of software that you had trained and developed and taught over a period of 10 years. Something that also emitted suitably endearing noises now and then. I'm pretty sure that you might get very attached to it. And what's more - it wouldn't just be a surface thing. If you'd invested 10 years in something that would be a very real attachment. You might value it more than your goldfish's life, maybe more than your pet dog's life. You might even be prepared to cause major harm to someone who accidentally deletes its memory. These are all 'positive' (as you call them) emotions.
SIGFPE
[ Parent ]

Not that anyone will see my post, but... (5.00 / 1) (#36)
by jethro on Thu Jul 05, 2001 at 04:19:16 AM EST

I have two points. First of all, can humans love a machine? Definetly. I've seen people who had a crush on Data from Star Trek and he's a FICTIONAL machine. And look at some of those Aibo owners.

Now, about the movie. It was made well, and I enjoyed it. That said...

(A) The damn glass robot guys in the end deserve their own damn movie.

(B) I kept thinking "He's a robot, and man is he a WHINEY robot." Someone starts crying like that when I abandon them in the woods, it'd just make me want to abandon them more (that doesn't sound right, I'm tired).

(C) You said his mom abandons him cause he's a machine, and not a real boy. Let's not forget the fact that he almost killed their other kid, albeit by mistake. Robokid had only been around for a month or so, as she says while preparing the birthday things. I've seen people who had a dog for years have to give it up because it's threatening The Baby. Shit Happens. She abandons him in the woods because she would rather do that than have him destroyed.

(D) I kept thinking, you know... maybe if mankind had spent less time developing semi-intelligent robots, and had rather dedicated resources to, oh, I don't know, colonizing other planets or reversing greenhouse effects... maybe all this could've been avoided.


Like I said, I'm tired. Night!

--
In the land of the blind, the one-eyed man is kinky.
Love and Robots (5.00 / 1) (#38)
by CrayDrygu on Thu Jul 05, 2001 at 01:41:27 PM EST

"She abandons him in the woods because she would rather do that than have him destroyed."

And I think that, right there, is evidence that his mother did indeed love him, on some level. If she just regarded him as a machine, I don't think she would have objected to having him disassembled.

She definitely didn't love him on the same level as a real son, though. More like a dog. She didn't want to see him go, and it hurt her to drop him in the middle of nowhere, but she would rather do that than keep him around and be a threat to their real child.

[ Parent ]

Yes. (5.00 / 1) (#41)
by jethro on Thu Jul 05, 2001 at 03:50:55 PM EST

And I think that, right there, is evidence that his mother did indeed love him, on some level.
I totally agree with that. I also agree she didn't love him on the same level as her real son, but she only had him there a month. Imagine what it'd be like after a year. In fact, if the REAL son never came back, the whole movie would probably never have happened, and the Cyberwhatever people's experiment would not have taken place, until perhaps after she died. But then they might've been dead too.

Also, does it strike anyone else as completely STUPID to make a robochild that can only imprinton ONE of the parents?


--
In the land of the blind, the one-eyed man is kinky.
[ Parent ]
If you think about it... (none / 0) (#48)
by istvaan on Wed Jul 11, 2001 at 02:36:43 PM EST

Consider this: How many people out there own pets? Of those pet-owners, how many, when asked, will indicate that they love their pets? If one considers that pets are not even sentient (though many pet-owners will disagree,) and are still capable of receiving love from their owners, then why should something that is more like a person -- with intelligence to boot -- be unable to receive the same affection?

makeup (none / 0) (#51)
by goosedaemon on Wed Jul 11, 2001 at 06:58:43 PM EST

soo... anyone else intrigued by the fact that the robot woman hobby had was applying makeup, then we moved to monica ... applying makeup? no idea what this insinuates, but hey.

A.I.: Artificial Intelligence | 52 comments (47 topical, 5 editorial, 0 hidden)
Display: Sort:

kuro5hin.org

[XML]
All trademarks and copyrights on this page are owned by their respective companies. The Rest 2000 - Present Kuro5hin.org Inc.
See our legalese page for copyright policies. Please also read our Privacy Policy.
Kuro5hin.org is powered by Free Software, including Apache, Perl, and Linux, The Scoop Engine that runs this site is freely available, under the terms of the GPL.
Need some help? Email help@kuro5hin.org.
My heart's the long stairs.

Powered by Scoop create account | help/FAQ | mission | links | search | IRC | YOU choose the stories!