While I love Ebert’s Great Movies series, and I love A.I. (right down to the far-flung ending), I’m a bit taken aback by his piece on the film. He says he originally wasn’t so receptive to it, and thought it failed to rigorously deal with the problem of the impossibility of emotion in a computer. He says at the time, he thought it had him “asking questions when he should have been finding answers.”
Now he seems to have rediscovered the film as a chronicle of inert machinery, trying unsuccessfully to fake human emotions. He sums it up with, “What does love mean in this context? No more, no less, than check, or mate, or π. That is the fate of Artificial Intelligence.” Somehow, this has made it a great film for him.
One of the strangest things here is that both of these opposed opinions are guided by his own wisdom, taken for granted, that it’s impossible for a computer to love, or to feel in any way an emotion he considers exclusively human. First, he thought the film was fakely trying to ascribe these emotions to computers (a critique I can understand, though I don’t share his sentiment), and then, this time around, he felt it honestly portrayed the impossibility of its own hypothetical situation (a reading I can’t even remotely sympathize with).
It’s such a strong case of a man reading a film, and wrestling with it, through his own lens. As I said, it somewhat baffles me, but at the same time, I can’t help but admire Ebert’s strong investment in his own relationship with the film.
Anyone else have any thoughts, or similar reactions? Do you agree with Ebert’s later reading? Is the film, taken at face value (as a story of a machine that’s capable of love) as facile as he originally seemed to think?
ED: Accidentally switched “humans” for “computers” up there. An above sentence now makes much more sense.
If that’s his reading I totally disagree with it. A.I shows man trying to create a robot that can love and, in the process, we see that it can actually love in a deeper, more obsessive way than us. Despite David’s unending love for his “mom” he is only given love back when he acts as a replacement for the family’s real child. When the latter recovers from his illness he gets thrown out like a bad toy (like Teddy, for instance). Amidst the other robots, he finds more respect and empathy than among the humans. The automatons are more warm than the envious, intolerant and often irrational humans – the flesh fair is the manifestation of a species that, feeling that its survival is in danger, reacts through violence, attempting the annihalation of the robots and actually getting pleasure out of the process (Spielberg may have intended to establish parallels with the Holocaust there). The ending makes perfect sense when seen through this scope. After human extinction, with the world populated by extremely advanced robots that are invested in archaeological efforts to understand the past, their origins and find more about their creators (that’s a very human endeavour we must say) they encounter David. Paradoxically, the ostracized robot becomes the only memory of their human ancestors and, by extension, of love itself. That is the moment where those future artificial beings face the true emotional nature of Humanity, all through a machine.
Yes all of that. It’s a great film that demands a fairly rigorous amount of self involvement. I believe it shares with Wall-E a kind of shaming of our own cold love that has become so diluted through modern technology, though really modernism in general. In the end, it is that absurd, impossible love that takes on an undying level that has become a foil to the pragmatic (“just replace him with a replacement son”) convenient love that is so common to society.
I’m a huge fan, my second favorite Spielberg, after Jaws. One of my top 100 favorite films of all time.
A couple of points in Ebert’s article:
“He doesn’t eat, but so strong is his desire to be like Martin that he damages his wiring by shoving spinach into his mouth.”
Well, actually, as I remember it, David starts to shove spinach into his mouth because Martin has been teasing David with the way Martin is able to actually eat. It is an agrier action than Ebert gives it credit for being.
“After faithfully following his instructions in such a way that he nearly drowns Martin, he loses the trust of the Swintons and they decide to get rid of him, just as parents might get rid of a dangerous dog.”
David doesn’t nearly drown Martin over “faithfully following instructions.” David nearly drowns Martin when one of the other kids at the pool grabs a sharp object and nearly punctures David, causing David to hysterically grab Martin and beg for safety, which lands them both in the pool.
“In an eerie scene, he comes across a storeroom containing dozens of Davids who look just like him. Is he devastated? Does he thrash out at them? No, he remains possessed.”
This is strange. Ebert is forgetting the scene that comes immediately before the scene in the store-room, where he does indeed “thrash out” at one of his duplicates, in one of the most disturbing scenes in the film. His glowering statement to the new David that “you can’t have her!” leads to David destroying the duplicate while screaming “I’m special! I’m unique!!”
“To fulfill his mission to love and be loved by Mommy, he concludes he should be like Martin, who Mommy prefers.”
Be like Martin? What does this mean?
And the allegation at the end that the whole One Last Day section is just a fantasy implanted in David’s consciousness by the aliens is just ridiculous, entirely bogus nonsense that is in no way whatsoever indicated in the film. This kind of crap really needs to stop.
For sure, Roscoe… Ebert seems to miss things, and “get” the movie less on his second viewing than on his first. Definitely odd. Maybe this is part of why this piece confused me so much.
I don’t have a problem with Ebert reconsidering the film. When he ignores major scenes in the film, it gets strange.
It does have a Kubrickian sense that human emotional maturity has not kept pace with technology, while at the same time, as in 2001 we had HAL seeming more human and emotional than the humans, here we have a programmed machine who feels more loving attachment to a person than most humans feel. The human brother acts despicably; the machine is programmed to love while humans have twisted emotions like bitter jealousy and are aggressive and destructive. Spielberg makes more of the sentimental side than Kubrick probably would have done.
I bawled my head off at the end. I bawled and howled and howled and bawled. I sympathise with the adopted child who finds his adopted/ive older brother is a devious viper. And who doesn’t wish they had their missing/deceased loving mother with them? Of course the film deals with some bare basic instincts, while setting up some interesting questions on what makes us human (v alien or machine), and what should we value in that- as well as how far technology can develop. Elsewhere Spielberg has done friendly and not so friendly aliens of course. And in Minority Report he looked at trust in technology and how that can become a dangerous force due to human folly. That film was quite interesting but not especially profound, or at least cast in shade a little by Spielberg’s crowd-pleaser rollercoaster excitement game. Hitch was better at suspense without helter sklelter pacing and brash effects that have dogged Hollywood
Jonathan Rosenbaum surprisingly picked A.I in his top 100; i’m not sure why or whether it’s worth revisiting (i saw part again without much interest) or studying in great detail.
Let’s not forget that David is also shown to have bitter jealousy (“you can’t have her!”) and he has an aggressive and destructive side too, as shown when he pulverizes his double in Hobby’s board room. The ickier aspects of the movie get brushed aside in the concentration on David’s goopy mommy fixation, the only kind of love that is shown in the film is that between a mother and child. It leads to some rather icky moments, I think. It takes a very daring or a very clueless filmmaker to show a child cuddling with his mommy corpse and pass it off as a happy ending.
I also take issue with Ebert’s assumption that humans are the only beings that can think. If biological machines can think, why can’t technological machines? The only counterargument to that is the ‘soul’, but then again we believe our dogs experience emotions even though we do not ascribe them souls.
My impression from the film was that we create AI so we can interact with it on our own terms. Humans create AI to be ‘People, except controllable and disposable’, and created David to be an emotional slave. Regardless of whether in reality machines can ever be able to feel emotion it’s pretty clear in the world of the film they could. Ebert is judging the film solely based on a stubborn insistence that they can not, and as a result, completely missing the point.
Anyway, it was almost a good film, until the alien part.
My understanding was that those were not extraterrestials, but earth-produced, self-generated robots; descendants of the mechas, if you will.
Coincidence that I re-watched this film last night, having purchased the Blu-Ray version several weeks ago.
Overall, I label the script as malarkey on a timeless, human myth. Longing for perfect love, most often sought from our progenitors or, by extension, our Creator, is hard-wired to our emotional circuitry and that spurs us on a lifelong journey that can never be satisfactorily concluded. In Christianity, the blue fairy is heaven, that final perfect day with mommy.
While able to admire the technical and human artistry that brought this story to the screen, it’s finally just an appreciation I might have for a medieval, illustrated Bible — a beautifully rendered transcription of malarkey.
Also I have a bit of a dork-nitpick about the film.
What exactly is the robots’ power source?
David and his super-toy can apparently run 2000 years straight without any recharge. At no point in the film was any robot ever re-charged, nor did any robot have any maintenance issues.
>>My understanding was that those were not extraterrestials, but earth-produced, self-generated robots; descendants of the mechas, if you will.<<
Of course this is true, but the confusion surrounding this issue point to some flawed decision making in the production design. They really do kind of look like the Close Encounters aliens. They look less like E.T. who also kind of looked like the Close Encounters aliens, but this begs the question of why Spielberg keeps returning to this basic design with variations.
On the substance of the thread, I actually agree with Eberts reading that David is not experiencing love in any human sense, but is trying to fulfill his programming. This is consistant with Kubrick’s rendering of Hal, who’s killing spree was not a result of it going evil or crazy, but showing the limitations of computer logic as it tries to carry out its mission.
I did find Ebert’s mistakes on actual plot points a bit strange.
How would one tell if David is experiencing love in a “human sense”? Isn’t that part of the issue of the film? What is love in a human sense or otherwise and what distinctions can be made about such a thing?
As to the design, Spielberg uses the look to signal difference without total alienation, that is to say humanlike without being how we see each other today. It suggests both the possibility for similarity while signaling the divide we experience when confronted with the unfamiliar. In the movie, when we first see David as his new “home” Spielberg introduces him with this shot:
which is then obviously echoed later on with the denizens of the further future:
This reverses the reference point from the earlier meeting when David was the anomaly. In the beginning it was the “human love” that was to be measured against, at the end it is David’s love that is looked at as a measure. David remains outside the “norm” in each encounter, but the questions are the same, just attended to from a different direction. The ideas of creation, love, meaning, and the stories we build surrounding those ideas are the subject of the film, so if Ebert is suggesting he “knows” the answers to these questions he is either wiser than anyone I’ve known or he is misunderstanding the questions to begin with.
The robots being descendants of the original robots isn’t any better than if they were aliens. That just makes it less random and more preachy.
It did seem to me at the very least ambiguous whether David was experiencing something approximating real emotion. And, I’m pretty sure the robots at the end possessed whatever abstract quality we consider a soul. Their curiosity about humans seemed self-designed, not programmed.
But back to Ebert’s point, here’s what I don’t see:
An excess of neurotransmitters causes neurons to fire electric impulses = Real emotion
A program designed by a human sends out electric impulses in the exact pattern of those neurons = Just a program being executed
The question, “What and where is the essence of the human soul” is unanswerable by humans, and Ebert is not only claiming to know the one true answer, he’s imposing that claim on his critical view of a film where it does not seem to be consistent with the director’s intentions.
Jirin, yes, the idea of the ambiguity of the emotions David may have been experiencing is the key I think, and as such it is meant to question where our own emotions come from as much as anything else. If it is something purely chemical then there is no reason to suspect that chemical reaction cannot be potentially duplicated in an artificial form, but thinking of it in those terms already begins to alter our understanding of our lived emotional lives. Do we bond to our mothers in a significantly different way than David did with Monica? I mean our response to our mother isn’t a purely conscious one, so how different is it than being “programmed” to imprint? Does the search for a creator become different in kind if that creator is human than it would be if it were superhuman? What is the effect of the stories we tell ourselves about that creation? If it is all explicable scientifically, than what is the effect of that on our understanding of what it means to be a feeling, thinking being and how can we differentiate ourselves from those “beneath” us or those that may come after? Your mentioning of the “soul” is on point to those ideas.
In regards to Ebert, personally I’ve found reading his column to be far too frustrating to be good for me in that his opinions differ considerably from mine, I often believe his arguments are flawed in some significant way, he is too popular and well loved to challenge with any reasonable chance of making an impact, and it all makes me cranky so I’ve tended to respond ungenerously when I have engaged his arguments and that doesn’t fit with how my attitude towards discussion in general nor to my respect for his career and importance to the film community as a whole, so I don’t read his column and therefore can’t comment directly on what he says only on what I read others saying about his writing.
“On the substance of the thread, I actually agree with Eberts reading that David is not experiencing love in any human sense, but is trying to fulfill his programming.”
But does that programming have jealousy, re: trying to eat the spinach?
I do not think AI is careful enough to truly contend with circuitry versus emotion, though it does do fine in terms of self-recognition and will. It is a Pinocchio tale, the point isn’t that he is or isn’t human, but that he wants to be. He just comes to that journey through somewhat disturbing Mommy love and a final deus ex machina. Too bad we’ll never see the Kubrick version, but for what we have, many of the points in this thread are fine indeed.
Polaris, if David’s “love” is based on his programming, it would have been programmed into him by a human, so its not surprising that “emotions” like jealousy would be a byproduct, as it is for so many people in love. His wanting to be human is also a byproduct of his programming as his observational data indicates that his love will only be fully reciprocated if he were human. I don’t think assuming David’s machineness in any way dilutes the drama of the film, but makes it more interesting because he can’t make the choices a human could.
Is jealousy the ghost in the machine of love?
(Not rhetorical question).
“Jealousy is both reasonable and belongs to reasonable men” —Aristotle
Interesting question as the embrace of the positive aspect of love is, in a way, what creates its negative association, jealousy. There is possibly the suggestion then that the creation of the former, even artificially, if real enough will create a secondary effect or emotion on its own, that the programming people experience in the unconscious attachment to their family can potentially give rise to complexity as the instability of the feelings gives rise to a sort of consciousness through dissonance. A stable or unquestioned experience of love might be suggested as being flat or a baseline that maintains ones awareness on a communal or social level, but the dissonance is what gives rise to a sort of understanding of one’s individual circumstance as one feels an independent state, isolated from the communal, an unshared emotion. Or something like that. I don’t know if I agree with that idea or not, but the vague suggestion of it being a possible way to look at the film is interesting.
Somebody needs to resurrect Asimov so we can ask him.
Well, actually, one of his books does detail the conflict of a robot who freezes because of a love life with a woman. Another robot tells him, “If you stay with her you will hurt her, because she will not be able to have a real relationship with a real man”, and the robot infers that if he leaves her, he’ll hurt her, because he will be breaking up with her. It causes him to freeze completely, and when an investigator is sent to look into it, he falls in with her. Good story.
There is also a series of short anime films from Japan called, I believe, Robot Love which was a diverse look into how emotion and programming mix. Haven’t seen it in some time though, do not know what some of the shorts have to say about it.
The movie doesn’t really give any answers to the question of whether robots can really experience emotions in a way comparable to humans. But, there’s a stronger theme in the movie that I don’t see being discussed: Slavery.
Robots are a race of slaves, that humans created to be similar to humans to better serve humans. The ultimate goal of human form robotics is to create a creature that is just like a human, except controllable and disposable. Which makes them, in the context of the film, as symbol of human decadence and excess. Maybe our perspective on the film is skewed by having been trained to ascribe human qualities to robots by Data from Star Trek, but the humans in the film made David so close to a human that they start feeling sympathy and attachment to him. They are just as confused as the audience over whether he bears the quality of sentience, but in the end, still see him as a disposable slave.
If we create a mechanical race exactly like us to the smallest detail, do the same moral imperatives apply to that mechanical race, or can we still address them as ownable objects? That is the essence of the film’s philosophy, in my opinion.
@ Roscoe “It takes a very daring or a very clueless filmmaker to show a child cuddling with his mommy corpse and pass it off as a happy ending.”
Where do you get the idea that the ending is meant to be happy? I find the ending unrelentingly tragic, which I believe Spielberg intended. The final scene not only presents us with the extinction of the human race but it also offers us an image of reciprocated love that is illusory. Monica’s resurrection is artificial. It is not the Monica that existed centuries ago that returns to David but rather a false, constructed Monica based on a scant few memories from David, which necessarily invalidate any possibility of bringing back the Monica that David remembers. His desperate longing for a mother to love him is what leads to her creation. She is as artificial as David, perhaps even more so, when she returns. The love that he sought for so long is not returned to him. He receives a response from an artificial construction of the human being who in her revived form is based on his memories of longing. The lights go out in the house, another patchwork construction from David’s memories, and we are left wondering whether David will continue to exist now that his only reason for being has been artificially satisfied and any possibility of true reciprocated love has been erased.
I don’t see how anyone could possibly see the ending of this movie as sentimental or happy. It’s baffling to me.
Yeah, that’s a weird characterization, isn’t it. For “happy” substitute “warm, tear-jerking, tender sentimental and cuddly in that Spielberg manner.” I quite agree about the bigger issues of the ending, with the reconstituted Monica etc., but I don’t see that Spielberg is saying what you’re seeing — I think that the film is saying that David’s love is absolutely 100% returned to him, that he gets the love he wants and all that, tempered a bit with the whole time limit thing which I always found clumsy and unbelievable. If I thought Spielberg was really saying what you’re saying he’s saying, I’d have a lot more respect for that final section of the film.
but I don’t see that Spielberg is saying what you’re seeing
You just don’t want to believe that Spielberg could make an intelligent movie
“You just don’t want to believe that Spielberg could make an intelligent movie”
Don’t put words in my mouth. I think he could if he wanted to, if he didn’t continually allow his films to be undone by silly family-friendly endings. I think a lot of A.I. is very intelligent indeed, and certainly among his finest efforts, except for that unfortunate tear-jerk ending sequence.
We all know that Kubrick wanted that tear-jerk ending sequence
Maybe he did — that doesn’t change the ending as it now stands, brought to the screen by Spielberg. A triumph for some, an unfortunate descent into tear-jerkiness for others. Opinions differ all over the place.
He is a nice two-part video analysis of the film that I came upon. The author – Ben Sampson: