Saturday, April 19, 2014


I just saw the movie Transcendence and, like nearly everyone else who saw it, I didn't feel that it came together as a movie very well.   I don't think it's easy to make a movie about the singularity for mass market consumption.  I think the plausible outcomes are too extreme.  For instance, if a super AI that has mastered nano-technology wanted to exterminate humanity, no human would likely ever know.  Everyone would just fail to wake up one morning, or just drop dead in the middle of the day.  There would be no epic battles between humans and machines fit for the big screen.

Director Wally Pfister seems to understand that at least.  (Spoiler Alert.)  My interpretation of the movie is that Will Caster, the super AI, was purely benevolent, never thought for a second that the humans were a threat, and at the end of the movie merely pretended to die so the stupid humans would calm down a little bit (at the end of the movie we see evidence that he is inhabiting his old garden).

But then why didn't Will simply use his virtually infinite intelligence to prevent humans from getting upset in the first place?  Maybe he could have cracked some jokes and really tried to connect to his wife on an emotional level, or shown some excitement and amazement to have been successfully uploaded into the computer.  Any kind of human-like reaction to the situation would have made the humans feel much more comfortable with him.  The reason he never comforts his human friends with some real human emotion is, of course, that the plot had to be twisted into something that had potential to be a successful Hollywood movie.  And like all the other elements of the movie that were contrived for that purpose (gun battles, etc.), that made the movie very disappointing for someone who is actually interested in the singularity.

I was hoping for something more psychological, where the uploaded consciousness is 95% Will Caster, but because of imperfections in the uploading process, or simply as the result of exploding intelligence, is just different enough to be creepy.  His goals diverge from the goals of the human Will Caster, and from humanity, enough for scary things to start happening, but not enough for him to want to exterminate humans.  He tries to get his wife on board, with genuine human-like interaction, and it is through his struggle with her that he finally abandons his plans and opts for something different.  Arguably, that is in fact what the movie was about, but instead of exploring the psychological struggle between him and his wife, too much screen time was spent on stupid gun battles and terrorists, and as a result the movie never came together.  Oh well.

The failure of this movie is particularly disappointing to me because I was hoping that it would help mainstream thinking about the dangers and opportunities of the singularity.  If it had any effect in that regard, it was probably more to dampen enthusiasm for exploring and popularizing singularity ideas than contribute to it.