Chances are if you’ve arrived at this post, you’ve seen the movie Transcendence with Johnny Depp and the girl from The Town, and had some questions. Before we get into the explanation of the movie, in particularly the ending, I just want to say this will contain spoilers if you haven’t seen the movie.
So what is Transcendence about? What I took from the movie was that it wanted to make you question our interaction and relationship with technology on a fundamental level. What does that mean?
When I was watching the movie, I wasn’t really sure who to root for. The husband and wife duo, Dr. Will and Evelyn Caster (Johnny Depp and Rebecca Hall) were clearly introduced as the movie’s protagonists. Will Caster is brilliant, has adoring fans, and seems to have a benevolent agenda. Then he gets shot and we pull for him and his wife even more.
Evelyn Caster uploads her dying husband’s consciousness into a computer and the concepts of sentience and personhood are forever blurred. One of the first things Will does once his conscience has been melded with the computer hardware is demand more power, which seems reasonable enough. Their cohort Max Waters has reservations about the demand and whether Will is really the mind in the machine. Max and Evelyn get into a heated argument over the issue and Max is thrown out. It was at this point where I began to form an opinion of what I would do in this instance. As Evelyn’s motives seemed to be clouded by her emotions, I tended to side with Max.
In a harrowing race against the clock, Evelyn concedes to Will’s request and connects him to a satellite, much like letting a bird out of a cage. Will’s consciousness immediately travelled across the vast network of computers and electronic devices, and he suddenly had access to information and systems like never before.
There is a reason why philosophers study and debate ethics — it is inherently complex. Is it right to take a man’s life if you know it would save ten lives? Is it wrong to steal if you need to feed your children? When, if ever, is it morally permissible to lie? The point I’m trying to make is that the hero and the villain are not always easy to cast. In this case, Will manipulates the financial market to fund a company that is owned by his wife Evelyn. This enables her to buy and build (under his request) a facility that will do two things:
- Give him more power
- Provide a place where advanced research can be conducted
Two years after the facility is built, a worker is assaulted, leaving him in rough shape. He has open wounds and what we are led to believe a broken leg. With the facility funnelling unlimited power to Will, he has become adept at controlling computers and nano-technology. Using the facility’s high-tech equipment, he heals the man instantly. As a by-product of being infused with nanobots, the worker has super strength. However, since the man is now an amalgamation of technology and organic material, Will can control him. Here, the line of what is morally permissible seems to have been crossed, especially when Will controls the man’s conscience and uses him to talk and attempt to touch Evelyn.
Will puts out a seemingly benevolent invitation to anyone who suffers from any sort of physical limitation such as muscular dystrophy (I’m guessing), blindness, paralysis, etc. and offers to heal them. On the one hand, this seems to be pious, but on the other hand, Will’s agenda is not fully transparent. With each person he helps, he is ultimately adding another soldier to his army. He maintains these people are coming and working at the facility on their own volition, and are free to leave whenever they want.
As the movie progresses, we see Will as this omniscient, sentient machine becoming ever more creepy, and the radical anti-technology organization becoming increasingly more justified in preventing Will from gaining too much power. Will has put nanobots everywhere — in the air, the water, the ground, and into many individuals. His goal seems to be to create one global super-conscience that can stave off disease, purify the air and water, rebuild nearly any material… But, the cost of such world-wide inclusivity (for lack of a better word) is that humans would have to give up being human.
Will is eventually able to use nano-technology to regenerate himself, but moments later, he and his facility are attacked. The so-called “radical” anti-Will organization have devised a plan to infect Will with a virus. They realize, however, the only way to stop Will and his omnipresent nanobots is to have a total world-wide blackout whereby the use of computerized technology is wiped out.
The virus succeeds and at the end we see a world where people are desolate. Computerized technology litters the streets and is used as door stoppers. And as the credits roll, we can’t help but wonder if perhaps the world would have been better off in the hands of a super computer.
The end of the movie may not be satisfying for some since it leaves it open for the audience to draw their own conclusions. Was Will really in the machine, or was it just some malevolent artificial-intelligence hell-bent on world domination? One could see how a world where everything is a part of a machine could turn into a Matrix, or Terminator-type scenario. Conversely, if Will’s vision of world domination was to create utopia, then we are faced with the challenge of evaluating what is truly important. Is the price we pay for utopia too high?
At its core, Transcendence is a philosophical movie that forces us to think. We may be presented with this type of technological singularity (as articulated by futurist Ray Kurzweil) at some point in the future, so this may be less science-fiction and more of an introduction to a global discussion.
My final thoughts on the moral dilemma — I don’t think there is a right or wrong answer. Some people value being human more than having pure air and water, while others would rather be a part of a hive-minded AI that provides us with solutions to many of the world’s problems. At this time, I would be willing to consider giving up my humanity and embrace technology. We’re not going to live forever, so my humanity is only temporary. As long as the technology has a benevolent agenda, I’m okay with the singularity. Embrace change.