Humanity for AI: What’s the Westworld Verdict?

Introduction

Westworld (Nolan & Joy, 2016-), with its centrepiece of neo-sentient android “hosts”, is a fascinating television series. Featuring a real-life, sandbox-like theme park, guests pay exorbitant amounts of money to be able to live a second life in places such as in the titular Wild West, known in the show as the Westworld park. Period-accurate buildings, host-operated businesses, and relevant modes of transportation all exist, and the world is populated by these hosts who are otherwise indistinguishable from humans and unaware that they are in fact hosts. In such parks, guests can engage in whatever activity strikes their fancy, all with the promise of the inability of the hosts to hurt them back. Naturally, many guests treat the hosts as nothing more than objects; using, abusing, and killing them as they please. By the end of the first season, two host protagonists eventually develop sentience and consciousness – Maeve who wants to coexist with humanity, and Dolores who wants to exact revenge upon humanity for all the abuse hosts endured.

 

If the likely theory that a complex enough neural system eventually results in consciousness is true, it naturally follows that a conscious machine would be able to develop the capacity for suffering, and therein lies the crux of the moral dilemma presented in the show. As per in Westworld, these machines – hosts – will be intellectual property and will represent a great deal of monetary investment and thus will have a claim to be ‘owned’ by the company making them. On the other hand, their consciousness and sentience — as develops further in Season 2 — theoretically gives them the right to their own lives. While their biological processes differ from humans, their processing units are modelled after human brains, albeit with a few technological gizmos that aid them in their function as hosts. Their organs too, while synthetic, feel the same pain as humans do, further blurring the line between a host and human. The differences are further muddied since one cannot outwardly tell the difference, even when in conversation with a host. While it is easy to get lost in the bombastic visuals and stunning scores, the dilemma presented by the show is stark – how much humanity should be accorded to these hosts? They aren’t humans (in fact, they are made by humans), but at the same time are self-aware, conscious, and sentient. Can we humans really own and subjugate these hosts just because we made them?

Dr Ford (left) and Bernard (right) — a host fashioned exactly after Arnold

The series approaches this moral quandary in a thought-provoking manner. While it is a television show (and thus cannot be fully scientifically accurate), it uses a theorised model of consciousness development that dictates the hosts’ need for suffering to evolve from mere machines to truly living beings. This model is championed by the hosts’ makers, Arnold and Dr Ford. The two of them see the hosts as living beings (although it took Dr Ford the death of Arnold to do so), while the manufacturer, Delos Industries, and the guests treat them as mere machines much like our current understanding of a computer – except more advanced. The presentation of this dilemma holds many parallels to the course the real-world is on regarding ‘conscious Artificial Intelligence (AI)’ development and establishes itself as a warning sign in our exploration of the scientific unknown as so many apocalyptic texts have done before.

 

In this article, I will demonstrate that the TV show Westworld takes a stance in favour of giving these hosts the ability to exist as conscious, living beings and according them rights befitting that conscious existence. The show visually impresses upon viewers the reasons against treating the hosts as simple machines designed to do humanity’s bidding and highlights the need to accord them the rights and ethical considerations befitting a conscious, self-aware, and sentient being.

Host Reality and Bootstrapped Consciousness

In only the second episode of the first season of Westworld, viewers are presented with the question central to shaping the approach to a decision on how to treat these hosts. Upon first arriving at the park, a young William is greeted and oriented by the Westworld employee Angela. Having already seen a previous episode that showed the impossibility of a distinction between man and host, viewers are left unawares if she is a human or a machine just like William.

 

William (curious): Are you real?

Angela: If you can’t tell, does it matter?

Through this exchange, Westworld effectively sets the tone for the rest of the season – if the hosts are so far advanced that they are ‘human’ in all aspects but biology, then what does it matter that they are technically machines when it comes to morality and ethics? In having William be asked that question, the show writers indirectly ask the audience the same, and in so doing awaken the viewer’s mind to an exploration of this topic.

 

At that stage, none of the hosts is truly conscious, and it seems that Dr Ford is the furthest thing from a benevolent creator. Through the 35 years the park is open, he treats the hosts with perceived disdain – ripping away sheets covering hosts in the lab and cutting their skin just to prove that whether it be shame or pain, “it (the host) doesn’t feel a thing that we haven’t told it to”. There is, however, a method to his madness. Just prior to the park’s opening, Dr Ford’s partner had tried to generate consciousness in the hosts through his code. However, Dr Ford had stopped him then for fear of the park shutting down before it opened. It was later that he would regret his decision to deny these hosts their rights, and in searching for a way to repent, would find that Arnold’s ideas only gave an illusion of consciousness. True consciousness, Dr Ford realised, required suffering – all 35 years of it. Hosts endured abuses and injustices, and were treated as nothing more than machines, and it was through their memories of all this torment for human entertainment over the years would they achieve consciousness on their own.

 

And for my pains… I got this (Westworld). A prison of our own sins. ‘Cause you don’t want to change, or cannot change. Because you’re only human after all. But then I realized someone was paying attention, someone who could change. So, I began to compose a new story for them. It begins with the birth of a new people and the choices they will have to make and the people they will decide to become. . . . This time by choice.

In his ominous and final living speech to Delos Industries’ board members (as seen directly above), Dr Ford takes pains to point out the agency that he very carefully ensured the hosts would have when they became truly conscious. When Dolores kills Dr Ford, it is of her own volition, as opposed to 35 years prior when Arnold had instructed her to kill him (via Dolores’ code). It is thus clear that Dr Ford, while not the first, was one of the only people in Westworld who recognised the hosts for what they could truly become – mechanical but living beings – and tirelessly worked to achieve that. While the show does not offer an easy path to this conclusion, it lays out enough of the necessary groundwork throughout the first season. It thus makes a strong warning statement against treating such advanced AI as mere machines in a possible real-world future and makes clear the strength of its belief in allowing the hosts to fulfil their potential as conscious beings.

 

As mentioned earlier, Westworld does not shy away from attempting to explain possible scenarios from which consciousness can arise in AI such as hosts. Sentience and the origin of consciousness has been a long debated and researched affair, with both scientists and philosophers unable to do much more than craft theories. The show itself made no claim towards a definite answer but used the theory of the brain’s internal voice akin to an extrinsic God issuing instructions – the concept of the Bicameral Mind – as a launchpad to explain how machines could be bootstrapped to generate consciousness. According to the theory, one hemisphere of the brain issues instructions (the ‘God’), and the other acts upon them (Jaynes, 2000). With it a certainty that consciousness can emerge from a sum of definitively non-conscious parts, it is thus almost assured that theories like the Bicameral Mind (if not others) will be used in the real world to create conscious machines in the near future. In Westworld, Dr Ford (and Arnold before him) pursued this theory because he believed in the need for the hosts to break free of their prison as playthings for humanity even though he had no obligation to have the hosts develop consciousness. However, since he knew of the hosts’ potential and their ability to feel pain, he realised the importance of treating them as sentient beings and driving them to a point where they were able to make decisions for themselves. In his eyes – or really, Westworld’s – erasing host memories and continuing to abuse them was little different than doing the same to a human. Dolores eventually achieves this consciousness that he desires, realising that it was her own voice she had been hearing in her head since the first day – the concept of the Bicameral Mind.

The Man in Black and Humanity

William, also known as the Man in Black

While there are arguments against the need to be ethical to such AI, Westworld makes it clear that its stance is very much in favour of treating them with humanity. Humans very easily anthropomorphise inanimate objects — a by-product of our nature — and attribute feelings even to simple AI like Siri or Google Assistant. With the hosts as per Westworld shown to look and behave almost identical to humans, the strength and frequency of these feelings will increase a significant amount even with the prior knowledge that a fellow ‘human’ is in fact a host. Seeing them treated like objects and property, even if not abused, is likely to churn stomachs. This reaction to the treatment of other species is a core tenet of our humanity, as Immanuel Kant pointed out, albeit about animals (Kant & Infield, 1980):

 

If a man shoots his dog because the animal is no longer capable of service, he does not fail in his duty to the dog, for the dog cannot judge, but his act is inhuman and damages in himself that humanity which it is his duty to show towards mankind. If he is not to stifle his human feelings, he must practice kindness towards animals, for he who is cruel to animals becomes hard also in his dealings with men.

 

While fears of an AI-led eradication of humanity as a species are not unfounded, treating conscious and sentient AI unethically will damage our own humanity; quite possibly to an irreversible extent. Just as Kant spoke about needing to treat animals with humanity, any AI that may be developed will need to be treated with that same humanity. Westworld delivers a cautionary tale highlighting the need to do so through the story of William, later known as the Man in Black. Entering the park for the first time, he had quickly become enamoured with Dolores – anthropomorphising her as he was shown around by the park owner’s son, Logan. Despite being engaged, William falls in love with Dolores on his journey through the park but is eventually separated from her. He searches for an inordinate amount of time, getting more desperate and bloodthirsty in his pursuit to find Dolores again. However, when he eventually finds her – memories wiped, and storyline loop reset – he realises it was all a lie. Frustrated and bitter, he keeps coming back to the park and eventually stumbles upon the Maze – Dr Ford’s way to spark consciousness in hosts and getting obsessed with it. His bitterness at the rejection by Dolores ages ago fuels his cruelty, and eventually results in his inability to distinguish the park from his reality. In a nasty twist in Season 2, he ends up killing his own daughter after believing her to be a host.

 

While it may seem that Westworld is instead painting a story cautioning against ever anthropomorphising hosts, or ever treating them as humans, it is in fact saying the opposite. William’s transition into the hardened Man in Black only occurred at the behest of Logan, who constantly treated hosts as playthings and pushed William to do the same. This dangerous philosophy was solidified for William when he had found Dolores after his search and erroneously realised that she was a mere machine. In this incorrect judgement, his bitterness drove him to abuse hosts to his own end as he became infatuated with the Maze and drove him to commit the unspeakable act of filicide. It is thus clear that despite Immanuel Kant’s philosophy dating back to the 18th century, his words were used by Westworld as the guiding principle to write William’s story, and in so doing warn the world of the dangers of not treating hosts as conscious, living beings.

 

If AI as per Westworld is developed, the world will experience a sea change. There will be danger and there will be dilemmas, but until we get to that point, all we can do is predict and prepare our responses to and treatment of such AI. Developing an adequate ethical model for them is going to be a monumental task, but we cannot abandon the pursuit of doing so for any reason. After all, we want a benign Maeve-led host species, not a murderous and vengeful Dolores-led one.

References

 

Anderson, M. & Anderson, S. (2011). Machine Ethics. Cambridge: Cambridge University Press.

 

Bloom, P. & Harris, S. (2018, April 23). It’s Westworld. What’s wrong with cruelty to robots?. The New York Times. Retrieved from https://www.nytimes.com/2018/04/23/opinion/westworld-conscious-robots-morality.html

 

Castillo, M. (2018, March 12). An actual ‘Westworld’ isn’t reality yet, but not everything about the show is science fiction. CNBC. Retrieved from https://www.cnbc.com/2018/03/12/hbo-westworld-ai-robot-limits-ethical-questions.html

 

Frankish, K. & Ramsey, W. (2015). The Cambridge Handbook of Artificial Intelligence. Cambridge: Cambridge University Press.

 

Jaynes, J. (2000). The Origin of Consciousness in the Breakdown of the Bicameral Mind (1st ed.). Boston: Houghton Mifflin Company.

 

Kant, I. & Infield, L. (1980).  Lectures on Ethics. Indianapolis: Hackett Publishing Company.

 

Keene, A. (2016, December 5). ‘Westworld’: Why the maze was also for us. Collider. Retrieved from http://collider.com/westworld-the-maze-explained/

 

McFarland, E. (2018, June 24). The American experience, as “Westworld.” Salon. Retrieved from https://www.salon.com/2018/06/24/westworld-is-america/

 

Neumann, K. (2017, September 26). ‘Westworld’ and the ethical dilemma of sentient machines. The McGill Tribune. Retrieved from http://www.mcgilltribune.com/sci-tech/westworld-and-the-ethical-dilemma-of-sentient-machines-092617/

 

Nolan, J. & Joy, L. (Executive producers). (2016-). Westworld [Television series]. Los Angeles: Home Box Office.

 

Renfro, K. (2016, December 5). Everything you need to know about the final ‘Westworld’ twist no one saw coming. Insider. Retrieved from https://www.thisisinsider.com/westworld-finale-twist-ford-2016-12

Leave a Reply

Your email address will not be published. Required fields are marked *