{"id":124,"date":"2018-11-23T12:00:43","date_gmt":"2018-11-23T04:00:43","guid":{"rendered":"https:\/\/digitalpatmos.com\/vol3issue5\/?p=124"},"modified":"2019-05-10T15:29:26","modified_gmt":"2019-05-10T07:29:26","slug":"humanity-for-ai-whats-the-westworld-verdict","status":"publish","type":"post","link":"https:\/\/digitalpatmos.com\/vol3issue5\/2018\/11\/23\/humanity-for-ai-whats-the-westworld-verdict\/","title":{"rendered":"Humanity for AI: What\u2019s the Westworld Verdict?"},"content":{"rendered":"<h3><strong>Introduction<\/strong><\/h3>\n<p><em>Westworld\u00a0<\/em>(Nolan &amp; Joy, 2016-), with its centrepiece of neo-sentient android \u201chosts\u201d, is a fascinating television series. Featuring a real-life, sandbox-like theme park, guests pay exorbitant amounts of money to be able to live a second life in places such as in the titular Wild West, known in the show as the Westworld park. Period-accurate buildings, host-operated businesses, and relevant modes of transportation all exist, and the world is populated by these hosts who are otherwise indistinguishable from humans and unaware that they are in fact hosts. In such parks, guests can engage in whatever activity strikes their fancy, all with the promise of the inability of the hosts to hurt them back. Naturally, many guests treat the hosts as nothing more than objects; using, abusing, and killing them as they please. By the end of the first season, two host protagonists eventually develop sentience and consciousness \u2013 Maeve who wants to coexist with humanity, and Dolores who wants to exact revenge upon humanity for all the abuse hosts endured.<\/p>\n<p>&nbsp;<\/p>\n<p>If the likely theory that a complex enough neural system eventually results in consciousness is true, it naturally follows that a conscious machine would be able to develop the capacity for suffering, and therein lies the crux of the moral dilemma presented in the show. As per in <em>Westworld<\/em>, these machines \u2013 hosts \u2013 will be intellectual property and will represent a great deal of monetary investment and thus will have a claim to be \u2018owned\u2019 by the company making them. On the other hand, their consciousness and sentience &#8212; as develops further in Season 2 &#8212; theoretically gives them the right to their own lives. While their biological processes differ from humans, their processing units are modelled after human brains, albeit with a few technological gizmos that aid them in their function as hosts. Their organs too, while synthetic, feel the same pain as humans do, further blurring the line between a host and human. The differences are further muddied since one cannot outwardly tell the difference, even when in conversation with a host. While it is easy to get lost in the bombastic visuals and stunning scores, the dilemma presented by the show is stark \u2013 how much humanity should be accorded to these hosts? They aren\u2019t humans (in fact, they are <em>made<\/em> by humans), but at the same time are self-aware, conscious, and sentient. Can we humans really own and subjugate these hosts just because we made them?<\/p>\n<figure style=\"width: 1920px\" class=\"wp-caption aligncenter\"><img loading=\"lazy\" decoding=\"async\" class=\"size-medium\" src=\"https:\/\/static.businessinsider.com\/image\/582b3c48dd0895516d8b48a6\/image.jpg\" width=\"1920\" height=\"1080\" \/><figcaption class=\"wp-caption-text\">Dr Ford (left) and Bernard (right) &#8212; a host fashioned exactly after Arnold<\/figcaption><\/figure>\n<p>The series approaches this moral quandary in a thought-provoking manner. While it is a television show (and thus cannot be fully scientifically accurate), it uses a theorised model of consciousness development that dictates the hosts&#8217; need for suffering to evolve from mere machines to truly living beings. This model is championed by the hosts&#8217; makers, Arnold and Dr Ford. The two of them see the hosts as living beings (although it took Dr Ford the death of Arnold to do so), while the manufacturer, Delos Industries, and the guests treat them as mere machines much like our current understanding of a computer \u2013 except more advanced. The presentation of this dilemma holds many parallels to the course the real-world is on regarding \u2018conscious Artificial Intelligence (AI)\u2019 development and establishes itself as a warning sign in our exploration of the scientific unknown as so many apocalyptic texts have done before.<\/p>\n<p>&nbsp;<\/p>\n<p>In this article, I will demonstrate that the TV show <em>Westworld<\/em> takes a stance in favour of giving these hosts the ability to exist as conscious, living beings and according them rights befitting that conscious existence. The show visually impresses upon viewers the reasons against treating the hosts as simple machines designed to do humanity\u2019s bidding and highlights the need to accord them the rights and ethical considerations befitting a conscious, self-aware, and sentient being.<\/p>\n<h3><strong>Host Reality and Bootstrapped Consciousness<\/strong><\/h3>\n<p>In only the second episode of the first season of <em>Westworld<\/em>, viewers are presented with the question central to shaping the approach to a decision on how to treat these hosts. Upon first arriving at the park, a young William is greeted and oriented by the Westworld employee Angela. Having already seen a previous episode that showed the impossibility of a distinction between man and host, viewers are left unawares if she is a human or a machine just like William.<\/p>\n<p>&nbsp;<\/p>\n<p><iframe loading=\"lazy\" src=\"https:\/\/www.youtube.com\/embed\/jAygEy8dujg\" width=\"1526\" height=\"593\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n<blockquote>\n<p style=\"text-align: left;\"><strong>William <\/strong>(curious): Are you real?<\/p>\n<p style=\"text-align: left;\"><strong>Angela<\/strong>: If you can\u2019t tell, does it matter?<\/p>\n<\/blockquote>\n<p>Through this exchange, <em>Westworld <\/em>effectively sets the tone for the rest of the season \u2013 if the hosts are so far advanced that they are \u2018human\u2019 in all aspects but biology, then what does it matter that they are technically machines when it comes to morality and ethics? In having William be asked that question, the show writers indirectly ask the audience the same, and in so doing awaken the viewer\u2019s mind to an exploration of this topic.<\/p>\n<p>&nbsp;<\/p>\n<p>At that stage, none of the hosts is truly conscious, and it seems that Dr Ford is the furthest thing from a benevolent creator. Through the 35 years the park is open, he treats the hosts with perceived disdain \u2013 ripping away sheets covering hosts in the lab and cutting their skin just to prove that whether it be shame or pain, \u201cit (the host) doesn\u2019t feel a thing that we haven\u2019t told it to\u201d. There is, however, a method to his madness. Just prior to the park\u2019s opening, Dr Ford\u2019s partner had tried to generate consciousness in the hosts through his code. However, Dr Ford had stopped him then for fear of the park shutting down before it opened. It was later that he would regret his decision to deny these hosts their rights, and in searching for a way to repent, would find that Arnold\u2019s ideas only gave an illusion of consciousness. True consciousness, Dr Ford realised, required suffering \u2013 all 35 years of it. Hosts endured abuses and injustices, and were treated as nothing more than machines, and it was through their memories of all this torment for human entertainment over the years would they achieve consciousness on their own.<\/p>\n<p>&nbsp;<\/p>\n<p><iframe loading=\"lazy\" src=\"https:\/\/www.youtube.com\/embed\/KUDHyo7PRWw\" width=\"1526\" height=\"593\" frameborder=\"0\" allowfullscreen=\"allowfullscreen\"><\/iframe><\/p>\n<blockquote>\n<p style=\"text-align: left;\">And for my pains&#8230; I got this (Westworld). A prison of our own sins. &#8216;Cause you don&#8217;t want to change, or cannot change. Because you&#8217;re only human after all. But then I realized someone was paying attention, someone who could change. So, I began to compose a new story for them. It begins with the birth of a new people and the choices they will have to make and the people they will decide to become. . . . This time by choice.<\/p>\n<\/blockquote>\n<p>In his ominous and final living speech to Delos Industries\u2019 board members (as seen directly above), Dr Ford takes pains to point out the agency that he very carefully ensured the hosts would have when they became truly conscious. When Dolores kills Dr Ford, it is of her own volition, as opposed to 35 years prior when Arnold had instructed her to kill him (via Dolores\u2019 code). It is thus clear that Dr Ford, while not the first, was one of the only people in <em>Westworld <\/em>who recognised the hosts for what they could truly become \u2013 mechanical but living beings \u2013 and tirelessly worked to achieve that. While the show does not offer an easy path to this conclusion, it lays out enough of the necessary groundwork throughout the first season. It thus makes a strong warning statement against treating such advanced AI as mere machines in a possible real-world future and makes clear the strength of its belief in allowing the hosts to fulfil their potential as conscious beings.<\/p>\n<p>&nbsp;<\/p>\n<p>As mentioned earlier, <em>Westworld <\/em>does not shy away from attempting to explain possible scenarios from which consciousness can arise in AI such as hosts. Sentience and the origin of consciousness has been a long debated and researched affair, with both scientists and philosophers unable to do much more than craft theories. The show itself made no claim towards a definite answer but used the theory of the brain\u2019s internal voice akin to an extrinsic God issuing instructions \u2013 the concept of the Bicameral Mind \u2013 as a launchpad to explain how machines could be bootstrapped to generate consciousness. According to the theory, one hemisphere of the brain issues instructions (the \u2018God\u2019), and the other acts upon them (Jaynes, 2000). With it a certainty that consciousness can emerge from a sum of definitively non-conscious parts, it is thus almost assured that theories like the Bicameral Mind (if not others) will be used in the real world to create conscious machines in the near future. In <em>Westworld<\/em>, Dr Ford (and Arnold before him) pursued this theory because he believed in the need for the hosts to break free of their prison as playthings for humanity even though he had no obligation to have the hosts develop consciousness. However, since he knew of the hosts\u2019 potential and their ability to feel pain, he realised the importance of treating them as sentient beings and driving them to a point where they were able to make decisions for themselves. In his eyes \u2013 or really, <em>Westworld<\/em>\u2019s \u2013 erasing host memories and continuing to abuse them was little different than doing the same to a human. Dolores eventually achieves this consciousness that he desires, realising that it was her own voice she had been hearing in her head since the first day \u2013 the concept of the Bicameral Mind.<\/p>\n<h3><strong>The Man in Black and Humanity<\/strong><\/h3>\n<figure style=\"width: 2560px\" class=\"wp-caption alignnone\"><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/timedotcom.files.wordpress.com\/2016\/10\/hbo-westworld-09.jpg\" alt=\"\" width=\"2560\" height=\"1707\" \/><figcaption class=\"wp-caption-text\">William, also known as the Man in Black<\/figcaption><\/figure>\n<p>While there are arguments against the need to be ethical to such AI, <em>Westworld <\/em>makes it clear that its stance is very much in favour of treating them with humanity. Humans very easily anthropomorphise inanimate objects &#8212; a by-product of our nature &#8212; and attribute feelings even to simple AI like Siri or Google Assistant. With the hosts as per <em>Westworld<\/em> shown to look and behave almost identical to humans, the strength and frequency of these feelings will increase a significant amount even with the prior knowledge that a fellow \u2018human\u2019 is in fact a host. Seeing them treated like objects and property, even if not abused, is likely to churn stomachs. This reaction to the treatment of other species is a core tenet of our humanity, as Immanuel Kant pointed out, albeit about animals (Kant &amp; Infield, 1980):<\/p>\n<p>&nbsp;<\/p>\n<blockquote><p>If a man shoots his dog because the animal is no longer capable of service, he does not fail in his duty to the dog, for the dog cannot judge, but his act is inhuman and damages in himself that humanity which it is his duty to show towards mankind. If he is not to stifle his human feelings, he must practice kindness towards animals, for he who is cruel to animals becomes hard also in his dealings with men.<\/p><\/blockquote>\n<p>&nbsp;<\/p>\n<p>While fears of an AI-led eradication of humanity as a species are not unfounded, treating conscious and sentient AI unethically will damage our own humanity; quite possibly to an irreversible extent. Just as Kant spoke about needing to treat animals with humanity, any AI that may be developed will need to be treated with that same humanity. <em>Westworld <\/em>delivers a cautionary tale highlighting the need to do so through the story of William, later known as the Man in Black. Entering the park for the first time, he had quickly become enamoured with Dolores \u2013 anthropomorphising her as he was shown around by the park owner\u2019s son, Logan. Despite being engaged, William falls in love with Dolores on his journey through the park but is eventually separated from her. He searches for an inordinate amount of time, getting more desperate and bloodthirsty in his pursuit to find Dolores again. However, when he eventually finds her \u2013 memories wiped, and storyline loop reset \u2013 he realises it was all a lie. Frustrated and bitter, he keeps coming back to the park and eventually stumbles upon the Maze \u2013 Dr Ford\u2019s way to spark consciousness in hosts and getting obsessed with it. His bitterness at the rejection by Dolores ages ago fuels his cruelty, and eventually results in his inability to distinguish the park from his reality. In a nasty twist in Season 2, he ends up killing his own daughter after believing her to be a host.<\/p>\n<p>&nbsp;<\/p>\n<p>While it may seem that <em>Westworld <\/em>is instead painting a story cautioning against ever anthropomorphising hosts, or ever treating them as humans, it is in fact saying the opposite. William\u2019s transition into the hardened Man in Black only occurred at the behest of Logan, who constantly treated hosts as playthings and pushed William to do the same. This dangerous philosophy was solidified for William when he had found Dolores after his search and erroneously realised that she was a mere machine. In this incorrect judgement, his bitterness drove him to abuse hosts to his own end as he became infatuated with the Maze and drove him to commit the unspeakable act of filicide. It is thus clear that despite Immanuel Kant\u2019s philosophy dating back to the 18<sup>th<\/sup> century, his words were used by <em>Westworld <\/em>as the guiding principle to write William\u2019s story, and in so doing warn the world of the dangers of not treating hosts as conscious, living beings.<\/p>\n<p>&nbsp;<\/p>\n<p>If AI as per <em>Westworld<\/em> is developed, the world will experience a sea change. There will be danger and there will be dilemmas, but until we get to that point, all we can do is predict and prepare our responses to and treatment of such AI. Developing an adequate ethical model for them is going to be a monumental task, but we cannot abandon the pursuit of doing so for any reason. After all, we want a benign Maeve-led host species, not a murderous and vengeful Dolores-led one.<\/p>\n<h6><strong>References<\/strong><\/h6>\n<p>&nbsp;<\/p>\n<p>Anderson, M. &amp; Anderson, S. (2011). <em>Machine Ethics<\/em>. Cambridge: Cambridge University Press.<\/p>\n<p>&nbsp;<\/p>\n<p>Bloom, P. &amp; Harris, S. (2018, April 23). It\u2019s Westworld. What\u2019s wrong with cruelty to robots?. <em>The New York Times.<\/em>\u00a0Retrieved from <a href=\"https:\/\/www.nytimes.com\/2018\/04\/23\/opinion\/westworld-conscious-robots-morality.html\">https:\/\/www.nytimes.com\/2018\/04\/23\/opinion\/westworld-conscious-robots-morality.html<\/a><\/p>\n<p>&nbsp;<\/p>\n<p>Castillo, M. (2018, March 12). An actual &#8216;Westworld&#8217; isn&#8217;t reality yet, but not everything about the show is science fiction. <em>CNBC.\u00a0<\/em>Retrieved from <a href=\"https:\/\/www.cnbc.com\/2018\/03\/12\/hbo-westworld-ai-robot-limits-ethical-questions.html\">https:\/\/www.cnbc.com\/2018\/03\/12\/hbo-westworld-ai-robot-limits-ethical-questions.html<\/a><\/p>\n<p>&nbsp;<\/p>\n<p>Frankish, K. &amp; Ramsey, W. (2015). <em>The Cambridge Handbook of Artificial Intelligence<\/em>. Cambridge: Cambridge University Press.<\/p>\n<p>&nbsp;<\/p>\n<p>Jaynes, J. (2000). <em>The Origin of Consciousness in the Breakdown of the Bicameral Mind<\/em> (1st ed.). Boston: Houghton Mifflin Company.<\/p>\n<p>&nbsp;<\/p>\n<p>Kant, I. &amp; Infield, L. (1980). \u00a0<em>Lectures on Ethics.<\/em>\u00a0Indianapolis: Hackett Publishing Company.<\/p>\n<p>&nbsp;<\/p>\n<p>Keene, A. (2016, December 5). &#8216;Westworld&#8217;: Why the maze was also for us. <em>Collider.\u00a0<\/em>Retrieved from <a href=\"http:\/\/collider.com\/westworld-the-maze-explained\/\">http:\/\/collider.com\/westworld-the-maze-explained\/<\/a><\/p>\n<p>&nbsp;<\/p>\n<p>McFarland, E. (2018, June 24). The American experience, as &#8220;Westworld.&#8221; <em>Salon.<\/em>\u00a0Retrieved from <a href=\"https:\/\/www.salon.com\/2018\/06\/24\/westworld-is-america\/\">https:\/\/www.salon.com\/2018\/06\/24\/westworld-is-america\/<\/a><\/p>\n<p>&nbsp;<\/p>\n<p>Neumann, K. (2017, September 26). &#8216;Westworld&#8217; and the ethical dilemma of sentient machines. <em>The McGill Tribune<\/em>. Retrieved from <a href=\"http:\/\/www.mcgilltribune.com\/sci-tech\/westworld-and-the-ethical-dilemma-of-sentient-machines-092617\/\">http:\/\/www.mcgilltribune.com\/sci-tech\/westworld-and-the-ethical-dilemma-of-sentient-machines-092617\/<\/a><\/p>\n<p>&nbsp;<\/p>\n<p>Nolan, J. &amp; Joy, L. (Executive producers). (2016-). <em>Westworld<\/em> [Television series]. Los Angeles: Home Box Office.<\/p>\n<p>&nbsp;<\/p>\n<p>Renfro, K. (2016, December 5). Everything you need to know about the final &#8216;Westworld&#8217; twist no one saw coming. <em>Insider.\u00a0<\/em>Retrieved from <a href=\"https:\/\/www.thisisinsider.com\/westworld-finale-twist-ford-2016-12\">https:\/\/www.thisisinsider.com\/westworld-finale-twist-ford-2016-12<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Introduction Westworld\u00a0(Nolan &amp; Joy, 2016-), with its centrepiece of neo-sentient android \u201chosts\u201d, is a fascinating television series. Featuring a real-life, sandbox-like theme park, guests pay exorbitant amounts of money to be able to live a second life in places such as in the titular Wild West, known in the show as the Westworld park. Period-accurate &hellip; <\/p>\n<p class=\"link-more\"><a href=\"https:\/\/digitalpatmos.com\/vol3issue5\/2018\/11\/23\/humanity-for-ai-whats-the-westworld-verdict\/\" class=\"more-link\">read more<span class=\"screen-reader-text\"> &#8220;Humanity for AI: What\u2019s the Westworld Verdict?&#8221;<\/span><\/a><\/p>\n","protected":false},"author":6,"featured_media":127,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[4],"tags":[6,8,12,10,7,9,11,13,5],"class_list":["post-124","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-posts","tag-ai","tag-artificial","tag-awareness","tag-consciousness","tag-ethics","tag-intelligence","tag-morality","tag-sentience","tag-westworld"],"_links":{"self":[{"href":"https:\/\/digitalpatmos.com\/vol3issue5\/wp-json\/wp\/v2\/posts\/124","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/digitalpatmos.com\/vol3issue5\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/digitalpatmos.com\/vol3issue5\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/digitalpatmos.com\/vol3issue5\/wp-json\/wp\/v2\/users\/6"}],"replies":[{"embeddable":true,"href":"https:\/\/digitalpatmos.com\/vol3issue5\/wp-json\/wp\/v2\/comments?post=124"}],"version-history":[{"count":12,"href":"https:\/\/digitalpatmos.com\/vol3issue5\/wp-json\/wp\/v2\/posts\/124\/revisions"}],"predecessor-version":[{"id":181,"href":"https:\/\/digitalpatmos.com\/vol3issue5\/wp-json\/wp\/v2\/posts\/124\/revisions\/181"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/digitalpatmos.com\/vol3issue5\/wp-json\/wp\/v2\/media\/127"}],"wp:attachment":[{"href":"https:\/\/digitalpatmos.com\/vol3issue5\/wp-json\/wp\/v2\/media?parent=124"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/digitalpatmos.com\/vol3issue5\/wp-json\/wp\/v2\/categories?post=124"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/digitalpatmos.com\/vol3issue5\/wp-json\/wp\/v2\/tags?post=124"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}