Westworld (TV series)

From Clockworks2
Jump to navigationJump to search

'Westworld' (TV series). HBO 2016-2018 (third season announced and covered separately here). Jonathan Nolan and Lisa Joy, credited as creators. First two seasons: Nolan director of two episodes, script credits on 20; Joy director of one episode, script credits on 20. Nolan and Joy executive producer credit along with J. J. Abrams and six others. "Series Production Design": Howard Cummings (10 episodes 2018), Zack Grobler (nine episodes 2016), Nathan Crowley (one episode 2016). Based on Michael Crichton's WESTWORLD (film), 1973.[1][2]


Highly human-looking cyborg/androids serve as "Hosts" to the Guests in the Westworld theme park. The androids can easily pass the Turing Test and pass for human, but the issue raised early in 'Westworld' is not AI or —in the first instance — of android free will but of 'consciousness', human and human-manufactured. Also relevant and highly powerful are the images and sounds, starting with a player piano and Leonardo's Vitruvian Man[3] transmuted into an impressive moment in the manufacture of the Hosts of Westworld and moving into images of the androids after they have been killed — often with graphic violence — by the Guests at Westworld. We have then, a kind of visual paradox of manufactured creatures as torn and bleeding flesh and blood, with the gore sometimes powerfully juxtaposed with — initially in the series — suggestions mechanical innards. And these andoids, with far more certainty than with any human being, will be resurrected. Note that in episode 5, "Contrapasso" (30 Oct. 2016), Ed Harris's Man in Black reveals that when he first came to Westworld the Hosts were flesh-covered robots and beautifully made, but in updated versions were recreated as androids because flesh was "more cost-effective."

In interaction with the androids is Westworld itself in its many levels: the apparently vast spaces of the American west, contained within the figurative mechanism of the theme park and controlling corporation. And within this huge theatrical space, Hosts and Guests play out narratives that are repeated with variations for each new set of customers. Established then — and made explicit later in expository dialog — is memory as the basis of consciousness, and the androids as unconscious/non-human because they repeat the scenarios, with carefully-limited possibilities for innovation — and then don't remember what has happened, which is in part a good thing since what happens to them at the hands (etc.) of the guests is often horrible, which is what many of the guests are paying for. More exactly, all of the androids, in theory, have no memories or real chance to break out of their assigned narratives: herein lies one set of plotlines for the series.

Related to the constraints on the androids of the looped and repeated narratives is their programmed inability to hurt Guests in any serious way, and the limitations of the effects on Guests of any of the weapons the Hosts can use (cf. the original film and Asimov's First Law of Rototics).[4] The possibility of the andoids rebelling, like adolescent children, is raised in the dialog early on, although many in the audience will know about the original movie or have enough experience of science fiction stories — or perhaps stories generally — to know that rebellion is likely, if not inevitable. Indeed, given the violence visited upon the androids, absence of rebellion would be an ethical offense by the authors.

A technician and the audience are told explicitly by the creator of the androids that these androids feel neither shame nor pain, and when they are in sleep mode behind the scenes this seems to be the case. In Westworld theme park and in character, however, the Hosts definitely appear able to feel pain and a wide range of emotions, and inflicting pain on them is part of the kick for Guests who want to go bad and seek sadistic thrills. Not feeling shame is inhuman — or pathological: see the conclusion of the myth of the Garden in the Biblical Book of Genesis (3.4-12). Feeling pain is a crucial difference between a human being and a hunter-killer Terminator in THE TERMINATOR and feeling is crucial to the pathos of the death [sic] of HAL 9000 in 2001: A SPACE ODYSSEY (film); and the appearance of suffering is how we infer that nonhuman and human animals are sentient (sense 1)[5] and suffering. If we add our consciousness of our "willing suspension of disbelief, for the moment," and our background knowledge that watching Westworld we are watching electronic images on a screen of actors on nested sets, the question of feeling by the androids becomes usefully complex as we in the television audience are both pained — let us hope — and entertained by the televised narratives in which the androids may be suffering.


EPISODES OF NOTE

The issue of free will is introduced explicitly in Episode 6, "The Adversary" (6 Nov. 2016), as is the distinction between Guests/humans and Hosts, common in the literature: "I was born; you were made." In this episode and passim, note everyday technology from various periods and the visual relations of that technology to humans and Hosts in frame. In addition to the 19th-c. train, with its passengers, note elevators and, in this episode, the use of a US Civil War gatling gun and an impressive extended shot of an escalator from above, giving the impression of an M. C. Escher maze engulfing a human rider. 
In Episode 7, "Trompe L 'Oeil" ("optical illusion" [13 Nov. 2016]),[6] there's a foregrounding of "playing God" in the creation of Westworld and gods vs. men as Hosts might see humans if/when they see the underground working of the park. What might have been a mere reiteration of a pious SF cliché is rendered significant in the excellent presentation of the Pride issue and return in the dialog and plot to the relationships of memory, improvisation, and what might be the possibility of free will and rebellion among the android Hosts. 
Episode 8, "Trace Decay" (20 Nov. 2016) is notable for a scene with Bernard and Ford, with the human Westworld co-creator Ford definitely in control and Bernard, here, clearly without effective free will. Balancing the criterion of free will for humanity is Bernard's show of emotions over a murder vs. Ford's psychopathic lack of response; and balancing this sequence is Maeve's recovering memories and achieving a good deal of apparently willed and willful control. This tension is expressed later in dialog with Ford commenting to Bernard on Bernard's unique situation: "A programmer who knows intimately how the machines work, and a machine who knows its own true nature." Bernard brings in feelings and memories — and Ford notes that Bernard's backstory isn't that different from the stories humans tell ourselves as to who we are: "The self is a kind of fiction, for Hosts and humans alike. It's a story we tell ourselves." The centrality of narratives is a motif in this version of Westworld relating its themes to academic debate of the late 20th century, as does the blurring of boundaries between the human and — in an odd but important formulation — the not "fully alive." Again, cf. and contrast that death [sic] of HAL 9000 in 2001: A SPACE ODYSSEY (film), where there is clearly no sense in which HAL has ever been biologically alive or the idea in SHORT CIRCUIT (1986) that "Number 5 is alive" when Number 5 is a thoroughly inorganic robot. 
Episode 9, "The Well-Tempered Clavier" (27 Nov. 2016). Note for standard move of android robot/cyborg peeling back skin and revealing mechanism, as in BILL & TED'S BOGUS JOURNEY and THE TERMINATOR, but done to a sympathetic female Host, with violence, and in a tragic mode. More literally classic is a Descent into the Underworld[7] but with Dolores taking an elevator from a confessional into the basement areas of the Westworld theme park; like Odysseus, Aeneas, or Odin, she gets crucial information. Important episode for development of the theme of the centrality of memory and pain for consciousness, and the suggestion both of "The Bicameral Mind" — handled briefly earlier and the title of the next Episode — and that the Hosts may have been created innocent by humans who, as a species, are no damn good. 
About two minutes into Episode 10, "The Bicameral Mind"[8] (4 Dec. 2016), we see a nearly fleshless early model of Dolores, showing her human(oid) form for her head and neck but from the neck and sternum down a skeleton of inorganic material: metal and/or composite of some sort; this gives a visual Metaphysical conceit of the Host equivalent of Homo duplex: instead of humans as flesh and spirit, we glimpse the early versions of the Hosts as living beings and machines, literal mechanisms. AI is never in question in Westworld but artificial, or machine consciousness is a major theme, and we're told in this episode explicitly that consciousness is achieved by a quest not upward but inward, through the suffering not of climbing a pyramid but achieving the center of a maze: with the "objective correlative" of a maze, most concretely here the old child's game of Pigs in Clover, with the goal of tilting the maze to get one or more small balls through it to rest at the center.[9] (The maze here is a game and the pyramid form most explicit in a buried church and the pyramid and the Labyrinth are ancient symbols that have retained resonance.) ¶ In what is minimally a theater piece — although the actors are living (sic) their parts — within a staged celebration, within the constructed world of the theme park, Dolores and Teddy reach where land meets water, which is where Dolores wants to die, and — in the playlet — does die; and then (apparently) she and Teddy shut down. For this climax and conclusion of the playlet, cf. and contrast the quest for the sea in one of the linked stories in Silverberg's The World Inside and the conclusion of THE TRUMAN SHOW, where Truman Burbank gets to a final artificial sea to escape his role within The Truman Show into the role of Romantic Hero to enter an outer world clearly labelled with the Hollywood sign. Driving home the point of the theatricality of the Dolores/Teddy love at this stage (or in this loop) is the repeated use of the Friar's line in Shakespeare's Romeo and Juliet, "These violent delights have violent ends." Decorously and ironically given use of that quotation, the episode ends with Dolores's shooting the Westworld co-creator Ford and then other guests, which may or may not be on her part an act of will. Parallel to the action centered on Dolores, Maeve frees some Hosts at least enough to kill — bloodily — Westworld "butcher" technicians and security personnel and (apparently) to be able to escape herself without blowing herself up; for the last point cf. INSURGENT (film) and the works cited there. Having been given what's purported to be the location of her daughter, Maeve chooses not leave Westworld. Since the death of her daughter (in one loop) is crucial to the identity established by her backstory, on the one hand, and suffering is crucial for consciousness and the potential for choice on the other, the degree of Maeve's freedom here is a difficult question, but familiar for humans. Suffering is important for consciousness in the tradition of tragic drama (with Shakespeare's King Lear a primary example); how there can be free will in a world of an omnipotent and omniscient God is also a standard question, with the possibility of the truest freedom consisting in bringing one's will into line with God's. For the Hosts in Westworld, however, "As flies to wanton boys are we to the gods: / They kill us for their sport"[10], and the Guests and staff are not gods but humans often behaving very badly — wills frequently freed from conscience or compassion, acts greatly freed from consequence — making the question of Host free will and, in any event, political freedom, more interesting and relevant.
========================

Discussed by Maximiliano Jiménez, "Immersion and Fictionality in Westworld," which see.

Reviewed by Sonya Dyer, SFRA Review #319 (Winter 2017): pp. 25-26; see especially for issues of race and gender.[11]

Seasons 2-3 reviewed by Amandine Faucheux, SFRA Review 51.1 (Winter 2021), also very good on issues of race and gender — and revolutionary politics. One caution: there are (largely male) sub-cultures in which trading insults can be bonding ("flyings," "busting," "cutting," "signifying on"), so attend carefully to such exchanges, however correct Faucheux may be here. [12]


Note in the series Popular Culture and Philosophy, Westworld and Philosophy: Mind Equals Blown (Chicago: Open Court, 2019), reviewed favorably (with one qualm), by Robert J. Creedon, SFRA Review 50.2-3 (Spring-Summer 2020).[13][14]


RDE, Initial Compiler, 12-21 July, 1Aug18, 25May20; 28Oct21, 5Nov21