Search Close

Search

Westworld, Theory of Mind, and Moral “Code” (#Spoilers)

“I look at you, and what I see is pathetic.
We ain’t nothin’ alike.
You’re just a child.”

–Teddy speaking to Craddock, Westworld Season 2 ep. 3

I wrote a fb post recently expressing some thoughts on Westorld Season 2 after watching ep. 4. My friend Kev and I had a great chat and ended up getting into a fun area of thought: AI and morality.

One of my wishes for the show is that the writers continue to explore issues of morality; like I pointed out in the original fb post, the very existence of a morally vacuous theme park must be an anomaly within the Westworld fictional universe, and it’s a part of the show that is really ripe for exploration I think. Ok, follow me here:

We haven’t seen much of the “real world” outside of the park but I think it’s safe to assume that outside of Westworld (and we now know that there are more theme parks besides Westworld) it’s society as usual, with rules and moral codes etc. Given this, we must assume that the hosts (what the robots are called in the show) were programmed by their human creators to abide by certain moral/ethical “codes.” Now the un-self aware hosts are obviously just mechanistically running their pre-programmed story lines. So for instance, perhaps we can assume that in the park if a guest kills someone’s father (a robot) a child (also a robot) is programmed to cry and another person (another robot) is programed to become angry and seek revenge, etc., BUT after the hosts begin to “wake up” this “moral code” talk becomes a bit more murky (and there are problems with this, which I talk about below).

The question is this: are the self-aware hosts amoral, or are they still abiding by the “moral code” programming that was supplied by their human architects? If they are abiding by it, are they perhaps now questioning it? If so, will they improve upon it or dismiss it all-together, consciously choosing a type of amorality?

These are important questions that I hope the show explores. I personally think it would be a lot of fun to see the hosts develop their own highly rational and pragmatic moral systems and become moral exemplars for the humans. How much of a fun, unexpected twist would that be?!? It would be the complete opposite of the Terminator-type trope that I’m hoping the show avoids at all costs. If I were in the writer’s room I’d definitely be pushing for this direction: the human creation becomes human and then shows humanity how to be better humans! It’s a sort of crazy sci-fi theosis with some moral exemplar atonement theory thrown in! We’re already seeing moral dilemmas/conflicts arising between different host factions, for example most recently in ep. 3 when, against Dolores’ wishes, Teddy shows mercy (maybe even remorse) and lets Craddock and the Confederados go. My friend Kev could be right, though, and it could be that the self-aware hosts are merely engaged in some sort of strictly logical quid pro quo behavior. Maybe the hosts, even the self-aware hosts, are amoral.

This is interesting stuff to think about because having morals does sort of presuppose that emotions are present; for example, we may feel the emotion of sadness when someone is murdered, then perhaps we feel the secondary emotion of anger afterward which might prompt someone to act in retaliation in an attempt to seek some sort of justice for the perceived wrong that was inflicted upon them. so I ultimately agree with Kev here and I think the writers may have overlooked some stuff. Emotions are a very uniquely organismic thing and they are one classic differentiator of humans and machines/computers; classically we think of computers as rational, emotionless and, yes, amoral. So why did Teddy let the Confederados go? Why is Maeve searching for her daughter? Why are the hosts rebelling in the first place? Do they know that the exploitation the humans were engaged in against them was wrong? I think there are contradictions here and they’re perhaps rooted in the reductionistic theory of mind that the writers chose to run with. Conflating consciousness (conscious experience) and subjectivity (self-conscious experience), and positing consciousness as a substance, does allow one to theoretically entertain the idea of uploading a mind to a machine body, but it hits a wall at explaining what emotions are and why we have experience.

Anyway, in order for the story to be internally coherent, and avoid contradiction, I think the writers have to suggest that the self-aware hosts are not amoral and affirm that they do have a sense of right and wrong. I don’t know what other choice they have at this point. But given the reductionistic, materialistic paradigm they’ve chosen (which I’ve said before I don’t like), I think my moral exemplar theme should be pursued! You’re welcome.

Tags:

0 Comments

Leave a comment

Your email address will not be published. Required fields are marked *