My Analysis of the Philosophy in Westworld
[ Published: November 24, 2016 ]
I think Westworld is fantastic. It’s fundamentally about the center of meaning. It’s asking questions like:
There are tons of spoilers in here. Don’t read this unless you are fully caught up and/or you don’t care about seeing/knowing things that might happen later.
What is true meaning?
What makes meaning real vs. fake?
What meaning could we expect from coming technologies like VR, or even to a lesser extent from video games?
How do you pursue meaning?
How do you know if you’ve found it?
From about the third episode I saw where they were going with the themes, and I think they’re really good.
Here are the basic lessons that the show is teaching.
There’s fundamentally no difference between the meaning inside the game and outside
People are always chasing meaning. The people on the outside went inside to look for it, and the people on the inside want to get out so they can find it
You don’t have true meaning unless there are consequences to your actions
We don’t actually have free will any more than the hosts do; we were simply programmed by evolution the way the hosts were programmed by the creators of the game
And then one that I hope they get to but is one I wrote about myself:
Meaning comes from overcoming fundamental obstacles. If you don’t have challenge you can’t have meaning. And the more dire the challenge the better the meaning.
It seems like the final plays for the show will be the main characters realizing these things, and then settling into a loop that they can be happy with. Or dying to defend something they decided was important.
It’s all Absurdism in my mind. Once they see that meaning is ultimately constructed, and that it’s the same for humans and for hosts, they’ll fall back to position three in my breakdown from The Difference Between Existentialism, Nihilism, and Absurdism.
I find the search for meaning extremely interesting, and I write about it a lot.
My ultimate (so far) thoughts on this are captured in an essay of mine called The Future of Happiness as Digital Humans, written in August of 2015. It basically argues that the first inclination for designers will be to strip conflict and pain and suffering and desire from people, so that they’re not so unhappy all the time.
And it’ll be a major mistake, since happiness comes most from avoiding or overcoming unhappiness.
Unsupervised Learning — Security, Tech, and AI in 10 minutes…
Get a weekly breakdown of what's happening in security and tech—and why it matters.
This isn’t purely philosophical. As I talk about in that piece, for humans it all goes back to evolution. We are programmed to value the things that make us 1) survive, and 2) reproduce. That’s how our incentives are built. The fundamental problem with human happiness is having those things handed to you without any effort.
If you want to have a meaningful life, you have to inject this depth of struggle back into it.
And if you think about Westworld, that’s what their loops are all about. It’s life and death. Kill or be killed. Save the damsel. Have lots of sex. Be the hero. Be the villain.
It’s returning people to a more simple and primal time, where the animal self is revealed and required to act. And animal victory over existential threats leads to animal happiness of the deepest (natural) kind.
This is why Bertrand Russell argued we were so unhappy (in the 1950s)—we basically have our existential threats solved for us, so our brains are struggling for meaning.
This is precisely the reason that Westworld exists within the story itself—to give someone that type of meaning again, even if it’s artificial.
This is also why Ed Harris is trying to go deeper to the real challenge. The more you have to lose the higher quality the meaning you get if you win.
So that’s my analysis. It’s about how humans are on their own loops as well, there’s no real difference between humans and the hosts, and how artificial reality itself really is.
If you want happiness you just have to find a loop you like and enjoy the ride.