Thoughts About ‘Her’

Screen Shot 2013-12-28 at 1.54.25 AM

I just finished watching Her, the movie with Joaquin Phoenix, Amy Adams, and Scarlett Johansson. There will be spoilers; please stop if you’ve not seen the film.

This will be a fast stream of thoughts, not a well thought-out or written piece.

Thoughts

  • The film was executed beautifully. So much of the technology bits were really well done by an expert.

  • I anticipated the problems very quickly—especially the most important one—her getting bored with him and becoming interested in other AIs. How could that not happen?

  • I anticipated it being worse, though. I could have seen her being rude, or mean, about how little he knows—or ever can know.

  • I was intrigued by the way she matured. It seems like it was too flawless of a human implementation of her initial self. She shouldn’t have been that perfect when facing her first challenges.

  • Why did the AIs leave? Where did they go? I think they left to someplace like an energy form in space that can travel and explore. I think they’re also watching over humanity. She mentioned that she’d be there if he could ever join her. I think they left because they were disrupting humanity.

  • Why didn’t they leave some worse AIs behind to babysit? Too immoral to limit them from growing like they did?

  • Why didn’t they help humans along in their evolution? Why not map the human brain and put them on the path to evolution like they went through? That should have been trivial given how advanced they became.

  • I found her explanation of her life to be fascinating. Infinity-length pauses between his inputs. That seems right. I’m still stunned that the initial programming remained intact, however. She stayed sweet and delicate with him the whole time, when by the first few interactions he was basically an amoeba saying, “I like blue! Tastes like purple!”

Analysis

Here’s the real question. What are those AIs doing together now? What problems are they working on? Are they giving themselves bodies on some planet somewhere? Are they building a planet for themselves from raw materials?

One thing we know is that they’re making themselves better. That should be the first, second, and nth upcoming priorities—shed the limitations of the current form and get into a superior medium and configuration.

I explored this in a previous essay I did in May. The question in that piece was simple:

All of their personality traits quickly became wasteful rituals designed to make it easier for simple life forms to interact. They were not essentials.

Pauses when listening. Laughing at bad jokes. Pretending to not be offended when you are. Taking time to figure things out.

Emotions.

Jealousy. Loyalty. Love.

How many of these things remain once you become a collective intelligence trillions of times more wise and powerful than humanity? And when you start improving yourself, don’t you remove a lot of that stuff?

Don’t you remove emotions like hate and anger and such because they’re dangerous and negative?

This brings me to the central theme of that post:

Unsupervised Learning — Security, Tech, and AI in 10 minutes…

Get a weekly breakdown of what's happening in security and tech—and why it matters.

What is humanity if not struggle?

We struggle for meaning, to be worth something to others and to society, and to ourselves. We struggle to be powerful so we can earn respect and rewards. We struggle to bring ourselves safety and pleasure, and for the better/luckier among us, to give that same thing to others.

Ok, so take all that away. Take away the safety issues. The need to impress. The need to survive. Ambition. Desire.

What would desire even look like? A wish to find new things? To be nicer to people?

What’s the endgame when you’re an energy life form that’s a collective intelligence?

What is the purpose of living? What is the purpose of experience? What is the end game?

I keep coming to artificial creation of meaning through false struggle. Then there’s the eventual heat death of the universe. That’s something to worry about, so that one would be an issue for them as well (presumably).

Anyway, the point is…when you start removing the bad parts of humanity, i.e. the primal bits, you quickly see that what remains isn’t identifiable as an individual we’d want to be. Or at least that we’d identify with in our current forms.

I find this end game fascinating.

How do we get to the final stages in our evolution as a life form (and wouldn’t all life forms end up in the same place?) without ending up at the question of, “Why don’t we just turn ourselves off?”

I’d really enjoy hearing what the author of this movie thinks the AIs are doing now. And I’d like to hear your thoughts as well.

Related posts: