What if Our Problems Aren’t Tech Problems?

problem isnt tech

I just woke up with a troubling—yet perhaps freeing—thought.

What if all our Culture War™ problems we’re having right now aren’t tech problems—but rather are the result of large groups of humans communicating—which is enabled by technology.

In other words, what if technology is just a means of exposing the ugliness that exists within humans? And specifically, ugliness that exists when large groups of humans are exposed to each other.

100,000 years ago when we were all running around in small groups, we were human. But we were small-minded and tribal. We were sexist and racist. And hated and distrusted out-groups.

That was our natural state, and we have somehow convinced ourselves that this is no longer the case, despite being shown over and over that it is.

What if Facebook and Reddit are nothing but megaphones and microscopes. Rather than create ugliness, they merely show us how ugly we actually are with increasing volume and resolution?

“Ok, sure…”, you might be thinking. “So what? Does that make it ok?”

No, it doesn’t make it ok. I’m not subscribing the naturalistic fallacy that says anything that’s natural is therefore ok. Nature is full of rape and murder, and that doesn’t make those things ok. Same applies to innate human bigotry.

But what I think this can do is help us admit to ourselves what the real problem is. Which is ourselves.

Right now the narrative is that we’re perfect. We’re amazing. It’s just that goddamn technology!

“Facebook is tearing us apart!”, they say.

Well, no. What’s tearing us apart is exposing humanity to itself. More people seeing what others believe. More people seeing who others love. More people seeing how others behave. More people seeing who others are.

That exposure brings out innate ugliness and innate negativity. Until 2010 or so that ugliness was quite isolated, like back in our hunter-gatherer groups. It was there, but it was small, private, and isolated.

What technology has done is expose that ugly truth to the world. And now we’re on fire.

We see the same thing with machine learning. You teach it to learn about human culture by showing it a corpus of our behavior and it comes back as a raunchy, sexist, racist.

“Machine learning is raunchy, sexist, and racist.” Nope. It’s just a mirror. That’s us that we’re seeing, reflected back.

Right, so this is horribly depressing. So what’s the plan?

Unsupervised Learning — Security, Tech, and AI in 10 minutes…

Get a weekly breakdown of what's happening in security and tech—and why it matters.

I believe the truth is almost always the best way forward, and I think this is an example of where we need truth more than ever.

It’s right to ask ourselves what harm technologies like Facebook and machine learning can be doing to our society. They are absolutely helping to bring about harm.

But we need to differentiate between two separate things:

  1. The ugliness that’s being exposed in ourselves, vs.

  2. The idea that the tools themselves are dangerous

In other words, the dialog should change from:

  • INCORRECT: Humans are fluffy and nice, and tools like Facebook and ML find tiny pockets of negativity and magnify them to make it look like we’re bad. Therefore, those technologies are bad.

to…

  • CORRECT: Humans are primitive and ugly, with a streak of goodness within us that we are working to extract and magnify as we grow as a species. Therefore, we must be cautious about deploying technologies that highlight and magnify our ugliness at scale, and work to use that same tech to modulate the negative and magnify the positive.

The effects are often the same, but one is coming from delusional scapegoating of tools—which is a form of denial that’s unhealthy—and the other is coming from an honest acceptance of our own flaws.

So, yes, we absolutely need to make sure the tools aren’t being used to magnify ugliness, and in turn to create more than naturally exists. But we shouldn’t be under the delusion that it’s the tools themselves that are creating the negativity. That’s a cop-out, and it’s wishful thinking that will get us nowhere.

The tech isn’t making us bad; it’s showing us that we are bad, which is making us worse.

The distinction matters.