Most people think the way AI is going to significantly impact society is by taking all our jobs or creating robots that try to kill everyone. But while we focus on all the distant or unlikely impacts of artificial intelligence we’re about to get completely blindsided by a very real and practical one.
The ability to imitate anyone’s voice or likeness in order to falsify evidence.
Once again porn is the innovator.
Lyrebird is a service that lets you upload a minute of your own voice, after which you can hear an AI speak in your voice. And the new thing in porn is putting celebrities’ faces on porn actresses. And then, within a few days, that evolved into people putting their friends into porn scenes.
It’s a bit disorienting to hear about these things. Like they mess with reality in some way. And that’s what I want to try to unravel here.
We are about to see an arms race between forensic analysis that can detect fakes, and increasingly sophisticated AI that more perfectly mimics the real thing.
I think there are two main ways this will affect our reality in a major way.
- There will be lots of voice recordings and videos in the world that make it look like people did something that they didn’t. This is the obvious one, and it’s pretty clear from the fact that this just started and the tech is already pretty good that the fakes will soon be quite convincing.
- The bigger impact, however, is that the better the fakes get the more people will be able to deny horrible things they actually did and that were actually recorded.
You have a voice recording of a politician admitting to a murder? Fake. You have a video of someone committing adultery? Doctored. You have a picture of someone robbing a liquor store. I was framed.
And this is just the first second of the rest of human history.
A few years ago AI couldn’t even find common objects in photographs, and now it’s finding problems better than doctors with over a decade of training. Now it’s giving people the ability to make it look (and sound) like people have done something they haven’t.
People often believe false things with no evidence. Imagine when they have high-quality fake evidence.
Once we gain the ability to falsify the authoring of action—in a way that can convince most people—we not only grant legitimacy to things that didn’t happen, but we remove it for things that actually did.
It’s hard to understate this because nothing is more fundamental to the infrastructure of human trust than seeing or hearing something. It’s the invalidation of our most basic truth-sensing abilities.
This is about to blindside us as a society. The only question is how much and how fast.
- Gives a whole new meaning to “fake news”. It really might be this time. Or not.