Being in security I think a lot about whether things are tools or weapons.
The distinction applies to guns. It applies to encryption. It applies to offensive security tools. And it applies to technologies like machine learning and the use of AI-monitored cameras throughout society.
The link I’m highlighting here is:
Visibility plus Understanding --> Tools and Weapons
- Visibility means you have the opportunity to observe a given object or behavior, like a message sent between people, or people traveling from place to place.
- Understanding means you can learn a lot about that thing once you see it, like who it is, what it is, how it’s built, how it works, etc.
This applies to so many things. People, bridges, locks, neighborhoods, and populations.
- If you know precisely how DNA and human biology work, it’s easier to make DNA-based biological weapons.
- If you know how bridges work, and how to build them, it’s much easier to blow them up.
- If you know how web applications are built, it’s a whole lot easier to break into them.
- If you have visibility into everyone moving within a city, and you can see their faces, and you have ML algorithms and facial recognition software monitoring those feeds, you can learn a lot about everyone being watched.
- And if you know psychology, sociology, and neurobiology—and you combine all those disciplines with knowledge of people’s biometric data, their facial expressions, their body language, etc.—you end up with the ability not just predict their behavior, but to influence it.
Medicine and Bioweapons. Civil Engineers and Terrorists. Personalized ads and Population Control. This is the curse of progress.
Weapons are mirror images of tools. The insight that gives you one gives you the other simultaneously. It’s only a question of time.
Humanity’s chances hinge on maturing fast enough to contain the weapons that emerge from the tools that we cannot help but create.
We’re so in love with new functionality—and the money that comes with it—that slowing down is not an option. We run in dark rooms carrying scissors because we don’t want to be outside when someone else finds the treasure.
The best we can hope for is a series of small mistakes that hurt us enough to pay attention, but not enough to end us. This seems to be what the Unibomber saw, and why he thought it was acceptable to kill.
He was wrong about that, and also to think what he did could make a difference.
There are billions of dollars being spent right now—using our planet’s smartest minds—to create the breakthroughs that will enable bioweapons, autonomous attack drones, and algorithms of population-scale manipulation.
I find it pleasantly ironic that an atheist like myself is resigned to faith as a strategy against despair.
I’m not sure it’s faith actually. More like resignation—to the unfolding of the universe.
Here’s to hoping we master the tools, and ourselves, before civilization-ending weapons become too easy to create and use on each other.