A Future Without Privacy

privacy-e1481089737735

If you listen to the main voices on privacy concerns they’re all saying something like this:

  1. Privacy is possible in 2017 and beyond

  2. We’re losing ground right now

  3. But don’t worry–we can fix it

Well, 1 out of 3 ain’t bad.

The truth is that privacy is going away—permanently. To see how and why, let’s step through a few concepts.

In information security there’s a formula used for risk that looks like this:

risk = probability X impact

It’s multiplication, so as something either gets more likely to happen or hurts more when it does, the risk goes up.

With privacy we have a bad situation literally on both sides of this equation. For probability we’re quickly moving towards the Internet of Things, where billions and then trillions of networked systems will be online. And the electricity flowing through these systems will not be amps and volts, but data.

Your data, as it turns out. And mine.

This is a problem that’s getting worse, not better. The plummeting cost of storage, the commonplace of social media, and the rise of data science powered by big data are all pointing us to a one-directional flow of data away from us and into the global internet.

It’s not a company that has your data, or 100 companies. It’s thousands, and that number is rising at a massive rate. Because of big data and machine learning we’re just now starting to see the power that this data holds. And now that industry has had a taste, they’re completely obsessed.

The way it works is there are many large and extremely well-funded companies who do nothing but collect, massage, use, and then sell your data—over and over again. From company to company, industry to industry, sliced and sorted in countless different ways depending on how they intend to use it.

It’s worse than pee in a pool—it’s tears in an ocean.

Your tears, as it turns out. And mine.

Stated another way, we cannot stop our data from getting out. It’s an inevitability. Freely floating information is better for shopping, it’s better for customization of your experiences, and it’s better for marketing. Like I said—inevitable.

So on a scale of 1 through 10, with 10 being a 100% chance that you’re data will be leaked/stolen and available to some significant subset of people who want it, in the next 10 years, I’d put that at around an 8 out of 10. And over 20 years it’s probably more like a 9 or a 10.

So that’s probability.

On the impact side things don’t look much better. It’s currently quite damaging for someone to have a certain combination of your data. Most importantly they can simply impersonate you in multiple ways, which includes creating financial accounts in your name, purchasing things, not paying the bills, and ultimately harming your credit history and even your reputation.

It can be, and has been, a serious life disruption for millions of people.

It’s a sad reality that the most sensitive data we have is also the hardest to change. How are you going to change your date of birth? Changing your social security number isn’t easy. And good luck with your mother’s maiden name.

The point is that it takes just seconds to steal this data and share it with a planet, and the data itself is meant to exist unaltered for an entire lifetime. That’s a wicked asymmetry.

Anyway, I’m making up numbers here, so let’s call the current impact from identity theft a 9.

So that leaves us with:

risk = 8 X 8 = 64

Let’s say that the average person is willing to tolerate a 50/100 risk score as it relates to their privacy, and we’re currently sitting at 64. Again, these are quick numbers in a simple model, but it serves the purpose.

Now, if our data does in fact become more likely over the next decade to be leaked, shared, massaged, transferred, and profited from by thousands of companies, then let’s say that probability (of data leakage) goes to 9. Then we’re at:

risk = 9 X 8 = 72

Even with a simple model it’s easy to see that’s an unacceptable amount of risk.

So here’s where things get weird.

The best thing we can do to address this problem long-term is to assume probability is at 10 and work on addressing the impact.

Unsupervised Learning — Security, Tech, and AI in 10 minutes…

Get a weekly breakdown of what's happening in security and tech—and why it matters.

If you want to make privacy less of a concern, we simply have to find a way to become less concerned about it. That’s a word game, yes, but it’s also a numbers game.

Watch what happens when we bring down the impact number:

risk = 9 X 4 = 36

So if we could halve the impact we could get our overall risk down to a 36, which is well within tolerance.

Fine, you say, lower the number, you say. How do we do that?

Great question, and I have an answer.

We need to (and will) migrate away from a world in which merely having information equates to authorization.

It’s a silly system, really.

PERSON 1: How do I know it’s really you?

PERSON 2: Because I have this series of numbers memorized.

PERSON 1: Sounds solid to me. Here’s a car.

It’s true that some authentication processes are stronger than others, but we don’t have billions in fraud and identity theft every year because the system is working.

The solution is to move towards a much stronger and multi-faceted system for authenticating activities and transactions. So instead of giving your mystical unicorn numbers to authenticate an key purchase or life change, you’ll be authenticated in three, five, or a dozen different ways—including by third parties, people in your network, and even random strangers.

Some of this will require tech we don’t have yet, and it will take time to build and implement, but it’s inevitable precisely because it’s about to become necessary.

The simple fact is that if we want to reduce the risk of privacy concerns, we have to move the right side of the equation, not the left. We have to make the data itself less useful to bad guys, and less of a concern to us that it’s out there.

This will mean losing something we currently care a great deal about, and that’ll be a bumpy ride for sure. At some point, however, we’ll realize that it was what could be done with the data that we were worried about, not the data itself.

In the meantime I’ll be playing both sides—protecting the privacy we currently have while preparing us for a future without any.

Your future, as it turns out. And mine.

Notes

  1. Thanks to Saša Zdjelar for talking through this topic with me many times over the years.

  2. This model doesn’t address all types of privacy concerns. There are many things people will still want to keep private even after this transition occurs, although they will be far fewer.

  3. One place privacy will remain critical is in countries where ones’ beliefs (or even their very identities) can cause them harm, such as dissidents, atheists, homosexuals, etc. in developing countries.

  4. The primary focus here is on information like data of birth, national ID numbers, residence, location information, product and experience preferences, etc. These are the types of things that 1) will benefit the consumer to be quasi-public in many cases, and 2) we can limit the impact of bad actors knowing.

  5. As with many such changes, this transition will naturally occur as a result of generational change. Most born in the 1950’s aren’t going to embrace this in any kind of numbers. This will be most accepted by those who grow up where it’s normal.

  6. Image from Pixabay—a legitimate and free online source of images.

Related posts: