My Response to Sam Harris on the Apple Encryption Debate

take1

[ UPDATE: Much credit to Sam for engaging in the conversation. I’m not sure how people claim he’s closed on this topic when he is clearly open to exploring it. ]

I don't agree with all of it. But this is a very good response to my remarks about encryption. https://t.co/rMl8zgtuWN@danielmiessler

I’ve been planning on doing a podcast episode on the Apple encryption debate for some time, but I was unsure of the format I should use.

This problem was just solved for me when I listened to Sam Harris—who is someone I respect greatly—miss the mark significantly in a recent podcast.

The thing that compelled me to respond was the fact that I don’t often disagree with Sam. His logic is usually impeccable, and we often end up with nearly identical opinions.

So it was somewhat surreal to hear him be wrong about something. Or at least disagree with me (which, of course, may not be the same thing).

Anyway, being in information security myself I felt like a response was important.

This essay takes the form of a retort to his comments, followed by my own points and then a summary.

Sam’s points

[ The points are summarized, by the way, not necessarily exact quotes. ]

  • Apple built the lock, but didn’t build the key, and now they’re telling us that building the key would put us all at risk. Self-serving abdication of responsibility.

  • Community in tech swayed by Snowden. Even when the government gets a court order, they think they shouldn’t give access

  • Gives cases where text messages could have helped solve a murder, but the texts are unread because the iPhone is unbreakable. Imagine being a family member!

  • Could someone build an impregnable room inside their own house?

  • What if you could take a drug that could make your DNA unanalyzable? So you could never be linked to any crime. The only people who would benefit would be criminals!

  • Apple could maintain the backdoor and it’d be fine, just like banks have your banking information. They’re trading on paranoia.

My responses

[ NOTE: This will come in the form of a podcast, which I may still record. I wrote it largely in the voice of a spoken conversation. ]

First, let’s start with where we agree.

You speak of a “Cult of Privacy”, where people are blindly saying that Snowden did nothing wrong whatsoever, that he didn’t set a dangerous precedent, that any violation of privacy in any case is always bad, etc., etc.

I absolutely agree with you that this is not an intelligent way to understand and discuss current events.

But there’s another cult on the other side, and it’s one that you’re coming dangerous close to membership in. And that’s “The Cult of Safety”. This one works like this: If there is any situation in which some amount of data could be used to help learn where a kidnapped girl is, or where a terrorist’s bomb will detonate, then it’s within the rights of a government to legally seize that data.

Notice that both sound good to their adherents. Privacy at all costs! Sure. Safety at all costs. Sure.

The problem is when they conflict.

Now, I’m partial to the idea of a model of a spectrum. On one side you have complete privacy, which we can call the Cult of Privacy, or the Snowden Extreme. Which, by the way, would not be perfectly accurate. Snowden doesn’t believe that the government should have ZERO access to our lives in order to gain security. His position is far more nuanced than that.

But anyway, on that side you would not do ANYTHING to sacrifice privacy, even if it threatened the safety of billions.

On the other side of the spectrum you have the Cult of Safety. And here you are willing to do ANYTHING, no matter how much it violates the personal privacy of billions, if you think you can gain one iota of public safety.

So, can we first agree that both of these are too extreme? Yes. I think so. Perfect.

So now let’s move on to Sam’s examples.

He first says that Apple built the lock, but didn’t build the key, and now they’re telling us that building the key would put us all at risk. And he calls this a self-serving abdication of responsibility.

This is not correct, and the easiest way to see this is to ask where the lock came from? What lock? The lock is private conversation. The lock is two people interacting with each other in secret.

Apple did not invent this. They built no lock. If you are in a crowded room and you see a friend, and you want to tell him something deeply personal, or politically sensitive, or whatever, and you decide to walk outside under the stars and whisper this to your friend, that’s the lock.

It’s a natural lock. A human lock. And we’ve been using it for tens of thousands of years.

So technology has not created anything new. All it has done is allowed that private whisper under the stars to cross long distances, and to be stored. But it has not fundamentally changed.

So, no. It’s not fair to ask Apple to provide a key to that. Because it would be a key to private human communication, not a key to an iPhone.

Sam then talks about how the tech community is unduly swayed by Snowden. So even when the government gets a court order, they think they shouldn’t give access.

Well, there’s a reason for this. It’s not a perfect reason. It’s not an absolute reason. But it’s a solid reason.

Basically, there’s a difference between a government behaving well and a government behaving badly, and the Snowden case revealed that there was quite a bit of the latter happening.

I’m far left on government, in ideal situations. In a properly functioning society I don’t see the difference between the people and their government. I want to trust my government. I’m happy to give them all manner of powers to help protect my safety.

But I’m also a student of history and psychology. I know that groups can take on characteristics of evil that the individuals lack. I think Orwell taught these lessons best. If you doubt this, try to imagine a Ted Cruz presidency where he’s in charge of the NSA.

Would he be trying to do evil or trying to do good? He’d be trying to do good. What would he actually be doing? Evil. Or something close to it. And it gets even more muddy when you have good people just following rules in an organization, as we learned with Germany in the 30’s and 40’s, as well as through countless psychology experiments.

People and organizations can become tainted by precedent, procedure, fear, and just the banal response of, “It’s my job.”

So when you see a government (which I actually see as dozens of distinct groups of various maturities and alignments, by the way) that continues to abuse its powers to violate peoples’ privacy, and you see that the scope is accelerating rather than slowing down, you have to think that maybe we’re approaching a Ted Cruz police state.

Not because surveillance is bad. Not because privacy always beats safety. Not because the Cult of Privacy is true.

Just because the government has a touch of mental illness right now, and that’s not the time to hand them weapons.

This does NOT mean government is bad. This does NOT mean healthy, responsible, and transparent governments that actually represent the people should not have some powers of this kind. They absolutely should.

But right now we are in the unfortunate position of having to defend against overreach from certain government groups. We’re in the position of having to push back because they’re slurring their speech, screaming “TERRORIST” at the top of their lungs, and asking for guns. More guns. Bigger guns.

Giant anti-privacy guns.

And that’s not a good thing.

If Jean Luc Picard and Rachael Maddow and Rand Paul were in charge, I’d have a completely different perspective. And my perspective is temporary. We just need to wait until the slurring and stumbling stops before we can relax a bit. That’s all.

Ok, next point.

Sam then talks about cases where text messages could have helped solve a murder, but the texts are unread because the iPhone is unbreakable. Imagine being a family member!

I’m surprised he took this line, honestly. Almost identical logic applies to negotiating with terrorists.

You don’t negotiate with terrorists to get Sally back for 1 million dollars for a very simple reason: it will lead to the abduction of 100 more girls like Sally.

Now it’s true that bypassing privacy doesn’t lead directly to more abductions the way paying a ransom can, but there is a clear, unifying thread: the placing of the good of a few over the good of the many.

In the case of terrorist ransom, the family who gets their kid back benefits, while many other families are placed in more danger. That’s the good of the few over the good of the many.

And in privacy bypass, if you give the government powers to peer into private, encrypted communications every time they can loosely point to security or safety, then you will be exchanging the privacy of millions for the increased safety of relatively few.

That is a tradeoff that must be made with extreme caution, which is precisely the point that Apple is making. They see my previous point about the current sick state of some of our government groups with regard to overreach and slippery slopes, and they get that it could have horrible implications to privacy overall.

But that’s not even the interesting part.

Let me ask a scarier question regarding the safety issue. I talked about how two people in an open field under the stars who are sharing whispers. I talked about how the content of those conversations are private today because they’re not being captured.

Well, they ARE being captured. They’re in the brains of the two people who had the conversation. So let’s change your statement from above, and put it 20 years in the future.

What if this man knows the location of the body but simply won’t give it to us? How would you feel if you were the family? Wouldn’t you want to have the data extracted from his brain?

Or to use a different analogy of yours, let’s say Apple built a brain scanning device so people can control their daily tasks without speaking. Now you could say that they built the lock but not the key.

Unfortunately the key is to the single most private thing that we have: our own thoughts and memories.

So, first they wanted to come after a private conversation between two people just because it used technology to cross distances, and now they could say it’s ok to read minds because technology allows that as well.

Let’s be very careful with this.

Take a step back and let’s redefine what privacy is, and what should be considered private between humans. I don’t have a great answer there (it’s a hard problem) but I’m willing to start with human interaction between two people, and the thoughts in one’s own mind.

Just because we develop technology to view and capture communication between humans (whether that’s from nano microphones that live in the air, or mics in every building, or whatever), or the we achieve the ability to read people’s thoughts and memories through whatever means, this doesn’t mean that the stored, encrypted content of those conversations and thoughts now constitute a lock that deserves a key.

We must realize that if the right to capture, store, and parse human communication applies across mediums, then it will likely extend to reading thoughts and intentions from the brain as soon as we’re able to do so.

That’s not sci-fi, it’s just a logical progression.

So we need to be very cautious about drawing privacy lines correctly, agnostic of technological capabilities, because our capabilities soon be (and already are) extraordinary.

Next Sam gives another example of the impregnable room in the house.

I actually just covered that. The room is your own mind. And yes, you do have the right to have it be private.

Unsupervised Learning — Security, Tech, and AI in 10 minutes…

Get a weekly breakdown of what's happening in security and tech—and why it matters.

The only reason we accept that now is because it’s not possible to read it. But as soon as it becomes possible, we will run into this issue head-on.

Now again, I’m not part of the cult of privacy, so do I think there could be situations where it’s ok to forcibly read someone’s mind to find a memory, or a thought?

Maybe. I can’t imagine a situation, though, because to do so is to open the door. You start with the situation of reading one guy’s mind to save a billion people. And soon that becomes reading the local perp’s mind to see if he stole a Twix at ShopMart.

I wish that were the argument from absurdum, but it’s actually just the way systems work. And you have to work REALLY hard to counter the inertia in that direction.

And that’s entirely the point. At what line do we say “no more”. If there is such a line it has to be the human mind, since it’s identical to the human soul.

Sam then asks whether it’d be ok to take a drug that makes your DNA unidentifiable.

It’s a great question, but he then goes on to say a very dangerous statement associated with the Cult of Safety. This is something only criminals would do.

Right. Only criminals want anonymity. Only criminals want to use encryption. That’s the underlying sentiment here.

If we lived in Star Trek the Next Generation and everything was super chill and the government was completely benign (like I believe it absolutely can be, by the way), then I’d absolutely agree with this.

It’s the same with guns. If there weren’t any reason to have guns then the danger would outweigh the benefit in overwhelming fashion. But it’s not that simple. There are situations every day where people have to defend themselves where there are no police to help them.

So they need guns.

And sometimes people need (or want) anonymity. And privacy.

And I’m ok with people wanting that at this stage of our development. I think it’s ok in concept.

Let’s say for example we have a futuristic society where you walk around and everything is controlled mentally. Every object and every other human knows exactly who you are. You simply wish something and it appears. And criminals are identified quickly via various methods, and crime is mostly unheard of.

Well what if someone said they want to live in the countryside and raise sheep? They don’t want the brain and DNA scanner. They don’t want to the mind-reader implants and the instant wish granting.

They want to be off the grid chopping firewood.

I say absolutely.

Now maybe they can’t come into town like that. Maybe they have to switch their stuff back on, and de-cloak, as part of the social agreement for using shared services.

But when they are done, and they go back to their cabin, absolutely. Cloak your DNA, turn off your DNA transponders, disconnect the Intention Prediction Engine.

Sam then says that Apple could maintain the backdoor and it wouldn’t be a problem, just like banks being able to see our banking details but not sharing them with the public.

This is just fundamentally different, as I’ve already covered.

If you imagine human attributes and interactions as a set of concentric circles, ask yourself what are at the center vs. the outside.

How much money you have is something like ring 2 or ring 3.

What you say privately to other humans on the planet is perhaps ring 1.

And what you think and believe and fantasize about, in your own mind, is ring 0. It’s the center of your being. It’s your actual soul.

So banks having access to your balances is one thing. But governments having access to your private conversations is another. It’s sacred human interaction, and should be violated by only the most trustworthy and reluctant authorities in the world, and in VERY FEW situations.

And that’s fundamentally our problem.

Right now we have many government groups (not all, mind you) that are not trustworthy. They are eager rather than reluctant. And they are trying to dramatically increase how often they can make these intrusions.

That’s 3/3 wrong answers.

Analysis

The way I see the Apple case is that they perceive a giant mess of overreach and abuse and slippery slopes. I would wager that Tim Cook would agree with the fact that in an ideal world we should be able to trust government groups to break privacy in some situations.

I certainly believe that.

I certainly think that there are some situations where public safety is more important than personal privacy.

And I also believe that it’s possible for government to use their powers correctly, and even that there are some groups doing that today.

But it would absolutely be irresponsible, in today’s political climate, with the clear evidence not just abuse but acceleration of that abuse, to enable further unchecked intrusions into millions of peoples’ privacy.

Here are a few additional points:

  • The underlying challenge facing intelligence right now is not actually encryption. The problem is poor intelligence infrastructure. Our intelligence services don’t even have enough linguists to know what terrorists are saying to each other in the clear. So many terrorist coordinations (including much coordination of the recent Paris attacks) are actually being done without any encryption at all, simply because they don’t need it.This is from my recent piece titled Failing at the Basics in Intelligence and InfoSec:

If you’re getting successfully attacked by people on known terror lists, speaking in the open over unencrypted channels, then you don’t get encryption legislation. That’s for mature organizations who’ve used their existing tools to the fullest, and can be trusted with the additional powers.

  • There is an additional, well-known security principle which says that any weakness placed in a system for legitimate use will eventually be used illegitimately. If a friendly government has a backdoor, it’s likely only a matter of time before hostile governments and criminals also have access to it. This is yet another reason to avoid creating the opening in the first place.

  • There’s also a potential danger if U.S. government starts prohibiting companies from making products that are perfectly secure (meaning from themselves as well). If they do this, they’ll likely drive business to non-American companies, both in the form of consumers but also in partner countries. Cases in point: Europe and China. They are extremely focused on privacy, and if they knew that all U.S. companies were required to give the U.S. government access under certain circumstances it would materially reduce the competitiveness of American products.

  • Expanding on that point, imagine U.S. companies not being allowed to make self-secure products, and the subsequent exodus to non-US manufacturers who do offer that option. Because the government will still be facing some instances of terrorism using those new, foreign-made phones, the next logical step will be to prohibit U.S. citizens from PURCHASING and USING self-secure technologies. That would be an absolute mess, but the first step to it seems perilously close.

  • Finally, the concept of authorities compelling entities to do things they don’t want to do, in the name of security, is rather troubling due to how many other situations it could apply to. Could they come to a private citizen and tell them to interact with and mislead a suspected terrorist? Could you be compelled to do so if asked? Could they go to a certificate authority and have them issue certain certificates, or sign certain ones, or grant access to private keys, all in the name of security?Perhaps.Once there is a precedent of, “Do this thing that violates your conscience…for security”, it seems obvious that this will become difficult to contain.

Summary

  1. The truth of this encryption debate lies somewhere between two, incorrect extremes: “The Cult of Privacy” that believes no amount of public safety is worth losing one shred of privacy, and “The Cult of Safety” that believes any amount of public privacy can be sacrificed to gain any degree of public safety

  2. Most peoples’ positions are somewhere between these, but it’s important to start by acknowledging that both sides are flawed and that the answer at any given moment lies on the spectrum in-between

  3. It’s sometimes ok to sacrifice some degree of privacy to gain some degree of safety

  4. It’s sometimes ok to sacrifice some degree of public safety to preserve some degree of personal privacy

  5. The amount of trust that should be given to public representatives (i.e. the government) should depend on how they’re handling that trust. When they’re representing our interests transparently, they should have strong capabilities to intrude in our privacy for public safety, because we know they will do so with extreme reticence and judgement, only when needed. When that trust is being significantly abused, the public should be able to dial back the powers of the government until they can regain their organizational and/or moral footing

  6. The current state of a number of government organizations, with respect to their treatment of the public’s trust to do the right thing with privacy-breaking powers, is poor. We can see this from multiple sources and examples over the last several years. For this reason, it is a strong and defendable position for companies to deny them backdoors and other attacks against privacy while there is a reasonable expectation that such powers will be abused

  7. When you place backdoors, they are as likely to be abused as used over long periods of time

  8. If a precedent is set such that governments can compel individuals or organizations to do things they are uncomfortable with, in the name of security, this will quickly expand to cover a great many situations

  9. Apple’s position reflects these observations, and that’s why I support it. If and when the behavior of our government changes, I will support expanded government powers to peer into private communications, as needed, in very limited circumstances, as I believe such behavior to be in the best interest of a global community

In short, there is a world where creating this backdoor, and other privacy bypasses like it, would be the right thing to do.

We simply don’t live in that world.

[ Subscribe to the Podcast: iTunes | Android ]

Notes

  1. In fairness to Sam, he has said very plainly that at the moment of recording this podcast, he hadn’t put that much thought into the issue, and that he could change his opinion with more information. Hopefully this response will serve that purpose.

  2. [ EDIT 28.02.16 ] Cleanup and addition of two other points to the close.

Related posts: