The debate between privacy, safety, and security has been ongoing for longer than I can guess. I wouldn’t be surprised if cave dwellers used secret passwords to enter adjoining caves or offer assistance in hunts. What were those codes worth to other tribes?

While we may have evolved in language skills and developed mind-boggling technology, the basic premise is unchanged. There is a perception that your privacy in some way compromises the security of the masses. If law enforcement cannot read your mail, then how will they stop the next terrorist attack? Obviously, the discussion merits far more than a short CUBit on this humble blogger’s site. I won’t argue that point. There is a place to strike balances between the privacy rights of individuals with the security responsibilities of your government. But this balance should never tip excessively in favor of the latter. I’d argue it must always lean towards the individual. Even if that person has committed heinous crimes?

There’s the rub. To collect evidence against this one person would put the security of a billion others (most of which not citizens of this country, and therefore not beholden to its laws) at risk. Is the balance needle moved?

This precise situation came to a head yesterday. Remember that time a person shot a bunch of innocent people in San Bernardino? Yeah, no love for them and deepest sympathies to the victims and their families. Well, the shooter owned an iPhone 5C and the FBI wants to collect information from it. Unfortunately for their investigation, the suspect used a passcode. As you may know from your own devices, you can only get it wrong 10 times and the device will erase itself. This feature is so good that the FBI cannot bypass it. So, they did what you’d expect…ask for a key. Since iOS 8 (we’re on iOS 9.2, or 9.3 on beta), Apple stopped keeping encryption keys. This means only the person with the passcode can access the phone’s data, not Apple. The FBI went to court against Apple on the matter. Early this week, a Federal judge ruled that Apple must provide a way for the FBI to access the phone.

They refused.

“So Apple sides with terrorists?” you may say. No, they side with their customers. You see, to modify one device would mean opening all of them up to this same intrusion. “But it can prevent another shooting or even a terrorist attack!” This is circular reasoning, as it presumes the result at the outset. I could just as easily say that it causes a terrorist attack since malicious actors used this “backdoor” to access a government official’s phone. In that case, the argument would be that we should encrypt and secure our devices better. Not to mention all the cases where a suspect’s information could now be accessed by authorities with impunity. All that encryption and security would then mean nothing. It would be akin to having a state of the art deadbolt on your door, but not adding hinges.

Is there a solution? Yes, but it’s not great, and it’s a bug. Companies regularly offer “bug bounties”, or cash rewards, to hackers finding security issues in their software. If the FBI wants this information so bad, offer an enormous bug bounty, say, $5 million, to crack the iPhone’s encryption. However, stipulate that payment only occurs if the flaw is not publicly disclosed and is submitted to the FBI and Apple simultaneously. That way, the FBI gets what they want (access to the suspect’s phone), Apple doesn’t compromise their values or the software (and gains an opportunity to fix a flaw, making it more secure for all), and none of us lose security for the sake of one investigation. Perfect? No. It’s possible no one will figure out how to bypass the passcode lock. Then what?

What’s your take? Can you think of a better way to satisfy all parties? Is there a way to truly balance privacy and security? The comments are open.

PS – This affects your credit union and members, too. Just swap “key to phone” with “key to member data”.