News Stay informed about the latest enterprise technology news and product updates.

Sorry Mr. Snowden -- encryption isn't the only path to security

Last week, Google showed off a new messaging app called Allo. The reaction to that announcement was either extremely positive or negative, depending on who was speaking. General consumers liked the product because it built Google smarts into a messaging app, while privacy proponents decried the fact that end-to-end encryption was not a default feature of the app.

Edward Snowden even weighed in on the matter:

Yes, the way that Allo is designed does leave a small point of access for a court order — Google servers can read messages in order to offer smart replies and contextual search data before immediately deleting the message. But Snowden’s assertion that this somehow makes the app “dangerous” and “unsafe” is hyperbolic at best, and at worst it makes it clear that Snowden has forgotten that not everyone on Earth is a fugitive from the law.

The choice doesn’t need to be a strict binary of safe/unsafe depending on if encryption is the default, because if that becomes true there’s no way to evolve messaging services. Google is in a unique position where the company is pushing artificial intelligence and machine learning, features that simply don’t work without access to data. Google may only want to add search results and suggestions to chat, and enterprise security relies on AI and machine learning for behavioral analytics and advanced malware detection. These features cannot exist in a world where encryption is the default.

Aside from that, the idea that a lack of encryption is the same as a lack of security ignores the fact that encryption was never designed to be the default. The aim of encryption was always to protect sensitive data, not to protect every word communicated between two parties. In this vein, Allo is the true expression of encryption — when you’re talking about restaurants, you can get Google suggestions because the chat is unencrypted, but when you’re talking about something sensitive (the definition of which is personal to everyone), you can switch to Incognito mode in order to be “safe” (as Snowden defines it) from the government’s prying eyes.

The aim of the encryption debate should be to make users aware of how to protect themselves and the ways that security is vulnerable either to legal orders or hackers. Pushing the idea that encryption is the only form of safety is both antithetical to how the technology is supposed to work and a gross simplification of what users want and need from that technology.

Join the conversation

5 comments

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

Senior reporter Heller's response to Edward Snowden's concerns about an encryption hole is completely off the mark, to say the least, because such personal privacy issues do not require being a "fugitive from the law". Such issues are recognized by the majority of Americans, even when they do not know the full extent of surveillance.

Anyone remotely conversant with use and abuse of messaging privacy protections by the intelligence community understands this is not a debate about how technology is "supposed to work", as Heller puts it, but how it can be turned to the purposes of various agencies and actors.

And that vulnerability is what drew the attention of Edward Snowden, who has considerably more experience with real threats to personal privacy. After all, his day job was cracking privacy protections for a variety of official targets.
Cancel
Admittedly, I have a more limited scope to what I think actually needs to be private than others, but I think that's really the point. Some people, like yourself, may want everything to be private and have no access for anyone but yourself and whomever you're conversing with. In that case, there are options for you. 

However, even knowing that law enforcement could gain access to my chats on Allo will not stop me from using the service because I want to have the added value that Google's AI will bring, and that added value doesn't exist without allowing access and giving up some level of privacy. I understand that, and I'll gladly make that trade if Google adds enough value (and I'm not talking about my Social Security number or something I consider truly private, in which case I'd use the app's Incognito mode). 

The choice is there, so I think Snowden calling the app "dangerous" is misleading. The app is not dangerous, it simply has an option that trades some privacy for added value. Snowden may not want to make that trade, and many others may not either, but that's a choice that has value to some and needn't be written off completely. 
Cancel
Yes, you are quite correct--  this is a matter of choice. But millions of message senders may not understand even their own options and risk, and are unable to make that choice intelligently.

Of course, the encryption "hole" designed into Allo advances the interests of Google marvelously. The only question is how safely the benefit is distributed to everybody else.

And that seems why Snowden is not fully behind use of Allo, and his recommendation is those with similar concerns stay away from it-- for now. Again, Allo's "flexbility" contains a built-in risk/compromise most do not even suspect, much less understand.

Thanks for your clarification.
Cancel
Very true that many may not be able to make that choice intelligently, I'm just not convinced that the answer to that problem is for all apps to be encrypted by default, nor do I think the apps get the blame for people not understanding privacy/surveillance concerns.

I think Google has done about as well as it could in protecting privacy in Allo while still serving its own interests. The only point where messages are readable is when they hit Google servers and even then messages aren't stored at all, just processed, sent on their way, and deleted. 

It leaves a small window where law enforcement can get in, but they'd essentially have to be watching a conversation real-time in order to gather anything, as far as I can tell. Definitely a risk, but maybe the best solution to the issue of adding AI while protecting privacy since AI and privacy don't mix very well.
Cancel
I see no problem with the default encryption being set to off, as long as you can turn it on if you want. This may make things easier for law enforcement to catch the lazy and less educated criminal. I believe the average citizen would have nothing going on that the government would want to see or spy on. 
If they want to see my messages asking my friend if he wants to meet for lunch on Sunday, or what I'm bringing to my nieces graduation party, then so what.That info is of no use to anyone but the parties involved Does that really need to be encrypted ?? If I am doing things that a lot of people do, and know are against the law,  like illegal downloads or other activities then turn the encryption on. 
Cancel

-ADS BY GOOGLE

SearchCloudSecurity

SearchNetworking

SearchCIO

SearchConsumerization

SearchEnterpriseDesktop

SearchCloudComputing

ComputerWeekly

Close