Balancing the right to privacy with the duty to protect
A hack of Firefox and Google Chrome underlines the complexity of the encryption debate
There are few things nicer than presenting some ideas in my home city, Edinburgh, a place that is thriving though IT innovation and enterprise. I really enjoyed the Scot-Secure conference (www.scot-secure.com) last week. It is always a responsive audience, and it’s a place you can present on some
of the major issues of our time. To be invited as an academic to an industry-focused event is always a good thing, especially to stimulate a bit of debate around key issues. I also sneaked in a bit of maths and a reference to one of Edinburgh’s finest sons, John Napier!
There is a major dilemma faced with the development of a secure Internet: how to balance the rights of the individual to privacy, alongside the rights of society to protect itself. In the firing line is cryptography, where it is used to protect identities and secure communications, but, on the other hand, it is used to hide terrorist activities. The debate around cryptography is thus one of the major debates of the 21st century, and it shows no signs of reaching a conclusion.
In the UK we have the Investigatory Powers Bill (IBP), where ISPs will log the communications of their users. The scope of this power is likely to reduce over the new five years, as almost 99% of all traffic on the Internet will be encrypted and ‘tunnelled’, which means that the logs will only contain the destination IP address, and there will be no details about the actual page visited or its content. Along with this, the cloud service providers such as Microsoft, Google and Facebook are moving toward encryption by default, where it will not be possible to connect to the service unless it is encrypted.
In the US, we see the developing Burr-Feinstein legislation, which has the Catch-22 clause that says the US companies must protect the privacy of US citizens but support the legal system in breaking any communications if required.
To many technical people, this is an almost impossible situation, and could only really be done with a ‘back door’ in software. This back door could obviously leave flaws in software that could compromise the whole of the Internet. Poor coding caused many of the current flaws, but a back door in software is likely to be discovered or leaked. This could lead to large-scale data leakages, on a scale that could encapsulate the whole world. Just imagine if Google’s private key was released to the world; every communication through Google would then be open to those with the secret key, or they could pretend to be Google and set-up spoof search engines.
Many coUntrIes are looking at ways of breaking secure communications, including Kazakhstan which has a plan to replace the digital certificates that are provided over the Internet with their own certificate, and thus be able to listen to secure communications.
For HTTPs, the secure communica- tions method for Web, typically works by the client (the user) receiving a digital certificate with the public key of the Web server (such as Google.com), and then creating a new encryption key that they will share for the session. This session key is then encrypted with the public key of the server, and is sent back. The encrypted session key is then decrypted with the private key of the server (such as Google.com). At the end of this process, both the client (the user) and the server (Google.com) have the some encryption key, and can now use it to secure the communications. We see here how important it is to keep the private key secret.
The core of HTTPs is unique session keys which are only used once, and never stored. If there was some way to store these keys, law enforcement could easily go back and replay the encrypted communications.
So are their any back doors in the software that we use? At the event I demoed a simple method of getting the Chrome and Firefox browsers to create a back door and, by setting a simple environment variable, the system stored all the keys that they use for their communications. Tools, such as Wireshark, can then easily read these keys and decrypt the communications.
The debate around cryptography is only just beginning, but it is one of the most fundamental debates of our time. Politicians think there is a magic wand that we can wave over the secure communications, but there isn’t. For law enforcement, it is going to be a challenging time.
And for individuals, who knows? For us, we will continue researching weaknesses in cryptography, and in building systems that don’t use public key infrastructure and move toward keyless cryptography.
Bill Buchanan is a Professor in the School of Computing at Edinburgh Napier University and a Fellow of the BCS and the IET. He currently leads the Centre for Distributed Computing, Networks and Security, and The Cyber Academy.