Director tells MIT that all must cooperate in response to crims’ use of ciphers
Robert Hannigan, director of GCHQ, told an audience at the Massachusetts Institute of Technology that there was an ethical problem presented by encryption and it was necessary for industry’s technical experts to help them work out a solution on its use by criminals.
Published alongside the speech were two newly declassified papers by James Ellis, a cryptographer at GCHQ who, in the 1970s, secretly invented public-key cryptography.
The papers, titled The Possibility of Secure Non-Secret Digital Encryption (PDF) and The Possibility of Secure Non-Secret Analogue Encryption (PDF), had remained classified for almost 50 years.
Though Ellis had first invented the capability, public-key cryptography would actually be made public in 1976, in a famous paper titled New Directions in Cryptography (PDF) written by Whitfield Diffie and Martin Hellman – who won the Turing Award this February, 40 years after the paper’s publication, for their contributions to the field.
Hannigan said he was publishing the Ellis papers for three reasons, key among which was how advances in 1970s public-key cryptography reversed “centuries of assumptions about how communications could be protected.” This gave the director “some hope that our current difficulties can be overcome.”
For nearly 100 years we have been intimately involved in strengthening encryption.
From traditional protection of military communications, through personal privacy online – including identity verification for Government digital services – through the security of domestic “smart” power meters – where the design principle is that homeowners are in control of their data – to the security of the nuclear firing chain, we understand the importance of encryption for the economy and for the individual.
That importance grows as more of our private lives move online and the economy becomes increasingly dependent on digital currency and block-chain systems.
Hannigan added: “But what the history of our cryptology teaches me above all is that the enduring problems in the debate over privacy and security are essentially moral rather than technical.”
The ethical issue presented by encryption “is the problem presented by any powerful, good invention, including the internet itself,” which could be used by miscreants, said Hannigan. “TOR is the most topical example: a brilliant invention that is still invaluable to those who need high degrees of anonymity, notably dissidents, human rights advocates and journalists; but an invention that is these days dominated in volume by criminality of one sort or another.”
Hannigan claimed he was not in favour of banning encryption, nor asking for mandatory backdoors. He stated he was “puzzled by the caricatures in the current debate, where almost every attempt to tackle the misuse of encryption by criminals and terrorists is seen as a ‘backdoor’.”
For those of us in intelligence and law enforcement, the key question is not which door to use, but whether entry into the house is lawful at all.
In our case that means applying the European Convention on Human Rights, as set out in UK domestic law: is what we are doing lawful (and appropriately sanctioned), is it necessary, and is it proportionate, notably in its infringement of privacy?
Proportionality was important to consider.
Drawing a parallel with GCHQ’s celebrated efforts at Bletchley Park, Hannigan said that it would not make sense to see the breaking of Enigma as a “backdoor”.
In the “easy example” of the breaking of the German Enigma code, it clearly enabled an Allied victory and an early end to the Holocaust, and, as such, could be considered proportionate.
Hannigan stated that “what Turing and his colleagues recognised was that no system is perfect and anything that can be improved can almost inevitably be exploited, for good or ill.”
“That is still true today.” said Hannigan. “It does not follow that Enigma was a poor system, or ‘weak’, or easily broken by anyone.
In fact we continued to build and use improved derivatives of Enigma – ‘Typex’ – for some years after the war.
For Turing, I do not think there was any such thing as an impregnable door, whether front, back or side: there were strong doors and degrees of security.”
‘We want better security’. Oh yeah?
Hannigan lauded Turing for his work on encryption and decrypting others’ crypto, claiming that “strengthening and improving encryption has been the focus of our most brilliant mathematicians over the decades, and still is.”
This is interesting.
In his book The Hut Six Story, Gordon Welchman, who founded Bletchley Park’s Hut Six, cited an article by Professor Jean Stengers, of the University of Brussels, titled La Guerre de Messages Codes (1930-1945) approvingly:
From pg. 17, The Hut Six Story, by Gordon Welchman
[Stengers] knows the reason for British reluctance to reveal the methods they developed. He has discovered that, after the war, Britain sold a considerable number of Enigma machines to certain countries, no doubt claiming that they were unbreakable.
But, says Stengers, British codebreakers had no trouble.
As I have made clear in this book, it is my firm conviction that what I have revealed should have been made available to our partners many years ago.
The over-prolonged secrecy has, in my view, already been prejudicial to our future national security.
The reason for the secrecy, made public at last, is hard to accept.*
* I, personallly, have heard conflicting statements on this matter, but, on balance, I believe that Jean Stengers is correct.
Following the publication of his book, Welchman was infamously described as “a disastrous example to others” by GCHQ. He subsequently lost his security clearance and ability to work in the USA on secure communications systems.
There’s security and there’s security
Expanding on the operational difficulties posed by encryption, Hannigan said:
Turing also knew that human behaviour was rarely as consistent as technology: we are all to some extent too busy or careless in our use of it. Put more positively, we routinely make trade-offs not just between privacy and security, but privacy and usability.
The level of security I want to protect the privacy of my communications with my family is high, but I don’t need or want the same level of security applied to protect a nuclear submarine’s communications, and I wouldn’t be prepared to make the necessary trade-offs.
It is not inconsistent to want, on the one hand, to design out or mitigate poor user behaviour, but at the same time exploit that behaviour when lawful and necessary.
The debate, for Hannigan, is not about back or front doors, but about “where you risk letting anyone else in if you accept that the lawful authorities can enter with a warrant.”
All of this meant “some very practical cooperation with the industry. Whatever high level framework, whatever posture democratic nations decide upon will need to be implemented by commercial providers.
And this will get technical.
That is where we will need goodwill on both sides.” ®
Sponsored: Five essentials for improving endpoint security