Expert Speak Digital Frontiers
Published on Oct 26, 2016
repeated calls for weakening industry encryption standards raise the question of whether the means employed by governments to protect national and personal security are well suited towards achieving their end
Keep the doors locked and turn on the light: Why less security online does not make us more secure offline


In the aftermath of the San Bernardino shootings in December 2015, the battle over encryption came back to the fore. Two cases stood out prominently in this regard: the first concerned the question of whether the Federal Bureau of Investigation (FBI) could oblige the American technology company, Apple, to bypass the passcode security of an iPhone 5s pertaining to a New York resident suspected of drug trafficking;<1> the second, which received widespread media attention, was whether the FBI could force Apple to create and electronically sign new software to unlock the work related iPhone 5c of Syed Farook<2> who, together with his wife Tafsheen Malik, killed 14 people and injured 22 others in a shooting spree in San Bernardino, California.<3> The government argues that current industry encryption standards seriously undermine its ability to engage in legitimate law enforcement and national security investigations. For its part, Apple argues that it cannot undermine the security of one iPhone without undermining the security of all iPhones. These debates have meanwhile been echoed in Europe after rumours spread that the terrorists behind the Brussels<4> and Paris<5> attacks exploited secure communication channels to achieve their nefarious ends.<6> This raises the question: What is the relationship between offline and online security? Can less security online make us more secure offline? In order to explore these questions, this paper distinguishes between three types of security: technological security, national security, and personal security. Technological security protects infrastructures, networks and devices; national security protects territorially bounded populations; and personal security protects individuals. The contribution not only argues that personal security in the digital age depends on both technological and national security, but also that technological and national security depend on each other. Undermining technological security thus not only weakens personal security but also national security – a conclusion that was already reached over 20 years ago in the context of the so-called Crypto Wars.<7> Most importantly, the recurrent national security obsession with criminals and terrorists “going dark” is misguided. First, if criminals want to “go dark,” they will “go dark,” regardless of whether the government has backdoor access to encrypted communications. Weakening industry encryption standards will merely compromise personal and technological security for the rest of us. Second, even if criminals want to “go dark,” this does not imply that law enforcement cannot continue doing their work as there is a host of data trails that criminals cannot avoid leaving behind if they continue to rely on digital communications. Indeed, the vast amount of information that intelligence services can access regardless of whether or not the content of communications is encrypted suggests that we are far from “going dark” but rather are living in a “golden age of surveillance.”<8> Finally, repeated calls for weakening industry encryption standards raise the question of whether the means employed by governments to protect national and personal security are well suited towards achieving their end. More specifically, recent events in Orlando,<9> Nice<10> and Munich<11> call into question whether the ability to monitor the online communications of all is the most effective way of preventing crimes and terrorist attacks perpetrated by a few. This article argues against technological solutionism:<12> perhaps the best way to protect national, personal and ultimately also technological security is to renew our focus on investigative powers offline.


Over the course of 2015, Apple received nearly 2,000 requests from the US government to make the personal data of its customers available to law enforcement.<13> While Apple generally complies with government requests that it deems lawful, it has also occasionally challenged others, most notably in the case of the personal data stored on the handheld devices pertaining to Jun Feng, a New York resident suspected of drug trafficking,<14> and Syed Farook, one of the suspects in the San Bernardino shootings.<15> In both cases, Apple received court orders seeking to compel the company to make personal data from its handheld devices accessible to law enforcement agencies under the All Writs Act (AWA).<16> While the relevant order in the former case, which “only” required Apple to bypass the passcode security of Feng’s iPhone 5s, was significantly less burdensome than that of the latter case—which would have required Apple to create and electronically sign new software to unlock Farook’s work-related iPhone 5c—Apple challenged both, on the grounds that the government could not force companies to undermine their own encryption standards. Setting aside the question of whether the government’s requests were good law – that is, “necessary or appropriate” and “agreeable to the usages and principles of the law” in the sense of the AWA – this piece will focus on the question of whether they are good policy – that is, whether less security online can make us more secure offline.

“A Rose By Any Other Name…”

Might not smell as sweet in the field of security (with apologies to William Shakespeare). Classifications matter. And distinguishing between different types of security is important, not least because one type of security might conflict with another. The following section will first distinguish between three types of security that are relevant to this conversation – technological, national, and personal security<17> – before making an assessment of the question of whether undermining one can ultimately reinforce another.
  1. Technological security
In this case, ‘technological security’ refers to the security of information technology (IT) infrastructure, networks and devices. According to this definition, the Internet as a whole constitutes the infrastructure, which is composed of a variety of networks—such as internet service providers (ISPs), operating systems, platforms, and intrawebs—which in turn depend on the accessibility and functionality of devices. This contribution follows the traditional information technology (IT) security triad, according to which IT security is protected when the confidentiality, integrity and availability (CIA) of the system are preserved.<18>
  1. National security
The term ‘national security’, by contrast, refers to the protection of territorially bounded populations. The definition of what constitutes a threat to national security inevitably differs from one country to another and changes over time. But in the aftermath of the terrorist attacks of September 11, 2001, the focus of national security in the United States and in most of Western Europe has primarily been on the prevention of such attacks on domestic soil. At the same time, concerns about cyberattacks are growing,<19> demonstrating that protecting technological security is in fact a part of the responsibilities of national security.
  1. Personal security
Finally, the concept of ‘personal security’ is concerned with the wellbeing of individuals. This includes not only the protection of physical but also psychological integrity, as well as the ability of individuals to enjoy their basic rights, offline and online, such as privacy, freedom of expression and participation in a social and political community.

The Paradox of Security

It seems odd that these security goals could be at odds with each other. After all, as mentioned earlier, technological security is also an explicit goal of national security, and the meaningful protection of personal security in the digital age includes not only the protection of the physical person offline but also that of their communications and exchanges of personal data online. Similarly, national security depends on technological security, e.g., to protect critical infrastructure of both the public and private sector.<20> But the relationship can just as well be negative: in the name of national security, governments can be tempted to ask businesses to weaken technological security (e.g., by introducing backdoors into encrypted communications) which in turn threatens personal security (by making the personal data and communications of individuals more accessible to governments and criminals alike). After all, as soon as a security loophole is created, public and private actors (both good and bad) will rush to exploit it. At the same time, governments claim that they cannot guarantee personal and national security when they are among the actors that technological security protects against. In the words of FBI Director James Comey, “We all care about safety and security on the Internet — and I’m a big fan of strong encryption — we all care about public safety, and the problem we have here is those are in tension in a whole lot of our work.”<21> This raises the question: Would less security online make us more secure offline?

Crypto Wars: Back to the Future

This conflict is far from new. As several commentators have pointed out,<22> the debates we are witnessing today eerily resemble those surrounding the Clinton administration’s attempts to introduce a state-of-the-art encryption system with a built-in tamper resistant backdoor for law enforcement access in the early 1990s. The Internet was not yet commercially viable back then and encryption was limited to a few government and private sector providers, but US and UK government and intelligence representatives already warned that, if encryption were to become more widespread, then the capability of law enforcement to prevent and solve crimes would be severely restricted. Accordingly, the US government suggested that technology companies should provide law enforcement access by design, e.g., by building a government backdoor into the security systems of their products. Specifically, the government advocated for the adoption of the so-called Clipper Chip, which, by means of key escrow, would send law enforcement a string of data that would allow it to decrypt encrypted communications, akin to providing the key to a lock.<23> The proposal was faced with strong resistance by technology companies and IT specialists who criticised the substantial social, economic, political and, inevitably, security risks and costs that would be entailed by the implementation of such proposals.<24> Now that government and law enforcement officials are singing a familiar tune, the opposition of technologists has only increased.<25> The types of risks that unilaterally inserting vulnerabilities into the digital ecosystem would introduce have not changed. What has changed, however, is the magnitude of the havoc that such proposals could wreak. If the American government demands that multinational technology companies such as Apple provide backdoor access to secure communication channels, what is to keep other governments around the world, with varying commitments to human rights and the rule of law, from following suit? What if companies and nonprofit organisations abroad start offering alternative encryption and data storage solutions, which not only undermines the business interests of domestic companies but also risks contributing to a splintering of the infrastructure of the web into separate jurisdictions?<26> How can we protect particularly vulnerable communication channels, e.g., those of journalists,<27> activists,<28> whistleblowers,<29> refugees,<30> courts<31> and even the government<32> —when the government itself is chipping away at those protections? And yet, despite the obvious tradeoffs, government and law enforcement officials cannot rid themselves of the gnawing feeling that we cannot protect national and personal security if we enable criminals and terrorists to “go dark.” The final part of this contribution will therefore attempt to shed some light on the contentious debates surrounding “darkness.”

Keep The Doors Locked…

First, if criminals want to “go dark,” they will “go dark,” regardless of whether the government has backdoor access to encrypted communications. The kinds of criminals we should be concerned about are not that unintelligent. Further, in the unlikely case that they do not manage to circumvent government surveillance of their communications online, nothing prevents them from continuing to plan and execute their criminal wrongdoing offline. As an incredulous Anthony Soprano pointed out to one of his associates in the popular 2000’s American television series, The Sopranos: “We’re supposed to leave phone calls about interstate hijacking now? How about faxes, e-mails, make it even easier for the cops? This is a face-to-face business, Christopher.”<33> Weakening industry encryption standards will thus merely compromise personal and technological security for the rest of us. Second, even if criminals want to “go dark,” this does not imply that law enforcement officials cannot continue doing their work as there is a host of data trails that criminals cannot avoid leaving behind if they continue to rely on digital communications. Indeed, the vast amounts of information that intelligence services can access regardless of whether or not the content of communications is encrypted suggests that we are far from “going dark” but rather living in a “golden age of surveillance.”<34> For instance, data about communication transactions – or in common parlance: metadata – cannot easily be encrypted, if at all, and yet provides a rich and reliable source of information about a person’s private interests, habits and concerns.<35> Furthermore, as long as many businesses themselves depend on the unencrypted access to their customers’ data, they have no incentive to engage in full-scale encryption<36> – unless, of course, the government antagonises them to the extent that they feel pressured into assuming the privacy-protective position.<37> Finally, the fact that the FBI ultimately dropped the case against Apple because it was able to break into the iPhone itself<38> undermines its own argument that law enforcement is lagging behind technological advancements.

…And Turn On The Light.

Finally, repeated calls for weakening industry encryption standards raise the question of whether the means employed by governments to protect national and personal security are well suited towards achieving their end. After all, encryption can only be a problem when the large-scale monitoring of communications is the solution to national security’s many woes. However, the inability of law enforcement and intelligence officials to predict and prevent the concerted terrorist attacks in Brussels<39> and Paris<40> calls into question whether monitoring the online communications of all is the most effective way of preventing crimes and terrorist attacks perpetrated by a few. The case of France, in particular, seems to suggest otherwise, considering that the country currently has some of the most intrusive government surveillance laws in Western Europe.<41> This article thus ultimately argues against technological solutionism:<42> Perhaps the best way to enhance national, personal and technological security is to renew our focus on investigative powers offline. As one commentator suggested in the aftermath of the Brussels attack, “the answer to the scourge of homegrown terrorism in Europe is <…> found in the basic tools of routine police work: learning the ins and outs of a tightly knit neighbourhood where dozens of people could lend support to a plot, and only a few of whom would know, or care, that it was terrorism.”<43> This primarily requires boots on the ground, not analysts in a room full of cables. That is not to say that signals intelligence (SIGINT) does not contribute to the prevention of crimes. Of course it does. But at the same time, we should not lose focus of the importance of human intelligence (HUMINT) to explore and develop relationships with communities online and off. Most importantly, the lone-wolf attacks in Orlando,<44> Nice<45> and Munich<46> suggest that the concerns of national security may increasingly have to include the phenomenon of contagion;<47> that is, that people participate in senseless mass killings not because of any particular ideological conviction but because the threshold for doing so has become dangerously low.<48> This problem will not be solved by introducing backdoors into secure communication channels online, but through a painful and potentially protracted offline conversation between policymakers, police officers, educators and community representatives. Preventing criminals and terrorists from “going dark” does not amount to much more than treating a symptom. But what we need to be doing is to shine light on the cause.<49> This essay originally appeared in the third volume of Digital Debates: The CyFy Journal
<1> In re Order Requiring Apple, Inc. to Assist in the Execution of a Search Warrant Issued by This Court, Case No. 1:15-mc-01902-JO, Memorandum and Order (E.D.N.Y. February 29, 2016), available at <2> In the Matter of the Search of an Apple iPhone Seized During the Execution of a Search Warrant on a Black Lexus IS1300, California License Plate 35KGD203, Case No. 15-0451M (C.D. Cal. February 16, 2016), available at <3> Christine Hauser, San Bernardino Shooting: The Investigation So Far, New York Times (December 4, 2015), <4> BBC, Brussels Explosions: What We Know About Airport and Metro Attacks, BBC (April 9, 2016), <5> BBC, What Happened at the Bataclan? BBC (December 9, 2015), <6> This was soon disputed, however. See Dan Froomkin, Signs Point to Unencrypted Communications Between Terror Suspects, The Intercept (November 18, 2015), See also Thorsten Benner and Mirko Hohmann, How Europe Can Get Encryption Right, Politico (April 13, 2016), (pointing out that “n the investigations following the Brussels and Paris attacks, the problem was not encrypted data, but data that was unavailable to the appropriate agencies”). <7> Sean Gallagher, What the Government Should Have Learned About the Clipper Chip, Ars Technica  (December 14, 2015), <8> Peter Swire, ‘Going Dark’ Versus a ‘Golden Age for Surveillance’, Center for Democracy and Technology (November 28, 2011), <9> Ralph Ellis et al., Orlando Shooting: 49 Killed, Shooter Pledged ISIS Allegiance, CNN (June 13, 2016), <10> Alissa J. Rubin et al., Scores Die in Nice, France, as Truck Plows into Bastille Day Crowd, New York Times (July 14, 2016), <11> Emma Graham-Harrison et al., Munich Attack: Teenage Gunman Kills Nine People at Shopping Centre, Guardian (July 23, 2016), <12> For an extensive critique of technological solutionism, see Evgeny Morozov, To Save Everything, Click Here: The Folly of Technological Solutionism (2014). <13> Apple, Government Information Requests, Apple (last accessed August 13, 2016), <14> See In re Order Requiring Apple supra note 1. <15> See In the Matter of the Search of an Apple iPhone supra note 2. <16> 28 U.S.C. §1651(a) (“The Supreme Court and all courts established by Act of Congress may issue all writs necessary or appropriate in aid of their respective jurisdictions and agreeable to the usages and principles of the law”). <17> See also Helen Nissenbaum, Where Computer Security Meets National Security, 7 Ethics and Information Technology 61 (2005) (distinguishing between computer and national security). <18> See Chad Perrin, The CIA Triad, Tech Republic (June 30, 2008), <19> See James R. Clapper, Statement for the Record Worldwide Threat Assessment of the US Intelligence Community, Senate Armed Services Committee (February 9, 2016), (where “Cyber and Technology” and “Terrorism” feature as the first and second top priorities, respectively). <20> See also Susan Landau, The National-Security Needs for Ubiquitous Encryption, in Matt Olsen et al., Don’t Panic: Making Process on the “Going Dark” Debate, Appendix A: Individual Statements from Signatories 1-3 Berkman Center for Internet and Society at Harvard University (February 1, 2016), <21> Giuseppe Macri, Comey on Encryption and Criminals ‘Going Dark’: ‘We’re Not Making it Up,’ Inside Sources (September 10, 2015), <22> See Gallagher supra note 7; Harold Abelson et al., Keys Under Doormats: Mandating Insecurity by Requiring Government Access to All Data and Communications, MIT Computer Science and Artificial Intelligence Laboratory Technical Report (July 6, 2015),; Eric Geller, A Complete Guide to the New ‘Crypto Wars, Daily Dot (last updated May 5, 2016), <23> See Gallagher supra note 7. <24> See Harold Abelson et al., The Risks of Key Recovery, Key Escrow and Trusted Third-Party Encryption: A Report by an Ad Hoc Group of Cryptographers and Computer Scientists, 3 Digital Issues 1 (June 1998). <25> See Abelson et al. supra note 22. <26> For criticism of various European proposals promoting technological sovereignty, see Mirko Hohmann et al., Technological Sovereignty: Missing the Point? An Analysis of European Proposals after June 5, 2013, Global Public Policy Institute (November 24, 2014), <27> See Freedom of the Press Foundation, Donate to Support Encryption Tools for Journalists (last accessed August 15, 2016), (describing the protection of secure communication channels as “one of the biggest press freedom challenges in the 21st Century”). <28> See Beats, Rhymes & Relief et al., Letter Re Apple v. FBI (March 3, 2016), available at Gathering_for_Justice_Justice_League_NYC_Opal_Tometi_and_Shaun_King.pdf (urging the judge presiding over the Apple v. FBI case “to consider the dire implications for free speech and civil liberties if the FBI is permitted to force Apple to create technology to serve its investigatory purposes. The FBI’s historically questionable surveillance procedures do not bode well for a precedent that allows the agency universal access to private smartphone data”). See also Jenna McLaughlin, The FBI Vs. Apple Debate Just Got Less White, The Intercept (March 8, 2016), (describing the opposition of racial justice activists against weakening encryption standards). <29> Micah Lee, Ed Snowden Taught Me to Smuggle Secrets Past Incredible Danger. Now I Teach You, The Intercept (October 28, 2014), (demonstrating that encrypted communications played an integral role in Edward Snowden’s whistleblowing efforts). <30> See Mark Latonero, Refugees’ New Infrastructure for Movement: A Digital Passage, Data & Society (February 1, 2016), (arguing that “

hones, social media, mobile apps, online maps, instant messaging, translation websites, wire money transfers, cell phone charging stations, and Wi-Fi hotspots have created a new infrastructure for movement as critical as roads or railways”). See also Paula Kift and Mark Latonero, On Digital Passageways and Borders: Refugees and the New Infrastructure for Movement and Control, talk at Data & Society (May 12, 2016), (arguing that the same digital technologies refugees have come to depend upon could just as easily be exploited, by public and private actors alike, for surveillance and control). <31> See In re Order Requiring Apple supra note 1 at 33, FN 28 (pointing out that, “ust as the criminal Feng has done, the United States government has chosen to entrust extremely sensitive communications and secret documents – including those of many of the prosecutors and judges who work in this court – to the passcode protections and other robust data security measures available on a variety of Apple devices. That observation does not mean Apple should have any greater or lesser obligation, as a matter of law or morality, to accede to the demands of law enforcement agencies. But it does highlight the proposition that forcing Apple to compromise the data security measures it offers its customers may adversely affect many who rely on such technology for purposes the government would endorse”). <32> Id. <33> The Sopranos, “Walk Like a Man,” Season 6, Episode 17. <34> See Swire supra note 8. <35> See Olsen et al. supra note 20 at 3 (arguing that “etadata is not encrypted, and the vast majority is likely to remain so”); see also ACLU v. Clapper, Case No. 13-cv-03994, Declaration of Edward Felten 11 (S.D.N.Y. 2013) (arguing that “it is practically impossible for individuals to avoid leaving a metadata trail when engaging in real-time communications, such as telephone calls or Internet voice chats”); see also Paula Kift and Helen Nissenbaum, Metadata in Context: An Ontological and Normative Analysis of the NSA’s Bulk Telephony Metadata Collection Program, 13 I/S: A Journal of Law and Policy for the Information Society (forthcoming) (arguing that changes in the social and technological environment have vastly increased the revelatory power of metadata). <36> See Olsen et al. supra note 20 at 10 (pointing out that “urrent company business models discourage implementation of end-to-end encryption and other technological impediments to company, and therefore government, access”). <37> See Apple supra note 13 (explicitly responding to the Crypto Wars by claiming that “Apple has never worked with any government agency from any country to create a ‘backdoor’ in any of our products or services. We have also never allowed any government access to our servers. And we never will”). <38> See Kim Zetter, The FBI Drops its Case Against Apple After Finding a Way into that iPhone, Wired (March 28, 2016). <39> See BBC supra note 4. <40> See BBC supra note 5. <41> See Alyssa J. Rubin, Lawmakers in France Move to Vastly Expand Surveillance, New York Times (May 5, 2016), <42> See Morozov supra note 12. <43>Joshua Hersh, What to Do About Brussels, New Republic (March 23, 2016), <44> See Ellis et al. supra note 9. <45> See Rubin et al. supra note 10. <46> See Graham-Harrison et al. supra note 11. <47> See Benedict Carey, Mass Killings May Have Created Contagion, Feeding on Itself, New York Times (July 26, 2016), (arguing that “public, widely covered rampage killings have led to a kind of contagion, prompting a small number of people with strong personal grievances and scant political ideology to mine previous attacks for both methods and potential targets to express their legal anger and despair”). <48> See Malcolm Gladwell, Thresholds of Violence, New Yorker (October 29, 2015), (citing Stanford sociologist Mark Granovetter’s theory of thresholds in the context of mass shootings in the United States). <49> See also Ramzi Kassem, France’s Real State of Emergency, New York Times (August 4, 2016), (arguing that, “ather than extending a state of emergency that serves only to further marginalize them, the French government should address the root causes of alienation among its minority communities”).

The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.