Next up is Nate Cardozo. He's going to go over some history up to the current state of cryptography and the law. So let's re-welcome Nate Cardozo. Thanks. Thanks for coming. Thanks for almost filling the room. I don't know. It's pretty full. I am Nate Cardozo. I'm a senior staff attorney at the Electronic Frontier Foundation in San Francisco. Uh I was on the EFF panel just now and I answered a couple questions but there may be some more time at the end of this one. I am a lawyer. I am not your lawyer. Unless you know that to be false. Because I probably am some of your lawyers in this room. Uh I've been working on crypto policy at EFF for the last couple of years uh and it's been an extraordinarily busy time. Uh in this talk I'm going to talk a little bit about the past where we've come from in terms of crypto law uh and of course what is old is new again and where we're going. Uh I'm going to talk about the legal challenges that face people who design, implement and use crypto around the world. I'm a US lawyer. My focus will be on the US but I'm going to touch on uh some dumb things that some countries are doing besides the United States. I mean we're doing dumb things here but there are other countries doing dumb things as well. And I'm going to talk about the future. What we're likely to see. So on Wednesday or maybe Thursday I forget at Black Hat Jennifer Granik said end to end encryption is legal, period. Alright that's the state of the law. Questions? I mean she's right. End to end encryption in the United States is legal, period. Um but there's still some places that you can go and some things to talk about. Uh the story I'm about to tell you isn't really particularly true but I'm going to tell it anyway. From around 1784 when Joseph Brahma invented a particularly good lever lock until the second half of the 19th century there was such a thing as perfect security. You're looking at it. That was it. Uh the the that safe was unbreakable. The lock couldn't be defeated and with the advent of overlapping cast steel rather than forged steel uh overlaps of course hiding the rivets uh you couldn't just um break it open. You couldn't pick the lock, couldn't break it open, uh couldn't be drilled, couldn't be uh bashed. You could drop it from a very tall building um so the solution of course is to just build one big enough that you can't lift. And that's exactly what they did. Of course lock pickers have been around as long as locks. And the locksmiths have been around for a long time. And in 1851 a locksmith called Hobbes picked Brahma's unpickable lock. Took him 51 hours but he did it. At the same time, right around the same time that Hobbes figured out how to pick this thing uh TNT was invented and that uh made it much much easier to break into this thing. Um but of course as as I said none of this is true safes were picked all the time in that intervening 67 years between Brahma and Hobbes. But not all of them were picked. by picking them and not by blasting them open. Uh, as you all know, even, even with a perfect spec, there's no such thing as perfect security. Uh, vulns are in the implementation. They're not in the, well, I mean, sometimes they're also in the spec, but more commonly they're in the implementation. Uh, so you overlapped your, your cast plates, uh, but you left the hinges exposed and you just knocked the hinges off the door. The, one of the co-founders of the Electronic Frontier Foundation, John Gilmore, in something like 1993, uh, I cannot get an exact date for this statement, said, the internet interprets censorship as damage and routes around it. Um, that statement is more true now than it was twenty-something years ago. In 93 there was no Tor, there were no VPNs, there were no anonymizing proxies, at least none to speak of. We barely even had the first inklings of transport layer security when Gilmore said this. Um, but there were words, there were lots of them. And images and code, politics, you name it, it was online. And for more than two decades the internet has provided us with a truly global platform for expression. Today anyone can write an opposition party blog, they can post photographs of their cats, which if you follow me on Twitter you see. Uh, you can organize a street protest, contribute to open source crypto on GitHub, send 419 spam, uh, search for extraterrestrial life, mine for bitcoin, swap selfies, use PGP. But in the 90s we had the first crypto wars. This, uh, is actually an anachronism, what I put on the screen, this is, uh, a pearl implementation of RSA so it's not exactly the right time, but you get the picture. The first crypto wars were an attempt by the U.S. government to regulate that. That, you couldn't put it on the internet. The first crypto wars were an attempt by the U.S. government to regulate that. Uh, if you were here for the last panel, there was a question about ITAR versus EAR. This was considered a munition in the same category as hand grenades or tanks and you had to get the same permit to put that online as you did to export a tank. Um, the fear was that this would become this. This is of course the Enigma machine, or rather a set of its code wheels, uh, which are under the name of CRPS, each of them has it. Uh, the Enigma machine, uh, who here, raise your hand if you're familiar with Enigma. Okay, most people. Um, Enigma was not invented as a military technology. Enigma was invented to protect the European banking system. Uh, and became famous after some modifications by uh, by the Nazis, uh, when it, when it went into effect in World War II and for a time it defeated all allied cryptanalytic attacks. And for a time it was set time allowed perfect security. Uh, it took a set of stolen code wheels, the invention of the computer, and the most brilliant cryptographic minds of their time, both in Poland and in the United Kingdom, uh, to crack this thing. But getting back to RSA, that, the US government's fear was that if we didn't regulate this, it would allow our adversaries this, perfect security. Uh, and we'd end up with a situation where a cipher, designed to facilitate banking, both RSA and, uh, Enigma, would instead be used by the Soviets for their nefarious plans for world domination, because that's what they did. Um, who here is old enough to remember this? Okay. Uh, a case in point, um, I'm certainly old enough to remember this befuddling option. Do you want Netscape Navigator, uh, for the US? Or do you want, uh, for the rest of the world? Do you want the version that only supports 40-bit, uh, RC4? Or the full 128-bit capable version? Um, the strong version, of course, was only available if you lived in the United States or Canada, because of the inclusion of encryption in ITAR. Um, but, you know, this was the 90s. There were no geo-IP blocks, there were no verifying mechanisms, and all you had to do was check the box that said you were in the United States. To get the strong version. That was it. The, the US, uh, implementation was that bad. And it was completely ineffectual. It didn't keep strong crypto out of anybody's hands. And it led to things like this, right? People put algorithms on t-shirts. You couldn't publish this on the internet, but you could print it on a shirt and wear it through the airport. It led to this. Um, so, Durat, of course, lives in Canada, so he wasn't subject to any of it. It led to this. Who here recognizes this? Awesome. Uh, crypto moved literally offshore for a time. Uh, this is Sealand, the principality of. Um, and it led to this. This is the clipper chip. I'm going to talk a little bit about this later. Of course, I'm a lawyer. Um, and my colleagues at the university are lawyers. My boss is a lawyer. And if all you have is a hammer, then everything looks like a nail. And if all you have is a JD, then everything looks like a loss, uh, a lawsuit. In the late 90s, a grad student at the University of California, Berkeley, walked into the EFF office. I don't think literally, but we can picture that. Um, and his, his, his deal was that he'd invented something and he wanted to publish a paper about it. Um, his name was Dan Bernstein. He's now a professor at Chicago and Eindhoven, and one of, uh, one of the, the best cryptographic minds of our generation. He wanted to publish a paper about snuffle. He didn't even want to publish the algorithm, although he also wanted to do that. Uh, so we went to court for him. My boss, Cindy Cohen, uh, who is now the executive director of the Electronic Frontier Foundation and at the time was a partner at a small firm, uh, down the peninsula in the San Francisco Bay Area, uh, represented, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, a. Uh, code is speech. And we won. And crypto is now legal and exportable. The Ninth Circuit in Bernstein said the availability and use of secure encryption may reclaim some portion of the privacy we have lost. And that's still true. Um, the UN Special Rights Commission has for freedom of expression just last year agreed and stated in his report that encryption protects not only security and privacy, not only free speech, but the right to hold opinions without interference. Um and that's exactly right. We depend on crypto in order to read, in order to write, in order to speak. All around the world and in the United States. This was a button um produced back in the in the first uh crypto wars against the clipper chip which was uh that that really terrible little chip that I showed you uh a minute or two ago. Uh the clipper chip was an NSA developed chipset for secure voice communications. The thought was it was going to be installed in all of our regular old telephone handsets and we would be able to make uh secure calls. Uh and uh we were able to make uh secure calls using it. Uh it used something called the skipjack encryption algorithm and it included a backdoor with key escrow in something called the the leaf, the law enforcement access field. Um Matt Blaze at at Penn among others showed that the algorithm was broken uh and thankfully the clipper chip was defeated and key escrow uh appeared to be dead. Um or at least requirements for key escrow appeared to be dead. And the internet was a safer place for it. Uh this is the the cute little uh golden key which EFF um was we had a campaign around the golden key to ask webmasters to put on their homepages because that's what we had back then. Uh in support of strong crypto without key escrow. Uh this was from 1996. Uh I was 15 at the time and my homepage had this key on it. Um the clipper chip failed mostly because it sucked uh but also because of the actions of the cryptographers like Matt Blaze and and his partners um who were able to show policy makers just how insecure it was. And thank your lucky PKI's for ECCN5D002. This is the encryption exception uh to the export controls. Um the clipper chip failed giant block of text, which I'm sure you can all read and have already entirely digested, is what makes strong crypto legal and exportable today. And we thought we had solved the field. We won. Our friends in Mountain View and Cupertino are free to ship products that actually protect their users security to the best of their ability. People like Moxie Marlin Spike and Adam Langley are free to publish free and open source tools for all of us to use. Um, as Jennifer Granik said, end to end encryption is legal, period. But, thanks to Comey, more work remains. And we're back to exactly where we started. Everything that is old is new again. In 1997, the director of the FBI at the time, Louis Free, said we're in favor of strong encryption. Robust encryption. We're in favor of strong encryption. Robust encryption. The country needs it. The industry needs it. We just want to make sure we have a trap door and key so that we can get into anything that you, that you look at. That sounds shockingly familiar to what, uh, the director of the FBI today, James Comey, is saying, uh, almost 20 years later. Um, 2015 and 2016 brought us a new set of challenges. IOS 8 and Android M brought us full disk encryption by default. Uh, WhatsApp joined iMessage. Uh, Microsoft, uh, it's, it's, it's an old, old message in actual end to end encryption for more than a billion people around the world. And that's to say nothing of, uh, Signal or GPG or Tor or Pond if you're crazy enough to use Pond. And we're back into the crypto wars. We're calling it the next crypto wars or Crypto Wars 2.0. Uh, you, you, you, you heard what I, what I said. Uh, about crypto in 1997 uh in 2015 um Jim Comey the director the current director of the FBI called crypto only a business model. Uh the government has of course been downplaying companies support uh calling it uh just a marketing pitch and not a technical feature. This of course completely disregards the facts of strong crypto. Um IDG and Lookout did a survey in 2013 before iOS 8 and Android M that found that of the something like 4 million phones that were lost and stolen in the United States fully a quarter of those lost or stolen devices resulted in identity theft. Um 1 million Americans were uh victims of identity theft because we didn't have full disk encryption on our phones. And uh that's where director Comey wants us to go back to. And I think that's crazy. Because what could possibly go wrong with backdooring our crypto. To this day most uh actual proposals actual technical proposals uh for weakening of of encryption are something like this. They are something like key escrow. Uh or maybe double key escrow where you escrow the key once to a private key held by the manufacturer and then again to a private key held by the government so that you need both to unlock it. Something like that. That's not really a good idea. Luckily a whole bunch of academics wrote a really good paper telling us why it's not a good idea. Uh this is keys under doormats published the last year? The year before? It's either 2014 or 2015. Um y'all should read it at least uh it for for the technical people you should read the whole thing. For lawyers like me read the executive summary it's very good. Uh the the people who wrote this paper are some of the best cryptographic computer security and security engineering minds alive today. And they write that we find it would pose far more grave security risks in peril in innovation and raised thorny issues for human rights and international relations if we are to give Jim Comey what he's asking for. Keys under doormats identifies three major classes of problems uh with lawful intercept or lawful access capability. First lawful access would necessarily abandon advances in crypto such as perfect well word secrecy. Um that's crazy we barely know how to build secure devices and secure systems. We are the people who are able to make a we're bad at it. We, we don't know what we're doing. And to abandon the state of the art and roll back to the bad old days when a lost or stolen phone had a 25% chance of resulting in identity theft strikes me as a really bad idea. And strikes them as a really bad idea too and they're smarter than I am. Second, it would necessarily increase system complexity. Uh the keys under doormats metaphor kind of breaks down here but you see what they're getting at. Uh there is no such thing as a back door that only the good guys can walk through. Um remember back with the safe. Uh you had an unpickable lock. You had cast plates that overlapped but you left the damn hinges exposed for someone to knock off so that they could open the door. That's the problem. The problem here isn't necessarily with the protocol but the massive increase in complexity that any sort of lawful access system is going to necessarily result in. And finally, doing something like this is going to concentrate the attacker's focus onto one or two incredible points of failure. And by definition, the ma- the key material is going to be, going to have to be kept online. Uh because as Jim Comey or the District Attorney of Manhattan, Cyrus Vance, uh have repeated over and over, they're going to use these capabilities a lot. They're not going to be okay with having uh the keys kept in secure offline storage. They want push button access to our communications. Uh and so Comey has actually heard us. He he he's heard us and he's come around. Um and he he's not pushing for back doors anymore. Uh he said last year that we're not seeking a back door approach. We want to use the front door. Uh which of course is the same damn thing. Uh the the Washington Post put it uh a little bit more weirdly I'll say. And they said that uh and this is a quote, a back door can and will be accessible to anyone who has access to a back door. And this is a quote that's not going to be accessible to anyone who has access to a back door. Um so we're going to be like 17th century. Um, it's a B-it's going to be really easily exploited by bad guys, too. However, with all their Wizardry, perhaps Apple and G-Google could invent some sort of Secure Golden Key. That's what the Washington Post called it. Of course, you know, technology, sufficiently advanced technology is indistinguishable from magic. This thing is magic to people like Jim Comey. To people like the editorial board of the Washington Post. They don't know how it works. It's obviously magic. Right? So if if the Wizards at at Mountain View you or Cupertino can design this, then just nerd harder and invent the golden key. Or they'll beat us up and take our lunch money. Like, come on. But that's not the way the world works. Okay, the slide you're about to see is false. NSLs are not magic. Only friendship is magic. There is no legal tool in place in, at least in the United States, uh, that is currently sufficient to require a provider or developer to maintain or create the ability to provide plaintext on demand. That is a much more, uh, verbose way of saying what Jennifer Granik said at, uh, at Black Hat earlier this week. Uh, end to end encryption is legal, period. Um, there is a perception in our community that NSLs are magic. Um, I'm here to, uh, hopefully help you rid yourself of that perception. National security letters and other types of national security process are terrifying. They're scary, um, they are, they operate with almost no oversight. National security letters get issued without even a judge's signature. But, they're not magic. With an NSL, you can get subscriber information and maybe a little bit of a transactional information. You can't get content. You can't get a backdoor. You can't force someone to build code. Uh, Jennifer and Rihanna at Black Hat the other day gave a great talk about technical assistance orders. Um, technical assistance orders maybe a little bit more magic but we don't know. Um, I'll talk a little bit about that later, uh, later today. But, there are things that might be magic around the world. I, I'm going to go back to the city of many countries are looking at or considering legislation that would mandate backdoors or have already mandated access to plain text or otherwise endanger encryption. The investigatory powers bill just passed the House of Commons and is up in the House of Lords in the United Kingdom. Section 189 4C of the IPB says that operators may be obligated to remove electronic protection at the sole discretion of the Home Secretary. What does that mean to you? Well, to Theresa May, who was the Home Secretary at the time that the IPB was introduced and is now the Prime Minister of the United Kingdom, it meant that the Home Secretary, or it will mean that the Home Secretary will have the capability to remove electronic protection from the United Kingdom. That means that the Home Secretary will have the capability to order providers to strip end to end encryption in the UK. Um, if at the Home Secretary's discretion it's practicable. Um, note the Home Secretary is not a cryptographer. The, uh, the second major problem with, uh, with this statute is it would grant the UK power to issue a national security notice. Uh, another secret instrument even more vaguely drawn, uh, than removing electronic protection that would require operators, and operators is construed very broadly to include things that aren't UK entities, like Google and Facebook and Apple, uh, to carry out conduct including the provision of services or facilities which the British government considers necessary in the interest of national security. They don't have a First Amendment in the UK. They don't have the arguments that won the day in the Apple FBI litigation. And this scares the living hell out of us. In Australia, uh, the Australian Department of Defense, that's not a typo, that's just how they spell it down there, um, has already passed a, a regulation, the DSGL, which I don't remember what that stands for, uh, that prohibits intangible supply of encryption technology. Um, this is terrifying to us. Many ordinary teaching and research activities may well be subject to an unclear export controls under this statute. Um, we don't know how Australian courts are going to interpret it, but it is certainly plausible, given just the plain reading of the law in Australia, that it is now illegal to teach encryption to students who aren't Australian citizens in Australia. That's horrifying. Um, other countries are doing crazy things as well. Um, China, uh, passed an anti-terror law last year. Uh, the government has been trying to !...the final version of which says, and this is the best translation I could find, that companies shall provide technical interfaces, decryption and other technical support. Um, end-to-end crypto is not legal in China. Period. To, uh, mangle Jennifer Granik's phrase from earlier. Um, okay, now I'll, I'll turn back to the US. Thanks Obama. In October of last year, the president said we will not, for now, classify our cryptocurrency as illegal and for legislation requiring companies to decode messages for law enforcement. Okay, there's a problem there. Can you spot it? I bolded it for you. A month later, the National Security Council issued a secret decision memo, which was thankfully leaked to Bloomberg, who published it, that said that they were going to identify laws that needed to be changed to deal with going dark. So, for now, lasted a month. Um, also at around the same time, we saw people like the Director of National Intelligence start thinking about what would be necessary, uh, to change the political climate in the United States in order to get those laws, laws changed. And, of course, in March 2016 at South by Southwest, the United States president sat down to talk about crypto. Uh, and that's what we got. We went from not now to if we don't, we're fetishizing our phones. Um, Bob Litt, who's the General Counsel at the Office of the Director of National Intelligence, uh, one of the, the chief lawyers in the security apparatus of the United States, said that the encryption debate could turn in the event of a terrorist attack or criminal event where strong encryption can be shown to have hindered law enforcement. And that's what we got in San Bernardino in December. Um, of course, uh, I, I, I'm not even going to ask for a show of hands. I hope that you're all familiar with, uh, what happened in San Bernardino and its aftermath. What was this case really about? The FBI wants wants the ability and and I'm not uh I I'm I'm literally paraphrasing or I'm not literally I'm I'm actually paraphrasing uh what Jim Comey said uh before a hearing in the United States House of Representatives under oath. The FBI wants the ability to mandate that companies turn our devices into tools of surveillance. It wasn't about this one phone. If the question in San Bernardino had been limited to should the FBI be able to unlock a single terrorist's phone, I'm comfortable with saying yes. That the answer to that question is yeah the FBI should. But that's not what the case was about. Um we saw from the leaks National Security Council memo uh and people like Bob Litt's statements that they were just waiting for a terrorist attack or criminal event uh to turn the public tide. That's what the Apple FBI case was about. It was about whether or not the FBI or the Department of Justice or the U.S. government can compel a company to change its practices. The only reason I would I would submit to you that the FBI pursued the case in the way that they did was to set a legal precedent um that would give them the ability to demand U.S. tech companies stop providing end to end encryption or secure device storage. And they saw it as a win win. They thought, the FBI thought, that even if they lost the court fight in San Bernardino, they'd be able to take that loss to Congress and ask for a fix. Um there was also a case in Brooklyn um that was very similar in front of a magistrate judge named Judge Orenstein. Um that case was about an IOS 7 device I think. So that was probably unlockable um and they got into it as well. But the FBI's ask in both of those cases, in both San Bernardino and Brooklyn, was ill considered for three reasons. First, legally what the FBI was asking for represented a fundamental shift in the way that the All Writs Act was interpreted. Um I have in in other contexts gone a lot deeper into the All Writs Act and what that is. I'm not going to uh for this audience um you find it super boring. But in any case it's never been used to compel an American company to sabotage its own products. Uh the All Writs Act was passed in 1789 and it was definitely available uh to to police back in the Joseph Brahma days. In the days of that first safe that I the first unbreakable safe. Brinks, West uh Wells Fargo were never compelled to create a master key to their devices. Um that is something that American courts had never done. Uh technically the the ask was flawed. As I said earlier, we don't know how to build secure systems and the fact that the FBI was considering mandating Apple undermine the security of an already not perfectly secure device was crazy. And was crazy not just because it was a secure device but because it asked to uh the left wing radicals at EFF um but to the several dozen companies that submitted Amicus briefs in support of Apple's position in San Bernardino. And finally the FBI's ask was flawed for policy reasons. There's no way that an FBI backdoor would stay an FBI back door. The Russians, the China's, the Brazilians, the Turks, the French, the Germans, you name it are going to want the same access. And the only reason that Apple's ے Apple has been able to say no to the Chinese, to the Russians, to the Brazilians, is because they don't give it to the FBI. And once that changes, the calculus all around the world changes very quickly. And that would be crippling not just to tech, but also to American business. Um, there are other litigations happening around the country, at least we think there are. Um, Wiretap Act litigation may be ongoing. Uh, in March, Matt Apuzzo, uh, wrote a story in the New York Times about an order directed at WhatsApp. Uh, a federal, a United States federal court order directing WhatsApp to do something, we're not exactly sure what, we don't know which court it's in front of, we don't know if the litigation is ongoing. Um, if I had to guess, I'm gonna say that it probably isn't ongoing right now, but who knows? There may be FISA court orders. Uh, the FISA court, the Foreign Intelligence Surveillance Court, uh, the FISA court, uh, the FISA court, uh, the FISA court, uh, the FISA court, uh, sits in the basement of a federal courthouse in Washington D.C. and meets literally in a Faraday cage to issue its secret orders. Um, litigation before the FISA court, generally speaking, is one-sided. The government, uh, stands in front of the judge alone and is unchallenged. Um, that is changing a little bit. There's now an amicus provision. Uh, an order directed at a company might be contested. Um, so far as we know, only one provider has ever contested a FISA court order. Uh, the FISA court, uh, has a FISA court opinion, or a FISA court order, and that was Yahoo in 2007, 2008. We didn't, of course, learn about that until 2013 or 2014. 2014, I think. Um, but, we're in the middle of a FOIA case to get access to any decryption orders that might exist at the FISA court. One of the nice things that happened last year, uh, this is a, um, a minor win for us, uh, the USA Freedom Act was passed in, uh, passed by Congress and signed into law by the President. And one section of USA Freedom says that the government has to declassify significant FISA court opinions. Um, of course, it doesn't really define what significant is, it's not clear whether it's retroactive, so we sued. And we're suing to get a hold of that. Um, oh, remember, okay, I'll, I'll go here. Remember when I said that they were just waiting for a big terrorist or criminal something to update the law? That happened in San Bernardino and we got the Burr-Feinstein bill. Luckily, the Burr-Feinstein bill seems to be dead right now, but it would have required providers of just about everything to decrypt on demand. Uh, carried civil and criminal penalties, would apply not just to communications, not just to storage, but also to licensing, which means it would have included app stores. Uh, if Burr-Feinstein had been passed in its original form, it would have required Apple and Google to decrypt on demand. To censor the app store and the play store to make sure that nothing had crypto in it. And, of course, uh, not just end-to-end encryption, but full-disk encryption, uh, would have been included. And actually, if you read the Burr-Feinstein legislation literally, if you, if you take it to its extreme, it would have actually outlawed general purpose computing. That's just a hint of how out of touch the, the, the drafters of this legislation are. Okay, 2016, a what are we looking at? Uh, there could be a key escrow mandate. That's certainly something that China and, uh, and India feel comfortable with. I don't think it's going to happen in the states. I don't think it's going to happen in the states for a couple of reasons, um, all of which are enumerated in the keys under doormats paper. Uh, the Burr-Feinstein bill may be redrafted and reintroduced. Um, it definitely won't pass in its current form because, as I said, uh, read literally it outlaws general purpose computing and even Congress isn't that dumb. I mean, maybe. They might be. Uh, but a law that says we don't care how you do it, just make plain text available, uh, is certainly plausible to me. That's the route the UK seems to be taking. Uh, the investigatory powers bill is, as I said earlier, in front of the House of Lords. Uh, if the House of Lords passes it, it will become the investigatory powers act and that will be the end of end to end encryption in the UK. Uh, I, I gave a talk at real world crypto in January and I, I made a lot of predictions, very few of which came true. I didn't anticipate the All Rights Act litigation at all. Um, but I made a prediction that said the government is going to focus on defaults, not primitives and I think that's right. I think that's still right. The government knows. They're not stupid. They know that there's no way of keeping strong crypto out of the hands of people who are really determined to get it. But there is a way of keeping strong crypto out of the hands of everyone who just walks into the Apple store and buys an iPhone. Um, they can force companies to change the defaults. Uh, we've seen a couple of states try it. In California and New York, a pair of bills that are almost identical were introduced at the beginning of the year that would have made it a crime to sell a smartphone that had security and security and security and security and security and device storage by default. They didn't neces- they didn't need, uh, they, they, they didn't even really try. California tried a little bit, but they didn't try very hard, uh, to make it impossible to install full disk encryption. They really just care about the defaults. Uh, they, they don't, they, they know they're not going to get the terrorists. They know they're not going to get organized crime. They know they're not going to get the pedophiles. They're going to get us. They're going to get ordinary Americans. Uh, and that's what these two bills were about. What's likely in 2016? Informal pressure. One of the things that, that I do along with my colleagues at EFF is we represent developers and sometimes small companies, uh, who get a visit from their three-letter agency friends. Um, and so I kind of know what this looks like a little bit. Uh, the FBI will request a meeting. They'll come down and sit in your office and say, you know, it'd be really great if you gave us a backdoor interview. Uh, and, uh, the FBI will rather run into your stuff and if you don't, blood will be on your hands and they'll show you pictures of terrorists using your product. That happens. So they don't neccesarily need to force you, they can just pressure you real hard. I don't think that any ban that we could possibly see in the United States is going to hit for your open source software. I don't think it's possible. We have the first amendment here. Uh, they're not dumb enough to try. Well, well, Dianne Feinstein is dumb enough to try, but, um, I don't think that's actually going to pass. Two slides ago I said it's about defaults not primitives. For any ban on, uh, I don't think, I don't think we're going to see bans on primitives. I don't think we're going to see algorithms targeted. Um, I think we're going to see defaults targeted. We might see a CALEA-like mandate. CALEA is the Communications Assistance for Law Enforcement Act passed in 1994 that requires telephone, plane, landline, and mobile telephone companies to have wire tap capabilities. This is a relatively decent possibility. A mandate like this would apply to providers doing business in the United States and something like as a condition for selling something they must turn, uh, maintain the ability to turn over plain text. Um, this is only going to be tough on Apple and Google. Uh, maybe even the app store and it's not going to touch, uh, GitHub or, or your, your pet free and open source crypto project. Uh, and of course countries around the world might continue to do dumb things. Uh, Kazakhstan, uh, appears to want everyone to install their certificate into your trust store so that they can man in the middle all of your SSL. Um, they de-published that requirement so it's not clear how serious they were, but they were certainly thinking about it. Uh, China already has. But, it's not going to work any better this time than it did the last, right. The last time all you needed to do was put it on a t-shirt and walk through an airport. Information doesn't give a crap about borders. These aren't centrifuges. These aren't scud missiles. These aren't nerve gas precursors. You can't stop code at the border. We live in a world with strong cri- cryptography and there's nothing that US government or any other government around the world can possibly do to change that fact. We have Tor. We have GPG. We have Signal. And we're beginning to have real accessible tools to evade censorship. WhatsApp is used every day by or every month by 1.1 billion people around the world with strong crypto. That's amazing. So what's to be done? What if you're a developer staring down the barrel of an order or a request or a demand or an NSL or if the NSA comes and sits in your office and says blood will be on your hand? Email info at EFF.org and we'll help. What if you're just a regular person wanting to fight back a little against the surveillance state? Oh, we've got a site for that in seven languages. Uh, we will show you how to- install, uh, Signal or WhatsApp on your phone. We'll show you how to turn on full disk encryption on any device you might have that supports it. We will help you with threat modeling. Uh, SSD is awesome and you should definitely go there. Uh, and what if you just have some questions? I don't know, ask them. That's it. I think, do we have a couple of minutes? Yes, we have a couple of minutes. There's a mic at the front. Line up if you want to ask a question or two. Hello. Hi. Uh, how do you feel about the, uh, democratic platform? Did you read the tiny little section and I mean tiny on cyber security? So, unfortunately EFF is a 501 nonprofit and we can't do anything about it. We can't get involved in election politics. Okay. Yeah, because they use that weird language about the false notion between privacy and security so. Well, I can, I can tell you what I think about privacy and security. Uh, you can't have one without the other. We need both. There's no tension between privacy and security. We need them both. Hello, how you doing? Hey, pretty good. All right. Um, I keep hearing that there's no such thing as perfect security. Would you say Bitcoin has perfect security barring- you know, the unlikely quantum thing? Uh, I have no idea. Luckily, EFF at this point is big enough that, uh, I can trust other people to think about cryptocurrency and I can think about crypto without currency. So, uh, ask, uh, send an email to info at EFF dot org. Okay, thanks. Yeah, thank you. So, the normal political argument for weakening crypto is we need to catch the terrorists and, uh, plus if you have nothing to do with it, you can't do anything about it. So, if you have nothing to hide, why should you care? Uh, but we have a sort of increasing, uh, number of terrorist events, at least hitting Western media. So, uh, how do you think that puts us, uh, if each time there's a terrorist event that convinces a certain portion of the public to, uh, not, or care less about privacy, uh, are, are we doomed or-? I sure hope not. Um, if I were a pessimist, there'd be no reason for me to get up every day and go into work at EFF. Uh, I don't think that's the case at EFF. I have to be an optimist on this. Um, as I said earlier, there's nothing anyone can do to keep strong crypto out of the hands of a- someone determined to use it. Uh, in terms of the I have nothing to hide, why should I care argument, that's something that we hear a lot at EFF. We hear it, uh, from policy makers, from regular people, and my response to that is it's not about you, right? You- it's- it's literally not about you. It's about everybody else. I don't have anything to say. Um, I- I- I don't have anything to say, and yet I benefit from freedom of speech, because other people's speech benefits me. Their robust ex- exchange of ideas benefits me. And privacy is the same. I benefit from you having privacy, because privacy is a prerequisite for social change. Privacy is a prerequisite for democracy. We couldn't have had a civil rights movement or a gay rights movement, uh, in the United States without privacy. You can't organize in public. Um, if you're an LGBT teen in Saudi Arabia, you can't organize in public. You need privacy, and I benefit, um, from privacy being available around the world. I'm sorry, that's the- all the time we have. Um, if you want to point somewhere else where you can take more questions, if you have the time. Um, I'm going to be going to the contest area. I have a two and a half hour booth shift, so if you want to meet me at the EFF booth, um, I'm in like five minutes. That's where I'm headed, and I can continue answering your questions there. And while you go over to the contest to talk to him, you can, at the same time, get a mohawk and donate to the EFF. And I haven't seen enough mohawks this year. Thanks, everybody!