00:00:00.033-->00:00:05.439 >> So this is 'Slacking Towards Utopia: The State of the Internet Dream.' Please welcome 00:00:05.439-->00:00:10.444 Jenifer S Granick. [Applause] >>Thank you, thank you Really? New college? Awesome! My alma 00:00:16.149-->00:00:21.054 matter. I should talk a little bit about new college in this talk. I want to thank everybody 00:00:21.054-->00:00:26.126 for coming. I think I got a really awesome room to speak in this year, with theses totally 00:00:26.126-->00:00:32.533 cool computers behind me. uh and I want to also thank my parents who are here. They are over 00:00:32.533-->00:00:37.537 here. [Applause] They've gotten to come see me talk a number of times here at DefCon. My Dad's 00:00:42.109-->00:00:48.649 hacker name is 'The Eagle' so you guys can look him up. Um- and uh I want to talk today 00:00:48.649-->00:00:53.654 about something that I think that a lot of us here at DefCon hold really dear. And uh I spoke 00:00:55.656-->00:01:00.827 about this last year at BlackHat when I gave the keynote. Can I just see um- just a show of 00:01:00.827-->00:01:05.832 hands, how many people saw my Blackhat keynote? Okay so some, but not all. So, the beginning, 00:01:08.001-->00:01:11.972 I apologize might be a little bit repetitive for you guys but I think it's important to set 00:01:11.972-->00:01:17.077 the stage. Because in that speech I talked about something that I called the dream of 00:01:17.077-->00:01:23.951 internet freedom. And- uh its something that I think um means a lot to many of us. And I said 00:01:23.951-->00:01:29.256 that the dream of internet freedom was endangered and that we as a community needed to 00:01:29.256-->00:01:35.529 start taking that seriously if we wanted to ensure that we could have some of those things 00:01:35.529-->00:01:40.133 that maybe animated us politically, animated us technologically in the early 00:01:40.133-->00:01:46.106 days of the internet. So today what I want to do is I want to revisit that idea of the dream 00:01:46.106-->00:01:51.144 of internet freedom and I want to take a look at what's happened over the past year and 00:01:51.144-->00:01:57.584 see whether we're getting closer to the dream, or we're getting further away. And my goal is 00:01:57.584-->00:02:02.255 going to be to end the talk early, before all of the time is up so that we can start to have 00:02:02.255-->00:02:08.562 a conversation, um- and uh start to talk together about what it is we might do if we want to 00:02:08.562-->00:02:15.102 take the dream seriously and try to see some of these things um- come to fruition. So, before we 00:02:15.102-->00:02:18.905 get there though I want to take a little time for the people who didn't see the speech to talk 00:02:18.905-->00:02:22.909 about what this dream of internet freedom is. And I'm going to talk about it from my 00:02:22.909-->00:02:27.914 experience and how I came to believe in this utopian vision. And for me it started when I 00:02:31.551-->00:02:36.556 read Steven Levy's book Hackers. And Hackers: Hero's of the Computer Revolution. And um- in 00:02:39.226-->00:02:45.665 this book I learned that information wants to be free. I learned that computers would 00:02:45.665-->00:02:52.372 help us understand authority better, and be able to be more individualistic. Make our own 00:02:52.372-->00:02:58.845 decisions about what was right and what was wrong. I learned that computers could connect 00:02:58.845-->00:03:05.585 people so that we could talk to each other and share information. Then I learned that 00:03:05.585-->00:03:11.858 uh individual rights were something that could be built in to the technological world 00:03:11.858-->00:03:18.832 around us. Um- that decentralization was not just a principle of computing, but also 00:03:18.832-->00:03:25.038 a principle of political freedom as well. Um and so this came to be what I started to understand 00:03:25.038-->00:03:30.043 as the dream of internet freedom. The dream of a free, open, interoperable, reliable 00:03:32.112-->00:03:37.117 internet, where people can speak their minds and anyone who wants to hear it can listen. So uh I 00:03:40.587-->00:03:47.494 also learned, not that long after, about uh that the hackers were a thing and the hackers 00:03:47.494-->00:03:54.501 were really intimately tied to this um- this stream and Steven Levy tells this story about you 00:03:54.501-->00:04:01.141 know- old school hackers back in the 1950s at MIT, but I learned that there are people now today, 00:04:01.141-->00:04:06.213 who are trying to make this dream true. Um- because I discovered the hacker manifesto, 00:04:06.213-->00:04:12.018 written and published in 1986, um- which I guess it's an anniversary this year, and 00:04:12.018-->00:04:17.324 published in Phrack in uhPhrack magazine. And there I learned that you know uh hackers were 00:04:17.324-->00:04:22.395 people who wanted free access to information and were willing to take time to build the tools to 00:04:22.395-->00:04:27.400 make it so. The hackers wanted a world where curiosity was its own reward. Um- and that people 00:04:30.003-->00:04:36.643 could explore this world around them, and find their own truths, not just accept the conventional 00:04:36.643-->00:04:43.083 wisdom. So, as the mentor explained it, uh with this background, the future could be 00:04:43.083-->00:04:48.188 a place where people would just meet mind to mind and exist, without skin color, without 00:04:48.188-->00:04:54.127 nationality, without religious bias. And I wanted this to be true. But I learned, when I 00:04:54.127-->00:05:00.066 started to read the hacker crackdown by Bruce Sterling, that it was in dander. And there 00:05:00.066-->00:05:06.173 were uh law enforcement agents who were using a law called the computer fraud and abuse act, my 00:05:06.173-->00:05:12.913 personal enemy, to go after people who are trying to explore this network and interpreting 00:05:12.913-->00:05:19.219 these laws in these huge ways that basically criminalize curiosity. Around about the same 00:05:19.219-->00:05:23.223 time, so I was going to law school, and I was like, this is going to be the thing I really 00:05:23.223-->00:05:28.228 try to fight against. And around about the same uh we learned about the horrible global 00:05:31.898-->00:05:36.436 scourge of online pornography. >>[inaudible] >>And... haha, some people here are Pro-porn I 00:05:36.436-->00:05:42.175 understand, I think that. You know, I was pro-porn too. uh The ideas that, you know there was 00:05:42.175-->00:05:47.180 this- the internet which should be this nice place for people to hang was polluted with all of 00:05:49.249-->00:05:55.188 this terrible stuff and that we needed to get rid of it. That was a very common point of view, 00:05:55.188-->00:06:00.260 there shouldn't be anything dirty on the internet. And another sort of alternative view 00:06:00.260-->00:06:03.763 was that, well if there is goingg to be dirty stuff for those few people who are 00:06:03.763-->00:06:08.602 interested in it, um- it should be zoned. Like in a city how there's like a red light 00:06:08.602-->00:06:13.506 district that's kind of slummy and run down, and you've got to go there. But everything else 00:06:13.506-->00:06:19.980 should be policed be clean. And this idea was terrible for those of us who wanted the internet to 00:06:19.980-->00:06:25.018 be a place that would be a free exchange of ideas. It was terrible because you know- our 00:06:25.018-->00:06:29.756 dream was that the internet would be like a library, but better than a library. Where 00:06:29.756-->00:06:35.629 every book that had ever been written was on the shelf and available. Um- and here it was 00:06:35.629-->00:06:39.733 that these people um- were coming in and saying prudishly no, it's not going to be like 00:06:39.733-->00:06:44.738 the library, its going to be like TV, uh or radio. And we didn't want that. uh so, this 00:06:47.374-->00:06:52.512 galvanized sort of a generation of activists who wanted to fight against this vision of the 00:06:52.512-->00:06:58.551 internet as TV or radio in favor of the vision of the internet that it was free, freer even, 00:06:58.551-->00:07:04.624 than a library. So into this mix comes John Perry Barlow, lyricist for the Grateful Dead, 00:07:04.624-->00:07:11.031 founder of the Electronic Frontier Foundation, lovely man, uh very poetic writer, and he 00:07:11.031-->00:07:15.902 drafted this 'Declaration of Independence of Cyberspace.' And in the declaration of 00:07:15.902-->00:07:20.907 independence of cyberspace, he set forth this vision, and he said in it, "governments of the 00:07:23.777-->00:07:28.782 industrial world, you weary giants of flesh and steel. I come from cyberspace. The new 00:07:31.418-->00:07:36.423 home of mine. On behalf of the future, I ask you of the past to leave us alone. You are not 00:07:39.626-->00:07:44.631 welcome among us. You have no sovereignty where we gather." Now this vision wasn't just 00:07:47.834-->00:07:54.741 reaction against the whole cyber porn hysteria with Marty Rimm and the passage by Congress of 00:07:54.741-->00:08:00.647 laws that would say it that its a- penalty- its a crime to put porn on the internet. This was 00:08:00.647-->00:08:05.652 reacting to those legislative proposals but it was also a reaction to this 'government and 00:08:08.154-->00:08:13.159 business as usual' and a real expression of this dream of internet freedom that the 00:08:15.228-->00:08:20.467 people, together, would govern themselves and would be able to take care of, and provide for 00:08:20.467-->00:08:25.472 each other, this intellectual stimulation. Okay? So this galvanized me, motivated me in 00:08:28.041-->00:08:33.046 my legal career and I think that it did the same for many, if not most of you. That this was the 00:08:36.616-->00:08:40.820 thing that got you excited about computers in the first place and made you want to be a part of 00:08:40.820-->00:08:45.825 it. Um- so the dream of internet freedom, that we would be able to overcome age, race, class and 00:08:48.294-->00:08:53.299 gender. That we would be able to communicate with anyone, anywhere. That we would have 00:08:55.735-->00:09:02.308 free access to information, wherever in the world it was created. The hands-on 00:09:02.308-->00:09:06.646 imperative, that we wouldn't have to take the way things worked for granted. That we 00:09:06.646-->00:09:12.085 would be able to investigate and explore it for ourselves and find our own reality. And 00:09:12.085-->00:09:17.090 ultimately the idea that computers would liberate us. So how are we doing? Well, I think 00:09:20.093-->00:09:25.098 that this utopian vision is less and less true everyday. And it's less and less true because of 00:09:31.938-->00:09:36.943 some dynamics that maybe we couldn't have predicted at the time. Right? Today technology is 00:09:39.412-->00:09:45.652 generating more information about us then ever before, increasingly making a map of 00:09:45.652-->00:09:50.657 what we do for governments and for companies to learn about us and to attempt to manipulate us, 00:09:52.859-->00:09:57.864 or to uh otherwise regulate us. It is a golden age for surveillance. uh today, there is 00:10:00.567-->00:10:05.572 a real appetite for censorship. Racism and sexism have proven more than resilient enough to 00:10:07.707-->00:10:12.712 thrive online. And people who use the internet don't want to hear this garbage. They want 00:10:15.315-->00:10:21.120 someone to take care of it for them. And so there's an appetite for censorship. Not just in the 00:10:21.120-->00:10:25.458 United States, the home of the First Amendment, but in countries around the world that 00:10:25.458-->00:10:30.463 have even a less robust free speech tradition that we do. uh and there's a real appetite for 00:10:32.632-->00:10:37.637 corporate control. And a lot of that has to do with spam, malware, um- bad user interface, 00:10:40.473-->00:10:45.478 uh people want to have their technology work. They are attracted to a safe, closed 00:10:49.249-->00:10:54.787 ecosystem where spam and malware are taking care of, where software updates happen 00:10:54.787-->00:10:59.792 automatically. There's just not this appetite for individualized control of technology. There's 00:11:02.795-->00:11:07.700 no market for it. So these are the things that are happening today. Surveillance, censorship 00:11:07.700-->00:11:14.107 and control. And they are enabled by three trends. Three things that are happening that 00:11:14.107-->00:11:19.412 make this possible. And one is centralization, okay? Centralization is the idea that 00:11:19.412-->00:11:24.417 all our services are now one thing. Google for search, Gmail for email. Facebook for social 00:11:26.553-->00:11:32.125 networking. It's enabled by regulation. Governments are getting involved here and are 00:11:32.125-->00:11:37.430 regulation for these things that people want. And ultimately globalization. Internet 00:11:37.430-->00:11:42.435 companies are global from the get go. And so it's not just the United States that involved, or 00:11:42.435-->00:11:46.139 the United States and Europe thats involved, it''s countries from around the world that are 00:11:46.139-->00:11:51.578 getting involved in the regulation game. So now I want to talk about - thats my- thats 00:11:51.578-->00:11:57.483 my sad state of affairs. So now I want to talk about what's happened over the last year or 00:11:57.483-->00:12:03.489 so. Um- and see if we can like- learn from recent experience about these things. So let's 00:12:03.489-->00:12:08.494 take freedom of expression. Alright. We have moved very far from the idea that people get to 00:12:13.833-->00:12:18.204 see whatever information they want to see. I mean it's actually particularly scary 00:12:18.204-->00:12:22.241 because sometimes we don't even know, maybe most of the time we don't even know what information 00:12:22.241-->00:12:28.715 is being removed from our purview. Um- so here's an example. Facebook has been sued 00:12:28.715-->00:12:33.720 multiple times by families whose loved ones were killed by ISIS. And these families have said 00:12:37.624-->00:12:42.629 that its- at least partially Facebook's responsibility because Facebook has videos from 00:12:45.131-->00:12:52.038 Jihadists online that are talking about and trying to foment uh allegiance to the 00:12:52.038-->00:12:57.043 group and foment jihadist activities. Okay? So they've- Facebook is getting all this 00:12:59.045-->00:13:05.284 pressure to take things down. And it's not just from a few civil litigants. uh United 00:13:05.284-->00:13:10.223 States government officials go to these social networks and they say, we know we can't make 00:13:10.223-->00:13:15.228 you do this, under the first amendment, but surely you can exercise some corporate 00:13:15.228-->00:13:19.899 responsibility and take down these beheading videos and stuff. Surely you don't need to 00:13:19.899-->00:13:26.272 carry this kind of terrible, inflammatory crap on your network. And companies start to 00:13:26.272-->00:13:31.277 take this stuff down. But then you have something like the Philando Castile video. A video 00:13:34.013-->00:13:39.018 of an African American man who was shot and killed by police officers for no good reason and 00:13:43.556-->00:13:48.561 initially, Facebook took the Castile video down. And then there was this outcry. Why are 00:13:51.698-->00:13:56.602 you taking this video down? You're censoring this evidence of police brutality. People need 00:13:56.602-->00:14:01.541 to see how this man died. These are the same thing, they're not different. But we're putting 00:14:05.745-->00:14:11.417 this pressure on the intermediary, on a private company to make these decisions 00:14:11.417-->00:14:18.191 for us about what we're going to get to see and what we're not. Alright? It's not just Facebook. 00:14:18.191-->00:14:23.196 You know- on Twitter, some guy basically harassed an African American actress, Leslie Jones, 00:14:27.900-->00:14:32.638 the woman who appeared in Ghost Busters, um- to the point where she was like, "Screw this, I'm 00:14:32.638-->00:14:35.975 not going to be on Twitter anymore, it's too filled with hate speech, I don't want to put 00:14:35.975-->00:14:42.582 up with this." And so Twitter blocked the guy, and he claimed okay, its a big violation of my 00:14:42.582-->00:14:46.552 free speech. He can go on the web, and he can still say whatever he wants, people want 00:14:46.552-->00:14:52.692 to find it, maybe it will be a little bit harder, I don't know. But, the web only really works 00:14:52.692-->00:14:59.232 as open, if you can find the stuff you want when you search for it. And that means Google. 00:14:59.232-->00:15:02.902 Google is the dominant search engine, basically around the world. Some compan- countries 00:15:02.902-->00:15:06.572 have their own but basically, in almost every country of the world, Google is the dominant 00:15:06.572-->00:15:11.878 search engine. Well, what does Google carry? They have a huge responsibility because that's 00:15:11.878-->00:15:16.716 the venue, that's the path through which most people are going to find the information 00:15:16.716-->00:15:22.688 that they want. So, Google is under an immense amount of pressure to change its search 00:15:22.688-->00:15:27.693 results in order to accommodate certain political, um- demands. So, Google is supposed to demote 00:15:31.230-->00:15:38.104 torrents, because torrents are copyright infringing. Um- Google is now subject to orders in 00:15:38.104-->00:15:44.777 Europe about the right to be forgotten. Now the right to be forgotten is the idea that uh 00:15:44.777-->00:15:50.216 even true information about you, at some point becomes sort of outdated or out moded, and if 00:15:50.216-->00:15:53.753 you don't want people to be able to find that thing about you and have it be the main thing they 00:15:53.753-->00:15:58.291 know, you can go and be delisted from the search engine, so that when the search for you, they 00:15:58.291-->00:16:04.664 don't see you DUI, or your terrible high school debate performance or whatever it is 00:16:04.664-->00:16:09.268 that you don't want people to see. So it's this privacy right that's developed in the European 00:16:09.268-->00:16:14.373 Union, but the result of it is, when people search for you in Google, they're not getting 00:16:14.373-->00:16:19.378 truthful information, right? Um- And it- you know, France and Germany were the countries that 00:16:21.480-->00:16:26.552 have been really, you know, advocating this, saying well you know, it's- it's uh not really 00:16:26.552-->00:16:32.091 even a conflict between privacy and free speech, because you don't have a free speech right 00:16:32.091-->00:16:37.730 to have this private information about people. A very different vision than we here in the 00:16:37.730-->00:16:42.969 united states have. So, ultimately the question is, Google's got to comply with 00:16:42.969-->00:16:48.574 French law, they have to. They operate there, they have to. What does it mean, though, to 00:16:48.574-->00:16:53.713 comply with french law. Does it mean that, if I'm a French citizen, and I have information 00:16:53.713-->00:17:00.453 de-indexed about me, then nobody in France can get it? Or nobody in Europe can get it? Or nosy in 00:17:00.453-->00:17:05.458 the entire world can get it. And what France is doing now, is uh is France has uh pushed its case 00:17:08.694-->00:17:14.200 against Google, that Google is legally obligated to remove these results, not just for 00:17:14.200-->00:17:17.470 France, not just for Europe, which has the right to be forgotten, although its 00:17:17.470-->00:17:21.941 different in different countries, but for everybody. Even here in the United States, 00:17:21.941-->00:17:26.946 where we would have a right to access that information. Sorry, technical difficulty. Why should 00:17:35.187-->00:17:40.726 France be able to tell us what we can and can't see. But because these are global 00:17:40.726-->00:17:45.731 companies being regulated by global governments, that's what's happening. Um- you know 00:17:49.135-->00:17:54.874 in another example, uh people are suing in Europe. They're suing Twitter and YouTube and 00:17:54.874-->00:18:01.047 Facebook for a efficient response in their opinion to hate speech. But hate speech is 00:18:01.047-->00:18:07.753 legal in the United States. Um- there was a news story this year, where the German- the 00:18:07.753-->00:18:12.758 German Chancellor Angela Merkel was overheard pressuring uh Mark Zuckerberg, asking him what is 00:18:15.127-->00:18:20.166 he doing to prevent criticism of her open door immigration policies. And you know in the 00:18:20.166-->00:18:26.272 law we call this a slippery slope, right? It starts with stuff that people agree on, like 00:18:26.272-->00:18:31.277 maybe copyright law, and terrorist content, and then it goes to you know, information 00:18:33.479-->00:18:38.484 about police brutality, or truthful facts about people that we're interested in, or, 00:18:41.420-->00:18:46.425 government policies. Um- oh and I wanted to- just one other thing about this. It is- my 00:18:49.261-->00:18:54.467 point is that these decisions about freedom of expression are inherently discriminatory, 00:18:54.467-->00:18:58.104 they're inherently discriminatory because it's all about the government's point of 00:18:58.104-->00:19:02.975 view or the majority's point of view of what's legitimate speech and what's not. We don't see 00:19:02.975-->00:19:09.715 Google and Twitter and Facebook under huge amounts of pressure to erase their sites of every 00:19:09.715-->00:19:15.121 iteration of the confederate flag. But we do them under that pressure to erase their sites of 00:19:15.121-->00:19:20.126 everything about ISIS. It's just a political question. Ok, surveillance. How are we doing 00:19:22.228-->00:19:27.233 on surveillance? I'm just going to say, so you guys know, not very good, okay. Um- Technology 00:19:30.002-->00:19:36.475 proliferates all kinds of information about us. Online, and increasingly offline, uh as 00:19:36.475-->00:19:42.214 we use health devices, we have the internet of things, we have censors everywhere, in our TVs 00:19:42.214-->00:19:47.219 and on the streets. It's a golden age for surveillance. And when technology has done what 00:19:49.688-->00:19:54.693 it's done, and made information collection about us, so cheap, so easy, and so ubiquitous, then 00:19:59.065-->00:20:05.905 the law has a role to play. This is where the law should step up its game and get involved and 00:20:05.905-->00:20:11.177 say, we're going to provide people with protection from suspicionless surveillance. 00:20:11.177-->00:20:17.116 Okay- but the law has actually done the opposite. It's been really, pretty pathetic. Um- 00:20:17.116-->00:20:23.055 basically the United States Department of Justice argues, and in many cases successfully, 00:20:23.055-->00:20:28.060 that the law doesn't um- require a warrant for vast categories of data about. Not for your opened 00:20:31.330-->00:20:36.335 emails, not for your uh not for your dropbox files, not for information about your physical 00:20:38.370-->00:20:44.076 location, not for your information about who you call, or who you email, not for your 00:20:44.076-->00:20:49.248 health data, none of that. The Department of Justice says no warrant. And I want to stress 00:20:49.248-->00:20:52.651 for people who are not lawyers, the importance of the warrant requirement because the warrant 00:20:52.651-->00:20:56.822 does a couple of things. One thing it does is it require probable cause. And that's 00:20:56.822-->00:21:00.292 really important because that removes suspicionlessness out of it. Its not like, well, we are 00:21:00.292-->00:21:06.665 just going to spy on you for no good reason. It gets a judge involved. Somebody from the 00:21:06.665-->00:21:12.338 government whose job it is to do more than just catch criminals, somebody whose got another job 00:21:12.338-->00:21:16.976 so we can balance. Um- and when you put these two things together, basically what it 00:21:16.976-->00:21:21.614 means is that is the warrant requirement is the enemy of master balance. Right? The 00:21:21.614-->00:21:26.619 warrant requirement is about going after only individual people for good reasons. Um- but 00:21:29.188-->00:21:35.127 we're at odds, we the people are at odds with our department of justice on this who basically, 00:21:35.127-->00:21:40.166 which basically wants to be able to go after this information uh you know, if it's reasonable to 00:21:40.166-->00:21:46.639 get it, which is in the eye of the beholder. In other countries, the legal protections 00:21:46.639-->00:21:52.378 for this kind of data are even lower than they are in the United States, as- as a general 00:21:52.378-->00:21:56.382 rule, there are examples to the contrary, but as a general rule, people in other countries are 00:21:56.382-->00:22:02.988 doing even worse. Now we are going to be fighting this battle, this year and next year. 00:22:02.988-->00:22:07.993 The law that underlies the Prism program that Edward Snowden revealed is uh gonna expire in 00:22:11.697-->00:22:16.702 December of 2017. It's called section 702 and that law basically says that if the 00:22:19.572-->00:22:25.010 government is um- looking at a foreigner, or their intention is to spy on a foreigner, um- then 00:22:25.010-->00:22:29.114 they don't need to get a warrant to do it, even when that foreigner, or maybe especially 00:22:29.114-->00:22:33.485 when that foreigner is talking with Americans. So for people who care about American privacy, 00:22:33.485-->00:22:38.123 the most bad for American privacy, for foreigners, means if your data is stored in the 00:22:38.123-->00:22:43.128 United States, it's bad for you, um- there's no need for uh a warrant. So, As this law starts 00:22:46.699-->00:22:50.236 to expire, those of us in the civil liberties community are going to be fighting to reform 00:22:50.236-->00:22:56.308 it. And so you guys will be hearing from me and others about, basically trying to make 00:22:56.308-->00:23:02.748 the law step up, and protect peoples' data that's stored in the United States. Um- this 00:23:02.748-->00:23:08.087 year, there has been another move to make data even easier for law enforcement to get. So 00:23:08.087-->00:23:12.691 one big issue is the idea of borders, and internet jurisdiction. It's really 00:23:12.691-->00:23:17.363 complicated if you have law of one jurisdiction, what if another country wants to do 00:23:17.363-->00:23:21.667 things a different way. Similarly, if the data's here in the United States, can other 00:23:21.667-->00:23:26.805 governments get it. That friction, that uncertainty used to be kind of a protection for 00:23:26.805-->00:23:33.078 people around the world, but now there is a legislative proposal that uh is being considered, 00:23:33.078-->00:23:38.484 which would basically say,that so long as the uh United States government certifies, does sort 00:23:38.484-->00:23:43.522 of a handshake, with another government to say that they meet certain requirements, that 00:23:43.522-->00:23:49.728 they're fair and uh that they are you know- they have adequate safeguards in place and those 00:23:49.728-->00:23:55.934 sorts of things, then United States companies will be able to turn over their customers' data 00:23:55.934-->00:23:59.905 to these other governments without needing to go through a US legal process or without 00:23:59.905-->00:24:06.045 needing a warrant. Again, it's another to make sure or to- to setup a system where peoples' 00:24:06.045-->00:24:10.316 data will get turned over to government investigators and intelligence agents without 00:24:10.316-->00:24:16.088 judicial review and without a warrant requirement. Okay so this is another legal battle 00:24:16.088-->00:24:19.291 that we're fighting this year. So ultimately, you know- we think about the internet as this 00:24:19.291-->00:24:21.293 very chaotic thing, where it's like all these beautiful individual drops of water coming 00:24:21.293-->00:24:26.298 together to make this wonderful cloud that we can- that we can investigate, but increasingly 00:24:36.408-->00:24:41.447 the cloud is getting locked down, right? And it's becoming something that is very 00:24:41.447-->00:24:46.452 surveilled, very controlled, and uh very knowable. The best thing that's happened in surveillance 00:24:50.155-->00:24:55.160 all year is encryption. Encryption adoption's been huge. And it's been that successful 00:24:58.230-->00:25:02.668 because companies have been able to implement it pretty much unilaterally. They've had to 00:25:02.668-->00:25:07.473 struggle against governments making arguments that they shouldn't encrypt, but they have 00:25:07.473-->00:25:12.678 been able to roll out encryptions so Gmail to Yahoo mail is now encrypted, WhatsApp 00:25:12.678-->00:25:18.317 is now encrypted, but these- you know- we're doing a lot better if you compare us to three years 00:25:18.317-->00:25:23.622 ago, or four years ago, a lot more of our stuff is encrypted, and encryption is good because 00:25:23.622-->00:25:29.795 even when it is not end to end, encryption means that it frees us from mass surveillance, and 00:25:29.795-->00:25:33.532 it means you have to go to somebody who holds the data and at least have legal process, 00:25:33.532-->00:25:38.937 right? So even encryption that's not end to end is really valuable for defeating 00:25:38.937-->00:25:43.942 suspicionless surveillance. Um- but you know, there's all this pressure against that, and the 00:25:47.179-->00:25:53.152 pressure against that comes from our government, which wants to try to pass rules or get 00:25:53.152-->00:25:59.725 voluntary cooperation with systems that ensure wiretap-ability. And these 00:25:59.725-->00:26:04.663 pressures come from other governments like for example Brazil. Brazil has jailed 00:26:04.663-->00:26:09.067 Facebook executives because WhatsApp is encrypted and they are unable to turn over 00:26:09.067-->00:26:14.072 information and know they're fining Facebook for that. So you know- what are we going to- what 00:26:16.308-->00:26:21.313 are we going to do about that. Um- and then ultimately, there has been reports, one by Rapid7 00:26:21.313-->00:26:27.086 and some other reports out there that while we're doing better with encryption, we're doing far 00:26:27.086-->00:26:33.358 less than we actually need to do. There have always been unpolice-able spaces, right? 00:26:33.358-->00:26:37.429 There have alway been things that the government didn't know, whether it's our thoughts, our 00:26:37.429-->00:26:42.067 behavior in the bedroom, uh what we do and say when we are in church, our ephemeral movements, 00:26:42.067-->00:26:47.072 how we move through space and time during the day. Um- and there are yes- risks, from 00:26:57.883-->00:27:03.789 having these things be private. But there are vast rewards from the fact that we take these 00:27:03.789-->00:27:08.794 risks. The fact that people can break the law is necessary for the evolution of our society. It 00:27:11.930-->00:27:18.337 is part of the natural growth of things, even when it includes crimes. Because over time we've 00:27:18.337-->00:27:24.276 changed our minds about some things that are crimes, because people were able to do it, and 00:27:24.276-->00:27:29.281 eventually we saw that it was good. Homosexuality, marijuana, sedition and more... So this 00:27:33.819-->00:27:38.657 idea that there should never be anything that you can hide, that we should live in this closed 00:27:38.657-->00:27:43.662 down cartoon cloud world, to me, is really quite terrifying. Because if it- if governments 00:27:45.964-->00:27:50.969 are effective, it basically is a hindrance to our social and political development as 00:27:56.742-->00:28:01.680 different communities. So finally I want to talk about the freedom to tinker. Many of you 00:28:04.049-->00:28:10.088 understand why this is important. It sounds like a hobby, but it's really not. It's 00:28:10.088-->00:28:15.093 a phrase that means to capture, our ability to study, modify and ultimately understand the world 00:28:18.263-->00:28:24.870 around us, and interference with the freedom to tinker is part of this centralization movement, 00:28:24.870-->00:28:30.108 right The digital Millennium Copyright Act. Section 1201 is a great example. It basically 00:28:30.108-->00:28:35.847 gives uh this legal protection to digital rights management software, so that people can't 00:28:35.847-->00:28:40.485 um- can't tamper with it and this so it also controls the way people use the underlying 00:28:40.485-->00:28:45.490 copyrighted work. uh but of course, the big enemy in my mind of the freedom to tinker is the 00:28:47.993-->00:28:53.665 CFAA, the computer fraud and abuse act, my personal least favorite statute. And this year, 00:28:53.665-->00:28:59.471 we had a ruling from a court in a case called Facebook vs Vachani. Um- It had been in the 00:28:59.471-->00:29:05.711 lower courts as Facebook vs Power um- and it just came out in July. It's case against- 00:29:05.711-->00:29:10.649 brought by Facebook against Power. And what Power was doesn't exist anymore. It was a 00:29:10.649-->00:29:16.355 social network aggregator. It allowed people to take all of their social networks, Facebook 00:29:16.355-->00:29:21.360 and, you guys rememb- may remember some of these, Myspace, um- you know your LinkedIn, your 00:29:23.595-->00:29:29.701 uh your um- what was the music one? Anyways- can take all your social networks, some of which 00:29:29.701-->00:29:34.706 don't even exist anymore and maybe this is why and aggregate them into a single place so you 00:29:34.706-->00:29:38.977 can see them all at one time. And Facebook didn't like that, right? They didn't want that to 00:29:38.977-->00:29:43.982 happen. So, uh initially, the company Power said well we can do this even if Facebook doesn't 00:29:46.018-->00:29:50.822 want us to do it, even if it's against Facebook's terms of service, because The Facebook 00:29:50.822-->00:29:55.227 users want us to do this, they want us to pull their information out and show it to 00:29:55.227-->00:30:00.232 them this way. So Facebook wrote a cease and desist letter to Power and said you've got to 00:30:02.267-->00:30:08.173 stop doing this. And the court case was over whether Power had violated the CFAA, a criminal 00:30:08.173-->00:30:14.012 law, in continuing to provide its customers with this social network aggregator. Now, those 00:30:14.012-->00:30:19.985 of us who hate the CFAA celebrated, years ago, when the court said a mere violation of 00:30:19.985-->00:30:24.990 the terms of service is not a crime, not a violation of the law. Um and we we're like Yay 00:30:26.992-->00:30:32.297 thats great! because, you know, terms of service say all sorts of crazy things. That's uh the 00:30:32.297-->00:30:36.401 law in the ninth circuit, where California and Nevada, but not necessarily the law in other 00:30:36.401-->00:30:40.038 parts of the country, but we're like, this is leadership, we're gonna show how it is, we're 00:30:40.038-->00:30:45.510 gonna lead the way. Well now the 9th circuit has said, okay, terms of service violation may 00:30:45.510-->00:30:51.483 not be uh uh a crime, but, if you get a cease and desist letter, and in the letter the 00:30:51.483-->00:30:57.989 company says stop doing this, and then you keep doing it, well then that is a crime. Then you 00:30:57.989-->00:31:02.928 don't have authorization and that is a crime. How can we let companies that write letters 00:31:04.930-->00:31:10.902 tell us what is and isn't a crime. Why are we letting companies that run computers say 00:31:10.902-->00:31:15.874 what people should and shouldn't be allowed to do with their own data. Yet, that is the ruling 00:31:15.874-->00:31:20.879 from uh the previously uh the 9th circuit, which I previously praised. This is a terrible 00:31:23.215-->00:31:28.220 decision for the freedom to tinker. Um- It pushes us into living in a permission based 00:31:30.555-->00:31:35.560 world, where if we don't have permission, we can't act. And if we act without permission, we 00:31:37.763-->00:31:42.768 can be sued, or we can be incarcerated. So, what's at stake? Over the next 20 years I 00:31:48.406-->00:31:53.779 think my fear is that if things keep going this way, um- things will happen and people really 00:31:53.779-->00:31:58.216 won't know why. Companies will make decisions, you'll see something or you won't, there'll 00:31:58.216-->00:32:03.688 be an algorithm that uh does something with your data or shows you an ad, or um- 00:32:03.688-->00:32:09.895 otherwise tries to sell you something you'll get- or maybe you'll get a loan, or maybe you 00:32:09.895-->00:32:14.032 won't get a loan and you won't be able to investigate the software that made thos- that 00:32:14.032-->00:32:20.138 helped make those decisions. Increasingly, we'll see less controversial content on the 00:32:20.138-->00:32:25.377 internet as companies either exercise their corporate responsibility or as they're 00:32:25.377-->00:32:31.383 pressured by governments around the world to take things off. There will be security 'haves.' 00:32:31.383-->00:32:35.487 The governments that are allowed to break the Computer Fraud and Abuse Act will continue to go 00:32:35.487-->00:32:41.426 ahead and do so, and there will be security 'have nots' the people who cannot uh whose 00:32:41.426-->00:32:46.865 ability to tinker is- whose ability to explore is controlled will just have to go along. And 00:32:46.865-->00:32:51.870 we're headed towards a world that's less like the utopian dream that I described and more 00:32:53.939-->00:32:58.877 a world where surveillance, censorship, and centralized control by companies and 00:32:58.877-->00:33:03.815 governments is the norm. That's my fear. I proposed a- last year in my Blackhat talk, a number of 00:33:07.719-->00:33:12.724 things that I thought would start to help us to avoid going too far into the horrible 00:33:15.694-->00:33:21.800 cartoon cloud world and try to bring us back, more closely to the dream of internet freedom, 00:33:21.800-->00:33:27.405 and I think that we are doing some of these, we're starting to do some of these pretty well. 00:33:27.405-->00:33:32.410 Um- you know- but- we don't have the actually the support necessarily of everybody who 00:33:36.781-->00:33:41.786 uses the internet because the reason why it's headed this way, isn't because people are stupid. 00:33:45.857-->00:33:50.896 The reason why it's headed this way is because people have other values at stake. There's other 00:33:50.896-->00:33:56.735 things that people care about. It's not going to just be oh people are prejudice against 00:33:56.735-->00:34:02.941 minorities or whatever. But it's because of fear. uh its because fear will start to drive our 00:34:02.941-->00:34:08.480 decision making. We're afraid of terrorists, we're afraid of pedophiles, we're afraid of drug 00:34:08.480-->00:34:13.485 dealers, we're afraid of crime. We don't like hate speech, so, people will start to embrace- we 00:34:17.122-->00:34:22.427 don't like malware, we don't like spam. So people will start to embrace this centralized 00:34:22.427-->00:34:28.433 control. They'll go for the walled garden. They'll um- you know cheer for pressure on 00:34:28.433-->00:34:33.438 social networks. So, what do we need to do to fix it. How do we make the dream of internet 00:34:36.875-->00:34:41.880 freedom possibly become more real. This year I went to a decentralization camp at the 00:34:45.684-->00:34:52.290 internet archive, which is operated by Brewster Kahle, um- also a really wonderful man. Um- 00:34:52.290-->00:34:58.496 and the decentralization camp was Brewster's effort to jumpstart this conversation that 00:34:58.496-->00:35:03.435 I want to have here, about what can be done to try to retain, capture, enshrine, the 00:35:07.205-->00:35:12.210 architecture of the internet that leads to- that can lead to ensuring these things that I 00:35:15.313-->00:35:20.318 described as part of the dream. And Brewster's um- camp had a lot of builders there. People 00:35:25.123-->00:35:29.728 who were currently in the process of building these decentralized technologies. Um- 00:35:29.728-->00:35:36.301 and then that was really awesome to see. And I learned a lot of stuff and I met a lot of really 00:35:36.301-->00:35:41.306 cool people who wanted to have technology be this force for political freedom, this force 00:35:43.508-->00:35:48.880 for individual freedom, this force for free speech and free expression, uh. But ultimately, 00:35:48.880-->00:35:55.053 I think, and maybe you guys are going to convince me otherwise, but ultimately I think that 00:35:55.053-->00:35:59.691 technology alone isn't really the problem, because I don't think the problem is that we 00:35:59.691-->00:36:05.463 don't have the tools for a decentri- decentralized network. I think the problem is that 00:36:05.463-->00:36:10.301 people maybe don't understand or value enough what a decentralized network can bring 00:36:10.301-->00:36:15.573 us, what opens can bring us, what this kind of technological freedom can bring us, and they 00:36:15.573-->00:36:22.547 want the bad things to be taken care of. Um- and so from my perspective, I feel like, not 00:36:22.547-->00:36:27.552 just technology, but norms, the um- values that people have uh need to be- we need to talk 00:36:30.455-->00:36:34.726 about it and we need to basically explain to people and help people understand, when we 00:36:34.726-->00:36:38.696 get of this, we're also getting rid of this. That these things- you know that there are good 00:36:38.696-->00:36:45.136 things that go together with it. Um- so, uh there's a saying we have in the law i think- i think 00:36:45.136-->00:36:50.442 that uh that you know there's four regulators for human behavior. There's technology, 00:36:50.442-->00:36:55.447 there's markets, there's law, and there's norms. And I think that if we want to maintain um- 00:36:59.117-->00:37:03.888 what we have today, of the dream of internet freedom. If there are things that we want to get 00:37:03.888-->00:37:08.893 better, or even if we want to kinda keep the status quo, then we need to start talking about 00:37:10.962-->00:37:15.967 bringing those levers to bare on this problem and start building the system for this uh 00:37:21.106-->00:37:28.113 revolutionary technology to either stay, or for us to think about what technology um- thats 00:37:28.113-->00:37:33.618 going to be liberating is going to be the thing that replaces the internet that we will have 00:37:33.618-->00:37:40.558 in 20 years. So thank you for your attention. Um- I promised I would leave time for questions 00:37:40.558-->00:37:44.763 and comments and I think we actually have about 10 minutes so I did what I wanted to do and 00:37:44.763-->00:37:50.034 I'd like to invite people who have thoughts or questions um- to come to the microphone, you 00:37:50.034-->00:37:53.905 need to come to the microphone so that people can read it, but I do want to let you um- even 00:37:53.905-->00:37:58.510 though I got to give a speech for a really long time, you don't. Um- so please keep your 00:37:58.510-->00:38:02.781 comment pretty short so that other people who want to say something have an opportunity. 00:38:02.781-->00:38:07.785 Okay um- I'm gonna go with this- oh- thank you [Applause] I'm gonna start with this brave 00:38:09.821-->00:38:14.826 gentleman in the front here in the blue shirt. >> So uh in the spirit of keeping it short, I've 00:38:18.429-->00:38:24.936 got two questions. Um- one was your prediction on encryption becoming illegal at some point 00:38:24.936-->00:38:31.910 in the US and two, is there any role for the dark web to save us. >> MM- okay great questions 00:38:31.910-->00:38:35.246 so um- I know I'm gonna be quoted on these things but I'm still going to tell you what I 00:38:35.246-->00:38:39.284 think. Um- I don't think encryption is going to become illegal in the United States. I- 00:38:39.284-->00:38:44.489 I don't think that is gonna happen. What I like to say about that is, you have law 00:38:44.489-->00:38:49.561 enforcement interest, not even intelligence, because the intelligence uh agencies, that 00:38:49.561-->00:38:54.699 you know- NSA, they've um- developed the skills and the techniques that they need to 00:38:54.699-->00:38:59.571 circumvent encryption. We've seen it in the Snowden slides. They have these great hackers 00:38:59.571-->00:39:03.241 that have got lots of great tools. They can get around encryption in lots of different 00:39:03.241-->00:39:08.146 areas. But the people who are really wanting to circumvent encryption are law enforcement. 00:39:08.146-->00:39:14.919 The FBI, and ultimately local cops. Um- and usually I say that law enforcement wins. But, law 00:39:14.919-->00:39:19.924 enforcement is up against all the money in the world. And when you put law enforcement usually 00:39:21.960-->00:39:26.598 pretty successful political player up against Apple, Facebook, Twitter, Googe, you 00:39:26.598-->00:39:32.370 know- business interests that don't want to be hacked et cetera. And I think they are not 00:39:32.370-->00:39:37.942 going to win. But, these American companies that are really dominant are not so in 00:39:37.942-->00:39:43.748 other countries and just look at what's happening to WhatsApp in Brazil. People- the guy who's 00:39:43.748-->00:39:48.186 like the vice president of sales spent a couple of days in prison. Now faced with being 00:39:48.186-->00:39:53.925 fined, um- and I don't know maybe this fine isn't as serious penalty but governments can 00:39:53.925-->00:39:58.630 bring very heavy fines to bear. And ultimately, then the question is, well what are you 00:39:58.630-->00:40:03.201 going to do? Are you going to abandon that country, as a company, or just not going to 00:40:03.201-->00:40:07.372 operate there anymore, or are you going to try to make it work. And we've seen both. We've 00:40:07.372-->00:40:12.844 seen you know- Google get out of China and we've seen um- Blackberry RIM make special 00:40:12.844-->00:40:18.449 devices for particular markets so that's where I think the risk is really going to come. Um- 00:40:18.449-->00:40:23.121 what do I think about the dark web? uh I think it's possible that the dark web is the future 00:40:23.121-->00:40:29.494 web, we just don't know. You know- every technology gets its start in crime. Pagers used to 00:40:29.494-->00:40:35.900 be a sign you were a drug dealer or a prostitute, you know I can go on, some- the early adopters 00:40:35.900-->00:40:40.638 of technologies are usually criminals but then, you know, god bless them, they develop the 00:40:40.638-->00:40:45.476 technology, and then it finds these legitimate uses and then people around the world fall in 00:40:45.476-->00:40:51.049 love with them. Not pagers anymore obviously, but for a while those were useful. Yes? 00:40:51.049-->00:40:55.520 >>As an old internet fart, I mean uh I'm going to play a little devil's advocate here and 00:40:55.520-->00:41:00.992 hopefully you can provide a little bit of insight, you just did with your last comment. I 00:41:00.992-->00:41:05.630 always run my own servers, I run over other people's pipes, X25, Bisync, all that stuff, I don't 00:41:05.630-->00:41:10.435 really care, I don't have any Facebook, I don't have any of this stuff, so why would I car 00:41:10.435-->00:41:14.706 about this. And I've worked on both sides, by the way. I've worked on the government side 00:41:14.706-->00:41:20.645 and the- the outside and they both have their points. So... >>Yeah, I mean- I think that, 00:41:20.645-->00:41:26.484 other that altruism, [laughter] I think the reason for people who are very technologically 00:41:26.484-->00:41:32.790 astute to care is because you want to talk to other people, not just yourselves. Um- You 00:41:32.790-->00:41:38.062 want to be able to access information from people around the world who don't have that 00:41:38.062-->00:41:43.968 same technological expertise. Um- You know if it's going to really be a global marketplace 00:41:43.968-->00:41:48.973 for ideas, there can't be an expertise price to pay to get in the door. >>Can you help me 00:41:51.342-->00:41:56.647 clarify, or just help clarify, what seems to be an apparent contradiction with um- freedom 00:41:56.647-->00:42:00.718 of information, freedom of access, those kinds of things, but also, your- same time you 00:42:00.718-->00:42:05.723 push for end to end encryption and uh how do you um- compare the needs of the freedom of 00:42:08.693-->00:42:13.498 information with individual property rights and intellectual property rights, protection of 00:42:13.498-->00:42:19.404 mass security information, those kinds of things, and what's your view on how those kinda jives 00:42:19.404-->00:42:24.442 together. >> Yeah, that's a fantastic question, uh So I'm just gonna come clean and admit 00:42:24.442-->00:42:29.680 that there is a conflict there. Um- You know if you talk about free flow of information, then 00:42:29.680-->00:42:35.420 what's privacy, right? If you want to talk about um- you know incentives to create, and those 00:42:35.420-->00:42:39.056 are part of intellectual property then how does that impact the free flow of 00:42:39.056-->00:42:44.228 information. And I guess what I would say is there are specific policy issues we could discuss. 00:42:44.228-->00:42:49.233 Um- But I think that we don't value the internet freedom utopia view as we make those 00:42:51.969-->00:42:57.842 debates, enough. So we have very strong intellectual property protection, like in the DMCA, 00:42:57.842-->00:43:04.382 um- and you know we have a lot of things that are prohibitive there um- I think we go to far. 00:43:04.382-->00:43:08.319 Um- In the, you know, surveillance versus freedom of information thing, I think 00:43:08.319-->00:43:12.690 that's a really hard issue, you can see it in the right to be forgotten. If I'm really worried 00:43:12.690-->00:43:17.361 about personal privacy, um- I'm an American, so for me it's a little bit of an easier 00:43:17.361-->00:43:23.634 question. I feel like truthful information should not be suppressed, period. Um- But that 00:43:23.634-->00:43:28.172 does mean that the privacy of some people or their ability to keep information about 00:43:28.172-->00:43:34.612 themselves uh secret, is going to be uh is gonna be a little bit less. uh But I think that we 00:43:34.612-->00:43:41.486 need to value the um- the- to me I value the freedom part more. But in- in- in- when we do 00:43:41.486-->00:43:45.590 policy a lot of times it's a negotiation right? You're not really picking one or the other. 00:43:45.590-->00:43:50.394 You have these options where you're trying to- you know, sort of optimize the two things and 00:43:50.394-->00:43:54.465 you have to balance, but in order to do it correctly, you have to value both sides. And 00:43:54.465-->00:43:57.602 what I'm saying is I think there is a side here that we are losing track of, that we are not 00:43:57.602-->00:44:01.639 valuing enough. Now I wanted to say something else about that which is related to encryption 00:44:01.639-->00:44:07.745 which is encryption, I think one of the policy problems there is that we have policy people, 00:44:07.745-->00:44:12.750 lawyers, who are used to, on the one hand, on the other hand, let's you know, make it work- 00:44:12.750-->00:44:16.587 try to find a way to make it work for everybody. Nobody is going to be entirely happy and 00:44:16.587-->00:44:22.727 then we have, um- it's either encrypted or it's hackable, you know, and I think the problem 00:44:22.727-->00:44:28.065 there is that policy makers have a very hard time understanding how difficult security is, and 00:44:28.065-->00:44:32.803 that efforts to undermine security, given our current state of knowledge, means it's 00:44:32.803-->00:44:37.608 going to be insecure. And so, we're sort of taking a nuanced way of looking at the world, and 00:44:37.608-->00:44:42.146 trying to make it, you know, and- and- they trying to like apply that nuanced way to 00:44:42.146-->00:44:47.318 something where its a much more binary, on or off, kind of thing. So, that's what I think 00:44:47.318-->00:44:54.025 that we don't have that nuance there. Thank you >>Hi. uh pertaining to the- your comment 00:44:54.025-->00:45:00.164 about the Facebook suit, in regards to their cease and desist, how does that apply to 00:45:00.164-->00:45:04.735 the first amendment in the- in the form. And if I say "Fuck Facebook" does that mean I'm 00:45:04.735-->00:45:10.041 gonna get arrested when I walk outside because fuck Facebook! [Laughter] >>Nobody arrest that 00:45:10.041-->00:45:15.046 guy. You know this is- this- Facebook benefits politic- or economically from being able to 00:45:19.116-->00:45:24.522 control uh the way that people use Facebook data. And it's you know— Facebook was like 'well, 00:45:24.522-->00:45:28.693 if we have aggregators' I don't know what they thought, but certainly an aggregator allows 00:45:28.693-->00:45:34.031 you to use uh different social networks more, but if you have to pick one, then somebody is 00:45:34.031-->00:45:39.103 going to end up being the victor in the marketplace and that's what ended up happening. We have 00:45:39.103-->00:45:43.474 this ongoing debate about what the CFAA can prohibit if you, you know- let it be up to a 00:45:43.474-->00:45:48.346 company that either puts it into terms of service or writes a letter and this irrationality uh 00:45:48.346-->00:45:52.984 is not something courts are really understanding very well right now. They are considering 00:45:52.984-->00:45:57.822 it in like a narrow case, but they're not thinking large enough about what the risks to 00:45:57.822-->00:46:02.760 free expression are. Yes, next please. >>Can you remind about what your speech was called. 00:46:04.929-->00:46:10.301 >>uh Slouching towards internet utopia, I believe... >>Okay, and I recognize and respect that 00:46:10.301-->00:46:15.306 you're an intelligent woman so I would like to believe that you understand that the principle of 00:46:15.306-->00:46:21.612 utopia is completely impossible. >>Are you asking me if you think utopia is possible? >>No what 00:46:21.612-->00:46:26.617 I'm trying to ask is, how can you say that there should be no regulations whatsoever, when the 00:46:31.055-->00:46:36.060 past have shown that regulations are necessary to form the utopia, and, without them, it 00:46:38.129-->00:46:43.634 leads to anarchy and chaos. >>Okay, I get totally- thank you. So, Um- there's two views I 00:46:43.634-->00:46:48.272 think of what I'm saying. One view might be we should have less regulation. I don't think 00:46:48.272-->00:46:52.176 no regulation is possible, but one view maybe we should have less regulation. Another view 00:46:52.176-->00:46:56.914 might be we should have more regulation. That we regulate what people can do, we regulate 00:46:56.914-->00:47:01.452 what companies do, we regulate what the government can do in order to further these values. 00:47:01.452-->00:47:06.957 So, when I- the reason why I call it the dream of internet freedom and I say that it's 00:47:06.957-->00:47:12.863 utopian is because I don't think that we're gonna live like that. But I think that it's a- it's a 00:47:12.863-->00:47:17.468 vision, it's like a set of values that I think we're loosing sight of that I think 00:47:17.468-->00:47:22.106 drew me and a lot of people to the internet and I think that we're slowly loosing. And I 00:47:22.106-->00:47:27.144 think if we want to preserve some of those aspects, we want to have some of that be true, 00:47:27.144-->00:47:32.149 then we need to start thinking about what to do. Now, whether we do it through regulation, of 00:47:32.149-->00:47:36.821 government, or we have less regulation of speech, these are gonna be case by case things. 00:47:36.821-->00:47:41.826 [Applause] Thank you. I think I have one more minute, so I think this is gonna have to be the uh 00:47:45.029-->00:47:51.135 >>Yeah this is gonna be quick. Basically this is about the end to end encryption that we were 00:47:51.135-->00:47:56.507 talking about, and how the end to end encryption cannot be misused by organizations like 00:47:56.507-->00:48:01.812 ISIS and others. And that is something that- So for- for- how do segregate ordinations like 00:48:01.812-->00:48:07.752 ISIS versus the good community. How do- how do you differentiate them. How do you make sure those 00:48:07.752-->00:48:12.690 communities do not abuse the others. >>That- they- don't- yeah. So, I think, I mean my 00:48:12.690-->00:48:18.462 view of this is that um- you know, I don't think that, I have an opinion about this but I 00:48:18.462-->00:48:20.464 don't think that more information about ISIS will make more people join ISIS. I think 00:48:20.464-->00:48:22.466 that it's just as likely if not more likely that more information about ISIS is going 00:48:22.466-->00:48:24.468 to make people feel resolved against it. You know, that will make people, if they hear their 00:48:24.468-->00:48:29.540 friends or relatives talking about how ISIS are you know- are attractive in some way, will 00:48:29.540-->00:48:34.912 make them uh resist it, or will make them say something to their friends and fight against it. So 00:48:34.912-->00:48:39.917 I don't think that the regulation of the information about these social or political 00:48:49.193-->00:48:55.599 movements, about um- abuse of governments, about terrorists, about these issues, I don't 00:48:55.599-->00:49:01.505 think controlling the information about actually helps make the world a safer, better 00:49:01.505-->00:49:06.410 place. I think that more information overall makes the world a safer, better place. 00:49:06.410-->00:49:11.415 [Applause] >>So, I'm looking - I'm looking for my- I'm looking for my boss who is supposed to 00:49:13.450-->00:49:19.156 tell- tell me if I'm not allowed to take any more questions, and until I stop. Okay. >>So uh I've 00:49:19.156-->00:49:25.095 always seen the US 4th amendment as making a space so people can explore the world, and, as long 00:49:25.095-->00:49:31.702 as what they do does not reach the helm of probable cause, they'd be left alone. That uh 00:49:31.702-->00:49:37.241 you know it would be a so called lawless space, but it would still be a constitutional space. 00:49:37.241-->00:49:42.346 And I always thought that you know, it's not gonna, generate anarchy and chaos as, if what 00:49:42.346-->00:49:46.817 your doing to be left- uh while your left to your own devices, does not reach the point where 00:49:46.817-->00:49:48.819 you're harming someone else, there's no real probable cause against you, that you'd have the 00:49:48.819-->00:49:53.057 freedom to do that kind of stuff. uh separately, real quick, you say that- you say 00:49:53.057-->00:49:58.062 that uh people want companies to essentially take care of everything for them, that they 00:50:04.101-->00:50:08.172 just want the bad things to go away. And from my perspective, that's- there doesn't really 00:50:08.172-->00:50:13.177 seem to be too much why uh uh why uh so much centralization or why uh did- did these 00:50:15.613-->00:50:20.017 centralized parts of the internet are shrinking. uh the loss of the morality of 00:50:20.017-->00:50:24.088 communications online where information is stored indefinitely, where you can't 00:50:24.088-->00:50:28.158 just have ca- casual conversations. There is a lot more consideration has to made 00:50:28.158-->00:50:32.696 towards everything you say online now, and the thought of exploring some random topic now 00:50:32.696-->00:50:38.335 has, you know, especially the extremes, now has a lot more concern to be had. uh because 00:50:38.335-->00:50:43.941 the loss of the morality. That's also compounded by fear of surveillance, where you know- 00:50:43.941-->00:50:48.245 some random chatter online is more likely to be seen if it- you're talking about something 00:50:48.245-->00:50:54.618 uh let me be extreme with your current culture. uh and, lost of trust, is Google a capable 00:50:54.618-->00:50:59.490 steward of your privacy, not necessarily because of any purposeful action on their part 00:50:59.490-->00:51:03.661 but because they still have to abide by a lot of legal processes, a lot of people are 00:51:03.661-->00:51:07.865 aware they have to abide by a lot of legal processes that unfortunately, your search 00:51:07.865-->00:51:13.070 queries have to be seen. >>So let me ask you a very quick question okay? Do you think, 00:51:13.070-->00:51:17.775 what do you think the answer to that. Is technology the answer? Is law the answer? Is- you know- 00:51:17.775-->00:51:23.480 Do people have to change the way they think about the internet? All of those? >>I would say make 00:51:23.480-->00:51:29.019 it safer for the people who typically explore the fringes of society to do so uh so, yeah, so 00:51:29.019-->00:51:35.159 fight, keep fighting against surveillance, keep pushing for more femoral communications. 00:51:35.159-->00:51:40.164 Make it safe to keep exploring. >>Thank you. And uh that's my time. [applause] Thanks to all 00:51:42.266-->00:51:47.271 of you, I really appreciate you coming, and your attention. Thank you for having me.