So this is Slacking Towards Utopia, the State of the Internet Dream. Please welcome Jennifer S. Granick. Thank you. Thank you. Really? New College? Awesome. My alma mater. I should talk a little bit about New College in this talk. I want to thank everybody for coming. I think I got a really awesome room to speak in this year with these totally cool computers behind me. And I want to also thank my parents who are here. They're over here. They've gotten to come see me talk a number of times here at DEF CON. My dad's hacker name is The Eagle, so you guys can look him up. And I want to talk today about something that I think that a lot of us here at DEF CON hold really dear. And I spoke about this last year at the University of New York. I spoke about my black hat when I gave the keynote. Can I just see, just a show of hands, how many people saw my black hat keynote? Okay, so some but not all. So the beginning, I apologize, might be a little bit repetitive for you guys, but I think it's important to set the stage. Because in that speech, I talked about something that I called the dream of Internet freedom. And it's something that I think means a lot to many of us. When I said that the dream of Internet freedom was endangered and that we need to be able to do something about it, I meant that we as a community needed to start taking that seriously if we wanted to ensure that we could have some of those things that maybe animated us politically, animated us technologically in the early days of the Internet. So today what I want to do is I want to revisit that idea of the dream of Internet freedom, and I want to take a look at what's happened over the past year and see whether we're getting closer to the dream or we're getting further away. And my goal is going to be to end my talk early, before all the time is up, so that we can start to have a conversation and start to talk together about what it is we might do if we want to take the dream seriously and try to see some of these things come to fruition. So before we get there, though, I want to take a little time for the people who didn't see the speech to talk about what this dream of Internet freedom is. And I'm going to talk about it from my experience and how I can help you. And I'm going to talk about how I came to believe in this utopian vision. And for me it started when I read Stephen Levy's book, Hackers, and Hackers, Heroes of the Computer Revolution. And in this book I learned that information wants to be free. I learned that computers would help us understand authority better and be able to be more individualistic, make our own decisions about what was right and what was wrong. I learned that computers could connect people so that we could talk to each other and share information. Then I learned that individual rights were something that could be built in to the technological world around us. That decentralization was not just a principle of computing, but was also a principle of political freedom as well. And so this came to be what I started to understand as the dream of Internet freedom. The dream of a free, open, interoperable, reliable Internet where people can speak their minds and anyone who wants to hear it can listen. So I also learned not that long after about that the hackers were a thing and that hackers were really intimately tied to this dream. And, you know, Stephen Levy tells this story about, you know, old-school hackers from back in the 1950s at MIT. But I learned that there were people now, today, who were trying to make this dream true. Because I discovered the Hacker Manifesto, written and published in 1986, which I guess it's an anniversary this year, and published in Frack Magazine. And there I learned that, you know, hackers were people who wanted free access to information and were willing to take time to build the tools to make it so. That hackers wanted a world where curiosity was its own reward. And that people could explore this world around them and find their own truths, not just accept the conventional wisdom. So as the mentor explained it, with this background, the future could be a place where people would just meet mind to mind and exist without skin color, without nationality, without religious bias. And I wanted this to be true. But I learned, when I started to read the Hacker Crackdown by Bruce Sterling, that it was in danger. And that there were law enforcement agents who were using a law called the Computer Fraud and Abuse Act, my personal enemy, to go after people who are trying to explore this network. And interpreting this law in these huge ways that basically criminalized curiosity. Around about the same time, so I was going to law school, and I was like, this is going to be the thing I, you know, really try to fight against. And around about the same time, we learned about the horrible global scourge of online pornography. And some people here are pro-porn, I understand. I think, you know, I was pro-porn too. The idea is that, you know, there was this, the internet, which should be this nice place for people to hang out, was polluted with all of this terrible stuff. And that we needed to get rid of it. That was a very common point of view. There shouldn't be anything dirty on the internet. And another sort of alternative view is that, well, if there's going to be dirty stuff, for those few people who are interested in it, it should be zoned. Like in a city, how there's like a red light district that's kind of slummy and run down, and you've got to go there. But everything else should be policed and be clean. And this idea was terrible for those of us who wanted the internet to be a place that would be a free exchange of ideas. It was terrible. It was terrible because, you know, our dream was that the internet would be like a library. But better than a library, where every book that had ever been written was on the shelf and available. And here it was that these people were coming in and saying prudishly, no, it's not going to be like a library. It's going to be like TV or radio. And we didn't want that. So this galvanized sort of a generation of activists who wanted to fight against this vision of the internet as TV or radio, in favor of the vision of the internet that it was free, freer even than a library. So into this mix comes John Perry Barlow, lyricist for the Grateful Dead, founder of the Electronic Frontier Foundation, lovely man, very poetic writer. And he drafted this Declaration of Independence of Cyberspace. And in the Declaration of Independence of Cyberspace, he set forth this vision. And he said in it, you know, governments of the industrial world, you weary giants of flesh and steel. I come from cyberspace, the new home of mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You have no sovereignty where we gather. Now this vision wasn't just a reaction against the whole cyber porn hysteria with Marty Rimm. It wasn't just a reaction against the whole cyber porn hysteria with Marty Rimm. And the passage by Congress of laws that would say that it's a penalty or it's a crime to put porn on the internet. This was, you know, reacting to those legislative proposals. But it was also a reaction to this government and business as usual. And a real expression of this dream of internet freedom that the people together would govern themselves and would be able to take care of and provide for each other, this intellectual freedom. So, this galvanized me, motivated me in my legal career. And I think that it did the same for many if not most of you. That this was the thing that got you excited about computers in the first place and made you want to be a part of it. So, the dream of internet freedom. That we would be able to overcome age, race, class, and gender. That we would be able to communicate with anyone, anywhere. That we would have free access to information wherever in the world it was generated. The hands-on imperative that we wouldn't have to take the way things work for granted. That we would be able to investigate and explore it for ourselves and find our own reality. And ultimately, the idea that computers would liberate us. So, how are we doing? Well. I think that this utopian vision is less and less true every day. And it's less and less true because of some dynamics that maybe we couldn't have predicted at the time, right? Today, technology is generating more information about us than ever before. Increasingly making a map of what we do for governments and for companies to learn about us and to attempt to manipulate us or to otherwise regulate us. It is a golden age for surveillance. Today, there's a real appetite for censorship. Racism and sexism have proven more than resilient enough to thrive online. And people who use the internet don't want to hear this garbage. They want someone to take care of it for them. And so, there's an appetite for censorship. Not just for the internet. Not just in the United States, the home of the First Amendment. But in countries around the world that have even a less robust free speech tradition than we do. And there's a real appetite for corporate control. And a lot of that has to do with spam, malware, bad user interface. People want to have their technology work. They are attracted to a safe, closed ecosystem where spam and malware is taken care of. Where software updates happen automatically. There's just not this appetite for individualized control of technology. There's no market for it. And so, these are the things that are happening today. Surveillance, censorship, and control. And they're enabled by three trends. Three things that are happening that make this possible. And one is centralization, okay. Centralization is the idea that all our services are now one thing. Google for search. Gmail for email. Facebook for social networking. It's enabled by regulation. Governments are getting involved here and are regulating for these things that people want. And ultimately, globalization. Internet companies are global from the get-go. And so, it's not just, you know, the United States that's involved or the United States and Europe that's involved. It's countries from around the world who are getting involved in the regulation game. So, now I want to talk about that's my sad state of affairs. So, now I want to talk about what's happened over the last year or so and see if we can like learn from recent experience about these things. So, let's take freedom of expression, all right. We have moved very far from the idea that people get to see whatever information they want to see. I mean, it's actually particularly scary because sometimes we don't even know. Maybe most of the time. We don't even know what information is being removed from our purview. So, here's an example. Facebook has been sued multiple times by families whose loved ones were killed by ISIS. And these families have said that it's at least partially Facebook's responsibility because Facebook has videos from jihadists online that are talking about and trying to foment, allegiance to the group and foment jihadist activities, okay. So, Facebook is getting all this pressure to take things down. And it's not just from a few civil litigants either. United States government officials go to these social networks and they say, we know we can't make you do this under the First Amendment, but surely you can exercise some corporate responsibility and take down these beheading videos and stuff. Surely you don't need to carry this kind of terrible inflammatory crap on your network. And companies start to take this stuff down. But then we have something like the Philando Castile video. A video of an African American man who was shot and killed by police officers for no good reason. And initially, Facebook took the Castile video down. And then there was this outcry. Why are you taking the video down? You're censoring this evidence of police brutality. People need to see how this man died. These are the same thing. They're not different. They're the same thing. But we're putting this pressure on the intermediary, on a private company to make these decisions for us about what we're going to get to see and what we're not, all right. It's not just Facebook, you know. When on Twitter, some guy basically harassed an African American actress, Leslie Jones, the woman who appeared in Ghostbusters, to the point where she was like screw this. I'm not going to be on Twitter anymore. It's too filled with hate speech. I don't want to put up with this. And so Twitter blocked the guy. And he claimed, okay, it's a big violation of my free speech. He can go on the web and he can still say whatever he wants. People want to find it. Maybe it will be a little bit harder. I don't know. But the web only really works as open if you can find the stuff you want when you search for it. And that means Google. Google is the dominant search engine basically around the world. Some countries have their own, but basically in almost every country of the world, Google is the dominant search engine. Well, what does Google carry? They have a huge responsibility because that's the venue, that's the path through which most people are going to find the information that they want. So Google is under an immense amount of pressure to change its search results in order to accommodate certain political demands. So Google is supposed to demote torrents because torrents are copyright infringing. Google is now subject to orders in Europe about the right to be forgotten. Now the right to be forgotten is the idea that even true information about you, at some point, becomes sort of outdated or outmoded, and if you don't want people to be able to find that thing about you and have it be the main thing they know, you can go and be delisted from the search engine so that if they search for you, they don't see your DUI or your terrible high school debate performance or whatever it is that you don't want people to see. So it's this privacy right that is developed in the European Union. But the result of it is is that when people search for you in Google, they're not getting truthful information, right? And, you know, France and Germany were the countries that have been really, you know, advocating this, saying, well, you know, it's not really even a conflict between privacy and free speech because you don't have a free speech right to have this private information about people, a very different vision than we here in the United States have, right? So ultimately the question is Google's got to comply with French law. They have to. They operate there. They have to. What does it mean, though, to comply with French law? Does it mean that if I'm a French citizen and I have information de-indexed about me, then nobody in France can get it or nobody in Europe can get it or nobody in the entire world can get it? And what France is doing now is France has pushed its case against Google that Google is legally obligated to remove these results not just for France, not just for Europe, which has the right to be forgotten, although it's different in different countries, but for everybody, even here in the United States where we would have a right to access that information. Sorry, technical difficulty. Why should France be able to tell us what we can and can't see? But because these are global companies being regulated by global governments, that's what's happening. Okay? You know, in another example, people are suing in Europe, they're suing Twitter and YouTube and Facebook for a deficient response, in their opinion, to hate speech. But hate speech is legal in the United States. There was a news story this year where the German Chancellor, Angela Merkel, was overheard pressuring Mark Zuckerberg, asking him what is he doing to prevent Christmas. And he was very critical of her criticism of her open door immigration policies. You know, in the law, we call this a slippery slope, right? It starts with stuff that people agree on, like maybe copyright law and terrorist content, and then it goes to, you know, information about police brutality or truthful facts about people that we're interested in, or government policies. Oh, yeah. Oh, and I want to say just one other thing about this. It is, my point is that these decisions about freedom of expression are inherently discriminatory. They're inherently discriminatory because it's all about the government's point of view or the majority's point of view of what's legitimate speech and what's not. We don't see Google and Twitter and Facebook under huge amounts of pressure to erase their sites of every iteration of the Confederate flag. But we do see them under that pressure to erase their sites of everything about ISIS. It's just a political question. Okay, surveillance. How are we doing on surveillance? Just going to say, so you guys know, not very good, okay? Technology proliferates all kinds of information about us, online and increasingly offline. As we use health devices, we have the internet of things, we have sensors everywhere in our TVs and on the streets. It's a golden age for surveillance. And when technology has done what it's done and made information collection about us so cheap, so easy, and so ubiquitous, then the law has a role to play. This is where the law should step up its game and get involved and say we're going to provide people with protection from suspicionless surveillance, okay? But the law has actually done the opposite. It's been really pretty pathetic. Basically, the United States Department of Justice argues, and in many cases successfully, that the law doesn't require a warrant for vast categories of data about you. Not for your opened emails, not for your Dropbox files, not for information about your physical location, not for your information about who you call or who you email, not for your health data, none of that. The Department of Justice says no warrant. And I want to stress for people who are not lawyers the importance of the warrant requirement, because the warrant does a couple of things. One thing it does is it requires probable cause, and that's really important because that removes suspicionlessness out of it. It's not like, well, we're just going to spy on you for no good reason. It gets a judge involved, somebody from the government whose job it is to do more than just catch criminals, somebody who's got another job so that we can balance. And when you put these two things together, basically what it means is the warrant requirement is the enemy of mass surveillance, right? The warrant requirement is about going after only individual people for good reasons. But we're at odds. We, the people, are at odds with our Department of Justice on this, which basically wants to be able to go after this information, you know, if it's reasonable to get it, which is in the eye of the beholder. In other countries, the legal protections for this kind of data are even lower than they are in the United States. As a general rule, there are examples to the contrary. But as a general rule, people in other countries are doing even worse. Now, we're going to be fighting this battle this year and next year. The law that underlies the PRISM program that Edward Snowden revealed is going to expire in December of next year. It's going to expire in December of 2017. It's called Section 702. And that law basically says that if the government is looking at a foreigner, their intention is to spy on a foreigner, then they don't need to get a warrant to do it, even when that foreigner, or maybe especially when that foreigner is talking with Americans. So for people who care about American privacy the most, it's bad for American privacy. For foreigners, it means if your data is stored in the United States, it's bad for you. There's no need for, you know, a warrant. So as this law starts to expire, those of us in the civil liberties community are going to be fighting to reform it. And so you guys will be hearing from me and others about basically trying to make the law step up and protect people's data that's stored in the United States. This year, there's been another move to make data even easier for law enforcement to get. So one big issue is the idea of borders and internet jurisdiction. It's really complicated. If you have law of one jurisdiction, what if another country wants to do things a different way? Similarly, if the data's here in the United States, can other governments get it? That friction, that uncertainty used to be kind of a protection for people from around the world. But now there's a legislative proposal that is being considered, which would basically say that so long as the United States government certifies, does sort of a handshake with another government to say that they meet certain requirements, that they're fair and that they are, you know, they have adequate safeguards in place and those sorts of things, then United States companies will be able to turn over their customers' data to these other governments without needing to go through U.S. legal process or without needing a warrant. Again, it's another way to make sure or to set up a system where people's data will get turned over to government investigators and intelligence agents without judicial review. And without a warrant requirement. Okay, so this is another legal battle that we're fighting this year. So ultimately, you know, we think about the internet as this like very chaotic thing where it's like all these beautiful individual drops of water coming together to make this wonderful cloud that we can, you know, that we can investigate. But increasingly the cloud is getting locked down, right? And it's becoming something that is very, very surveilled, very controlled and very knowable. The best thing that's happened in surveillance all year is encryption. Encryption adoption has been huge. And it's been that successful because companies have been able to implement it pretty much unilaterally. They've had to struggle against governments making arguments that they shouldn't encrypt. But they have been able to roll out encryption. So Gmail to Yahoo Mail is now encrypted. WhatsApp is now encrypted. But these, you know, we're doing a lot better. If you compare us to three years ago or four years ago, a lot more of our stuff is encrypted. And encryption is good because even when it's not end-to-end, encryption means that it frees us from mass surveillance. And it means you have to go to somebody who holds the data and at least have legal process, right? So even encryption that's not end-to-end is really valuable for defeating suspicionless surveillance. But, you know, there's all this pressure against that. And the pressure against that comes from our government, which wants to try to pass rules or get voluntary cooperation with systems that ensure wiretappability. And this pressure comes from other governments like, for example, Brazil. Brazil has jailed Facebook executives because WhatsApp is encrypted. And they're unable to turn over information. And now they're fining Facebook for that. So, you know, what are we going to do about that? And then ultimately, you know, there's been reports, one by Rapid7 and some other reports out there that while we're doing better with encryption, we're doing far less than we actually would need to do. There have always been unpoliceable spaces, right? There have always been things that the government didn't know, whether it's our thoughts, our behavior in the bedroom, what we do and say when we're in church, our ephemeral movements as we move through space and time during the day. And there are, yes, risks from having these things be private. But there are vast rewards from the fact that we take these risks. The fact that people can break the law is necessary for the evolution of our society. It is part of the natural growth of things, even when it includes crimes. Because over time we've changed our minds about some things that are crimes because people were able to do it and eventually we saw that it was good. Homosexuality, marijuana, sedition, and more. So this idea that there should never be anything that you can hide, that we should live in this closed, closed down cartoon cloud world, to me is really quite terrifying. Because if it, if governments are effective, it basically is a hindrance to our social and political development as different communities. So finally I want to talk about the freedom to tinker. Many of you understand why this is important. It sounds like a hobby. But it's really not. It's a phrase that means to capture our ability to study, modify, and ultimately to understand the world around us. And interference with the freedom to tinker is part of this centralization movement, right? The Digital Millennium Copyright Act, Section 1201 is a great example. It basically gives this legal protection to digital rights management software so that people can't tamper with it. And so it also controls the way people use the underlying copyrighted work. But of course the big enemy in my mind of the freedom to tinker is the CFAA, the Computer Fraud and Abuse Act, my personal least favorite statute. And this year we had a ruling from a court in a case called Facebook versus Vichani. It had been in the lower courts as Facebook versus Power. And it just came out in July as a case against, brought by Facebook against Power. And what Power was doesn't exist anymore. It was a social network aggregator. It allowed people to take all their social networks, Facebook and you guys may remember some of these, MySpace, you know, your LinkedIn, your, what was the music, anyway, you could take all your social networks, some of which don't even exist anymore and maybe this is why, and aggregate them into a single place so you could see them all at one time. And Facebook didn't like that. Right? They didn't want that to happen. So initially the company, Power, said well we can do this even if Facebook doesn't want us to do it even if it's against Facebook's terms of service because the Facebook users want us to do this. They want us to pull their information out and show it to them this way. So Facebook wrote a cease and desist letter to Power and said you've got to stop doing this. And the court case was over whether Power had violated the CFAA. It was a criminal law in continuing to provide its customers with this social network aggregator. Now those of us who hate the CFAA celebrated years ago when the court said a mere violation of the terms of service is not a crime, not a violation of the law. And we were like yay, that's great because, you know, terms of service say all sorts of crazy things. That's the law in the Ninth Circuit where California and Nevada are but not necessarily the law in other parts of the country. So we were like this is leadership, we're going to show how it is, we'll lead the way. Well now the Ninth Circuit has said okay, terms of service violation may not be a crime but if you get a cease and desist letter and in the letter the company says stop doing this and then you keep doing it, well then that is a crime. Then you don't have authorization and that is a crime. How can we let companies that write letters tell us what is and isn't a crime? Why are we letting companies that run computers say what people should and shouldn't be allowed to do with their own data? Yet that is the ruling from the previously, the Ninth Circuit which I previously praised. This is a terrible decision for the freedom to tinker. It pushes us into living in a permission-based world where if we don't have permission then we can't act. And if we act without permission, we can be sued or we can be incarcerated. So what's at stake? Over the next 20 years I think my fear is that if things keep going this way, things will happen and people really won't know why. Companies will make decisions. You'll see something or you won't. There'll be an algorithm that does something with your data or shows you an ad or otherwise tries to sell you something. And you'll get, or maybe you'll get a loan, maybe you won't get a loan and you won't be allowed to investigate the software that made those, that helped make those decisions. Increasingly we'll see less and less controversial content on the internet as companies either exercise their corporate responsibility or as they're pressured by governments around the world to take things off. There'll be security haves. The governments that are allowed to break the Computer Fraud and Abuse Act will continue to go ahead and do so. And there'll be security have-nots. The people who cannot, whose ability to tinker, whose ability to explore is controlled will just have to go along. And we're headed towards a world that's less like the utopian dream that I described and more a world where surveillance, censorship and centralized control by companies and governments is the norm. That's my fear. I proposed last year in my Black Hat talk a number of things that I thought would start to help us to avoid going too far into the horrible cartoon cloud world and try to bring us back more closely to the dream of internet freedom. And I think that we are doing some of these, we're starting to do some of these pretty well. You know, but we don't have the, actually the support necessarily. Of everybody who uses the internet. Because the reason why it's headed this way isn't because people are stupid. The reason why it's headed this way is because people have other values at stake. There's other things that people care about. It's not going to just be oh, people are prejudiced against minorities or whatever. But it's because of fear. It's because fear will start to drive our decision making. We're afraid of terrorists. We're afraid of pedophiles. We're afraid of drug dealers. We're afraid of crime. We don't like hate speech. So people will start to embrace, we don't like malware. We don't like spam. So people will start to embrace this centralized control. They'll go for the walled garden. They'll, you know, cheer for pressure on social networks. So what do we need to do to fix it? How do we make? The dream of internet freedom possibly become more real? This year I went to a decentralization camp at the Internet Archive, which is operated by Brewster Kahle, also a really wonderful man. And the decentralization camp was Brewster's effort to jumpstart this conversation that I want to have here about what can be done to try to retain, capture, enshrine the architecture of the internet that leads to, that can lead to ensuring these things that I've described as part of the dream. And Brewster's camp had a lot of builders there, people who were currently in the process of building these decentralized technologies. And that was really awesome to see. And I learned a lot of stuff. And I met a lot of really cool people who wanted to have technology be this force for political freedom, this force for individual freedom, this force for free speech and free expression. But ultimately, I think, and maybe you guys are going to be able to convince me otherwise, but ultimately, I think that technology alone isn't really the problem. Because I don't think the problem is that we don't have the tools for a decentralized network. I think the problem is that people maybe don't understand or value enough what a decentralized network can bring us, what openness can bring us, what this kind of technological freedom can bring us. And they want the bad things to be taken care of. And so from my perspective, I feel like not just technology, but norms, the values that people have need to be, we need to talk about it. And we need to basically explain to people and help people understand. When we get rid of this, we're also getting rid of this. That these, you know, that there are good things that go together with it. So there's a saying we have in the law, I think, that, you know, there's four regulators for human behavior. There's technology, there's markets, there's law, and there's norms. And I think that if we want to maintain what we have today of the dream of internet freedom, if there are things that we want to get back, we have to do it. And if we want to get back better, or even if we just want to kind of keep the status quo, then we need to start talking about bringing those levers to bear on this problem and start building the system for this revolutionary technology to either stay, or for us to think about what technology that's going to be liberating is going to be the thing that replaces the internet that we will have in 2020. So thank you for your attention. I promised I would leave time for questions and comments, and I think we actually have about 10 minutes, so I did what I wanted to do. And I'd like to invite people who have thoughts or questions to come to the microphone. You need to come to the microphone so that people can read it. But I do want to let you know, even though I got to give a speech for a really long time, you don't. So please keep your comment pretty short so that other people who want to say something have an opportunity. I'm going to go with this. Oh, thank you. Thank you very much. I'm going to start with this brave gentleman in the front here in the blue shirt. So in the spirit of keeping it short, I've got two questions. One, what's your prediction on encryption becoming illegal at some point in the U.S.? And two, is there any role for the dark web to save us? Okay, great question. So I know I'm going to be quoted on these things, but I'm still going to tell you what I think. I don't think encryption is going to become illegal in the United States. I don't think that's going to happen. What I like to say about that is you have law enforcement interests, not even intelligence, because the intelligence agencies, you know, NSA, they've developed the skills and the techniques that they need to circumvent encryption. We've seen it in the Snowden slides. They have these great hackers. They've got lots of great tools. They can get around encryption in lots of different areas. But the people who are really wanting to circumvent encryption are law enforcement, the FBI, and ultimately local cops. And usually I say that law enforcement wins. But law enforcement is up against all the money in the world. And when you put law enforcement, usually a very successful political player, up against Apple, Facebook, Twitter, Google, you know, business interests that don't want to be hacked, et cetera, then I think they're not going to win. But. These American companies that are really dominant are not so in other countries. And, you know, just look at what's happening to WhatsApp in Brazil. People, the guy who's like the vice president of sales spent a couple of days in prison. Now Facebook's being fined. And I don't know, you know, maybe this fine isn't a serious penalty, but governments can bring very heavy fines to bear. And ultimately then the question is, well, what are you going to do? Are you going to abandon that country as a company? You're just not going to operate there? Are you going to try to make it work? And we've seen both. We've seen, you know, Google get out of China, and we've seen Blackberry Rim make special devices for particular markets. So that's where I think the risk is really going to come. What do I think about the dark web? I think it's possible that the dark web is the future web. We just don't know. You know, every technology gets its start in crime. Pagers used to be a sign that you were a drug dealer or a prostitute. You know, I can go on. So the early adopters of technologies are usually criminals, but then, you know, God bless them, they develop the technology, and then it finds these legitimate uses, and then people around the world fall in love with them. Not pagers anymore, obviously, but for a while, those were useful. Yes? Hey, it's an old Internet fart. I mean, I'm going to play a little devil's advocate here, and hopefully you can provide a little bit of incentive as you just did with your last comment. I've always run my own servers. I run over other people's podcasts. I know the types, X25, BiSync, all this stuff. I don't really care. I don't have any Facebook. I don't have any of this stuff. So why would I care about this? And I've worked on both sides, by the way. I've worked on the government side and the outside, and they both have their points. So that's my only question. Yeah. I mean, I think that, you know, other than altruism, I think the reason for people who are very technologically astute to care is because you want to talk to other people, not just yourselves. You want to be able to access information from people around the world who don't have that same technological expertise. You know, if it's going to really be a global marketplace for ideas, there can't be an expertise price to pay to get in the door. Can you help me clarify or just help clarify what seems to be an apparent contradiction with freedom of information, freedom of access, and those kinds of things, but also you're at the same time you push into an encryption, and how do you compare the needs of the freedom of information with individual property rights, intellectual property rights, protection of national security information, those kinds of things, and what's your view on how that kind of drives together? Yeah. That's a fantastic question. So I'm just going to come clean and admit that there is a conflict there. You know, if you're talking about free flow of information, then what's privacy, right? If you want to talk about, you know, incentives to create, and those are part of intellectual property, then how does that impact the free flow of information? And I guess what I would say is there are specific policy issues we could discuss, but I think that we don't value the internet freedom utopia view as we make those debates enough. So we have very strong intellectual property protection like in the DMCA, and, you know, we have a lot of things that are prohibitive. And then we have this thing called the free flow of information forgotten is there, I think we go too far. In the, you know, surveillance versus freedom of information thing, I think that's a really hard issue. You can see it in the right to be forgotten. If I'm really worried about personal privacy, I'm an American, so for me it's a little bit of an easier question. I feel like truthful information should not be suppressed, period. But that does mean that the privacy of some people or their ability to keep information about themselves secret is going to be, you know, is going to be a little bit less. But I think that we need to value the, to me, I value the freedom part more. But when we do policy, a lot of times it's a negotiation, right? You're not really picking one or the other. You have these options where you're trying to, you know, sort of optimize the two things and you have to balance. But in order to do it correctly, you have to value both sides. And what I'm saying is I think there's a side here that we're losing track of that we're not valuing enough. Now, I want to say something else about that which is related to encryption, which is encryption. I think one of the policy problems there is that we have policy people, lawyers who are used to, on the one hand, on the other hand, let's, you know, make it work, try to find a way to make it work for everybody. Nobody's going to be entirely happy. And then we have, it's either encrypted or it's hackable, you know? And I think the problem there is that policymakers have a very hard time understanding how difficult security is and that efforts to undermine security given our current state of the knowledge means it's going to be insecure. And so we're sort of taking a nuanced way of looking at the world and trying to make it, you know, and they're trying to like apply that nuanced way to something where it's a much more binary on or off kind of thing. So that's, I think that we don't have that nuance there. Thank you. Hi. Pertaining to your comment about the Facebook lawsuit in regards to their cease and desist, yes. How does that apply to the First Amendment in the form? And if I say fuck Facebook, does that mean I'm going to get arrested when I walk outside? Because fuck Facebook. How about that? Nobody arrest that guy. You know, this is, Facebook benefits politically or economically from being able to control the way that people use Facebook data. And it's, you know, Facebook was like, well, if we have aggregators, I don't know what they thought, but certainly an aggregator allows you to use, you know, different social networks more. But if you have to pick one, then somebody's going to end up being the victor in the marketplace. And that's what ended up happening. We have this ongoing debate about what the CFA can prohibit if you, you know, let it be up to a company that either puts it in terms of service or writes a letter. And this irrationality is not something I think courts are really understanding very well right now. They're considering it in like a narrow case, but they're not thinking large enough about what the risks to free expression are. Yes. Next, please. Can you remind me of what your speech was called? Slouching towards internet utopia, I believe it was. Okay. And I recognize and respect that you're an intelligent woman. So I would like to believe that you understand that the principle of utopia is completely impossible. Are you asking me if I think utopia is possible? No. What I'm trying to ask is how can you say that there should be no regulation or regulations whatsoever when the past has shown that regulations are necessary to form the utopia and without them, it leads to anarchy and chaos? Okay. I can totally thank you. So there's two views, I think, of what I'm saying. One view might be we should have less regulation. I don't think no regulation is possible. But one view might be we should have less regulation. Another view might be we should have more regulation, that we regulate what people can do, we regulate what companies do, we regulate what the government can do. In order to further these values. So when I, the reason why I call it the dream of Internet freedom and I say that it's utopian is because I don't think that we're going to live like that. But I think it's a vision. It's like a set of values that I think we're losing sight of, that I think drew me and a lot of people to the Internet. And I think we're slowly losing. And I think if we want to preserve some of those aspects, we want to have some of that be true, then we need to start thinking about what to do. Now, whether we do it through regulation of government, or we have less regulation of speech, these are going to be case by case things. Thank you. I think I have one more minute. So I think this is going to have to be the. Yeah, this is going to be quick. Basically, this is about the end-to-end encryption that we're talking about. And how the end-to-end encryption cannot be misused by organizations like ISIS and others. And that is something that I'm, so for how do we segregate organizations like ISIS versus the good community? How do you differentiate? And how do you make sure that those communities do not abuse the others? That they don't, yeah. So I think, I mean, my view of this is that, you know, I don't think that, I have an opinion about this, but I don't think that more information about ISIS will make more people join ISIS. I think it's just as likely, if not more likely, that more information about ISIS is going to make people, people feel resolved against it. You know, that will make people, if they hear their friends or relatives talking about how ISIS are, you know, are attractive in some way, will make them resist it. Or will make them say something to their friends and fight against it. So I don't think that the regulation of the information about these social or political movements, about abusive governments, about terrorists, about these issues, I don't think controlling the information about it actually, actually, helps make the world a safer, better place. I think that more information overall makes the world a safer, better place. So I'm looking, I'm looking for my, I'm looking for my boss, who's supposed to tell, tell me if I'm not allowed to take any more questions until I stop. Okay. All right. So I've always seen, you know, the U.S. Fourth Amendment as making a space where people can explore the world. And as long as what they do does not reach the realm of probable cause, it's not going to work. It's not going to work. They'd be left alone. That, you know, it would be a so-called lawless space, but it would still be a constitutional space. And I always thought that, you know, it's, it's not going to engender anarchy and chaos as if what you're doing to be left, while you're, left your own devices does not reach to the point where you're harming someone else and you, there's suddenly probable cause against you, that you'd have the freedom to do that kind of stuff. Separately, real quick, you say that, I don't know if you've heard this before, but, you know, you say that people want companies to essentially take care of everything for them, that they just want the bad things to go away, and from my perspective, that doesn't really seem to be too much why, why so much centralization or why the decentralized parts of the internet are shrinking. The loss of the morality of communications online, where information is stored indefinitely, where you can't just have casual conversations. There's a lot more consideration that has to be made, towards everything you say online now, and the thought of exploring some random topic now has, you know, especially the extremes, now has a lot more concern to be had, because of the loss of the morality. That's also compounded by the fears of surveillance, where, you know, some random chatter online is more likely to be seen, if you're talking about something that may be extreme for your current culture. And, loss of trust as Google is a capable steward of your privacy, not necessarily because of any, purposeful action on their part, but because they still have to abide by a lot of legal processes, a lot of people are aware they have to abide by a lot of legal processes that unfortunately your search queries have to be seen. So, let me ask you a very quick question, okay? Do you think, what do you think is the answer to that? Is technology the answer? Is law the answer? Is, you know, do people have to change the way they think about the internet, all of those? I would say make it safer for the people who typically explore the friends of the society to do so. Yeah, so, fight, keep fighting against surveillance, keep pushing for more ephemeral communications, make it safe to keep exploring. Thank you. And, that's my time, so thanks to all of you. I repeatedly appreciate your coming and your attention. And, thank you for having me.