Thanks for everybody for coming out for this talk. The FTC is kind of the, it's the federal agency everybody can actually love. Yeah? The FTC has been doing really cool stuff and they're here to give everybody some really good news and talk about some new programs they've got going on. So let's give them, give them our support and give them a big round of applause. Thank you all very much. This mic is on, right? I'm like echoing. We have an awesome PowerPoint but it's not coming up right now so maybe it'll come on during our presentation. If you could see it right now it would say that the title of our talk is Research on the Machines. Help the FTC protect privacy and security of consumers. And then the next slide would say who we are but we'll just cover that. I'm Carol McSweeney. I'm a federal trade commissioner. I'm an attorney and I'm really interested in protecting consumer privacy and data security. And I'm Laurie Cranor. I'm the chief technologist at the FTC. I've been there since January and I've been doing a lot of security and privacy related work. So the machines, you know estimates vary. I see these wide range of different statistics. It looks like the FTC is the largest company in the world. I'm a like we have about 25 billion connected devices right now and we're on our way to about 50 billion consumer facing connected devices in 2020. Um, you know, some people call this the internet of things. I think that term is a little bit overused. I think it's internet of a lot of stuff. Um, but really what's going on here is that we are connecting ourselves and the stuff in our lives in new and exciting ways. That's bringing a huge amount of innovation to consumers and we We want to make sure that consumers get the advantage of it, but I don't need to tell anybody in the room that it's also creating a huge amount of insecurity for consumers and raising a lot of privacy issues. So one of the things that's also happening, and we saw this on display terrifically yesterday in the DARPA competition, is that the machines are getting smarter as well. So at the FTC, we're really worried about trying to protect consumers in this increasingly interconnected environment. One of the things that we're very focused on is the potential security and privacy risks to consumers. And I'd also note that I think increasingly consumers themselves are very, very concerned about trusting these devices. So we see some survey data that really indicates... Oh, slide! Yes! It says you're going to slide ahead of where I am. Oh, there we go. Machines, right? Okay. Now. And so we see some consumer survey data. That really indicates that consumers themselves are maybe not adopting some of these new technologies because they're worried about the security of them. I've been attending DEF CON for the last three years, and I see a lot of really creative, really interesting work presented here. And I think consumers are right to be a little bit concerned about the security of these devices. So we're starting to see that reflected as well. So what we're going to cover today... What we're going to cover today is really how we're trying to approach this challenge of protecting consumers in this environment. It's easy to sort of adopt this attitude of, like, abandon all hope, ye who enter here. There's no way we're ever going to fix this. It's just a disaster for privacy and security. But I really prefer to approach this issue using the teachings of another great master. There is do or do not. There is no trial. Right? So we're going to talk a little bit about the do part of this and what we are trying to do at the FTC today. So quick overview. We can go... Oh, right. I'm sorry. Issues of the day. We also, in addition to bringing a bunch of enforcement cases, are also really trying to focus on the broader policy debate. And we're going to talk about how we need your help. We're going to talk about some of the events that we're holding and some of the ways that you can help us. So how do we respond to the rise of the machines when the machines are everywhere? Well, the FTC, and we're using its acronym, the Federal Trade Commission, actually has almost nothing to do with trade policy, thank God, and everything to do with being a consumer protection and competition enforcer. So primarily what we do is bring cases against companies. These are civil cases. It means we sue people. We get settlements. We put them under order. And then we operate and make sure they comply with the order. orders that we put, put them under. That's really different than other parts of the government that are more focused on writing rules or regulations, which isn't so much what we do, with the exception of, uh, writing rules about children's privacy online under the COPPA Act. So we, uh, primarily bring cases involving privacy and data security, and by last count, we've actually brought more than 400 of these cases since we began bringing privacy-related cases almost 25 years ago. So it's not a new issue for us at the Federal Trade Commission. We do it by, by using two authorities in the FTC Act. Uh, first, we look at whether a practice is unfair, and it can be unfair legally if it's, uh, going to create a substantial injury to consumers, it's not avoidable, reasonably avoidable by them, and not outweighed by some other pro-competitive or consumer benefit. Or we bring cases in situations where something has happened that has deceived the, uh, the, uh, the, uh, the, uh, the, uh, the, uh, the consumers in a meaningful and material way. And, and so those are the two primary ways in which we have really engaged in an active enforcement mission to protect consumer privacy and data security. So what does this mean? Example. Yes, um, so we're going to tell you about a few cases here. Uh, so, Facebook, uh, had, had settings for users to control their privacy settings, and they promised that if you limited access to some of the personal information you post, uh, that you would not be able to control the privacy settings. So, uh, they posted that, um, that it would not be viewed by, by people that you did not grant access to. It wouldn't be shared with, um, third parties. And they also said that if you, uh, deleted your account, then the photos that you had posted would no longer be accessible. But, as it turned out, that some of the information people posted was accessible to other people and third parties beyond the settings that they had set, and some of the photos were accessible even after people deleted their accounts. All right. So, that was deceptive. And it also turned out that, uh, we brought an unfairness count in that case because some of the data that had been designated private, uh, Facebook sort of retroactively changed how it was handled and made it public. And we said, wow, that's super unfair. Again, consumers can't avoid that and it can cause them a real harm. So that's, that's the legal theory for that kind of case. So, uh, in the case of Google, they had promised people that their Gmail contacts wouldn't be used for anything other than their personal data. So... than as part of Gmail. However, when they launched their new buzz uh social media service, they populated buzz with the Google contacts uh from Gmail. Um and uh and so that exposed people's contact information um on Google buzz. Yeah, so and that was I would note uh actually a broader case as well involving a number of accounts but mostly they're all deception based um uh uh um accounts there. So the misreps are you are you're uh sharing information under one set of terms but actually they don't live up to that set of terms. So that's that's a misrep case, misrepresentation case for us. It's deceptive to the consumers. So and I guess I should note here as well that that was these are um cases from like 2011. They're a little bit older but this was the first case where the FTC remedy actually requires comprehensive privacy policy be implemented by the uh the uh the uh the uh the uh the company. And the result of these cases are orders that are we call them consent orders resolving the claims um that put these companies under 20 year orders and then we go back every couple of years and look at how they're doing. It also gives us an additional um a way in which to make sure they're complying with the orders uh because sometimes uh things happen and uh if they um are in violation of the order, they're in contempt of it, we can then uh penalize them monetarily which can be meaningful in some cases. So that's a good point. So Snapchat uh had promised that the images that you send on Snapchat would disappear after a short period of time and that if somebody tried to take take a screenshot of them you would get a notification. But actually there were a number of ways that you could save a Snapchat image um and you could also circumvent the notification feature. Yeah so it doesn't disappear deceptive. Pretty simple. So um Wyndham the hotel chain had three data breaches that unfairly exposed consumers' payment card information. They had a number of security failures that led to these data breaches including uh storing payment card information in the clear and not patching and not monitoring their systems for malware. Yeah so this is an important case because actually we proceeded using our unfairness authority saying that data security practice practices were unfair to consumers. Uh Wyndham disagreed with us, we engaged in extensive litigation and this year uh we won at the circuit court level the use of our authority to bring data security cases to protect consumers. So that's a really important validation of the Federal Trade Commission being in this space and using our authorities. Okay, Oracle provided a Java update to correct important security flaws and they promised consumers that if they installed this update they would be safe and secure. However, the- the- the- the- the- the- the- the- the- the- the- the- the- the- the- the- the- the- the- the- the- installer did not automatically remove all of the old versions of Java leaving users vulnerable. Uh again um a- an important data security case and I- and I think we'll transition now into uh another really important data security area for us and that's the Internet of Things stuff. Yeah so uh Asys made uh routers and they promised consumers that their routers would protect consumers' local networks. However, the routers were vulnerable to an exploit that perfectly provided complete access to a consumer's connected storage devices without needing any credentials. They also did not address security flaws in a timely manner which allowed attackers to change router security configuration without a consumer's knowledge. And I just note here I mean routers are just an incredibly important feature of protecting all the connected devices that you might have on your home network. So making sure that the companies that are making claims about the security of them are actually making valid claims is really really important. So I think this was a super important case. Another feature of it was um and we've seen this in a couple of our other enforcement actions, configuration of encryption whether it was properly used and properly configured or not and when it's not uh we've actually brought cases as well um so Fandango is a is another one of those. Uh there's several other examples that we could we could use but for the sake of time I think we'll just say these are examples that we could use but for the sake of time I think we'll just say these are examples of how we use our authority and we thought they were important to share so that as we have a conversation about how we can work with you to help bring cases you understand the kind of legalese that goes along with them. Uh so how do we bring cases? Well uh we rely on researchers and research so that's gonna be an important part of our talk today. We also read media reports and find those very interesting a lot of the time. Um and we actually get uh cases through consumer complaints and other complaints that are filed with us. Uh we have a whole network actually it's called the Sentinel Network and it helps us bring in complaint data from consumers also from state law enforcement agencies, from better business bureaus and from a variety of places. This network actually works for our whole mission. A bulk of what we do is also protect consumers from scams and frauds and things that are very low tech. Uh but I bet it has a has a tech component of it as well. So uh you know I think uh we've been spending the first part of our talk talking about enforcement. It is one of the most important things that the FTC does but we're really mindful that all of this amazing connectivity in consumers lives um is raising a host of issues that are that go far beyond simply whether the security practices are unfair to them or whether they're being deceived about what the products are actually doing. So the FTC is not just an enforcer it's also kind of an advocate and we're trying to work with other government agencies and with other communities to make sure that we're we're putting in place the strongest possible policies and responses both to help keep consumers informed but make improvements to our laws as well so that as all of this great uh tech kind of cascades over us in our daily lives um we we have uh better and stronger protections for consumers. MS. So I am going to talk to the late detail of this today. Uh um our national governor is going to be doing a about an example of something that we worked on. And this started with a personal incident that happened to me. Shortly after I started working for the FTC, my mobile phone account was hijacked. And I discovered this when my phone stopped working and on the same day my husband's phone stopped working. And we called our carrier and our carrier said, oh, is that your new iPhones that stopped working? And we said, we don't have new iPhones. And they said, well, in our database, it says you have new iPhones. Uh, so they sent us to, uh, the phone store to get new SIM cards and eventually, um, uh, they figured out that there had been fraud on our account. It turns out that somebody went into a phone store with a fake ID, claimed to be me, asked to upgrade the phones. Um, and the phone company, uh, happily gave them two new, brand new iPhones, uh, charged them to my account and put our phone numbers on them. Uh, so, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh, uh. Uh, so, uh, when this happened to me, I, I clean, cleaned up the mess but I was really interested in how often does this happen to, to other people and what could be done to prevent this. Uh, so, I talked to all of the major carriers about what they were doing, uh, to prevent it. Um, and, uh, and the type of authentication, uh, procedure that they're using. Uh, they, they are relying mostly on that driver's license, um, and, uh, a phone store employee who is not necessarily well trained in how to spot fake IDs. I looked at our consumer sentinel um database to uh try to understand how often this was happening. Now uh consumer sentinel you know gets all these reports that that people send in um and in this case these are mostly reports that come in through identitytheft.gov and we know that this is just the tip of the iceberg because most people don't even know that they can submit their identity theft complaints. We're trying to get the word out so tell your friends. Um but we we we expect that this is um only maybe 1% or so of the total identity thefts that are happening we see. So I went back through this data and if you look 3 years ago in a typical month say January 2013 we got about a th- a thousand reports of this mobile phone hijacking or a similar thing called SIM swap. Um and we had about a thousand reports and that made up about 3% of all of our identity theft reports that month. Um then we looked 3 years later and we find 2,600 reports and that is about 6% of all identity theft reports that month. So we're definitely seeing a trend here of this becoming an increasingly large problem. Uh I also did a lot of looking for media reports and saw that there were um a lot of reports of people having uh uh uh uh uh uh uh uh uh uh uh uh uh similar things happen to them. Uh perhaps even worse uh besides just using this to get free phones some of the attackers are using this to get access to the victim's phone number so that they can intercept their two factor authentication. Uh so shortly after this happened to me it happened to DeRay McKesson who is a well known Black Lives Matter activist. He has something like 400,000 Twitter followers and somebody wanted to get into his Twitter account so they could tweet as him. Um and this is something that is becoming increasingly common. Um I understand that in Europe uh they're doing this to get access to people's bank accounts. And um and the attackers are successfully able to get in and actually clean out people's uh bank accounts. So is it any wonder that consumers have trust and security issues? Uh I will note that our consumer, consumer sentinel data, the complaint data that we've been talking about reflects that identity theft reports. Uh actually uh I don't know whether you've read the report or if it's not a left is the number one consumer complaint for the last five years. We get hundreds of thousands of these complaints. So it's not just this kind of spoofing, but it's a wider problem as well. Uh, it doesn't show any signs, not surprisingly, of, of lessening, um, unfortunately. Um, so obviously there's a huge amount of defenseless data out there. We have this alphabet soup approach to our privacy protections in the US. Many of you in this room are probably familiar with it. It, uh, is like the, the, the TL, you know, DR version of this is like FTC Act, FERPA, COPPA, HIPPA, Communications Act, GLB, uh, state laws, right? But there's no comprehensive privacy law. There's no, uh, comprehensive data security law. So that's the, the atmosphere that we're, we're operating in. Which is why, uh, the FTC doesn't just do enforcement. It does a tremendous amount of education, convening, and trying to work broadly, uh, to address these issues. One of the initiatives that we've had in the last year is something called Start With Security, uh, which is really trying to get our message out about what good security practices look like. I probably don't need to tell anybody in this room that a lot of the consumer-facing technology, uh, um, is pretty porous and, and um, in fact many of the people who are creating it probably have no idea what Starting With Security actually looks like. So So we're trying to get that message out as broadly as we possibly can. Some of the biggest problems we're continuing to see are ignored reports of vulnerabilities, slow response time to vulnerability reports, lack of data minimization where appropriate, failure to store passwords securely, lack of training of employees, lack of proper configuration. You know, so we continue to see a host of problems in that space as well. We're also trying to increase our in-house capabilities and our in-house expertise to understand how the technology is working and to be a better environment for people like you to bring research to us. So actually, we have some of our awesome Office of Technology Research and Investigation folks here today. Joe and Erin, if you want to raise your hand. And if you want to do like a- Shirts like this. Yeah, yeah, so shirts like this. You want to do an IoT deep dive, Joe and I are actually going to be in IoT Village later on this afternoon at four o'clock, so we would love to talk to you then, and also hear of any issues and research that you've already been doing in the IoT space. So we also have an internship program and we're trying to bring more technologists in through that as well. One of the things that the Office of Technology, OTEC, is doing is they are putting together a fall technology series. And so we have coming up in September, a workshop on ransomware. In October, we have a workshop on drones. And in December, a workshop on smart TVs. There's information about all of these workshops on our website. We're very interested. If you have expertise in these areas, you have research, reports, anything you'd like to share with us, there's information on how you can share that with us either before or after the workshops. If you're in the D.C. area, please come. The workshops are free and open to the public. If you're not in the D.C. area, or even if you are, you are welcome to watch our live webcast of the workshops and the videos will also be archived. So these are good ways for us to collect information on these topics focused on the security and privacy issues and to better understand what consumer protection issues there are in these spaces. Another workshop we have coming up, and this is one that I've been working a lot on, is putting disclosure. Disclosures to the test. So my interest in this started when I was doing work on privacy policies, which are a type of consumer disclosure. But I realized that there are a lot of other types of disclosures which the FTC is interested in, which have some of the same problems that privacy policies do as far as being long and hard to understand, and we would really like them to be more effective. And so the purpose of this workshop is to bring in research from consumers who do usability, user studies and evaluate disclosures to try to figure out how to make them actually communicate well with consumers. And so we'll be hearing from people who have done work on privacy notices, but also nutrition labels and drug facts and all sorts of other types of disclosures. I thought this was covered so incredibly well this morning by Sarah and Mudge in their talk about the cyber independent testing lab that they're putting together. This need to have consumers, you know, consumers have more transparency so that they can make educated choices about the products that they're buying, the software that they're buying, the apps that they're buying, and just to understand what some of the risks might be associated with them. So we're trying to really improve and increase and expand our knowledge of about the kind of communications that work with consumers and are effective with them. Oh, privacy con too. So this is our second annual privacy con this year. This is also a forum for researchers, especially researchers who are doing privacy research to come present it to us. We had an incredibly successful first privacy con last year. We're going to do it again this year. And I first of all learned a huge amount, which was great. It definitely affects our enforcement, but it also I think really affects the broader policy discussion that we're having on privacy in the country as well. So that's coming up in January. There will be information about how to participate that is actually currently on our website. It's currently on our website. And we are seeking research papers in the privacy area right now. The deadline I believe is in October sometime. So definitely think about submitting things and think about coming or tuning in. This should be a really great event. So research. We are going to wrap up this talk talking about the research wish list that Laurie has been putting together, which I'm really excited about because I feel like sometimes we have a very abstract conversation about what it is that would really help us to understand better with the academic community, with the researcher community. So this is our attempt, and it's going to be an attempt that we keep pursuing, right, to sort of refine the kinds of issues that we think are really going to be helpful to us to understand and to really solicit research in academia and elsewhere for these kinds of topics. We also are going to make sure that we have time for questions too, so I'll let you. Okay, yeah, yeah, yeah. So I spent some time working with the OTEC folks, and we went and talked to people in every division of the agency about their research needs so that we could then go out and talk to researchers about ways you might be able to help us. I don't have time to go through the entire wish list, but we're going to focus on some of the security and privacy items here. So we're very interested in research on how to assess the risks that are posed, you know, by breaches and vulnerabilities. You know, we know that there are risks, but we want to look at exactly what metrics can we use to assess them. We also are very interested in protecting consumers from ransomware, from malvertising, and other risks. And so we're interested in research that helps us protect consumers. We're also interested in being able to trace exposed data to specific breaches, and we're looking for research to help us do that. We're looking at research that is at the intersection of economics and security, how can we make certain types of attacks less profitable and therefore less desirable for an attacker to pursue. And then we're also very interested in protecting consumers from fraud, and so we're interested in ways that we can automate the process of spotting fraud, detecting fraud quickly. IoT devices. IoT devices is an emerging area, and we're very interested in research related to that. We would like to help IoT device manufacturers and platforms have better security, and so we're very interested in research along those lines. We're also interested in defensive measures, so that if there is a problem with an IoT device, it won't compromise the entire network. Other emerging trends, there are increasing devices that have sensors in them, including devices for children, Barbie dolls that talk to you, and things like that. We're very interested in how to prevent these devices from compromising consumer privacy and children's privacy. We're very interested in how to isolate critical systems, for example, in connected cars. Bots, that's a new thing. Increasingly we have bots, other artificial intelligence, and when consumers interact with them, we wonder, do they even know that they are interacting with a machine, and so we want research on how consumers can become aware of that and what they know about this. Virtual reality is a new area that we've seen a lot of progress in lately, a lot more consumer devices available in the virtual reality space, and there hasn't been a whole lot of discussion of the security and privacy issues. You know, it's fun, it's entertaining, but we want to stay out ahead of that and try to make sure that we protect consumers as well. New tools and techniques, we're very interested in a variety of different types of tools. We're interested in hearing about tools that consumers can use to control their personal information, and especially across contacts, as personal information is now increasingly shared across contacts, your phone, you know, shares with your TV and whatnot. We're also interested in tools that help consumers observe what data their devices are sharing. We are interested in tools that allow us to analyze apps and to understand the type of data that they are sharing and are associating with third-party libraries. We're interested in algorithms that are used to make decisions about people and may actually, either on purpose or inadvertently, discriminate against people. We're interested in identifying when cross-device tracking is occurring, and we're interested in tools that will help us identify vulnerabilities in IOT devices, and many, many more. This is just a quick sampling of some of the research areas that we're interested in and that we hope you will come talk to us about if you have insights. So, what happens if you do, and what are some of the ways that you can come talk to us about it? So, our OTEC folks will take a look at the research that we receive. They will look across the agency to find people for whom this is relevant and try to direct that to them so that we can see if it's going to be of use to the work that we're doing or whether we should start a new project in an area that somebody brings something to our attention. Yeah, and I just say that sometimes you bring something to us. And then we actually end up bringing a case. So, it can result in a lawsuit against a company as well. That's happened some of the time. So, actually segue as well into the we want you slide from the creepy Uncle Sam. You know, I think what we're trying to really, if you have one takeaway from this talk made clear here is that we actually can't solve all of the challenges that are going to be confronting consumers in a hyper-connected environment. Without a lot of partnership, particularly with the security researcher community. So, we're trying to do the most that we can do to try to develop those partnerships and have it inform not only our enforcement mission, but also the research that we do, the studies that we conduct, the workshops that we conduct, and the ways in which the FTC tries to actually make sure that policy makers and others in the broader space are seeing these issues that might harm consumers. Yeah. So, we have set up the email address, research at FTC.gov. Please use that to send us research, and that will be examined by our folks in OTEC. And for pointers to all of the workshops that I mentioned and all the other things that I've talked about here, please take a look at FTC.gov slash tech. That's the tech blog. Lots of other interesting stuff there too. So, check it out. I think we're ready for questions. Yeah. And you can follow us on Twitter, too. You're at Laurie Tweet. Yeah, I'm at Laurie Tweet. And I'm at TMXVNFTC. So, thank you. Awesome. All right. So, we have, we left plenty of extra time for questions. So, we'd love to field some questions, and I think we have, like, what, five or ten minutes? Hi. Great brief. You mentioned discriminatory algorithms that you are concerned about. We know in the news, I think it was, two or three months ago, with Facebook and their news feed. There's also been, recently, the other major social media site, Twitter, banning people because they did a bad movie review of a Ghostbuster movie, and they had their account banned. From Breitbart News, it was Milo. Also, there's been other censorship against talk show hosts for mentioning, repeating what happened in Germany, I won't mention religion, because I don't want to be censored here, an attack that he lost his Facebook account. What are you doing for situations like that, or is that in your swim lanes? Thanks. So, some of that raises a host of really interesting, sort of broader First Amendment concerns. You know, I think one of the things that we're trying to really focus on when we're thinking about algorithms and data, and especially, like, machine learning on top of all of that. Is the extent to which choices are being curated and offered to consumers in such a way that might limit their choice, or even result in a disparate impact on them. So, one of the things that we haven't really gone into yet, is the extent to which those algorithms are kind of manipulating the overall news that they're getting, which is, I think, your question. But we are interested in the extent to which it might be impacting the credit offerings that consumers are getting, the housing offerings, employment offerings. Some of the things that we're trying to do, you know, we're trying to figure out, you know, So, that's one of those core economic choices. Now, we do have laws on the books, comfortingly. Civil rights laws, equal opportunity laws, right? That already protect people in the brick and mortar world from this kind of discrimination. But one of the things that is really hard in the increasingly digital world is figuring out when that's even happening at all. And that's some of the work that we really need help with right now. Yeah. I heard a lot of emphasis in this presentation on regulation, basically. Or actions against companies and consumers. But I feel that more and more government is becoming a servicer of consumers. And it used to be you'd go into an office and deal with someone and that was a real interaction. But now the services are so broad and dynamic because government is trying to offer electronic services on the forefront. And I'd argue that they're not necessarily the most expert at it, and data breaches and such, these are all things that apply to government as much as they do as consumers. So what's your regulatory or involvement with government services? Yeah. Well, we are the government, but we don't actually regulate the other parts of the government, so that's actually good for us. Because that's, as you point out, a big challenge. I mean, I think you see this administration taking action to really try to improve both the privacy and security talent and policies throughout the government. So the question for people in the back was, you know, what are you doing, FTC, about the government and its problems? And the short answer is we're focused on protecting consumers. But, you know, I think we are collaborating with the other parts of the government. We have our own chief privacy officer, our own chief security officer. We're very mindful of these issues. And I think one of the other things that you really see happening in this administration is a government-wide emphasis on bringing technologists into government. That's something the FTC has been a real leader on. We actually have been doing this for a number of years, because what we recognize is that when we're dealing with protecting consumers in an increasingly digital world, we need technologists to help us understand what is even happening in that world, which is why we have people like Laurie, but why we've also expanded to develop an entire office that is staffed by researchers and technologists as well. I think we need to grow those resources, but we need to do it throughout the government as well. And when we're having big debates, like encryption debates, we need to make sure that technologists are at the table for those debates. A lot of the time, the policy talk in Washington isn't so well-informed. Probably no one here is surprised to hear that. Yeah, and there is now a government-wide privacy council which the FTC participates in actively and is helping to educate other agencies. Early in your talk, you mentioned about Google and Facebook and how they were, you caught them for something changing their end-user agreement. In these EULAs, it often says that we can change the EULA terms anytime we want to. So then how can you kind of accuse them of unfairness if a user has agreed to these terms? Yeah, so this is a great question. The question is, if you have a user agreement that covers everything, how can you then come back and bring a deception case about something that's sort of covered in the 60, 90-page user agreement? Well, the answer is context matters. And I think what we're trying to make very clear in our FTC enforcement is if users share information under one set of rules and in a way that makes sense, given the kind of stuff they're doing with an app, right, then, you know, that's covered by the user agreement. If you do something that's super tricky, right, or really impossible for consumers to figure out, or you change how that information is being handled without really giving them a clear explanation of what those changes are, or if you set up your thing to defeat what their settings were to begin with, right, that's a case we just brought called InMobi, that we actually can bring a deception case in that situation. Last fall you had a workshop about cross-device tracking. Yeah. And I know you sent some warning letters to developers this spring that were using a toolkit that might be used for cross-device tracking. What additional is, it seems like this is an area that is probably going to grow rather than shrink. Is there additional activity going on at FTC to continue to track this, and what are you doing in the future? Yeah, so thank you. This is a question about cross-device tracking, which is an issue that we're definitely trying to understand a lot more clearly. It's already informed a little bit our enforcement efforts, right, so the InMobi case, which I was just talking about, is a case where we actually had a mobile ad company that is an incredibly widely used company that said it would only track if you opted in, and, in fact, it tracked whether or not opt-ins were set, and had really created, created a whole system to kind of go around the opt-in to begin with in order to track consumers using geolocation and other things. So we said that's unfair and deceptive. We also, you noted the Silver Push letter, so the Silver Push technology, which, for those of you that didn't see our letter, because I get it, we are, you know, out there in Washington, we issued a warning letter to app developers saying that if you installed Silver Push, which is a piece of software that can monitor device microphones and listen to the audio beacons that are coming off of advertisements on TV, so it basically is technology that allows them to gather what someone's viewing habits are based on what their telephone microphone is picking up from these audio beacons that are embedded in the advertisements. We said that we were very skeptical that this kind of technology should be included in apps, so I think that should serve as a pretty bright line warning letter that we're worried that consumers really don't have adequate notice and transparency about what that tech is. We're also looking at many of the ways in which people are being passively, I could say surveilled, it's a bit loaded, but passively having information gathered about them. Last year we brought a case called Nomi, which is a company that was tracking people's locations and retail locations, and they said they would offer an opt-out in retail locations, but they didn't in fact compel the retailers using the technology to offer the opt-out. So there was no opt-out, and there's no way a consumer can know that's happening, really, unless you have some kind of clear notice that it's occurring and some kind of choice. So there we said, look, if you're going to say you're going to offer an opt-out, you have to really offer the opt-out. Now, again, there's no comprehensive privacy law in this country, so there's nothing that says that that kind of thing can't happen without consumers choosing or having a choice about it, so it's an area that we're continuing to monitor very carefully. Spring-boarding. Coming off of the previous questions about consumer privacy and the transparency that goes on between other government agencies, is your commitment to transparency documented if, say, an exploit is discovered and, say, the NSA wants to hold on to that exploit for some use? Do you want me to take this one, probably? Go for it. So we are a civil enforcement agency, and I could imagine that there would be situations in which we wouldn't be in that dialogue for a variety of reasons. If something is disclosed to us, what we then do is try to understand whether we have enough facts to actually bring a case using our existing authorities about the practices that led to, especially if it's exploited, or whether in some cases we have brought cases more And so when something was disclosed and then the recipient of that didn't really react at all, right? So if you don't have a mature disclosure program in your company to receive exploits and respond to them, that can be a factor in our analysis about whether you have reasonable data security practices in your company. But I'm not really answering, like, your direct question, which is a broader, like, national security question, because we're a little bit less advanced. A broader civil liberties question as well, too, because what if I discover an exploit and then I get slapped with a notice to not mention it because the government wants to use it for something else? What's my protection or the protection of consumers? So this is an area that I, you know, I personally think is one that we really need to work on the maturity of our laws in the U.S. and how we're handling it, because the FTC thinks that we need to have really good, clear partnerships with security researchers so that people who are doing the work, on behalf of, you know, on behalf of consumers, to help us understand how the technology is actually working, are able to do that work without, you know, fear of reprisals. Now, understand this is a balancing act, right, that there are bad actors out there. We want to protect against the bad actors. But, yeah, I think it's a part of the broader conversation that we need to have, and the FTC, maybe not all of the FTC, I'll say at this point, I'm speaking on behalf of myself, right, but, you know, I mean, I think some of us really feel strongly that we need to modernize how we're handling Computer Fraud and Abuse Act and some other things so that we can have a more mature system in place for handling how research is handled and how exploits are handled when they're disclosed. Hi. It's great that the FTC is trying to get ahead of privacy risks in IoT and virtual reality, which are new technologies, but can you talk more about what you're doing with routers? I know about the ASUS case, but routers are so important to consumers. Many of them don't realize it. It's the gateway into their private networks and where everything's shared. And the practices, the security practices with router vendors have been so bad for so long, and many of the same vendors are now doing IoT as well. So what are you doing there to convince vendors to improve those practices that have been going on for so many years? So for starters, we're bringing cases. I don't want to talk about any pending cases, but I would say that we take the security of routers and the claims being made around them very, very seriously and are taking a careful look. I don't know if you want to add to that. Yeah. Yeah. All right. All right. Well, I think we're out of time. So again, if there's one takeaway, it's that we really want to forge a good partnership. We want to hear from you. We want to participate with you. If you think there are things that we're missing, we would love to hear about it and add it to our call to research. So thank you for your attendance and time, and happy DEF CON. This is awesome. Yeah. Thanks for coming.