00:00:00.067,00:00:03.136 >> Well thanks everybody for, uh, for coming out for this talk. 00:00:03.136,00:00:06.073 [audience noise] Uhm, the FTC is kind of the, it's, it's the 00:00:06.073,00:00:08.542 federal agency everybody can actually love. [audience noise] 00:00:08.542,00:00:12.379 Yea? >> Yea! >> Uh, the FTC's been doing really cool stuff... 00:00:12.379,00:00:14.281 [applause] ...and, uhm... [applause] And they're here to 00:00:14.281,00:00:17.117 give everybody some really good news. [laughter] Uhm, and talk 00:00:17.117,00:00:19.519 about some new programs they've got going on. So, let's give 00:00:19.519,00:00:22.522 them a, uh, our support and, uh, give them a big round of 00:00:22.522,00:00:26.026 applause! [applause] [cheering] >> Good luck, ladies. >> Thank 00:00:26.026,00:00:31.565 you all, uh, very much. Is this, is, this mic is on, right? Like 00:00:31.565,00:00:35.302 echoooooing... Uhm, we have an awesome PowerPoint but, uh, it's 00:00:35.302,00:00:38.305 not coming up right now, so, maybe it'll come on during our 00:00:38.305,00:00:41.441 presentation. Uhm, if, if you could see it right now it would 00:00:41.441,00:00:44.778 say that the title of our talk is "Research On The Machines". 00:00:44.778,00:00:47.614 Uh, "Help the FTC protect privacy and security of 00:00:47.614,00:00:51.718 consumers." Uh, and then the next slide would say who we are 00:00:51.718,00:00:53.820 but we'll just cover that... [chuckle] I'm Terrell McSweeny, 00:00:53.820,00:00:57.858 I'm a federal trade commission, uhm, I'm, uh, an attorney and, 00:00:57.858,00:01:01.995 uhm, I'm really interested in protecting consumer privacy and 00:01:01.995,00:01:05.532 data security. >> And I'm Lorrie Craner, I'm the chief 00:01:05.532,00:01:10.437 technologist at the FTC. I've been there, uh, since January 00:01:10.437,00:01:14.107 and I've been doing a lot of security and privacy related 00:01:14.107,00:01:18.645 work. [background noise] >> So, the machines, uhm, you know, 00:01:18.645,00:01:21.648 estimates vary. I, I see these wide range of different 00:01:21.648,00:01:25.218 statistics, it looks like we have about 25-mill, billion 00:01:25.218,00:01:28.789 connected right now. And we're on our way to about 50 billion 00:01:28.789,00:01:32.826 consumer facing connected devices in 2020. Uhm, so, you 00:01:32.826,00:01:35.729 know, some people call this "The Internet of Things", I think 00:01:35.729,00:01:38.732 that term is a little bit overused, I think it's "Internet 00:01:38.732,00:01:41.868 of a lot of stuff". [laughter] Uhm, but really, what's going on 00:01:41.868,00:01:44.838 here is that we are connecting ourselves and the stuff in our 00:01:44.838,00:01:47.674 lives in new [cough] and in exciting ways. That's bringing, 00:01:47.674,00:01:50.577 that's bringing a huge amount of innovation to consumers and we 00:01:50.577,00:01:53.146 wanna make sure consumers get the advantage of it. [cough] But 00:01:53.146,00:01:56.783 I don't need to tell anybody in the room that it's also creating 00:01:56.783,00:02:00.487 a huge amount of insecurity for consumers and raising a lot of 00:02:00.487,00:02:03.857 privacy issues. So, uh, one of the things that's all, that's 00:02:03.857,00:02:06.460 also happening, and we saw this happening on display 00:02:06.460,00:02:08.962 terrifically yesterday in the drone flying competition is that 00:02:08.962,00:02:12.499 the machines are getting smarter as well. So, at the FTC we're 00:02:12.499,00:02:15.402 really worried about trying to protect consumers in this 00:02:15.402,00:02:19.072 increasingly interconnected environment. One of the things 00:02:19.072,00:02:22.409 we're very focussed on is the potential security and privacy 00:02:22.409,00:02:25.812 risks to consumers. And I'd also not that I think that, uh, 00:02:25.812,00:02:28.682 increasingly consumers themselves are very very 00:02:28.682,00:02:33.053 concerned about trusting these devices. So, we see some survey 00:02:33.053,00:02:35.489 data that really indicates... Ah! Slides! [applause] 00:02:35.489,00:02:39.259 [cheering] Yes!! Weird, cause you're a slide ahead of where I 00:02:39.259,00:02:44.931 am... yea. >> There we go. >> Machines, right? Okay. 00:02:44.931,00:02:47.734 [laughter] Uhm, and, and so we see some consumer survey data 00:02:47.734,00:02:52.005 that really indicates, uhm, that consumers themselves are maybe 00:02:52.005,00:02:54.741 not adopting some of these new technologies because they're 00:02:54.741,00:02:58.278 worried about the security of them. Uh, you know, I've been, 00:02:58.278,00:03:02.215 uh, attending DefCon for about, the last 3 years and I, I see a 00:03:02.215,00:03:05.886 lot of really creative, really interesting, uh, work presented 00:03:05.886,00:03:08.622 here and I think, you know, consumers are right to be a 00:03:08.622,00:03:12.292 little bit concerned about the security of these, of these 00:03:12.292,00:03:16.063 devices. So, we're starting to see that reflected as well. Uhm, 00:03:16.063,00:03:20.634 so what we're gonna cover today, uhm, is really how we're trying 00:03:20.634,00:03:23.537 to approach this challenge of protecting consumers in this 00:03:23.537,00:03:28.442 environment. Uhm, it's easy to sorta adapt this attitude of, 00:03:28.442,00:03:32.079 like, "Abandon all hope ye who enter here", it's like "There's 00:03:32.079,00:03:34.881 now way we're ever gonna fix this. It's just a disaster for 00:03:34.881,00:03:39.553 privacy and security". But I really prefer to approach this 00:03:39.553,00:03:44.424 issue using the teachings of another great master - "There is 00:03:44.424,00:03:48.929 do or do not, there is no try". [chuckle] Right? [laughter] So, 00:03:48.929,00:03:52.666 I, we're gonna talk a little bit about the "Do" part of this and 00:03:52.666,00:03:56.203 what we're trying to do at the FTC today. Uh, so quick 00:03:56.203,00:04:01.274 overview, uhm... [pause] Oh, right, sorry, "Issues of the 00:04:01.274,00:04:04.678 day" [laughter] Uhm, we also, uhm, in addition to bringing a 00:04:04.678,00:04:06.813 bunch of enforcement cases, we're also really trying to 00:04:06.813,00:04:10.083 focus on the broader policy debate. >> And we're going to 00:04:10.083,00:04:12.986 talk about how we need your help, we're gonna talk about 00:04:12.986,00:04:15.856 some of the events that we're holding and some of the ways 00:04:15.856,00:04:20.427 that you can help us. >> So, how do we respond to rise of the 00:04:20.427,00:04:24.131 machines if the machines are everywhere? Well, the FTC and, 00:04:24.131,00:04:27.968 and we're using its acronym, the Federal Trade Commision, uhm, 00:04:27.968,00:04:30.937 actually has almost nothing to do with trade policy - thank 00:04:30.937,00:04:33.240 God! And everything [laughter] to do with being a consumer 00:04:33.240,00:04:36.977 protection and competition enforcer. So, primarily what we 00:04:36.977,00:04:40.914 do is bring cases against companies, these are civil cases 00:04:40.914,00:04:43.383 which means we sue people, we get settlements, we put them 00:04:43.383,00:04:46.553 under order and then we, uh, operate and make sure they 00:04:46.553,00:04:49.389 comply with the order that we, we put them under. That's really 00:04:49.389,00:04:51.525 different from the other parts of the government that are more 00:04:51.525,00:04:54.628 focussed on writing rules or regulations which isn't so much 00:04:54.628,00:04:58.632 what we do with the exception of, uh, writing rules about 00:04:58.632,00:05:02.235 children's privacy online under the COPPA act. So we, uh, 00:05:02.235,00:05:06.406 primarily bring cases involving privacy and data security and by 00:05:06.406,00:05:09.476 last count we've actually brought more than 400 of these 00:05:09.476,00:05:14.047 cases since we began bringing privacy related cases almost 25 00:05:14.047,00:05:16.750 years ago. So, it's not a new issue for us at the Federal 00:05:16.750,00:05:21.054 Trade Commision. We do it by, by using two authorities [cough] in 00:05:21.054,00:05:25.559 the FTC act - uh, first we look at [cough] whether practices is 00:05:25.559,00:05:29.529 unfair and it can be unfair legally if it's, uh, going to 00:05:29.529,00:05:32.933 create substantial injury to consumers, it's not avoidable - 00:05:32.933,00:05:36.169 reasonably avoidable by them; and not outweighed by some other 00:05:36.169,00:05:40.207 pro-competitive or consumer benefit. Or we bring cases in 00:05:40.207,00:05:43.343 situations where something has happened that deceived consumers 00:05:43.343,00:05:46.947 in a meaningful and material way. And, uh, so those are the 00:05:46.947,00:05:50.951 two primary way in which we have really engaged in an active 00:05:50.951,00:05:54.020 enforcement mission to protect consumer privacy and data 00:05:54.020,00:05:59.025 security. So, what does this mean? Examples.... >> Yes! So, 00:05:59.025,00:06:02.896 uh, we're gonna tell you about a few cases here. Uh, so, 00:06:02.896,00:06:06.733 Facebook, uh, had, had settings for users to control their 00:06:06.733,00:06:10.070 privacy settings and they promised that if you limited 00:06:10.070,00:06:13.273 access to some of the personal information you posted that, 00:06:13.273,00:06:17.644 uhm, that it would not be viewed by, by people that you did not 00:06:17.644,00:06:21.181 grant access to. It wouldn't be shared with, uhm, third parties 00:06:21.181,00:06:25.385 and they also said that if you, uh, deleted your account then 00:06:25.385,00:06:28.588 the photos that you had posted would no longer be accessible. 00:06:28.588,00:06:31.958 But... as it turned out that some of the information people 00:06:31.958,00:06:36.396 posted was accessible to other people and third parties beyond 00:06:36.396,00:06:39.733 the settings that they had set and some of the photos were 00:06:39.733,00:06:43.703 accessible even after people deleted their accounts. [cough] 00:06:43.703,00:06:46.606 >> Alright, so, that was deceptive and it also turned 00:06:46.606,00:06:49.542 that, uh, [cough] we brought an unfairness count on that cause 00:06:49.542,00:06:51.878 some of the data that had been designated "Private", uh, 00:06:51.878,00:06:54.581 Facebook kind of retroactively changed how that was handled and 00:06:54.581,00:06:57.617 made that public and we said "Wow, that's super unfair". 00:06:57.617,00:07:01.121 Again, consumers can't avoid that and it can cause, uhm, a, 00:07:01.121,00:07:04.324 real harm. So that's, that's the legal theory for that kind of 00:07:04.324,00:07:08.528 case. >> So, uh, in the case of Google they have promised people 00:07:08.528,00:07:12.432 that their Gmail contacts wouldn't be used for anything 00:07:12.432,00:07:15.602 other than as part of Gmail. However, when they launched 00:07:15.602,00:07:20.473 their new "Buzz", uh, social media service, they populated 00:07:20.473,00:07:25.312 Buzz with the Google contacts, uh, from Gmail. Uhm, and, uh, 00:07:25.312,00:07:28.548 and so that exposed people's contact information on Google 00:07:28.548,00:07:29.883 ?? Buzz. >> Yea... So, and that was, I would note of a broader 00:07:29.883,00:07:31.217 case as well involving a number of accounts but basically 00:07:31.217,00:07:32.585 they're all deception based, uhm, uh, uh, counts there. So, 00:07:32.585,00:07:33.920 the misreps are "You are, you're sharing information under one, 00:07:33.920,00:07:38.925 one set of terms" but actually they don't live up to that set 00:07:49.903,00:07:52.706 of terms, so that, that's a misrep case, misrepresentation 00:07:52.706,00:07:55.942 case for use. It's deceptive to the consumers. >> So... >> Oh, 00:07:55.942,00:07:58.411 and I guess I should note here as well that that was, these 00:07:58.411,00:08:01.548 were, uhm, cases from like 2011 - they're a little bit older but 00:08:01.548,00:08:04.718 this was the first case where the FTC remedy action requires 00:08:04.718,00:08:09.389 comprehensive privacy policy be implemented by the company. And, 00:08:09.389,00:08:12.292 the result of these cases are orders that are, we call them 00:08:12.292,00:08:15.228 "Consent Orders", resolving the claims that, uhm, put these 00:08:15.228,00:08:18.431 companies under 20 year orders and then we go back every couple 00:08:18.431,00:08:21.167 of years and look at how they're doing. [coughing] Also gives us 00:08:21.167,00:08:24.270 an additional, uhm, a way in which to make sure they're 00:08:24.270,00:08:27.540 complying with the order, uh, because sometimes, uh, things 00:08:27.540,00:08:31.578 happen and, uh, if they, uhm, are in violation of the order - 00:08:31.578,00:08:34.714 they're in contempt of it we can then, uh, penalize them 00:08:34.714,00:08:39.753 monetarily which can be meaningful in some cases. >> So, 00:08:39.753,00:08:43.923 Snapchat, uh, had promised that the images that you send on 00:08:43.923,00:08:47.727 Snapchat would disappear after a short period of time. And that 00:08:47.727,00:08:51.197 if somebody tried to take a, take a screenshot of them you 00:08:51.197,00:08:54.901 would get a notification. But, actually there were a number of 00:08:54.901,00:08:58.304 ways that you could save a Snapchat image, uhm, and you 00:08:58.304,00:09:01.708 could also circumvent the notification feature. [audience 00:09:01.708,00:09:07.380 noise] >> Yea, so, it doesn't disappear - deceptive. [pause] 00:09:07.380,00:09:12.652 Pretty simple. >> So, uhm, Wyndham - the hotel chain, had 00:09:12.652,00:09:16.022 three data breaches that unfairly exposed consumer's' 00:09:16.022,00:09:19.759 payment card information. They had a number of security 00:09:19.759,00:09:24.197 failures that lead to these data breaches, including, ca, uh, 00:09:24.197,00:09:27.934 storing payment card information in the clear and not not 00:09:27.934,00:09:32.038 patching, and not monitoring their systems for malware. 00:09:32.038,00:09:34.040 [audience noise] >> Yea, so this is an important case because we 00:09:34.040,00:09:37.010 actually, we proceeded using our unfairness in the wording saying 00:09:37.010,00:09:40.113 "The data security practices were unfair to consumers.". Uh, 00:09:40.113,00:09:42.982 Wyndham disagreed with us, we engaged in extensive litigation 00:09:42.982,00:09:46.386 and this year, uh, we won at the circuit court level, the use of 00:09:46.386,00:09:49.622 our authority to bring data security cases to protect 00:09:49.622,00:09:52.692 consumers. So, that's a really important validation of the 00:09:52.692,00:09:55.195 Federal Trade Commission being in this space and using our 00:09:55.195,00:09:59.599 authority. >> Kay... Oracle provided a Java update to 00:09:59.599,00:10:03.103 correct important security flaws and they promised consumers that 00:10:03.103,00:10:07.373 if they installed this update they would be safe and secure. 00:10:07.373,00:10:09.976 [laughter] However, the installer did not automatically 00:10:09.976,00:10:13.313 remove all of the old versions of Java - leaving users 00:10:13.313,00:10:18.184 vulnerable. >> Uh, again, uhm, I'm, Important data security 00:10:18.184,00:10:21.888 case, and I think, I think we'll transition now into, uh, another 00:10:21.888,00:10:25.258 really important data security area for us and that's the 00:10:25.258,00:10:29.996 internet of things stuff. >> Yea, so, uh, Asus made, uh, 00:10:29.996,00:10:32.499 routers and they promised consumers that their routers 00:10:32.499,00:10:36.135 will protect local networks. However, the routers were 00:10:36.135,00:10:39.706 vulnerable to an exploit that can provide it complete access 00:10:39.706,00:10:43.543 to a consumer's connected storage devices without needing 00:10:43.543,00:10:47.580 any credentials. They also did not address security flaws in a 00:10:47.580,00:10:50.617 timely manner which allowed attackers to change router 00:10:50.617,00:10:54.454 security configuration without a consumer's knowledge. >> And I 00:10:54.454,00:10:57.724 just note here, I mean, routers are just an incredibly important 00:10:57.724,00:11:00.727 feature protecting all the connected devices you might have 00:11:00.727,00:11:04.330 on your home network. So, making sure that the companies that are 00:11:04.330,00:11:07.967 making claims about the security of them are actually making 00:11:07.967,00:11:11.804 valid claims is really, really important. So I think this was a 00:11:11.804,00:11:14.507 super important case. Another feature of it was [cough] uhm, 00:11:14.507,00:11:17.043 and we've seen this in a couple of our other enforcement actions 00:11:17.043,00:11:20.647 configuration of encryption whether it was properly used and 00:11:20.647,00:11:23.783 properly configured or not, and when it's not uh, we've actually 00:11:23.783,00:11:27.387 brought cases as well. So, uh, Fandango is another one of 00:11:27.387,00:11:30.456 those. Uh, there's several other examples that we could, that we 00:11:30.456,00:11:33.359 could use but for the sake of time I think we'll just say 00:11:33.359,00:11:35.895 these are examples of how we use our authority and we thought 00:11:35.895,00:11:38.498 they were important to share so that as we have a conversation 00:11:38.498,00:11:41.968 about how we can work with you to help bring cases, you 00:11:41.968,00:11:46.539 understand the kind of legalese that goes along with them. 00:11:46.539,00:11:51.110 [pause] Uh, so how do we bring cases? Well, uh, we rely on 00:11:51.110,00:11:53.947 researchers and research - so that's going to be an important 00:11:53.947,00:11:58.351 part of our talk today. We also read media reports and find 00:11:58.351,00:12:01.421 those very interesting a lot of the time. Uhm, and we actually 00:12:01.421,00:12:04.824 get, uh, cases through consumer complaints and other complaints 00:12:04.824,00:12:08.261 that are filed with us. Uh, we have a whole network actually - 00:12:08.261,00:12:11.230 it's called "Sentinel Network" and it helps us bring in 00:12:11.230,00:12:15.101 complaint data from consumers also from state law enforcement 00:12:15.101,00:12:17.937 agencies, [cough], from other business buries, and from a 00:12:17.937,00:12:21.007 variety of places. This network actually works for our whole 00:12:21.007,00:12:24.110 mission, the bulk of what we do is also protect consumers from 00:12:24.110,00:12:27.914 scams and frauds and things that very low tech. Uh, but I bet it 00:12:27.914,00:12:33.686 has, has a tech component of it as well. So, you know, I think, 00:12:33.686,00:12:36.823 uh, we've been spending the first part of our talk talking 00:12:36.823,00:12:39.592 about enforcement - it is one of the most important things that 00:12:39.592,00:12:43.997 the FTC does but we're really mindful that all of this amazing 00:12:43.997,00:12:47.800 connectivity in consumers' lives, uhm, is raising a host of 00:12:47.800,00:12:51.371 issues that are, that go far beyond simply whether the 00:12:51.371,00:12:54.507 security practices are unfair to them and whether they're being 00:12:54.507,00:12:58.177 deceived about what the product is actually doing. So, the FTC 00:12:58.177,00:13:02.015 is not just an enforcer, it's also kind of an advocate and 00:13:02.015,00:13:05.485 we're trying to work with other government agencies and with 00:13:05.485,00:13:09.222 other communities to make that we're putting in place the 00:13:09.222,00:13:13.026 strongest possible policies and responses both to help keep 00:13:13.026,00:13:16.229 consumers informed but make improvements to our laws as 00:13:16.229,00:13:22.502 well. So, as all of this great, uh, tech kind of cascades over 00:13:22.502,00:13:26.506 us and our daily lives, uhm, we, we have, uh, better and stronger 00:13:26.506,00:13:30.443 protections for consumers. >> So, I'm gonna talk about an 00:13:30.443,00:13:33.780 example of, uh, of something that we worked on and this 00:13:33.780,00:13:36.749 started with a personal incident that happened to me. Uh, shortly 00:13:36.749,00:13:41.454 after I started working for the FTC, my, uh, mobile phone 00:13:41.454,00:13:45.391 account was hijacked. And I discovered this when my phone 00:13:45.391,00:13:48.728 stopped and on the same day my husband's phone stopped working 00:13:48.728,00:13:52.665 and we called out carrier and, uhm, our carrier said "Oh, is 00:13:52.665,00:13:55.735 that your new iPhones that stopped working?" and we said 00:13:55.735,00:13:59.105 "We don't have new iPhones!" and they said "Well, in our database 00:13:59.105,00:14:02.075 it says that you have new iPhones!". Uh, so they sent us 00:14:02.075,00:14:05.645 to, uh, the phone store to get new sim cards and eventually, 00:14:05.645,00:14:08.748 uhm, uh, they figured out that there had been fraud on our 00:14:08.748,00:14:12.919 account and it turns out that somebody went into a phone store 00:14:12.919,00:14:17.123 with a fake ID, claimed to be me, asked to upgrade the phones. 00:14:17.123,00:14:22.195 Uhm, and the phone company, uh, happily gave them two new, brand 00:14:22.195,00:14:25.098 new iPhones, uh, charged them to my account and put their phone 00:14:25.098,00:14:29.235 numbers on them. [pause] [pause] So when this happened to me I, I 00:14:29.235,00:14:33.439 cleaned, cleaned up the mess but I was really interested in how 00:14:33.439,00:14:36.342 often does this happen to other, to other people? And what could 00:14:36.342,00:14:40.046 be done to prevent this? Uh, so I talked to all of the major 00:14:40.046,00:14:43.716 carriers about what they were doing, uh, to prevent it. Uhm, 00:14:43.716,00:14:47.587 and, and the type of authentication procedures that 00:14:47.587,00:14:51.023 they're using, uh, they, they are relying mostly on that 00:14:51.023,00:14:55.394 driver's license and, uh, a phone store employee who's not 00:14:55.394,00:15:00.800 necessarily well-trained in how to spot fake IDs. I looked at 00:15:00.800,00:15:05.671 our consumer sentinel, uh, database to, uh, try to 00:15:05.671,00:15:09.742 understand how often this was happening. Now, uh, consumer 00:15:09.742,00:15:12.979 sentinel, you know, gets all these reports that, that people 00:15:12.979,00:15:16.382 send in, uhm, and in this case these are mostly reports that 00:15:16.382,00:15:20.286 come in through "identity theft dot gov". And we know that this 00:15:20.286,00:15:24.257 is just the tip of the iceberg because most people don't even 00:15:24.257,00:15:27.260 know that they can submit their identity theft complaints. We're 00:15:27.260,00:15:30.630 trying to get the word out so tell your friends. Uhm, but we, 00:15:30.630,00:15:35.701 we, we expect that is, uh, only maybe one percent of so of the 00:15:35.701,00:15:40.072 total identity thefts that are happening we see. So, I went 00:15:40.072,00:15:43.609 back through this data and if you look three years ago in a 00:15:43.609,00:15:48.481 typical month - say January 2013 we got about a thou, a thousand 00:15:48.481,00:15:52.685 reports of this mobile phone hijacking or a similar thing 00:15:52.685,00:15:56.522 called "sim swap", uhm, so we had about a thousand reports and 00:15:56.522,00:15:59.892 that made up about three percent of all of our identity theft 00:15:59.892,00:16:04.030 reports that month. Uhm, then we looked three years later and we 00:16:04.030,00:16:09.268 find 26-hundred reports and that is about six percent of all 00:16:09.268,00:16:12.238 identity theft reports that month. So, we're definitely 00:16:12.238,00:16:15.641 seeing a trend here of this becoming an increasingly large 00:16:15.641,00:16:19.946 problem. Uh, I also did a lot looking for media reports and 00:16:19.946,00:16:23.549 saw that there were, uh, a lot of reports of people having 00:16:23.549,00:16:29.188 similar things happen to them. Uh, perhaps even worse, uh, 00:16:29.188,00:16:32.391 besides just using this to get free phones - some of the 00:16:32.391,00:16:36.095 attackers are using this to get access to the victim's phone 00:16:36.095,00:16:38.965 number so that they can intercept their two-factor 00:16:38.965,00:16:42.869 authentication. Uh, so shortly after this happened to me, it 00:16:42.869,00:16:46.839 happened to, DeRay McKesson who is a well-known "Black Lives 00:16:46.839,00:16:49.709 Matter" activist. He has something like 400-thousand 00:16:49.709,00:16:52.979 Twitter followers and somebody wanted to get into his Twitter 00:16:52.979,00:16:56.916 account so they could tweet as him. Uhm, and this is something 00:16:56.916,00:17:00.219 that is becoming increasingly common, uhm, I understand that 00:17:00.219,00:17:03.256 in Europe, uh, they're doing this to get access to people's 00:17:03.256,00:17:07.760 bank accounts and, uhm, and the attackers are successfully able 00:17:07.760,00:17:12.632 to get in and actually clean out people's, uh, bank accounts. >> 00:17:12.632,00:17:14.901 So, is it any wonder that consumers have trust and 00:17:14.901,00:17:17.803 security issues, right? [chuckle] Uh, I will note that 00:17:17.803,00:17:20.706 our consumer, consumer sentinel data, that complaint data that 00:17:20.706,00:17:23.976 we've been talking about reflects that identity theft is 00:17:23.976,00:17:27.179 the number one consumer complaint for the last 5 years. 00:17:27.179,00:17:31.017 We get hundreds of thousands of these complaints - so it's not 00:17:31.017,00:17:34.787 just this kind of spoofing but it's a wider problem as well. 00:17:34.787,00:17:38.157 Uh, doesn't show any signs, not surprisingly, of, of lessening, 00:17:38.157,00:17:43.195 uhm, unfortunately. Uhm, so, obviously there's a huge amount 00:17:43.195,00:17:47.133 of defenceless dat, data out there. We have this 00:17:47.133,00:17:50.836 alphabet-soup approach to our privacy protection in the US and 00:17:50.836,00:17:53.806 any of you in this room are probably familiar with it, it, 00:17:53.806,00:17:57.610 uh, is like the, the, the TL, you know, DR version of this is 00:17:57.610,00:18:01.981 like FTC Act, FCRA, COPPA, HIPPA, Communications Act, GLB, 00:18:01.981,00:18:05.217 uh, state laws, right? But there's no comprehensive privacy 00:18:05.217,00:18:08.387 law, there's no, uh, comprehensive data security law. 00:18:08.387,00:18:12.925 So that's the at, the atmosphere that we're, we're operating in, 00:18:12.925,00:18:17.196 which is why, uh, the FTC doesn't just do enforcement - it 00:18:17.196,00:18:21.801 does a tremendous amount of education, convening and trying 00:18:21.801,00:18:25.671 to work broadly, uhm, to address these issues. One of the 00:18:25.671,00:18:28.274 initiatives that we've had in the last year is something 00:18:28.274,00:18:31.744 called "Start with Security". Uh, which is really trying to 00:18:31.744,00:18:35.815 get our message out about what good security practices look 00:18:35.815,00:18:39.251 like - I probably don't need to tell anybody in this room that a 00:18:39.251,00:18:42.989 lot of the consumer-facing technology, uh, uhm is pretty 00:18:42.989,00:18:46.692 porous and, and, uhm, in fact many of the people who were 00:18:46.692,00:18:50.596 creating it probably have no idea, uh, starting with security 00:18:50.596,00:18:52.932 actually looks like. So we're trying to get that message out 00:18:52.932,00:18:56.602 as broadly as we possibly can, some of the, the biggest 00:18:56.602,00:18:59.739 problems we're continuing to see are ignored reports of 00:18:59.739,00:19:03.175 vulnerabilities, uh, slow response time to vulnerability 00:19:03.175,00:19:06.545 reports; lack of data minimization where appropriate, 00:19:06.545,00:19:11.384 failure to store passwords, uh, securely. Lack of training of 00:19:11.384,00:19:14.920 employees; lack of proper configuration, uhm, you know, so 00:19:14.920,00:19:18.824 we continue to see a host of problems, uh, in that space as 00:19:18.824,00:19:22.461 well. We're also trying to increase our in house 00:19:22.461,00:19:26.866 capabilities and our in house expertise, uhm, to understand 00:19:26.866,00:19:31.103 how the technology is working and to be a better environment 00:19:31.103,00:19:34.740 for, uh, people like you to bring research to us, uhm, so 00:19:34.740,00:19:37.910 actually we have some of our awesome office of technology and 00:19:37.910,00:19:40.846 research and investigation folks here today - uh, Joe and Aaron 00:19:40.846,00:19:43.983 if you wanna raise your hand... Uhm, and if you wanna... >> 00:19:43.983,00:19:46.018 They're wearing shirts like this... >> Yea, yea, so shirts 00:19:46.018,00:19:49.388 like this! You wanna do an IOT deep-dive, uh, Joe and I are 00:19:49.388,00:19:51.624 actually going to be in IOT village later on, uh, this 00:19:51.624,00:19:54.727 afternoon at 4 'o'clock. So, we would love to talk to then and 00:19:54.727,00:19:57.897 also here, uh, any issues and research that you've, you've 00:19:57.897,00:20:01.767 already been doing in the, the IOT space. Uh, so we also have, 00:20:01.767,00:20:04.336 uhm, an internship program and we're trying to bring more 00:20:04.336,00:20:08.841 technologists in through that as well. >> Uh, one of the things 00:20:08.841,00:20:13.045 that, that, uh, the office of technology, OTech, is doing, uh, 00:20:13.045,00:20:17.283 is they're putting together a fall technology series. Uhm, and 00:20:17.283,00:20:21.187 so we have coming up in September a workshop on 00:20:21.187,00:20:24.890 ransomware, in October we have workshop on drones, and December 00:20:24.890,00:20:28.594 a workshop on smart TVs. Uh, there's information about all of 00:20:28.594,00:20:32.064 these workshops on our website, we're very interested if you 00:20:32.064,00:20:36.302 have expertise in these areas, you have research reports, 00:20:36.302,00:20:39.371 anything you'd like to share with us, there's information on 00:20:39.371,00:20:42.541 how you can share that with us - either before or after the 00:20:42.541,00:20:47.480 workshops. Uh, if you're in the DC area, please come, uh, the 00:20:47.480,00:20:50.716 workshops are free and open to the public. If you're not in the 00:20:50.716,00:20:53.552 DC area, uhm, [cough] or even if you are, you are welcome to 00:20:53.552,00:20:57.690 watch our, uhm, live webcasts of the workshops, uhm, and the 00:20:57.690,00:21:01.794 videos will also be archived. Uh, so these are, are good ways 00:21:01.794,00:21:04.830 for us to, uh, collect information on these topics 00:21:04.830,00:21:08.901 focussed on the security privacy issues and to better understand 00:21:08.901,00:21:13.973 what consumer protection issues, uh, there are in these spaces. 00:21:13.973,00:21:16.475 Another workshop we have coming up and, and this is one that 00:21:16.475,00:21:20.312 I've been working a lot on is putting disclosures to the test. 00:21:20.312,00:21:23.916 Uh, so my interest in this started when I was doing work on 00:21:23.916,00:21:27.987 privacy policies which are a type of consumer disclosure, uh, 00:21:27.987,00:21:30.856 but I realize that there are a lot of other types of 00:21:30.856,00:21:34.894 disclosures, uhm, which the FTC is interested in which have some 00:21:34.894,00:21:38.264 of the same problems that privacy policies do as far as 00:21:38.264,00:21:41.467 being long and hard to understand and we would really 00:21:41.467,00:21:46.438 like them to be more effective. And, so, uh, the purpose of this 00:21:46.438,00:21:50.442 workshop is to bring in researchers who do usability, 00:21:50.442,00:21:54.013 users studies and evaluate disclosures to fi, to try to 00:21:54.013,00:21:57.650 figure out how to make them actually communicate well with 00:21:57.650,00:22:02.087 consumers. And, so, uh, we'll be hearing from, uh, people who 00:22:02.087,00:22:05.090 have done work on privacy notices but also nutrition 00:22:05.090,00:22:09.361 labels and drug and all sorts of other types of disclosures. >> I 00:22:09.361,00:22:12.164 thought this was covered so incredibly well this morning by 00:22:12.164,00:22:15.167 Sarah and Mudge in their talk about the cyber independent 00:22:15.167,00:22:18.504 testing lab that they're putting together. This need to have 00:22:18.504,00:22:21.006 consumers have more transparencies so that they can 00:22:21.006,00:22:25.177 make educated choices about the products that they're buying, 00:22:25.177,00:22:27.146 the software that they're buying, the apps that they're 00:22:27.146,00:22:29.448 buying [cough] and just to understand what some of the 00:22:29.448,00:22:33.319 risks might be associated with them. So, we're trying to really 00:22:33.319,00:22:36.589 improve and increase and expand our knowledge of, about the kind 00:22:36.589,00:22:39.592 of communications that work with consumers and are effective with 00:22:39.592,00:22:44.330 them. [pause] Uh, "Privacy Con 2", so this is our second annual 00:22:44.330,00:22:47.866 privacy con this year, this is also a forum for researchers - 00:22:47.866,00:22:50.970 especially researchers doing privacy research to come present 00:22:50.970,00:22:54.607 it to us. Uhm, we had an incredibly successful first 00:22:54.607,00:22:57.176 Privacy Con last year, we're going to do it [cough] again 00:22:57.176,00:23:01.380 this year. Uhm, and I, first of all, learned a huge amount which 00:23:01.380,00:23:04.583 was great. Uhm, it definitely affects our enforcement but it 00:23:04.583,00:23:07.620 also, I think, really affects the broader policy discussion 00:23:07.620,00:23:10.889 that we're having on privacy on, in the country as well. Uhm, so 00:23:10.889,00:23:13.692 that's coming up in January, there will be information about 00:23:13.692,00:23:15.761 how to participate [cough]e, uhm, that is actually currently 00:23:15.761,00:23:18.464 on our website... >> It's currently on our website and we 00:23:18.464,00:23:22.234 are, uh, seeking research papers, uh, in the privacy area 00:23:22.234,00:23:26.005 right now, uhm, the deadline I, I believe is in October some 00:23:26.005,00:23:29.541 time. Uh, so definitely, uhm, think about submitting things 00:23:29.541,00:23:33.245 and think about, uh, coming or tuning in, uhm, this should be a 00:23:33.245,00:23:37.049 really great event. >> So research, uhm, we're gonna wrap 00:23:37.049,00:23:41.220 up this talk talking about the research wish, wishlist that 00:23:41.220,00:23:44.223 Lorrie has been putting together which I'm really excited about 00:23:44.223,00:23:45.924 because it, I feel like [cough] sometimes we have a very 00:23:45.924,00:23:49.428 abstract conversation about what it is that would really help us 00:23:49.428,00:23:52.531 to understand better with the academic community, with the 00:23:52.531,00:23:55.634 researcher community. So, this is our attempt, uhm, and it's 00:23:55.634,00:24:00.005 gonna be uh, uhm, uh an attempt that we keep pursuing, right? To 00:24:00.005,00:24:02.841 sort of refine the types of issues that we think is really 00:24:02.841,00:24:06.178 gonna be helpful to use to understand and to really solicit 00:24:06.178,00:24:10.883 research and in academia and elsewhere for, for, uh, these 00:24:10.883,00:24:14.019 kinds of topics, uhm, we're also going to make sure that we have 00:24:14.019,00:24:17.623 time for questions too. So, I'll let you... >> Yea.. >> Run 00:24:17.623,00:24:20.993 through it. >> Yea, yea, so I, I spent some time working with the 00:24:20.993,00:24:24.029 OTech folks and we went and talked to people in every 00:24:24.029,00:24:27.700 division of the agency about their [cough] research needs, 00:24:27.700,00:24:31.503 uhm, so that we could then go out and talk to researchers 00:24:31.503,00:24:34.473 about ways you might be able to help us. Uhm, I don't have time 00:24:34.473,00:24:37.009 to go through the entire wishlist but we're gonna focus 00:24:37.009,00:24:41.246 on some of the security and privacy items here. Uhm, so, 00:24:41.246,00:24:44.717 we're very interested in research on how to assess the 00:24:44.717,00:24:49.021 risks that are posed by breaches and vulnerabilities. You know, 00:24:49.021,00:24:52.624 we, we know that there are risks but, uhm, but we wanna look at 00:24:52.624,00:24:57.596 exactly what metrics can we use to assess them. Uhm, we also are 00:24:57.596,00:25:01.266 very interested in protecting consumers from ransomware, from 00:25:01.266,00:25:05.037 malwaretising, and, and other risks. And so, uhm, we're 00:25:05.037,00:25:09.942 interested in research that helps us protect consumers, uhm, 00:25:09.942,00:25:13.312 we're also interested in being able to trace exposed data to 00:25:13.312,00:25:16.348 specific breaches and we're looking for breaches to help us 00:25:16.348,00:25:19.952 do that. Uh, we're looking at research that is at the 00:25:19.952,00:25:24.022 intersection at economics and security - how can we make 00:25:24.022,00:25:27.993 certain types of attacks less profitable and therefore less, 00:25:27.993,00:25:31.897 uhm, less desirable for an attacker to pursue? Uhm, and 00:25:31.897,00:25:35.334 then we're also very interested in protecting consumers from 00:25:35.334,00:25:39.638 fraud, and so we're interested in ways that we can automate the 00:25:39.638,00:25:45.177 process of spotting fraud, uhm, detecting fraud quickly. Uhm, 00:25:45.177,00:25:49.348 IOT devices is, uhm, emerging area, uh, and we're very 00:25:49.348,00:25:52.785 interested in, in research related to that. Uhm, we would 00:25:52.785,00:25:56.922 like to help IOT device manufacturers and platforms have 00:25:56.922,00:25:59.591 better security and so we're very interested in research 00:25:59.591,00:26:03.195 along those lines. Uhm, we're also interested in defensive 00:26:03.195,00:26:08.967 measures so that if there is a problem with an IOT device it 00:26:08.967,00:26:12.504 won't compromise the entire network. Other emerging trends, 00:26:12.504,00:26:16.475 uhm, uh, there are increasing, uh, devices that have sensors in 00:26:16.475,00:26:19.178 them, including devices for children - Barbie dolls that 00:26:19.178,00:26:22.281 talk to you and things like that, uh, we're very interested 00:26:22.281,00:26:25.751 in how to prevent these devices from compromising consumer 00:26:25.751,00:26:29.388 privacy and children's' privacy. Uhm, we're very interested in 00:26:29.388,00:26:34.226 how to isolate critical systems, for example, in connected cars. 00:26:34.226,00:26:39.164 Uhm, bots - that's a, that's a new thing, uhm, increasingly we 00:26:39.164,00:26:43.802 have, uh, bots; other artificial intelli, uh, intelligence, uhm, 00:26:43.802,00:26:47.606 and when consumers interact with these bots we wonder "Do they 00:26:47.606,00:26:51.343 even know that they're interacting with a machine" uhm, 00:26:51.343,00:26:55.981 and so we want, uh, research on how consumers and, uh, become 00:26:55.981,00:27:00.018 aware of that and they know about this. Uhm, virtual reality 00:27:00.018,00:27:03.956 is a new area that, uh, we're, we've seen a lot of progress in 00:27:03.956,00:27:07.659 lately, a lot more consumer devices available in the virtual 00:27:07.659,00:27:11.230 reality space and there hasn't been a whole lot of discussion 00:27:11.230,00:27:14.233 of the security and privacy issues, you know, it's fun, it's 00:27:14.233,00:27:16.902 entertaining but we wanna stay out ahead of that [cough] and 00:27:16.902,00:27:21.306 try to make sure that we protect consumers as well. Uhm, new 00:27:21.306,00:27:26.111 tools and techniques - uh, we're very interested in a variety of 00:27:26.111,00:27:29.715 different types of tools, uh, we're interested in hearing 00:27:29.715,00:27:33.652 about tools that consumers can use to control their personal 00:27:33.652,00:27:37.089 information and especially across contacts as personal 00:27:37.089,00:27:40.692 information is now increasingly shared across contacts - you 00:27:40.692,00:27:44.663 know, your phone, you know, shares with your tv and what 00:27:44.663,00:27:48.000 not. Uhm, we're also, uhm, uh, interested in tools that help 00:27:48.000,00:27:52.237 consumers observe what data their devices are sharing. Uhm, 00:27:52.237,00:27:57.543 we are interested in, uh, tools that allow us to analyze apps 00:27:57.543,00:28:01.647 and to understand the type of data that they are sharing and 00:28:01.647,00:28:05.684 that uhm, they are associating with third party libraries. Uhm, 00:28:05.684,00:28:09.888 or interested in algorithms that, uhm, are used, uhm, to 00:28:09.888,00:28:13.759 make decisions about people and may actually, uhm, either on 00:28:13.759,00:28:17.696 purpose or inadvertently discriminate against people. Uh, 00:28:17.696,00:28:20.933 we're interested in identifying when cross-device tracking is 00:28:20.933,00:28:24.002 occurring, uhm, and we're interested in tools that will 00:28:24.002,00:28:28.774 help us identify vulnerability in IOT devices. And many, many 00:28:28.774,00:28:32.077 more! This is, uh, just a, a quick sampling of some of the 00:28:32.077,00:28:35.147 research areas that we're interested in and that you'll, 00:28:35.147,00:28:41.053 we hope you will come talk to us about, uh, if you have insights. 00:28:41.053,00:28:45.090 Uhm, so what happens if you do, uhm, uh, come talk to us about 00:28:45.090,00:28:49.394 it? So, uhm, uh, our OTech folks, uhm, will take a look at 00:28:49.394,00:28:52.864 the research that we, uh, receive, they will look, uhm, 00:28:52.864,00:28:57.035 across the agency to find, uhm, people for whom this is relevant 00:28:57.035,00:29:01.607 and try to direct that to them so that we can see if it's going 00:29:01.607,00:29:05.043 to be, uhm, of use to, uh, to, to the work that we're doing or 00:29:05.043,00:29:07.879 whether we should start a new project in an area that somebody 00:29:07.879,00:29:11.550 brings to our attention. >> Yea, and I, and I just say that, uhm, 00:29:11.550,00:29:14.553 sometimes you bring to us and then we actually end up bringing 00:29:14.553,00:29:17.422 a case, so... It can result in a, in a lawsuit against a 00:29:17.422,00:29:20.225 company as well. [audience noise] Uhm, that's happened, uh, 00:29:20.225,00:29:23.862 some of the time. So, actually segway as well to the "we want 00:29:23.862,00:29:27.966 you" slide. The creepy uncle Sam... [laughter] Uhm, you know, 00:29:27.966,00:29:30.769 I think what we're trying to, what we're trying to really, uh, 00:29:30.769,00:29:35.841 if you have one takeaway made clear here is that, uh 00:29:35.841,00:29:38.443 [coughing], we actually, we can't solve all of the 00:29:38.443,00:29:42.381 challenges that are going to confronting consumers in a 00:29:42.381,00:29:44.883 hyperconnected environment without a lot partnership - 00:29:44.883,00:29:48.854 particularly with the security researcher com, community. 00:29:48.854,00:29:52.024 [coughing] So we're trying to do, uh, the most that can do to 00:29:52.024,00:29:56.094 try and, uh, develop those partnerships and have it not 00:29:56.094,00:29:59.698 only our enforcement mission but also the, the research that we 00:29:59.698,00:30:03.335 do, the studies that conduct, the workshops that we conduct 00:30:03.335,00:30:07.439 and the ways in which the FTC tries to actually, uh, make sure 00:30:07.439,00:30:10.909 that, uhm, policy makers and others in the broader space are 00:30:10.909,00:30:15.447 seeing these issues [cough] that might harm consumers. [cough] >> 00:30:15.447,00:30:18.984 Yea, so, uhm, we have set up the email address - "research at FTC 00:30:18.984,00:30:23.155 dot gov". Uh, please use that to send us research and that will, 00:30:23.155,00:30:27.159 uh, be examined by our folks in OTech, uhm, uh, and for pointers 00:30:27.159,00:30:30.696 to all of the workshops that I mentioned and all the other 00:30:30.696,00:30:35.200 things that I talked about here - please take a look at "FTC dot 00:30:35.200,00:30:38.704 gov slash tech" that, that's the tech blog, uhm, lots of other 00:30:38.704,00:30:41.873 interesting stuff there too, so check it out! Uh, I think we're 00:30:41.873,00:30:44.409 ready for questions... >> Yea, and you can follow us on Twitter 00:30:44.409,00:30:47.746 too, you're at Lorrie tweet... >> Oh, yea, I'm "at Lorrie 00:30:47.746,00:30:51.283 tweet". >> And I'm "at T McSweeny FTC". [audience noise] 00:30:51.283,00:30:55.087 [applause] So, uh, thank you. [applause] Awesome, so we have, 00:30:55.087,00:31:00.492 we left plenty of extra time for questions so we'd love to field 00:31:00.492,00:31:03.562 some questions and, uh, I think we have like what? 5 or 10 00:31:03.562,00:31:07.532 minutes? >> Hi, uh, Greg Reef. You mentioned discriminatory 00:31:07.532,00:31:10.836 algorithms that you are concerned about, we know what, 00:31:10.836,00:31:14.005 in the news, I think it was two or three months ago with 00:31:14.005,00:31:18.143 Facebook and their newsfeed. There's also been recently the 00:31:18.143,00:31:22.714 oth, the other, uh, major social media site - Twitter banning 00:31:22.714,00:31:26.618 people because they did a bad movie review of, uh, 00:31:26.618,00:31:31.757 Ghostbuster, uh, movie, and they had their account banned and a, 00:31:31.757,00:31:36.361 uh, from Breitbart news it was Milo. Uh, also there's been, 00:31:36.361,00:31:41.133 uhm, other censorship against talk show hosts for mentioning 00:31:41.133,00:31:43.135 repeating what happened in Germany, I won't mention 00:31:43.135,00:31:45.137 religion because I don't want to be censored here [laughter] ... 00:31:45.137,00:31:49.441 and, uh, an attack that he lost his, uh, Facebook account. What 00:31:49.441,00:31:54.312 do you do for situations like that or is that in your, uh, 00:31:54.312,00:31:58.183 swimlane? Thanks. >> So some of that raises a host of really 00:31:58.183,00:32:02.254 interesting sort of broader first amendment concerns, uhm, 00:32:02.254,00:32:04.790 you know, I think one of the things that we're trying to 00:32:04.790,00:32:07.893 really focus on when we're thinking about algorithms and 00:32:07.893,00:32:11.563 data and especially like machine learning on top of all of that 00:32:11.563,00:32:15.400 is the extent at which, uh, choices are being curated and 00:32:15.400,00:32:18.637 offered to consumers in such a way that might limit their 00:32:18.637,00:32:22.340 choice or even resolve in a disparate impact on them. So one 00:32:22.340,00:32:23.675 of the things that we haven't gone into yet is the extent to 00:32:23.675,00:32:25.010 which those algorithms are, are kind of manipulating the, the 00:32:25.010,00:32:26.378 overall news that they're getting. Which is, I think, your 00:32:26.378,00:32:27.712 question, uhm, but, but we are interested in the extent to 00:32:27.712,00:32:29.047 which, uhm, impacting the credit offerings that consumers are 00:32:29.047,00:32:30.382 getting, the housing offerings, appointment offerings - some of 00:32:30.382,00:32:31.716 those pure economic choices. We do have laws on the books, uh, 00:32:31.716,00:32:33.084 comfortingly, civil rights laws, equal opportunity laws, right? 00:32:33.084,00:32:34.419 That already protect people in the brick and mortar world from 00:32:34.419,00:32:35.754 this kind of discrimination but one of the things that are 00:32:35.754,00:32:37.088 really hard in the increasingly digital world is figuring out 00:32:37.088,00:32:39.224 when that's even happening at all. Uhm, and that's some of the 00:32:39.224,00:32:40.559 work that we really need help with right now. [pause] Yea? >> 00:32:40.559,00:32:41.893 I, I heard a lot of emphasis in this presentation about 00:32:41.893,00:32:43.228 regulation, basically, or actions against companies and 00:32:43.228,00:32:44.596 consumers but I feel that more and more government is becoming 00:32:44.596,00:32:46.598 a servicer of consumers. And it used to be you go into an office 00:32:46.598,00:32:51.603 and you deal with someone and that was a, a real interaction 00:33:24.936,00:33:27.472 but now the services are so broad and dynamic because 00:33:27.472,00:33:30.909 government is trying to offer electronic services on the 00:33:30.909,00:33:34.279 forefront and I'd argue that they're not necessarily the most 00:33:34.279,00:33:38.383 expert at it and data breaches and such - these are all things 00:33:38.383,00:33:41.119 that apply to the cit, to the government as much as private 00:33:41.119,00:33:45.257 companies. So, what's your regulatory or involvement with 00:33:45.257,00:33:49.561 government services? >> Yeah, well, uh, we are the government 00:33:49.561,00:33:51.796 [laughter] but we don't actually regulate the other parts of the 00:33:51.796,00:33:54.733 government so that's actually good for us. [laughter] Because, 00:33:54.733,00:33:57.068 that's to point out if they challenge, I think if you see 00:33:57.068,00:34:00.305 this administration taking action to really try to improve 00:34:00.305,00:34:04.876 both the privacy, uhm, and security, uh, talent and, and 00:34:04.876,00:34:08.113 policies throughout the government. Uhm, so the question 00:34:08.113,00:34:10.615 for the people in the back was, uh, you know "What are you 00:34:10.615,00:34:13.752 doing, FTC, about the government and its problems?" Uhm, and the 00:34:13.752,00:34:17.088 short answer is we're focussed on protecting consumers but, I, 00:34:17.088,00:34:19.357 you know I think we are collaborating with the other 00:34:19.357,00:34:21.726 parts of the government - we have our own Chief Privacy 00:34:21.726,00:34:24.696 Officer, our own Chief Security Officer, we're very mindful of 00:34:24.696,00:34:28.099 these, of these issues. Uhm, and I, and I think one of the other 00:34:28.099,00:34:30.368 things that you really see happening in this administration 00:34:30.368,00:34:33.738 is uh, uh, government-wide emphasis on bringing 00:34:33.738,00:34:36.741 technologists into government - that's something the FTC has 00:34:36.741,00:34:38.677 been a real leader on, we actually have been doing this 00:34:38.677,00:34:41.513 for a number of years because what we recognize is that when 00:34:41.513,00:34:43.949 we were dealing with protecting consumers in an increasingly 00:34:43.949,00:34:46.985 digital world we need technologists to help us 00:34:46.985,00:34:49.921 understand what is even happening in that world. Which 00:34:49.921,00:34:52.757 is why we have people like Lorrie, uh, but, why we've also 00:34:52.757,00:34:56.328 expanded to develop an entire office that is staffed by 00:34:56.328,00:34:59.497 researchers and technologists as well. I think we need to grow 00:34:59.497,00:35:02.300 those resources, uh, but we need to do it throughout the 00:35:02.300,00:35:04.869 government as well and when we're having big debates, like 00:35:04.869,00:35:07.939 encryption debates, we need to make sure technologists are at 00:35:07.939,00:35:10.675 the table for those debates because, uh, a lot of the time 00:35:10.675,00:35:14.346 the policy talk in Washington isn't so well informed. Probably 00:35:14.346,00:35:16.915 no one here is surprised to hear that.... [laughter] >> Yea, and 00:35:16.915,00:35:19.618 there, there is now a government-wide privacy council 00:35:19.618,00:35:22.921 which the FTC participates in actively and is helping to 00:35:22.921,00:35:27.192 educate other agencies. >> Early, early in your talk you 00:35:27.192,00:35:30.061 mentioned about Google and Facebook and how they were, you 00:35:30.061,00:35:32.530 caught them for something, changing their, their end-user 00:35:32.530,00:35:35.300 agreement. In these EULAs it often says that we can change 00:35:35.300,00:35:38.236 the EULA terms anytime we want to. So then how can you, kind 00:35:38.236,00:35:41.706 of, accuse them of unfairness if the user has agreed to these 00:35:41.706,00:35:45.076 terms? >> Mhhmm, yea! So this is a great question! The question 00:35:45.076,00:35:48.313 is "If you have, uh, uh, user agreement that covers 00:35:48.313,00:35:51.216 everything, uhm, how can you then come back and bring a 00:35:51.216,00:35:54.352 deception case about, about something that's sort of covered 00:35:54.352,00:35:58.456 in the 60, 90 page, uh, user agreement?" Well, the answer is: 00:35:58.456,00:36:01.793 context matters. And I think what we're trying to make very 00:36:01.793,00:36:06.665 clear in our FTC enforcement is if users share information under 00:36:06.665,00:36:10.869 one set of rules in a way that makes sense given the kind of 00:36:10.869,00:36:15.373 stuff they're doing with an app, right? Then, uh, then, you know, 00:36:15.373,00:36:18.910 that's covered by the user agreement. If,if you do 00:36:18.910,00:36:22.213 something that's super tricky, right? Or really impossible for 00:36:22.213,00:36:25.350 consumers to figure out or you change how that information is 00:36:25.350,00:36:27.819 being handled without really giving them a clear explanation 00:36:27.819,00:36:31.456 of what those changes are or if you set up your thing to defeat 00:36:31.456,00:36:34.559 what their settings were to begin with, right? That's a case 00:36:34.559,00:36:38.263 we just brought called "InMobi" that, that we actually can bring 00:36:38.263,00:36:45.036 a deception case in that situation. [pause] >> Hi, last 00:36:45.036,00:36:48.273 Fall you had a workshop about cross-device tracking... >> Yes! 00:36:48.273,00:36:51.443 >> And I know you sent some warning letters to developers 00:36:51.443,00:36:54.946 this Spring that, uh, that were using a toolkit that might be 00:36:54.946,00:36:58.750 used for cross-device tracking. Uh, what additional is, it seems 00:36:58.750,00:37:03.221 like this is an area that is probably going to grow rather 00:37:03.221,00:37:07.358 than shrink, uh, is there additional activity going on at 00:37:07.358,00:37:10.462 FTC that, to continue to track this? And, and what are you 00:37:10.462,00:37:11.796 doing in the future? >> Yea... so thank you! This is a question 00:37:11.796,00:37:13.131 about cross-device tracking which is an issue we're 00:37:13.131,00:37:14.466 definitely trying to understand a lot more clearly, uh, it's 00:37:14.466,00:37:15.800 already informed a little bit. Our enforcement efforts, right? 00:37:15.800,00:37:17.268 So the InMobi case which I was just talking about is a case 00:37:17.268,00:37:20.371 where, uhm, we actually had a mobile ad company that is uhm, 00:37:20.371,00:37:22.707 in incredibly widely-used, uhm, company that said it would only 00:37:22.707,00:37:29.114 track if you opted in and in fact it tracked weather and not 00:37:29.114,00:37:34.119 and had really created a whole system to kind of go around the 00:37:40.625,00:37:44.095 opted to begin with in order to track consumers using, uh, 00:37:44.095,00:37:46.865 geo-location and other things. So we, we said that that's, 00:37:46.865,00:37:50.702 that's unfair and deceptive. Uhm, we also, you noted the 00:37:50.702,00:37:54.305 SilverPush letters, so the SilverPush technology which, uh, 00:37:54.305,00:37:56.674 for those of you that didn't see our letter - cause I get it, 00:37:56.674,00:37:59.077 we're, you know, out there in Washington... [laughter] Uhm, we 00:37:59.077,00:38:01.513 issued a warning letter to app developers saying that "If 00:38:01.513,00:38:04.282 you've installed SilverPush...", which is a piece of software 00:38:04.282,00:38:07.552 that can monitor device maker phones and listen to the audio 00:38:07.552,00:38:10.788 beacons that are coming off of advertisements on TV, so 00:38:10.788,00:38:15.527 basically, uhm, is technology that allows them to gather what 00:38:15.527,00:38:19.564 someone's, uh, viewing habits are based on, uh, what they're 00:38:19.564,00:38:22.600 telephone microphone is picking up from these audio beacons that 00:38:22.600,00:38:24.903 are embedded in these advertisements. Uhm, we said 00:38:24.903,00:38:28.173 that "We were, uh,very skeptical, uh, that, that this 00:38:28.173,00:38:31.376 type of technology should be included in apps". So I think 00:38:31.376,00:38:35.046 that should serve as a pretty , uh, bright line warning letter 00:38:35.046,00:38:37.482 that we're worried that consumers really don't have 00:38:37.482,00:38:41.052 adequate notice and transparency about what that tech is. We're 00:38:41.052,00:38:43.388 also looking at many of the ways in which people are being 00:38:43.388,00:38:47.692 passively, uh, I, I could say "unveiled" - that's a bit 00:38:47.692,00:38:50.895 loaded, but... [laughter] Passively, uh, have, having 00:38:50.895,00:38:53.698 information gathered about them, uh, last year we brought a case 00:38:53.698,00:38:57.135 called "Nomi" which was a company that was, uhm, tracking 00:38:57.135,00:39:00.939 people's', uhm, locations and retail locations and the had 00:39:00.939,00:39:04.108 said they would offer an opt out and retell locations but they 00:39:04.108,00:39:07.145 didn't in fact compel the retailers using the technology 00:39:07.145,00:39:09.581 to offer the opt out - so there was not opt out. And there's no 00:39:09.581,00:39:11.716 way a consumer can know that's happening, really, unless you 00:39:11.716,00:39:15.220 have some kind of clear notice that it's occurring and some 00:39:15.220,00:39:18.022 kind of choice. Uhm, so there we said "Look, if you're gonna say 00:39:18.022,00:39:20.325 you're gonna offer an opt out you have to really offer the opt 00:39:20.325,00:39:23.328 out". Uhm, now again, there's no comprehensive privacy law in 00:39:23.328,00:39:27.065 this country so there's nothing that says that kind of thing, 00:39:27.065,00:39:30.168 uh, can't happen without, uh, consumers choosing or having a 00:39:30.168,00:39:32.937 choice about it, so... Uhm, it's a, it's a, it's an area that 00:39:32.937,00:39:37.742 we're continuing to monitor very carefully. [pause] >> Spring 00:39:37.742,00:39:39.377 boarding off of th, uh, the previous questions about 00:39:39.377,00:39:40.712 consumer privacy and the transparency that goes on 00:39:40.712,00:39:42.046 between other government agencies, uhm, is your 00:39:42.046,00:39:43.381 commitment to transparency documented - if say, an exploit 00:39:43.381,00:39:44.716 is discovered and say, the NSA wants to hold onto that exploit 00:39:44.716,00:39:49.721 for some use? >> Hmmm... do you want me to take this one? 00:39:57.895,00:40:01.866 Probably... [laughter] >> Yea, go for it. >> Uhm, so... 00:40:01.866,00:40:07.105 [laughter] Uhm, we are a civil enforcement agency and, uhm, I 00:40:07.105,00:40:10.408 could imagine where there could be situations in which, uh, we 00:40:10.408,00:40:14.846 wouldn't be in that, uh, in that dialogue. Uhm, for a variety of 00:40:14.846,00:40:20.051 reasons... Uhm, if, if something is disclosed to us, uh, we, what 00:40:20.051,00:40:24.222 we then do is try to understand whether, uhm, we have enough 00:40:24.222,00:40:28.159 facts to actually bring a case using our existing authorities 00:40:28.159,00:40:33.264 about, uhm, the practices that lead to, uh, especially if it's 00:40:33.264,00:40:39.037 exploited or, or whether, uhm, in some cases we have brought 00:40:39.037,00:40:41.673 cases, uh, when something was, uh, disclosed and then the 00:40:41.673,00:40:44.442 recipient of that didn't really react at all, right? So if you 00:40:44.442,00:40:47.245 don't have a mature, uh, disclosure program in your 00:40:47.245,00:40:51.215 company to receive exploits and respond to them that could be a 00:40:51.215,00:40:54.218 factor in our analysis about whether you have reasonable data 00:40:54.218,00:40:57.188 security practices in your company. But I'm not really 00:40:57.188,00:41:00.692 answering, like your direct question, uhm, which is, which 00:41:00.692,00:41:05.196 is the broader, like, national security question because... >> 00:41:05.196,00:41:07.899 The broader civil libe, civil liberties question is applicable 00:41:07.899,00:41:11.035 too because what if I discover an exploit and then I get 00:41:11.035,00:41:14.005 slapped with a notice to not mention because the government 00:41:14.005,00:41:16.874 wants to use it for something else? WHat's my protection or 00:41:16.874,00:41:19.410 the protection of consumers? >> So this is an area that I, you 00:41:19.410,00:41:22.380 know, that I personally think is, uhm, one that we really need 00:41:22.380,00:41:25.316 to work on - the maturity of our laws in the US and how we're 00:41:25.316,00:41:29.220 handling it. Because, uh, the FTC thinks that we need to have 00:41:29.220,00:41:33.324 really good, clear partnerships with security researchers so 00:41:33.324,00:41:36.160 that, uh, people who are doing the work on behalf of consumers 00:41:36.160,00:41:39.097 to help us understand how the technologies actually working, 00:41:39.097,00:41:42.900 uhm, are able to do that work without, uh, you know, fear or 00:41:42.900,00:41:46.037 rep, fear of reprisals. Now, you understand this is a balancing 00:41:46.037,00:41:48.139 act, right? That there are bad actors out there, we wanna 00:41:48.139,00:41:51.009 protect against the bad actors but, uhm yea, I think it's a 00:41:51.009,00:41:54.312 part of a broader conversation that we need to have. And the 00:41:54.312,00:41:58.249 FTC, uhm, maybe not all of the FTC, I'll say at this point I'm 00:41:58.249,00:42:00.651 speaking on behalf of myself... [laughter] Uh, you know, I think 00:42:00.651,00:42:04.589 some of us really feel strongly that we need to modernize, uh, 00:42:04.589,00:42:07.759 how we're handling, uh, com, computer fraud and abuse acts 00:42:07.759,00:42:12.730 and some other things so that we can have a more mature system in 00:42:12.730,00:42:15.900 place for handling, uhm, how researchers handle and how 00:42:15.900,00:42:19.437 exploits are, are handled when they're disclosed. >> Hi, you 00:42:19.437,00:42:24.042 described that the FTC is trying to get a handle of privacy rigs, 00:42:24.042,00:42:27.945 IOT, and virtual reality and new technologies but can we talk 00:42:27.945,00:42:31.816 more about what you;re doing with routers. I know about the 00:42:31.816,00:42:35.720 Asus case but rout, routers are so important to consumers. Many 00:42:35.720,00:42:39.624 of them don't realize it, it's the gateway to their private 00:42:39.624,00:42:42.794 networks and where everything is shared and the practices, the 00:42:42.794,00:42:46.097 security practices [cough] with router vendor, of router vendors 00:42:46.097,00:42:50.701 have been so bad for so long and many of the same vendors are now 00:42:50.701,00:42:54.739 doing the IOT as, as well. So [cough] what are you doing there 00:42:54.739,00:42:57.875 to convince vendors to improve those practices that have been 00:42:57.875,00:43:00.945 going on for many, so many years? >> So, for starters we're 00:43:00.945,00:43:04.982 bringing cases, uhm, I, it, I don't wanna talk about any 00:43:04.982,00:43:09.053 pending cases but I would say we, we take this, we take the 00:43:09.053,00:43:11.155 security of routers and the claims being made around them 00:43:11.155,00:43:15.026 very very seriously and are, are taking a careful look. I don't 00:43:15.026,00:43:18.429 know it you wanna add to that? >> Uhm, yea.. >> Yea.. 00:43:18.429,00:43:20.531 [laughter] >> Alright... >> Yea, alright, I think we're out of 00:43:20.531,00:43:24.302 time so, again, if there's one takeaway is that we really want 00:43:24.302,00:43:27.171 to forge a good partnership, we wanna hear from you, we wanna 00:43:27.171,00:43:29.574 participate with you. Uh, if you think there are things, uhm, 00:43:29.574,00:43:33.010 that we're, that we're missing, uh, we would love to hear about 00:43:33.010,00:43:35.980 it and add to our call to research list. So thank you for 00:43:35.980,00:43:39.750 your attendance and time and happy DefCon - this was awesome. 00:43:39.750,00:43:43.454 >> Yea, thanks for coming! [applause]