1 00:00:00,000 --> 00:00:00,000 We've had problem last night with some 2 00:00:00,000 --> 00:00:00,042 after hours parties when we convert to DJ zone and apparently 3 00:00:00,042 --> 00:00:01,999 the petting zoo was full of strippers. 4 00:00:01,999 --> 00:00:01,999 So, if you have children, don't bring them 5 00:00:01,999 --> 00:00:03,417 to the after hours parties. 6 00:00:03,417 --> 00:00:04,584 That's not a good idea. 7 00:00:04,584 --> 00:00:06,792 This is going to be the last talk of the day. 8 00:00:06,792 --> 00:00:09,792 And it's a 20 minute talk and a little bit of Q&A afterwards. 9 00:00:09,792 --> 00:00:10,792 So the mic is here. 10 00:00:10,792 --> 00:00:11,959 Don't block the exits. 11 00:00:11,959 --> 00:00:12,959 The seat bet works. 12 00:00:12,959 --> 00:00:14,918 Put the big end and little end together. 13 00:00:14,918 --> 00:00:14,918 We've got a few minutes between now and 6:00 so we're going 14 00:00:14,918 --> 00:00:17,167 to wait a few minutes to see if anybody shows up. 15 00:00:17,167 --> 00:00:18,999 How many this is your first DEF CON. 16 00:00:18,999 --> 00:00:19,999 Raise your hand. 17 00:00:19,999 --> 00:00:20,999 Two DEF CONs. 18 00:00:20,999 --> 00:00:22,999 How about you've been here five times. 19 00:00:22,999 --> 00:00:23,999 4s? 20 00:00:23,999 --> 00:00:24,999 Great. 21 00:00:24,999 --> 00:00:25,999 3s? 22 00:00:25,999 --> 00:00:26,999 Okay. 23 00:00:26,999 --> 00:00:28,542 So for guys the first time, nice? 24 00:00:28,542 --> 00:00:29,918 What needs improvements. 25 00:00:29,918 --> 00:00:30,999 Make sure it's legible. 26 00:00:30,999 --> 00:00:30,999 One of the talks in here yesterday, they were sending ideas written 27 00:00:30,999 --> 00:00:32,792 with a crayon and they were happy. 28 00:00:32,792 --> 00:00:35,167 We'll start it off and Mr. Jim will take care of it. 29 00:00:35,167 --> 00:00:36,167 So 20 minutes. 30 00:00:36,167 --> 00:00:37,667 So we get to start the party. 31 00:00:37,667 --> 00:00:37,667 JAMES DENARO: Thanks for coming to the last talk 32 00:00:37,667 --> 00:00:38,751 of Saturday evening. 33 00:00:38,751 --> 00:00:38,751 I'm fully aware of the fact that I'm all that is standing between you 34 00:00:38,751 --> 00:00:40,334 and a substantially better time. 35 00:00:40,334 --> 00:00:42,792 (Laughter) So we're going to do this quickly. 36 00:00:42,792 --> 00:00:44,667 This is basically a turbo talk format. 37 00:00:44,667 --> 00:00:46,709 20 minutes we're going to move quickly. 38 00:00:46,709 --> 00:00:46,709 All these slides will be made available online so, 39 00:00:46,709 --> 00:00:49,709 if you miss something, you can catch it on the slides later. 40 00:00:49,709 --> 00:00:51,626 Contact info will be on the last slide. 41 00:00:51,626 --> 00:00:51,626 With all the different parameters, so feel free to reach 42 00:00:51,626 --> 00:00:52,709 out to me afterwards. 43 00:00:52,709 --> 00:00:52,709 We'll be doing a little bit of a Q&A after this 44 00:00:52,709 --> 00:00:54,459 for anyone who's still here at 6:20. 45 00:00:54,459 --> 00:00:54,459 The topic for today is to disclose or sell an exploit without getting 46 00:00:54,459 --> 00:00:55,459 in it trouble. 47 00:00:55,459 --> 00:00:55,459 I'm Jim Denaro an intellectual property attorney based 48 00:00:55,459 --> 00:00:55,459 out of Washington, D.C., focus my work 49 00:00:55,459 --> 00:00:55,459 on security technologies, before I went to law school, I used 50 00:00:55,459 --> 00:00:55,459 to spend far too much time tweaking around on max bug 51 00:00:55,459 --> 00:00:55,459 on my power PC and figured that no better way to keep doing that than 52 00:00:55,459 --> 00:00:56,459 to do this. 53 00:00:56,459 --> 00:00:57,459 So here we go. 54 00:00:57,459 --> 00:00:57,459 Just because I'm an attorney and this does have some legal 55 00:00:57,459 --> 00:00:57,459 component to it, although this is not a law talk, really, I to give 56 00:00:57,459 --> 00:00:57,459 the standard disclaimer that this presentation is not legal advice 57 00:00:57,459 --> 00:01:00,542 about your specific situation or your specific questions. 58 00:01:00,542 --> 00:01:00,542 Even if you ask question a question, we're still talking about hypotheticals, 59 00:01:00,542 --> 00:01:00,542 if we develop a attorney client relationship, 60 00:01:00,542 --> 00:01:01,999 then we can give legal advice. 61 00:01:01,999 --> 00:01:05,459 This is not attorney client relationship alone through this talk. 62 00:01:05,459 --> 00:01:06,834 We can maybe do that later. 63 00:01:06,834 --> 00:01:07,083 Just as a quick overview what we're trying to accomplish 64 00:01:07,083 --> 00:01:10,918 in the next 20 minutes we're going to cover the types 65 00:01:10,918 --> 00:01:13,999 of risks that are being faced by researchers, 66 00:01:13,999 --> 00:01:18,459 some risk mitigation strategies that researchers can take to try 67 00:01:18,459 --> 00:01:20,999 to reduce those risks. 68 00:01:20,999 --> 00:01:22,709 Some options for disclosing 69 00:01:22,709 --> 00:01:28,125 a vulnerability that may be less that ma I have less risk and then some 70 00:01:28,125 --> 00:01:33,667 of the risks that are associated with selling an exploit. 71 00:01:33,876 --> 00:01:37,125 The overall goal of this is to make yourself a harder target. 72 00:01:37,125 --> 00:01:42,792 If someone asks you well can I be sued if I do this or if this happens? 73 00:01:42,792 --> 00:01:44,334 The answer is always yes. 74 00:01:44,667 --> 00:01:49,334 You can always be sued by anybody for anything at any time. 75 00:01:49,501 --> 00:01:52,167 The only question is who's going to win? 76 00:01:52,167 --> 00:01:54,167 And the goal is to make it more likely that you 77 00:01:54,167 --> 00:01:57,667 will win which disincentivizes someone from actually suing you 78 00:01:57,667 --> 00:01:59,709 in the first place. 79 00:02:04,209 --> 00:02:07,626 So let's start out with just some great examples 80 00:02:07,626 --> 00:02:10,999 of the kind of research activities that might get 81 00:02:10,999 --> 00:02:13,334 somebody in trouble. 82 00:02:13,334 --> 00:02:16,083 For example, some of these these are generally real 83 00:02:16,083 --> 00:02:17,834 life cases. 84 00:02:18,125 --> 00:02:20,542 You found out how to see other people's utility bills 85 00:02:20,542 --> 00:02:23,334 by changing the HTTP query string. 86 00:02:23,709 --> 00:02:26,125 I talked to somebody at a party the other night who was wondering how 87 00:02:26,125 --> 00:02:29,292 to do that, he was wondering what to do about it. 88 00:02:29,375 --> 00:02:32,083 You discover your neighbor's Wi Fi is not protected. 89 00:02:32,083 --> 00:02:34,083 How did you find that out? 90 00:02:34,999 --> 00:02:38,999 You broke the crypto that's protecting some media that you had. 91 00:02:38,999 --> 00:02:40,459 It's getting more serious. 92 00:02:40,459 --> 00:02:42,751 There's actual money at stake or maybe you wrote 93 00:02:42,751 --> 00:02:45,459 a better remote access tool. 94 00:02:45,459 --> 00:02:47,876 That sounds like you might make a lot of money. 95 00:02:47,999 --> 00:02:50,709 So many of the same risks apply surprisingly 96 00:02:50,709 --> 00:02:55,083 enough whether you're just looking at changing HTTP strings or 97 00:02:55,083 --> 00:02:59,209 whether you're actually taking apart a DVD. 98 00:02:59,334 --> 00:03:02,501 So in general, we're talking about techniques. 99 00:03:02,501 --> 00:03:03,999 I've sort of defined it here. 100 00:03:03,999 --> 00:03:07,250 Broad spectrum, everything from denial of service, 101 00:03:07,250 --> 00:03:11,125 a technique that might be used for denial of service attacks 102 00:03:11,125 --> 00:03:14,501 to something that's just you know more akin to sort 103 00:03:14,501 --> 00:03:17,417 of investigatory web browsing. 104 00:03:17,999 --> 00:03:22,209 First, when is there risk to a security researcher. 105 00:03:22,334 --> 00:03:26,918 There are three general areas where we see the risks starting to show up, one, 106 00:03:26,918 --> 00:03:29,834 there can be a threat of legal action before you go 107 00:03:29,834 --> 00:03:31,999 to a conference or make this disclosure, 108 00:03:31,999 --> 00:03:34,751 there's examples listed here. 109 00:03:34,876 --> 00:03:38,959 You might be the recipient of a legal action seeking 110 00:03:38,959 --> 00:03:42,999 an injunction barring you from disclosing something 111 00:03:42,999 --> 00:03:45,626 before a conversation. 112 00:03:45,626 --> 00:03:46,999 So now removed from merely saber rattling 113 00:03:46,999 --> 00:03:50,459 to an actual lawsuit being filed against you. 114 00:03:50,459 --> 00:03:54,083 And then there's the possibility of a legal action being initiated 115 00:03:54,083 --> 00:03:58,292 against you after you make the make the disclosure. 116 00:03:58,292 --> 00:03:59,999 And these are all real examples. 117 00:04:00,250 --> 00:04:03,334 Declan McCullough of CNet and his colleagues have 118 00:04:03,334 --> 00:04:05,417 written articles. 119 00:04:06,417 --> 00:04:08,542 I recommend them to you. 120 00:04:10,999 --> 00:04:14,250 Some of these seem to be around Blackhat and DEF CON 121 00:04:14,250 --> 00:04:16,542 on a regular basis. 122 00:04:17,209 --> 00:04:19,125 That's when it can happen. 123 00:04:19,292 --> 00:04:21,999 Your number one concern is typically going to be 124 00:04:21,999 --> 00:04:24,834 the computer fraud and abuse act. 125 00:04:24,834 --> 00:04:26,501 You've probably heard about that lately the perhaps here 126 00:04:26,501 --> 00:04:28,501 at other conferences. 127 00:04:29,083 --> 00:04:32,125 The main issue is it prohibits access 128 00:04:32,125 --> 00:04:37,584 without authorization or exceeding authorized access. 129 00:04:37,959 --> 00:04:42,959 The two times you're likely to run into possibly exceeding authorized 130 00:04:42,959 --> 00:04:46,626 access or acting without authorization would be 131 00:04:46,626 --> 00:04:49,584 in the investigatory phase of working 132 00:04:49,584 --> 00:04:54,459 on your whatever technique it is that you've got. 133 00:04:55,125 --> 00:04:59,083 And when you actually create a tool that performs whatever this 134 00:04:59,083 --> 00:05:00,999 technique is. 135 00:05:00,999 --> 00:05:03,459 You might actually have a problem or that tool does 136 00:05:03,459 --> 00:05:06,125 the act that is prohibited. 137 00:05:06,209 --> 00:05:08,250 So in light of everyone's talked much 138 00:05:08,250 --> 00:05:12,626 about how vague this notion of the computer fraud and abuse act 139 00:05:12,626 --> 00:05:14,999 is of authorization. 140 00:05:14,999 --> 00:05:16,167 I've created a handy checklist to figure 141 00:05:16,167 --> 00:05:17,999 out if you might have a computer fraud 142 00:05:17,999 --> 00:05:20,083 and abuse act problem. 143 00:05:20,083 --> 00:05:22,459 (Laughter) There we go. 144 00:05:24,834 --> 00:05:26,999 Are you connected to the Internet? 145 00:05:27,292 --> 00:05:28,751 Probably. 146 00:05:28,999 --> 00:05:31,125 Are you accessing a remote system? 147 00:05:31,250 --> 00:05:32,792 Probably. 148 00:05:32,792 --> 00:05:35,083 Do you have permission to access that system? 149 00:05:35,209 --> 00:05:37,083 This is the real hard question. 150 00:05:37,501 --> 00:05:39,999 It's really hard to know if you have permission, 151 00:05:39,999 --> 00:05:42,834 if you saw a banner go by that said you don't have access, 152 00:05:42,834 --> 00:05:45,250 you probably don't have access. 153 00:05:45,876 --> 00:05:48,417 But there are a lot of cases where it's not so clear. 154 00:05:48,417 --> 00:05:52,918 And that's really where you have this sort of like the Andrew Auernheimer 155 00:05:52,918 --> 00:05:57,626 situation where he's querying an API on a regular basis. 156 00:06:00,375 --> 00:06:03,834 There's no banner or clear prohibition for doing that. 157 00:06:03,834 --> 00:06:05,626 It was a public facing API after all. 158 00:06:06,792 --> 00:06:09,209 There are real risks in figuring out whether or not you 159 00:06:09,209 --> 00:06:10,999 have permission. 160 00:06:10,999 --> 00:06:12,584 But that's really all it takes. 161 00:06:13,626 --> 00:06:17,584 Unfortunately, it's not just about what you do. 162 00:06:17,999 --> 00:06:21,083 The computer fraud and abuse act is about what your friends do. 163 00:06:21,417 --> 00:06:23,999 I believe the risk of being caught up in conspiracy to violate 164 00:06:23,999 --> 00:06:26,918 the computer fraud and abuse act is most certainly enhanced 165 00:06:26,918 --> 00:06:29,834 by the prevalence of social media today. 166 00:06:29,999 --> 00:06:34,167 If you're on Twitter or other easy to use social media platform, 167 00:06:34,167 --> 00:06:38,334 you're talking to your friends about how you might do something 168 00:06:38,334 --> 00:06:42,417 or answering questions about how you might do a certain thing, 169 00:06:42,417 --> 00:06:46,584 with a technique that you've developed, you're starting to head 170 00:06:46,584 --> 00:06:49,417 down the road of conspiracy. 171 00:06:49,417 --> 00:06:54,501 Conspiracy typically does require an overt act. 172 00:06:54,542 --> 00:06:56,999 In order to really fulfill the conspiracy and typically just 173 00:06:56,999 --> 00:07:00,125 discussing something with someone does not. 174 00:07:00,584 --> 00:07:02,959 But, if you start providing technical support 175 00:07:02,959 --> 00:07:05,417 for something that someone else is doing, 176 00:07:05,417 --> 00:07:09,083 you're really definitely increasing the risk of being caught 177 00:07:09,083 --> 00:07:13,083 up in the conspiracy to violate the Computer Fraud and Abuse Act 178 00:07:13,083 --> 00:07:15,876 if not violating yourself. 179 00:07:15,959 --> 00:07:19,083 We've got examples where the Computer Fraud and Abuse Act has 180 00:07:19,083 --> 00:07:20,959 been applied. 181 00:07:22,334 --> 00:07:25,667 Because that's how we see how it's being applied if we look 182 00:07:25,667 --> 00:07:28,209 at examples and we can compare what we're doing 183 00:07:28,209 --> 00:07:31,375 to some things that happened in the past to other people 184 00:07:31,375 --> 00:07:34,667 and see how close those comparisons are. 185 00:07:34,667 --> 00:07:36,334 And since we're in Las Vegas, we actual absolutely have to talk 186 00:07:36,334 --> 00:07:38,209 about the case of Nestor. 187 00:07:38,626 --> 00:07:45,501 Nestor was really into video poker and he liked to play and play and play. 188 00:07:45,667 --> 00:07:47,876 He got really good at it. 189 00:07:47,876 --> 00:07:49,999 He played it so much that he discovered a bug 190 00:07:49,999 --> 00:07:53,999 in the video poker software that enable him to play one type 191 00:07:53,999 --> 00:07:57,834 of game and bid a bunch of money in that game and switch 192 00:07:57,834 --> 00:08:02,375 to another game and ate multiplier would be applied to his bid so when 193 00:08:02,375 --> 00:08:06,292 he won, he got this enormous payout, and he figured out how 194 00:08:06,292 --> 00:08:10,083 to reproduce this big and he and his friends were doing it 195 00:08:10,083 --> 00:08:12,999 and getting a lot of money. 196 00:08:12,999 --> 00:08:17,292 Eventually how these stories always end, he got caught and 197 00:08:17,292 --> 00:08:22,999 he was charged with violating the computer fraud and abuse act. 198 00:08:23,417 --> 00:08:26,999 He was looking at the computer we saw it's mostly 199 00:08:26,999 --> 00:08:31,209 unauthorized access or exceeding authorization that 200 00:08:31,209 --> 00:08:32,999 you had. 201 00:08:32,999 --> 00:08:35,792 And it's hard to imagine that sitting he didn't access the firmware 202 00:08:35,792 --> 00:08:37,999 or take the game apart. 203 00:08:37,999 --> 00:08:40,999 He sat there and pushed the buttons on the machine. 204 00:08:41,334 --> 00:08:43,999 How you could seed authorized action says to video access 205 00:08:43,999 --> 00:08:45,999 to video poker machine is mind boggling 206 00:08:45,999 --> 00:08:49,417 but those charges were assessed against him. 207 00:08:49,417 --> 00:08:54,626 Ultimately, the Department of Justice did not pursue the charges. 208 00:08:54,709 --> 00:08:57,334 They went ahead with other fraud charges. 209 00:08:57,334 --> 00:08:59,999 But, nonetheless, for some period of time, he was facing computer Fraud 210 00:08:59,999 --> 00:09:02,792 and Abuse Act for doing exactly that. 211 00:09:04,083 --> 00:09:07,083 It's also worth looking at the tragic case 212 00:09:07,083 --> 00:09:13,501 of Aaron Swartz who spoofed his MAC address to download journal articles. 213 00:09:13,501 --> 00:09:15,501 That was Computer Fraud and Abuse Act. 214 00:09:16,292 --> 00:09:21,125 Andrew Auernheimer, who allegedly conspires to run 215 00:09:21,125 --> 00:09:25,209 an automated script to plug in identifiers 216 00:09:25,209 --> 00:09:32,209 for iPads and get e mail addresses, didn't even do it himself. 217 00:09:32,209 --> 00:09:35,083 He's doing several years in federal prison for that. 218 00:09:35,125 --> 00:09:38,792 Also worth noting that Department of Justice as said 219 00:09:38,792 --> 00:09:42,083 in their manual that conspiracy to hack a honeypot 220 00:09:42,083 --> 00:09:46,083 with can violate computer Fraud and Abuse Act there's no end 221 00:09:46,083 --> 00:09:48,918 to the sorts of things that can violate 222 00:09:48,918 --> 00:09:52,125 the computer Fraud and Abuse Act so you're looking 223 00:09:52,125 --> 00:09:55,999 at a situation where a computer Fraud and Abuse Act acts 224 00:09:55,999 --> 00:10:00,334 as an ex post facto law where the Department of Just sis is able 225 00:10:00,334 --> 00:10:04,751 to look at what you did after the fact and if they don't like it 226 00:10:04,751 --> 00:10:07,626 or they don't like you for whatever reason, 227 00:10:07,626 --> 00:10:12,667 you may be trollish for some reason, you're likely to be on the wrong ends 228 00:10:12,667 --> 00:10:16,959 of the computer Fraud and Abuse Act prosecution. 229 00:10:21,000 --> 00:10:24,999 There's the company that whoever the target 230 00:10:24,999 --> 00:10:29,959 is of this exploit can also pursue who's ever accessed 231 00:10:29,959 --> 00:10:33,918 the system without authorization. 232 00:10:34,999 --> 00:10:37,834 The question then is is there anything we can do to try 233 00:10:37,834 --> 00:10:40,167 to reduce our chances of being on the wrong ends 234 00:10:40,167 --> 00:10:42,459 of this type of lawsuit? 235 00:10:42,792 --> 00:10:46,667 Well, let's take a quick we don't want to go too far into statute. 236 00:10:46,667 --> 00:10:49,542 Not a continuing legal education conference. 237 00:10:49,542 --> 00:10:52,501 But let's look at the statute and see if there's key words we can 238 00:10:52,501 --> 00:10:54,501 at least identify. 239 00:10:54,999 --> 00:10:57,999 Here we are whoever having knowingly accessed 240 00:10:57,999 --> 00:11:01,209 the computer without authorization. 241 00:11:01,709 --> 00:11:02,999 Another part. 242 00:11:03,459 --> 00:11:07,999 Whoever intentionally accesses a computer without authorization. 243 00:11:08,083 --> 00:11:10,999 So one of the things you can do is to try 244 00:11:10,999 --> 00:11:15,959 to avoid unintentionally creating knowledge and intent. 245 00:11:16,501 --> 00:11:20,834 It's a little bit hard to do this for yourself if you intend 246 00:11:20,834 --> 00:11:26,918 to do something you can avoid doing it in connection with other people. 247 00:11:26,999 --> 00:11:30,999 So for example, I would suggest that you do not direct 248 00:11:30,999 --> 00:11:34,542 information about how to use some kind of technique 249 00:11:34,542 --> 00:11:38,834 to someone that you suspect or have reason to know is likely 250 00:11:38,834 --> 00:11:41,125 to use it illegally. 251 00:11:41,667 --> 00:11:44,209 Be careful when providing technical support 252 00:11:44,209 --> 00:11:47,584 for a new technique you've developed. 253 00:11:48,250 --> 00:11:52,999 If I were your lawyer I would advise you not to answer that tweet if someone 254 00:11:52,999 --> 00:11:56,542 is tweeting into asking about how to make something more 255 00:11:56,542 --> 00:11:58,709 effective perhaps. 256 00:12:02,083 --> 00:12:04,501 Next slide is more detailed. 257 00:12:05,083 --> 00:12:07,999 Some more approaches that you might take. 258 00:12:08,125 --> 00:12:11,209 Don't provide information directly to individuals, especially 259 00:12:11,209 --> 00:12:15,876 if you're not sure who they are or what they might be up to. 260 00:12:15,876 --> 00:12:20,459 Consider just posting things on a Web site only. 261 00:12:20,999 --> 00:12:23,999 Do not post information to forums. 262 00:12:25,999 --> 00:12:28,083 Where you suspect or you or forms that are known 263 00:12:28,083 --> 00:12:30,918 to generally promote illegal activity. 264 00:12:30,959 --> 00:12:33,999 If you publish it on your own Web site or have control 265 00:12:33,999 --> 00:12:36,584 of the post, consider disabling comments so you 266 00:12:36,584 --> 00:12:40,083 don't have a situation of people discussing potentially illegal 267 00:12:40,083 --> 00:12:42,375 uses of your technique. 268 00:12:42,375 --> 00:12:45,083 Lastly, don't maintain logs. 269 00:12:47,459 --> 00:12:49,501 (Laughter) So one of the things we've seen happen, 270 00:12:49,501 --> 00:12:51,542 that's enough of the computer Fraud and Abuse Act 271 00:12:51,542 --> 00:12:52,959 for now. 272 00:12:52,999 --> 00:12:58,709 There's not a lot you can do about it beyond just being careful. 273 00:12:59,292 --> 00:13:02,626 Let's move on to temporary restraining order. 274 00:13:02,626 --> 00:13:06,501 This is particularly timely, actually, because you may have read 275 00:13:06,501 --> 00:13:10,999 the story about the VW group and the mega most encryption that was 276 00:13:10,999 --> 00:13:13,709 used on the mobilizers. 277 00:13:13,709 --> 00:13:17,250 So European security researchers figured out how to bypass 278 00:13:17,250 --> 00:13:21,501 the or discovered a flaw in the encryption that was used 279 00:13:21,501 --> 00:13:27,334 on the vehicle immobilizers used on Porsche and Audi and Bentley. 280 00:13:27,334 --> 00:13:30,751 And they were going to present this at the conference in Washington, D.C., 281 00:13:30,751 --> 00:13:32,999 in a few week and they got themselves slapped 282 00:13:32,999 --> 00:13:36,250 with a temporary restraining order preventing them from making 283 00:13:36,250 --> 00:13:38,999 the disclosure at the conference. 284 00:13:38,999 --> 00:13:42,999 How did this happen and how can we keep this from happening again. 285 00:13:42,999 --> 00:13:45,250 We've seen this here at Blackhat and DEF CON. 286 00:13:45,250 --> 00:13:48,667 The talks have been stopped by a temporary restraining order. 287 00:13:48,999 --> 00:13:51,209 Take a quick look at the factors that courts look 288 00:13:51,209 --> 00:13:55,083 at when deciding whether or not to grant a temporary restraining order 289 00:13:55,083 --> 00:13:57,999 to prevent a researcher from disclosing information 290 00:13:57,999 --> 00:14:01,584 about a vulnerability the VW group will they have temporary harm 291 00:14:01,584 --> 00:14:03,999 if the TRO is not issued. 292 00:14:03,999 --> 00:14:04,999 Good evening, sir. 293 00:14:04,999 --> 00:14:06,999 Who knows how this works? 294 00:14:06,999 --> 00:14:07,999 Is he a new speaker. 295 00:14:07,999 --> 00:14:08,999 Yes. 296 00:14:20,375 --> 00:14:24,209 It's really hard to be selected as a speaker to the DEF CON. 297 00:14:24,209 --> 00:14:26,083 You need to present talks to eventually have yourselves 298 00:14:26,083 --> 00:14:27,959 up here, right? 299 00:14:27,959 --> 00:14:29,167 I've been drinking all day. 300 00:14:34,083 --> 00:14:36,999 A big round of applause for Jim. 301 00:14:41,083 --> 00:14:42,667 All right. 302 00:14:42,667 --> 00:14:44,209 One more order of business. 303 00:14:44,209 --> 00:14:46,959 We need a new person his first time at DEF CON. 304 00:14:46,959 --> 00:14:48,125 First hand up right there. 305 00:14:48,501 --> 00:14:51,542 Red shirt, we've got extra, let's get more. 306 00:14:51,542 --> 00:14:52,542 Two people. 307 00:14:53,584 --> 00:14:55,542 First hand up over there. 308 00:14:55,542 --> 00:15:11,584 There we go. 309 00:15:11,584 --> 00:15:13,751 Cheers to our new speaker. 310 00:15:18,999 --> 00:15:21,999 Let's see if he can pick up where he left off. 311 00:15:22,501 --> 00:15:27,876 We're going to work on new material for tomorrow. 312 00:15:28,083 --> 00:15:29,083 Thank you. 313 00:15:30,959 --> 00:15:36,999 JAMES DENARO: All right. 314 00:15:36,999 --> 00:15:37,999 So that was great. 315 00:15:37,999 --> 00:15:38,999 Thank you. 316 00:15:38,999 --> 00:15:42,167 So just a quick look at some of the factors that the court is going 317 00:15:42,167 --> 00:15:44,501 to look at when deciding whether or not to grant 318 00:15:44,501 --> 00:15:46,375 a temporary restraining order to someone 319 00:15:46,375 --> 00:15:48,751 like the VW group who wants to prevent something 320 00:15:48,751 --> 00:15:51,209 from happening at the conference. 321 00:15:51,334 --> 00:15:55,125 Will they suffer irreparable harm? 322 00:15:55,125 --> 00:15:56,999 They've got an embedded system. 323 00:15:57,542 --> 00:16:00,626 They're going to figure out how to break it. 324 00:16:00,999 --> 00:16:03,876 Impossible to figure out in a period of time. 325 00:16:05,083 --> 00:16:08,083 Probably irreparable harm. 326 00:16:08,459 --> 00:16:10,626 Money isn't going to fix it. 327 00:16:12,083 --> 00:16:14,584 That goes in VW group's favor. 328 00:16:14,709 --> 00:16:17,125 Will there be harm to the researcher. 329 00:16:17,999 --> 00:16:19,959 Your paper got delayed? 330 00:16:20,250 --> 00:16:25,167 You couldn't put in some part of the algorithm, you have to pare back. 331 00:16:25,167 --> 00:16:28,167 Hard to see that as a huge harm to the researcher, might feel bad 332 00:16:28,167 --> 00:16:31,083 about that but in terms of the huge sums of money 333 00:16:31,083 --> 00:16:34,999 the VW group is going to have to pay to fix this, it's not going 334 00:16:34,999 --> 00:16:38,167 to look good for the researcher there. 335 00:16:38,167 --> 00:16:40,542 Public interest, this is a fun one because we might think 336 00:16:40,542 --> 00:16:42,876 the public interest clearly favors disclosing 337 00:16:42,876 --> 00:16:45,709 the vulnerability so it can be fixed. 338 00:16:45,999 --> 00:16:49,292 The court is probably going to go the other way 339 00:16:49,292 --> 00:16:54,999 on that and see that really, the risk to all these BMWs or Porsches 340 00:16:54,999 --> 00:17:00,083 and Bentleys and things being stolen is much greater, much more 341 00:17:00,083 --> 00:17:06,999 in the public interest than having your obscure crypto talk go forward. 342 00:17:08,626 --> 00:17:10,999 The last factor is the likelihood a requester 343 00:17:10,999 --> 00:17:13,250 will ultimately prevail. 344 00:17:13,334 --> 00:17:15,999 This is the one we need to focus on. 345 00:17:16,542 --> 00:17:19,250 Because the VW group has to have cause of action, 346 00:17:19,250 --> 00:17:22,834 they can't just say we don't like it. 347 00:17:22,999 --> 00:17:27,375 They say here's why you need to stop. 348 00:17:27,375 --> 00:17:29,792 It's because you did something bad to us. 349 00:17:30,083 --> 00:17:34,876 And in the case of VW group, with the mega most case and also 350 00:17:34,876 --> 00:17:39,999 in the case of this Sysco disclosure, what we had was the use 351 00:17:39,999 --> 00:17:44,459 of copyrighted material and that was the hook that got 352 00:17:44,459 --> 00:17:46,959 the TRO to issue. 353 00:17:47,292 --> 00:17:49,709 So the obvious advice then is to avoid the use 354 00:17:49,709 --> 00:17:52,083 of copyrighted material. 355 00:17:52,083 --> 00:17:55,334 So, if you include source code or object code from whatever it 356 00:17:55,334 --> 00:17:59,167 is that you're working on, that gives leverage to whoever it 357 00:17:59,167 --> 00:18:03,083 is who wants to stop you from disclosing it. 358 00:18:03,375 --> 00:18:07,999 There is a fair use of exception if you use pieces of code. 359 00:18:08,250 --> 00:18:10,626 That's a case by case analysis. 360 00:18:10,626 --> 00:18:12,876 You can't just say this is fair use. 361 00:18:12,876 --> 00:18:16,999 It depends on how much you use and other actors that are very specific 362 00:18:16,999 --> 00:18:20,999 to what's actually going on in your case. 363 00:18:20,999 --> 00:18:23,959 So just try to avoid it if you can. 364 00:18:23,959 --> 00:18:24,999 It may not be possible. 365 00:18:25,250 --> 00:18:27,542 But to the extent you can do that. 366 00:18:27,584 --> 00:18:31,334 Also avoid dark net sources for whatever you're getting this stuff. 367 00:18:31,626 --> 00:18:35,083 In the mega most case, the court talked about the fact that 368 00:18:35,083 --> 00:18:38,626 the researchers obtained some information about how 369 00:18:38,626 --> 00:18:43,501 the Megamost system worked through some sketchy channels. 370 00:18:43,542 --> 00:18:46,083 I don't recall it saying exactly where they got it. 371 00:18:46,083 --> 00:18:48,417 But it was some sort of Bittorrent PDP type thing, 372 00:18:48,417 --> 00:18:50,792 wherever they got it. 373 00:18:50,999 --> 00:18:53,999 It wasn't from VW group or mega most. 374 00:18:54,209 --> 00:18:56,501 So another thing you want to do is be aware 375 00:18:56,501 --> 00:19:00,792 of pre existing contractual relationships that you as a security researcher might 376 00:19:00,792 --> 00:19:04,792 have with the target of whatever it is you're working on. 377 00:19:04,999 --> 00:19:08,417 These contractual agreements could come in the form of a term of service, 378 00:19:08,417 --> 00:19:10,999 end user license agreements. 379 00:19:10,999 --> 00:19:13,834 Nondisclosure agreements or employment agreements. 380 00:19:21,417 --> 00:19:24,918 An end user license agreement might well have provisions 381 00:19:24,918 --> 00:19:29,626 in it that prohibit reverse engineering software for example and that might be 382 00:19:29,626 --> 00:19:32,083 what you're doing as part your exploration 383 00:19:32,083 --> 00:19:36,083 into your technique and that could give leverage to someone to try 384 00:19:36,083 --> 00:19:39,626 to stop using, oh, you've breached this. 385 00:19:39,834 --> 00:19:40,999 Nothing is for certain. 386 00:19:40,999 --> 00:19:42,751 It's just an argument they have. 387 00:19:42,751 --> 00:19:44,999 There's not much you can do about it. 388 00:19:45,459 --> 00:19:47,250 Pretty much every piece of software you have is going 389 00:19:47,250 --> 00:19:50,999 to have some software assuming you've come to it legitimately. 390 00:19:56,083 --> 00:19:59,083 There's not a whole lot you can do about that. 391 00:19:59,083 --> 00:20:01,042 But you at least can be aware of the risk. 392 00:20:01,250 --> 00:20:02,375 If nothing else. 393 00:20:07,959 --> 00:20:11,999 How far you need to do to mitigate the risk depends 394 00:20:11,999 --> 00:20:16,083 on the technique you've used in research. 395 00:20:16,083 --> 00:20:20,125 If you've done things that clearly look like some of the examples 396 00:20:20,125 --> 00:20:24,209 of what people have done that's gotten them prison time, 397 00:20:24,209 --> 00:20:27,000 that's something you need to be careful 398 00:20:27,000 --> 00:20:31,834 of and maybe take more aggressive mitigation techniques in order 399 00:20:31,834 --> 00:20:34,626 to perhaps hide some of the information 400 00:20:34,626 --> 00:20:37,292 about what you're doing. 401 00:20:37,292 --> 00:20:40,584 So for example, if in the mega most case no one had 402 00:20:40,584 --> 00:20:44,501 identified that it was the VW group that was where 403 00:20:44,501 --> 00:20:49,000 the crypto system had been compromised, VW group would not 404 00:20:49,000 --> 00:20:54,167 have been able to issue to go after a temporary restraining order 405 00:20:54,167 --> 00:20:57,000 against the researchers. 406 00:20:57,000 --> 00:20:59,918 So perhaps there's an opportunity here 407 00:20:59,918 --> 00:21:03,792 for the conference going community to create 408 00:21:03,792 --> 00:21:10,459 a track where people could present things that sort of get a little asterisk 409 00:21:10,459 --> 00:21:13,626 or something next to it. 410 00:21:13,999 --> 00:21:16,959 This is something that had to be kept quiet. 411 00:21:17,083 --> 00:21:19,459 Confidential disclosure, trust the review board, this is going 412 00:21:19,459 --> 00:21:21,999 to be really cool but we can't tell you about what it 413 00:21:21,999 --> 00:21:24,918 is because then you won't get to hear it. 414 00:21:24,999 --> 00:21:27,375 So maybe that's one approach. 415 00:21:27,375 --> 00:21:30,959 I'd like to talk about some of the ways you might make 416 00:21:30,959 --> 00:21:36,999 a disclosure that are relatively less likely to get you in trouble. 417 00:21:37,584 --> 00:21:41,250 You can obviously disclose to the responsible party. 418 00:21:41,250 --> 00:21:42,709 That's what we'd like to do. 419 00:21:42,709 --> 00:21:44,083 That's what the responsible disclosure paradigm 420 00:21:44,083 --> 00:21:45,999 is all about. 421 00:21:45,999 --> 00:21:47,667 You have a problem with the system. 422 00:21:48,334 --> 00:21:53,250 This is actually unfortunately, relatively high risk. 423 00:21:53,501 --> 00:21:55,999 And risk scales with the questionableness 424 00:21:55,999 --> 00:22:02,501 to whatever technique it was that you use to find out about this vulnerability. 425 00:22:02,501 --> 00:22:03,834 So, if you're connected to the remote system, 426 00:22:03,834 --> 00:22:06,999 you don't have permission, that's how you did it. 427 00:22:07,292 --> 00:22:09,751 It may not be a great idea to tell them about it 428 00:22:09,751 --> 00:22:13,959 because if they don't like it, they've got an action against you. 429 00:22:13,959 --> 00:22:16,501 If you're inconvenience, that's a problem for you. 430 00:22:16,501 --> 00:22:19,959 You might think you're doing them a favor, they might not agree that you're 431 00:22:19,959 --> 00:22:21,959 doing them a favor. 432 00:22:22,083 --> 00:22:23,959 If you're able to submit them anonymously 433 00:22:23,959 --> 00:22:26,999 to whoever the vendor is or whoever the responsible party, 434 00:22:26,999 --> 00:22:28,751 that's great. 435 00:22:29,626 --> 00:22:32,999 Depends how good your op sec is, I guess. 436 00:22:33,292 --> 00:22:34,459 A lot of times you think you're anonymous 437 00:22:34,459 --> 00:22:36,083 by not as anonymous as you thought you were 438 00:22:36,083 --> 00:22:37,959 or hoped you were. 439 00:22:37,999 --> 00:22:41,999 That's a risk in yourself you need to consider. 440 00:22:42,417 --> 00:22:46,209 If you're a bug bounty, maybe you're at less risk, you can disclose 441 00:22:46,209 --> 00:22:50,083 to a government authority perhaps maybe you don't believe it 442 00:22:50,083 --> 00:22:52,999 will ever get to the vendor. 443 00:22:53,417 --> 00:22:55,834 But again, if you if your techniques were perhaps 444 00:22:55,834 --> 00:22:59,501 questionable, you might not necessarily want to be submitting it 445 00:22:59,501 --> 00:23:01,417 to a government. 446 00:23:01,417 --> 00:23:02,999 A governmental authority, you may want it separately, 447 00:23:02,999 --> 00:23:06,626 you may have an interest in keeping your identity anonymously. 448 00:23:07,083 --> 00:23:10,334 You may try to submit anonymously to the government 449 00:23:10,334 --> 00:23:15,083 but I don't know how much we can really trust that any more. 450 00:23:19,999 --> 00:23:23,667 Fortunately this is a legal talk, somewhat legal talk and you almost 451 00:23:23,667 --> 00:23:27,709 never get to a legal talk where someone will tell you something for sure, 452 00:23:27,709 --> 00:23:31,999 absolutely 100% you will not get in trouble if you do this. 453 00:23:32,125 --> 00:23:34,918 But for Natalie, we are in a case here where there 454 00:23:34,918 --> 00:23:37,584 is one group of people who really don't have 455 00:23:37,584 --> 00:23:41,999 to worry about getting in trouble with the computer fraud and abuse act 456 00:23:41,999 --> 00:23:46,417 when they disclose the vulnerability and here they are. 457 00:23:46,417 --> 00:23:48,918 Okay to disclose if you're one of these people. 458 00:23:49,083 --> 00:23:50,417 Although she really should not have been 459 00:23:50,417 --> 00:23:53,999 hacking the palace computer, we're not going to hold that against her. 460 00:23:56,083 --> 00:24:00,876 So we're thinking about ways that we might be able 461 00:24:00,876 --> 00:24:05,501 to leverage opportunities for security researchers 462 00:24:05,501 --> 00:24:12,125 to make disclosure while keeping the risk as low as possible. 463 00:24:15,751 --> 00:24:21,417 So we're working on creating a pilot program where attorney client 464 00:24:21,417 --> 00:24:27,584 privilege can be leveraged to hide the identity and the techniques used 465 00:24:27,584 --> 00:24:32,584 by security researcher in making a disclosure. 466 00:24:32,626 --> 00:24:34,751 So the concept works like this. 467 00:24:34,751 --> 00:24:36,292 The researcher would disclose the vulnerability 468 00:24:36,292 --> 00:24:39,999 to a trusted third party which would be an attorney. 469 00:24:40,125 --> 00:24:41,959 Only to the attorney. 470 00:24:42,250 --> 00:24:45,999 It's critical that this be a completely confidential disclosure 471 00:24:45,999 --> 00:24:48,959 to maintain that attorney the confidentiality 472 00:24:48,959 --> 00:24:54,709 of that disclosure so that other entities in the outside can't get to it. 473 00:24:54,709 --> 00:24:58,125 The trusted third party does not publish the vulnerability on behalf 474 00:24:58,125 --> 00:25:01,584 of the researcher, however, the trusted third party does disclose 475 00:25:01,584 --> 00:25:03,999 the vulnerability to whoever the third party is, 476 00:25:03,999 --> 00:25:06,709 whoever has this vulnerability. 477 00:25:07,125 --> 00:25:12,999 The researcher remains anonymous through the entire process This 478 00:25:12,999 --> 00:25:17,999 is possibly abuse if there's no better option. 479 00:25:17,999 --> 00:25:21,918 It's a little bit cumbersome and there are some side effects chiefly 480 00:25:21,918 --> 00:25:24,375 that the researcher remains anonymous, 481 00:25:24,375 --> 00:25:29,083 doesn't get public credit for whatever the research was. 482 00:25:29,375 --> 00:25:33,125 But it is one possible way for the researcher to be able 483 00:25:33,125 --> 00:25:36,292 to disclose and remain about as anonymous 484 00:25:36,292 --> 00:25:39,292 as one could possibly get. 485 00:25:39,292 --> 00:25:40,999 So this is a pilot program we're currently 486 00:25:40,999 --> 00:25:42,501 working on it. 487 00:25:42,501 --> 00:25:44,876 We're kicking out the bugs right now. 488 00:25:44,959 --> 00:25:47,792 If anyone is interested in talking to us further about it, 489 00:25:47,792 --> 00:25:52,250 we definitely welcome your input and please see me afterwards. 490 00:25:52,834 --> 00:25:55,999 We should now turn to selling very quickly. 491 00:25:55,999 --> 00:26:00,083 Right now there is no law in the U.S. 492 00:26:00,083 --> 00:26:02,292 that prohibits the selling of an exploit. 493 00:26:03,083 --> 00:26:07,167 And that is a situation that is probably likely to change 494 00:26:07,167 --> 00:26:11,667 in the not too distant future but for now there's not too much 495 00:26:11,667 --> 00:26:16,584 to worry about unless your techniques in developing your exploit have 496 00:26:16,584 --> 00:26:20,292 problem, then you still have a problem. 497 00:26:20,292 --> 00:26:22,167 But the fact of the sale itself is not something that's going 498 00:26:22,167 --> 00:26:24,125 to get you in trouble. 499 00:26:24,459 --> 00:26:30,167 However, there's a lot of focus on this market now. 500 00:26:30,167 --> 00:26:31,876 And here's some recent articles from May 501 00:26:31,876 --> 00:26:36,792 of 2013 booming zero day trade as Washington experts worried. 502 00:26:36,999 --> 00:26:39,292 My favorite "The U.S. 503 00:26:39,292 --> 00:26:43,999 Senate wants to control malware like it's a missile" stuff is dangerous. 504 00:26:44,292 --> 00:26:46,999 So every year, the Congress has to pass 505 00:26:46,999 --> 00:26:51,417 the National Defense Authorization Act that sets the budget for DoD 506 00:26:51,417 --> 00:26:56,501 and includes a bunch of other stuff that gets stuck in there. 507 00:26:56,501 --> 00:27:00,918 And this year well, for 2014, the Senate version it hasn't been 508 00:27:00,918 --> 00:27:02,751 passed yet. 509 00:27:02,751 --> 00:27:04,876 It's still in Congress. 510 00:27:04,876 --> 00:27:09,667 The Senate version has provisions that seek to begin the process 511 00:27:09,667 --> 00:27:13,709 of regulating the sale of exploits. 512 00:27:14,501 --> 00:27:18,083 The bill the house version doesn't have this. 513 00:27:18,083 --> 00:27:19,834 This is still just in the Senate. 514 00:27:19,834 --> 00:27:22,083 But you know, I think this is where it's headed. 515 00:27:22,083 --> 00:27:25,250 The bill notes that the president shall establish process 516 00:27:25,250 --> 00:27:28,709 for developing policy to control the proliferation 517 00:27:28,709 --> 00:27:31,209 of cyber weapons through a whole series 518 00:27:31,209 --> 00:27:33,626 of possible actions. 519 00:27:33,626 --> 00:27:37,334 Export controls, law enforcement, financial, diplomatic engagement 520 00:27:37,334 --> 00:27:38,999 and so on. 521 00:27:39,083 --> 00:27:45,584 The Senate armed services committee, that had the bill before it was passed 522 00:27:45,584 --> 00:27:51,999 to the rest of the Senate said they had some commentary on this. 523 00:27:52,083 --> 00:27:54,584 And they referred to the dangerous software, 524 00:27:54,584 --> 00:28:00,083 a global black market, a gray market, it starts to look really bad. 525 00:28:00,209 --> 00:28:02,834 But they note that there is we need to have a carve 526 00:28:02,834 --> 00:28:05,250 out for dual use software. 527 00:28:05,709 --> 00:28:07,292 And pentesting tools. 528 00:28:07,667 --> 00:28:10,834 In Europe, the European Parliament recently 529 00:28:10,834 --> 00:28:15,501 passed a directive, they're a little ahead of us this prohibition 530 00:28:15,501 --> 00:28:20,999 on the sale of tools as they call it, basically exploits, will be required 531 00:28:20,999 --> 00:28:23,999 to be enacted by all of the member states 532 00:28:23,999 --> 00:28:28,709 in short order and this provision prohibits the production, sale, 533 00:28:28,709 --> 00:28:31,709 procurement for use, import, distribution 534 00:28:31,709 --> 00:28:36,584 of these tools that can be used to commit these enumerated offenses 535 00:28:36,584 --> 00:28:40,417 which is pretty much all the bad things you can think 536 00:28:40,417 --> 00:28:43,209 of doing with a computer. 537 00:28:43,209 --> 00:28:46,792 However, there's an exception, very important exception. 538 00:28:47,083 --> 00:28:49,999 For tools that are he created for legitimate purposes such 539 00:28:49,999 --> 00:28:54,125 as the test or reliability of systems and further notes that there needs 540 00:28:54,125 --> 00:28:57,999 to be in order to violate this law, you need to have need to show 541 00:28:57,999 --> 00:29:03,083 a direct intent that the tools be used to commit some of the offenses. 542 00:29:03,501 --> 00:29:06,751 So in both cases, both U.S. 543 00:29:06,751 --> 00:29:08,667 and in Europe we're seeing this trend. 544 00:29:08,667 --> 00:29:12,876 Well, it's really going back to the definitional problem how do we 545 00:29:12,876 --> 00:29:16,626 define what an exploit is and how do we make sure that 546 00:29:16,626 --> 00:29:22,125 legitimate tool can still be bought and sold this is perspective. 547 00:29:22,125 --> 00:29:24,250 We don't know what the laws will look like but I would start thinking 548 00:29:24,250 --> 00:29:25,667 of this. 549 00:29:26,083 --> 00:29:27,999 Think about dual use tools. 550 00:29:28,083 --> 00:29:33,918 If you write something don't put it together as the next greatest hack. 551 00:29:33,918 --> 00:29:36,209 This is you're creating pentesting tools. 552 00:29:37,584 --> 00:29:39,876 This has gone on for a long time. 553 00:29:39,876 --> 00:29:42,626 If you look at software I'm sure you've all used it. 554 00:29:42,626 --> 00:29:43,626 Copy 2+. 555 00:29:43,834 --> 00:29:46,083 Apple 2 or locksmith? 556 00:29:46,459 --> 00:29:48,125 Backup software. 557 00:29:48,125 --> 00:29:49,999 And the manuals for these softwares have 558 00:29:49,999 --> 00:29:54,209 they elaborate disclaimers, this is being used to back up your floppy, 559 00:29:54,209 --> 00:29:58,083 this is not being used to make illegal copies. 560 00:29:58,083 --> 00:30:05,167 And that is really the conundrum and that's where exploits will go. 561 00:30:06,417 --> 00:30:08,999 Some exploits will never be able to be looked 562 00:30:08,999 --> 00:30:11,918 at as a dual use tool for sure. 563 00:30:12,250 --> 00:30:14,292 If you know, if sort of like if you have 564 00:30:14,292 --> 00:30:18,334 a nuclear missile equivalent of an exploit, it's hard to justify 565 00:30:18,334 --> 00:30:21,083 the pentesting value of that. 566 00:30:21,209 --> 00:30:27,083 For a lot of tools perhaps that's where they should go. 567 00:30:28,167 --> 00:30:29,959 If you are selling. 568 00:30:29,999 --> 00:30:31,501 Know your buyer. 569 00:30:31,626 --> 00:30:33,292 To the extend you go. 570 00:30:33,375 --> 00:30:39,999 I think regulation is just one bad outcome away. 571 00:30:39,999 --> 00:30:41,709 Someone in the U.S. 572 00:30:42,375 --> 00:30:44,751 is going to sell an exploit and going to go through some channel 573 00:30:44,751 --> 00:30:46,876 and get used against some U.S. 574 00:30:46,876 --> 00:30:48,667 Interest, we may not here about it. 575 00:30:49,501 --> 00:30:53,042 It may be secret but this will happen and then there 576 00:30:53,042 --> 00:30:58,083 will be a huge drive to stop this from happening quickly. 577 00:30:58,209 --> 00:31:01,125 It's the same reason as soon as if someone is murdered 578 00:31:01,125 --> 00:31:04,959 with a certain weapon that weapon has to be banned, that's 579 00:31:04,959 --> 00:31:08,876 the way laws are created is very reactionary and I expect that 580 00:31:08,876 --> 00:31:10,834 to happen here. 581 00:31:11,918 --> 00:31:13,999 Maybe you can prevent that from happening. 582 00:31:14,999 --> 00:31:16,792 Know your buyer. 583 00:31:16,999 --> 00:31:20,667 If you're selling something don't sell into channel that's likely to go 584 00:31:20,667 --> 00:31:23,999 into embargoing with the United States. 585 00:31:23,999 --> 00:31:26,250 Maybe your best bet is to sell it to the U.S. 586 00:31:26,542 --> 00:31:28,834 asking for assurance from your buyer. 587 00:31:28,959 --> 00:31:33,584 You don't have knowledge of someplace it's not supposed to go. 588 00:31:33,584 --> 00:31:37,999 You can be lied to, but you can't control everything. 589 00:31:38,375 --> 00:31:41,999 At least you can get assurance that it's not going to be used 590 00:31:41,999 --> 00:31:44,834 in some illegitimate way. 591 00:31:45,167 --> 00:31:47,667 Also, you can always use disclaimer language. 592 00:31:47,667 --> 00:31:50,542 So have nice examples of disclaimer language here. 593 00:31:51,083 --> 00:31:55,167 This huge chunk of text on the top is actually from a software product that 594 00:31:55,167 --> 00:31:58,501 many of you have probably used many times. 595 00:31:58,501 --> 00:31:59,501 It's good stuff. 596 00:31:59,501 --> 00:32:02,501 I've highlighted the best of the operative language in it. 597 00:32:02,501 --> 00:32:06,667 But, if you're selling something be sure to use disclaimer language that flows 598 00:32:06,667 --> 00:32:10,083 along these lines that would help you from being charged 599 00:32:10,083 --> 00:32:13,501 at being complicit in any sort of illegal use to which 600 00:32:13,501 --> 00:32:16,876 the software might eventually be put. 601 00:32:16,999 --> 00:32:20,083 And lastly, I'd just like to highlight this bottom little 602 00:32:20,083 --> 00:32:23,876 paragraph which is actually from the Apple iTunes store, 603 00:32:23,876 --> 00:32:27,999 end user license agreement that comes with that. 604 00:32:27,999 --> 00:32:31,626 And it requires that you agree that you will not use these products 605 00:32:31,626 --> 00:32:35,250 for any purpose prohibited to develop design or manufacturer 606 00:32:35,250 --> 00:32:40,375 production of nuclear missiles or chemicals or biological weapons. 607 00:32:40,375 --> 00:32:41,375 My God. 608 00:32:43,083 --> 00:32:45,999 Words with friends, that's dangerous stuff. 609 00:32:46,501 --> 00:32:49,125 So thank you for coming. 610 00:32:49,125 --> 00:33:01,167 This is my contact (Applause.) We have time here for questions so, 611 00:33:01,167 --> 00:33:16,083 if people want to line up, I'm happy to entertain them as best we can. 612 00:33:16,083 --> 00:33:18,125 There are definitely free speech issues. 613 00:33:18,125 --> 00:33:21,083 Especially in the temporary restraining order context. 614 00:33:24,584 --> 00:33:25,999 Second amendment, sorry. 615 00:33:25,999 --> 00:33:32,250 Come see me about that. 616 00:33:35,250 --> 00:33:36,959 Question back here? 617 00:34:00,209 --> 00:34:04,459 What about using a corporation to limit your liability 618 00:34:04,459 --> 00:34:07,501 for disclosure or selling? 619 00:34:07,501 --> 00:34:09,417 Is that JAMES DENARO: Corporations can be held liable 620 00:34:09,417 --> 00:34:11,709 in many cases even under the computer Fraud 621 00:34:11,709 --> 00:34:16,709 and Abuse Act it hasn't happened yet but a corporation could be held liable. 622 00:34:16,709 --> 00:34:25,083 AUDIENCE: A question regarding full disclosure versus 623 00:34:25,083 --> 00:34:29,999 responsible disclosure. 624 00:34:30,083 --> 00:34:32,209 So when we do it, we do it via responsible disclosure, 625 00:34:32,209 --> 00:34:35,209 we contact the vendor, we give them 30 days and we tell them 626 00:34:35,209 --> 00:34:37,834 our intent to publish and we publish everything so 627 00:34:37,834 --> 00:34:40,999 the actual vulnerability of how to do it so people can replicate 628 00:34:40,999 --> 00:34:43,209 and do whatever they want. 629 00:34:43,417 --> 00:34:47,667 In most cases the vendors get a hot fix in within a week and then 630 00:34:47,667 --> 00:34:52,250 if within 30 days they provide the hot fix, we write them and say 631 00:34:52,250 --> 00:34:55,918 to fix it install hot fix whatever. 632 00:34:56,083 --> 00:34:58,125 Sometimes vendors will say we need more time, 633 00:34:58,125 --> 00:35:02,250 maybe we'll negotiate a couple days, but sometimes they'll say we're not 634 00:35:02,250 --> 00:35:05,626 going to fix it and you come publish it. 635 00:35:05,751 --> 00:35:08,542 So I won't explain what we do for that. 636 00:35:08,542 --> 00:35:11,459 But Google recently published the fact that they plan 637 00:35:11,459 --> 00:35:15,083 to disclose vulnerability within 7 days. 638 00:35:15,083 --> 00:35:16,083 All right? 639 00:35:16,083 --> 00:35:17,501 To have a 7 day turnaround. 640 00:35:17,501 --> 00:35:20,999 So what happens if a company like Google I don't want to use 641 00:35:20,999 --> 00:35:24,667 the word threaten but intend to publish a vulnerable 642 00:35:24,667 --> 00:35:28,501 within the 7 day turn around, and the company says don't 643 00:35:28,501 --> 00:35:30,751 or we'll sue you. 644 00:35:30,834 --> 00:35:31,999 What happens? 645 00:35:31,999 --> 00:35:36,999 JAMES DENARO: Google is at risk if they have some kind 646 00:35:36,999 --> 00:35:40,083 of obligation not to. 647 00:35:41,584 --> 00:35:44,542 It depends on the specific circumstances of it. 648 00:35:44,999 --> 00:35:47,459 But in this case if no law has been broken, 649 00:35:47,459 --> 00:35:51,999 then Google could be published without any AUDIENCE: The case 650 00:35:51,999 --> 00:35:56,334 for me, for example, where I contact vendor and say I've got 651 00:35:56,334 --> 00:36:00,459 the 10 vulnerabilities I intend to publish. 652 00:36:00,459 --> 00:36:03,209 And they come back and say I'm going to sue you same 653 00:36:03,209 --> 00:36:07,792 with Google where Google says we're going to give you 7 days, not 30, 654 00:36:07,792 --> 00:36:12,626 the company comes back and says Google I'm going to sue you. 655 00:36:12,999 --> 00:36:17,667 It's not the same as they're going to sue me. 656 00:36:19,083 --> 00:36:22,959 JAMES DENARO: That's the unfortunate part. 657 00:36:22,959 --> 00:36:25,375 Is it the case of how good your legal team 658 00:36:25,375 --> 00:36:28,999 is or how expensive your legal team is. 659 00:36:28,999 --> 00:36:30,334 JAMES DENARO: Exactly. 660 00:36:30,334 --> 00:36:32,125 AUDIENCE: How much do you charge? 661 00:36:37,584 --> 00:36:41,999 Wherever you want to meet him at to carry on. 662 00:36:41,999 --> 00:36:44,083 There's about 5 or 6 other people. 663 00:36:44,083 --> 00:36:45,709 So this talk is over with. 664 00:36:45,834 --> 00:36:47,709 We've got to get ready for the evening. 665 00:36:47,999 --> 00:36:50,999 Down the hallway, he'll answer the rest of your questions. 666 00:36:50,999 --> 00:36:51,125 Unfortunately, there's not a Q&A 667 00:36:51,125 --> 00:36:53,417 because it's been disassembled too. 668 00:36:54,292 --> 00:36:55,709 Thank you, all.