[00:07.510 --> 00:14.310] Welcome, everybody, to DEF CON 30's DEF CON Group's VR Village presentations, where you're [00:14.310 --> 00:23.470] getting to see some virtual DEF CON presentations from people all around the world. In fact, [00:23.470 --> 00:30.170] our next speaker hails from Tijuana. Jabels is going to talk to us about Pwning the Lazy [00:30.170 --> 00:35.510] Admin. He's a co-founder of the Tijuana DEF CON Group and former intelligence consultant [00:35.510 --> 00:44.170] for the Mexican government, now working in IT service security full-time. So please welcome [00:44.170 --> 00:47.310] to our stage, Jabels. [00:51.390 --> 00:53.230] Hey, guys, how are you? [00:55.850 --> 00:56.770] Excellent. [00:56.770 --> 00:59.390] Can everyone hear me okay? [00:59.630 --> 01:01.570] Yeah, make sure you pick up the microphone. [01:01.570 --> 01:02.690] Yes. [01:04.350 --> 01:07.510] Ah, yeah, there. Okay. [01:07.510 --> 01:14.750] All right, but how do I look up? [01:15.310 --> 01:16.490] Okay. [01:17.190 --> 01:19.430] All right. [01:20.410 --> 01:28.310] Yeah, let me put it down because I can't... [01:28.310 --> 01:31.010] If you right-click, it goes... [01:33.970 --> 01:41.370] If you right-click, it goes between movement of your body and movement of your head. [01:42.110 --> 01:48.370] Okay. Now, yeah, so we have to turn to the screen, right? [01:48.770 --> 01:49.850] Yep. [01:51.940 --> 01:54.860] Because I don't see the other one. [01:57.610 --> 01:59.490] We got to see those lights. [02:00.350 --> 02:01.370] Those lights. [02:03.010 --> 02:12.550] Yeah, the presentation software that we're using, which is the explore function, broke. [02:12.550 --> 02:15.890] The service went down right in the middle of our thing. [02:15.890 --> 02:18.990] So what we had to do is export all the slides to JPEGs. [02:18.990 --> 02:22.970] And we're using a different type of viewer, but it doesn't allow us to do embedded videos, [02:22.970 --> 02:24.850] but we can at least share the slides. [02:24.850 --> 02:27.150] Okay, yeah, no matter. [02:27.150 --> 02:29.910] Yeah, I think I have... I'm in position now. [02:29.910 --> 02:30.550] Okay. [02:33.270 --> 02:35.610] All right. Thanks, guys. [02:38.210 --> 02:44.150] So for the first slide, as it was just mentioned, yeah, my name's Jabels. [02:44.610 --> 02:46.790] Others say Jabels, JB. [02:47.430 --> 02:49.550] It doesn't really matter which one. [02:49.550 --> 02:50.390] My name's Juan. [02:50.390 --> 02:53.670] You can use whatever nickname you want. [02:53.670 --> 02:57.390] I'm the co-founder for the Tijuana Deftone Group. [02:57.950 --> 03:04.970] And, yeah, I like breaking into stuff, but absolutely all the time I try to do it with permission, [03:04.970 --> 03:10.150] because I'm, as someone else mentioned here, a law-abiding citizen, right? [03:10.490 --> 03:12.890] And next slide, please. [03:17.920 --> 03:19.520] And the next one. [03:22.240 --> 03:25.240] All right, so that's the purpose of my talk. [03:25.240 --> 03:26.520] I'm going to give you guys... [03:26.520 --> 03:28.600] I'm not going to get too technical. [03:28.600 --> 03:40.540] I just want to give you guys some real-world examples on the stuff I found while working as a pen tester for the last, like, five years or so. [03:40.780 --> 03:49.620] You're going to notice that most of these are really just dumb things that could have been avoided easily. [03:49.900 --> 03:55.780] And all of these examples are for ISO 27001-certified companies. [03:55.780 --> 03:59.100] Now, that's not to say that the certification doesn't work. [03:59.100 --> 04:07.040] There are others that I'm obviously not going to use as an example, and they are using security right in their processes. [04:07.280 --> 04:20.520] But these are for just the ones that don't, either because they don't care or they're not taking the, like, due care to maintain security in their organization. [04:20.520 --> 04:27.320] Okay, so that's just to say that a piece of paper isn't going to get you secure. [04:27.380 --> 04:28.820] Next slide, please. [04:31.830 --> 04:34.010] Okay, so what's the problem? [04:34.010 --> 04:39.410] Mainly here in Mexico, we have a cultural problem regarding security. [04:39.410 --> 04:49.930] Everything's being looked at as if it was just introducing more bureaucracy or stopping the business from flowing when that's really not the case, right? [04:49.930 --> 04:57.510] You should embed security into your process, into your technology, into everything you're doing, even into your policies. [04:57.850 --> 05:06.790] So the thing here is that I've detected or I've tried grouping these cultural problems into three main categories. [05:06.790 --> 05:08.090] Next slide, please. [05:10.130 --> 05:19.930] These categories are, well, first of all, your leadership. Leadership, most of the time, doesn't really care about what IT is doing. [05:19.930 --> 05:27.170] They just look at them as an expense, and since they don't care about what IT is doing, they're not going to care about security, right? [05:27.170 --> 05:35.610] So they just see everything from buying a firewall, or buying antivirus software, or buying some other type of endpoint protection. [05:35.650 --> 05:38.430] Implementing security controls is just an expense, right? [05:38.430 --> 05:46.590] They just care about the bottom line, and there's an image out there on the internet where you have two cases, right? [05:46.590 --> 05:53.470] One, let's say that it's the boss for a company saying, hey, everything works. [05:53.550 --> 05:56.270] Why am I paying you, right, if everything's running smoothly? [05:56.270 --> 06:00.450] And then, on the other hand, why am I paying you if everything's broken? [06:00.450 --> 06:05.490] So that's a thought process where that type of leadership takes you. [06:06.050 --> 06:13.910] The second one that I've noticed is their security strategy is pretty much batch and pray, where batching is really optional. [06:13.910 --> 06:16.270] They just don't do it. [06:16.270 --> 06:26.890] They think that their small business or their industry that they work on doesn't really matter that much to hackers. [06:26.890 --> 06:29.670] So why should they implement all those security controls? [06:29.670 --> 06:34.130] Why should they work with such a high standard for their security? [06:34.370 --> 06:42.350] So with that type of company, what I usually find is that they still have eternal blue or blue key on their networks. [06:42.530 --> 06:45.850] And that's all supposed to want to cry, right? [06:45.850 --> 06:56.010] So you think that most of these companies that have fallen in the WannaCry ransomware attack way back a couple of years ago, [06:56.550 --> 07:01.370] would want to patch so that they don't fall victim to that type of cyber attack again. [07:01.370 --> 07:04.470] But yet, still, they don't do it, right? [07:04.830 --> 07:11.750] And lastly, I think this is the biggest problem on most companies. [07:11.750 --> 07:14.610] They have IT staff with a terrible attitude. [07:14.610 --> 07:20.630] And what I mean with a terrible attitude is that you have your... pretty much your know-it-all, right? [07:20.630 --> 07:24.530] Like, hey, you're missing... and why aren't you patching? [07:24.530 --> 07:33.690] And then they make some snarky remarks about how patches break everything or what they say goes within their company regarding technology, [07:33.690 --> 07:37.550] because no one else on their organization understands technology. [07:37.690 --> 07:40.050] And that's, I think, really dangerous, right? [07:40.050 --> 07:46.230] Because your IT staff is supposed to be your in-house experts regarding security and technology, [07:46.230 --> 07:51.010] and you should usually trust them to make the best decision for your business, [07:51.010 --> 07:58.290] because IT is a business enabler, and they will help you mitigate risks, and especially for the business, right? [07:58.310 --> 08:01.390] And most of the times, they don't. [08:01.410 --> 08:03.510] So these are the three main categories. [08:03.510 --> 08:11.310] I'm going to talk about stuff from IT that I found during most of the presentation. [08:11.310 --> 08:17.870] I might jump between the other categories, but IT is mostly the ones here that I'm going to focus on. [08:17.870 --> 08:19.230] Next slide, please. [08:22.700 --> 08:27.040] Okay, so how do I exploit these cultural problems? [08:27.040 --> 08:32.180] I don't do it, or for this talk, I'm not going to focus on the hacker stuff. [08:32.180 --> 08:41.940] I'm going to focus on the really stupid stuff that you don't even need to be properly trained on pentesting or cybersecurity or even IT to do it. [08:42.040 --> 08:45.460] You just need to poke around. [08:45.840 --> 08:47.480] So next slide, please. [08:48.600 --> 08:53.260] The first thing I'm going to focus on is password cracking. [08:53.260 --> 09:06.160] So whenever you're on a pentesting engagement, usually when you want to capture credentials, you would use a tool like Responder or Wireshark and put your computer on premise, [09:06.160 --> 09:12.180] and do something like that to start sniffing the network, capturing hashes, and then trying to break them. [09:12.180 --> 09:22.400] You can do that either with GenDripper, Hashcat, or whatever other method you want to use to hack your captured hashes. [09:22.920 --> 09:24.280] Next slide, please. [09:24.560 --> 09:41.020] What I'm going to do is, most of the time, whenever I'm in an engagement with a company for the first time, is I'm going to just walk around checking their whiteboards, because it's unbelievable the amount of information they leave there. [09:41.020 --> 09:42.420] Next slide, please. [09:44.380 --> 09:53.200] Here on the left, you'll see what says ContraseƱa, Pechugon, and then in French, you'll see that it says router. [09:53.240 --> 09:56.600] So that pretty much is, in Spanish, right? [09:57.080 --> 10:01.160] Password, Pechugon, and by router, they mean router. [10:01.160 --> 10:08.020] So they left their router password written on a whiteboard on some meeting room. [10:08.920 --> 10:13.140] So the problem here is that they're telling me their password, and they're telling me what it's for. [10:13.260 --> 10:17.380] So just by that, they're saving me a lot of time in the engagement. [10:17.580 --> 10:24.640] And the funny thing about this one is that on the image on the right, that's the name of a chicken place here in Mexico. [10:24.660 --> 10:28.660] That's a chain of food, right? [10:28.660 --> 10:37.440] And for this particular company, on that meeting room, they did have a window that looked right across the street into one of these places. [10:37.440 --> 10:41.380] So that seemed pretty stupid to me. [10:41.780 --> 10:43.060] Next slide, please. [10:43.580 --> 10:45.480] And that's a regular thing, right? [10:45.540 --> 10:48.340] It seems like most of the time they put their passwords. [10:48.340 --> 10:57.840] They just look around and say, like, okay, whatever store is right across my business, that's what I'm going to name it. [10:57.840 --> 11:08.980] So for physical intrusion, you usually want to do lockpicking or clone badges or do some RFID stuff or do something interesting, right? [11:09.020 --> 11:12.260] Tailgating or other type of social engineering. [11:12.720 --> 11:16.580] But the other thing you can do is just... next slide, please. [11:16.580 --> 11:18.260] Turn the doorknobs. [11:19.180 --> 11:21.560] That's another thing that I've noticed. [11:21.560 --> 11:25.800] A lot of the time, people don't lock their offices, don't lock their sites. [11:25.800 --> 11:40.460] Or if they do lock them, they put the bar that goes into the doorframe backwards so that you can poke it in with your guest badge that they provide you once inside the building. [11:40.460 --> 11:43.260] And you can just slide it in between the door and the frame. [11:43.540 --> 11:45.260] And that will open it right up. [11:45.640 --> 11:46.760] Next slide, please. [11:47.940 --> 11:55.000] So what I've been able to get access to is just laptops that are left there unattended. [11:55.000 --> 12:00.860] On the left side of the screen, I show a cheap laptop with a land turtle plugged in it. [12:00.860 --> 12:05.420] The thing about that case was that the person who left it there came back from lunch. [12:05.680 --> 12:10.580] And I'm not sure if they noticed that that was plugged in or not, but they didn't report it. [12:10.580 --> 12:19.160] And I use that land turtle example a lot because it's bulky and they want to have something visual there. [12:19.500 --> 12:22.240] Just to see if they report if they see something. [12:22.240 --> 12:24.300] In this case, they didn't. [12:24.300 --> 12:36.520] And then on the right side, that image is for the office of, I think, this person was the personal assistant for one of the managers there. [12:37.500 --> 12:41.380] It would be like kind of a C-suite kind of deal. [12:41.380 --> 12:47.240] But if you notice, there's a couple of routers in between the cabinet. [12:47.600 --> 12:50.580] There's another Wi-Fi router. [12:50.580 --> 12:54.380] And then on the wall, all the way to the back, there's a switch. [12:54.760 --> 12:56.360] So why is that there? [12:56.360 --> 12:59.220] Why is it visible and accessible for everyone? [12:59.220 --> 13:00.120] I don't know. [13:00.120 --> 13:08.080] But, again, since it was an office, the access controls weren't the best ones, right? [13:08.080 --> 13:18.440] And they didn't want to change it because that requires cabling, moving everything from that office up to the proper site. [13:18.520 --> 13:23.860] So, yeah, that's, again, pretty lazy from the IT perspective. [13:24.000 --> 13:25.480] Next slide, please. [13:27.540 --> 13:29.660] Okay, this one. [13:29.660 --> 13:34.640] This is, like, the main goal when you're doing some physical stuff, right? [13:34.640 --> 13:39.720] The left side, there is a site that was left, like, completely wide open. [13:40.060 --> 13:45.280] It was for the same building of the dudes with the chicken place for a password. [13:45.400 --> 13:53.120] And on the right side, this was a site for a really big building, like a 12-story building. [13:53.120 --> 13:57.520] They have one of their sites that's being, like, painted. [13:57.520 --> 14:03.360] You can see the stairs on the background of the image. [14:03.640 --> 14:14.700] And they were painting the plumbing for their sprinkler system because it needs to be color-coded according to one norm here in Mexico. [14:14.700 --> 14:19.760] So they were doing that, and they left the contractors unattended. [14:19.760 --> 14:25.560] And since they were unattended, they didn't have a way to lock the site. [14:25.660 --> 14:29.360] So I just went in, and I opened the door because it was wide open. [14:29.360 --> 14:32.400] And they took a couple of pictures for that engagement. [14:32.400 --> 14:35.540] They couldn't plug anything in, so I didn't. [14:35.540 --> 14:40.620] But at least I just made the mention that, hey, your site was left wide open. [14:40.620 --> 14:49.200] And for that building, everything that's running through that network, there were, like, two financial startups. [14:49.200 --> 14:56.820] One of them is widely used here in Mexico, so that one was really interesting to check out. [14:57.320 --> 15:01.920] It was a very big risk for this building. [15:01.980 --> 15:07.540] Now, for the one on the left-hand side, I did get found out that I was messing around with it. [15:07.540 --> 15:09.980] On that one, I did plug some stuff in. [15:10.020 --> 15:18.640] They checked the cameras, and then, like, an hour or so into the engagement after that, they did look for me and ask me why I was poking around. [15:18.640 --> 15:23.720] They didn't know who I was, the security guard and a couple of IT people. [15:23.720 --> 15:29.800] So on that end, they did a really good job because they did find out that, hey, someone's messing around with our site. [15:30.140 --> 15:33.940] Again, they did leave it open and unattended. [15:34.240 --> 15:35.640] Next slide, please. [15:38.140 --> 15:42.100] Okay, so the other thing for physical intrusion, the lunchroom. [15:42.100 --> 15:43.000] Next slide. [15:45.470 --> 15:50.410] Okay, so right there next to the fridge, you can see a LAN cable. [15:50.410 --> 15:55.230] And again, a LAN cable with a USB battery plugged in just to keep it going. [15:57.670 --> 16:02.590] That one was found completely by accident during pre-engagement. [16:02.590 --> 16:10.470] I just noticed that when I went to their site, on their lunchroom, when they went to get some water, they had this... [16:10.470 --> 16:14.850] they had cables like laying around in a couple of weird places. [16:15.090 --> 16:22.350] So once the engagement started, I went like right to them and see if they were like cable spins to see if I can get some DHCP. [16:22.350 --> 16:26.850] I could, and I did get a reverse shell from that device. [16:27.190 --> 16:36.630] So, I mean, I'm not sure physically why you would leave that cable there if it's going to be turned into a lunchroom. [16:36.630 --> 16:40.250] It seems like there's a lot of things that need to go wrong. [16:40.250 --> 16:49.170] I'm not sure they just woke up one day and decided, hey man, let's turn this office into a lunchroom and then bring everything back in. [16:49.170 --> 16:52.330] I'm pretty positive that's not the way that worked. [16:52.330 --> 17:04.370] So there had to be planning and there had to be a way for them to, you know, like inventory their network nodes and see what needed to be unplugged or disabled. [17:04.370 --> 17:10.670] And that gave me an easy entryway into their network from a public place, right? [17:10.690 --> 17:12.170] Next slide, please. [17:18.250 --> 17:21.130] Okay, so what kind of information? [17:21.130 --> 17:29.170] So that was a way that I got in just opening doors and plugging into stuff, right? [17:29.170 --> 17:37.990] So what was the information that I was able to get from those practices that were being performed on their sites? [17:37.990 --> 17:39.810] Next slide, please. [17:41.890 --> 17:46.190] The main one was like private pictures, right? [17:46.190 --> 17:56.450] This is going to have the least amount of impact on the business, but it is really bad from a personal... [17:57.770 --> 18:02.990] I hear a voice in the background. [18:19.290 --> 18:22.930] Hey, X-Ray, that's Charmaine with his mic on. [18:31.510 --> 18:32.710] We're good. [18:32.750 --> 18:34.770] Okay, I muted him, you can go ahead. [19:18.400 --> 19:20.140] Anybody else hearing audio? [19:20.860 --> 19:24.780] No, apparently it looks like he's muted, but I'm not sure why. [19:30.150 --> 19:33.590] It looks like he's muted. I'm trying to figure out why he's muted. [19:34.590 --> 19:36.250] Can you hear me? [19:36.610 --> 19:38.810] Yes, we can hear you now. [19:39.110 --> 19:41.490] Okay, where did I lose you guys? [19:42.970 --> 19:44.570] Beginning of the slide. [19:44.570 --> 19:51.790] Right at the beginning of the slide, somebody started streaming junk in here, so we had to mute it, and then your mic went dead. [19:51.790 --> 19:54.410] So if you could start the slide over, it'd be great. [19:56.850 --> 20:05.630] All right, yeah, so one of the first findings was private pictures for one of the people being employed in the company. [20:05.630 --> 20:11.970] Now, this is going to have the least impact on the business, but it is very important on a personal level, right? [20:11.970 --> 20:14.810] Because you don't want your pictures out there. [20:15.130 --> 20:22.190] Now, the problem here was that as part of the engagement, whenever I did find something like this, [20:22.190 --> 20:26.670] I'm usually required to tell the customer right away that someone's streaming porn. [20:26.730 --> 20:30.810] Your IT guys are playing World of Warcraft during the engagement. [20:30.910 --> 20:35.690] There's a bunch of different situations that might arise during the engagement, right? [20:35.690 --> 20:37.070] This is one of them. [20:37.070 --> 20:49.890] So I go to the security guy, tell him, hey, there's some weird pictures that are all stored in this computer [20:49.890 --> 20:55.210] that's labeled as belonging from a department that I knew was all men. [20:55.970 --> 21:07.130] Turns out that the computer just was mislabeled, and that it was signed to the girl at the reception. [21:08.030 --> 21:12.670] So those pictures were hers, but I didn't know it at the time, right? [21:12.670 --> 21:15.330] Like, I didn't recognize her. It was the first time I saw her. [21:15.450 --> 21:18.770] And the IT guy said, hey, where do you get them? Can I get a copy? [21:18.770 --> 21:22.550] Instead of actually going like, oh, no, that's bad, right? [21:22.550 --> 21:25.970] You know, he was requesting a copy, which I think is pretty stupid. [21:27.450 --> 21:36.590] And yeah, this is one of the main things that I would like everyone to take into their workplace, right? [21:36.590 --> 21:44.370] That we do have a lot of issues with telling people, hey, don't plug in your phone to the computer. [21:44.430 --> 21:51.730] Now, one of the things is that, yeah, you don't want the user or your employee to steal your information. [21:51.730 --> 21:58.530] But you also don't want their information to be stolen because of your lack of security controls, right? [21:58.530 --> 22:02.590] Like, in this case, they had internal blue all over. [22:02.750 --> 22:06.970] These were all Windows 7 computers. They were, like, really old. [22:07.050 --> 22:11.470] And they were jeopardizing the privacy of their employees' information. [22:11.470 --> 22:18.730] So that was, like, really bad. And the way they responded to that, it didn't strike me right away. [22:18.730 --> 22:20.990] Next slide, please. [22:22.910 --> 22:31.950] Okay, now, stepping it a little bit up into the more risky level for your business, identity theft, right? [22:31.970 --> 22:43.070] One of those engagements, I was able to find a shared drive that allowed anonymous or gift access. [22:43.070 --> 22:52.510] And in it, they had a financial folder with their scanned security cards for their companies and other PDFs that just said stuff like passports. [22:52.590 --> 23:02.390] So they had every employee that would travel for that corporate office scanned right there on a PDF that was left unprotected, right? [23:02.390 --> 23:06.810] And, again, that's why they would do that. I'm not sure. [23:08.670 --> 23:12.210] It's one of the problems here in my country, right? [23:12.210 --> 23:19.110] We don't really have any regulation for protecting privacy other than whatever you're going to do with that information, [23:19.110 --> 23:24.830] you need to tell that person that you're getting the information about. [23:24.950 --> 23:28.710] And if they ask you to remove it, go ahead and remove it. [23:28.710 --> 23:33.790] That's pretty much the scope for our privacy regulation. [23:34.390 --> 23:35.690] Next slide, please. [23:39.040 --> 23:47.600] Okay, now, this was one of my favorite cases because of a bunch of things lined up for this. [23:47.720 --> 23:55.620] What you're looking at is the administrator console for one of the most popular, and I wouldn't say like the best, [23:55.620 --> 24:01.560] but one of the most popular ones or that have more user interaction here in the region. [24:03.460 --> 24:08.080] These guys had a SQL injection in one of their search bars, [24:08.080 --> 24:13.100] and it was brought up to our attention because of a member on the group, right? [24:13.100 --> 24:19.540] Like he mentioned, he had found vulnerability on one of their websites for a radio station. [24:20.320 --> 24:22.240] So we asked him about it. [24:22.240 --> 24:27.840] He gave us like a proof of concept, and then we tried contacting the radio station. [24:27.840 --> 24:32.840] But during the report we were trying to pull together, [24:32.840 --> 24:39.760] we noticed that their database for usernames and passwords was exposed due to that SQL injection [24:39.760 --> 24:46.800] and that they didn't divide the database for users for the radio stations, [24:46.800 --> 24:50.060] for the news channels and news media that they have. [24:50.060 --> 24:54.300] So everything was in just one database, unencrypted. [24:54.300 --> 25:02.020] And once we made a dump of those access, what we noticed was that every user, [25:02.020 --> 25:04.560] every single user had the same password. [25:04.720 --> 25:06.880] And it was something really stupid. [25:06.880 --> 25:12.340] If you go with something like 1234abcd, that's the type of password they all have. [25:12.560 --> 25:16.380] So one of those accounts was the admin account, [25:16.380 --> 25:23.080] and what we were able to do with that information is to go back into their news articles [25:23.080 --> 25:26.060] and edit them, every single one of them. [25:26.060 --> 25:30.220] We could write whatever we wanted, had a legitimate resource. [25:30.300 --> 25:32.740] We could create a fake news article. [25:33.000 --> 25:39.440] This was during the election, so this was potentially harmful. [25:41.100 --> 25:43.740] And so, yeah, that's pretty much it, right? [25:43.740 --> 25:49.580] So the thing here was that they didn't know how to encrypt their database. [25:49.580 --> 25:55.560] They just gave everyone out the same password that I'm telling you about, right? [25:55.560 --> 25:57.140] So it's something really simple. [25:57.140 --> 26:01.720] It doesn't even follow, like, your standard, you know, special characters, [26:01.720 --> 26:03.160] four case uppercase numbers. [26:03.160 --> 26:07.660] It was just a really plain password, like, six characters long. [26:07.800 --> 26:13.800] And they didn't restrict where you could access this console from, [26:13.800 --> 26:17.320] and I think that was really terrible. [26:17.320 --> 26:22.200] Once we started working into how to patch this, it took us five minutes. [26:22.880 --> 26:30.660] Like, really five minutes just to have one person go through encrypting the database, [26:30.660 --> 26:38.580] and then it was, like, five minutes to find what function the search bar was using [26:38.580 --> 26:41.560] that was different for the rest of the websites. [26:41.600 --> 26:45.920] And it was just missing, I remember it was something like secure string, [26:45.920 --> 26:48.880] and that was the only thing that they were missing for that, [26:48.880 --> 26:52.880] because it wasn't performing the proper data input sanitation. [26:53.420 --> 26:57.360] So, yeah, it was something, like, really stupid that they didn't check, [26:57.360 --> 27:01.140] because they thought, oh, man, we have so many different websites. [27:01.260 --> 27:03.380] How are we going to go through all of them? [27:03.420 --> 27:07.940] And then I just told them, well, why don't you just use diff repositories, [27:07.940 --> 27:09.000] and that was it. [27:09.220 --> 27:10.620] That's how we found it. [27:11.260 --> 27:12.520] Next slide, please. [27:14.680 --> 27:17.900] And this is the stupidest one ever. [27:18.360 --> 27:21.960] I really fell into this one by accident. [27:21.960 --> 27:26.160] We were performing a pen test on site for the PBX [27:26.160 --> 27:29.760] that they used to recruit police officers down near TJ. [27:29.760 --> 27:34.760] And the thing here is that they forgot about us in the building, [27:34.760 --> 27:37.380] so we were left there alone. [27:37.380 --> 27:41.340] So what we did was just start wandering around, [27:41.340 --> 27:47.360] and all of a sudden we see this closet full of, like, [27:47.360 --> 27:49.640] swag stuff and police uniforms. [27:49.880 --> 27:53.860] So I, of course, put one on and took a picture, right, [27:53.860 --> 27:56.520] to let them know that, hey, guys, this should be locked. [27:57.940 --> 28:01.980] And now in hindsight, if you look at what's going on right now [28:01.980 --> 28:04.460] in, like, Ciudad Juarez or Guadalajara, [28:04.460 --> 28:09.620] you see a lot of people walking around with military uniforms [28:09.620 --> 28:13.240] that burning stuff, as of right now, like this, [28:13.240 --> 28:14.940] last two, three couple of days. [28:14.940 --> 28:16.540] They shouldn't have access to this. [28:16.540 --> 28:19.280] And in hindsight, I figured that, yeah, [28:19.280 --> 28:25.400] we have a big problem about not controlling access to this type of gear [28:26.060 --> 28:27.500] properly, right? [28:27.820 --> 28:31.560] So, yeah, I mean, if you notice, like, [28:31.560 --> 28:35.640] something as simple as you lock it or inventory your stuff [28:35.640 --> 28:39.440] or put a camera on the site where you have that stuff. [28:39.780 --> 28:44.420] And don't forget about your contractors when they're doing something for you, [28:44.420 --> 28:44.580] right? [28:44.580 --> 28:47.240] Like, be there with them so they don't wander around [28:47.240 --> 28:51.580] and have access to their uniform as well. [28:51.580 --> 28:56.080] Having their backpack with them can really help. [28:56.080 --> 29:00.940] I mean, all of this was just really terrible practice. [29:01.520 --> 29:05.340] And, I mean, it's, again, just being lazy, [29:05.340 --> 29:09.780] because as I mentioned, all of these companies are certified with, [29:09.780 --> 29:14.060] I would say, like, good enough standard. [29:14.240 --> 29:15.980] It should give you, like, the basic stuff, [29:15.980 --> 29:17.820] and you were following it to a T. [29:17.820 --> 29:19.700] This shouldn't have happened. [29:20.180 --> 29:24.660] So, yeah, this is just, like, laziness or, like, really bad practice, [29:24.660 --> 29:29.000] because I cannot attribute this to lack of knowledge, [29:29.000 --> 29:33.100] since they should know that this isn't the way that they should be doing [29:33.100 --> 29:34.800] their security. [29:35.660 --> 29:37.060] Next slide, please. [29:39.910 --> 29:42.990] Okay, now, I know that physical access isn't [29:44.430 --> 29:48.510] or usually isn't part of what most IT people do, [29:48.510 --> 29:50.070] but we can help, right? [29:50.070 --> 29:55.310] There's a bunch of standards that can help us achieve better security, [29:55.310 --> 29:57.170] even if you don't want to go to, like, ISO [29:57.170 --> 30:00.690] or something that you have to pay and maintain as a fee. [30:00.690 --> 30:02.270] There's a lot of free resources. [30:02.270 --> 30:06.630] There's stuff like CIS and NIST-CSF that can really help you [30:06.630 --> 30:10.290] and give you a lot of guidance on how to implement security. [30:10.290 --> 30:16.310] If you're new to this stuff, I would really advise you to just read [30:16.310 --> 30:20.530] through these and see what you can do or what's helpful [30:20.530 --> 30:26.830] for your organization, as well as a bunch of other stuff, right? [30:26.830 --> 30:33.310] There is FANS, Infosec Institute, DEFCON Groups, DEFCON Talks, [30:33.310 --> 30:36.890] the DEFCON YouTube channel has a lot of useful information, [30:37.570 --> 30:39.870] and it's free, right? [30:39.870 --> 30:42.550] That's the best part of it. [30:43.010 --> 30:44.170] That's pretty much it. [30:44.170 --> 30:47.910] On the next slide, you will see my contact information, [30:47.910 --> 30:54.230] and my friend who is also the founder for the DEFCON Group, [30:54.230 --> 30:56.030] that's his email as well. [30:56.030 --> 31:02.450] If you want to come over to TJ, you want to have a talk, [31:02.450 --> 31:07.210] you just feel like you have some questions, you want to go over some other stuff, [31:07.210 --> 31:08.670] we're glad to help. [31:09.130 --> 31:11.270] And yeah, that's pretty much it. [31:11.690 --> 31:13.750] Do you have any questions, guys? [31:30.080 --> 31:36.200] What's your advice for getting people to get past that laziness threshold? [31:36.200 --> 31:38.580] Because every company has it to some extent. [31:38.580 --> 31:42.420] A lot of companies make you go through, [31:42.420 --> 31:44.580] their employees have trainings that they go through, [31:44.580 --> 31:49.400] but it seems like a lot of the time there's no battling apathy. [31:49.600 --> 31:51.780] Has that been your experience as well? [31:52.500 --> 31:59.840] Yeah, and one of the things that I found that works best is I give them live demos, [31:59.840 --> 32:03.720] like put something together, something even really like simple, [32:03.720 --> 32:07.740] something like, oh, here's how they spoof your Facebook, right? [32:07.740 --> 32:11.240] Just something that gives them on a personal level, [32:11.240 --> 32:13.620] they can relate to their everyday usage. [32:13.620 --> 32:21.540] And that's usually the place where I find that they shift their attitude towards security. [32:21.540 --> 32:23.840] And that's when they start caring about MFA. [32:23.840 --> 32:25.980] That's when they start caring about security passwords. [32:25.980 --> 32:29.480] Once you start telling them how easy it is to get spoofed, [32:29.480 --> 32:33.800] and how their private information or personal information can get out there, [32:33.800 --> 32:35.200] that's when they start noticing. [32:35.200 --> 32:38.660] And then it starts turning into like a habit. [32:38.760 --> 32:44.320] And then that like rolls over to the way they work. [32:45.000 --> 32:48.860] That's the one I found that it's most useful for awareness. [32:56.990 --> 33:02.150] Yeah, one thing I found is that people rely on what are called folk models. [33:02.350 --> 33:09.510] And that's a set of rules of thumb that they believe are true for them and their environment. [33:09.550 --> 33:11.630] And it doesn't matter what credentials you have, [33:11.630 --> 33:13.490] they just won't accept it from you. [33:13.490 --> 33:16.350] And what I found got passed is exactly what you're saying. [33:16.350 --> 33:19.630] You show them something personal where it impacts them personally, [33:19.630 --> 33:21.490] and all of a sudden they start listening. [33:22.830 --> 33:30.930] Yeah, I know white people are like that, but yeah, it seems to work. [33:39.750 --> 33:42.730] Feel free if you have questions, ask questions now. [33:53.260 --> 33:56.180] Well, thank you, Juan, for an excellent presentation. [33:56.660 --> 34:00.720] And people, please give our speaker a round of applause here. [34:01.800 --> 34:03.680] And feel free to ask him questions. [34:03.680 --> 34:05.940] You know, we're here, if we're here, we want to talk to people. [34:05.940 --> 34:10.140] So feel free if you have questions or just want to talk, walk up to people and talk to them. [34:10.540 --> 34:17.660] I really appreciate this particular presentation because having worked at a university, [34:17.660 --> 34:21.560] this kind of problem, this lackadaisical attitude is rampant, it's everywhere. [34:21.560 --> 34:25.320] So thanks, thank you for doing this presentation. [34:26.180 --> 34:31.780] Okay, we've got about 27 minutes to our next speaker. [34:31.960 --> 34:35.060] So wander around, talk to each other, get something to eat. [34:36.060 --> 34:37.900] Maybe a bathroom break might be nice. [34:38.180 --> 34:40.640] We'll see you back here in about 20 minutes.