1 00:00:06,400 --> 00:00:10,600 Oh, I should have done the training. 2 00:00:11,660 --> 00:00:13,420 Oh, I see. 3 00:00:13,420 --> 00:00:13,660 Okay. 4 00:00:13,660 --> 00:00:15,540 Yeah, there's two laser pointers. 5 00:00:16,640 --> 00:00:17,620 Is that better? 6 00:00:17,620 --> 00:00:18,620 Can everyone hear me? 7 00:00:18,620 --> 00:00:19,980 Yes, that's much better. 8 00:00:20,300 --> 00:00:20,860 Okay. 9 00:00:21,100 --> 00:00:23,260 I don't know what's going on with that, Michael. 10 00:00:23,620 --> 00:00:24,160 Thank you. 11 00:00:24,240 --> 00:00:25,460 Sorry about that. 12 00:00:26,260 --> 00:00:26,800 Okay. 13 00:00:26,800 --> 00:00:27,740 No, that's fine. 14 00:00:27,960 --> 00:00:28,480 All right. 15 00:00:28,480 --> 00:00:30,560 So, essentially, I was just talking to Michael. 16 00:00:30,560 --> 00:00:35,400 He rejected from talking at DEF CON, so we don't even need people to listen to that. 17 00:00:37,220 --> 00:00:38,120 All right. 18 00:00:38,120 --> 00:00:42,160 So, what this talk is all about. 19 00:00:42,820 --> 00:00:43,260 All right. 20 00:00:43,260 --> 00:00:44,580 Let's see. 21 00:00:44,580 --> 00:00:44,900 Yeah. 22 00:00:44,900 --> 00:00:45,160 Okay. 23 00:00:45,160 --> 00:00:47,220 Can we move to the next slide, please? 24 00:00:53,030 --> 00:00:53,510 And... 25 00:00:57,780 --> 00:00:58,980 All right. 26 00:00:58,980 --> 00:01:11,460 So, essentially, I've got a brain that is always trying to consume as much as I possibly can all the time. 27 00:01:11,560 --> 00:01:17,840 And sometimes I just don't have the hours in the day to actually do everything that I want to do. 28 00:01:18,880 --> 00:01:20,680 And this is especially true. 29 00:01:20,680 --> 00:01:28,220 So, when I go on Twitter, and I'm on Twitter, as the introduction said, pretty much all the time, I'm pretty much addicted to it. 30 00:01:28,220 --> 00:01:34,880 I follow, basically, the cleverest people in the entire world, as far as I'm concerned. 31 00:01:35,880 --> 00:01:44,580 And some of them drop some information about stuff that they're working on or something that they find interesting all the time. 32 00:01:44,580 --> 00:01:49,460 And I just put it into bookmarks because I just really can't get to it. 33 00:01:49,740 --> 00:02:00,400 And so, I have, and I checked this out this morning, over 500 bookmarks sitting, waiting for me to address them. 34 00:02:00,420 --> 00:02:07,140 And I have 10 yesterday since I created the slides. 35 00:02:07,140 --> 00:02:13,100 So, it's a huge issue, but it's a good one to have. 36 00:02:13,100 --> 00:02:18,320 And I think that's probably what the whole subject of this talk is all about. 37 00:02:24,200 --> 00:02:31,960 So, to produce this talk, just behind the scenes, I used a piece of software called Dewey. 38 00:02:33,940 --> 00:02:37,040 It's a way of arranging bookmarks. 39 00:02:37,040 --> 00:02:42,920 I know that if you pay money to Twitter each month, you can do the same thing. 40 00:02:42,920 --> 00:02:50,340 I have a problem with paying money to Twitter each month, especially since this Dewey program is free. 41 00:02:50,340 --> 00:02:53,720 I have no idea anything about the organization for all. 42 00:02:53,720 --> 00:03:09,680 I know my bookmarks are being looked at by some sort of weird government of some sorts, or some company, and they're selling it so that whenever I jump onto Twitter, it advertises something that makes sense to me, which doesn't seem to be the case. 43 00:03:09,680 --> 00:03:15,820 But that's maybe why they're happy to do something for free. 44 00:03:15,820 --> 00:03:16,280 Who knows? 45 00:03:16,280 --> 00:03:26,140 Anyway, so I quickly got a hold of this app, just so that I could arrange my bookmarks and have a look and see exactly what was available to me. 46 00:03:26,340 --> 00:03:34,220 And for the first time, I actually jumped back into bookmarks that I started with about two or something years ago. 47 00:03:34,960 --> 00:03:37,220 There's actually good stuff in there. 48 00:03:38,540 --> 00:03:40,900 So next slide, please. 49 00:03:42,240 --> 00:03:45,720 So we're going to take a bit of a look through these. 50 00:03:45,720 --> 00:03:51,140 I've skipped some bookmarks because they just don't really make sense to the great public. 51 00:03:52,080 --> 00:03:53,900 And yeah, let's jump in. 52 00:03:53,900 --> 00:03:59,950 So this is like one of my oldest ones, February 27, 2019. 53 00:04:00,360 --> 00:04:03,840 So it's a good two and a half years old. 54 00:04:05,580 --> 00:04:09,780 And yeah, it comes from Katie McCaffrey. 55 00:04:10,780 --> 00:04:34,220 And what they were talking about here is running an experiment where instead of using Word and using track changes in Word and comments and stuff, they decided to abandon Word altogether and just make their documents in a way that they could store it in GitHub. 56 00:04:34,220 --> 00:04:44,580 So any changes that you made to documents, once you committed those changes, you could change reason. 57 00:04:45,180 --> 00:05:01,520 And so anyone that's following up on your documents can just go through essentially the changelog and see exactly what those changes were, which is an interesting way of doing things. 58 00:05:02,360 --> 00:05:10,420 The reason why I'm showing you this is a lot of these were just me thinking, well, how can we do things better? 59 00:05:10,420 --> 00:05:20,640 So as my introduction said, I'm a GRC consultant, but I also consider myself to be a GRC hacker. 60 00:05:20,880 --> 00:05:26,800 And so I'm always trying to see how can we hack the processes that we're doing. 61 00:05:26,840 --> 00:05:33,680 So instead of just accepting the fact that reports are done with Word, what is better? 62 00:05:33,680 --> 00:05:34,620 What can we do? 63 00:05:34,980 --> 00:05:38,040 What's a better way of us doing things? 64 00:05:38,920 --> 00:05:41,680 And this one just really appealed to me. 65 00:05:41,680 --> 00:05:43,740 So should you do this? 66 00:05:43,740 --> 00:05:44,820 Should you use Word? 67 00:05:44,820 --> 00:05:45,640 Who knows? 68 00:05:45,640 --> 00:05:46,500 Up to you. 69 00:05:46,500 --> 00:05:53,920 But at least it's something that you should be thinking about, I don't know, if you want. 70 00:05:54,420 --> 00:05:57,140 And that's what these bookmarks are all about. 71 00:05:57,140 --> 00:06:04,060 It's just stuff for me to think about, and maybe you can think about it as well and get something out of it. 72 00:06:04,580 --> 00:06:06,700 So yeah, that's one thing. 73 00:06:06,700 --> 00:06:10,920 If you're looking at me for answers in this talk, I'm sorry, I don't have answers. 74 00:06:10,920 --> 00:06:15,640 These are all just ways of making me think and making me question. 75 00:06:16,540 --> 00:06:18,200 All right, next slide, please. 76 00:06:21,280 --> 00:06:34,680 Okay, so it wouldn't be complete without a tweet from Dominic White, also known as Singe, who's one of the smartest people that I've ever had the privilege of meeting. 77 00:06:34,680 --> 00:06:39,140 Him and I were both originally from Johannesburg. 78 00:06:40,360 --> 00:06:49,940 So we used to kind of move around in the same circles and debate and think together. 79 00:06:49,940 --> 00:06:54,400 And so I consider him to be one of the smartest guys that I know. 80 00:06:54,400 --> 00:07:01,880 And then Sholder Vets, who he's quoting here, also fits into that category. 81 00:07:02,180 --> 00:07:11,220 So yeah, businesses can get away with being technically bankrupt in terms of security debt because it isn't something that is measured. 82 00:07:13,260 --> 00:07:17,700 Yeah, I just love the idea of security debt. 83 00:07:17,700 --> 00:07:27,600 Again, not something that I can claim to know everything about and certainly not something I can do in a talk that's supposed to be 15 minutes long. 84 00:07:28,160 --> 00:07:44,040 But it's something that I like to think about, the idea that if you take shortcuts when you're developing new systems or new services or anything to that effect, it's going to cost you down the line. 85 00:07:44,040 --> 00:07:48,140 And that's just something to think about in terms of security. 86 00:07:48,680 --> 00:07:59,240 And when you're working with organizations and you look at them and say, listen, surely when you're putting a solution together, patching was something that you considered. 87 00:07:59,340 --> 00:08:01,380 And they're like, well, no, not really. 88 00:08:01,380 --> 00:08:05,000 And it's like, well, okay, well, you're going to have to consider it now. 89 00:08:05,000 --> 00:08:07,800 And it's going to be more difficult for you to consider it now. 90 00:08:07,800 --> 00:08:10,740 And they're like, well, we just can't afford the time. 91 00:08:10,740 --> 00:08:15,460 And it's like, well, then in that case, you're pretty much technically bankrupt. 92 00:08:15,460 --> 00:08:19,240 And it's not a security issue. 93 00:08:19,360 --> 00:08:21,000 It's a development issue. 94 00:08:21,000 --> 00:08:23,440 You should have done it right at the beginning. 95 00:08:23,440 --> 00:08:34,640 And I love that kind of concept of thinking of it in terms of debts, where if you start it now and do it now, it'll be a good thing. 96 00:08:34,640 --> 00:08:40,240 But if you do it in the future, you're going to not only be paying the debt, but the interest on top of that. 97 00:08:41,680 --> 00:08:46,700 So, yeah, this is an interesting tweet. 98 00:08:47,240 --> 00:08:48,640 Next slide, please. 99 00:08:51,580 --> 00:09:07,760 Okay, so this is one that I found quite interesting, just basically because it quotes one of the people that I really, really like. 100 00:09:08,650 --> 00:09:11,580 And so Dr. 101 00:09:11,580 --> 00:09:32,950 Nicole Forsgren, who's basically written a book about essentially DevOps, including ways that she investigated on how companies do it successfully. 102 00:09:32,950 --> 00:09:42,710 She's like basically the one person that's scientifically tested whether DevOps is a good idea and come out with an idea that it is. 103 00:09:43,830 --> 00:09:54,630 Now, I disagree with her on this one very specific point about the fact that she talks about maturity models. 104 00:09:54,630 --> 00:09:58,350 And I think her definition of maturity models is different to mine. 105 00:09:58,350 --> 00:10:14,910 So essentially, what she's talking about here is the idea of something like a PCI, where you're not kind of working out what you should be doing based on your own environment, but literally based on a set of best practices. 106 00:10:15,450 --> 00:10:19,310 And I actually really like PCI. 107 00:10:19,310 --> 00:10:24,890 And I like the idea of the fact that you should use best practices. 108 00:10:26,170 --> 00:10:46,950 Because in a lot of organizations, that's really what all you've got, you know, and I probably shouldn't say best practices, I should probably say good practices, because a lot of organizations, you know, that is the minimum that you should be doing is actually what they do. 109 00:10:47,710 --> 00:10:56,490 And so I guess I kind of agree with this, that you should customize your controls for your organization. 110 00:10:56,490 --> 00:11:00,510 But a lot of organizations don't even have the minimum. 111 00:11:00,510 --> 00:11:14,250 So this is something that I'd like to think about, like to work out more and then be able to come up with a good solution to what organizations should do. 112 00:11:14,930 --> 00:11:19,730 But yeah, so this is one that I thought I'll just pull out and highlight. 113 00:11:22,690 --> 00:11:24,690 Next slide, please. 114 00:11:27,590 --> 00:11:32,050 Okay, so going from a tweet about Dr. 115 00:11:32,050 --> 00:11:34,090 Forsgren to a tweet by Dr. 116 00:11:34,090 --> 00:11:36,650 Forsgren about someone else. 117 00:11:36,730 --> 00:11:41,270 So this is from Camille Fournier. 118 00:11:42,950 --> 00:11:52,910 And it's about, it says, it's the most insightful thing that you can say about metrics and measures is that people will game them. 119 00:11:52,990 --> 00:11:56,470 You don't have anything insightful to add to the conversation. 120 00:11:56,470 --> 00:12:01,450 So we know that, we know that people will try and game these things. 121 00:12:01,450 --> 00:12:07,990 It doesn't mean that you should just abandon them and go with, you know, gut feel or something. 122 00:12:07,990 --> 00:12:12,770 You should lean more into collecting good information. 123 00:12:15,170 --> 00:12:16,690 And absolutely. 124 00:12:16,690 --> 00:12:18,690 So I think... 125 00:12:26,250 --> 00:12:28,910 Which continues this idea. 126 00:12:28,910 --> 00:12:32,870 So Dr. 127 00:12:32,870 --> 00:12:41,630 Forsgren carries on and says, metrics will be used against people because trust me, lack of metrics is also used against people. 128 00:12:42,470 --> 00:12:44,690 And the idea being that if you... 129 00:12:46,870 --> 00:12:53,590 Is that metrics can be used against certain types of people. 130 00:12:54,690 --> 00:12:56,690 And absolutely it can. 131 00:12:56,690 --> 00:13:00,570 And especially when you look at AI or ML. 132 00:13:01,330 --> 00:13:09,810 So, you know, artificial intelligence, which is all machine learning, which is based on the past, essentially. 133 00:13:09,810 --> 00:13:13,110 And you'll see that a lot of them have issues. 134 00:13:14,350 --> 00:13:20,870 The reason why they have issues is because the source material is not great to start with. 135 00:13:20,970 --> 00:13:22,870 And the idea being, hang on a sec. 136 00:13:22,870 --> 00:13:28,950 But, you know, even though it's a bad thing, doesn't mean that we should stop there or ignore it and just go with gut feel. 137 00:13:28,950 --> 00:13:35,970 Because gut feel is probably based on the same data, the same information as it has been. 138 00:13:35,970 --> 00:13:40,310 So, yeah, that's an issue. 139 00:13:40,910 --> 00:13:45,290 But we should actually work harder to get that source information better. 140 00:13:45,290 --> 00:13:49,870 We should work harder to understand what the stuff is that we're looking at. 141 00:13:50,330 --> 00:13:54,070 Don't abandon it just because it doesn't make sense to you. 142 00:13:54,510 --> 00:13:55,830 Work harder. 143 00:13:56,310 --> 00:14:00,250 And I think that's what I get out of this particular tweet. 144 00:14:00,610 --> 00:14:02,110 Next slide, please. 145 00:14:08,050 --> 00:14:12,810 Okay, this is just one that I just thought I'll throw in. 146 00:14:12,810 --> 00:14:26,360 So this is from Rachel Toback, who is one of the best social engineers out there and has computed many times at the social engineering village. 147 00:14:27,740 --> 00:14:39,500 And I just thought this was a good one because it kind of shows how if you look at things more deeply, they make a lot more sense. 148 00:14:39,500 --> 00:14:52,760 So if you look at the laziness of information security, we always say, listen, don't click on links in emails, which is actually not helpful advice at all. 149 00:14:52,760 --> 00:15:05,760 I always think of kind of a Seinfeld explaining that to Gary and then coming back and saying, but that's what emails are for. 150 00:15:05,760 --> 00:15:16,080 Like, literally, half the point of an email is you send something to someone so they can click on the link and carry on their daily business. 151 00:15:16,080 --> 00:15:21,360 It doesn't make sense to say something that's just kind of don't click on links in email. 152 00:15:21,360 --> 00:15:25,780 You know, that person was compromised because like, what the hell were they thinking? 153 00:15:25,780 --> 00:15:27,860 They should have just not clicked on the link. 154 00:15:27,880 --> 00:15:37,160 But this takes it one step further and has a look at exactly what is going on with kind of spam emails. 155 00:15:37,580 --> 00:15:50,460 And what she's got here is a principle created by someone called, name I'm going to try and pronounce, Robert Cialdini. 156 00:15:50,460 --> 00:15:52,960 And that's a principle of persuasion. 157 00:15:52,960 --> 00:15:57,700 But like, okay, now we're looking at the philosophy behind emails. 158 00:15:57,700 --> 00:16:01,060 So not just, you know, don't click on links in emails. 159 00:16:01,060 --> 00:16:07,380 But hang on a sec, is this email trying to increase the amount of urgency that I need to do? 160 00:16:07,380 --> 00:16:14,720 And if it is, do I need to actually be as urgent as the email says that I should? 161 00:16:14,720 --> 00:16:20,560 So it's like, okay, if you don't contact us within the next 24 hours, something bad is going to happen to your account. 162 00:16:20,560 --> 00:16:29,820 Whoa, wait a minute, someone is trying to cause me to bypass my thinking to try and be quick and rash. 163 00:16:29,820 --> 00:16:31,780 Why are they doing this? 164 00:16:32,060 --> 00:16:34,240 Why is it in their best interest? 165 00:16:34,240 --> 00:16:40,220 Does it make sense that my bank would give me a certain amount of time to respond to something? 166 00:16:40,220 --> 00:16:41,360 If not, hang on. 167 00:16:41,360 --> 00:16:46,400 So maybe someone's trying to con me and trying to bypass my actual way of thinking. 168 00:16:46,580 --> 00:16:51,420 This to me is so much better advice than don't click on links in emails. 169 00:16:51,420 --> 00:16:58,920 It's kind of... understand why someone's doing something while they're talking to you in a certain way. 170 00:16:59,760 --> 00:17:01,260 Next slide, please. 171 00:17:06,320 --> 00:17:10,640 Okay, this one I'm getting close to breaking the rules about Spark language. 172 00:17:10,640 --> 00:17:13,320 There's no swearing in this, but it's close. 173 00:17:13,320 --> 00:17:16,960 So this is a risk matrix, which I thought was quite cute. 174 00:17:17,100 --> 00:17:19,340 It's the Australian risk matrix. 175 00:17:19,340 --> 00:17:37,550 So if anyone is interested in knowing how risks work, you basically take the likelihood of something happening and you take the consequences to you, kind of times them together, or add them together, or times them together, and you come up with a number. 176 00:17:37,940 --> 00:17:41,720 And that's how much risk there is in something happening. 177 00:17:41,720 --> 00:17:57,080 So, yeah, so in Australia, you've got a chance of something that's either nah, or yeah, nah, or yeah, or nah, yeah, or dead set. 178 00:17:57,380 --> 00:17:59,940 And that's the likelihood of something happening. 179 00:17:59,940 --> 00:18:02,860 And then, of course, your consequences is lower than a lizard's. 180 00:18:02,860 --> 00:18:03,960 Don't be a sook. 181 00:18:04,360 --> 00:18:08,320 She'll be apples, or fair dinkum, or rooted. 182 00:18:08,320 --> 00:18:15,680 And then basically, either something will be, she'll be right, or fuck. 183 00:18:15,980 --> 00:18:18,740 So those are your different ones. 184 00:18:18,880 --> 00:18:21,680 Have I seen this used in business? 185 00:18:21,680 --> 00:18:23,180 No, actually. 186 00:18:24,180 --> 00:18:26,780 But I really like this, this one. 187 00:18:26,780 --> 00:18:29,960 And then the same same chap came out with. 188 00:18:30,780 --> 00:18:32,600 He's actually from Britain. 189 00:18:32,940 --> 00:18:36,600 And he came out with the British one, which is on the next slide, please. 190 00:18:39,120 --> 00:18:40,960 And here's the UK one. 191 00:18:40,960 --> 00:18:48,380 So their chances are once in a blue moon. 192 00:18:48,380 --> 00:18:51,580 Not likely, on occasion, a fair chance. 193 00:18:51,580 --> 00:18:58,640 And then momentarily, which is like the opposite to what the US uses is momentarily. 194 00:18:59,260 --> 00:19:05,940 And then your consequences are trifle, piffle, hoo-ha, jagged, or royally leaped. 195 00:19:06,580 --> 00:19:09,900 And then, of course, yeah, you can see the different ones. 196 00:19:09,900 --> 00:19:13,080 It's like biscuits or pop the kettle on. 197 00:19:13,300 --> 00:19:14,580 And then it gets worse. 198 00:19:14,580 --> 00:19:22,420 So it's like bloody hell, and then bugger, and then like the worst possible one is like bugger. 199 00:19:22,420 --> 00:19:24,180 We better pop the kettle on. 200 00:19:24,700 --> 00:19:29,300 And so, yeah, there's your five, five, five risk matrix for the UK. 201 00:19:31,960 --> 00:19:41,620 Think in the interest of time, and also because I don't know if it's technically possible, can we skip the next slide and go on to the one after that one, please? 202 00:19:43,760 --> 00:19:47,100 That's... yeah, this one was just a bit of fun. 203 00:19:48,180 --> 00:19:55,860 But also, I like the quote at the top there, when someone sacrifices the main feature to ship on time. 204 00:19:57,300 --> 00:20:00,900 Yeah, a big part of DevOps. 205 00:20:00,960 --> 00:20:16,440 And you'll see a lot of these things are concerned with the idea of DevOps and all of that, because I think that processes is something that IT has totally skipped over in previous years. 206 00:20:16,440 --> 00:20:24,280 And we've done all the worst possible processes, the worst possible ways of developing things, the worst possible of everything. 207 00:20:24,280 --> 00:20:28,400 And manufacturing was like that. 208 00:20:28,440 --> 00:20:32,600 But they were like that a good 60, 70 years ago. 209 00:20:33,120 --> 00:20:39,560 And they've worked really, really hard to improve things, come up with all the best ways of doing things. 210 00:20:39,660 --> 00:20:41,000 And we totally ignored that. 211 00:20:41,000 --> 00:20:43,080 We just thought we were better than them. 212 00:20:43,080 --> 00:20:47,900 We just decided that we'll do things in our own way. 213 00:20:47,900 --> 00:20:49,240 And it's a problem. 214 00:20:49,240 --> 00:20:56,760 And I think now we're actually starting to understand that, hang on, these guys actually spent a lot of time doing the right thing. 215 00:20:57,080 --> 00:20:59,220 And we can learn from them. 216 00:20:59,220 --> 00:21:08,560 And so I like to think that a lot of this has been worked out for us. 217 00:21:09,160 --> 00:21:14,140 And yeah, this is exactly what DevOps is all about, actually. 218 00:21:14,480 --> 00:21:23,900 The idea that you need to ship something on time, and that is the biggest, most important thing that you can do. 219 00:21:23,900 --> 00:21:29,940 But in reality, the whole point of it is to get a good working system. 220 00:21:30,320 --> 00:21:33,540 And that's what you should be working towards. 221 00:21:34,640 --> 00:21:39,740 So I just really liked the picture because it really doesn't make sense. 222 00:21:39,740 --> 00:21:42,420 But I also like the quote that they've got at the top. 223 00:21:43,080 --> 00:21:44,600 Next slide, please. 224 00:21:50,330 --> 00:21:55,310 Yeah, this is a fun one about consulting. 225 00:21:56,310 --> 00:22:01,530 So it says at GM, if you see a snake, the first thing you do is hire a consultant on snakes. 226 00:22:01,590 --> 00:22:03,490 And then you get a committee together on snakes. 227 00:22:03,490 --> 00:22:05,590 And then you discuss it for a few years. 228 00:22:06,050 --> 00:22:13,890 And then the most likely course of action is nothing because, you know, if the snake hasn't bit anyone yet, then probably won't be a problem. 229 00:22:14,430 --> 00:22:18,610 And then so you just leave him to crawl around on the factory floor. 230 00:22:19,510 --> 00:22:25,590 Better way to do things is build an environment where the first guy who sees a snake kills it. 231 00:22:25,630 --> 00:22:27,630 Well, not necessarily kills it. 232 00:22:34,990 --> 00:22:42,410 I just really working as someone, as a consultant for so many years. 233 00:22:42,410 --> 00:22:46,410 This is absolutely what I see in a lot of organization. 234 00:22:46,410 --> 00:22:54,590 And as a consultant, I'm not complaining because this pretty much is what puts food into my kids' mouths. 235 00:22:54,810 --> 00:23:06,110 Because the amount of times that I've been hired by an organization because they just didn't want to, you know, look at an issue and deal with it. 236 00:23:06,110 --> 00:23:10,070 They'd rather bring consultants in to tell them, hey, listen, you have an issue. 237 00:23:10,070 --> 00:23:12,490 And then they still do nothing about it. 238 00:23:12,970 --> 00:23:14,850 It is crazy. 239 00:23:15,590 --> 00:23:22,110 I guess if you're in an organization, the best thing that you can do is hang on. 240 00:23:22,110 --> 00:23:23,050 What is practical? 241 00:23:23,050 --> 00:23:23,790 What can we do? 242 00:23:23,790 --> 00:23:25,750 Let's get this thing sorted out. 243 00:23:25,750 --> 00:23:31,750 As opposed to, you know, let's get consultants in and then get a committee together and then put this together. 244 00:23:32,090 --> 00:23:36,630 At the end of the day, are you actually solving what you set out to solve? 245 00:23:38,050 --> 00:23:39,490 Hopefully. 246 00:23:39,590 --> 00:23:46,650 And if you're not, then you probably need to take a deeper look at the work that you're actually doing. 247 00:23:47,130 --> 00:23:49,150 Next slide, please. 248 00:23:53,760 --> 00:24:01,600 This comes from Gossi the dog, also someone really smart and someone I enjoy following. 249 00:24:04,200 --> 00:24:07,180 This is quite an interesting one. 250 00:24:07,180 --> 00:24:11,920 Again, so what they said here is Google says it ought to show this. 251 00:24:11,920 --> 00:24:19,600 Okay, this is not even, you know, application-driven two-factor authentication. 252 00:24:19,600 --> 00:24:25,420 This is SMS-based two-factor authentication, 99% effective against bulk phishing. 253 00:24:25,900 --> 00:24:29,560 So that's pretty good figures, actually. 254 00:24:29,600 --> 00:24:33,380 And then 100% effective against credential stuffing. 255 00:24:33,380 --> 00:24:48,700 So what this is saying, essentially, is multi-factor authentication can help you out against, you know, people trying to attack your authentication, which is great information in the first place. 256 00:24:48,700 --> 00:24:58,860 But hang on a sec, it also, if you think about it, means that your phishing emails that you're sending out and spending a lot of time doing, they're good, they're effective. 257 00:24:58,860 --> 00:25:09,000 So I wouldn't say stop doing that, but also be aware that, you know, if you're just doing that, and all you're really showing is that there's an issue, you're not really fixing it. 258 00:25:09,000 --> 00:25:15,080 Whereas with, you know, if you actually have multi-factor authentication, you're actually fixing the issue. 259 00:25:15,320 --> 00:25:21,220 So sometimes it's good to actually look practically at what there is and what you should be doing. 260 00:25:21,460 --> 00:25:23,120 Next slide, please. 261 00:25:25,960 --> 00:25:31,440 Okay, so this is just to show you an example of what there is in my bookmarks. 262 00:25:31,440 --> 00:25:37,860 This is just a picture of a control that absolutely can be bypassed easily. 263 00:25:37,860 --> 00:25:43,880 And you can see that in the practical, in the real world, but you can't really see that in IT. 264 00:25:43,880 --> 00:25:47,160 Sometimes we just put things in place without understanding them. 265 00:25:47,160 --> 00:25:57,500 The reason why I've actually bookmarked this, if you have a look at the next slide, please, is because it comes from Swift on security. 266 00:25:57,500 --> 00:26:00,000 If you don't follow them, you absolutely should be. 267 00:26:00,360 --> 00:26:02,800 They just said, reply with security memes. 268 00:26:02,800 --> 00:26:10,580 And this whole kind of bits of threads is absolutely packed full of lots of fun and games. 269 00:26:11,580 --> 00:26:15,500 And so I bookmarked it in case I ever needed to use it in a presentation, which I did. 270 00:26:15,500 --> 00:26:17,100 And here it is. 271 00:26:19,120 --> 00:26:20,460 Next slide, please. 272 00:26:21,160 --> 00:26:22,740 I'm running very low on time here. 273 00:26:22,740 --> 00:26:24,900 So we're just going to go through the next few quite quickly. 274 00:26:24,900 --> 00:26:27,860 And I think there are slides that we can get through quite quickly. 275 00:26:28,220 --> 00:26:32,680 So this is just a picture of when Kubernetes goes wrong. 276 00:26:32,720 --> 00:26:34,520 Just lots of fun and games. 277 00:26:34,820 --> 00:26:39,100 I just love Kubernetes kind of visual puns. 278 00:26:39,440 --> 00:26:40,240 Next one. 279 00:26:40,240 --> 00:26:41,400 Next slide, please. 280 00:26:42,300 --> 00:26:49,540 This is just to show you that my bookmarks are not all just information security and process-driven and very serious topics and stuff. 281 00:26:49,540 --> 00:26:51,540 Some of them are just quite crazy. 282 00:26:51,960 --> 00:26:56,160 So this is just a way of making it look like you have a head in a jar. 283 00:26:57,300 --> 00:27:04,360 For how long it'll probably take to put on your desk to get the rest of the IT team to take you seriously. 284 00:27:04,760 --> 00:27:06,180 Next slide, please. 285 00:27:12,560 --> 00:27:17,520 This is actually a really interesting story, which if I had more time, I would jump into. 286 00:27:17,520 --> 00:27:20,860 It's just basically the whole idea of how processes are broken. 287 00:27:21,180 --> 00:27:29,540 If you send an email to this organization, it gets sent off to an email address that belongs to a person. 288 00:27:29,600 --> 00:27:35,300 So this is if you unsubscribe from the email list, it actually goes to a single person that works at the organization. 289 00:27:36,700 --> 00:27:39,580 Some other IT team have picked it up. 290 00:27:39,580 --> 00:27:40,660 It goes to them. 291 00:27:40,700 --> 00:27:44,900 They do their research as to whether the person is supposed to be on the list or not. 292 00:27:44,900 --> 00:27:54,740 And then that goes into an Excel spreadsheet, which then gets emailed to someone else who then checks it against the database, manually sends it back, logs a call, goes to a third person, etc., etc. 293 00:27:54,740 --> 00:27:59,920 It's a beautiful way of showing just how broken organizations can be. 294 00:28:00,980 --> 00:28:02,400 Next slide, please. 295 00:28:05,790 --> 00:28:15,590 This is just a document that I kept because it's got a cybersecurity style guide, which I think is really something good to use. 296 00:28:15,930 --> 00:28:17,470 Next slide, please. 297 00:28:20,060 --> 00:28:22,060 Recipe on how to make the best bread. 298 00:28:22,060 --> 00:28:24,060 And I have tried it, and it is the best bread. 299 00:28:24,060 --> 00:28:25,700 It's absolutely amazing. 300 00:28:26,980 --> 00:28:29,060 So next slide, please. 301 00:28:32,720 --> 00:28:34,280 Vendor documentation. 302 00:28:35,160 --> 00:28:38,600 Can definitely be not all that very useful. 303 00:28:38,940 --> 00:28:40,900 And that's what the slide is showing. 304 00:28:41,660 --> 00:28:43,220 Next slide, please. 305 00:28:46,720 --> 00:28:48,680 This is just good advice. 306 00:28:48,940 --> 00:28:55,940 I think, you know, always be thinking about what your customer wants, what they're trying to do. 307 00:28:55,940 --> 00:28:58,200 How do you fit into their processes? 308 00:28:59,260 --> 00:29:02,600 And, yeah, just remember customer doesn't want a quarter-inch drill. 309 00:29:02,600 --> 00:29:04,180 They want a quarter-inch hole. 310 00:29:04,180 --> 00:29:12,100 I think that's absolutely good and something that I consider every single time. 311 00:29:12,100 --> 00:29:14,360 Next slide, please. 312 00:29:23,570 --> 00:29:25,480 Next slide, please. 313 00:29:33,790 --> 00:29:37,330 Let me check if there's technical issues. 314 00:29:37,450 --> 00:29:40,010 Yeah, it's absolutely fine. 315 00:29:40,010 --> 00:29:45,850 The next couple of slides are not all that important. 316 00:29:47,190 --> 00:30:02,690 One thing I can leave you with, I guess, the last slide that I just had there was, like, even though I've got so much stuff, and it's also probably good just to keep some of the rest of the slides are just basically just fun and games. 317 00:30:02,690 --> 00:30:15,110 I think the one thing I just wanted to let you know there is just keep a sense of humor about you. 318 00:30:15,130 --> 00:30:18,930 Life is always serious, especially in the industry that we're in. 319 00:30:18,930 --> 00:30:28,370 But also, you should just sometimes go outside, have fun, take a break, relax, and let your mind just take a break itself. 320 00:30:30,270 --> 00:30:43,030 Really, the kind of book that leads to questions is just a bunch of things that I found quite interesting once upon a time. 321 00:30:43,530 --> 00:30:51,950 I guess if there's any takeaways, yeah, besides the fact that you should always relax and take life easy from time to time. 322 00:30:51,950 --> 00:30:54,910 But also, there's a lot of good stuff on Twitter. 323 00:30:54,910 --> 00:30:57,050 There's a lot of smart people out there. 324 00:30:57,050 --> 00:31:02,330 And I highly recommend that you take a look around and see. 325 00:31:02,730 --> 00:31:06,730 And sometimes, people are real people. 326 00:31:06,730 --> 00:31:08,190 They don't only talk tech. 327 00:31:08,190 --> 00:31:09,410 They talk about their lives. 328 00:31:09,410 --> 00:31:10,830 They talk about how they do things. 329 00:31:10,830 --> 00:31:14,190 And sometimes, that can be even very useful. 330 00:31:14,210 --> 00:31:16,430 And you can learn a lot from that. 331 00:31:16,570 --> 00:31:18,410 So that's pretty much the takeaways. 332 00:31:18,670 --> 00:31:20,390 Yeah, thank you all very much. 333 00:31:20,390 --> 00:31:27,710 If there are any questions or anything that you'd want to discuss further about any of my slides, please, please let me know. 334 00:31:27,750 --> 00:31:32,850 One thing I'm going to try and do is get a hold of all the different tweets that I actually collected up.