[00:06.400 --> 00:10.600] Oh, I should have done the training. [00:11.660 --> 00:15.540] Oh, I see. Okay. Yeah, there's two laser pointers. [00:16.640 --> 00:18.620] Is that better? Can everyone hear me? [00:18.620 --> 00:19.980] Yes, that's much better. [00:20.300 --> 00:23.260] Okay. I don't know what's going on with that, Michael. [00:23.620 --> 00:25.460] Thank you. Sorry about that. [00:26.260 --> 00:27.740] Okay. No, that's fine. [00:27.960 --> 00:28.480] All right. [00:28.480 --> 00:30.560] So, essentially, I was just talking to Michael. [00:30.560 --> 00:32.440] He rejected from talking at DEF CON, [00:32.440 --> 00:35.400] so we don't even need people to listen to that. [00:37.220 --> 00:42.160] All right. So, what this talk is all about. [00:42.820 --> 00:44.580] All right. Let's see. [00:44.580 --> 00:47.220] Yeah. Okay. Can we move to the next slide, please? [00:53.030 --> 00:53.510] And... [00:57.780 --> 00:58.980] All right. [00:58.980 --> 01:06.380] So, essentially, I've got a brain that is always trying to consume [01:06.380 --> 01:11.460] as much as I possibly can all the time. [01:11.560 --> 01:15.000] And sometimes I just don't have the hours in the day [01:15.000 --> 01:17.840] to actually do everything that I want to do. [01:18.880 --> 01:20.680] And this is especially true. [01:20.680 --> 01:22.740] So, when I go on Twitter, and I'm on Twitter, [01:22.740 --> 01:26.160] as the introduction said, pretty much all the time, [01:26.160 --> 01:28.220] I'm pretty much addicted to it. [01:28.220 --> 01:33.040] I follow, basically, the cleverest people in the entire world, [01:33.040 --> 01:34.880] as far as I'm concerned. [01:35.880 --> 01:40.580] And some of them drop some information about stuff that they're working on [01:40.580 --> 01:44.580] or something that they find interesting all the time. [01:44.580 --> 01:49.460] And I just put it into bookmarks because I just really can't get to it. [01:49.740 --> 01:54.100] And so, I have, and I checked this out this morning, [01:54.820 --> 02:00.400] over 500 bookmarks sitting, waiting for me to address them. [02:00.420 --> 02:07.140] And I have 10 yesterday since I created the slides. [02:07.140 --> 02:13.100] So, it's a huge issue, but it's a good one to have. [02:13.100 --> 02:18.320] And I think that's probably what the whole subject of this talk is all about. [02:24.200 --> 02:29.120] So, to produce this talk, just behind the scenes, [02:29.120 --> 02:31.960] I used a piece of software called Dewey. [02:33.940 --> 02:37.040] It's a way of arranging bookmarks. [02:37.040 --> 02:40.460] I know that if you pay money to Twitter each month, [02:40.460 --> 02:42.920] you can do the same thing. [02:42.920 --> 02:46.460] I have a problem with paying money to Twitter each month, [02:46.460 --> 02:50.340] especially since this Dewey program is free. [02:50.340 --> 02:53.720] I have no idea anything about the organization for all. [02:53.720 --> 02:59.400] I know my bookmarks are being looked at by some sort of weird government [02:59.400 --> 03:03.140] of some sorts, or some company, and they're selling it [03:03.140 --> 03:06.600] so that whenever I jump onto Twitter, it advertises something [03:06.600 --> 03:09.680] that makes sense to me, which doesn't seem to be the case. [03:09.680 --> 03:15.820] But that's maybe why they're happy to do something for free. [03:15.820 --> 03:18.880] Who knows? Anyway, so I quickly got a hold of this app, [03:18.880 --> 03:22.000] just so that I could arrange my bookmarks and have a look [03:22.000 --> 03:26.140] and see exactly what was available to me. [03:26.340 --> 03:29.760] And for the first time, I actually jumped back into bookmarks [03:29.760 --> 03:34.220] that I started with about two or something years ago. [03:34.960 --> 03:37.220] There's actually good stuff in there. [03:38.540 --> 03:40.900] So next slide, please. [03:42.240 --> 03:45.720] So we're going to take a bit of a look through these. [03:45.720 --> 03:49.160] I've skipped some bookmarks because they just don't really make sense [03:49.160 --> 03:53.900] to the great public. And yeah, let's jump in. [03:53.900 --> 03:59.950] So this is like one of my oldest ones, February 27, 2019. [04:00.360 --> 04:03.840] So it's a good two and a half years old. [04:05.580 --> 04:09.780] And yeah, it comes from Katie McCaffrey. [04:10.780 --> 04:15.640] And what they were talking about here is running an experiment [04:15.640 --> 04:21.020] where instead of using Word and using track changes in Word [04:21.020 --> 04:25.840] and comments and stuff, they decided to abandon Word altogether [04:26.780 --> 04:34.220] and just make their documents in a way that they could store it in GitHub. [04:34.220 --> 04:39.240] So any changes that you made to documents, once you committed [04:39.240 --> 04:44.580] those changes, you could change reason. [04:45.180 --> 04:48.900] And so anyone that's following up on your documents can just go through [04:48.900 --> 04:58.260] essentially the changelog and see exactly what those changes were, [04:58.260 --> 05:01.520] which is an interesting way of doing things. [05:02.360 --> 05:08.740] The reason why I'm showing you this is a lot of these were just me thinking, [05:08.740 --> 05:13.540] well, how can we do things better? So as my introduction said, [05:13.540 --> 05:20.640] I'm a GRC consultant, but I also consider myself to be a GRC hacker. [05:20.880 --> 05:25.420] And so I'm always trying to see how can we hack the processes [05:25.420 --> 05:30.660] that we're doing. So instead of just accepting the fact that reports [05:30.660 --> 05:34.620] are done with Word, what is better? What can we do? [05:34.980 --> 05:38.040] What's a better way of us doing things? [05:38.920 --> 05:43.740] And this one just really appealed to me. So should you do this? [05:43.740 --> 05:46.500] Should you use Word? Who knows? Up to you. [05:46.500 --> 05:50.640] But at least it's something that you should be thinking about, [05:50.640 --> 05:57.140] I don't know, if you want. And that's what these bookmarks are all about. [05:57.140 --> 06:02.100] It's just stuff for me to think about, and maybe you can think about it as well [06:02.100 --> 06:06.700] and get something out of it. So yeah, that's one thing. [06:06.700 --> 06:10.920] If you're looking at me for answers in this talk, I'm sorry, I don't have answers. [06:10.920 --> 06:15.640] These are all just ways of making me think and making me question. [06:16.540 --> 06:18.200] All right, next slide, please. [06:21.280 --> 06:27.180] Okay, so it wouldn't be complete without a tweet from Dominic White, [06:27.180 --> 06:34.680] also known as Singe, who's one of the smartest people that I've ever had the privilege of meeting. [06:34.680 --> 06:39.140] Him and I were both originally from Johannesburg. [06:40.360 --> 06:49.940] So we used to kind of move around in the same circles and debate and think together. [06:49.940 --> 06:54.400] And so I consider him to be one of the smartest guys that I know. [06:54.400 --> 07:01.880] And then Sholder Vets, who he's quoting here, also fits into that category. [07:02.180 --> 07:11.220] So yeah, businesses can get away with being technically bankrupt in terms of security debt because it isn't something that is measured. [07:13.260 --> 07:17.700] Yeah, I just love the idea of security debt. [07:17.700 --> 07:27.600] Again, not something that I can claim to know everything about and certainly not something I can do in a talk that's supposed to be 15 minutes long. [07:28.160 --> 07:44.040] But it's something that I like to think about, the idea that if you take shortcuts when you're developing new systems or new services or anything to that effect, it's going to cost you down the line. [07:44.040 --> 07:48.140] And that's just something to think about in terms of security. [07:48.680 --> 07:59.240] And when you're working with organizations and you look at them and say, listen, surely when you're putting a solution together, patching was something that you considered. [07:59.340 --> 08:05.000] And they're like, well, no, not really. And it's like, well, okay, well, you're going to have to consider it now. [08:05.000 --> 08:07.800] And it's going to be more difficult for you to consider it now. [08:07.800 --> 08:10.740] And they're like, well, we just can't afford the time. [08:10.740 --> 08:15.460] And it's like, well, then in that case, you're pretty much technically bankrupt. [08:15.460 --> 08:23.440] And it's not a security issue. It's a development issue. You should have done it right at the beginning. [08:23.440 --> 08:34.640] And I love that kind of concept of thinking of it in terms of debts, where if you start it now and do it now, it'll be a good thing. [08:34.640 --> 08:40.240] But if you do it in the future, you're going to not only be paying the debt, but the interest on top of that. [08:41.680 --> 08:46.700] So, yeah, this is an interesting tweet. [08:47.240 --> 08:48.640] Next slide, please. [08:51.580 --> 09:07.760] Okay, so this is one that I found quite interesting, just basically because it quotes one of the people that I really, really like. [09:08.650 --> 09:32.950] And so Dr. Nicole Forsgren, who's basically written a book about essentially DevOps, including ways that she investigated on how companies do it successfully. [09:32.950 --> 09:42.710] She's like basically the one person that's scientifically tested whether DevOps is a good idea and come out with an idea that it is. [09:43.830 --> 09:54.630] Now, I disagree with her on this one very specific point about the fact that she talks about maturity models. [09:54.630 --> 09:58.350] And I think her definition of maturity models is different to mine. [09:58.350 --> 10:14.910] So essentially, what she's talking about here is the idea of something like a PCI, where you're not kind of working out what you should be doing based on your own environment, but literally based on a set of best practices. [10:15.450 --> 10:19.310] And I actually really like PCI. [10:19.310 --> 10:24.890] And I like the idea of the fact that you should use best practices. [10:26.170 --> 10:46.950] Because in a lot of organizations, that's really what all you've got, you know, and I probably shouldn't say best practices, I should probably say good practices, because a lot of organizations, you know, that is the minimum that you should be doing is actually what they do. [10:47.710 --> 10:56.490] And so I guess I kind of agree with this, that you should customize your controls for your organization. [10:56.490 --> 11:00.510] But a lot of organizations don't even have the minimum. [11:00.510 --> 11:14.250] So this is something that I'd like to think about, like to work out more and then be able to come up with a good solution to what organizations should do. [11:14.930 --> 11:19.730] But yeah, so this is one that I thought I'll just pull out and highlight. [11:22.690 --> 11:24.690] Next slide, please. [11:27.590 --> 11:36.650] Okay, so going from a tweet about Dr. Forsgren to a tweet by Dr. Forsgren about someone else. [11:36.730 --> 11:41.270] So this is from Camille Fournier. [11:42.950 --> 11:52.910] And it's about, it says, it's the most insightful thing that you can say about metrics and measures is that people will game them. [11:52.990 --> 11:56.470] You don't have anything insightful to add to the conversation. [11:56.470 --> 12:01.450] So we know that, we know that people will try and game these things. [12:01.450 --> 12:07.990] It doesn't mean that you should just abandon them and go with, you know, gut feel or something. [12:07.990 --> 12:12.770] You should lean more into collecting good information. [12:15.170 --> 12:16.690] And absolutely. [12:16.690 --> 12:18.690] So I think... [12:26.250 --> 12:28.910] Which continues this idea. [12:28.910 --> 12:41.630] So Dr. Forsgren carries on and says, metrics will be used against people because trust me, lack of metrics is also used against people. [12:42.470 --> 12:44.690] And the idea being that if you... [12:46.870 --> 12:53.590] Is that metrics can be used against certain types of people. [12:54.690 --> 12:56.690] And absolutely it can. [12:56.690 --> 13:00.570] And especially when you look at AI or ML. [13:01.330 --> 13:09.810] So, you know, artificial intelligence, which is all machine learning, which is based on the past, essentially. [13:09.810 --> 13:13.110] And you'll see that a lot of them have issues. [13:14.350 --> 13:20.870] The reason why they have issues is because the source material is not great to start with. [13:20.970 --> 13:22.870] And the idea being, hang on a sec. [13:22.870 --> 13:28.950] But, you know, even though it's a bad thing, doesn't mean that we should stop there or ignore it and just go with gut feel. [13:28.950 --> 13:35.970] Because gut feel is probably based on the same data, the same information as it has been. [13:35.970 --> 13:40.310] So, yeah, that's an issue. [13:40.910 --> 13:45.290] But we should actually work harder to get that source information better. [13:45.290 --> 13:49.870] We should work harder to understand what the stuff is that we're looking at. [13:50.330 --> 13:54.070] Don't abandon it just because it doesn't make sense to you. [13:54.510 --> 13:55.830] Work harder. [13:56.310 --> 14:00.250] And I think that's what I get out of this particular tweet. [14:00.610 --> 14:02.110] Next slide, please. [14:08.050 --> 14:12.810] Okay, this is just one that I just thought I'll throw in. [14:12.810 --> 14:26.360] So this is from Rachel Toback, who is one of the best social engineers out there and has computed many times at the social engineering village. [14:27.740 --> 14:39.500] And I just thought this was a good one because it kind of shows how if you look at things more deeply, they make a lot more sense. [14:39.500 --> 14:52.760] So if you look at the laziness of information security, we always say, listen, don't click on links in emails, which is actually not helpful advice at all. [14:52.760 --> 15:05.760] I always think of kind of a Seinfeld explaining that to Gary and then coming back and saying, but that's what emails are for. [15:05.760 --> 15:16.080] Like, literally, half the point of an email is you send something to someone so they can click on the link and carry on their daily business. [15:16.080 --> 15:21.360] It doesn't make sense to say something that's just kind of don't click on links in email. [15:21.360 --> 15:25.780] You know, that person was compromised because like, what the hell were they thinking? [15:25.780 --> 15:27.860] They should have just not clicked on the link. [15:27.880 --> 15:37.160] But this takes it one step further and has a look at exactly what is going on with kind of spam emails. [15:37.580 --> 15:50.460] And what she's got here is a principle created by someone called, name I'm going to try and pronounce, Robert Cialdini. [15:50.460 --> 15:52.960] And that's a principle of persuasion. [15:52.960 --> 15:57.700] But like, okay, now we're looking at the philosophy behind emails. [15:57.700 --> 16:01.060] So not just, you know, don't click on links in emails. [16:01.060 --> 16:07.380] But hang on a sec, is this email trying to increase the amount of urgency that I need to do? [16:07.380 --> 16:14.720] And if it is, do I need to actually be as urgent as the email says that I should? [16:14.720 --> 16:20.560] So it's like, okay, if you don't contact us within the next 24 hours, something bad is going to happen to your account. [16:20.560 --> 16:29.820] Whoa, wait a minute, someone is trying to cause me to bypass my thinking to try and be quick and rash. [16:29.820 --> 16:31.780] Why are they doing this? [16:32.060 --> 16:34.240] Why is it in their best interest? [16:34.240 --> 16:40.220] Does it make sense that my bank would give me a certain amount of time to respond to something? [16:40.220 --> 16:41.360] If not, hang on. [16:41.360 --> 16:46.400] So maybe someone's trying to con me and trying to bypass my actual way of thinking. [16:46.580 --> 16:51.420] This to me is so much better advice than don't click on links in emails. [16:51.420 --> 16:58.920] It's kind of... understand why someone's doing something while they're talking to you in a certain way. [16:59.760 --> 17:01.260] Next slide, please. [17:06.320 --> 17:10.640] Okay, this one I'm getting close to breaking the rules about Spark language. [17:10.640 --> 17:13.320] There's no swearing in this, but it's close. [17:13.320 --> 17:16.960] So this is a risk matrix, which I thought was quite cute. [17:17.100 --> 17:19.340] It's the Australian risk matrix. [17:19.340 --> 17:37.550] So if anyone is interested in knowing how risks work, you basically take the likelihood of something happening and you take the consequences to you, kind of times them together, or add them together, or times them together, and you come up with a number. [17:37.940 --> 17:41.720] And that's how much risk there is in something happening. [17:41.720 --> 17:57.080] So, yeah, so in Australia, you've got a chance of something that's either nah, or yeah, nah, or yeah, or nah, yeah, or dead set. [17:57.380 --> 17:59.940] And that's the likelihood of something happening. [17:59.940 --> 18:02.860] And then, of course, your consequences is lower than a lizard's. [18:02.860 --> 18:03.960] Don't be a sook. [18:04.360 --> 18:08.320] She'll be apples, or fair dinkum, or rooted. [18:08.320 --> 18:15.680] And then basically, either something will be, she'll be right, or fuck. [18:15.980 --> 18:18.740] So those are your different ones. [18:18.880 --> 18:21.680] Have I seen this used in business? [18:21.680 --> 18:23.180] No, actually. [18:24.180 --> 18:26.780] But I really like this, this one. [18:26.780 --> 18:29.960] And then the same same chap came out with. [18:30.780 --> 18:32.600] He's actually from Britain. [18:32.940 --> 18:36.600] And he came out with the British one, which is on the next slide, please. [18:39.120 --> 18:40.960] And here's the UK one. [18:40.960 --> 18:48.380] So their chances are once in a blue moon. [18:48.380 --> 18:51.580] Not likely, on occasion, a fair chance. [18:51.580 --> 18:58.640] And then momentarily, which is like the opposite to what the US uses is momentarily. [18:59.260 --> 19:05.940] And then your consequences are trifle, piffle, hoo-ha, jagged, or royally leaped. [19:06.580 --> 19:09.900] And then, of course, yeah, you can see the different ones. [19:09.900 --> 19:13.080] It's like biscuits or pop the kettle on. [19:13.300 --> 19:14.580] And then it gets worse. [19:14.580 --> 19:22.420] So it's like bloody hell, and then bugger, and then like the worst possible one is like bugger. [19:22.420 --> 19:24.180] We better pop the kettle on. [19:24.700 --> 19:29.300] And so, yeah, there's your five, five, five risk matrix for the UK. [19:31.960 --> 19:36.680] Think in the interest of time, and also because I don't know if it's technically possible, [19:36.680 --> 19:41.620] can we skip the next slide and go on to the one after that one, please? [19:43.760 --> 19:47.100] That's... yeah, this one was just a bit of fun. [19:48.180 --> 19:52.460] But also, I like the quote at the top there, [19:52.460 --> 19:55.860] when someone sacrifices the main feature to ship on time. [19:57.300 --> 20:00.900] Yeah, a big part of DevOps. [20:00.960 --> 20:06.300] And you'll see a lot of these things are concerned with the idea of DevOps and all of that, [20:06.300 --> 20:16.440] because I think that processes is something that IT has totally skipped over in previous years. [20:16.440 --> 20:21.940] And we've done all the worst possible processes, the worst possible ways of developing things, [20:21.940 --> 20:24.280] the worst possible of everything. [20:24.280 --> 20:28.400] And manufacturing was like that. [20:28.440 --> 20:32.600] But they were like that a good 60, 70 years ago. [20:33.120 --> 20:39.560] And they've worked really, really hard to improve things, come up with all the best ways of doing things. [20:39.660 --> 20:43.080] And we totally ignored that. We just thought we were better than them. [20:43.080 --> 20:47.900] We just decided that we'll do things in our own way. [20:47.900 --> 20:53.060] And it's a problem. And I think now we're actually starting to understand that, [20:53.060 --> 20:56.760] hang on, these guys actually spent a lot of time doing the right thing. [20:57.080 --> 20:59.220] And we can learn from them. [20:59.220 --> 21:08.560] And so I like to think that a lot of this has been worked out for us. [21:09.160 --> 21:14.140] And yeah, this is exactly what DevOps is all about, actually. [21:14.480 --> 21:23.900] The idea that you need to ship something on time, and that is the biggest, most important thing that you can do. [21:23.900 --> 21:29.940] But in reality, the whole point of it is to get a good working system. [21:30.320 --> 21:33.540] And that's what you should be working towards. [21:34.640 --> 21:39.740] So I just really liked the picture because it really doesn't make sense. [21:39.740 --> 21:42.420] But I also like the quote that they've got at the top. [21:43.080 --> 21:44.600] Next slide, please. [21:50.330 --> 21:55.310] Yeah, this is a fun one about consulting. [21:56.310 --> 22:01.530] So it says at GM, if you see a snake, the first thing you do is hire a consultant on snakes. [22:01.590 --> 22:03.490] And then you get a committee together on snakes. [22:03.490 --> 22:05.590] And then you discuss it for a few years. [22:06.050 --> 22:11.850] And then the most likely course of action is nothing because, you know, if the snake hasn't bit anyone yet, [22:11.850 --> 22:13.890] then probably won't be a problem. [22:14.430 --> 22:18.610] And then so you just leave him to crawl around on the factory floor. [22:19.510 --> 22:25.590] Better way to do things is build an environment where the first guy who sees a snake kills it. [22:25.630 --> 22:27.630] Well, not necessarily kills it. [22:34.990 --> 22:42.410] I just really working as someone, as a consultant for so many years. [22:42.410 --> 22:46.410] This is absolutely what I see in a lot of organization. [22:46.410 --> 22:54.590] And as a consultant, I'm not complaining because this pretty much is what puts food into my kids' mouths. [22:54.810 --> 23:06.110] Because the amount of times that I've been hired by an organization because they just didn't want to, you know, look at an issue and deal with it. [23:06.110 --> 23:10.070] They'd rather bring consultants in to tell them, hey, listen, you have an issue. [23:10.070 --> 23:12.490] And then they still do nothing about it. [23:12.970 --> 23:14.850] It is crazy. [23:15.590 --> 23:22.110] I guess if you're in an organization, the best thing that you can do is hang on. [23:22.110 --> 23:25.750] What is practical? What can we do? Let's get this thing sorted out. [23:25.750 --> 23:31.750] As opposed to, you know, let's get consultants in and then get a committee together and then put this together. [23:32.090 --> 23:36.630] At the end of the day, are you actually solving what you set out to solve? [23:38.050 --> 23:46.650] Hopefully. And if you're not, then you probably need to take a deeper look at the work that you're actually doing. [23:47.130 --> 23:49.150] Next slide, please. [23:53.760 --> 24:01.600] This comes from Gossi the dog, also someone really smart and someone I enjoy following. [24:04.200 --> 24:07.180] This is quite an interesting one. [24:07.180 --> 24:11.920] Again, so what they said here is Google says it ought to show this. [24:11.920 --> 24:19.600] Okay, this is not even, you know, application-driven two-factor authentication. [24:19.600 --> 24:25.420] This is SMS-based two-factor authentication, 99% effective against bulk phishing. [24:25.900 --> 24:29.560] So that's pretty good figures, actually. [24:29.600 --> 24:33.380] And then 100% effective against credential stuffing. [24:33.380 --> 24:48.700] So what this is saying, essentially, is multi-factor authentication can help you out against, you know, people trying to attack your authentication, which is great information in the first place. [24:48.700 --> 24:58.860] But hang on a sec, it also, if you think about it, means that your phishing emails that you're sending out and spending a lot of time doing, they're good, they're effective. [24:58.860 --> 25:09.000] So I wouldn't say stop doing that, but also be aware that, you know, if you're just doing that, and all you're really showing is that there's an issue, you're not really fixing it. [25:09.000 --> 25:15.080] Whereas with, you know, if you actually have multi-factor authentication, you're actually fixing the issue. [25:15.320 --> 25:21.220] So sometimes it's good to actually look practically at what there is and what you should be doing. [25:21.460 --> 25:23.120] Next slide, please. [25:25.960 --> 25:31.440] Okay, so this is just to show you an example of what there is in my bookmarks. [25:31.440 --> 25:37.860] This is just a picture of a control that absolutely can be bypassed easily. [25:37.860 --> 25:43.880] And you can see that in the practical, in the real world, but you can't really see that in IT. [25:43.880 --> 25:47.160] Sometimes we just put things in place without understanding them. [25:47.160 --> 25:57.500] The reason why I've actually bookmarked this, if you have a look at the next slide, please, is because it comes from Swift on security. [25:57.500 --> 26:00.000] If you don't follow them, you absolutely should be. [26:00.360 --> 26:02.800] They just said, reply with security memes. [26:02.800 --> 26:10.580] And this whole kind of bits of threads is absolutely packed full of lots of fun and games. [26:11.580 --> 26:15.500] And so I bookmarked it in case I ever needed to use it in a presentation, which I did. [26:15.500 --> 26:17.100] And here it is. [26:19.120 --> 26:20.460] Next slide, please. [26:21.160 --> 26:22.740] I'm running very low on time here. [26:22.740 --> 26:24.900] So we're just going to go through the next few quite quickly. [26:24.900 --> 26:27.860] And I think there are slides that we can get through quite quickly. [26:28.220 --> 26:32.680] So this is just a picture of when Kubernetes goes wrong. [26:32.720 --> 26:34.520] Just lots of fun and games. [26:34.820 --> 26:39.100] I just love Kubernetes kind of visual puns. [26:39.440 --> 26:41.400] Next one. Next slide, please. [26:42.300 --> 26:49.540] This is just to show you that my bookmarks are not all just information security and process-driven and very serious topics and stuff. [26:49.540 --> 26:51.540] Some of them are just quite crazy. [26:51.960 --> 26:56.160] So this is just a way of making it look like you have a head in a jar. [26:57.300 --> 27:04.360] For how long it'll probably take to put on your desk to get the rest of the IT team to take you seriously. [27:04.760 --> 27:06.180] Next slide, please. [27:12.560 --> 27:17.520] This is actually a really interesting story, which if I had more time, I would jump into. [27:17.520 --> 27:20.860] It's just basically the whole idea of how processes are broken. [27:21.180 --> 27:29.540] If you send an email to this organization, it gets sent off to an email address that belongs to a person. [27:29.600 --> 27:35.300] So this is if you unsubscribe from the email list, it actually goes to a single person that works at the organization. [27:36.700 --> 27:39.580] Some other IT team have picked it up. [27:39.580 --> 27:40.660] It goes to them. [27:40.700 --> 27:44.900] They do their research as to whether the person is supposed to be on the list or not. [27:44.900 --> 27:54.740] And then that goes into an Excel spreadsheet, which then gets emailed to someone else who then checks it against the database, manually sends it back, logs a call, goes to a third person, etc., etc. [27:54.740 --> 27:59.920] It's a beautiful way of showing just how broken organizations can be. [28:00.980 --> 28:02.400] Next slide, please. [28:05.790 --> 28:15.590] This is just a document that I kept because it's got a cybersecurity style guide, which I think is really something good to use. [28:15.930 --> 28:17.470] Next slide, please. [28:20.060 --> 28:22.060] Recipe on how to make the best bread. [28:22.060 --> 28:24.060] And I have tried it, and it is the best bread. [28:24.060 --> 28:25.700] It's absolutely amazing. [28:26.980 --> 28:29.060] So next slide, please. [28:32.720 --> 28:34.280] Vendor documentation. [28:35.160 --> 28:38.600] Can definitely be not all that very useful. [28:38.940 --> 28:40.900] And that's what the slide is showing. [28:41.660 --> 28:43.220] Next slide, please. [28:46.720 --> 28:48.680] This is just good advice. [28:48.940 --> 28:55.940] I think, you know, always be thinking about what your customer wants, what they're trying to do. [28:55.940 --> 28:58.200] How do you fit into their processes? [28:59.260 --> 29:02.600] And, yeah, just remember customer doesn't want a quarter-inch drill. [29:02.600 --> 29:04.180] They want a quarter-inch hole. [29:04.180 --> 29:12.100] I think that's absolutely good and something that I consider every single time. [29:12.100 --> 29:14.360] Next slide, please. [29:23.570 --> 29:25.480] Next slide, please. [29:33.790 --> 29:37.330] Let me check if there's technical issues. [29:37.450 --> 29:40.010] Yeah, it's absolutely fine. [29:40.010 --> 29:45.850] The next couple of slides are not all that important. [29:47.190 --> 29:54.090] One thing I can leave you with, I guess, the last slide that I just had there was, like, even though I've got so much stuff, [29:54.090 --> 30:02.690] and it's also probably good just to keep some of the rest of the slides are just basically just fun and games. [30:02.690 --> 30:15.110] I think the one thing I just wanted to let you know there is just keep a sense of humor about you. [30:15.130 --> 30:18.930] Life is always serious, especially in the industry that we're in. [30:18.930 --> 30:28.370] But also, you should just sometimes go outside, have fun, take a break, relax, and let your mind just take a break itself. [30:30.270 --> 30:43.030] Really, the kind of book that leads to questions is just a bunch of things that I found quite interesting once upon a time. [30:43.530 --> 30:51.950] I guess if there's any takeaways, yeah, besides the fact that you should always relax and take life easy from time to time. [30:51.950 --> 30:54.910] But also, there's a lot of good stuff on Twitter. [30:54.910 --> 30:57.050] There's a lot of smart people out there. [30:57.050 --> 31:02.330] And I highly recommend that you take a look around and see. [31:02.730 --> 31:06.730] And sometimes, people are real people. [31:06.730 --> 31:08.190] They don't only talk tech. [31:08.190 --> 31:09.410] They talk about their lives. [31:09.410 --> 31:10.830] They talk about how they do things. [31:10.830 --> 31:14.190] And sometimes, that can be even very useful. [31:14.210 --> 31:16.430] And you can learn a lot from that. [31:16.570 --> 31:18.410] So that's pretty much the takeaways. [31:18.670 --> 31:20.390] Yeah, thank you all very much. [31:20.390 --> 31:25.550] If there are any questions or anything that you'd want to discuss further about any of my slides, [31:25.990 --> 31:27.710] please, please let me know. [31:27.750 --> 31:32.850] One thing I'm going to try and do is get a hold of all the different tweets that I actually collected up.