The Voice of Cyber®

Episode 145: Nathan Wenzler
First Aired: November 23, 2022

Nathan Wenzler is the Chief Security Strategist at Tenable, the Cyber Exposure company. Nathan has over two decades of experience designing, implementing and managing both technical and non-technical security solutions for IT and information security organizations. He has helped government agencies and Fortune 1000 companies alike build new information security programs from scratch, as well as improve and broaden existing programs with a focus on process, workflow, risk management and the personnel side of a successful security program.

As the Chief Security Strategist for Tenable, Nathan brings his expertise in vulnerability management and Cyber Exposure to executives and security professionals around the globe in order to help them mature their security strategy, understand their cyber risk and measurably improve their overall security posture.

See also: 2022 Ponemon Cost of Insider Threats Global Report

Help Us Improve

Please take two minutes to write a quick and honest review on your perception of KBKast, and what value it brings to you professionally. The button below will open a new tab, and allow you to add your thoughts to either (or both!) of the two podcast review aggregators, Apple Podcasts or Podchaser.

Episode Transcription

These transcriptions are automatically generated. Please excuse any errors in the text.

Introduction (00:27) You're listening to Kbcast Cyber security podcast for all executives cutting through the jargon and hype. Do you understand the landscape where risk and technology meet? Now, here's your host, Karissa Breen. Karissa (00:41) Joining me today is Nathan Wenzler, chief security strategist from Tenable. Today we're discussing the great reshuffle combined with already strained security resources which shines an even brighter spotlight on the need for organisations to have an effective insider threat management programme. A 2022 Cost of insider Threat survey by Panaman Institute found that insiderled cybersecurity incidents have increased by 44% over the last few years, with average annual costs of known insider led incidents up to more than a third to $15.38 million. So today we're going to be talking about all of this. So, Nathan, thanks for joining. I'm really keen to get into some of the polymer report and yeah, I'm keen to also start with how you're seeing the insider threats change. Nathan Wenzler (01:32) Yeah, first, thanks for having me and really appreciate the time here to do that. Insider threat is an interesting one. I have been really focused on the kind of people side of security for a good chunk of my career, both as a former CISO and security leaders, also as a practitioner. So this kind of a topic is sort of near and dear to my heart. We've watched this evolve quite heavily, not just over the last couple of years when we've seen the great reception happening, but even longer than that. Initially, insider threat was really seen as only malicious, disgruntled employees who were trying to take some kind of revenge action against their company or soon to be former company. So there's a lot of concern about intellectual property loss, about data theft, these kinds of things. But I think what we've seen perhaps the most telling about how it's all changed in the last two years has really emphasised this point is we're seeing a lot more of accidental or even in some cases where the employees are simply not even aware that they are participating in some form of intellectual property theft or data theft can be as simple as policy is just not being clear. Nathan Wenzler (02:45) And so folks think that it's okay to take their laptop home or they're working from their own devices with work and they copy things down, not realising that they're breaching policy. The other half of this is going from an accidental kind of standpoint, is we're seeing a lot of challenges organisations try to understand when they see an employee's credentials being used to commit some sort of theft, is it really the employee they think it is. Right. We know that cyber attackers have been going after credentials as part of their attack process for ever in a day. It's really one of those common things that they move towards. But if I am an attacker, I can get someone's credentials, it's already inside the company. You know, we might look at it after the fact is as humans involved, and say, well, obviously that wasn't Nathan, that was someone who compromised his credentials. But from a technology standpoint, the systems have no way to really know that the user behind Nathan, as long as they've authenticated, they've done their pieces correctly, that's Nathan's credential and allowed to do things. So it's gotten much more complicated, right. We can't just assume that it's a malicious actor who we know it's often, again, accidental or perhaps even more importantly, incidental. Nathan Wenzler (04:06) And we're seeing a lot more focus from organised criminal groups who are compromising credentials and then quietly behaving like the user to perform their theft of either data intellectual property. So it's a real big challenge for these organisations because it's not as clear as it used to be and it's happening much more frequently. Karissa (04:27) So in the start of the fully interview, when I said to 44% over the last two years, that's pretty high, not like 10%. So why do you think that jump is so significant within two years? Is it because, like you said before, like BYOD people are sort of doing things from home now, there's sort of an easier environment for potentially people to do the wrong things? Nathan Wenzler (04:50) I think you're going right down the right path here. The last few years, obviously, between the Pandemic times and how that sort of spawned, if you will, the great reshuffling and people switching jobs quite frequently and it's become that sort of process helps to encourage, but causes the opportunity for more of that incidental and accidental sort of data loss. Right out of organisations at the beginning of the Pandemic, they weren't prepared for a work from home state, they didn't have laptops, they didn't have equipment they could send home. So, sure, have your employees use their own systems and we'll make the best of it for now, but that also opens up the opportunity for that same employee to now be storing their corporate data on that system. So the sort of chaotic nature of what we've been going through the last few years started that process, just creating more places where data could be moved to areas that didn't control very well. We couple that with employees who are just moving from job to job to job to job. Again, they may not realise anything that they're going to play wrong, but they're taking data with place to place and that leaves the companies in a place of really severe risk, because, again, they're trying to keep control over their property, over their data, and it gets really hard to do. Nathan Wenzler (06:14) So I think the two pieces together are likely what's been driving the bulk of that over the last couple of years. But I think also, I'd be remiss if I didn't say that just like that, we've seen an increasing tiger attack along these lines too. So criminal actors know that this chaos is happening. They know either the endpoints are not being well protected or controlled, so that they're not able to deal with these kinds of data problems, or users are bringing their own devices in, which are often less secured and easier to compromise. They know that and so they've been taking advantage of this as well. So there's definitely going to be some contribution from just the attackers taking advantage of the situation. But from what we see from trends in my speaking with organisations that have struggled with this, I would say the majority of that increase just comes from the complexity that we've been dealing with from users in an incidental problem. Karissa (07:13) When you say users from an incidental problem, what do you mean specifically? Nathan Wenzler (07:16) Well, I mean, that's not a malicious sort of state from the employees of the end users, right. They're moving data, often without even knowing it by accident. It's just sort of incidental to the fact that they are working from home in an uncontrolled environment or moving to a new job, and not necessarily in a malicious sort of capacity. Karissa (07:36) So it's not like John did, he get his bonus and then all of a sudden he sells all the data. It's on the dark web, so to speak. That's less of that. Nathan Wenzler (07:45) Yeah, I think over time we've been seeing less of that, which I think may seem counterintuitive for a lot of folks, especially if you're an organisation that has had to deal with that. But it's the frequency problem, right? Do malicious actors internally still exist? Absolutely. Should organisations be concerned about, are we treated our employees fairly, are we treated a good culture so we don't have a lot of angry folks? Are we being fair? Those are all things that help kind of support place where people may not be so angry about their situation and then take out some sort of revenge on the company, but they still do happen. The trick here is, though, that the vast majority of what's happening is the rest of it we've been talking about, it the accidental kind of thing, or the criminal actors who've taken over credentials and that are using them as if they were the employee, those happen far more frequently. So if we're trying to try to get our arms around the problem, that's kind of where you need to focus, because that's what's happening more often than not. Karissa (08:47) So we've got sort of two buckets of people. So the oversight from users to actual staff that made a mistake, like you said, they're not in controlled environments and then it's easier now for, I guess, attackers to hide behind, potentially like your profile, for example, so it's easier to frame you type of things. Are you saying like, it's easy to say, oh, definitely need it. Okay, so that's really interesting. So how does that go? So just say hypothetically you've been called up because they think that you are a malicious insider and they're like, well, all the fingerprints are on you, Nathan. It's all you, it's all you. How does a company navigate that? Because what happens at the end of the day, it wasn't you and you've been fired from your job because all roads lead to you. And in fact, it wasn't. Nathan Wenzler (09:33) That's an awesome question. We could probably spend another hour talking about certain HR and legal ramifications around how you work with security organisations to deal with these kinds of security incidents. The answer, quite frankly, is it takes a lot of work. And this is why this is such a major risk factor for a lot of organisations in the security space. Because in order to investigate all of the data points, to look at all the event logs, to sort of go back and do the forensic analysis of all the pieces that were done, for a lot of companies, that means hiring an outside firm can be weeks, can be incredibly expensive. In the middle of this, you're already dealing with the data breach itself, right? You've already had lost, so the company is impacted, you're spending more money, it's taking a whole lot of time. Meanwhile, it all seems like an employee. So you've got a lot of the HR and kind of cultural problems you've got to deal with, with an employee that's got a dark cloud over their head that they're worried about. By and large, when we look at the evidence, these things, you can absolutely go back in time and figure out what happened. Nathan Wenzler (10:40) But that's a really extensive and timeconsuming effort. So credit where credit is due. Most organisations I've ever worked with don't generally jump the gun and just immediately fire someone, especially with a pretty serious data breach. They're going to do an investigation, they're going to look into it to find out what actually happened, especially if they're billed with a legal case. You got to have that evidence, you've got to have that data. So none of that happens very quickly. But it is incredibly expensive and it's so time consuming that you might clear the employee. But boy, you are going to go through a lot more headaches getting there than if you had just done the work upfront to protect their credentials and make sure that it couldn't do anything abusive or dealing data they should have been able to access, that kind of thing. Karissa (11:27) Wow, this is really interesting. Okay, so I'm curious to know, so as you said I said before, Nathan, they don't just sort of come up to like, you're fired, it was you, you're out. So what do you do then? Just say, I found out all roads lead to you. Or we can use someone else's name in the mix. John, how do you handle it? What do you do? Do you put them in quarantine? Do you put them in like, work jail? Like you can't have them there on site, like you don't know what you're dealing with, like, what do you do? How do you handle this? Nathan Wenzler (11:59) Yeah, again, it's kind of a complicated answer. So you're diving into what is the incident response process look like when you suspect you've got a credential compromise and you really got a couple of moving parts here. You've got potentially an attacker that's compromised the credentials. So you're not just looking at the actual user, you're actually trying to compromise or attacker. And if you do anything to modify that user account they're using, if you were to delete it or quarantine it or whatever you do, alert the attacker you're on with it. Right. So that caused more damage. Containment is the key upfront. You've got to start to do some investigation work, figure out the scope of the problem, see if there's ways that you can contain or limit the damage that's not only already been done, but the damage that the attacker, again, if it is an outside entity, could potentially do. Once you've got some boundaries set for containment, then quite frankly, you're spot on about the next steps. You're going to look at quarantining that user account, revoking some of the access permissions that it can't continue to do things that it shouldn't be doing. Nathan Wenzler (13:12) You're also, of course, going to be monitoring your network very heavily to make certain that other accounts don't start behaving in a strange manner. So it's a lot of kind of permissions and rights containment that starts to happen for the user themselves, the employee, most of these situations. It's a little annoying, I would say. What I see most commonly is they just issued a new Credential that they use kind of during this process, but they also can get kind of right back to work, do what they need to do if they're not legitimately the direct suspect of what's going on. But, yeah, containing that damage is really kind of a key first step. And that can be a really complicated piece between rights permissions on the account, between network access, between the data sets that you're trying to control and protect. It could be application access. So a lot of work has to be kind of coordinated and done so that when you throw the switch, so to speak, you can really isolate that compromised account and start to push out the attacker from the environment and begin the recovery process. Karissa (14:11) Wow. Yeah. Okay. I do hear what you say. It makes sense. So it's very convoluted. It's quite an arduous process as well. So what happens? You do all this process and then you get to the end of the line and you worked out it wasn't John how do you handle that? Because then this person is probably ostracised. It's probably obvious the guys haven't seen John for a while, something's happened. Nathan Wenzler (14:30) Sure. Karissa (14:30) How do you handle that from a leadership perspective? And then you're thinking about it like if you're John and you've been well, I don't want to say the word accused, but you've had to go through this whole process and it wasn't you, do they just leave in the end? What have you seen in your experience? Nathan Wenzler (14:45) Well, I really appreciate this question, I got to tell you, because this is such a massive, massive piece of leadership culture and organisational culture and how you deal with your people. This is not really in the longest, not even a security problem anymore. This is a people problem. So how you go about it initially from a leadership perspective is going to set the tone for everything that happens after that. And that's really the key. So, yes, if your leadership gets the first the first note from their team, hey, we think we've been compromised and we think it's Joe, if that leaders first instinct is to call Joe and start screaming and say, what are you doing? We're going to investigate you, we know what's happening. Yeah, you've set the tone for the whole thing. Karissa (15:38) People have definitely done that, am I not wrong? Nathan Wenzler (15:42) No, absolutely. The challenge there with that is it becomes a self fulfilling prophecy when you jump to the gun like that. You don't go through the real work to understand what really happened. You have created a malicious employee. You have to be a disgruntled person that works for you. And, yeah, if they don't quit on their own accord, they're not going to be really willing to work with you in a cooperative manner, they're not going to be a great employee. You're going to have massive, massive business problems on top of the potential for well, I mean, just kind of human psychology, right? If I was Joe, in that case, there might be a .6 months down the road where I say, I'll show them, they thought I did it, I'll do it for real this time. Now you have that problem. Right. It sounds a little ridiculous, but this. Karissa (16:37) Is why it's true. Nathan Wenzler (16:39) It's why in the security side of things, I reiterate this with a lot of people constantly in talks that I give and consulting work that I've done the past. Your security programme is not a function of your It department. Your security programme is part of your risk management function in the organisation. And that means you have to look at things like, what is the risk of the kind of culture our leadership has set? Because if it's a really negative culture, we might be encouraging people to become in, disgruntled, malicious insiders. That's a risk. That's the thing that we have to address somehow. So as companies are getting a little more mature about these processes, they start to realise that security isn't just a bunch of techies sitting there basement swearing hoodies and trying to hack the world. When you see it as a risk management function, it starts to make more sense about asking these broader business questions around well, how do we encourage people to work with this? How do we approach an incident like this in a way that doesn't single out the employees so that we can do this in a positive manner and not set ourselves up for future problems? Nathan Wenzler (17:50) This is really very much within the realm of a security, a discussion alongside the business, HR, legal, everybody else. So how leadership sucks. The tone is critical in these kinds of things, and not accusing the employee out of the gate, being aware the compromise of credentials is a real thing, a very common thing, letting the investigation do what it needs to do. And so you have the facts about what happens before you come to any conclusion. All of these kinds of factors are really, really critical in the response process and still doesn't happen. I mean, some folks still do it. The way you get into that there describe but we're trying to get people more and more to have that more kind of data centric approach and problem before you do anything too rash. Karissa (18:35) So I want to focus now on the response side of things. Now, in the Australian market there's been a large data breach. You probably heard about it, you know, people sort of claiming like oh, how they responded is really bad. I mean, I would say more often people would probably say how companies respond is bad and it's probably because they're not doing it every day. Like if you're doing something once a blue moon, you're not going to be too sharp on how you respond. So from your perspective, hopefully this type of compromised credential behaviour is not happening every day or they're not sort of dealing with insider threats every day, but maybe a thing. So how would you go about making sure that people, when it does happen, people are on their game, they're not sort of like running around, they don't know what's going on, they're calling, they're calling Joe, they're going off their head. Like people act in panic and it's so easy to sit back on our couches when nothing's going wrong and accuse the company that there's chaos. It's just because it's not habit, they're not doing it every day. Like, you know how if you don't go to the gym very often you're not great at it, but if you're going every day you're a lot better at it. Karissa (19:43) And this is the same thing in incident response. Like we're not doing incident response to this level every single day. So of course people are not going to be a sharp with it. But what is some advice you can give to people to make sure when it happens you are on your A game, man. Nathan Wenzler (19:58) This is a fantastic question. I love this, by the way. I remember early in my career when I was told the analyst working for some government agencies, I did a lot of training around formal incident response processes, how to build a programme, how to execute the programme, that kind of thing. And back then, 20 years ago, the advice, the number one advice is Practise, Practise, Practise. To your point, you do it every once in a while. It's hard when you do it all the time either. The muscle memory kicks in, you go through it and you do it. But where Practise is a problem is no one ever Practises an incident while under the stress of an actual breach. Everyone kind of knows just Practise is not really happening. I'll get the playbook out, I'll run through the steps and then we can just cheque the box and tell management we did our business here. So the companies have some different ways of going about this. And I would tell anyone listening if you heard what I just said and your reaction is oh great, we'll actually cause a data breach that we control and not call anyone its Practise and will really freak them out. Nathan Wenzler (21:12) And we will do this under a live stress exercise. Please do not do that. Please do not do that. Your employees will resent you. It's not a fun exercise. I've been in those situations, I have consulted for companies that have done that and the reaction you get from staff can be anything from outright resentment to a mass walkout of your people. So you do have to Practise, you do still have to go through the exercise of letting people know when you're going to do a schedule, the time, build out a good test plan so that it's seen as real as it can be. Do not make people go through an actual live fire exercise even if you control it. Now, that said, the way that we get through this is bringing in the right stakeholders, help people coordinate an effective manner when the real stressful situation happens. And this is probably the place where I see most programmes fail because they see incident response with a purely technical exercise. Get the security team in, get the It folks in the folks firewalls, get in a room, we'll lock the environment down and we'll get rid of whatever the problem is. Nathan Wenzler (22:28) Ransomware, cybercriminal, whatever it is, what they're missing, people from that table. You don't have HR involved in that table, you don't have legal counsel involved in that conversation, you don't have someone designated for communication both internally and potentially publicly. And what I typically find is the incident response programmes who take into account as part of the exercise and with all their training when they bring in the business stakeholders that will be involved. As I said, HR legal er, and then the kind of company you are marketing, perhaps make sure you can restore confidence to your customers, these kinds of things. Those people have a stake in this and they can be really critical parts of how well you deal with the response. And when you have those people in place and they know what they're supposed to be doing, they know what their part is, it actually does alleviate a lot of stress from the technical folks, but they don't have to worry about those kinds of pieces, someone else's responsibility. I'm a security analyst trying to lock down the credential for doing more damage. I can trust that there is somebody already building a PR response. Nathan Wenzler (23:45) They're going to get out in front of the public, we're going to have a coordinated, well thought out sort of message. That's not something I'm going to have to do, not have to worry about. I can just focus here. Seems like a small thing, but it's a really, really critical part to help negate some of that stress that's going to happen when you're actually trying to put out a live fire from a data breach or some other type of separate attack. Karissa (24:09) Yeah, that's really interesting and you're absolutely right. It's all well and good to practise these things. The question that I'm really interested in now is talking about Practising it. So I used to work for a shopping mall and we used to have to do tabletop exercises every week. Like, what happens if someone comes into the mall with the machete, which absolutely happens, how do we respond? And we had to go around and say how this is what we do, and all of that. But the thing is, when you're in that moment and someone's coming into a shopping mall in the middle of the day with a machete, it's very different to sitting in a tabletop exercise. So how do we sort of transfer people from it's all well and good to do all the practises and all of that, when, like you said, you're not under duress, you're not under stress, how do you sort of replicate the same type of environment without going, like you said, like full fledge and trying to, like, replicate. It's going to create problems because when we're in a controlled, relaxed, it's not really happening. You can think a little bit more logically, but when you're in the moment and you've got something that's live action that's happening, it's a lot harder than to respond. Karissa (25:22) How do you mentally prepare people? Because I've often seen people that, for example, that prepare for four years for the Olympics, they get there and they, I don't know, the first 2nd, they fall over and they break their leg and it's all over type of thing. It's easy to think about it, but on the day it changes how we operate. Nathan Wenzler (25:39) It's a great question and there is no real simple answer to this, frankly. I've seen people do it. I mean, I've seen organisations who have said, well, we're just going to shell off power to the data centre and see what everybody does. And they live got off the power to their data centre to create the panic. And whatnot that's an incident. That's a data breach. You've had an attack, you just attacked yourself. Hey, so it's not even a Practise thing in that point. If a live real attack just happens to be self inflicted, that's not the way to Practise. It doesn't really benefit anything in the long run. I think we have to acknowledge that there really is not a way to fully 100% prepare people for that moment of when it actually happens. The best you can do is be absolutely prepared, have a lot of training, a lot of documentation, do the tabletop exercises so people are really comfortable. And there's a lot of folks, when they are in panic mode, will often resort to sort of the comfort of structure. They'll blindly follow the list or the checklist they've got in front of them because that's how they can keep their mind focus on something other than the negative thing happening around. Nathan Wenzler (26:48) Some of this is a leadership question, right? If you have a good culture, you got positive leadership, you know that your managers and your upper management folks have your back, right then infuse a little more comfort into your employees that win again, they're in the midst of a fire. It's one less thing to panic about. If they see the management involved, just like they are trying to help them through the process and protect the company, everybody's in it together. That's a valuable kind of piece to reassure folks that I may not have to paint it quite as much. There's a lot of small things like that that you can do to help as much as possible. There really is no way, short of shooting yourself in the foot and attacking yourself, that you ever can really recreate that same kind of level of stress. So you just got to be ready to mitigate in every way help people when you can have backups for folks. If somebody really can't handle it, let's bring in someone who's okay or maybe slightly more okay. There's a lot of change and chaos that happens when you're in the midst of doing those things and insider threats especially because it may still be in the mode of is it one of us? Nathan Wenzler (28:02) You're still trying to figure that out. So there's a lot of additional stress, but you've got to do as much preparation as you can and then be ready on the day to help people through it, knowing that it's chaos. Karissa (28:14) So in terms of being ready, what are some of the things more broadly that you would sort of recommend for people? And the other question I'd love to ask you as well is tabletop exercises, how frequent should people be doing this? And I guess that depends. But I've. Had various answers, so I'm keen to get yours. Nathan Wenzler (28:32) Yeah, let me talk about the preventative part first, because I think that's really the key here. We've just spent the last few minutes, several minutes, talking about the scariness of when the fire happens, right? It's terrible. The worst thing. Everyone's in panic. Awful here's. What's more awful than that? Companies who don't do the work up front and they're dealing with that kind of situation where everything is on fire multiple times a year. Right. You're concerned about staff attention, if you're concerned about culture, if you're a security person or an It person, you're going, this is the 7th breach we've had this year. Nobody's trying to solve this stuff. I'm out of here, I don't want to deal with this. So we talk a lot about Incident Response as an after the fact kind of thing, and that is exactly what it is right there in the name response. But what we're not focusing on is what are the kinds of things we should be doing to ensure that as few attacks are successful as possible so that we don't have to execute the Incident Response Programme as often? Not at all. I'm not going to say 100%, there's no such thing, but just can we start to get it down so that we have a programme? Nathan Wenzler (29:54) The response programme only triggers once every so often, not once a month. The only way you get there is by really focusing on the preventative measures. And the preventative part of it is, I think, frankly, where we've lived for a long time in security after the effect. Responsive tools like your endpoint security products and SIM tools for a lot of ways are event based or after the fact tools. They're critical. We have to have again, we have to deal with the contingency of the attack. Nothing. But when you look at all the tools we have for gaining visibility into the app that we're trying to make understand how complicated our attack surfaces. Where could an attacker actually breach? US hats management. We've been doing patch management for 30 plus years. Right? That's a critical piece of the preventative measure. Close the holes that an attacker might use to get into your environment by removing vulnerabilities from the organisation credentials. Like, we've been talking about that this whole time. Get your arms around the credential problem, right? Do that upfront. Not when your house is on fire. You want to make sure that you have employees have appropriate right to find their accounts. Nathan Wenzler (31:13) They haven't been given domain admin access to your whole organisation just because it's easier. We have to do these kinds of cyber hygiene preventative steps up front to understand the way the land mitigate as much risk as we possibly can, close a lot of vulnerabilities and places we can be exploited much as possible. That is going to make your entire Incident Response programme much more effective, because you're only going to really need that when it's something very, very serious and it's something that not many people out from having to do it all the time. There's also something to be said for the fact that the more you fix up front, not only is it cheaper, easier or quicker to do that, but if you know what you fixed, then you also know kind of what's left. And that also helps you refine the containment process for during incident response. You know that the attacker can't take advantage of certain exploitable vulnerabilities or domain admin credentials because you've done the work upfront to protect them. That is a really powerful way to help your containment strategy. It's going to really make your response process more effective. Nathan Wenzler (32:25) So don't lean on instant responses. The only mechanism right, the preventative piece is key. And the more that you do to get visibility of your learning, more you do to understand how it all connects in terms of if this perimeter system is breached, what can they do from there? Whose credentials could they compromise? What other vulnerabilities might be out there that they could move to? Understanding all of that and then starting to build a security plan around it up front, that is really what your insider threats and management plan is. That's how you got to focus. Then when the inevitable happens, you've got your response plan to start to mitigate after the fact. Karissa (33:09) Yeah, that's interesting. That's excellent. So do you think, Nathan, that I love what you're saying, but then when it comes to the reality of doing this, this type of stuff just gets pushed down the chain because people got other things to do. Like they've got like, oh my gosh, we have 1000 other things that we've got to do or we've got to make sure that we're compliant and then we're going to get audited so we have to prepare for the audit. And so then because this isn't like a live thing that's happening, it's just so easy to go by the wayside. Do you think that's why when things do become alive, people are in panic mode because they haven't had the time to prepare and be preventative because other things are taking place? Nathan Wenzler (33:51) I think that's a huge part of it. If we're talking about panic part of the load, yeah, they're not prepared, haven't done the work. You have no idea what your environment looks like. You don't really know the layer of your own land. You don't know what other compromising things are out there. Misconfiguration credentials or vulnerabilities or whatever. You've made the response part that much more complicated in a lot of ways. It's sort of like car maintenance, right? Why do we get oil cages in our cars instead of just letting the engine burn itself up and then get a new engine? Because it's cheaper and easier and simpler to do the maintenance upfront and not have to go through the damage and the hassle and the weeks of time to placing the engine, we're kind of talking about the same thing here. Is replacing your oil a fun process? No one you know, we all would rather have to do other things at a time, but I guarantee you it's less of a problem than getting your whole engine replaced. Karissa (34:53) That is so true. Nathan Wenzler (34:55) There's a lot of that kind of thing going on and I would say that what I'm seeing change in the industry. This is a really big shift that even in the last five years has breen happening, but probably more so in the last two or so. Organisations are really starting to understand the value of better communication about technical problems to a non technical audience. So you mentioned, and I'm going to use your own pocket. We have 100,000 things to do. It just seems like too much. What am I going to do about it? Right? That's exactly the problem the security industry has had for years. We go to our C suite executives, we go to business leaders and other teams and say, hey, we found 300,000 vulnerabilities last month in the environment and we fixed 100,000 of them. Now, if you're a security practitioner and you live in that world, that sounds pretty awesome, right? Fixing 100,000 vulnerabilities in one month. Like, I've done a really good job, I'm proud of myself. But you know what? Your chief financial officer is going to say, so you only picked a third of the things that's an F, like you guys are failing. Nathan Wenzler (36:10) Why am I budgeting your team when you only did one third of the work? You told me you have 300,000, you only pick 100,000. They don't know context. They don't understand what the meaning behind the kind of volume based metrics that security tools are really good briefly. So we have to start becoming more mindful about how we communicate. Volume based metrics like that are really important to understand workload. Right. I may need to go to my It team and say, listen, we've got a lot of problems. We have X number of hundreds of patches to deploy. It's a lot more work this month. Let's talk about how we do that. But I'm going to go up to the leadership votes. And this again, we're talking about budget approvals, we're talking about more staffing, we're talking about even just culturally having them see securities of value to your organisation. You got to communicate that kind of information differently. And when you can do that, when you can come to your boards or your C suites and say, listen, I can show you we have a letter grade for our environment right now. And over the last nine and see that our letter grade has gone from a D plus to a B minus, we're still working, we're heading back to the right direction, but things are getting better. Nathan Wenzler (37:30) I may not understand a thing about cybercapability, but I can understand grades or score or trendline. These are things I can easily understand and sort of intuitively, at least know the direction we're heading and what we see. If you take that to the kind of its next step, and we're talking here about attacks, insider attacks and that kind of thing, trying to create visualisation of those kinds of problems is even a more powerful form of communication, at least that I find. So one of the talks I was giving recently was talking about how to look at this concept of the Tap path analysis. Very common forensic tool. We see it used after the fact for data breaches all the time, where they go in and they mapped out all the data and they can put a nice little sort of flow line that says, this is the system the attacker broke into. They moved to this system, they performed a privilege escalation attack, they moved to this system. And it's literally like a busy diagram going back to my dating myself about my Microsoft tools. But it really is, it's a visual representation of how an attacker moves to the organisation and attack them. Nathan Wenzler (38:47) What I've been starting to recommend with people is to start to look at that upfront, right, don't look at those types of things after the fact. Let's go back and look at what is my environment look like from a security posture state, where are all the vulnerabilities, the misconfigurations, the potential problems? Where could it all go wrong? And if I can correlate that data and give it context against each other, I can visualise the same thing. I can actually say, well, we see these vulnerabilities here, that means an attacker could do lateral movement to these other systems or get access to credentials. And we sort of hypothesise what they could do. What the visualisation can help with, though, is and I've breen a couple of organisations go through this exercise. Inevitably, there's one or two choke points where they start to say, I have this one server in the middle of my environment that doesn't require any elevated access to get to, but it's been given because of service accounts or a legacy application. Full domain admin, access to everything on the back end. And so you'll see literally hundreds of potential attack paths flowing through that one server. Nathan Wenzler (40:00) It's a visual. I can go to my board and say, hey, I need to protect this server because I found 47,000 vulnerabilities on. It really more compelling when I can show them a quick screenshot and say, out of all the systems in our environment, this one, you can see visually how many potential rows an attacker could use it to completely compromise us in every way. That visualisation of data, you take it away from the big numbers, you take it away from a context that the business leaders to understand. That's the thing they can understand. And now you can start to talk about it from a risk perspective, they're going to be more engaged about, oh yeah, that's bad, what do we need to do about it? All right, let's talk about the plan. I need this much budget, I need these tools, I need these people. It changes the nature of the whole conversation so that the communication piece right, is so key in terms of how we prioritise, how we get buy in, how the leadership folks understand where it starts in the midst of all that noise. Don't give them the hundred thousand options, give them the one or two options. Nathan Wenzler (41:06) These, these are the areas we got to focus on that changes the whole game for security folks and that's where we I really do see the industry is moving more and more in that direction as we try to help make better decisions about all this stuff. Karissa (41:19) I hope so. I'm glad because I think that's definitely a key component that a lot of people have not focused on. They've been definitely focused on the technology side of things, which is critical, but I think it's how we communicate and the discourse in which we speak to people like CFOs for example, and what things are going to be important to them. So I definitely second your opinion and I do hope to see that there's a change in the space. But one of the things I'd like to sort of maybe close the interview on Nathan is we started with the 44% over the last two years. Do you believe, or do you have to have a hypothesis on what we can expect in terms of insider threats will increase over the years? Now, I mean that 44 is quite significant based on what you said today, whether it's an oversight or a credential compromise. But will it get worse or do you think that we sort of got it under control? Nathan Wenzler (42:09) I think right now my sense of it is that we're kind of in a plateau state, right? It's still growing about the same kind of rates, but as frankly, the global economy situation is becoming what it is. There's a lot of fear about that. I think we're seeing the sort of great reshuffling activity is starting to come down a little bit, which is going to again limit the number of opportunities for those kind of accidental potential compromises or data losses. The key to the two of us, we could very easily ramp back up because the core of the problem hasn't changed. Again, it's not just about people moving, it's about the fact that we as organisations don't do a good job of controlling our financial spaces. We don't do a good job of controlling endpoints for things that are outside of our corporate environments. We don't have to bring your own device policies. There's a lot of things that the organisations are not doing. It's really not the employees fault in that case. So until folks take that part of it seriously and start to make the changes accordingly, right. To get away from this sort of breen for all thing that we did at the early part of 2020 when the Pandemic started, just say, doesn't matter what we do, let's just keep the business running, keep the lights on, that time is done, right? Nathan Wenzler (43:33) We kept the lights on, work through it. If you are an organisation that's still operating under the mindset of, doesn't matter what we do, let's just keep everything going. You're behind the curve and you're going to be seeing a growth in these kinds of attacks and these kinds of problems because you haven't really started to build the controls around all of those movie parts that you need to do. But I think it's going to be an interesting thing for the next year or two to see what that number does and agree with you. It was a huge, huge increase over the last few years. If people have gotten the message and we start to see some work towards that basic management of the things that are often compromised, that number should start coming down. If people have decided that it's just, frankly, not very cyber security work and I want to just buy the next hot tool and have some fun with something. A bright, shiny technology yeah, that drive right back up again, because the underlying problem isn't going to go away. Karissa (44:32) Yeah, most definitely. I understand what you're saying. So hopefully now, because people aren't in panic station, like they said, keeping the lights on when we were going through the COVID crisis, people can sort of take a step back, have a moment of clarity, reassess, understand what they need to do and work towards fixing a lot of the problems and the issues that they've got. In terms of any sort of closing comments or do you have any final statements you'd like to leave with our audience today? Nathan? Nathan Wenzler (44:56) Well, I feel like we both collectively here, had a good conversation about the panda. What you're doing in response? I think that's the thing I'd want to reiterate. Right. Be prepared. This is really what this is all about. It's not meant to scare you, it's not meant to be about invoking. A whole lot of fear, uncertainty, doubt, this problem can be managed. It's not impossible. It does require some work, right? We've got to understand our environment. We've got to do the work to put good controls in place. We'd have to put controls in place that still allow the business to function. There's a lot of things we've got to do. But be prepared. The more you do upfront, the more you have, as part of your programme, a more preventative approach to this problem, understanding that there's no 100%, you still need to have your response tools in play, you still have to have a recovery mechanism. All really critical, but preventative is the place we need to live. The more we do up front. The easier it is. Cheaper it is, the faster it is. We're going to limit the amount of damage overall over time. Nathan Wenzler (45:58) It's, again, the oil change analogy. Do that work up front as much as you possibly can so that you're not in that panic state often, or the panic won't be as severe when it happens. And that's ultimately the best way you're going to get through all of this. Karissa (46:15) Most definitely. And I think this is a good reminder for people to go and cheque their vehicles immediately, whether that's a metaphor for security or actual physical vehicle as well. So thanks very much for your time, Nathan. I've really appreciated your honesty and your thoughts and opinions, and I can't wait to get you back on the show. Thanks for joining. Nathan Wenzler (46:34) Appreciate it. Thank you so much for your time today. Karissa (46:36) Thanks for tuning in. We hope that you found today's episode useful and you took away a few key points. Don't forget to subscribe to our podcast to get our latest episodes. This podcast is brought to you by Mercsec, the specialists in security search and recruitment solutions. Visit to connect today. If you'd like to find out how KBI can help grow your cyber business, then please head over to KBI Digital. This podcast was brought to you by KBI Media, the voice of cyber.
Share This