You're listening to KBKast, the Cybersecurity podcast for all executives cutting through the jargon and height to understand the landscape where risks and technology meet. Now here's your host Karissa Breen.
William Makdessi. Welcome to the show. I feel rude, but I haven't actually had you on here before, so I apologise, but I'm keen to get into your thoughts because you've got a very interesting background that you walked me through, but when we had that chat, you sort of spoke a little bit more about your experience with local and state government. So I'm keen to jump into that and start with that straight away. So tell me a little bit more about your experience with working with local and state government.
William Makdessi (01:05)
Yeah, sure, Karissa. I've worked with InConsult for about two years now. InConsult as a whole has worked with over 90 of the 128 New South Wales local government councils, including New South Wales state government, everything from your busy Metropolis councils to the rural end of the Conduit councils, councils that rely on diesel generator, wireless antennas, just to get internet connection, you name it. We have a team that performs audit as well, internal audit and such. And then we have our risk team, which is essentially where I sit, doing pretty much everything for local and state government, including traditional risk management, climate risk, modern slavery, but my focal area, which is cyber risk, and that's where I sit.
So when people think about risk management, what's your definition then of risk management, but then also more importantly to what you do today is the cyber stuff? Because I think that there's always people just floating around with, oh, we've got to decrease our risk, but then what does that actually mean to you?
William Makdessi (02:04)
Yeah, that's a good question. Risk management is not what it was traditionally. I'll speak of it traditionally from the council perspective, since we have a lot of experience with council. For councils, typically you have your regulatory, financial, operational reputational and service delivery risk. Now, service delivery risk typically didn't include much of it, but there was very little emphasis on information technology. Services were hosted on your single server, on site air conditioning in the room, moisture sensors, you name it. It was very simple back then and web threats were also not a very common discussion. So risk management, it's been around for some time, but it isn't what it is today. Nowadays, risk management it kicks off discussion about starting up full time security teams, consulting with external cyber risk experts, and performing independent reviews and GAAP analysis just to stay up to date with regulation and standards. The landscape has completely changed and there's plenty of threats.
Yeah, you raise a good point. The landscape is obviously always changing. Do you think that's hard to be able to handle as well? Because, I mean, at the end of the day, it's a bit exhausting. It's like, oh, we just mitigated that risk and the next one another five pops up. So do you think it's overwhelming for people?
William Makdessi (03:16)
Definitely. Our approach to that and how we can help people manage that is we're essentially coming into these local governments into an area that's uncharted territory for them. For them, cyber risk is the new kid on the block. So what we do is we tend to act as a bit of a mediator in what is typically a technical process. I do have experience working as a network admin in the past and working with threats firsthand. So using that experience that I had as a network admin, I translate technical findings or areas of improvement into something that's easily interpreted by the risk teams and leadership teams. It's not often that you would have a leadership team that knows why an FTP protocol is unsafe. So we explain the consequences, the cost to remediate, give an example of maybe a client we know that's actually been exploited by FTP vulnerabilities and it just makes a lot more sense for them. And because of our large client base, it does help when we try and relate these potential issues to real world cases, because real world cases will always trump theory when it comes to that situation.
Yeah, that's so true. Trumps theory, yes. Do you think that perhaps people in our industry talked more about theories rather than sort of like practical experience, or do you find there's a little bit of that because maybe that's dislodgement between people actually understanding risk at the fundamental level because people talk in higher level theories and strategies rather than practical on the ground knowledge. Would you say there's a bit of that that's involved?
William Makdessi (04:42)
Definitely. That's something we see the most of actually, when it comes to particularly cyber risk. Traditionally, a lot of risk management is discussing frameworks, putting documentation in place, and maybe some business continuity testing. But that's about it. When it comes to cyber risk, there's a lot more requirement when it comes to actual testing of controls and implementing plans that have been exercised in real world scenarios. That's something we actually have quite a bit of experience in, and it seems to have really good benefit for our clients. There really is nothing better than experiential learning. Being part of a real life exercise is so much greater than just reading through the document yourself. Even if you read through it four or five times, the difference is profound.
So I want to understand why are people still adapting to this mentality of reading some document five times versus real world experience? Why is that the case? Because you are right. I mean, I'm probably definitely more of an on the job learner than versus anything else. But I'm just curious to know, from your perspective, with your experience, why are people still operating like that?
William Makdessi (05:49)
Well, if we think about Australia in particular, I think the main problem has got to do with the general definitions of a lot of the standards that we have here. So if we look at APRACE two, three, four, then we look at the ACSC essential eight guidelines. A lot of them are very general and there's no specific guidelines around how specific testing should be done or control testing should be performed, how often they should be performed. In fact, one of the most common terms I see in April CPR, information Security, is the use of the word regularly. Now, what is the definition of the word regularly? And that's a big problem, because the definition is open to interpretation. And that's one thing I've learned over the years, is the interpretation of an individual is something that cannot be controlled or defined. It is up to them.
So what would you define as regular?
William Makdessi (06:39)
When it comes to cyber risk? Good question. When it comes to cyber risk, I tend to tell people regular is typically twice a year.
William Makdessi (06:47)
And so what do you think other people's definition of regular is?
Once every ten years?
William Makdessi (06:52)
It does depend on the context. So this is standards such as Apocalypse, which are quite high level. We're talking about major control testing here. So twice a year is probably the most frequent I would put as a definition for regularly. But then you've got also some councils or organisations that are smaller, they might to put in place might like to put in place a plan, a multi year plan to tackle cyber security. Security. So their multi year plan might be testing over a two to three year period. So by the time they reach the end of the two to three year period, that's the completion of the testing of the entire framework. And that's acceptable as long as it's documented. One of the biggest problems when it comes to these definitions of regularly is they don't actually define what their roadmap or their plan looks like.
So if I'm a customer and I'm reading this and it's like regularly, do you think people skim over that? Do you think that they don't take it with sort of any seriousness? Or is that why they would then engage companies like yours to then understand, OK, well, let's actually define what regularly means, because it may be once a year. Twice, yeah. There's the gap then.
William Makdessi (07:57)
Yeah. That's the question we get all the time. What does it mean by regularly? Too often we have our clients or even new clients who come across and say, hey, we just wanted to reach out to someone with experience to understand. What does regularly mean according to APRA? What does regularly mean? According to the Australian Cyber Security Security Centre, It's a common question, but then you do have situations where highly mature organisation might use that to their advantage. They know it's a bit of a loophole and they know they're lacking resource this year or the last couple of years. So they'll take regularly, in their own interpretation to be you know what I think that means? Once every two years.
I was just about to ask you that. That would then mean that there are loopholes because it is their interpretation and there's no definitive sort of this regularly is once per annum, or whatever it may be. So if a company is low on resources or they've got other stuff going on which I can understand and I have a level of empathy, but then doesn't it then create other problems? So hypothetically, if it's like, okay, a company, they can get away with that because the Africa could be like, well, I guess we didn't define regularly and how much it should be performed or not performed. So then what does that then mean then for these guys that perhaps are just doing it once every couple of years or once in a blue moon, what's your experience with that?
William Makdessi (09:12)
It's just a delay in their cyber resilience and their maturity, which in the end is only benefiting the fact that they have a lack of resources. It does not benefit their posture, it does not benefit the potential for them to actually experience a cyber attack. It's increasing the likelihood of them experiencing a cyber attack. And as we know from a lot of the other podcasts we've heard and a lot of things we've seen in the news and a lot of stats that are out there, the cost of a cyber attack is so much more than the investment in the cyber strategy to prevent it. So there is really no benefit besides the fact that they are just pushing out the inevitable. It has to happen eventually. And if there is any kind of audit or a tripartite that takes place, if there isn't sufficient testing, if there isn't regular enough testing, you can't just put forward an excuse. It's just not going to work, it's not going to hold when it comes to a reviewer and all that.
So I'm just really fascinated by this. So with the opera stuff, so hypothetically, if it's like this regularly, why don't they have guidelines, perhaps? I know it's going to depend and it always depends and all this and that, but surely there should be something there because there is that loophole and then there's no real accountability. Then it's like all regularly could mean once every six years. I'm curious to know how do we then keep people accountable?
William Makdessi (10:23)
Yes. No, I totally agree. And I think what needs to happen is there needs to be more guidance around two things. One is the implementation of a statement of applicability, which is something that's used typically with ISO standards, but I'm trying to actually encourage it for use in other frameworks as well, and standards. And what that does is it basically sets the context from a high level, coming straight from the leadership team, we are going to commit to these parts of the framework because it is commensurate with our infrastructure. There is no way you can create a standard that is commensurate with every infrastructure around the globe. It doesn't work.
No. So one of the things I'm curious to know as well is with these long frameworks news or whatever it is, they're quite lengthy, right? You got to really, truly through stuff. Don't you think that's sort of counterintuitive a little bit? If you've got to read, like some super long document, these are all things you've got to cover. People just a don't read it, they can be bothered because it's all too hard, they feel overwhelmed and then nothing gets action. So why are we not trying to reduce? And I get you got to have these long frameworks and all that, but what about trying to sort of condense some of the key points? Because it is frustrating, it is arduous, and people don't want to do it because it's another thing that they've got to do in terms of compliance and keeping their head above the water and keeping the lights on. I'm curious to hear, from your point of view, what's going to happen about condensing these long frameworks.
William Makdessi (11:47)
Yeah, so this is actually where a statement of applicability would come in. This is a statement that you make when it comes to the commitment to a framework. So if we look at the Nissan security framework in particular, for example, it is not something you can be accredited for or certified against. It is generally a voluntary commitment to the framework. So in that case, you put forward a statement of applicability and you say, I'm going to, at a high level, executive or leadership level, commit to these areas of the framework as they are commensurate with our infrastructure. And that's how it will work. It will only be assessed against those areas of the framework. And that's something that is lacking in people who are implementing the Nissy security framework. If you look at anyone who's using ISO 27,001, for example, which is a less technical but broader framework, they tend to put an SOA in place. And the reason behind that is the entire ISO standard won't necessarily apply to the organisation. We can't just go and implement this entire new framework because it is profound, especially when we're working with small to medium enterprises that may not have adequate wireless security or may not even know what boundary protection is or anything like that.
William Makdessi (12:54)
It's quite a few different things that may not be necessary for an organisation of that size.
Yeah, I understand that and I think that's a really, really great starting point because, again, it's overwhelming for people and then as a result, they just don't do it. So thanks for clarifying that. So one of the things I want to sort of go back on now and to get into this a little bit more, is your approach to cyber risk management when it comes to local and state government, because I think people have these preconceived notions in their minds. If you wouldn't mind sort of sharing some of your insight and your experience with working with these government agencies.
William Makdessi (13:25)
Yeah, so one thing we see with local government when it comes to risk management in general, they do tend to have a bit of a front foot. And this is primarily because of regulation of risk management statewide. There's requirements by the Office of Local Government Audit Office of New South Wales because of this. Generally speaking, LGAs are a little bit more mature when it comes to risk management. They're still your typical fumbles, like there might be some outdated testing, things like that, but overall the maturity is higher. On the other hand, when it comes to cyber risk, it's not uncommon to see it develop slower in local government. In our experience, other industries are implementing rapid transformation projects with considerable investment in long term cyber strategy, so it's happening a lot faster in other industries. It's this area and local government that's the most lacking. It's the new kid on the block. And when you have a heavily formalised regulation of your risk management framework, you can't necessarily skip steps to get things done faster. So when it comes to that, that's where we step in again as the mediators, we try and simplify the process for them.
William Makdessi (14:23)
So a lot of these findings, a lot of these gaps that are found sometimes new even to the information technology teams, because they're areas of security that have only been around for the last five or ten years. A good example of that would be something like the A policy security and email. That was something that was profoundly lacking in local government until recently. As part of the study we conducted. Essentially, something is new. The progress is slow in local government, unfortunately, and that's where we step in to explain the benefits of it. And when we look at D market policy security in particular, it's free of charge, it actually costs nothing to do. It's a little bit of assessment, setting up of a domain record and you're good to go. You're basically preventing any would be threat actor from performing a hijack or a spoof of an email address. So it's a very strong implementation of control.
You said before you don't want to skip steps to try to go faster. Do you think that often happens, though? Because people are just going to get this done, I'll just skip like five, six and seven to get to step 20. Does that happen?
William Makdessi (15:21)
Well, in other industries it's not really skipping steps when there's no step, is it? So when we're talking about local government, there are steps that have to be taken. There's formal processes, approvals, budget allocation. We're talking about other industries, they may not have particular steps, you may not have to get approval from a particular leadership team, it may have to be passed on to counsellors. With the business case, it's quite different. So it's not necessarily literally skipping steps, it's just that those processes in other industries are a lot more simple and maybe simply is a little bit better when it comes to cyber risk, because we're talking about a threat landscape that is evolving so rapidly. It's not like a traditional risk management, we need to move a little bit faster.
Well, I guess that's what sort of leads me to my next question, which you've sort of already identified, is the differences in the approach that you found towards cyber risk management. So we just said before that of course there's a little bit more set in stone policies and standard operating procedures that government then have to follow versus perhaps another company that doesn't need to get approval and all that type of stuff. Do you have any sort of other differences that come to mind when I'm asking this question?
William Makdessi (16:26)
Yeah, definitely. I would say it's the maturity. The maturity is something that's a huge difference. So when we're talking about different sized councils, for example, or even just organisations in general, there is a great difference in maturity versus size. It's actually not linear, it's not what you would think.
Is that people's then perception of bigger in size means more mature. Is that what that fallacy comes from?
William Makdessi (16:50)
Yes. So essentially there is that assumption. A lot of the maturity of security we see is typical of a good security culture. It's not what is the assumption that is out there. So good security culture, not just in It teams, but across the entire organisation, the maturity of frameworks can only be defined by a budget so much. And that's probably one of the biggest problems I see in our industry, is it seems to be very budget centric discussion, especially when there's numerous security controls that don't even cost a dime. Why is budget the biggest concern? The drive of the It staff to stay in the know of latest trends, the positivity and relationship that they share with the business side, the enthusiasm of general staff and the understanding they have of their own responsibilities. When it comes to secure Practise, there's so many different factors. We've seen incredibly small organisations, we've seen incredibly small local governments in rural areas with less than 50 staff, sometimes with greater cyber resilience and maturity than even ASX listed multinationals.
William Makdessi (17:48)
And we've done work for
Well, that's wild. I'm curious to know, you said the budget, so why is that sort of like a robot for them? Like you just said that there are the controls that are free that people can implement, so why are they so focused and hung up on the budget side of things?
William Makdessi (18:03)
That's a good question and I think that's where my technical background helps out quite a bit. Working in cyber risk, there is that lack of understanding of the technical side. And also one of the hottest topics a couple of years ago was the fact of the issue of budget allocation. To invest in cyber risk, that was such a hot topic in media that it became the focus and I think that made it feel like the primary cause of this issue on the surface. But working with organisations and government to create, improve and test their information security frameworks, we always see otherwise. I'll run a phishing campaign across an organisation and the data from a phishing campaign alone, so not only the data from the entire framework, but just from a simple phishing campaign, shows that staff education has the greatest impact on user compromise. So it doesn't necessarily have anything to do with budget. It doesn't mean we need to do more fishing campaigns, bigger fishing campaigns, spend more. You could spend millions on next generation end user security and suddenly a staff member legitimises Bec attack, which is a business email compromise attack. Now, what's unique about a B EC attack is they don't necessarily have queues that can be picked up by AI or advanced threat mechanisms.
William Makdessi (19:14)
They won't have an attachment, they won't have a suspicious fishing link, they won't have any queue demanding an urgent response from the individual. They'll start off as a basic email requesting a change of bank account details because they're overseas pretending to be a staff member. Something very simple. The final backstop to that email is the human behind the screen. And sometimes all it takes is a quarterly email to staff to just remind them of this ever existing threat. Or even if you were to invest in cyber risk awareness training, that is hands down one of the best bang for buck investments that you can make. It's not something that would prompt a careful budget allocation or break the bank. It's something that's incredibly low cost and will greatly improve the cyber resilience of an organisation. So why is budget allocation the biggest topic of concern when there are so many things that are very bang for very cheap? And as I was saying earlier, when it comes to DMARC policy, for example, there are a lot of protocols or configuration changes that can be made to a domain, to an infrastructure that can completely change the resilience.
William Makdessi (20:15)
And it doesn't actually cost anything, it's just the time and effort of the It team or the individuals involved in making those changes. Nothing more than that.
Yeah, so I think that's potentially the bottleneck it's like the time and effort, which I don't have. I think it's that it's like, okay, we need budget to go and hire Will, for example, or whoever to come in and help us. Right, I think it's that then, don't you think?
William Makdessi (20:36)
That's a good point. And that's actually why vulnerability management is a big topic, especially in the cybersecurity framework. Vulnerability management itself, the definition of It, isn't very well understood. Essentially, what vulnerability management is is getting those It teams to understand the issues they have and prioritise them appropriately. There is that what I like to call it, debt. That pretty much every It team has. It's a debt that they have of tasks they need to complete and almost always they have too many tasks, not enough resource, and it just keeps growing. So vulnerability management is an assessment of those tasks and identifying what has the biggest impact on the organisation. Now, to get It teams to understand risk management, to do a risk assessment in the first place, it's usually something outside of their scope of their role. There needs to be more of a blend between risk management and It roles. There needs to be that collaboration between risk teams and It teams to understand how these vulnerabilities could impact the business and how to prioritise these vulnerabilities. So that even though they may have a lack of resource. They can address the higher probability and higher impact risks and address them more appropriately and dress them at a time frame that is appropriate with the amount of risk it poses.
Yeah, that's so true, isn't it? And I think that's probably where that issue comes from because they've got enough things on there to do and then it's like, well, now you're asking me to potentially, like you said, time and resource, which they don't have. So I think that's probably a broader conversation to have. But one of the things that I sort of want to talk to you about now, Will, is going back to the assumptions that people do have when it comes to big companies, that they have their security on point versus companies that don't have their security on point. And you said that's not the case, so just talk me through it a little bit more. Like, do you think you just referenced a few examples with the local government with under 50 staff and their sub resilience is pretty on point. So do you think it comes from they have that lack of budget, we've just got to try to get the best we can with limited funds. Both companies probably got buckets and buckets of money. It's like we'll get round to it when we can. So what do you think it is?
William Makdessi (22:44)
That's a very good question and I think the lack of budget is definitely a driving force to find better ways to manage cyber risk. And that's when you see smaller organisations doing better because they have that well developed culture. And culture sometimes doesn't cost much. Obviously there are investments in staff days and culture building exercises, things like that. There is a little bit of investment and stuff like that. But when you've got a good culture around cyber risk in an organisation, it's not like investing in end user security, it's not like investing in a soccer infrastructure. It's something that you've developed through your relationships with people. It's something that can benefit the cyber resilience of an organisation. Again, as I said, the backstop at the end of the day is the human on the other side of the screen. If all of your formal controls or your technical controls fail, if a phishing email still gets through, if a malware software still gets through, that person at the end, the person in front of the screen needs to be the one to understand and have the vigilance to understand that it's a risk. And a lot of that comes back to that good culture, that positive culture around reporting.
William Makdessi (23:47)
Now, something we see too often is staff not reporting, especially with phishing campaigns.
Why wouldn't they report though?
William Makdessi (23:53)
It's the fear of getting in trouble.
Fear of getting in trouble of what though? Of not reporting or running the phishing campaign or people not doing the right thing. And it's like, oh, we've got pretty low standards here. In terms of people being compromised in the phishing simulation, what is it?
William Makdessi (24:07)
It's particularly compromised. Let's just say you had 20% of the organisation compromise, an efficient campaign. Of those 20% that would compromise, you might only see 10% actually reported. And that is common of a poor culture and that's what we typically see. Actually, a recent phishing campaign I ran there was, I believe it was just a small portion of the organisation, eight people were compromised and one person out of the eight reported that they actually clicked on the campaign. Wow, that is highly reflective of a fear around reporting. They don't want to be pulled up on the fact that they failed the test. They think they might be reprimanded. And that is why a positive culture is so important. And one thing I've been trying to tell organisations to do is instead of communicating by email to your staff, did you know 20% of our staff failed this phishing campaign last month? You should be focusing on how many people actually reported it. Make it positive, maybe even award the people that report it. Give them a voucher, I don't know, give them a day off, far out.
People reporting it last night in Cinnamon.
William Makdessi (25:10)
Okay, but don't you think we've sort of bred this culture? I mean, in my experience of running reports is what I used to do, we used to even compare is not the right word. But we would look at each function of the business and say like, oh, these guys are up this month, these guys are down this month. So it's like. We've bred this culture a little bit like, oh, well, yeah, look, 20% of people in your area failed the simulation and don't you think we've sort of got ourselves into this mess?
William Makdessi (25:37)
That's very true. That same organisation where I ran that last phishing campaign a month prior, they ran a campaign with the It team and the stats were twice as good as the stats of any other area. So obviously, that does help in the development of that smugness that smug culture. Like It did a little bit better than you guys. But of course they did better. They're educated on this.
You'd hope they did better.
William Makdessi (26:01)
Exactly. I mean, even if we're talking about finance, finance should do better because they're dealing with digital fraud all the time.
Yeah. In case of happening.
William Makdessi (26:10)
Yeah, exactly. So there are certain areas that will do better. It's just in their nature, it's inherent for them to do better. But we can't go around comparing this because there's no benefit to that. What we need to do is we need to focus on conducting organisation wide cyber risk awareness training. Same level of training for everyone. And that's the issue. We don't go out and select the privileged user access accounts and just give them a little bit more training. That's not how it should be done. You don't want to help develop that more smug culture. Everyone should get the same training. It should be I'll use the word regular, because that's a funny word in this conversation at the moment, but regular, which, in the case of cyber risk awareness training, is at least once a year. At least once a year, you have to train every single staff member, including contractors, temps. They need to be on the same level. That education is critical to the responsive staff, but that also helps to develop that positive culture, make it fun, have the It team involved. Develop that relationship between general staff and the It team.
William Makdessi (27:10)
Because the age old issue of It being a different breed, they were always singled out as a different type of person or a different type of team. Getting in touch with them is typically only through service desk or a support ticket, and even then, you wouldn't really go beyond that. It's something that needs to be broken down.
It's still like that, though. Oh, you'll have to go through service desk. Oh, sorry, your ticket number seven. It's still like that, then. So we talk about this. But are we doing anything about it, though, really?
William Makdessi (27:36)
Yeah. That's why we need to involve It in these things. That's where I like to talk about my experience as an It admin. In the past, people actually like that. I mean, an It admin, giving a presentation on cyber risk awareness, speaking in a language I can understand, making it fun. It's actually something that anyone could be a part of, anyone can learn from. We're just not approaching it correctly.
And do you think that will change over time? Because I feel like people say it is, but then I feel like it's not, and in some instances it is. What are your thoughts on that?
William Makdessi (28:03)
I think it will get better. The only problem I see is there are a lot of these cheap and easy solutions online for doing training and phishing campaigns. Don't get me wrong, they're great and the cost benefit is fantastic because a lot of them are very low cost. But it still doesn't help with the development of that team, developing a relationship with the rest of the organisation. That's something that can only happen if people stand up and actually have presentations together, talk to each other, start developing that relationship themselves. You can't just go and sign up for a training module online and just go and tell yourself, okay, everyone, go and do this or you're going to get reprimanded. That doesn't build a positive culture. It is a lot easier to do it online. Heck yeah. But it still needs to be personable, you need to make it fun, you need to involve people from it.
Yeah, I guess I hear what you're saying and it's a lot easier to say you go ahead and do it. But the reality people doing is slim limited. I mean, I've been on those outstanding noncompliant lists multiple times, incurity myself. I was on those lists, they just didn't care. That's the fact that I didn't care. It was boring, it was basic and I was like, why are we doing this? And I was in that role. So imagine someone else that's sitting in a completely different arena. They're probably like, what on earth is this?
William Makdessi (29:13)
Yeah, I actually went through the onboarding process and did some elearning modules recently for another organisation and it took me a solid day and this was me absolutely whizzing through it, actually guessing some questions to try and make it go faster. And by the time I hit about 334 o'clock, I was completely brain. I actually don't even remember half the stuff I went through, to be honest. I just wanted to get it done because if I didn't get it done, you don't move on to the next part of your employment. Some people actually make it a requirement in the first one or two weeks to actually complete those learning modules. But what we see in local government actually is something that's quite common, and I don't really see this in other industries, is there is typically elearning modules that are implemented at local government, so in the first week or two you have to complete acceptable user courses, security courses, fraud and corruption courses, things like that. There's elearning modules that run you through the policies of the organisation, but then what they also do, even though the staff members have completed those modules, is they'll go and consult with someone like us and they'll get some risk management or cyber Risk Awareness Training done once a year and actually make It personable.
William Makdessi (30:19)
And local government is probably one of the best at that, really pushing the Cyber Risk Awareness training. So I'll definitely have to commend them on that.
Can you say make it more personable? What do you mean by that? What does that look like, having someone.
William Makdessi (30:29)
From It actually talk to the staff, having someone present? So what I tend to see is, in your average size local government council, you might have anywhere from 18 to 20 1 hour sessions to cover the entire amount of staff at the council. And I'll have the It manager show up to every single session out of those 18 to 20 sessions.
Do they show up willingly, though?
William Makdessi (30:51)
Of course, yeah. Not because it's compulsory or anything.
So this is a difference, then, between local governments and then just like another regular company that's out there, the It guys will be begrudgingly, then, showing up. Is that what you're saying the difference is?
William Makdessi (31:05)
Well, that's what I'm saying. In my experience, I see It managers show up even if it's just for half of the sessions. The fact that they showed up to that many when they're the It manager of a local government council, you got 500 staff, it's fantastic. And the staff appreciate it. They see the It manager there. Sometimes the It manager would jump in and drop a sentence or to make a comment or have a laugh with one of the staff members because they are going through an exercise where they're picking up red flags in an email and one of them might say something funny so the It manager would jump in and it helps build that relationship. It's fantastic. There's really nothing that can replace that.
So what do you think that stems from? So this stems from just, again, the culture. They've gone about things initially, which is sort of like bred into these people working there. Right. There's a company that's like, we don't have anything in place. The It guy can be bothered showing up half the time. So do you think this stems from the leadership perspective and gendering, the right culture, which then permeates down, and then you do have the It guy willingly showing up to conduct these sort of trading sessions, is that correct? And then I want to understand that theory around why then, if it's even so easy, why is this not been done in other companies?
William Makdessi (32:15)
I think that has to do with the fact that a lot of organisations outsource parts of the infrastructure. There's a lot of use of third party vendors. If we look into, let's say, the insurance sector, for example, one thing we see in insurance and reinsurance is there is a lot of dispersion or diversification of their It vendors. So their It team is not made up of one single It team. It's made up of three or four different It teams across different organisations that have one service desk each, one method of contact each. They have different lead times for different priority tickets. It's just this shamble of different kinds of teams that you have to deal with. So you kind of lose touch with the It team in that way. One thing I see with local governments is they always typically have one It team and the It team shows up to the office, they know them personally, they're people that have stuck around since legacy mainframe infrastructure days. That's probably one of the contributing factors to that, I think.
Okay, that I hear what you're saying, but if we zoom out for a second, the reality is in a big company or whatever, you have to engage third party. You have to have external vendors and service providers so it's hard to get around. It's very rare nowadays to have everyone in house versus the outsource model, which they have to for certain technologies or whatever it is. And people don't hang around for like 20 years anymore. So then how do you manage that?
William Makdessi (33:35)
Yeah, that's a good point. It's difficult. You just have to work on that culture. Just can't stress it enough. The only way to develop your cyber resilience is to break down that wall between It and other stuff. And the only way that's going to happen is if you have some involvement of It in risk management, in planning, in the cyber strategy, making it a little bit more layman as well for other stuff, helping to understand what are the technical issues that we face. Okay, don't tell me exactly what the technical issue is because I'm not going to understand it, but tell me why we need to do this. What benefit is it going to be? And that's the only way it's really going to improve.
So what I'm hearing what you're saying is I guess there's a myriad of ways to go about doing this and that this is just an example. But what you're sort of saying is if it's sort of stem from the top and we have external vendors and service providers and whoever else. Maybe they've got to get a plan and a strategy to say. Okay. Like once a year or regularly come in. Define regularly first and foremost and then get these guys in that are external to come in and have that face to face. Is that sort of what you're saying? Some people are a bit more familiar. It's not like there's a face to the name versus, I don't know, a company they just call up when I've got some service desk issue or whatever it is.
William Makdessi (34:49)
Yes, that's why they have sales managers and account managers. Right, but that's what we always say. You see an organisation sign up with a vendor and the minute the sales made, that account manager disappears. Nowhere to be seen. So that's something that we need to make sure is well delivered to service delivery by that vendor. Are they going to provide that after sales support or is it actually part of the contract? Because it should be.
But is it, though? That was a good point. I was actually just about to go there and you read my mind. Should it be stipulated in a contract to say, you regularly have to come in, like, once a quarter, show your face, get in front of my team, talk a little bit more about your technology to make sure we're using it effectively?
William Makdessi (35:25)
Very specific for a contract, but I don't see why it should be an issue. We provided a solution as well for GRC, and part of our contract is guaranteed up to three to six months of post implementation support. No charge. Whatever you want, we'll do it. And that's how it should be.
How much support do you get, though?
William Makdessi (35:41)
Whatever they want, that's fine. It's making sure that they're happy with the product, making sure that they're actually happy with the communication we've delivered and the transfer of knowledge. That's something that we also like to focus on as well. We like to transfer as much of our knowledge to the client as well, help them out, make sure that they understand what they need to do. What are the responsibilities? Same goes for culture, though, when it comes to these vendors. There needs to be that transfer of that culture, of that understanding, of that collaboration between the organisation and the vendor, to make sure that staff understand how they can work together, how they can actually build this positive culture, to build up cyber resilience. Because it's not just the technical controls, because technical controls can only go so far. You need to make sure the person on the other side of the screen understands.
Yeah, that's so true. But don't you think it seems a bit obvious that should be happening? I mean, if you're paying a million bucks or $5 million a year, I'd be wanting people in there all the time, getting in front of these people. But don't you think that that's the case now? I'm not going to say it's not. These people don't care, because I don't think that's true, and maybe some of that, but I also think maybe they haven't thought about it, perhaps.
William Makdessi (36:43)
Yeah, it's probably just that. I mean, maybe it's happening in other countries, maybe in America. This is something they do all the time, I don't know, but this is just what we see in Australia. There really isn't that fun, enjoyable cyber risk awareness training that come together, collaboration and understanding why risks exist and how it actually affects everyone. Why is it important that the lady in the cemetery department is careful when she clicks on emails? Just speaking from government, sorry,
I was like, what?
That's a department?
William Makdessi (37:12)
Yes, it is. And it is important because there was a council that was affected by that because of a phishing attack that's particularly why that came to mind. But everyone has an important role, even if they don't think it. Everything is internet facing now. We look at Australia, we think about critical infrastructure. There's so many systems that were designed that were not meant to be internet facing in the first place and now they're internet facing for remote monitoring and stuff. That's a huge problem. The same with when it comes to those departments or those individuals who don't necessarily think they're important. They may not have been exposed to that level of internet activity or that level of responsibility. In fact, I was actually speaking to a CIO recently and one of the things he said was he believes position descriptions are absolute crap. They're hybrid, they're dynamic, they're constantly changing in our new environment. And that's a good point because if you have someone who's come in and they've worked in an organisation for ten years and they've worked in a particular role, all of a sudden their responsibilities change. Is their understanding of risk?
William Makdessi (38:07)
Is their understanding of cyber risk around those new responsibilities also improved. And that's a big problem when we've got these roles that are constantly evolving and new responsibilities coming through the door, does that new responsibility introduce new risks that that person is not necessarily educated on?
So the question that I have is, if I'm a leader or whoever, and I've got all these vendors, service providers, how come they're not sort of standing up and saying, okay, we never see these people, get these people in our office in front of us? Like, are people asking these questions? Because I don't know if they are. So why is that?
William Makdessi (38:36)
I don't really know. I mean, that is something that we focused on and it's something that we provided that's highly personable level of Cyber Risk Awareness Training. It's something that we tried out and it took us years. It took us quite a few years to work out once we realised that making it personable was the best way to actually deliver the message, to actually get staff to understand and listen. We've stuck to that now. I know. And a lot of organisations are not having this conversation. And when we explain our Cyber Risk Awareness Training, it may not sound like it's going to be any greater benefit to you doing an online module, it's not until you really experience it and go through it that you see the benefit. How do you raise the question when it's only something you realise is beneficial after you've done it? That's the problem. You need to raise awareness on it.
Well. I think that really concludes our interview today because I wanted to really get into your experience and working in state and local government and to understand where these guys are doing it rather well and where other people outside of those government agencies can actually sort of look to some of the insights that you've shared with us today and look to implement some of your strategies and your theories. So I really appreciate your time today, Will.
William Makdessi (39:44)
I think that, yeah, you've definitely given me some knowledge and I'm definitely certain that you will have partial knowledge on to our listeners. So thanks very much for your time.
William Makdessi (39:53)
Thanks, Karissa. Really appreciate it.
Thanks for tuning in. We hope that you found today's episode useful and you took away a few key points. Don't forget to subscribe to our podcast to get our latest episodes. If you'd like to find out how KBI can help grow your cyber business, then please head over to KBI Digital. This podcast was brought to you by KBI Media, the Voice of Cyber.