Lisa Black [00:00:00]:
Everybody’s going to criticize everything you do. In the United States, we call them Monday morning quarterbacking, right? It’s a football term. It’s just like the game was on Sunday and everybody’s talking about it on Monday and they’re all saying, oh, that one should have done that, or that one should have done that. It’s great. But the people in the business, this is what they do, right? Your incident commanders, your incident response team, this is what they do. So, it depends on how long that questioning is going to go on. Again, if your communication is tight and you are communicating regularly and with authority, that should mitigate that rumor mill because the rumor mill is going to happen regardless.
Karissa Breen [00:00:58]:
Joining me now is Lisa Black, Director Public Sector at Aeon Nexus, and today we’re discussing when systems fail and how leaders actually survive in a crisis. So Lisa, thanks for joining and welcome.
Lisa Black [00:01:08]:
Thanks for having me.
Karissa Breen [00:01:09]:
So Lisa, I met you at the 2020 conference down in Miami at the tail end of last year. Now you’ve obviously got a crazy awesome background, so I was just like, I have to have you on the show because you provide a different perspective. Whilst it’s not like full-blown cyber stuff, the same principles and the approach applies, and I believe it’s really important important to get multiple versions of, of views from people like yourself so people can draw correlations. So maybe let’s start there. When a major incident hits, given your experience and your background, what do you believe leaders misunderstand about what’s actually at risk, in your opinion?
Lisa Black [00:01:49]:
I think that’s a great question because given the nature of the incident, if it’s a cyber attack, most people think it’s technical, and that’s a knee-jerk reaction, right? Most CEOs, however, leaders aren’t really proficient in cyber, and it’s not tangible, so they can’t see it. You could delegate operations and you can have someone report to them on containment and mitigation and restoration and all the other phases that come, but what’s actually more at risk is more personal. It’s your ability to lead coherently under that pressure. And frankly, I think most leaders underestimate how fast trust in your own organization erodes internally. And your ability to make decisions with integrity is really important because IT is basically the spine of everything we do. And when it’s not working, the very thing that helps people do their jobs is not working. So they have to pause, they get to reevaluate, and in doing so, they’re trying to figure out who makes those decisions and what information is reliable. And so threat actors, I think they exploit confusion more than code.
Lisa Black [00:02:51]:
The loss of confidence in an organization really moves faster than any of the malware that they can drop in months or weeks leading up to it.
Karissa Breen [00:02:58]:
Do you think people are getting better at perhaps the risk given what’s happening out there across the world? Whether it is in aviation, we’ve seen a few sort of incidents happening. Yes, tech, cyber for sure, but just general sort of things going on. Do you think that the veil is coming down a little bit more? There is a little bit more, and I hate to say it, awareness, but That’s an important question because people do learn from other unfortunate incidents in order to make their practice or their business better.
Lisa Black [00:03:29]:
I think everybody should be not only aware of it at this stage in the game, it’s 2026, but you should just be waiting for when it’s gonna come to you. You know, this isn’t about drilling and, you know, figuring out when something’s coming, identifying what that response is gonna look like. You really have to embed all of this in your daily operation. It has to become part of the fabric of what you do every day. It’s a matter of when, not if this is going to happen to you.
Karissa Breen [00:03:53]:
And when you say embedding into the fabric, what does that look like in your eyes?
Lisa Black [00:03:57]:
Oh, well, I mean, that really is, it’s more than just drilling once or twice a year. It’s, and in fact, if you’re going to drill, if you’re going to make it a formal environment, you should be doing it in a way where you’re just calling everybody in on a random Tuesday and not letting them prepare for it. I would like to, you know, I would like to do after-action reports monthly, you know, this is what we saw coming into this, and this was what the reaction was. And rather than applaud people on how they reacted to it, why don’t we introduce consequences instead of applause? Why don’t we just say, hey, you know, based on what you did, if this escalated, this is the lawsuit that would’ve come from that. These are the regulatory findings that would’ve hampered our ability to move forward. This is the reputational fallout, so that they can start thinking about that for the next time. They’re trying to mitigate or, you know, deal with whatever is in front of them on the technology side.
Karissa Breen [00:04:52]:
So it’s going down the risk rabbit hole just for a moment, because even if you look, just zoom out, look at tech, you said before, it’s the spine of majority of companies out there nowadays with agentic and AI and all these things that are out there. What we’re seeing in the market is that the risk profile is changing day by day. So this is where I find it interesting because like you said, you can’t just do things once a year, et cetera, but how often should people be doing what you’re saying, part of the fabric and maybe testing it because again, things are changing day to day. So what may have been relevant last week may change the following week. And then how do you sort of tie that in with people leave the company, someone else comes in, maybe someone’s on holidays and he’s a key person, who knows, not answering his phone. So I’m curious to get that because these are sort of real life things that happen. And sometimes what I often see in the space is people talking too theoretically about things.
Lisa Black [00:05:45]:
Well, I think that they just, they learn the jargon, right? And I think that’s the unique value I bring to this is I’m not a practitioner by any means. I led government operations in a variety of different leadership capacities over almost 30 years. So I’m not a huge fan of traditional predictable crisis training. What I like to do is uniquely position people on your staff where they will have value in a holistic way. That forces discipline. For example, I was the head of operations for a county with a $4 billion budget and 12,000 staff. And when we had a cyberattack, we established a command structure that defined the role and reporting structure so that this isn’t what you’ll be doing on most days. That’s why I think the element of surprise really helps when you’re navigating risk.
Lisa Black [00:06:34]:
You don’t train for the event, you train for a failure of that scenario. Because if your cyber training is comfortable, it’s definitely lying to you. Um, everybody says, I think this is part of the jargon you probably hear, it’s worst case scenario, you plan for worst case scenario. But what that worst case scenario, if you’re doing it right and you’re positioning people out of their element and making them almost uncomfortable, sometimes it teaches, or most times it teaches prioritization under pressure. And that’s what’s hard to grasp for people., but that exercise will force leadership into the difficult decisions. In most cases, as you know, when you’re in the middle of an event, a cyber event, you may not know the whole story, but you have to decide based on the facts that you have in front of you. And that may be turning down systems that have an implication to, you know, people that are in the system that rely on benefits access. How are they going to get, their food that month? How are they going to get the heating that is subsidized by government? These are all things that leaders have to take and weigh when we’re hardwired for shortcuts and doing something different is really hard.
Lisa Black [00:07:44]:
And disproving assumptions really does take energy. But that’s what I mean by you have to embed it. You have to really upend the apple cart everywhere you go. Look at your systems. When we were hiring for certain positions in government, we hired for skill. Of course, we wanted to make sure you had some kind of competency in, in, you know, if you’re going into a financial system or if you’re going into a legal area, of course. But we also know that our job is to serve the public in those positions that I held. And one of the things that we like to do is have as many different positions and opinions in the room at one time because the outcome is always better, right? Truth always emerges from exhaustive exploration of things that might not be true.
Lisa Black [00:08:31]:
So if you have no assumptions and everybody’s sitting there and bringing their best arguments to, to a scenario, it breaks assumptions. And, you know, introducing that ambiguity and forcing trade-offs really is a blessing at the end of the day because that’s the muscle memory. It’s not just practicing for digital failure, it’s really practicing for human failure. And if you can become resilient as a team, I think your outcomes will be better for it.
Karissa Breen [00:08:58]:
One of the things I’m curious about, so what was coming to my mind as you were speaking there, Lisa, was what I’ve heard people say over the years is that when you’re doing like corporates, the people that are sort of just haven’t really been exposed to like real life incidents, perhaps like a paramedic that’s going out to scenes, car accidents, all sorts of horrible things they’ve got to deal with day in, day out, a little bit more to your words, resilient. They’ve got the muscle memory there. Do you think that perhaps in some of these big businesses and corporations, they haven’t been exposed to this sort of stuff? So perhaps they just don’t have that in their DNA as much as other people who are doctors and frontline workers and all this sort of stuff. How do people start to draw on some of those parallels that people in other fields have? Therefore, they’re not so confronted when something happens, and they can be calm in a serious situation?
Lisa Black [00:09:47]:
Well, I think again, that’s a little bit of a cross-training element, right? One of the things I did when I worked for the mayor of the city of New York, and this was under his direction, was, you know, he took the first month out of, I don’t remember which term it was, first, second term, right? He started a new administration. He said, for the next month, I’m gonna take the head of corrections and I’m gonna put them at the head of social services. They were always the number two. You never replaced the commissioner. So, They were the number 2, and they went in knowing what they knew through the eyes of their content. I know corrections, but now I’m going into Department of Social Services, and I’m going to have to figure out what I know and what I can bring to bear on this agency within the organization. And that cross-training really is helpful. I know it’s difficult when a lot of these leaders are, they’re constantly working and they have their own pressures.
Lisa Black [00:10:37]:
But if you could do it for a week, if you could do it for a day a month, Anything you can do to bring that exposure to those other realms, I think, is helpful for that consistency across the organization.
Karissa Breen [00:10:49]:
You mentioned before, you’re not a big believer of traditional predictable sort of training. What does that look like? Is it just sort of like, oh, we’re going to sit around a bit of a PowerPoint, we’re going to do this once a year, then we’re going to forget it, and then we’ve got to do our annual check each year and we’ve ticked the box.
Lisa Black [00:11:03]:
Is that what you mean by that? Absolutely not. I hate checking the box. You can’t learn in a predictable environment. You have to have competing crises. Real life is not about one cyberattack or one failure of electricity or power. There are injects. And what we did, if we started to train for things, for example, I will share that during our cyberattack, we wound up having a weather emergency and, you know, we lost power. So here we are serving a population of 1.5 million.
Lisa Black [00:11:37]:
We are reverting back to paper and all of our traditional redundancies to serve our public, but then some of them are losing power. So how do you deal with that? How do the media inquiries change? Your staff, they get tired and they get tired fast. They’re working 10, 12, 18-hour days. So how are you looking at at relieving them, right? We know that nobody needs a tired or cranky employee. They might be the ones that are answering the phone when a board member calls or a legislator calls or a media inquiry comes in. You can’t have them have that tone on that first touch in your organization. So I would say, think about what guides you in your day-to-day and use that as your kryptonite. These are the resources that I use on a day-to-day, What is the worst thing that could happen? And if it starts with customer service, you can’t have somebody cranky at the phone.
Lisa Black [00:12:31]:
It may be a labor dispute, right? Inject those into part of your decision-making. What happens if you’re running a hospital and there’s a nursing shortage? What are you gonna do? How do you negotiate under that intense pressure? It really is just a form of continuous improvement, and I think a lot of organizations look at continuous improvement, but they look at it from an HR lens. I would start looking at it as a crisis management or risk management lens as well.
Karissa Breen [00:12:58]:
So, I just want to shift gears slightly and go through perhaps the first 60 minutes of a major disruption. That’s when the real panic sinks in. What do you believe are the main decisions leaders must make regardless of the technical competency? And this is really important because I think from what I’ve observed and the people that I’ve spoken to over the years dealing with this sort of stuff, even if they’ve dealt with an incident themselves, is how they set path on the right foot. So I think this is really important.
Lisa Black [00:13:32]:
It is really important. I think that first hour after a major disruption really shapes the next several months. It’s that tone that is set from the start that sets either the calm or the crisis. It sets direction, it protects trust, it conveys a level of authority. But I would say, you know, during those first 60 minutes, there are a lot of very pressurized scenarios that decision makers have to make. And, you know, we’re not just talking about the executive or the CEO, your COO, all of those professionals in that tree, they have to take on different roles than they normally would. Of course they’re concerned about the legal aspects or the financial aspects. But first thing, you have to verify that this, is it real? Is this a verified breach? Is it real or is it a drill? And then you assess that information, you identify what’s affected, you try to come up with some solutions for containment.
Lisa Black [00:14:28]:
Generally, your technical folks are advising you on all of that. That’s not what a CEO, the CEO just needs to basically have the language to understand what’s happening. That way he can drill down on what your next steps were. You know, at some different point, but right at that first few minutes, is this real? And then that CEO has to determine who’s in charge. And in the United States, we assign command structure and organize, I would say organize or guide the chaos, right? We assign tasks through what we call the NIMS structure. I don’t know if you’ve heard of that before. It’s National Interagency Incident Management System, NIIMS. It’s the US’s standard for command and control.
Lisa Black [00:15:07]:
If you follow that, and most emergency managers are trained in it, unfortunately most IT professionals are not. One of the things I did within the first day or so of my event was I brought our emergency management teams in and I started having them be a resource to our IT individuals and saying, okay, this is what we’re gonna do. We’re gonna establish situation reports, right? So what does that mean? It means documenting all of the things that you’re doing and providing daily incident action reports. So we had both sitreps and IAPs. We managed logistics, et cetera. But beyond who’s in charge and that this is real, you have to then set the priorities. And I know in the business community, that means what’s most at risk as a business priority. But this is not a tech call, right? This isn’t about technology.
Lisa Black [00:16:01]:
This is the leader using values and risk to make determinations. So I’ll give you an example. In government, when you have a cyber attack, your first instinct is life and safety. And I know for many global corporate organizations, that’s the same. It really depends on what content they are providing or what service they’re providing. But if your police department can’t function, What does that mean to the responders? What does that mean to the community? How does that affect public safety? If you are a water treatment plant operator or infrastructure, utilities, transportation, bridges, how is the loss of that bridge, lighting or heating, how is that affecting your customers? You know, and then you obviously set your priorities based on restoration times, critical services. Financial, obviously your data folks, your tech people are looking at your exfiltration, what’s been exfilled, and what do I have to look at by way of restoration? But I would say, what is most at risk to you? That’s one of the conversations we continue to bring to each other daily. Because of real-life injects, we might have said that this is a priority today, but tomorrow the comptroller comes in and says, I have to leverage these payments today.
Lisa Black [00:17:20]:
So that may take priority over replacing something in a clerical office. So you have to continue to reevaluate those priorities. So while it’s a, it’s something you do in the first 60 minutes, you continually refine that. And then of course, controlling messaging. I think there hasn’t been an event I’ve spoken at or that I’ve participated in that doesn’t talk about Communication and the knowns versus the unknowns, facts versus, you know, what people assume to be facts. And then I would say more important than the communication itself is establishing a cadence for that communication. I think if people can rely on when that information is coming to them and build that into their structure, their new structure of what they’re working on, that certainly helps. It also establishing that cadence And making it regular alleviates silence.
Lisa Black [00:18:14]:
And we all know that that creates panic and leaves room certainly for adversaries or talking heads on TV or radio to fill the vacuum. And it really will keep the rumor mill to a minimum. And of course, that’s also, again, this is the first 60 minutes. So if you can establish that cadence, this is what we’re gonna do in emergency. We’re gonna brief our board every day at noon and 6, whatever it is. But you need to have that for both internal and external communication. You need to identify a person that will be doing all of that communication. So there’s one person, one voice in charge, and you have to establish that if they’re hearing something else from somebody that hasn’t been that identified spokesperson, you still have, you know, your board member still has to check in with that spokesperson to ensure that it’s integrity.
Lisa Black [00:18:58]:
And then, you know, you have to decide who’s being notified, whether it’s your departments of justice, your attorneys general, ministries, et cetera. What are your required communication regarding the breach? And then, of course, I think the last thing is, and I think I mentioned this early, your leader needs to know how you’re mitigating this. They don’t need to know the technical ins and outs of it, but this is probably the most expensive decision, right? They will be advised by your technical professionals. As to the state of affairs, and the leader has to decide on isolation or shutdown. They’re the ones to call in, hey, we need continuity of operations. We call them COOP plans in government. Is any service being suspended? What resources are needed? Those are really where I think you’ll see the financial impacts of a decision. But hopefully if you did the other ones before all of that, you’ll be in a better place to determine what you’re going to be doing, how long you’re going to need to be doing it for.
Karissa Breen [00:19:57]:
Okay, this is really interesting. So I want to get into just going back a moment when you said you’ve got to set the tone at the start, whether it’s calm or chaotic. Would you say, generally speaking, what sort of camp would you say people are sort of in at the moment? More on the calm side, more on the chaotic side? And even if it is chaotic, are they moving more towards being calm because perhaps these incidents are becoming a little bit more prevalent nowadays?
Lisa Black [00:20:22]:
So I think that’s a mixed bag. It’s a mixed question for sure. I think, you know, when we’re looking at divisiveness in rhetoric and politics right now, and everybody’s a little agitated when it comes to cyber, again, I think everybody is kind of waiting for it to happen for that, to them, for them, with them. So I wouldn’t mistake that for calm. I would still say when I talk about tone and setting the direction is, you know, if we have an event, Don’t have your number 2 or number 3 running around the office telling everybody to unplug their computers and making rash phone calls telling them to unplug their computers. That is not helping when you’re getting a call from the boss and they’re saying unplug everything in your office. So setting the tone is bringing them all into one room and saying, this is the message we are gonna share with our subordinates and each level down and make sure that authority is established. Now, you know, if they’re running around doing what they’re doing, what they’re doing with their hair on fire, they shouldn’t be in the room.
Lisa Black [00:21:21]:
You need to remove them. That’s plain and simple. But I wouldn’t say that most people take that calm for, you know, comfort. If that’s the case, that’s also a problem. I, I think if you’re calm in a scenario, I’m hoping that you’re well trained and that time is not on your side. So there, there’s nothing really to be calm about. You have to be deliberate, you have to be forceful, you have to be unfortunately short with people. There’s a time for empathy, and it’s not usually within that first 60 minutes or even in the first day.
Karissa Breen [00:21:52]:
Okay, I’m going to move into the communication side of things. This is a big one, and I’ve got so many questions for you, but I’m going to start with one thing I’ve observed in any incident is you’re going to have these adjudicators. So people say, oh, they should have done that, oh, they should have done this. And I get it’s people that have no experience, but even if you did everything, even if you came in, Lisa, and said on paper what this company, organization, government did on paper was correct, there’s always going to be someone who’s critical. So how do you sort of deal with that? Because it is a little bit more of that backlash on everyone’s getting on Twitter and complaining and, oh, they should have done this, and they put their two cents in. And I get that everyone has an opinion, they’re entitled to, they’ve got mechanisms nowadays to share those opinions. But does that sort of come into it when people are making decisions? Because I’ve often seen as well in more breaches with PII, for example, personal identifiable information, where it gets really interesting online and it changes the dynamics. I’m just curious to hear what your sort of sentiments are towards this.
Lisa Black [00:22:49]:
That’s a great question. I think more importantly, because everybody all of a sudden becomes a technical expert via a Google search, right? We don’t want to forget about the media who always are looking for something salacious and opportune vendors who want business, right? They think this is their opportunity. And in the case of, of government, there’s always a political adversary. They wanna use their event to their advantage. And that may even be in the corporate cycle as well with competitors. So I think you’re right. Everybody’s going to criticize everything you do. In the United States, we call them Monday morning quarterbacking, right? It’s a football term.
Lisa Black [00:23:23]:
It’s just like the game was on Sunday and everybody’s talking about it on Monday and they’re all saying, oh, that one should have done that, or that one should have done that. It’s great., but I think the people in the business, this is what they do, right? Your incident commanders, your incident response team, this is what they do. So, it depends on how long that questioning is going to go on. Again, if your communication is tight and you are communicating regularly and with authority, that should mitigate that rumor mill because the rumor mill is going to happen regardless. It’s just, you know, in the digital age that we are. The benefit of, I would say, of having a cyberattack, and if you shut down your system, is that, hey, maybe nobody’s looking at a Twitter or whatever, you know, social media feed is out there right now. Hopefully they’re just doing their job and focusing on the task at hand.
Karissa Breen [00:24:11]:
Okay. And then talking through communication again, even we’ve had some really bad breaches in Australia and everyone has their opinion. Sure. One of the things that comes up a lot is they handle the comms badly. Oh, but the CEO didn’t care. There’s still much— so, or wasn’t empathetic enough. With comms, and I do think there are companies who probably have handled it wrong, what would be your approach to this? Because sometimes, like, legal wants to stay silent because if they say something, oh, we’re going to get in trouble, or— but then it’s like they don’t say enough, they get sort of hit. So it’s a very difficult conundrum.
Karissa Breen [00:24:48]:
I’m aware of that, but I’m keen to sort of help you help me walk me through how this looks, because I don’t think it’s as easy as what people a saying out there either.
Lisa Black [00:24:58]:
It’s not. And at the lead of this, I think a lot of the trust really erodes internally quickly. And so if you pay attention to that, hopefully, as you should be, you’ll be feeling it pretty quickly. I would say that, you know, change, any kind of change for any of those leaders, is it implies loss. Like, they can’t do what they were trained to do or that they were hired to do, right? They’re in a new position now because of the change in content. Hey, we’re under attack now. What does that mean? And let’s not forget that they are actually the victim of a crime then. And I think that’s really hard for most people to understand.
Lisa Black [00:25:34]:
Anybody that’s had a cyber attack is a victim of a crime. They are in the middle of surviving. We wouldn’t do that to a domestic violence victim. We wouldn’t do that to others. And it, it is because cyber is, is silent. You can’t see it. Not— it’s not silent. It is— it’s just not tangible.
Lisa Black [00:25:50]:
So we have to support all of our staff along the way, even the leaders. We have to acknowledge their stress. We have to acknowledge your fears. We have to remember to feed our staff. We have to remember to, to pay them. They’re working beyond, you know, a certain amount of hours, you double time or time and a half, whatever it is. And we have to allow rest. So that’s when that established, you know, if we’re using the NIMS system or any of our systems that we have in place through the federal government for emergency planning, We are following that so that we have redundancy, so that our staff are feeling like they’re not solely in charge of, I don’t know, restoring the code, or I’m taking the backups from tapes, I’m doing whatever their role is, they are not in it alone.
Lisa Black [00:26:34]:
And we have to support them in that because some of these events can take months. I mean, our event, it could have been over in 6 weeks, and unfortunately, we We never closed government at all. I wanna be very clear. Governments, you know, their mission is to serve people. We could do that without computers. We just had to return to 1990. We had to remember what it was like to purchase a house and file all the paperwork by hand and manually input into other systems offsite and off-prem. So, there is a way to get things done.
Lisa Black [00:27:06]:
It’s just, it may take a little longer, and if you’re in business, time is money. And I understand that is one of the biggest competing issues, certainly for boards. You know, how much is it going to cost me? How— what am I going to get back? Less probably about the reputational harm. And frankly, I believe, you know, that’s why we’re seeing such distrust in government, is the reputational harm of constant failure. You know, they messed up this or they messed up that, and everybody’s so quick to judge that. At the end of the day, there are people that are working working so hard or in such genuine ways, in most cases, to restore services and really restore public’s trust. And that’s unfortunately where I think you see the dichotomy of decision-making or decision recommendations from your leaders in finance and legal and HR. I don’t know that they’ll ever be aligned, right? They were all trained in one vertical, and that’s great.
Lisa Black [00:28:05]:
That’s why it still lies with that one authority at the top. If it’s not that one authority, it’s a unified command where you have, you know, two or three individuals making decisions together. So I think you just have to, you have to support, you have to support every one of your employees and let them know that we understand what they’re going through, but we’re going through it too. So, you know, kind of get in line.
Karissa Breen [00:28:27]:
So I want to press on a little bit more around the cadence of the comms, because I do believe this is important. Would you say people who aren’t regularly I know this is obvious, updating customers or regulators or whoever they need to update. Obviously that is a problem. But the other thing that I’ve thought about is even if there’s no update, nothing’s changed, should that still be an update? Hey, we’re still working on it.
Lisa Black [00:28:48]:
Nothing’s changed.
Karissa Breen [00:28:49]:
It’s just that what I’ve seen causes angst for people from what I do research online is protracted amount of time of like nothing’s changed, but there’s no update to say, hey everyone, this is what’s happening. I’ve seen that as a potential gap. The legal counsel comes in and says, well, we can’t say anything now because we’ve got the regulator in. And I’ve seen that as well, which then prohibits comms to go out, or the comms get really muddy then. And then when you’ve got media people who specialize in my field that want very specific answers, like, oh, I can’t give you that answer. And so therefore it just then starts to fuel a completely different narrative.
Lisa Black [00:29:26]:
You’re right, and I’ve seen both sides of that, and I can give you examples too. We were working with an incident response team. We wanted to give a preliminary readout of our key findings, right? Just 3 pages. This is when the adversary came in. This is when the threat actor did this. This is when they did that. They were in for X amount of months before they were identified. This is what they, they took control of the DCs.
Lisa Black [00:29:49]:
As soon as they got into the AD, it was over, right? Their legal team and the incidents team. They went crazy. No, we don’t give out this information until we have a final report. I said, listen, your job and our job is very different. We are servicing the public. They need answers. So we just usurped them. We, and I’m so thankful that we did.
Lisa Black [00:30:08]:
And they actually took that back and they said, oh, you know, wow. When we’re dealing with governments, you really have a different perspective than corporate. They were all worried about being sued and X, Y, and Z. And I went through the actual risk with them. And said, we’re still gonna do it. So, you know, it really is a decision point. So I’ll start with that. And then on the other side of that, the silence creates panic.
Lisa Black [00:30:29]:
You know, we went out very quickly and said, this is what we’re facing, and did not give a lot of details thereafter. And what happened was that that left room to have other information fill that vacuum, and rumors surfaced. And it just, once it snowballs, you know, I think you’re right. It’s hard to get out of. The problem with that, however, is from my perspective, we’re a victim of a crime, and we continued to say that. And it’s almost like media didn’t really care too much. We’re in— we’re in New York, proximate to New York City. We have the biggest media market, right, in the world.
Lisa Black [00:31:03]:
They want answers. And the problem was we were trying to educate those personalities that were, you know, reporters, etc., that, hey, maybe you shouldn’t answer that person, or don’t click links because you could be being phished, or you could be lured to the dark web, and they don’t understand the risk because they’re just trying to like beat their competitor out to make it to news, you know, at 6:00 PM or whatever. And trying to educate the press about what information we have and what information we can provide will create a blueprint for bad guys, basically. Hey, they wanna know how we’re remedying X, Y, and Z. I’m not gonna go tell them which vendor I’m using to do that because it’s pretty easy to see, you know, what that vendor supplies. So once you do that, it’s easy to put that patchwork together and say, X vendor’s doing this, you know, this is who we’re using for, for firewall, etc. It becomes nothing but a challenge for us then to keep that information confidential so that, God forbid, if there’s another breach, they don’t have the blueprint of our entire system mapped right there.
Karissa Breen [00:32:09]:
So I recently interviewed the Chief Information Security Officer for Medibank, which is the largest health insurer in Australia. They had huge breach, the hugest that we’ve ever seen in 2022. I interviewed him, took a while to get the thing published because of all the lawyers, but we got there. One of the things that he said to me was, KB, when the external mainstream media— which isn’t what I do because obviously I’m more specialized, correct— they misconstrue the terms, he goes, and then we’re trying to recorrect because they misinterpreted it because they’re not tech people, they’re not cyber people, then they’ll run with some major crazy headline. And then he goes, now we’re trying to overcompensate of dis and misinformation that’s gone out there. Maybe, maybe it was intentional. Maybe it wasn’t intentional. They’re not tech people.
Karissa Breen [00:32:50]:
So I kind of don’t blame them. Then they’re doing extra legwork to try to reclaim that narrative. So do you see that happening as well? If someone creates a crazy headline that is false and fabricated, or they’ve exaggerated it, then other media get ahold of it and then it just spirals. And I just see this as a problem. And then this company or government is trying to correct those facts, which is actually detracting from the incident itself.
Lisa Black [00:33:16]:
So I have not only felt this, I’ve experienced this. There are real consequences to disinformation, right? Especially in this space. Traditional media outlets, they’re getting smaller by the day, so they don’t have, you know, somebody that focuses on cybersecurity or somebody that focuses on finance or somebody focuses on healthcare. So I understand especially the larger organizations, you’re, you’re used to dealing with your beat reporter is what we would call it, right? I started my career out in media a long time ago and you’re a beat reporter, so you would cover this and this, and then you became somewhat of an expert of it, in it. You knew the right questions to ask and the right people to go to get the information you needed. But when you’re sending, you know, somebody that handles tribal affairs to cover a cybersecurity incident, there needs to be a little bit of education on your end. And I think that’s important for the media industry to understand, is that the real consequences, they have consequences for not only your systems down the road, but for all of the customers that are affected. So as quickly as we want to get information, somehow it’s almost disingenuous from a media personality to say, oh, you didn’t give that to me fast enough, we’re going to just print XYZ without, without having our our take included in that package.
Lisa Black [00:34:32]:
And, and that’s unfortunate. And even if it— our take is included in the package, they don’t find validity in it because, you know, we were silent for 4 days, so now they’re offended. And if we could remove the bias from everywhere in the world, I mean, that would be one of the ones I quickly identify, because that affects regulatory issues, that affects banking and credit industry and issues. We had it, you know, once, once this hit the media for us, I got a call from several of our creditors and they want to put us on credit watch, ’cause that’s one thing the banks do in the US every week is they look at, okay, what’s in the news, what’s happening, and why weren’t we identified? Why are we reading about this? And we don’t have to, they’re, you know, depending on your regulation, you don’t have to notify anybody in a certain period of time until you are firm with your information. You have to make sure you’re providing accurate information.. And so I’ll never forget when they put me on the phone with the bankers and I was like, I’m not the finance person, but I can tell you what we’re doing to mitigate and to, you know, to compensate for the systems that were down. And it was very helpful. We didn’t have to go on credit watch, but it was that slight thing that they read somewhere that they were alerted to the fact that, oh no, should we be holding back capital to continue operations? Here, and that’s what I mean by real consequences.
Karissa Breen [00:35:52]:
The other part I want to explore now is what about over-reliance of technology? So what I mean by that is if your systems lock up, phones go down, I don’t have your number written down, Lisa, and I guarantee you 99% of people don’t carry phone books that they write people’s numbers in. So, and I know people would know like 911, but other outside of that, how do people start to memorize this? And you use your vernacular before, like muscle memory, because I’ve heard people saying, oh, we’ve got this document, we print it out, but then it’s not updated. Maybe half the people that you’re supposed to contact don’t even work there anymore. And if you can’t contact someone, like, what do you sort of do then? Because we are so reliant on technology that it’s going to be hard then to start to communicate with people. You’re right.
Lisa Black [00:36:33]:
In the middle of a, um, in the middle of an event, a cyber attack or a breach, it’s not really about the data we lost, but more about the loss of ability to function. How do we still do our jobs without that acumen, that technical spine of the work that helps us do our job. And I had that exact experience during our event. And because it was related to life and safety, it was, you know, the systems went down. Police officers have computers in their patrol cars, and we had the 911 operators on the phones or on the radios referring, you know, where to go. This is Officer X, you have to go to this location. And, you know, some of them have been so overreliant on GPS or mapping, you know, a digital map that they couldn’t find their way there, or their response time was a little slower. And we immediately noticed that because we were, we always monitor our response times.
Lisa Black [00:37:28]:
And once it got to, we saw 1 minute, I know it’s only 60 seconds, but 60 seconds is life or death. Once we saw 1 minute was a delay that was basically system-wide, it was when we started calling in for additional resources. Throughout the state, we have a plan where we call for state assets to supplement what we’re doing. So, we did some of that, but it highlighted that reliance on IT. So, what we did in that scenario is we highlighted that in our academy. We started better training on judgment, having fewer priorities, but they were clear, right? Hey, what if you don’t have a digital something to tell you where you’re going? We can’t afford recovery time to be longer or response time to be longer if we don’t incorporate our simple fallback rules. So we put paper maps in vehicles, right? I mean, yes, it meant going back to the basics, but I would say, you know, in a post-COVID environment when we, everybody was over-reliant on technology, this was a scenario for us to come back to saying, okay, let’s get back to some of the basics. And I described it before, you know, we, we used to purchase homes and buy your title insurance and search for your records in your microfiche systems.
Lisa Black [00:38:48]:
All of those systems still exist. They’re not all digitized. And so how, if those still exist and we’re still paying for the actual footprint, right? We’re still paying for facilities. Now, are they frequented? Probably not, but they certainly were during the time of our outage. And I think that really is a dichotomy between COVID and post-COVID environments because COVID, we had to separate and not be near each other. And then in a cyberattack, we couldn’t use those systems in a remote way. So we had to come back to the office. We had to actually come and communicate with each other.
Lisa Black [00:39:22]:
And frankly, I’m, I’m Gen X. That was welcoming for me. I was trying to teach people, I had to teach some of our staff what a fax machine was. They were like, what is that? How do I use it? They just thought it was the copier. And, you know, fortunately we still had a system where we could use the copier as a fax machine or, you know, break out landline phones. It all meant going back to several decades before, but I think it really did start to train judgment. You didn’t have to rely on these things to get that job done. So, how can you get it done? And it’s almost, if there are any fans of Friends, the television series out there, you have to pivot.
Lisa Black [00:40:00]:
What is your pivot and how can I accomplish what I need to without those tools?
Karissa Breen [00:40:04]:
One other quick comment around going back to like the olden days would be what I’m hearing certain psychologists are saying that the younger generation being, you know, Gen Z and Friends, just their memory’s being atrophied now. So they can’t just remember stuff. And I remember getting around without a GPS back in the day. And you start to memorize it. But now people, especially younger generations who are going to be responsible, they are going to take people from the older generation, they’re going to retire, they’re going to move on. These people are going to be the people that are looking after New York City in the future. So how is that going to sort of pan out? I mean, they’re so reliant on like ChatGPT and these systems and technology that are they— and I hate to say it— are they have the capability to think, well, I need to memorize this stuff in case something goes down.
Lisa Black [00:40:54]:
Look, I think that’s a good question and one for probably somebody that even has children. I don’t. So look, you, you can rebuild your workforce. That’s the bottom line. You train your employees. So if you set those values from day one, that is incumbent upon the organization to do that. We can’t just overgeneralize that one generation does something better than the other. You train judgment.
Lisa Black [00:41:17]:
You have a skillset that may include judgment, and we can always continue to refine that. But designing for these, call them failures, right? Even cognitive failures, it is our responsibility to bring that education, and it has to start with leaders, and it has to start with making your organization build resilience. I know in cyber we talk about backups and redundancy, but what can people do if their digital memory Essentially disappears. I, I don’t know. I think design a no-tech disruption exercise. Hey, this is your job. How do you figure that out if there’s no tech? You know, if everything you know is disrupted, if you get people to start to, to think differently. And look, that’s something that should have been happening in, in your, your education and it should have been happening at home.
Lisa Black [00:42:06]:
But it, it is our job and our responsibility if we are the owners of these companies, if we are the operators of these companies and these governments, you could tie it directly to cyber. You can put it into your leadership readiness. You can make it part of your continuity of operations. It’s probably one of the most under-discussed risks in modern operations, I think, but you have to surface it. And I’m certainly no expert in how to combat it, but I think if you add it and make it part of your continuity and leadership readiness, you’ll be a head start above the others that don’t.
Karissa Breen [00:42:39]:
And then, so Lisa, what do you think sort of moving forward, What— obviously we’re in early stages of 2026, but what do you sort of think, how do you think the year’s going to sort of pan out? What do you sort of think back at the end of it going, maybe they could have done that better, that company or that government? And I know you don’t have all the answers.
Lisa Black [00:42:53]:
I’m just more curious to see and get inside your mind a little bit more. So as somebody who’s been through an attack, it’s really hard for me to sit there and judge somebody else. Now, I will read as much as I can about it, but I always understand that there are nuances to that. Response that I was not made aware of for whatever reason, probably some lawyer out there saying, you know, you can’t say it. I would say that looking forward, this is going to be an exciting year. I think we have to do more to collaborate and we have to do more with public-private partnerships because that’s what keeps everything going. Government for too long was siloed and did not allow the private sector to influence their decisions, or at least they said they didn’t. Or be part of their decision-making.
Lisa Black [00:43:37]:
But I think we have to start recognizing that they have a very specific purpose and a very specific skillset in a very coordinated way. They’re dealing with the same content and they’re doing something the government can’t. So if you get everybody together, I believe the consequences for the threat actors will be that they have to try even harder, right? Try even harder to infiltrate. We— there are certain sectors of, of the world that are like industry. They are seeing things happen, and maybe they should be alerting the public sector, the governments, maybe sooner than they’re required to, just as a courtesy. So, you know, I think they’re always worried, and this is probably where the lawyers come in again, they’re always worried about regulatory implications. Hey, what if I tell them this and they’ll assume that I knew XYZ? So stop with all of those machinations and stop with those assumptions. What we have to do is basically just start looking at what everybody is doing, how we can work together.
Lisa Black [00:44:41]:
And if we work together, I mean, is that just a kumbaya moment or am I being a little naive? I don’t know. I do think that the best consortiums I’ve ever been have been collaborative. And again, having people in the room with diverse opinions and diverse consequences and the ability to advance things in a different way is always helpful.
Karissa Breen [00:45:03]:
And lastly, Lisa, what would you like to leave our audience with today?
Lisa Black [00:45:06]:
Any final thoughts? Yeah, so I would just say that, you know, the work that I get to do with data security and information security and helping government leaders across our nation and beyond to become more aware of the things that they can be asking and potentially less technical even, Helping them make the right judgments and develop the right solutions is something that I’m always really excited to do. And although I’ve left public service, my heart is always in it. So anything I can do to continue to help in that realm is something I’m going to champion.