The Voice of Cyberยฎ

KBKAST
Episode 275 Deep Dive: Shannon Sedgwick | Geopolitics and Cyber Risk
First Aired: September 04, 2024

In this episode, Shannon Sedgwick, Partner – National Cybersecurity Practice from MinterEllison, comes back on the show to talk about cyber warfare and its potential to precede physical warfare, especially targeting critical infrastructure. He also shares his insights on the intersection of geopolitics, technology, and cybersecurity, exploring the potential for AI to exacerbate global divisions and influence economic landscapes. The conversation also dives into the impact of increasing cyber threats, the challenges of AI regulation, and Australia’s position in the international technology landscape.

After two decades of working globally, consulting on risk and cybersecurity, Shannon has keen insight into what makes an organisation both protected and resilient from cyber threats. Shannonโ€™s focus is on cyber risk governance and providing strategic advice to executive leadership and boards. Shannon works with government and corporate clients to develop solutions to incorporate cyber risk into their strategies. Shannon helps clients meet risk-reduction and compliance objectives and advises on the implementation of new and evolving technologies by ensuring they are secure, fit-for-purpose, scalable, and continually driving efficiencies. By employing his unique blend of experience in finance and cybersecurity, he assists in uplifting internal due diligence capabilities, focused on reducing risks and increasing return on investment.

Help Us Improve

Please take two minutes to write a quick and honest review on your perception of KBKast, and what value it brings to you professionally. The button below will open a new tab, and allow you to add your thoughts to either (or both!) of the two podcast review aggregators, Apple Podcasts or Podchaser.

Episode Transcription

These transcriptions are automatically generated. Please excuse any errors in the text.

Shannon Sedgwick [00:00:00]:
Cyberattacks will be a precursor to kinetic warfare and almost an advanced warning of kinetic warfare should the saber rattling progress beyond what is still largely words and minor skirmishes in the cyberspace. It’s now apparent that most organizations where an attack would cause a material impact to the populace is now classed as critical infrastructure, and they’re expected to allocate sufficient capital to cybersecurity. But, you know, from a critical infrastructure perspective, if we really view it from the CIA triad, which is confidentiality, integrity, and availability of systems, most attacks on critical infrastructure as a precursor to Canadian warfare will be on the availability of systems, not so much around confidentiality or integrity of data.

Karissa Breen [00:00:58]:
Joining me today is Shannon Sedgwick, partner, National Cybersecurity Practice for MinterEllison. And today, we’re discussing the rise of geopolitics and cyber risk. So, Shannon, a very warm welcome back on the show.

Shannon Sedgwick [00:01:14]:
Thanks, KB. It’s been a while.

Karissa Breen [00:01:16]:
It has. So on that note, I wanna start with, you know, giving your background in the military, and I think we were discussing before you’re you know, you are the very first people I’ve ever interviewed on the show and will hit 300 episodes this year. The people may not be aware that you have that military background, but you’ve also brought forward your pedigree and your knowledge and your experience in the military for what you do now. So maybe that would help shape the narrative around, like give us a bit of an overview of your thoughts on geopolitics and sort of what’s happening today, if you can sort of make sense of it.

Shannon Sedgwick [00:01:48]:
I didn’t do a great deal in the military. A lot of my overseas experience came with the work that I did post military, you know, working independently in Middle East and Central Asia and Africa. A lot of that work had a cyber angle to it. So it it it gave a unique lens at the coalface of geopolitics and the intersection between that and and technology and cybersecurity and and data privacy, particularly for the clients that we’re working with. But from the from a geopolitics perspective, there’s there’s a lot of there’s we live in a fantastic age right now. There’s a lot happening at the moment where between, you know, geopolitics and, geograph you know, geoenvironmental issues and ESG and cyber and workplace issues and, you know, the rise of nationalism and populism versus, you know, the old trend of globalism, these types of things are building to a moment building up to quite a steady momentum, particularly since COVID, and none more so than I would argue probably AI. And, you know, governments are racing to regulate AI to reduce the potential of sociopolitical risks, but what they’re simultaneously trying to do is foster that domestic AI innovation to compete geopolitically. So as a result, AI has the potential to deepen already existing divides both within that and between countries as a result of the distribution of the related benefits.

Shannon Sedgwick [00:03:16]:
You you look at some reports, and North North America and China are likely to be home to 70 to 75% of the global economic impact of AI, while other developed countries will probably take much of the rest of it, like Europe and and Asia. But, you know, America, for example, they’ll have, like, a 15% GDP boost by 2030 just from AI. In China, it’s gonna be 20 to 30% by that point. So this situation risk spawning a competitive race between countries for AI dominance, but the widening of that knowledge gap will leave much of the rest of the world behind because they just don’t have the funds or the capability to keep up. And and it’s not only battles for talent and computing infrastructure, and this is the cyber lens to it. It’s also access to and control of the data that’s required to feed AI. The the ability of data to flow across borders means that early movers in AI can gain a global influence that make it really difficult for initiatives elsewhere to catch up. It’s you know, we’ve spoken about it before individually, you and I, that it’s our first mover principle.

Shannon Sedgwick [00:04:24]:
Now you did the same with the work that you’re doing now. You were a first mover, very difficult for people to catch up. But a second concern around AI from a geopolitical standpoint is it’s both unintentional and intentional. Here’s what I mentioned before is exacerbating those political divisions and polarizing societies. You know, we’re we’re now very aware of the way that social media can contribute to polarization, but the AI driven out algorithms will play a significant role in addition to potentially keeping users trapped in those bubbles of content that match their own world view, confirmation bias that people don’t realize they’re susceptible to, and it limits access to other perceptions or other perspectives. And and it hardens misperceptions like and, again, confirmation bias, but they have the unanticipated effect of actively pushing users towards increasingly extreme content. So, you know, certain social and and digital media platforms have drawn a a lot of criticism for the ways in which video streaming services algorithms can push users in the direction of extremist political reviews and and also conspiracy theories based on their browsing behavior, and AI is gonna take that to a whole another level. It’s is frequently being intentionally used by nation states and also people domestically to manipulate and polarize viewpoints, and particularly around the issue of deepfakes.

Shannon Sedgwick [00:05:47]:
You know, deepfake video and audio content designed to deceive public and targets and and also those that denigrate public figures. You could imagine a large scale historical event similar to, you know, something out of a Clancy novel where a certain nation state leader thinks that the US has declared war on them or launched missiles because of their very convincing deep fake they saw on a a news site. So that that that’s that’s you know, AI is a a significant geopolitical issue that we need to keep tabs on, and and I’m sure a lot of investment is going to be going into that, but whether regulation can keep up is another issue. Yeah. And then you’ve got Russia’s continued invasion of Ukraine, tech production, energy prices, you know, grain supply, it’s all been affected, and it has a it creates these risk exposures in capital flows as well, you know, trade and commodity markets. And the semiconductor shortage is another issue that’s exacerbated by Ukraine and Russia. Those supply chain resilience is a massive concern for most countries in these days, but particularly given the rise of electric vehicles and lithium battery industries and things like that. And majority of the production’s in Taiwan, which is a contested area, but, you know, hopefully, Japan getting involved with their semiconductor production like they used to, will alleviate some of those pressures.

Shannon Sedgwick [00:07:10]:
But it’ll be interesting to see how China acts over the next 1 to 2 years, particularly around the South China Sea remaining that flash point. Annoying a lot of a lot of different countries, to put it lightly, like the Vietnam and the Philippines and Taiwan, Malaysia, because it’s a major shipping route. It’s I think it’s, like, 1 5th or more of global trade trends as those waters. They can see why they have a vested interest in controlling those waters. So it’ll be interesting to see what happens. And then you’ve got climate change, which is massive on every developing country’s radar and that transition from fossil fuels to renewable energy. And, you know, climate change is now inseparable from most energy issues, so there’s a competition there between governments to secure access to resources, particularly oil and gas and the Paris agreement is causing a lot of capital investment into removing carbon and greenhouse gases from energy systems and broader economies, but there has got to be consideration of other issues that will come off the back of it like oil exporting economies have to deal with stranded assets, which means that assets that lose value or they generate new debt or liabilities before they reach the end of their planned life. So it’s usually in the oil, gas, and coal industries.

Shannon Sedgwick [00:08:29]:
There’ll be a lot of stranded assets. And then, of course, probably the biggest one of mine in your radar is cyberattacks becoming more frequent and severe and then increasingly being used as a tool of statecraft. You know, the human and financial impacts rises in line with digitization and the adoption of AI and automation. So it’s it’s it’s yeah. I’ve I’ve gone on for a bit there, but there’s a a lot of geopolitical issues at present that are affecting global economies.

Karissa Breen [00:08:57]:
No. I appreciate that, and I know we can’t go into all of them, but a couple of things what was coming to my mind to ask you about and to do a few follow on questions would be going back to the global economic impact. You mentioned before that GDP for the US was, like, 15%, and then China was, like, 20 to 30%, and then the rest is sort of they’re gonna cop it. So where would you sort of see Australia sitting in that race?

Shannon Sedgwick [00:09:20]:
Very little. We’d be lucky to be we might get a bit of a bump up from AI, but I think there was a story recently about the Australian government did a tender recently for investing in an an AI capability, and they didn’t choose a domestic capability. They chose one based out of the US just because they and they probably rightly so. They might have been a bit more advanced than where Australian based start ups are, but it was still disappointing to see that they weren’t investing in domestic capabilities that were, again, reliant on international start ups. You you know as well as I do having spoken to a lot of tech start up leaders in this space and myself having been a non executive director and a chair of various successful start ups, it it’s very difficult to commercialize ideas in Australia because of the lack of access to funding. So I don’t have high hopes for government’s involvement in AI. They probably seek to regulate more than they will invest as is Australia’s way. I hope I’m wrong about that, but, you know, not criticism.

Shannon Sedgwick [00:10:22]:
It’s just the way we operate, historically. But I do have hope for, you know, the large institutions that are investing a lot in AI, particularly on the cyber side of things. Even from a a meter Ellison point of view, we’re investing significantly in AI, and it’s played a playing a huge role in our company. We’re not gonna be left behind. We’re a first mover from a legal and and consulting industry perspective in the adoption of AI to, you know, assist with rogue tasks and allow us to focus on the more complex problems and issues that require that human factor. So from a from a Australian perspective, I don’t think we’re gonna have anywhere near as a large part to play, but that doesn’t mean we should stick our heads in the sand by any means.

Karissa Breen [00:11:04]:
Okay. So following that example before that the investment went overseas, which you and I have spoken about before, that often happens, when you said that goes against those whole sovereign capability, because remember how that was sort of a big thing, you know, 12, 18 months ago. So whatever happened to that theory, it seems like it’s out the window.

Shannon Sedgwick [00:11:21]:
I think there’s still a lot of discussion around it. And when they do from a vendor procurement perspective, when they’re looking at their due diligence around risk management of adopting vendors into environments, data sovereignty and data residency is still a significant issue. It’s one of the main issues, whereas, you know, if you’re a major technology supplier and you don’t have your data centers located in Australia, it is very difficult to particularly in government, whether it’s state, local, or federal, to land those contracts. So you’re you’re you’re finding a lot of investment from international players in setting up data centers here just so they can win some of that, you know, lucrative defense or federal government work. So it’s still a concern, but it seems to pass by the wayside when it comes to investment, which I I don’t I don’t quite understand fully. I’m sure there’s a reason behind it. They wanna invest in, you know, what’s likely to be the biggest success. And I I wasn’t part of or had any privy to the decision making or the assessment done to choose that particular AI provider.

Shannon Sedgwick [00:12:20]:
It it did ruffle a lot of feathers. There was a lot of people very disappointed that that that was the choice that was made, and you can understand why. There’s a there’s a history of underinvestment in technology start ups in Australia.

Karissa Breen [00:12:32]:
And then going back to the whole GDP thing, so you said before the US 15% and what’s happened just here, it’s almost like we’re we’re allowing them to win. We are putting our head in this, and it’s like, oh, well, you know, we’ll just give it to a US player. Feels like we’re sort of almost, you know, not backing ourselves as much.

Shannon Sedgwick [00:12:48]:
I think there’s certainly intent to, but whether the infrastructure allows it you know, Australia, we’ve largely had to from a tech start up point of view, we’ve had to largely had to rely on equity funding to fund technology start ups and debt funding from banking institutions. It it has been historically deemed as overly risky for financial mainstream financial institutions to invest into because, you know, there’s no tangible assets which they can, you know, take take in control over should that company fail. How would they get their money back? And, statistically, most start ups as we know. So it it it is difficult to get funding. There it’s not impossible to get debt funding, but, you know, other countries make it a lot easier. And there there’s a way for, you know, obviously, a mix of debt and equity funding to make sense, but it’s, government grants, I think, have been have been lacking. But there there are people in government that we have spoken to that are very tech friendly, but it’s just about getting that big machine moving. And I think as as is usual with technology and and how we progress as a nation, it has to be led by private industry.

Shannon Sedgwick [00:13:57]:
Well, it doesn’t have to be, but it just usually is. So investment by the large corporations that reside here, particularly big banks, They’re usually quite advanced in terms of their cybersecurity and technology adoption, and oftentimes, government regulation falls into line or investment falls into line because of that influence from, you know, those larger corporations. So it’ll be interesting to see what the future holds, but the investment recently from the Labor Party into, and this is a should be a nonpartisan issue in my in my opinion. Hopefully, the new cyber strategy for 2023 to 20 30 remains that way where funding isn’t lost or redirected because a political party changes in the future. It must be political pressure proof, but I I have hope for it. You know, we have increasing funding, not just in ASD’s red spice program where usually most of the money goes in defense, but but it’s actually going quite a bit in the private industry and helping SMBs. So I I have hope for the future for investment in, yeah, not just cybersecurity, but also technology start ups. So I’m cautiously optimistic.

Karissa Breen [00:15:00]:
Okay. You said something before as well around, you know, deepfakes. I mean, you said a lot of interesting things, and we could go on for hours, but you said it before around polarizing society. Now this is interesting. Would you say, with what you know and your experience, do you think we’re in a state now, as in today, that we’re in this point where things have never been more polarizing than they are today, or would you think earlier things were? I have a little bit more to add to that, but I wanna hear your response first.

Shannon Sedgwick [00:15:27]:
That’s something I’m quite excited to talk about actually because what what comes to mind is the risk of deglobalization. So, there’s a variety of factors that have given rise to questions around the benefits of increased international movements of services, people, capital, tech, and ideas, the growth of nationalism, which is what you’re talking about, that division, that that sequestering of ideas and and people and tech and capital within your own nation to protect yourself that are creation of both literal and figurative borders, and they call it protectionism and and populist movements in recent years. It’s created an environment of increasing uncertainty and it could potentially lead to deglobalization. That is, like, a reversal or a slowdown of globalization, and nothing sped that up more and exposed that vulnerability than the COVID 19 pandemic. And many countries are heavily reliant on imported goods with nations, worldwide enforcing border closures over COVID and and restriction on an international level. It made it increasingly difficult, and and since then, governments have been increasingly keen to diversify their source of imports as a protective measure to reduce their dependence on a single trading partner, and and Australia has taken significant steps to do that as well. But despite this, a lot of businesses do remain interested in in cross border economic engagement and and even amid heightened tensions and that saber rattling between Western society and, you know, particularly the US and Australia and China. We continue to engage in in that bilateral trade with them.

Shannon Sedgwick [00:17:04]:
The anti globalization movement, it it does pose a threat to economic growth and international relations. But there I do believe that there’s a bit more of a renewed trend even in the last 6 months towards a bit more of increased pragmatism and collaboration, and a lot of that has been fueled, as I mentioned before, by, that reaction to Russia’s invasion of Ukraine. And and that that new pragmatism that we’re seeing is gonna counter movement towards a more closed off world of self contained economies where, you know, governments are protecting their own industries and citizens from foreign competition. Like, they implement subsidies and other incentives that favor domestic producers over foreign ones. There there is incentives, and all behavior stems from there are incentives to bolster domestic production and domestic capital flows, but it doesn’t mean that we’ll you know, we should do away, throw the baby out with the bathwater and do away completely with bilateral trade and, international cooperation, particularly on issues such as cyber. As I said, that needs to be a a nonpartition, particularly with the and and, you know, regulation of AI as well, I would argue falls into that same bucket. So, you know, it it is extremely divisive. You only have to look at the US to see that the divisive nature between political parties where people and it doesn’t happen as often as Australia, but it has had an impact here where people attribute their entire personality to a political party’s beliefs whether they believe it or not.

Shannon Sedgwick [00:18:38]:
They’re not making up their own mind about issues. They’re sort of doing what they’re told and mainstream media on both sides of the political spectrum do nothing to help that situation. It’s in their best interest to polarize people. But then you look at it from the from a nation state enemies of the US, It suits them perfectly. They’re they’re feeding into their, into their agenda perfectly. Now you only had to look at the attacks on the Democratic National Committee back in 2015 and 2016, achieved their aims, and they’re still they’re still achieving their aims with that division.

Karissa Breen [00:19:12]:
Okay. So I wanna expand on obviously, the US election is coming up. Now I spoke to head of research from Tenable about deep fakes, so had this whole example around how that can influence certain, you know, certain beliefs of people as you touched on. So would you say with everything with deepfakes and, you know, obviously, now, like you’ve mentioned, there is the gaps widening between polarization between different parties and and and different sort of, countries, etcetera. Would you say, though, things will just keep getting worse, and they’re not probably gonna they’re probably not gonna get better? And the reason why I asked that is because, I don’t know, back in the day, there wasn’t social media. So now you open your phone, it does feel like you are being fed certain bits of information whether we like it or not. It’s just the way things are now. There was no, you know, Twitter and things like that back in the day.

Karissa Breen [00:20:03]:
There was literally, like, one newspaper and 5 channels on television, so you didn’t have a lot of choice. Whereas now, like you said before, around confirmation bias, etcetera, where do you sort of see this problem going now?

Shannon Sedgwick [00:20:16]:
Well, without significant change to those platforms, and I think regulation is the only way to do it, and Australia’s regulatory agencies, particularly privacy commissioner, they’ve taken significant steps even towards the large social media players to address these issues around, you know, the the promotion of hate speech and discriminatory videos and even violent and extremist propaganda and and conspiracy theories on mainstream platforms, but they’re still a long way behind because they just wield so much power. You know, there’s a reason why these platforms are free. You are the product as the user, and it’s never going away. So we have to adopt that. It would it’s not like it’s similar to the, long contested issue of, gun control in the US. It’s it’s never going away. There has to be a a middle ground that has to be found, and it’s the same with social media and the proliferation of extremist views because of these algorithms that spin up. You know, if I watch a lot of cooking shows, all I’m gonna see is cooking show content across not just one platform, but all of my platforms because it’s collecting information about me and soon enough in ads on Google searches, I’m gonna be seeing things about cooking products.

Shannon Sedgwick [00:21:33]:
The same way with extremist content, I start watching a Fox Fox News videos and things like that, I’m gonna start seeing more along the lines of, you know, Fox News content and things from the republican party and the likes of Mitch McConnell and these these types of content and, you know, not criticizing one party or another because if I watched, Russia TV, I get the same. I’d be being fed Russia propaganda mixed in with real news. And it becomes increasingly difficult to discern truth from fiction. We live, and I’ve said this before in our conversations, we live in a post truth age where someone’s emotional reaction to information matters more than whether that information is true or not. And we have this saying from that we hear regularly from conspiracy theorists and extremists on both sides of the political spectrum is do your own research. Well, I agree with that in principle that one should do their own research, but how do you know what you’re researching is providing a critical view of a topic and it’s not just feeding confirmation bias. But how do you know it’s true? How do you validate it? How do you fact check? Yeah. Are you reading both sides of an argument? Are you just considering your own and it’s validating you and everybody else is stupid? Because that’s that’s what happens.

Shannon Sedgwick [00:22:51]:
And unless you’ve done critical research, you know, typically at the university level where you understand that that necessity for viewing both sides of an argument and researching critically and challenging your own beliefs, most people don’t do that. They just see what is fed to them, and it’s particularly with people who didn’t grow up with access to technology, like many young people have now. It it is easy to believe whatever you read. It’s a significant issue. It’s a massive issue. It’s creating a huge divide. However, it’s extremely profitable for certain large players. So, yeah, they it’s a thorny problem, and I don’t have the answers to it.

Shannon Sedgwick [00:23:30]:
But the only way is to, in my mind, is through government regulation of those those social media platforms.

Karissa Breen [00:23:38]:
So following that a little bit more, would you say as well there seems like a really basic question, but the the things just appear worse because, again, like I mentioned, there is social media now. We hit you know, things are more ubiquitous. Like, back in the day, there was none of that. There was one radio station, 5 channels. The thing are things actually worse, or do they just feel worse because we can see more of it?

Shannon Sedgwick [00:24:00]:
I I don’t think it’s as bad as it’s made out to be. Sure. Things you know, there’s certain things such as the rising cost of living and the issues around housing affordability and things like that that cause a real issue amongst young people and is is deeply concerning and as well it should be. You know, people can’t afford to live even on a the average wage or beyond the average wage. That divide between the haves and the have is not is quite frankly disturbing. So there there is the bad side of things, but there’s a lot of good being done as well. You know, the the adoption of technology is a is a good thing. And, you know, the counter, as I mentioned before, the counter to that populist and nationalistic movements with not doing away with globalization, there are people with power that wanna see that collaboration and, you know, 5 Eyes is a is a good example of that, the intelligence sharing framework set up between 5 Eyes countries, you know, which Australia is a part of.

Shannon Sedgwick [00:25:03]:
There’s a lot of good happening in the world, but it’s good it doesn’t sell. Bad news is what sells. And if you are a constant imbiber, for lack of a better word, of mainstream media and social media, it can become quite poisonous because as as we spoke about earlier, there’s the algorithm spin up exactly what you’ve been watching lately. And even from the point of it can be quite harming to your mental health because you see all of these, you know, these filters for everything. You see these seemingly perfect people who live their lives traveling the world, and they seem to have no cares, and you’re working at a 9 to 5 in a state office. It it can cause mental health issues and, you know, the the complaint around Gen z not wanting to work, I think, is utterly wrong. It’s just a different they grew up in a different time. There’s different priorities now.

Shannon Sedgwick [00:25:54]:
Work is not the priority that it used to be. It’s more of a, a way to provide the life that they really wanna live. So it’s just a differing priority. So it it’s it’ll be interesting to see where future generations go as, you know, the boomer generation and gen x generation. They go through their exit positions of leadership and millennials come into positions of power and then Gen z after us. It’s it’s gonna be interesting how that worldview is shaped from people who lived through drastically different generations. It’s gonna be quite interesting to see, but I don’t think the news is as bad as we think it is. I think it’s just it seems worse because of the the media that we ingest.

Karissa Breen [00:26:35]:
You also mentioned before around keeping tabs on AI. Would you say that we are doing that? Or, again, I know it’s a big thing, a big problem. It’s not an easy fix. It’s not a flick of a line switch. I get that. But, again, does it feel like we’re keeping tabs, or are we in actuality keeping tabs, would you say?

Shannon Sedgwick [00:26:55]:
I think we are keeping tabs, you know, from my conversations with people in government, those with a vested interest in this in government, you know, defense, and particularly our intelligence agencies have a close eye on this issue, but also from a regulatory and a policy point of view. The department of home affairs rightly show we’re keeping a close eye on this, and their regulatory and legal frameworks are being considered around how to govern AI. But it is it is a difficult problem because how do you regulate something that’s so fast moving? You have to future proof any regulation because of how how long it takes to pass through their government systems. So I think that they’re certainly not sticking their head in the sand. They’re aware of the risks, but I think we’re also aware of the benefits of AI. And, again, that’s being led by private industry in my experience from, like, property and infrastructure and construction groups, insurance in a massive way, financial services, and that is banking, critical infrastructure, marketing in particular. You know, AI is and and and from a manufacturing point of view as well, AI and automation and, you know, just assisting with maintenance of plants and equipment has been a game changer. And AI, I think people are coming to the realization that it isn’t the threat to taking workers’ jobs as they once thought it was.

Shannon Sedgwick [00:28:13]:
It’s more about bolstering their capabilities. So what government is doing right now is not only looking at, you know, the risks of AI being used in a, averse sense, but also looking at how we can use it and educate and that continual training for staff so that they’re not replaced by AI, but they’re allowed to be more effective than AI. AI is a long way from being autonomous or being able to have the same level of quality outputs that a human would in most tasks. Obviously, some things which require drawing inferences from large sets of data, nothing beats AI. It’s it’s what it’s made for. It’s what it’s built for, and that’s why it’s perfect for things such as sort of law firms and and those who are actually blaming intelligence or actionable intelligence out of large sets of data. So there’s there’s benefits and risks, and I think we have some very smart people in government that are, that have their finger on the pulse of this issue just depending on how well we can craft that language. But their push towards collaboration between private industry and public is something that should be lauded, and it’s been great recently.

Karissa Breen [00:29:22]:
Okay. So, Shannon, you said as well, and I’m trying to go all the things you said at the start, I’m trying to touch on all of them because it was all interesting. You did it’s gonna be interesting to see how China acts over the next 2 or so years. So how do you expect them to act if you sort of had to hypothesize their what they’re gonna do?

Shannon Sedgwick [00:29:41]:
A lot of what they do will be largely reactive. You know, obviously, they’re pushing for more control over major shipping lanes in the South China Sea and going hard after, Taiwan, and they they make very aggressive overtures in the public, but then again, so is the US and so do we. And there’s become a back and forth of saber rattling, which sort of heightened with the the, you know, the cattle industry, the beef cattle industry being impacted dramatically, and I think grain got impacted dramatically as well around COVID time of them not accepting any of our exports, which was, you know, most of our exports from an agricultural perspective are heading out to China, and that was severely impactful to our economy, which is not very mature in compared to other countries in diversification. We’re reliant on 3 or 4 different industries as a whole rather than, you know, 10 or more like a lot of other countries. So largely agriculture and, the tertiary education sector, and tourism is probably the main three that I can that comes to mind at least. I think we need to be very careful as that. We have to play the, hesitant maiden, for lack of a better term, bit of Switzerland to play where we’re neutral. Whilst we obviously have taken the side of the US in most issues, we need to be very careful about towing that line when it comes to China because we are economically reliant on them, and we need to ensure that that bilateral trade continues.

Shannon Sedgwick [00:31:11]:
So, you know, we we have to tread carefully with this. We can publicly denounce some of the actions that they take, but I think nobody wants kinetic warfare or even any form of warfare because that it’s just bad for business, to be frank. But we’ve already seen from a cyber perspective almost a a cold war happening behind the scenes with attacks between nation states on increasing attacks on nation states on critical infrastructure assets. So I think that will increasingly happen, and our intelligence agencies would be aware of that and assuring up those protections and ensuring that private industry own critical infrastructure assets through the security critical infrastructure act and and the new obligations coming in there. They’re doing their part to shore up those those assets, but we’re still a long way from being where we need to be from a maturity level at a national from a national perspective.

Karissa Breen [00:32:04]:
So going back to your comments before around hesitant maiden, which I agree with.

Shannon Sedgwick [00:32:10]:
Yeah.

Karissa Breen [00:32:11]:
Would you say with your experience and knowledge, it’s a better position to be in, like, for Australia specifically? Because, again, like, we’re not we’re not like a US where we’re gonna come out and polarize people and, you know, hardcore. We obviously do play we probably sit on the fence a lot more. Would you say that is advantageous to us? And I know you mentioned before around the economically relying on China and friends. Would you say that would be the better position to be in without without sort of nation and and how it is today?

Shannon Sedgwick [00:32:37]:
At a simplified level, yes. I think we need to, as I mentioned, tread very carefully and tread that line. I I think if it did come to, and knock on wood, that doesn’t happen, you know, kinetic warfare, we would obviously have to choose a side because Australia would be a strategic stepping ground for other nation states to use from a, a warfare perspective. But I think what will be a precursor to that are cyberattacks. Increasingly, as we saw in Ukraine, cyberattacks are precursor to kinetic warfare. We call it preparing the battlespace in military parlance, which is where you’re softening a target and preparing a target before conducting kinetic activities. And and what I mean by kinetic warfare, if people don’t understand that reference, is what you think of as warfare, soldiers’ tanks, artillery, navy aircraft, physical warfare, whereas, you know, cyber warfare, as we know, is a targets against, you know, critical infrastructure assets and anything which will have a material impact on a nationally significant target. And I think with satellite communications and technology and Internet enabled capabilities and targeting those of of your opponent, and they’re essentially using cyber warfare as a proxy war fighting capability.

Shannon Sedgwick [00:33:59]:
But it also allows them deniability, which is interesting because it is attribution is is is incredibly difficult in cyberspace. Attributing in particular attack to a threat actor unless they outright own it. And even then, it’s difficult to be sure, unless you’ve got offensive capabilities like the ASD and and the NSA and other intelligence agencies, which is illegal for anybody else in Australia for them, but in the ASD to do, it’s almost impossible to identify who is who. So there’s gonna be significantly more attacks stemming from nation states, and how we can have approval with them is difficult. You know? There’s a a good recent example of that actually from a a China based perspective is, I don’t know if you recall, but I think it was a couple of months ago, an Australian mining company that, works in rare earth minerals. They disclosed a cyber incident to ASX. They’re publicly listed. The day before the attack, Australia’s treasurer, they ordered 5 people, 5 foreign persons of a certain nationality to divest their shares in the business, and and that decision was to, protect national interest and ensure compliance with probably something set out by the foreign investment review board.

Shannon Sedgwick [00:35:12]:
That is, you know, their investment framework. And in the ASX incident disclosure, the cut the mining company had got affected. They said that the exfiltrated data had been released on the dark web, but it hadn’t had a material impact on on their operations or border systems. But in this is one of those cases where the threat active group, which is called, Beyond Lian, they claimed responsibility on its dark website and said that it was personal data relating to employees as well as financial data. And although there is an an identified link between Beyond Lian and China, the timing of the publication of that attack from Beyond Lian is notable. Obviously, there’s no definitive proof. Maybe intelligence agencies have it. I don’t have access to that intelligence, but you can see that there is recompense sword against perceived slights from nation state actors.

Shannon Sedgwick [00:36:05]:
So even seemingly innocuous innocuous decisions of getting rid of people out of the country or getting them to, sell their interest in the straight the strategic Australian asset can have ramifications at a national level for us. So it’s very interesting to see that type of proxy warfare. It’s not quite warfare as an insult of one upping one another. So it’s it’ll be interesting to see where that goes in terms of and it won’t be the last time we see that.

Karissa Breen [00:36:32]:
So going back to your military parlance around preparing the battleground, wouldn’t you say this is going to be the future now of war? So to your point, precursor of the cyber cyber warfare that which then could, may, can lead into a kinetic warfare. Isn’t this going to be how it goes now? Because like you said, the precursor of the cyber stuff is gonna soften that battleground to then, you know, get the target ready to then lead into that kinetic warfare.

Shannon Sedgwick [00:36:58]:
Yes. Certainly. And and the targets will be almost certainly be critical infrastructure, which is why critical infrastructure security, particularly in Australia, is a is a significant concern because this is the way wars are fought now. Cyberattacks will be a precursor to kinetic warfare and almost an advanced warning of kinetic warfare should the saber rattling progress beyond what is, you know, still largely words and minor skirmishes in the cyber space. It’s now apparent that most organizations where an attack would cause a material impact to the populace is now classed as critical infrastructure, and they have to they’re expected to allocate sufficient capital to cybersecurity. But, you know, from a critical infrastructure perspective, if we will view it from the CIA triad, which is confidentiality, integrity, and availability of systems, most attacks on critical infrastructure as a precursor to Canadian warfare will be on the availability of systems, not so much around confidentiality or integrity of data. So from an industrial infrastructure perspective, you know, there was a Triton malware attack, nearly caused a huge explosion because in a Saudi petrochemical plant allowed the hackers to take over the plant’s safety systems. Likewise, during COVID 19, Israeli water systems, and we don’t have to think very hard about who the potential attacker might have been there or at least funded by their Israeli water systems.

Shannon Sedgwick [00:38:20]:
They endured multiple cyberattacks designed to compromise the industrial control systems and pumping stations and water wastewater plants and agricultural pumps. And then there was a successful one in the US with Colonial Oil. Do you remember that? That’s the largest pipeline in the US that was hit by a massive ransomware attack. And they they supply almost 50% of the US East Coast gas, diesel, and jet fuel, and they were forced to shut down its operations entirely for 11 days. And and even after 11 days, they only partially recovered, and they ended up having to pay $5,000,000 USD in ransom. And and even from an espionage point of view as well with the evolution of technology, trying to get your hands on secrets in a digital environment has become increasingly more advantageous, and and there’s that deniability there. Whereas, you know, together secrets, you had to develop. Of course, there was still SIGINT, that is signals intelligence you could gain from satellites and things like that or overhearing radio communications, but we didn’t have the cyberspace to tap into, whereas which is where most information resides these days.

Shannon Sedgwick [00:39:28]:
You had to rely on human intelligence sources and verifying those and developing, you know, human assets in positions of power in target target countries, whether they be allied or what we see as potential adversaries from a intelligence gathering perspective. It’s made it a lot you know, the digital environment has flattened or even collapsed geography as we understand it because we can reach out and touch somebody on the other side of the planet. Now our data and assets are increasingly being stored in not just on prem systems and networks, but now the cloud. So the the the core concept around intelligence and clandestine operations and espionage still still persists, but there’s, the growth of SIGINT and, intelligence gathered through the cyberspace has exploded well beyond that of human intelligence gathering.

Karissa Breen [00:40:22]:
So, Shannon, this does seem like a very broad question, but where do you think we go from here? What do you think happens now?

Shannon Sedgwick [00:40:29]:
We keep going as we have been in terms of Australia’s made some significant steps with both legislation, a few changes to the privacy act of 1988, and the uplift of that be more in line with the likes of GDPR, particular focus on consent of data owners to the use of their information, things like that, and the storage and governance of data, which is a significant issue for most of our clients. You know, 50% of the work we’ve been doing lately has been data mapping and data governance services. So and the use of AI as well in in terms of that data governance question around the privacy impact analysis. So, you know, the adoption of AI and automation will continue to increase unabated. I think that we will be playing catch up from a regulatory perspective. That that speed of digitization was accelerated further in COVID 19, but it hasn’t really slowed down. And as a result, I think, you know, espionage, data theft, attempts to disrupt our day to day lives going digital, that’ll increase malicious cyber activities and cybercrime will only increase, and cyber risks will become more pronounced. But I think from a private industry perspective, at least, we’re taking the steps necessary and particularly with support of the the government with its new cyber strategy.

Shannon Sedgwick [00:41:44]:
I’m hopeful that that will continue to spur on investment and and capital allocation to not only adoption of technology, but the address addressing of risk from both a data and a system perspective in that adoption of technology. Milicious actors will continue to target key assets in critical infrastructure, like disabling health care services and stealing research while inflicting reputational costs on corporations and governments, but the online world’s gonna continue to have significant impact on geopolitics and vice versa. But in both sectors, intelligence is a hugely important factor. So gathering and verifying actionable intelligence allows decision makers in government and industry to address risk effectively and predict what’s mostly likely to occur in the future.

Share This