The Voice of Cyber®

KBKAST
KB On The Go: ISACA Beyond Tomorrow
First Aired: October 04, 2024

In this bonus episode, KB is on the go at ISACA’s Beyond Tomorrow Conference in Melbourne. KB sits down with industry leaders like Erik Prusch, CEO of ISACA, as he discusses the organization’s expanding global influence and their pivotal role in career development for their 180,000 members. They also delve into the critical topics of AI’s transformative power across sectors, the intricacies of third-party risk management, and the indispensable importance of mastering basic cybersecurity practices. Erik is joined by fellow experts such as Jamie Norton, Chirag Joshi, Francine Hoo, Kate Raulings, Richard Magalad, Sam Mackenzie, and Wayne Rodriguez, who also bring their expertise and stories to the table, sharing the newest developments and challenges in cybersecurity and critical infrastructure.

Erik Prusch

Erik is an experienced CEO and board director for major tech companies. Prior to joining ISACA, he was most recently chief executive officer at Harland Clarke Holdings Corp., a provider of integrated payment solutions and integrated marketing services. He has also served as CEO for Outerwall, Lumension, NetMotion Wireless, Clearwire and Borland Software Corporation. Additionally, he has been a board member for RealNetworks, WASH, Calero Software and Keynote Systems. Previously in his career, Erik served as chief financial officer for a number of public companies, such as Identix and Borland, and for divisions of public companies, such as Gateway Computers and PepsiCo. He began his career at Deloitte & Touche (then Touche Ross). Erik holds a bachelor’s degree from Yale University and an MBA from NYU’s Stern School of Business.

Jamie Norton

Jamie Norton, CISA, CISM, CGEIT, CISSP, CIPM is a Partner at McGrathNicol, a specialist Advisory and Restructuring firm committed to helping businesses improve performance, manage risk, and achieve stability and growth. He also serves on the Advisory Board at Avertro, a cybersecurity startup enabling informed and defensible data-driven decisions about organisational cyber resilience and AI safety. He has over 25 years’ experience in managing security resilience for State and Federal Government agencies and commercial organisations. He is the former Chief Information Security Officer (CISO) at the Australian Taxation Office (ATO), one of Australia’s largest federal government agencies, where he led the security governance, risk, intelligence & operations, testing and forensics teams. He has chaired and supported several senior industry and interdepartmental committees on cyber strategy and resilience and the senior Australian representative at international government forums on cybercrime. He has previously held leadership roles at NEC, Tenable, Check Point, and the World Health Organization.

Jamie has been involved with ISACA for nearly 20 years, at the local chapter board, conference organiser and most recently with the CISM Certification Working Group. He holds degrees in accounting and information technology from the Australian National University and is an affiliate member of Chartered Accountants Australia and New Zealand. Jamie is a regular and accomplished industry speaker and media commentator on cyber security. He is based in Australia.

Chirag Joshi

Chirag Joshi, a multi-award winning cyber security executive, brings extensive experience in leading cyber security and risk management programs across various industries, including critical infrastructure sectors such as financial services and energy. His expertise in both IT and OT environments, coupled with his experience in managing cyber security through mergers and acquisitions, makes him uniquely qualified to address the challenges of the SOCI Act. As the author of bestselling books on cyber security and a recognised thought leader, Chirag offers valuable insights into practical implementation strategies and behavioural aspects of security awareness. His role as Founder and CISO at 7 Rules Cyber, combined with his experience in leading multi-million-dollar cyber transformation initiatives, positions him to provide actionable advice on navigating the complex landscape of critical infrastructure protection, supply chain resilience, and cyber risk management in the context of the SOCI Act and beyond.

Francine Hoo

Francine is a Director with KPMG’s Data team focusing on building trusted data practices. She has helped build, assure and audit multiple frameworks including governance, data management, data analytics practices, privacy, risk and compliance. Having started in audit, she leverages her combined experience to help build evidenced based, human centric, ethical and trustworthy data practices. She’s helped teams build AI Assurance frameworks to ensure safer and reliable deployment of AI based outcomes. She passionately believes that humans are accountable for the right use of data and therefore the sufficient and appropriate risk management of data operations in all its forms – including AI and automation. The future of data driven outcomes including AI is dependent and strengthened by the partnership of a diversity of thinking, where humans collaborate with tech.

Kate Raulings

Kate didn’t start her career in cyber security. Computers and internet connected devices weren’t common at the time. She has a deep understanding of business imperatives developed over a decade’s experience in senior communications and innovation roles before focusing on IT strategy, governance and cyber security for the last 8 years. She regularly briefs senior executive, audit and risk committees and boards on privacy and cyber security matters and has supported numerous organisations through notifiable data breaches. Kate has a Masters in Marketing and an MBA from the University of Melbourne as well as CISM certification. She has won local and global recognition for her success in digital communication and was a finalist in the Women in ICT Awards in 2022 and 2023. She is a member of the Australian Women in Security Network and an ISACA member. Kate is the CISO at EPA Victoria.

Richard Magalad

Richard is a 30 year veteran of the ICT industry starting at the Commonwealth bank and was a 10-year IT director from 2010 at a mining company with gold and diamond projects in Australia, Laos and Canada.
Current projects are systems integration for two of the large Telcos, several agencies in the Australian Federal Government. He is a hands-on tech with a philosophy to never separate Cyber Security from Information Technology, just as he was trained in highly secure arena in banking and government
He consulted and trained cyber security in SE Asia to governments and critical infrastructure enterprises on missions for Dept of Foreign Affairs and Trade and with RMIT University, where he now lectures cyber security to professional students.
He was an executive committee and secretary at the Australia Computer Society (Victoria) until 2022 and the current chairperson of Cloud Branch with the Australian Information Security Association.

Sam Mackenzie

Sam Mackenzie, is a Cybersecurity committee member with ACS Victoria Branch and brings 25 years of experience, where he speaks straightforward cybersecurity and technology with business leaders.
Having worked with global brands overseas and household names in Australia, he’s known for creating high-performance teams across the sectors of health, telecoms, energy and more recently local government. His approach is characterised by structured thinking, simplifying complexity and developing culture as a catalyst for change.

Wayne Rodrigues

Wayne Rodrigues is currently a Security Architect at Insignia Financial and an active member of the cybersecurity community.

Having been involved with ISACA Melbourne in the very early stages of his career, he has remained an active member and volunteer for the past 12 years. He is also part of various other initiatives such as the Purple Team Australia mentoring program and the EC-Council Career mentoring program. Being a keen advocate for continuous learning and growth, he loves mentoring others in the industry. Wayne believes these initiatives an excellent opportunity to give back to the community and help mould the next generation of industry professionals.

Help Us Improve

Please take two minutes to write a quick and honest review on your perception of KBKast, and what value it brings to you professionally. The button below will open a new tab, and allow you to add your thoughts to either (or both!) of the two podcast review aggregators, Apple Podcasts or Podchaser.

Episode Transcription

These transcriptions are automatically generated. Please excuse any errors in the text.

Karissa Breen [00:00:16]:
Welcome to KB On the Go. And today, I’m on the go in Melbourne at ISACA’s Beyond Tomorrow Conference. I’ll be reporting on the ground here at Collins Square Event Centre. Beyond Tomorrow focuses on technology, process, and people with 5 keynote sessions, 24 presentations, panel sessions, fireside chats with over 30 thought leaders and subject matter experts, plus so much more. So for this bonus episode, I’ve lined up a few interviews from the members, including CEO of Osaka, Eric Prusch. So please stay tuned. Joining now in person is Eric Prusch, chief executive officer from ISACA. So, Eric, thanks for joining all the way from the US, and welcome.

Erik Prusch [00:00:57]:
Thank you very much.

Karissa Breen [00:00:58]:
So Eric, you’ve recently joined ISACA just over a year ago as the CEO. So maybe talk talk us through your vision.

Erik Prusch [00:01:05]:
You know, I joined ISACA to lead an organization, you know, that’s been around since 1969, 180,000 members across the globe, 225 chapters affiliated with ISOC, and

Erik Prusch [00:01:19]:
I have it in a 188 countries. And the opportunity for me was a great one, which is I wanted to find an organization that not only had a reason for being, but did it better than anybody else. And one of the things that ISACA has really showed me is what the capability of a 180,000 people all moving together in a direction to create opportunities for all of them can mean. How how does it work? And ISACA is one of those rare instances where an organization is so passionate, so engaged of not only about their professions, but the membership in ISACA that it’s it’s astounding. There there’s nothing like it. And that’s demonstrated by not only our returning members, we have a very high retention rate on members, but how long they stay members with us and how active they are in the chapters and in really leading ISACA forward. And, and the vision is really about those members. It’s about bringing capability to all of those members to allow them to pursue their careers, to pursue their journeys, and make certain that they are the experts at what they do, that they are the leaders and they are the knowledge centers within their enterprises and are able to deliver outsized impact into their organizations.

Erik Prusch [00:02:39]:
That’s what we do every day. That’s what we wake up, and that’s what I get excited about.

Karissa Breen [00:02:42]:
Do you sort of go on the ground a lot and speak to the members so you’re sort of hearing it at the coalface around, you know, what people what people are up to, what’s happening, what’s their viewpoint? Are you doing that a lot?

Erik Prusch [00:02:52]:
Consistently. I think that’s part of what we think about our advantages is that 180,000 across the globe, I liken it to being crowdsource. Right? We’re crowdsourcing not only where this industry should go, They’re also crowdsourcing what are the ways that we’re conveying information and feeding them in terms of where they’re going in their careers and how they’re getting there. What that allows us to be is real time in what we do. So we can be fastest to create solutions. We can also be the most instructive in terms of having the ability to not only certify or develop content or develop training or end to end in terms of those members, where others may just focus on certification or they may just focus on training, we’re able to provide, I think, an end to end for those members. But a 180,000 people are touch points into all of their enterprises and all of the geographies that we represent.

Karissa Breen [00:03:53]:
The reason why I asked you that question is, like, you’ve just touched on a 100 that a 180,000 people is a lot of people. And a lot of, you know, CEOs and executives, they have a security detail, they’re in their car, they’re not going out and talking to the person that’s on the on the front line or at the front desk. So I think it’s important that people who are listening to this, who are ISACA members or willing to, you know, become an ISACA member, they understand that because I think it’s important. And I’ve watched a lot of that undercover boss show and I think that sometimes when those people go undercover and actually see at the coalface what people are dealing with, it really changes their perspective. And I think that’s an important thing because it is built on members.

Erik Prusch [00:04:32]:
That’s correct.

Karissa Breen [00:04:32]:
So I appreciate you sharing that. So I’m aware that AI is a focus for you, so maybe tell us more and I know that it’s being discussed a lot here today and I’ve had a few sort of sound bite interviews with a few people touching on AI, but maybe I’d I’d love to hear your thoughts. Wherever that wherever that takes you, please start.

Erik Prusch [00:04:50]:
AI is a new frontier. Everything’s changing as a result of it. How we do our jobs, what areas of opportunity there are, what what areas of threat there are. Everything that we’ve touched is changing with AI so much so that it’s becoming a part of our common speak. ISOC has delivered 6 new courses, in record time in order to make certain that our members understand AI. But we’re uncovering what the needs are. We’re uncovering what the level of understanding with our experts are, and making certain that we’re helping to guide our members going forward. I’ve said it before.

Erik Prusch [00:05:28]:
I think one of the biggest challenges for us is that we don’t even understand the problem. Cybersecurity, IT audit, risk assurance, governance don’t even understand yet the quantity of the problem, the proliferation of the problem, much less develop the solutions. And we are doing that every day, not only using our 180,000 touch points, but making certain that we’re providing as real time as possible, the understanding to help guide these, our folks to be able to deliver for their enterprises. And to me, that is something that’s unique because of our network. And it’s something that it creates an opportunity for us to influence regulatory bodies, influence governments, influence enterprises, along the way. And that’s what we need is a collective force around that. But AI is a new frontier and we’re excited about about it. We think we’ve got a major role to play in it.

Erik Prusch [00:06:27]:
The 6 courses are just scratching the beginning surface of it, but we’re evolving to a point where we’re speeding up the the knowledge. We’re getting it to market faster and faster than we’ve done in the past, which is also making certain that we’re responsive to the needs of our members. And remember, our members aren’t our customers. Our members are our stakeholders. And when we think about what we’re trying to equip them, it’s not to sell them a product. It is to make certain that they’re informed and most informed of anybody within their enterprise. That’s a much more aspirational goal.

Karissa Breen [00:07:02]:
So a couple of things in there. You said we, as in ISACA, play a major role. What does that look like?

Erik Prusch [00:07:09]:
It’s end to end understanding of how this is coming into the enterprise. We need to know where threats are. We need to understand the procedures that we’re going to deploy. We need to understand the risk that enterprises are taking on when they’re introducing AI into their into their companies. It is understanding it end to end as we’ve done before. And the only difference is is when that landscape changed 9 months ago significantly, and it wasn’t that AI was created 9 months ago. It came to such a significant role within enterprise 9 months ago Mhmm. That all of a sudden we now are into that crunch phase of making certain that we understand it end to end, and it’s still evolving.

Erik Prusch [00:07:55]:
So we’re trying to get out in front of something that is constantly evolving. The only way you’re going to do that is with scale. You’re going to do that with those touch points and you’re going to do it by being responsive to those stakeholders, in the ways that they need to be informed. But we still have so far to go when we ask our members how well do they understand it. It’s not enough. When we ask enterprises how well do they understand it, it’s not enough. We’ve gotta go further, faster, and in a much more significant and scalable way than we’ve ever done before.

Karissa Breen [00:08:25]:
And how would you know if you understood something enough?

Erik Prusch [00:08:28]:
So I don’t know that there’s ever enough. Right? But there gets to degrees of confidence. Right? Maybe the incidences of go down through time as a percentage of total. Right? Maybe our confidence of being able to anticipate problems gets better. But I don’t know that there’s ever enough. I don’t think that there’s ever been a time that we said, yeah, we’re good from a cybersecurity standpoint. Right? Because we’ve continued to have breaches, and it’s not AI related. These are these are old school breaches, whether it’s updates or or other problems that that get thrust into our environments.

Erik Prusch [00:09:05]:
The dependence on working with other institutions or other enterprises gets impacting ours. Right? So making certain from if you think about what we have from a digital trust perspective, it is to make certain everybody is working in a coordinated manner that you understand all of your vulnerabilities whether in your firewall or outside of your firewall. Suppliers, employees, customers. Every aspect is is a potential vulnerability. What we have to do though is we have to evolve faster than the problems we’re involving in order to make headway on it. And we’re not at that point yet, meaning as as an industry, as cybersecurity, IT audit, risk, governance, we haven’t evolved faster than AI is evolving currently.

Karissa Breen [00:09:52]:
Do you think as well from my understanding of interviewing people like yourself on the show, the parallels that I’m drawing is, like you said, it’s a new frontier. It’s a new sort of thing in terms of how familiar people are with the problem. Do you think this is gonna take gonna take time? And I feel like that’s such a cop out of an answer because everything, of course, takes time. But what I mean by that is, look at when the Internet, like, came out in the nineties. Right? He was saying it wasn’t gonna go anywhere. It took a bit of time for people to understand how it works. How can people leverage it, whether it’s for good or bad? Do you think the same sort of approach is going to happen with AI?

Erik Prusch [00:10:25]:
I think that AI has the ability to proliferate well beyond what the Internet started out as. And while there is definitely a parallel, which is there’s always going to be bad guys, there’s always going to be malfeasance. There’s always going to be some challenge that is identified with everything, that we do. I think the fact that matter is what we’ve got is a different scale of problem than we had when the internet started. In the nineties, when you had a bad website or you had a bad actor, it was pretty, pretty easy to identify, or it was easier to identify and it was easier to discontinue because it wasn’t everywhere all at the same time. I think with AI, it is in more aspects than the internet is. It’s not just in one medium. It’s not just through one access point.

Erik Prusch [00:11:20]:
There is now infinite numbers of access points. And I think that makes the scale of the problem greater. I think that means that our knowledge and capability has to be much greater than it’s been in the past. We’ve gotta retool, reskill, along the way.

Karissa Breen [00:11:35]:
So going back to your comment around, like, people’s quite don’t, you know, don’t understand sort of maybe the fidelity of the problem. Do you think anyone on this earth really does understand it though?

Erik Prusch [00:11:46]:
No. I don’t. And even if people understand it more than others, the question is whether they’re gonna be using it for good purposes or whether they’re gonna be using it for bad purposes. I think what we’ve got to do is make certain that we’re setting those standards, that we’re having those discussions, we’re making those determinations, how society wants to use AI. I think that is as important as anything else. I think there’s a lot of money that’s behind AI. There’s not only the money around developing AI or incorporating AI, but there’s also market capitalizations that are reflecting whether or not AI AI is within. And we’re talking about 1,000,000,000,000 of dollars, right? We we’ve seen Nvidia grow so dramatically at the possibility of AI.

Erik Prusch [00:12:35]:
And that’s just from a chipset standpoint. We haven’t even thought about the downstream consequences. So you’ve got a lot of money that’s moving in to try and exploit AI. And now the question is, is how do we make certain that that is all being done consistent with what society wants to be done. Whether that’s ethical or whether that’s our ethics, I’ll say, or whether that’s understanding vulnerabilities or whether that’s understanding IP. I mean, you think about the problems. They’re they’re significant. None of them are easy.

Erik Prusch [00:13:08]:
There’s not one problem in this that’s easy. Ethics is not easy. While an enterprise can determine the ethics that they want to deploy, the fact that matters is what matters equally as much as what are the other competitors to that company are doing and how are they doing it. And then on top of that, it’s what is society or the government local governments or or federal governments of countries wanna do on top of that. And what you see is lots of layers that we’ve gotta permeate or penetrate through, I’ll say, in order to make certain we’ve got those common structures necessary in order to make this to be productive and be utilized in the best possible way.

Karissa Breen [00:13:46]:
I’ve spoken to other ISACA people, Mary Carmichael, Jenai Minkovich, about regulation, what ISACA is doing in that space. But I wanna hear maybe your your perspective on how do we get to that point of regulation, but also in my discussions with people, they’re sort of just saying, well, the government needs to regulate it. Do you think people are perhaps just throwing that problem over the fence? We’re like, okay. Well, you guys deal with it. And do you think anyone really knows how to deal with it effectively?

Erik Prusch [00:14:16]:
Yeah. I I think we’re still learning. I think advocacy is certainly an important element of it, but I can go back to some reactions that were really difficult, were not guided well, like the adoption of SOX was was a really good one, in the 2000, early 2000, where it was in response to a problem. It had little guidance to it. It had marginal effectiveness, and it had a very long tail that required evolution through time to get better. What was the purpose of it? You know, how did the regulation come to be? And that’s a, that was a very narrow problem that it was trying to solve or relative to AI, a very narrow problem. Yes. I don’t think it’s good enough just to say the government will take care of it.

Erik Prusch [00:15:02]:
Much the same way is we have regulation today around anti competitiveness. That doesn’t seem to be working in terms of preventing problems from occurring. We’ve got to do better. We’ve got to engage in the conversations. We’ve got to get the adoption, a set of standards that we can all understand and benefit from and self regulate as much as outside regulation is gonna do. If if we don’t, then then I think the problem is gonna continue to to proliferate unchecked as it is, I think, today.

Karissa Breen [00:15:35]:
So I like to live with the philosophy that majority of people out there in the world are good people, and they they are bad people. So to your earlier point around some people may use AI for good and for bad. So with jaws, you know, following that sort of track a bit more, would you say that majority of people will use AI for good? Or do you think that maybe more people may use it for bad? If you had to hypothesize.

Erik Prusch [00:15:58]:
Yeah. I I wouldn’t even wanna speculate on it because I I believe the same thing. I think more people are good. I think the bad guys are in in smaller numbers. And I think that when acting for good will stand the test of time. And if you’re acting for bad, it will be short lived before somebody figures out that problem. I think the we are paid to make certain that we enable organizations to grow. We enable organizations to grow the right way.

Erik Prusch [00:16:28]:
And if you believe that people are good and they want to be a part of good long term standing economic value creation and that the organizations behind it want the same things. I think we’ll get to the solutions in short order. I think the problems will be very clear over time and there’ll be a long tail that we’re having to overcome. Just like we had a long tail on the internet, just like we had a long tail on cloud, just like we had a long tail on personal information and, and security. It still persists. It’s just getting more infrequent, just like breaches in cybersecurity are becoming less frequent, but we’re still occurring. They’re still coming along at a pace.

Karissa Breen [00:17:13]:
So maybe just following the good side of AI, I know that, you know, people are adopting it in good ways. So is there anything, like, any sort of insights or observations you’d like to share with how businesses are adopting AI?

Erik Prusch [00:17:26]:
Yeah. I mean, I think that what AI is doing is it’s is it’s maximizing the contribution of individuals. I I think we’re able to steer to higher value, rather than necessarily low end monotony. And that doesn’t displace people. What it does is it effectively makes them more productive, possibly more creative, possibly, more efficient. And to me, that’s that’s good. That’s continued evolution. We haven’t seen the the capacity of people yet, and therefore, I have a 100 a 100% confidence that people matched with AI is gonna contribute a lot more than just people or just AI.

Erik Prusch [00:18:11]:
So to me, it’s it’s fascinating to watch people evolving around what these possibilities are.

Karissa Breen [00:18:17]:
Now I know you have spoken at your fireside chat. Maybe share a little bit more about what that was about.

Erik Prusch [00:18:25]:
Oh, so, you know, what what we’re trying to do is make certain that, one, we’re we’re informing folks about what we’re doing as ISACA. While our, our members are engaged, I think there’s always a question about what it is that we’re doing in order to take it. So we wanna make certain that we’re having a great conversation with our members and that they know the direction that we’re heading and why we’re heading in those directions. We’ve certainly got some compelling growth initiatives, scheduled for 2025. We’re we’re certainly gonna continue to build on making certain that we’re bringing more members in, that we’re bringing newer people, not only academic institutions, but making certain that we’ve got a good on ramp into our membership. We also want to make certain that we’re adopting to this AI environment and that we’re thinking through our certifications or training and our knowledge bases in a productive way, but we’re building infrastructure too that make it easier for members to find what they need all along the journey. And given that we handle all of these domains and we’re somewhat unique in terms of the domains that we, that we handle as an association that allows for a lot of customizations too. So we’re thinking ahead on what, what may be required of our members and always with a common backdrop that we want them to be the experts.

Erik Prusch [00:19:45]:
We want them to be the go to agents within their enterprises to be able to drive the solutions to these problems that persist.

Karissa Breen [00:19:59]:
Joining me now in person is Chirag Joshi, founder, 7 Rules Cyber. Chirag, thanks for joining me back on the show, and welcome.

Chirag Joshi [00:20:07]:
Thanks, TB. Always such a pleasure to see you.

Karissa Breen [00:20:09]:
So, Chirag, I’m aware that you presented today. So please tell us, what did you present on?

Chirag Joshi [00:20:13]:
So my talk is going to be on the security of Critical Infrastructure Act and the practical challenges and implementation that has come about with the Socky Act, as we call it. And it’s also beyond that, right, in terms of the lessons that organizations can glean when it comes to looking at their cybersecurity postures in a more holistic manner. So that’s our talk. But beyond the cybersecurity, we’re also gonna cover a very important part of, of SOCI that is supply chain and physical hazards.

Karissa Breen [00:20:41]:
Okay. So when it comes to critical infrastructure, do you think it’s one of those things that isn’t, like, front of mind? It sort of feels a little bit relegated in terms of even from a media point of view, there’s not a lot of content out there about critical infrastructure, and it is in fact critical because we rely off it every day, but it just seems like one of those things that maybe isn’t as prominent as perhaps it should be. What are your thoughts on that?

Chirag Joshi [00:21:03]:
It’s a good of an education journey. So this the act itself requires the regulated entities. There’s a certain number of entities who are covered by and industries covered by the SOTC Act. Within that, there is a specific cohort who are deemed as systems of national significance. And those are the, you know so the requirements that apply change based on how critical you are to the entire economy, to our society, and stability. Now to your point about we don’t hear as much, I think part of that is government’s trying to change that. They’re trying to engage more with the industry and educate them. Now there are some organizations who might be critical infrastructure but are going through a maturity journey, And there is an aspect of what I call is, even if you put socket aside for a second, go back to good cyber risk management practices.

Chirag Joshi [00:21:51]:
There are certain frameworks that are advocated as being good practices, which most of our listeners might be familiar with if they work in cybersecurity associated. So the NIST cybersecurity framework, Essential8, the Australian Energy Sector cybersecurity framework, AESCSF. So there are some which that’s why I almost put the Socky aside, good cyber risk management, and then Socky can be part of your overall hygiene. And then the requirements that come along and beyond that can be accommodated based on your business.

Karissa Breen [00:22:21]:
Okay. Do you think I’m gonna ask a real basic question. Do you think people in general just get the SOPHIE Act? I mean, there’s a lot. Have you read it? Like, it’s like 400 plus pages. Like, it’s a lot to wrap your head around. Right?

Chirag Joshi [00:22:31]:
I’ve had the pleasure of enjoying that read, Kavy. But you’re right. There is the aspect of there’s a lot there. And I tell people not everything applies to everyone. Right? So I think there’s an education aspect. At a very basic level, people need to get a handle of, you know, what their asset inventories are, you know, which key controls that implemented for things like incident response planning, incident response notifications, vulnerability management. So it’s it goes back to not groundbreaking stuff. It is still, you know, good practices now that are required.

Chirag Joshi [00:23:02]:
So there are certain aspects of Socky Act which were initially when it was talked about a lot, which is, you know, government stepping in to assist when required. That creates some concerns. But I think by and large, the industry has largely found its feet with regards to working with a government on that front. So yeah. Look. It’s by design. It’s it’s not a very exciting topic. I just think if you put confusion aside, go back to proper cyber risk management, that will solve most of your problems.

Karissa Breen [00:23:31]:
Joining me now in person is Francine Hu, director from KPMG. So, Francine, thanks for joining. Welcome.

Francine Hoo [00:23:37]:
Thank you. Thanks for having me.

Karissa Breen [00:23:39]:
So I know that you’ve recently had your session you presented on AI, but tell us, what did you present on?

Francine Hoo [00:23:46]:
So I did present on AI. We did talk a little bit about, you know, the war stories, what we think that our concerns of live board and people at the moment. But I think the most important part is how did it actually interacted with how we protect people and planet. And just talking about my experiences and what I’m seeing and how we’re helping people build that infrastructure out so as they develop, what are the questions to us? What are the measures that they can implement?

Karissa Breen [00:24:12]:
So in terms of what you’re seeing now in terms of adoption with you know, would you say people are still trying to get their head around, like, how to leverage it properly in organizations? Yes. I would say so.

Francine Hoo [00:24:24]:
I’d say it depends on people’s appetite for how much they’re willing to fail, how much they can afford to fail. It’s at different maturities. It’s funny because AI has been around for a long time. I think Gen AI has actually just made it just more popular. So now the real questions are being asked, you know, about what is this? What are we doing? Are we happy about it? You know, and I think culminated with just everyone’s awareness of how your personal information matters in here, it’s just, you know, everything’s come together. And I think that’s why there’s a lot more awareness of the fact that it’s around. What do I wanna do? And do I actually wanna be on that journey?

Karissa Breen [00:25:02]:
Do you think people are happy about AI or gen AI now? Is it really is, you know, emerging quite hard in the market?

Francine Hoo [00:25:10]:
It’s an interesting question. And, you know, we were I’ve been around with the team actually hosting panel events around the country. And the best question that’s been asked is, you know, who uses the Internet daily? And hands up. Who uses it, you know, hourly and within the hour? And hands are still up. And then it’s, who remembers when the Internet started? Like, my hands up that’s showing here a page. Yeah. Totally. You know? Ding, ding, ding.

Francine Hoo [00:25:38]:
And, like, I couldn’t even make that noise properly if I wanted to. But we are so willing to do that. And we are so willing to actually fly on planes with all the pilot. And so we’re so nervous about all the development now. I think it’s because it’s more the fact that people don’t understand it. And there is a whole lot of misunderstanding that it is Terminator. So I remember when I was asked the question, I wasn’t at KPMG or was it another, another place? And I, I bumped into a friend who was in a bit of a tizz and we were doing this fire drill And I’m like, you’re right. And he’s like, do you know what AI is? I thought, oh my god.

Francine Hoo [00:26:14]:
I knew this question was gonna come to me. And I kept really quiet because we were playing blockchain at that time. And he goes, you’re thinking of the Terminator, aren’t you? I said, not gonna lie. Yes. The image is in my head. And he goes, I’ll make it very simple for you. And look, this is pre gen AI, by the way. He goes, AI is on steroids.

Francine Hoo [00:26:35]:
Is a sushi. It’s not a sushi. It’s not is a sushi is a hot dog. Here’s so if you think about humans and how we make our decisions, we’re just really fast at distilling, if yes, if no’s, if can, if not, and we can do it all parallel, you know, and the machine just keeps moving that way. And that blew my mind because I went, that’s what it is. And then I started thinking about how we orchestrate and engineer. You know, I was we’d orchestrated ways of assuring people the value versus you’re gonna waste a $1,000,000 by the way, investing in something you don’t understand. And that’s when it changed the landscape for me.

Francine Hoo [00:27:11]:
So my my experience with that was probably quite a few years ago, and I think a lot of people are going through that similar experience now.

Karissa Breen [00:27:20]:
Where do you think people get this view of the whole Terminator thing from? Like, where did that, like, I get it, but, like, obviously, you know, if you’ve got mainstream media out here that is sort of pushing that narrative, that’s one thing. But when people do think about AI, they do often refer to hop culture Terminator.

Francine Hoo [00:27:37]:
I think it’s because if you actually look at the stats and the numbers, like, you know, Terminator jaws came out when the baby boomers were around. Everyone is so scared of sharks, including me, for good reason. Terminator and, you know, robots and that whole narrative, early nineties. So So we were all watching that. You know, we didn’t have YouTube. We didn’t have streaming devices. It was all about TV. It was all about movies.

Francine Hoo [00:27:59]:
So it’s etched in our brain. So we would think that that is the case. And I mean, it’s not wrong. It’s just, it never got distilled what that was, you know? And there’s a lot of us that love star wars growing up as well, you know, with the droids and everything. So there is this, I think, misconception in a lot of people’s minds that when you look at robots, things that organically move, they say into AI, what they don’t realize, and this has been a great education process for me, is that you’re looking at a whole series of automation, integration of data, a good user experience, you know, and then there’s a piece of artificial intelligence that generates from that. And in some of the very basic stuff, it’s more the automation and the the UI than it is the AI.

Karissa Breen [00:28:47]:
Joining me now in person is Jamie Norton, ISACA board of directors, who has recently been appointed. So, Jamie, thanks for joining and welcome.

Jaime Norton [00:28:55]:
Thank you. Have a good one.

Karissa Breen [00:28:57]:
Okay. So you’ve recently been appointed to the board. So maybe, you know, share what has Osaka sort of mean to you? I know you’ve got, like, a long tenure with, you know, the chapters, etcetera, but, you know, perhaps share a little bit more about your story.

Jaime Norton [00:29:10]:
Yeah. So I’ve been involved with Osaka now for for nearly 20 years. Yeah, started off early career, you know, focused on certifications and and, you know, attending training and and industry updates, and and it’s really, ASSAKA’s really been with me, I guess, for most of my career, being involved with the community, being, you know, certifications, international board. So for me, it started out very much about certification and training and, and helping me get experienced. But, as I’ve become sort of more, you know, aged and become more senior, it’s now just as much about the community and and having good people around me and, you

Jaime Norton [00:29:47]:
know, being able to help the industry

Jamie Naughton [00:29:48]:
and give back. So it’s really been, a good journey for me.

Karissa Breen [00:29:51]:
So in terms of, you know, last 2 days has been multiple sessions, like AI AI’s been one of them. Obviously, it’s a concern for people, but with your background being the previous size of the ATO, like, perhaps, what are some of your concerns then for this cybersecurity sector?

Jaime Norton [00:30:04]:
Yeah. I think, you know, as as we’ve seen here in the last couple of days, the AI is definitely a big one and and it’s very much an emerging threat. You know, we we can already see the potential challenges, you know, facing us both from a governance perspective, but also, you know, in terms of just malicious actors using AI and and being able to harness that to, you know, just increase scale and also, you know, sophistication, I guess, of attacks on individuals. But I think also at the same time, we still have this double speed happening in cybersecurity where basics are still really tough for a lot of organizations. So just getting that basic hygiene and doing the, the, the key things we need to do is still very much a challenge whilst at the same time, we’re, we’re dealing with emerging threats like AI and sophisticated threats. So it’s, it’s why I guess our industry is so challenging to stay ahead of us. It’s sometimes hard to do the basics and and keep focus on, you know, the emerging threats at the same time.

Karissa Breen [00:30:55]:
So then on that point, you are right about the basics and the patch management. I always go about that in my interviews, but we’re talking about companies are still trying to get those basics right as you would know. So now we’re talking about AI, which is, like, obviously, spawning into this new sort of area and new terrain for people. So would you say that perhaps, like, we need to be able to crawl first. Now we’re sort of, like, running in the Olympics by the AI sort of, you know, that’s my analogy for AI in comparison. What’s your view then on that?

Jaime Norton [00:31:23]:
Absolutely. I think you can’t you can’t run before you crawl. So getting those basics right at least gives you a fighting chance to then look at the next level of sophistication and how that might impact, unless we have our, you know, basics done, we’re we’re gonna struggle to even, you know, combat some of these more advanced threats. We’re gonna even the simple threats are gonna take us out. So definitely getting those done, you know, patching, as you say, multifactor, some of these kind of things, the essential aid essentially from, from government, that kind of governance is really necessary, I guess, the first, and then starting to look that that’s really an enabler, I suppose, for targeting more sophisticated threats.

Karissa Breen [00:31:56]:
And then with my previous discussion with Eric Krush, the CEO of Osaka, who flew here for this event, he was sort of saying like, no one really understands, like, AI, like, fully. Would you agree with that statement?

Jaime Norton [00:32:08]:
I I think so.

Jaime Norton [00:32:08]:
I mean, it’s we’re seeing we’re seeing some some people starting to emerge that have certainly been involved with it now for a few years, but it is a really new, yeah, a really new phenomenon, unless you’re an academic with, you know, coming out of the university sector, most of us have only really had the last couple of years where we’ve just been paying attention to some, perhaps only the last 3 to 6 months where it become more and more prominent. So I think we’re all in a pretty similar boat. We’re all sort of grappling with what this means. We’re at a bit of a an evolutionary stage where we don’t quite know where this is gonna go. So we’re just trying to get the governance right on it, make sure that we’re as prepared as we can be, but also the the governance ride for AI in our own organizations too, because increasingly security solutions in our as an industry, we’re adopting AI to help combat what we think is coming. So just trying to keep it in the pace of it at the moment, I think, is the key key initiative for us.

Karissa Breen [00:32:56]:
And where would you say AI is going?

Jaime Norton [00:32:58]:
Without a doubt, I think from a from a threat actor perspective, we’re gonna see AI. We’re already starting to see AI being used in terms of, you know, scams and and the cybercrime element. Without a doubt, that’ll become mixed in with the middle nation state as well, so we’re gonna see sophistication there that’s gonna, and a scale too, so we gotta see the scale of these attacks just increasing and increasing, scams are already just ubiquitous. So I think that’s, that’s certainly gonna happen, on on the defensive or the the sort of blue team side, we’re gonna see AI. Most of most vendors have already got something AI in their product. I think that’s gonna increase. And then the amount of decision making, I guess, in terms of what AI will do for us on a daily basis will grow as well, and so we’ll have to start to balance those two elements.

Karissa Breen [00:33:43]:
Joining me now in person is Kate Raulings, ISACA member and speaker today at the conference. So, Kate, thanks for joining and welcome.

Kate Raulings [00:33:49]:
Thank you so much.

Karissa Breen [00:33:50]:
Now Kate, tell us a little bit more about what you discussed today.

Kate Raulings [00:33:53]:
Well, my sessions was called Do the Basics or Beware, and it’s looking at my time consulting and supporting organisations through incident response. And it’s really the lessons that I’ve learned, the 10 lessons that I’ve taken away from supporting those organisations. It’s the basics because there’s some really practical, simple things in there, such as patching applications. Everyone goes, oh, but that’s basic. You should just, everyone should just do it, but it’s how to actually make sure you’re doing it properly and doing it well and you’ve got full coverage. That’s one of the lessons that we’ve talked about today.

Karissa Breen [00:34:27]:
So Kate, would you say, we often in cyber talk about the basics, but the basics don’t appear basic. It appears difficult and hard and complex. What are your thoughts on that?

Kate Raulings [00:34:38]:
Yeah. I I would completely agree. I think basics are there because everyone knows they should be doing it, but getting it a 100% right a 100% of the time is the challenge. And an attacker has only got to get it right once when you’re on the inside of an organisation 100% of the time, 100% of the coverage and that’s where it becomes challenging and problematic and difficult. Some of the other incidents and the lessons that I have learned, technology alone won’t save you. You know, it really is, as the conference title states, the combination of technology and process and people.

Karissa Breen [00:35:18]:
Joining me now in person is Richard Magalad, managing director at ITR Australia. Richard, after 3 years, you’ve finally joined the interview, although it is a a small little snippet, but here we are. I’ll take it. So tell us, what did you present on today at the conference?

Richard Magalad [00:35:35]:
Hi, KP. Yes. 3 years in the making. Good to be here and good to be at the ASR conference. I just took part in a panel over at the on how to manage your 3rd party risk, and we were we’re going through some of the challenges that a lot of people in industry are experiencing right now. With the recent high profile, incidents with the the likes of Medibank and and Octane and and others as well. It just reminds us that the third party does fall part of fuel attack surface, and therefore, adequate management really needs to be put through just to make sure that we can trust the suppliers to doing the right thing for us and that, we are also doing the right thing for our suppliers.

Karissa Breen [00:36:21]:
Yeah. Do you think in terms of, like, the whole risk around, you know, 3rd party suppliers and supply chain, Would you say people are now worried because they need these supplies to make their business run? So do you think there is this fear or this anxiety to be like, well, kinda can’t do my business without them, but also I’m now fearful if there’s an outage or there’s an incident that I’m now being impacted by it.

Richard Magalad [00:36:43]:
Yeah. There definitely is. There’s, there’s this realization, especially after in the last couple of years, because prior to these high profile incidents, we have always been using suppliers for decades now. And the thing is, we’re always under the impression that our suppliers will always do the best for us because they they have a motivation to look after us. They want to earn the income that they derive from looking after our business. But, now that we’re learning more and more so, there’s some element of doubt is coming into, into all industries. They’re we’re all now starting to look at what are their suppliers doing, or are they doing enough? And there’s definitely a bit of paranoia that’s coming in there. The the other aspect of that as well is some of the suppliers are then responding and doing the right thing by making the business more resilient.

Richard Magalad [00:37:39]:
But at the same token, unfortunately, there’s a small number of suppliers who are then starting to to duck for cover, and they’re the ones that we need to weed out because they’re the ones that could be the next high profile bridge as well.

Karissa Breen [00:37:51]:
What do you mean by duck for cover?

Richard Magalad [00:37:53]:
So in in some of

Richard Magalad [00:37:54]:
the instances, I’ve seen some, supplier cyber questionnaires that’s been published, that that distributed by the organization. Then I’ve been in one of those, some in those panels where I then look at the response from the supplier. And just looking at the response, knowing what I know in the industry, as they start to doubt, this doesn’t sound true because of what I know from a particular supplier. So then what triggers is we then go back to them and say, okay. Well, thank you for the response, but we want evidence now. And next thing you know, they stop taking our phone calls. And then in a small number of suppliers, they’ve actually terminated the contract. And in some of them, then they wanted to recontract and they watered down the terms and conditions.

Richard Magalad [00:38:44]:
So that’s a red flag because why would a supplier of mine that’s been dealing with me for 10 years all of a sudden have left me after I challenged them for evidence, or why then they why they’re watering down the terms and conditions with a lot more exit clause. And so and and look. It’s a good thing because we now know what we didn’t know. If we can weed out some of the bad operators, then that’s better for the industry as well.

Karissa Breen [00:39:09]:
So what was what was the result of that? Which one was it?

Richard Magalad [00:39:12]:
So some some of the supplies, we we had one where there were some software suppliers that were providing some sort of kinda not not a CRM product, but, product that ties in with CRM. And there were some privacy questions in there that was asked within the questionnaire, and when it borne out, could not attest that they were looking after the privacy properly. And, after failed phone calls, they said, you know what? This is route. So at that stage, if your supplier does not communicate with you, the customer, after, like, 2 weeks, a month, then, you know, it’s time to change suppliers as well. So some have left, whereas others are watering down. And in those cases, there’s still negotiation that’s happening. It’s like kinda negotiations where I I think there was a mere couple moment where the supplier goes, look. We we weren’t doing as good as we thought we were.

Richard Magalad [00:40:03]:
We still would like to service you, but we’d like to change our terms and conditions, which means that they’re doing less work for the business. But so in the end, by the supplier doing less work for the business, less risky work, that means they continue to be as a good supplier, a good third party, supplier for us. But it also then allows us in the organization to then go at another supplier that that is better, that manages the risk much better as well.

Karissa Breen [00:40:30]:
And would you also say that people just ghosted you? Never have them ever again?

Richard Magalad [00:40:34]:
Yeah. Look. It’s it’s one of those one of those things in the industry, and I think it’s not just the cybersecurity industry or the tech industry for that matter. It’s the good old I think I just got caught, so time for me to go running for cover. So, yes, they, some supplies have done that. And and sadly, some of them have, have even, in very small cases, it’s the word where they close shop and then gets reborn into another one, and then so they’ve, they pretty much done that. So what happened is you’ll have this business that was delivering goods and services to other companies, then they disappear, then the new one, the board of directors or rather the managing directors and the shareholders are the same. So different ABN, different name.

Richard Magalad [00:41:23]:
In some cases, they’ve even registered with different states. But then you do a you do a background check and a due diligence on the owners through ASIC, which, hello, it’s public knowledge, then you see it’s the same operators. So red flag for that one. And some of the other ones that we found, which is actually a good outcome, is some of the small providers have actually been bought out through this process. So a larger mothership will call just bought the small operators and put them under the wing. And because the the bigger company had good controls, good policies, good everything, then everyone’s happy that’s because we’re managing the risk better again. It’s interesting how the focus is not so much, we don’t expect the supplier to be the best super secure business. We want the supplier to show us that they’re managing their risk best.

Karissa Breen [00:42:18]:
So joining me now in person is Sam Mackenzie, cybersecurity committee member from ACS. So, Sam, thanks for joining me back on the show, and welcome.

Sam Mackenzie [00:42:25]:
Hi. Good to be here.

Karissa Breen [00:42:26]:
So I know that you have had a chat today and you’ve presented something, so please tell us. What did you present on today?

Sam Mackenzie [00:42:32]:
I’m really interested in cybersecurity for critical infrastructure, and I had some guests join me on a panel talking about control rooms, and control rooms are a key part of how we manage and coordinate our critical infrastructure so that we can get good outcomes, better availability, that’s the services, and those services might be the power grid. They might be the transport network or other areas like airports and things obviously have control rooms as well.

Karissa Breen [00:42:57]:
Yeah. And I know that you and I did a deep dive focusing on critical infrastructure. So what do you what do you think people sort of miss when it comes to critical infrastructure? Is it more so just they think, well, I can just turn on the light and the power works and I don’t have to think about the mechanics of, you know, how it functions? Do you think it’s that and people aren’t actually focusing on, well, what happens if the thing isn’t working and we can’t have access to power, for example?

Sam Mackenzie [00:43:21]:
Yeah. I I think and and even when I talk to people about it at social events, they don’t really have an understanding of what it is, and I think the general population, like it’s not really up to them to know. I think they’re just expecting to rely on those services. Like you say, you turn on the tap, you want the water to come out, you turn on the light switch, you want the power to work. I actually grew up off grid for power and water and so I have experienced that through my childhood and it drives me to have a keen interest in making sure that people don’t have to experience that and so that’s one of my passions around helping secure it, helping make it available and supporting the public in not needing to worry about it. I think the challenge is that the threats landscape has changed over time and the threats are actively targeting and attacking this infrastructure that we’ve got and we’ve walked into a situation where the technology use across these most essential of our services has has expanded, significantly.

Karissa Breen [00:44:18]:
So in terms of your presentation, what would be some of the key takeaways that you’d like people? And I know that not everyone could be here in person today, that’s why we wanted to put together this interview to discuss maybe there were some learnings that people can sort of walk away with.

Sam Mackenzie [00:44:29]:
I guess the key thing that I’m trying to drive and through the conversations that I’m bringing people together across discipline, across sector, is is for better security, better cyber safe outcomes in regards to higher availability, harder threat targets for for the bad actors to attack, And so I would encourage people to collaborate across discipline, cross sector because I think that’s where better outcomes occur.

Karissa Breen [00:44:55]:
Okay. So joining me now in person is Wayne Rodriguez, security architect from Insignia Financial. So, Wayne, thanks for joining, and welcome. It’s wonderful to finally meet you in person.

Wayne Rodrigues [00:45:05]:
Likewise, Karissa.

Karissa Breen [00:45:06]:
So tell us, you’ve got a bit of a journey with ISACA and how you got involved. So talk us through what does that look like and, you know, how how did you get involved with ISACA?

Wayne Rodrigues [00:45:14]:
Yes. Definitely. So my journey started about 12 years ago back in 2,012. Interestingly, my dad at at that time was a heavily involved member of Faiza account. He was part of the professional development subcommittees and part of the various director positions in there. He really encouraged me to this join on as part of a student membership. So just attended some of those, got my head around it, and it seemed really keen and really interesting. And then when he moved on to his presidency at the ISACA Melbourne chapter, just got further involved there.

Wayne Rodrigues [00:45:48]:
So it’s been about 12 years now and got myself heavily involved following in his footsteps, helping out the various other directors over here as well.

Karissa Breen [00:45:57]:
And so when you say following his footsteps, do you mean, in terms of the trajectory or in terms of just, you know, being involved with the chapter, for example?

Wayne Rodrigues [00:46:06]:
Pretty much both. I think he was an inspiration for my career as well as volunteership and giving back to the community. He’s following on in terms of his career. Really love the aspect of cybersecurity and securing businesses and processes. Really piqued my interest and hence go straight into cybersecurity. And in terms of volunteering and giving back to the community, he’s always been a core part of the community. Just always giving back and creating, various openings and entries through his leadership. So just following on to that, just trying to give back wherever I can as well and helping out.

Karissa Breen [00:46:41]:
So you said before giving back to the community. So I wanna focus on that a little bit more. So there’s obviously, you know, Goddessaka. You’ve got these other sort of, you know, independent sort of groups and memberships and all sorts of things. But what does giving back to the community look like for you?

Wayne Rodrigues [00:46:57]:
I currently have about 10 or so years of experience in the field. I definitely believe that I can help the the new entrance into cybersecurity. I can help the current candidates and professionals in upskilling and moving on to the next stage of the career. So just giving them some of that guidance and insight into my personal journey, any tips, and tricks involved in in that, and just sharing some professional advice as well along the way.

Karissa Breen [00:47:23]:
So do you have any tips or tricks then you wanna leave our audience with today?

Wayne Rodrigues [00:47:26]:
I think definitely, events like this definitely help. They they not only spark your interest in certain areas, they also provide you an insight into aspects which you might not even be aware of. Not only that, you get to be wonderful people, and this all helps build that.

Wayne Rodrigues [00:47:44]:
And

Karissa Breen [00:47:46]:
there you have it. This is KB on the go. Stay tuned for more.

Share This