Karissa Breen [00:00:15]:
Welcome to KB On the Go. And today, I returned to the Tech Leaders 2024 conference here in the Hunter Valley, 2 hours north of Sydney. Tech Leaders is the premier network and education event for journalists and technology vendors in the IT and cybersecurity sector. Running across the 2 days, the event brings together government representatives, industry analysts, and technology journalists to hear from technology companies. I’ve lined up a few guests to appear on the show today, so please keep on listening. I’m joined now by Chris Diffley, senior director for client security from Optus. So, Chris, thanks for joining and welcome.
Chris Diffley [00:00:45]:
Thank you. Pleasure to be here in the sunshine of Hunter Valley.
Karissa Breen [00:00:49]:
So just before Optus announced an update, so maybe talk through what is that update and then the benefits of it. Let’s start then and we can go into a few other things after that.
Chris Diffley [00:00:58]:
Yeah. Sure. So, yeah, today, we announced that we’re launching the Optus managed threat monitoring service, which is powered by Devo Technology. Very excited about this platform whenever we sell to our customers, be it a contact center, be it a SD WAN service, whatever it is. We wanna be able to say that we can back that up with the absolute best in class cybersecurity in the back end. It will be the right technology automated, leveraging the latest analytics engines using AI and ML to really put the focus more on the technology looking through reams and reams of data rather than relying on human beings to to search through a lot of events.
Karissa Breen [00:01:43]:
Just go back to AI component. Now, obviously, now we’re seeing a trend in the space with we are combating AI in terms of, you know, the defense side of it because the criminals are using AI. Where do you sort of see this going now moving forward? You’ve sort of touched on a little bit throughout your presentation, but maybe if you could sort of elaborate more on we are seeing this trend, but then what does this mean now moving forward with, you know, how how AI is gonna be used in the future?
Chris Diffley [00:02:07]:
Yeah. So it it’s so embedded into this system, and we ran a a pocket proof of concept for about 18 months to make sure we chose the right technology partner to give us that benefit and to be able to meet those AI challenges in the threat landscape head on.
Karissa Breen [00:02:25]:
What do you mean by AI challenges?
Chris Diffley [00:02:27]:
We’re seeing trends within the cybersecurity landscape. There’s a lot more state actors that are causing the threats. There’s a lot more technology being used to influence how those threats come in, and we’ve seen a lot of compromises across the Australian landscape as well. You can’t tackle that with human beings. You you’ve got to have the right technology investment. So this platform comes with an inbuilt seam and saw capability. So not only do you have your security information event management, but it’s also the same platform, low code, no code, cloud native platform that my technology guys can use that gives us that ability to orchestrate and automate defense against that. So to give you an example, Devo platform comes with a 400 day data.
Chris Diffley [00:03:18]:
Its analytics engine is really, really strong. That’s that’s what we’re partnering with here. And that amount of data allows us to see some of the more complex longer term threats that we we know is in the landscape there. So research shows us that threat actors can be in compromised organizations up to 200 days. If you’re only looking at the last 90 days worth of data, you’re not picking up on some of that intel that that’s happened previously. So once you’ve got that longer term view, you can start to work out that there is threats taking place across, a longer time period. That’s just one example of the the technology that it uses.
Karissa Breen [00:04:05]:
Just go back a step. You said before the analytics engine being strong, what makes it strong?
Chris Diffley [00:04:10]:
So look, it’s it’s cutting edge technology, Cambridge, Massachusetts built, but it’s 3 things that they ingest in there to to make it work for us. There’s the use cases out of the box, which are really strong and based on actual breaches around the world globally. There’s a lot of regional use cases that we bring to bear, not just within Optus, but within our, regional partners in the Asia Pacific region via Singtel. And then on top of those use cases, we have the AI and the ML engine that’s constantly evolving and learning what it needs to, not have human beings having to tie the links together all the time using that technology base to throw up scenarios every day. We’re good in in my teams at threat hunting. We really invest a lot of time in that. When we find those behaviors, those anomalies, we like to then orchestrate them and automate them. We don’t have to go looking for them again.
Chris Diffley [00:05:10]:
We can then put them in the dashboard, put them in the tool, and we can be confident that the technology is then working through what that looks like. We’re very excited, about those uplifts and changes that to how we work in that space.
Karissa Breen [00:05:26]:
And just sort of pressing a little bit more, do you mean sort of, like, critical attack path? That’s what was it what you meant before?
Chris Diffley [00:05:31]:
Correct.
Karissa Breen [00:05:31]:
So why would you say, just more generally, why do you think this is important in terms of this announcement? Because, you know, yes, that you might be, like, 1st in region, like, other players are doing similar things. So what makes this sort of different would you say?
Chris Diffley [00:05:46]:
So we know we’re we’re not selling this individually. We we are selling contact centers to big clients, big 4 banks, to federal government agencies. We’re selling managed, network services, SD WAN technology. We’ve got to be able to show that we can, protect all of that data, all of those services in the very strongest way possible. They’ve invested a lot of time in the POC, a proof of concept with this vendor, and they really did come out on top of all of the the different partners we we could have chosen to be able to give us that technology advantage to place on top as a layer over our security operation centers that we run through our Sydney Macquarie Park facility. We’ve got the people. Optus has always invested heavily in training, making sure we retain the right staff within our business. Got deep mature partnerships with our partner ecosystem, Devo being one of them that’s coming on board.
Chris Diffley [00:06:46]:
You pull all of those things together, the services that we’re trusted on by government, by enterprise, by the financial services industry, we have the right security back end to ensure that we’re looking at all the the latest, you know, threat paths.
Karissa Breen [00:07:02]:
So I don’t have time to go into it too much today, but I do would you say people still get confused between the difference between a SOC and a SOAR, And then what would be the main difference?
Chris Diffley [00:07:12]:
Yeah. So the SOC is to us, it’s our facilities. We have 2 of them in Sydney. We have our GSOC, which is our federal government. SOCs that comes with different controls very much aligned to the central 8, the ISM protective security policy framework, no less secure, heavy defense in-depth to make sure that all of that data is protected, almost air gapped as best we can, but also there’s a lot of compliance and regulatory that we need to go through be it ISO 27,001 or SOC 2 or the federal government industry. All of that forms part of our overarching SOC. There’s a lot of training into the people within that space, a lot of focus on GRC and the governance risk and compliance, making sure that we’ve got that as absolutely clean as we can. The most important thing for me would definitely be the discipline and rigor in your security hygiene and making sure you have that lens on all of your vulnerabilities, making sure that you patch in the shortest possible time, making sure when you have a critical vulnerability that comes in almost daily, now, which is, you know, a huge ramp up from even where we were last year, that we’re patching that.
Chris Diffley [00:08:29]:
We’re working with our vendors, and getting that fixed within, 48 hours. All of that’s your holistic SOC. The same component is the 24 by 7 cybersecurity event management piece. But what Devo gives us is that SOAR capability on top to really orchestrate and automate those events to make sure that we’re we understand in real time what the threats are. Really, in a nutshell, to summarize all of that, it’s about looking deeper into our customers’ data and identifying and partnering with them through the methods that they’re they’re accustomed to today when they have a security event. That’s what builds trust within the customer base.
Karissa Breen [00:09:17]:
I’m joined now by Chris Gonzo Gondek, solutions engineering manager from NetApp. So thanks for joining, Chris, and welcome.
Chris โGonzoโ Gondek [00:09:23]:
Thanks for having me on the show.
Karissa Breen [00:09:25]:
Okay. So today in your presentation, you discussed flexible environment. So maybe what do you sort of mean by that term?
Chris โGonzoโ Gondek [00:09:32]:
Flexible environment. I guess that comes down to the ability to have freedom of choice in infrastructure. If we think about all workloads, operating systems, virtual machines, applications, databases, they all constitute effectively data at the end of the day, and data lives on storage. So, being the intelligent data infrastructure company, we provide storage capabilities on premises in data centers, as well as in all of the major clouds, like Microsoft, Azure, Google Cloud Platform, and Amazon Web Services. And so, flexibility means the omnipresence of leveraging these infrastructures, the appropriate workload, appropriate cloud, but it’s also flexibility in the functions of the storage technology itself, multiple storage protocols, multiple data activities like classification, security, data protection, and disaster recovery, and things like that. It’s a flexible use of storage technologies for different data outcomes.
Karissa Breen [00:10:30]:
Would you say storage is one of those things that seems to get a bit relegated by people? Like, it’s like, okay, well, you know, it’s just storage. We put it back in our mind and have to think about it again?
Chris โGonzoโ Gondek [00:10:41]:
Absolutely. I think it’s it’s probably not front of mind. You know, when people think AI, they’re probably thinking about GPU systems and, you know, the number crunching that goes on when you’re doing generative AI. They’re not thinking about the fact that 85% of AI machine learning projects fail because of data access issues. It’s data that’s being fed into these, large language models. In other scenarios, when we think about security, there’s a lot of focus on cybersecurity, not so much on cyber resiliency as a result. So, network centric thinking versus data centric thinking. And when you think about compliance and governance, needing to turn data into information so that we can classify it, These are storage conversations.
Chris โGonzoโ Gondek [00:11:28]:
These are data storage conversations. They’re not network conversations or, you know, user access control. And, lastly, it’s the data itself that is the attack surface area in a security conversation. So, being able to create fast resilient copies and fast recovery, is a storage conversation as well.
Karissa Breen [00:11:46]:
And I ask that question because I do speak across multiple different disciplines and it’s something that I think people just forget about.
Chris โGonzoโ Gondek [00:11:52]:
Absolutely. And we’re very supportive of that.
Karissa Breen [00:11:55]:
And it’s sort of like out of sight, out of mind. That’s why I used the word relegate before. Okay. So there’s a couple of things on there that I wanna sort of speak about a little bit more. So you mentioned, Gonzo, security sorry, storage security by design. So, mate, talk me through it.
Chris โGonzoโ Gondek [00:12:09]:
So security by design means that when you take piece of hardware, for example, in our scenario, we make data center storage appliances, we fill it with high performance slash disk, that’s just storage by itself. It becomes intelligent data infrastructure when we put our storage operating system on there, which we call ONTAP. ONTAP, by design, services many different, I guess, workloads Yep. Through various different storage protocols, but inherently built into the system is something we call autonomous ransomware protection. Autonomous ransomware protection looks at data and the usage of data and creates a normal pattern of behavior and that normal pattern of behavior becomes what the autonomous ransomware protection is looking for to stop cyber threat activity in its tracks and then create what we call a tamper proof snapshot. The snapshot is tamper proof because it’s locked, it’s kind of logically air gapped, and it cannot be removed or deleted unless multiple admins verify and multiple admins multifactor authenticate and agree that it can be removed. This would be a scenario like an honest mistake or a false positive and we can retrain the model to improve the accuracy of the results. So, security by design means that it’s there, it’s built in.
Chris โGonzoโ Gondek [00:13:29]:
Whenever you deploy a new volume, we have snapshots enabled by default, we’re just going to assume you’re going to want to protect your data. The autonomous ransomware protection works in addition to that regular data protection cycle.
Karissa Breen [00:13:40]:
Definitely familiar with security wide design. Is this like you mentioned before, you’ve obviously explained the storage component. Would you say that people don’t embed the security element through the whole storage life cycle, if you wanna call it that? Why do you think that is?
Chris โGonzoโ Gondek [00:13:53]:
I think it’s because it’s never been really thought about at the security layer. Sorry. At the storage layer. It’s only been thought about at the security layer in terms of perimeter.
Karissa Breen [00:14:03]:
So does that go back to my earlier question around storage is being relegated? People are forgetting about it out of sight, out of mind?
Chris โGonzoโ Gondek [00:14:10]:
Correct. They they’re they’re not making the connection and the assumption that storage plays a critical role in cyber resiliency. They’re just thinking network centric cybersecurity. And so, a lot of emphasis on firewalls, intrusion, identity, etcetera. Right. But inside the perimeter, where we’re doing our work, where we’re passing all of the, you know, perimeter security parameters, We’re live next to the data. Now, if it’s my user credentials that have been phished, then it’s activity that’s, again, bypass the perimeter happening on the inside. How do you detect that? How do you detect, you know, anomalous behavior or how do you detect when data is being exfiltrated without these storage mechanisms in place? These are storage functions.
Karissa Breen [00:14:53]:
Because then you touched on today in your presentation as well around data at rest, right, as well. So would you say with your experience, people don’t really talk about, like, field level encryption?
Chris โGonzoโ Gondek [00:15:03]:
That’s a very good point. Applying storage security principles across the data life cycle is very important to us. That goes in conjunction with efficiencies as well because we don’t just keep a primary copy, we keep a secondary and a tertiary, which means that if we’re applying encryption and we’re blowing that encryption down the line to secondary and tertiary copies, it’s very hard to get efficiencies out of encrypted data already. We solve that problem by doing all the efficiencies prior to the encryption and then keep it encrypted through its lifecycle. Where we’re using things like snap locking in our primary storage, we’re also using things like object locking in object storage down the tertiary, path to add to that logical air gap and to the recoverability factor. When it comes to finding and redacting specific bits of information, that becomes more sophisticated technology at the application layer. What we can do from a classification perspective is find that sensitive personal information so that it can be redacted by another process. That process, so what we call data classification, uses content indexing technology.
Chris โGonzoโ Gondek [00:16:11]:
It means if some data hits our storage, the classification engine using AI will open it, read it, contextualize it, and classify and categorize it into things like sensitive and personal information, non business categories, which you can fine tune to determine what constitutes non business. Once we’ve got that there, then you know through our dashboards what is sensitive personal information to then go and do something about it. Should we put it in a more secure location if it isn’t already or do we need some redacting technology to play a role here?
Karissa Breen [00:16:46]:
And so you’d be leveraging AI to do that, I’m assuming?
Chris โGonzoโ Gondek [00:16:49]:
It’s built in. Yes. Our classification engine uses AI.
Karissa Breen [00:16:52]:
What if you have more specific requirements? So would you be able to leverage your own sort of protocols to to say, hey, this is
Chris โGonzoโ Gondek [00:16:59]:
Where we’re at with the technology today
Karissa Breen [00:17:01]:
Yep.
Chris โGonzoโ Gondek [00:17:01]:
Is we’re getting as far as over 99% accuracy in classifying and categorizing the data. What you do with it afterwards is subsequent activities. Right now, it’s kind of rudimentary when we want to quarantine data, for example. So, identifying gives us the results, what’s the first thing we do with the results? We may want to immediately quarantine them because they’re exposed, they’ve got sensitive personal information on a public, cloud storage environment, let’s move it to a more secure storage environment, so quarantining. More sophisticated and advanced processes beyond that, like redacting certain bits of information from within documents. Salaries and that. Yeah. That that would be an application process not currently within NetApp capability.
Karissa Breen [00:17:48]:
Okay. The other thing you spoke about as well is data gravity. So what what does that mean?
Chris โGonzoโ Gondek [00:17:54]:
Data gravity or data having gravity means that every time that we create data, it will consume some magnetic storage somewhere which has ones and zeros, which needs to be powered and that power is coming from somewhere. Data will attract more gravity to it because I don’t just have one copy, I’ll make a secondary copy and a tertiary copy and I’ll be mandated by governance and compliance laws to hold on to it for long periods of time. This ultimately has a knock on sustainability impact and effect. So, that data gravity, as it grows, creates a bigger carbon emission associated with it. We’ve already got some metrics like every email you send emits 0.3 grams of carbon. So, we’ve got down to the point where in our storage solutions, our sustainability dashboard doesn’t just report on the metrics that the energy regulators want to see, like kilograms of carbon per terabyte, these new metrics, or heat BTUs for cooling. We’re also offering ways to fix and improve the sustainability score for reducing that data gravity. Happens in a number of ways.
Chris โGonzoโ Gondek [00:19:01]:
1 is data efficiency, straight away. If you apply compression, compaction, deduplication, FIM provisioning, tiering, you’ll make a smaller data footprint, smaller carbon footprint. But we maintain those efficiencies when we make copies. So, your Doctor copy off-site is also compressed, compacted, deduplicated. Doesn’t require as much network bandwidth, doesn’t require as much storage in the destination. And then, its ultimate life cycle in the end where we’re holding on to it for 7 years, efficiencies maintained. You get a 3 x, 10 x, 15 x reduction over the life cycle of data reducing gravity, reducing emissions.
Karissa Breen [00:19:36]:
That was what I was gonna ask you next because you’ve just touched on sustainability. And so you’d be familiar with the UN Global Compact, that whole regulatory framework and how they’re assessing companies and
Chris โGonzoโ Gondek [00:19:47]:
Yes. Yes. And we are part of it. The name escapes me right now. We have an ESG report, that publicly talks about it. NetApp contributes to that global consortium on sustainability and part of our commitment to it isn’t just the fact that our technology helps customers reduce their carbon footprint, we’re also very conscious about our packaging. We’re also very conscious about our manufacturing processes using green energy and stuff like that. Our score is multidimensional in contributing to that global consortium.
Karissa Breen [00:20:16]:
It’s just something that’s starting, in terms of, like, a theme I’m starting to see more of this. The only thing is I think there are still companies out there that are doing the whole greenwashing.
Chris โGonzoโ Gondek [00:20:25]:
Greenwashing. Absolutely. Yeah. So two things that our sustainability dashboard really helps with. If the ACCC is coming after you to say demonstrate how you’re making a commitment to better sustainability, the dashboard shows trending over time and how you can achieve and improve on your score through these data reduction activities, as well as maybe using greener energy sources. You can actually input if you’re using wind power versus coal power in your specific environments or the hyperscaler that you choose. The other side of this, there is a monetary one, Australian carbon credit units. There are rebates based on achieving better sustainability.
Chris โGonzoโ Gondek [00:21:05]:
That’s our situation in Australia. What we’re doing globally in places like the EU and Singapore, there is literally no more data. It is mandated that you have your sustainability metrics reported in the language that the energy regulators want to see. Normally, when we think about data, we measure it in terabytes and megabytes and gigabytes. We can actually give you a kilograms of carbon per terabyte metric. We can actually give you a watts per hour metric and this is really important to the energy regulators or if you’re a service provider who wants to deliver a greener service or a green SLA and guarantee that you won’t go above these thresholds of carbon emission. That’s what’s considered by.
Karissa Breen [00:21:50]:
Joining me now is Gavin Jones, area vice president and country manager Australia and New Zealand from Elastic. So, Gavin, thanks for joining and welcome.
Gavin Jones [00:21:57]:
No. Great to be with you and great to be here at Tech Leaders Summit in the Hunter Valley.
Karissa Breen [00:22:01]:
So maybe, Gavin, let’s start there. Talk to us a little bit more about what you’ve presented on today.
Gavin Jones [00:22:06]:
Sure. Firstly, thank you very much for for the opportunity. It was actually first up an opportunity to let people know a little more about what Elastic is. A lot of people aren’t aware of what Elastic does, and so I was able to share that. I was able to share our vision for generative AI, which is not only an area that probably offers the biggest opportunity and potential for, probably the next generation of of, people in Australia, but also surprisingly and and concerningly, it’s it where Australia is lagging behind in the adoption. We actually commissioned a a report that I can talk to some of the stats, or we’ve just talked about some of those stats, where we are lagging behind some of our competitors both globally and also domestically. And then we shared some of the real opportunities and benefits of what generative AI offers. That’s some really exciting development.
Karissa Breen [00:22:53]:
So wouldn’t you say generally Australia lags in a lot of things? So, yes, technology is a big one, other things as well, but I feel like that’s a common theme as an undertone is Australia’s lagging in this, this, and this. Why do you think that’s the case?
Gavin Jones [00:23:07]:
There’s a lot of lot of reasons. Firstly, just to give the the statistics that our report turned up, our this report we conducted on a global level to look at the adoption of generative AI showed that Australia is lagging in the adoption and only has has embraced generative AI to the tune of about 40% 42% of companies. That compared to some of our counterparts in the Asia Pacific region, Singapore at 63% and India at 81%, which is putting Australia at a real competitive disadvantage versus our trading partners. There are lots of barriers to adopting generative AI. I think everyone knows about some of the biases that come in with generative AI. Some of the, false positives or, hallucinations. Had to remember the term. There’s some of the the security, the privacy, and the, regulatory concerns.
Gavin Jones [00:23:58]:
But one of the things that we’re hearing constantly from organizations is how they bridge that gap between the local context around your private and confidential company data and the multitude of generative AI capabilities that exist in LLMs and copilots out in the public Internet. You know, I don’t think anyone thinks they should be going and putting their confidential company data into chat gpt. And so where Elastic sits, and this is what I shared in our presentation this morning, is we provide that trusted bridge between confidential company information and the context that’s so critical to leverage the best of Gen AI and the multitude of LLMs and Copilots in the market to best enhance that customer and employee experience for using Gen AI.
Karissa Breen [00:24:42]:
Okay. Alright. This is interesting. So you’re right. I’ve done a lot of interviews around even, like, government people leveraging Gen AI to, you know, increase their productivity, increase their speed. Why would we wanna do a monotonous task when we can leverage AI and tooling to do it for us? So going back to the point before, why would you say Australia is slow? Would you say we’re a reserves market?
Gavin Jones [00:25:03]:
I I think there’s been a lot of there’s been a lack of awareness of the benefits for the ROI and the urgency for generative AI. I think a lot of people’s initial experiences have been leveraging chat gpt to kind of doctor an image or, you know, build out a document or respond to an email. That’s people’s first impressions. But going from that embedded use case that’s in an application to actually looking at what your enterprise strategy is and how that aligns to your company vision and underpins your most strategic priorities is where there’s probably been a gap in terms of understanding the return on investment and the urgency to drive that. And so we see that kind of bucketing in 3 broad areas. How you actually improve customer experience to drive revenue outcomes, some really good use cases that we’re working with our customers on for those ends, How we actually improve customer efficiency and effectiveness and improve productivity, and that’s been one of the ones that’s been high on top of mind for many people. And then finally around cyber resilience. How do we actually leverage gen AI to improve your ability to respond to security threats and incident?
Karissa Breen [00:26:06]:
So just going back to one of the stats that you presented today, the 87% are considering increasing their investment in Gen AI. So, again, do you think that 87% is because, like you said before, people think it’s about, you know, images and those types of things? Is it the the awareness is still getting there? So doing interviews like this sort of encourages more the adoption or
Gavin Jones [00:26:26]:
Yeah. I think part of it’s the awareness and an awareness not only of the benefits of Gen AI and why it should be an urgent priority, but also the awareness of how to overcome those challenges. Because they are serious challenges. You talk about government. I don’t think anyone would be comfortable with government employees sharing citizen data on public gen AI use applications. And so, I think there’s a real barrier in terms of broader adoption because companies are struggling with how to actually overcome those barriers to adoption. And that’s what we shared in our presentation. There’s probably 7 or 8 layers that are important before you actually embed this into your technology strategy.
Gavin Jones [00:27:04]:
And so we kind of address it both at the application layer through our security and observability tools, but also importantly provide that platform that allows them to connect confidential data and the multiple LLMs, and especially ones that are relevant for the use cases they have because that may be different for each different use case. One of the examples I was sharing earlier today as well was that if you’ve got LLMs that you’re asking a question as whether the earth is flat and Reddit is one of the data sources, you could actually get a very credible response that says it is.
Karissa Breen [00:27:33]:
That’s where the hallucination comes into it.
Gavin Jones [00:27:35]:
That’s where the hallucinations come in. We allow organizations to work out which are the best LLMs and copilots to use for their use cases so they get a trusted response.
Karissa Breen [00:27:43]:
So I was recently in, the US, and I interviewed the head of AI for Zscaler, and he’s part of the World Economic Forum for AI, and he spoke a lot about hallucination. So my question to him was, to your point, the whole flat earth thing, and I said if I went out there and ran multiple media sites claiming that the sky is purple and then you integrated LLM in, you know, into some of these sources, you would start to come up with that theory. So my question to him and maybe to yourself, Gavin, would be, who gets to decide what’s credible or not credible? Because maybe I’m color blind and the sky actually is purple.
Gavin Jones [00:28:17]:
I I think it’s it’s going to vary on every use case in every company. I think what we’re seeing in the beauty of Elastic is that some workloads are best run-in a private l l m, and they may choose the data stores, or sometimes it’s actually being referred to instead of a large language model, as a small language model. It’s only serving or gathering data, and still billions of data points, but it’s very restricted to trusted data stores. That could be from the public internet, but some companies may choose that that is too sensitive a a use case and they want to run that on prem. Elastic allows them to run it self managed, on premise, or in their cloud of choice, or on any of the 3 hyperscalers, or any combination of the 3 of those. And this is the beauty of Elastic’s model. We allow companies to choose which is right for them. Could be purely on prem, but never neglect the power of external large language models for things like cyber resilience because that’s where the threats are usually detected.
Gavin Jones [00:29:13]:
Ahead of when the actual company themselves detect it, it’s the broader ecosystem that recognizes there’s an issue and those reports start flooding in.
Karissa Breen [00:29:21]:
So going just focusing on the small language model. So I mentioned before it, like, government, you know, government agencies. Right? So they would probably use that as a use case sensitive information. They’re not gonna just, you know, pull it from wherever. It needs to be accurate and have these hallucinations that start to come into it. Is this something that government agencies will start to adopt, would you say, in your experience?
Gavin Jones [00:29:41]:
This is where we, as Elastic, help our customers work out which is the right model for them. More language models may be better served better suited to use cases where it’s a very finite number of responses that would be trusted, and they may be a better use case to be run on premise. We help customers work that out. Other use cases where they’re trying to look at where there’s security incidents and they want to leverage the world wide web and incidents and breaches that may be impacting 1,000 or tens of thousands of companies in real time, that’s a much broader use case. So we allow them to blend that both both of those best of the best of both of those worlds, but use that in a single data store and combine that with their confidential data that Elastic has already indexed. That’s kind of the perfect storm of using SLMs and LLMs.
Karissa Breen [00:30:25]:
So going back to the Flat Earth example, have you seen a lot of this? And are we gonna get to a stage where we’re really, like, delusional and thinking, well, is this true? Is it not true? What’s your thoughts then on that? Like, how does this sort of progress forward?
Gavin Jones [00:30:39]:
Look, I think this comes down to the ethical considerations barrier. I think there’s a lot of ethical considerations that need to factor into the way that you’re gonna use Gen AI. Not just its alignment to your vision as a company and how you’re gonna serve citizens if you’re a government, but also the ethical considerations. How do you actually provide trusted responses? And that needs to be factored in to people as they’re thinking about how to adopt this more broadly. I think the other consideration that we’re hearing coming up often and there’s been antitrust lawsuits around this is the biases that come with some of the large language models. All of that needs to be factored in as you build out a strategy, but importantly, you need to be cross referencing multiple sources and working out which is appropriate for your customer and employer base.
Karissa Breen [00:31:17]:
We’re not quite there though in in terms of, like, the the ethics around it because it’s a relatively new sort of thing concept. We’re still trying to get ahead around it. So what sort of happens between now, as in now, you know, people are leveraging AI, Gen AI, and then getting to the point where we have a north star for, you know, ethical standards and, you know, regulation around it, what is the in between part? What are we gonna see happen?
Gavin Jones [00:31:40]:
Yeah. It’s a good question. Look. The truth is this will evolve over not just the next 2 to 3 months, but over the next decade. I think what we are seeing is there are some really strong powerful use cases that companies Australian companies need to be embracing now. Things like fraud protection, where you can actually do multi factor authentication of users based on their social media profiles so they don’t fraudulently or defraud the government of funds that are meant for disaster relief or social welfare programs. There are so many use cases around how we get a better experience with live chat and chat bots. Things that we can actually help in workplace search.
Gavin Jones [00:32:15]:
We’ve got a multitude of use cases across those three domains that I talked about before. The company should be leveraging now. The broader, more ethically centric use cases, I think, will evolve over time, but we can still drive massive competitive advantage for companies leveraging technologies and approaches that have already been validated and tested today.
Karissa Breen [00:32:36]:
Okay. I’m joined now by Geoff Schomburgk, regional vice president APJ from Yubico. So, Geoff, lovely to have you back on the show.
Geoff Schomburgk [00:32:43]:
Great to be here. Thanks, Karissa.
Karissa Breen [00:32:45]:
So I have done some reconnaissance in the space, which I often do. And today, I messaged a CISO to get them to ask more about Yubico. So one of the questions they asked me so this is specific from them. They work in a retailer. And they said, what would be your suggestion to get the business to transition into MFA even though they should be doing it? So perhaps perhaps if you can steer clear of an obvious answer. So we can say awareness, so we can say this, but what really is gonna move that needle?
Geoff Schomburgk [00:33:17]:
What’s gonna move the needle? So move into MFA, the the first question would be which MFA? Not all MFA is created equal. So we would encourage them on that journey to move to something that is strong, that is fishing resistant. That’s kind of a a starting point. And, users are are the challenge. It’s about encouraging adoption. Most of, unfortunately, what we see is it’s a change program of how do you encourage your humans to adopt a different approach. Now there’s 2 ways you can do that, And this is the nonstandard answer. Is the carrot or the stick? So we’re adopting MFA because we, as an organization, want to be more secure.
Geoff Schomburgk [00:33:58]:
So we can take the stick approach and that says, I’m from IT. This is what you’re going to do. You’re gonna have to adopt this because it’s good for the organization. That generally doesn’t sit so well. And maybe the approach is the carrot which is encouraging and showing the user the benefit of why they’re doing this to make their life easy. The fact that they can log in and authenticate, which they have to do every day, easily, quickly, and without a password. Now when you when you say that without a password or password list, people start to pick up their ears, go, excuse me, what did you say? Password list. I don’t need a password.
Geoff Schomburgk [00:34:37]:
I never have to change my password again. That’s right. So that benefit automatically starts to encourage curiosity and, hence, interest to adopt something that actually makes their life easier. Then as we we talked about today, the fishing resistant user in all aspects of their life. MFA from the business point of view, it’s been seen as a business tool. And, you know, if you’ve got your rental car, you don’t put air in the tires because it’s not your car. You don’t worry about putting oil in the engine because it’s not your car. If it’s your car personally, you do because it’s of value to you.
Geoff Schomburgk [00:35:12]:
So if we can make MFA, authentication, more broadly acceptable for all parts of their life in terms of the consumer services they access, the government services, then it becomes really valuable to the individual because they know that their organization is helping them secure their personal world. So there there’s a real value that can be attracted to that. And once you’ve got over that hurdle, then it’s about the traditional sort of change program and transformations of, you know, communicating, educating, training, finding the the key adopters, encouraging those early wins, and all the stuff that comes out of the transformation and change textbooks. But try and find the carrot or the stick, and we generally find that in doing this, the the carrot approach works much, much better than the stick approach.
Karissa Breen [00:36:01]:
So would you say as well, just from a frustration point of view, in my experience working corporates and enterprises, even resetting your password, how much time that takes? Or I don’t know my password. I gotta go into the IT help dirt, you know, desk dude downstairs. Even that takes time in terms of productivity.
Chris โGonzoโ Gondek [00:36:15]:
Absolutely.
Karissa Breen [00:36:16]:
Did do you see that’s a key key driver, though?
Geoff Schomburgk [00:36:18]:
Yes. And just what you experienced, I I was thinking then, I know what I’m like when I’ve got a login and I can’t remember the password. My frustration level goes up. My tolerance level goes down. My anger level goes up. This is frustrating. So how do we take that away? Yes. We can.
Geoff Schomburgk [00:36:35]:
And, you know, business benefits of productivity depending on what your environment is. If it’s my mom who’s logging in, that productivity is not important. But if I’m in a retail environment or manufacturing environment where time is critical, then our research and those where others will show that this authentication method is at least 4 times faster. So think about that as a productivity benefit for the user and then productivity in the IT support team because 90% of their calls to the support desk have gone away because they’re not being asked to reset passwords. They’re not being asked to, I’ve been locked out of my account. Please reset. So there’s a productivity gain at both end.
Karissa Breen [00:37:16]:
We don’t have time if you’re on a pause, you know, pause terminal and you’re having to reset your password or something’s happened and you’ve got a whole line of people. What do you do then?
Geoff Schomburgk [00:37:24]:
You’ve lost a sale and you’ve got a disgruntled customer. That wasn’t great customer service. So it is important at that front end to make it as easy and seamless as possible.
Karissa Breen [00:37:37]:
And there you have it. This is KB on the go. Stay tuned for more.