I recently headed down to TechLeaders, which was hosted at Kirkton Park in the Hunter Valley.
TechLeaders Australia is dedicated to the Australian technology industry. It serves as a platform for professionals, businesses, and organisations to connect, collaborate, and stay informed about the latest trends and developments in the tech sector. This is my third year attending and I always enjoy meeting new people and of course sitting down for an interview – followed by a glass fo wine after. This environment encourages a more personable approach, rather a โconferenceโ. You can really get to know people on a personal level.
Phil Sim, Founder of Influencing and the one who pioneered TechLeaders commented,
โIt’s really important for media to occasionally step back from the grind of day-to-day reporting, and just devote time to talking, learning and digging into what are the emerging tech challenges and issues.โ
Mr. Sim went on to say,
โThat’s why we bring together people from every part of the tech ecosystem. Media, analysts, executives, users and industry groups, all come together to engage in discourse that really gets to the heart of what’s happening in the tech sector across Australia.โ
I sat down with a number of interesting and knowledgable people at the conference; featuring Chris Diffley, from Optus, Chris โGonzoโ Gondek, from NetApp, Gavin Jones from Elastic and Geoff Schomburgk from Yubico to discuss their opinions on various areas including Australia’s journey with generative AI, data security, and sustainable practices.
Generative AI: A Work in Progress
Australia has been slow in adopting generative AI, largely due to a lack of awareness of its benefits and a reluctance fuelled by concerns over return on investment and urgency. Gavin Jones from Elastic confirms that generative AI is often downgraded to simpler tasks like image doctoring or email responses. However, there is an under explored potential in using AI for enterprise strategy and aligning it with company visions.
The resistance also stems from significant challenges related to data security, ethical considerations, and biases prevalent in large language models (LLMs). Elastic plays a role in helping companies determine suitable models for specific use cases, such as employing small language models (SLMs) for handling sensitive information and LLMs for broader applications.
Ethical Standards and Security Measures
Ethical considerations are major checkpoints when integrating generative AI. Companies need to carefully vet the sources and appropriateness of the models they use, reinforcing trusted responses and minimising biases. The establishment of ethical standards and regulations for generative AI is still in progress, requiring a multidisciplinary approach and cross-referencing multiple sources.
โSome workloads are best run-in a private LLM, and they may choose the data stores, or sometimes it’s actually being referred to instead of a large language model (LLM), as a small language model (SLM). It’s only serving or gathering data, and still billions of data points, but it’s very restricted to trusted data stores.โ
Storage and Security by Design
Chris โGonzoโ Gondek from NetApp spoke with me about the often, overlooked and at times relegated world of storage. In various fields such as AI, cybersecurity, and governance, thereโs a palpable need for discussions around data storage and security. NetAppโs flexible environment supports both on-premises and cloud storage.
A concept called โdata gravityโ which refers to the idea that data creation consumes storage and power consequently leading to carbon emissions. NetApp’s commitment to sustainability includes reducing data gravity and supporting the UN Global Compactโs sustainable practices. They combat greenwashing through a sustainability dashboard that measures and improves sustainability efforts. Sustainability can be one of those things that is perceived as smoke and mirrors.
โData gravity or data having gravity means that every time that we create data, it will consume some magnetic storage somewhere which has ones and zeros, which needs to be powered and that power is coming from somewhere.โ
Commented Gondek.
Threat Monitoring
Chris Diffley from Optus announced their latest launch โ the Optus Managed Threat Monitoring service, powered by Devo Technology. This service leverages AI and machine learning for threat detection and monitoring, acknowledging the increasing use of AI by cybercriminals. The Devo platform offers a 400-day data view, enhancing the detection of long-term, complex threats with strong analytics based on global breaches and regional use cases. Optus places great emphasis on maintaining rigorous security measures and compliance processes in its operations.
Diffley commented.
โWe’re seeing trends within the cybersecurity landscape. There’s a lot more state actors that are causing the threats. There’s a lot more technology being used to influence how those threats come in, and we’ve seen a lot of compromises across the Australian landscape as well.โ
Multi-Factor Authentication (MFA): A Positive Approach
Geoff Schomburgk reiterated the importance of a strong, phishing-resistant MFA. User adoption remains a hurdle, and a positive reinforcement (carrot) approach is seen as more effective than a forceful one (stick). Encouraging MFA adoption can yield benefits like enhanced productivity and reduced IT support needs, easing the frustrations and employee rage around frequent password resets.
Schomburgk added.
โSo we can take the stick approach and that says, I’m from IT. This is what you’re going to do. You’re gonna have to adopt this because it’s good for the organisation. That generally doesn’t sit so well. And maybe the approach is the carrot which is encouraging and showing the user the benefit of why they’re doing this to make their life easy.โ