AI for Cybersecurity: Use With Caution
Posted: Tuesday, Jan 23

i 3 Table of Contents

AI for Cybersecurity: Use With Caution

Artificial intelligence (AI), according to Forbes โ€œis the fastest-adopted technology in history.โ€ More specifically, generative AI tool ChatGPT in February 2023 became the first tech application to reach 100 million unique users in its first two months.

AI in all its forms is making a significant impact in many areas of human endeavour, for good or ill, and cybersecurity is no exception. There are certainly many ways in which AI can aid those charged with keeping IT systems and data safe from attack, but AI is not a panacea for cybersecurity, and use of AI brings its own challenges.

Here we look at some of the pitfalls of using AI for cybersecurity, leaving aside ways in which it might be used by attackers, and at how AI can be deployed to aid cybersecurity.

The Challenges of AI

Despite its โ€˜intelligenceโ€™ name, AI does not yet have the intelligence to counter an attack: that still requires human intelligence, skilled cybersecurity professionals, and all the other tools at their disposal. AI is not a substitute for human expertise. On the contrary the information and insights it produces can increase the demand for human expertise.

While AI might be able to identify the vulnerability of an asset, and hence the risk of it being attacked, it cannot identify an organisationโ€™s most important assets, or assess the importance of the asset in the context of the business: how willing the business is to accept the level of risk the AI has identified. Only humans can do that.

Perhaps one of the biggest downsides of AI is not a problem with the technology per se, but one of perception. Such has been the hype around AI that it is easy for the unwary and ill-informed to be seduced by the claims and believe AI to be a panacea for cybersecurity challenges, which it is not.

Often the cybersecurity โ€˜achievementsโ€™ of AI that gain the most publicity are those developed and implemented by large organisations with massive resources: they do not readily translate into achievable benefits for the great majority.

For example, Australiaโ€™s Commonwealth Bank has touted the benefits of an AI model it developed to help identify fraudulent digital payment transactions. It says this AI-based tool has blocked nearly one million transactions since being implemented in 2020. However, the bank also says it has spent $30m since 2015 on tackling such financial abuse.

There are other, generally acknowledged, problems with AI that are equally applicable to AI security tools, namely lack of transparency and potential for bias in AI models. In October researchers at Stanford University issued a reportassessing the transparency of AI models, warning that while the capability of AI was โ€œgoing through the roofโ€ transparency was declining.

AI as another weapon in a cyber security arsenal can represent another possible vulnerability. AI systems require access to large volumes of data to be effective: they are โ€˜trainedโ€™ using huge datasets. These could include sensitive data, the privacy and integrity of which must be rigorously maintained. Also, if some of this data is later found to be inaccurate, it is almost impossible to correct any AI model built using the data.

Finally, as any assistive technology becomes more widely used it can lead to over reliance and a false sense of complacency (think of self-driving cars). The cost of AI will likely decrease rapidly and its functionality increase. This is good in that it will free up scarce cybersecurity resources to focus on cognitive tasks only humans can manage, but could lead to overreliance on AI, especially when dealing with the unique requirements of an organisation when AI is less likely to be correct.

The Opportunities of AI

One thing that AI, and machine learning (ML), can do far better and faster than any human is to scan huge volumes of data and identify patterns, or changes. Trillions of network packets, millions of website calls and thousands of file downloads can all be scanned in seconds by using AI models that are learning and using the collective knowledge to make a call as to whether something is a threat or not.

AI with ML can also be used to build a comprehensive picture of website traffic and distinguish between benevolent probes such as search engine crawlers and malicious probes, whether generated by bots or human attackers. Such AI-generated insights can be particularly helpful to security teams struggling to understand and counter an attack in real time.

AI/ML can also be used to analyse networks and generate insights that be used to configure networks for greater security and improved controls and processes.

AI can also be used to aid software testing. Software applications are routinely tested for quality, performance and functionality before deployments, and those tests are designed to identify any vulnerabilities. AI can be deployed to automate and accelerate all aspects of this process.

Balancing Both

There is no doubt that AI is already a very useful aid to cybersecurity and, given the extraordinary pace at which it is evolving, can only become more useful, but the same approach must be adopted as with any other potentially transformative technology: become a cautious early adopter so as to avoid being behind the eight ball when the technology becomes mainstream, but avoid being seduced by claims of great achievement, and in particular be wary of giving nascent technology a critical role.

The development of AI has been extraordinarily rapid, and there is no indication the pace will slow, which makes appropriate adoption and deployment a particularly difficult balancing act.

Darren Reid
An innovative and entrepreneurial executive, with extensive domestic and global experience, Darren Reid is the Senior Director of Asia-Pacific and Japan at Carbon Black. From large enterprise organisations, to smaller start-up and scale-up companies, he has delivered growth in areas as diverse as sales, marketing, professional services, product development and operations.
Share This