ISACA PROVIDES NEW GUIDANCE TO HELP ORGANISATIONS MANAGE GENERATIVE ARTIFICIAL INTELLIGENCE RISK
New whitepaper states: โ€˜AI can quickly amplify a โ€œcontrolledโ€ risk to a chaotic level, potentially derailing an unprepared business.โ€™ย 
Posted: Tuesday, Sep 19
  • KBI.Media
  • $
  • ISACA PROVIDES NEW GUIDANCE TO HELP ORGANISATIONS MANAGE GENERATIVE ARTIFICIAL INTELLIGENCE RISK
ISACA PROVIDES NEW GUIDANCE TO HELP ORGANISATIONS MANAGE GENERATIVE ARTIFICIAL INTELLIGENCE RISK

ISACA PROVIDES NEW GUIDANCE TO HELP ORGANISATIONS MANAGE GENERATIVE ARTIFICIAL INTELLIGENCE RISK

 

New whitepaper states: โ€˜AI can quickly amplify a โ€œcontrolledโ€ risk to a chaotic level, potentially derailing an unprepared business.โ€™ย 

 

Sydney, Australia (19 September 2023): ISACA has launched a new resource – The Promise and Peril of the AI Revolution: Managing Risk, – which acknowledges the benefits of generative artificial intelligence (AI), but explores the rapidly evolving risk landscape and the steps that risk professionals should take to keep up with it.

While excitement around the benefits of generative artificial intelligence applications like OpenAIโ€™s ChatGPT and Googleโ€™s Bard has grown, so have the notes of caution from many in the industry, who point to a range of potential risks that could come with the tech.ย 

 

The paper examines several different types of potential risk that enterprises could face with generative AI, including invalid ownership, weak internal permission structures, data integrity and cybersecurity and resiliency impact, not to mention larger societal risk. As AI will likely affect businesses in every industry, organisations must take four important steps to maximise AI value while installing appropriate and effective guardrails, as part of a continuous risk management approach:

 

  1. Identify AI benefits.
  2. Identify AI risk.
  3. Adopt a continuous risk management approach.
  4. Implement appropriate AI security protocols.

 

Following these steps will allow leaders to strike a good balance of risk versus reward as AI-enabled tools and processes are leveraged in their enterprises. In addition to breaking down the above four steps, the ISACA paper includes eight protocols and practices for building AI security programs in the fourth step, including:

 

  • Trust but verify.
  • Design acceptable use policies.
  • Designate an AI lead.
  • Perform a cost analysis.

 

โ€œWhile some leaders may prefer to wait to adopt AI tools, it can be a risk to your organisation to delay the implementation of proper security and risk management plans; AI risk isnโ€™t just a precaution โ€“ itโ€™s a necessity,โ€ says Jason Lau, Chief Information Security Officer of Crypto.com and ISACA Board Director. โ€œIt is imperative that leaders prioritise establishing the correct infrastructure and governance processes for AI in their organisations, ensuring they align with core ethics, sooner rather than later.โ€

 

To download a complimentary copy of The Promise and Peril of the AI Revolution: Managing Risk, visit https://www.isaca.org/promise-peril-ai-revolutions.ย 


 

About ISACA

ISACAยฎ (www.isaca.org) is a global community advancing individuals and organizations in their pursuit of digital trust. For more than 50 years, ISACA has equipped individuals and enterprises with the knowledge, credentials, education, training and community to progress their careers, transform their organisations, and build a more trusted and ethical digital world. ISACA is a global professional association and learning organisation that leverages the expertise of its 170,000 members who work in digital trust fields such as information security, governance, assurance, risk, privacy and quality. It has a presence in 188 countries, including 225 chapters worldwide. Through its foundation One In Tech, ISACA supports IT education and career pathways for under-resourced and underrepresented populations.

Twitter: www.twitter.com/ISACANewsย ย 

LinkedIn: www.linkedin.com/company/isaca

Facebook:ย www.facebook.com/ISACAGlobalย 

Instagram: www.instagram.com/isacanews/ย 

 

Contact:

Karen Keech karen@establishedpr.com.au 0411 052 408

Share This