ISACA, a global organisation focused on digital trust, has introduced the Advanced in AI Risk (AAIR) certification. The credential is aimed at professionals working in audit, risk, security, privacy, and compliance, with a focus on developing skills for governing AI systems.
The launch comes amid differing levels of AI adoption and preparedness across European organisations. According to ISACA’s 2026 AI Pulse Poll, 59% of digital trust professionals are unsure about their organisation’s ability to shut down an AI system during a security incident. These results indicate gaps in operational readiness and accountability for AI systems.
AI risk is increasingly being treated as a business and governance issue rather than solely a technical one, with attention at board level. The absence of established governance structures may create regulatory and reputational risks, particularly in light of the forthcoming EU AI Act, which introduces requirements for oversight and accountability.
ISACA’s findings highlight gaps in governance practices. For example, 33% of organisations do not require employees to disclose AI usage, while 38% assign responsibility for AI risk to a board or executive-level role. This suggests variability in how organisations manage oversight of AI activity.
The AAIR certification is designed for experienced IT risk professionals looking to build expertise in AI governance. It covers areas such as governance framework integration, lifecycle risk management, and risk programme management, with an emphasis on assessing AI-related risks and communicating them to stakeholders and regulators.
To be eligible, candidates must already hold one of 25 specified certifications, including CISA, CISM, or CRISC. Supporting materials such as an online review course and a review manual are available for preparation.
Overall, as AI becomes more embedded in business processes, certifications like AAIR are positioned to support the development of governance and risk management capabilities needed to address associated challenges.