At The ODP Corporation, the board has directed the business to develop an artificial intelligence (AI) governance plan so that appropriate reviews and approvals are in place. The internal audit function is involved in the process, with the CAE being a member of the AI Governance Committee, according to Sarah Morejon Rodriguez, senior manager of Internal Audit. The discussions delve into the evolving landscape of AI deployment, addressing implementation milestones, emerging risks, and the controls necessary to help safeguard stakeholder trust.
As organizations race to harness the transformative potential of AI and generative AI, internal audit functions are stepping into a critical role. Far beyond compliance, they are becoming strategic partners — helping boards and executive teams navigate the ethical, operational, and reputational dimensions of AI.
In this era of accelerating innovation, AI governance is not optional, it is essential. And internal auditors are uniquely positioned to provide independent assurance to support governing bodies through both advisory and assurance services.
Having a Seat at the Table
As a member of the AI Governance Committee, The ODP Corporation’s CAE has “a big role in making sure that risks are being identified, that they’re being addressed appropriately, and that the right groups are involved in AI governance discussions,” Rodriguez says. The stakeholders vary based on the AI use case, but generally will include IT, legal, risk management, information security, privacy, compliance, and relevant business owners. The CAE’s presence also helps ensure that the internal audit function is thoroughly up to date on current AI processes and uses throughout the organization.
Internal audit is a key player because it provides independent assurance to the governing body that the organization’s approach to managing AI and internal controls is robust, regularly updated, and effectively implemented across business units, says George Barham, director of Standards and Professional Guidance for technology at The IIA. In doing so, internal auditors employ the same rigor and approaches that they use in reviewing policies and procedures in other areas, he says.
When it comes to AI roles and responsibility, Barham emphasizes the need for robust checks and balances and segregation of duties, so that those who design and develop the AI programs are not the ones who test and deploy it. In terms of compliance, internal audit can examine any related regulations and provide independent assurance that the organization is following the rules, in collaboration with the legal and regulatory teams.
To manage risks before they cause operational or reputational damage, internal auditors can consult with product and engineering teams on risk management issues and recommend the best ways to address them, according to Richter. Often the people implementing new technologies may already be aware of related risks, but they might not have the right controls in place or may not be updating them as needed.
Internal auditors should keep in mind that the core technology is no different from machine learning tools that have been available for years, Richter says. Greater complexity does add new challenges, however. Agentic AI, for example, can make decisions and gain access to a variety of systems and tools. Companies will have to be vigilant about the access these systems are allowed so that they do not inappropriately disseminate sensitive human resources information, for example, he says.
Setting Strong Guardrails
There is often an assumption that governance, policies, procedures, and controls can hinder innovation, but Barham says appropriate guardrails and parameters support AI innovation and the people involved in it. He compares them to the brakes on a car, which give drivers greater confidence because they know how to slow down or stop in any situation.
For AI leadership, it can be easier to innovate if they know there are controls and processes that can prevent any serious missteps. Internal audit, acting as a strategic partner, can help establish valuable guardrails by working to ensure the governing body has the information it needs to provide good governance.
Barham emphasizes the importance of training for those responsible for AI operations and governance. The governing body should receive updates on AI and should be prepared to ask incisive questions and be engaged in the process.
Building an organizationwide culture of AI accountability and transparency also begins with education for the entire company on what AI tools are available to them and how to put them to best use, Rodriguez says. Training should encompass AI use guidelines, such as security, privacy, and ethical concerns.
To promote a culture of AI accountability and transparency, internal auditors should encourage communication among involved teams and assess whether people at all levels can report a problematic process or result.
Being a Partner
The IIA’s Three Lines Model provides a clear framework for delineating internal audit’s role in relation to both the governing body and organizational leadership. This structure can serve as a guidepost for how internal auditors contribute meaningfully to AI governance.
As AI adoption accelerates, so too does the need for trusted partners who can assess emerging risks and ensure regulatory alignment. “I would love to have a team that can help me keep up with new regulations and risks,” says Richter, who, while not an internal auditor himself, underscores the growing demand for such expertise.
Far beyond a compliance function, internal audit is well-positioned to play a strategic advisory role. Auditors can help their organizations navigate the complexity of AI systems, ensuring that innovation proceeds responsibly. In a landscape shaped by rapid technological change, internal auditors have an opportunity — and a responsibility — to lead.
Disclaimer
The IIA publishes this document for informational and educational purposes only. This material is not intended to provide definitive answers to specific individual circumstances and as such is only intended to be used as peer-informed thought leadership. It is not formal IIA Guidance. The IIA recommends seeking independent expert advice relating directly to any specific situation. The IIA accepts no responsibility for anyone placing sole reliance on this material.
Learn more with our other resources
Leadership Lessons from Scaling AI-Driven Companies
September 3, 2025