These are the 4 skills to look for when building an AI governance team

 

This article was written by Allan Millington, Global Data Policy Leader at EY. The original article was published by the World Economic Forum. You can find the article here

  • AI governance is rising quickly to the top of agendas in businesses worldwide.
  • No single compliance function within a business can oversee AI governance in its entirety — it must be a collaborative effort.
  • Here are 4 key skills that anyone responsible for AI governance in a business needs.

The question of AI governance has arrived in the C-suite. Top executives, along with technology, data privacy, risk, legal and security professionals are seeking insight into how other companies are tackling the thorny question of governing AI and determining which model is best for them.

It’s becoming clear that sweeping change is needed — and urgently. AI laws coming into force don’t offer an end-to-end blueprint to aid compliance that address all risks; they add to an existing list of established compliance obligations. What is different is that risk assessment and mitigation cannot be implemented inside traditional silos.

At the core of this is data, and we all know there is no AI without data. While the collection, use, processing, storage and disposal of personal data is already a highly regulated space, organizations need to stretch further to address how AI upends the traditional data risk landscape.

AI governance needs to be multidisciplinary

One of the factors complicating AI governance is that compliance specialisms typically sit in multiple parts of the business. As a result, their existing governance mechanisms and processes may be quite siloed, focused on identifying and mitigating specific risks without assessing the bigger picture outside of their narrow remit.

For example, data protection professionals are not equipped to identify potential intellectual property infringement during their privacy review. Equally, intellectual property lawyers are not best placed to opine on risks associated with personal data. But it is precisely these functions — data privacy, risk, legal and security — that must work together to address risks associated with AI.

Collaboration is essential. The best way to tackle this is to assign ownership of AI governance to the function that has the skill set to pull together a seamless and integrated approach and wrap around an appropriate operate model to ensure accountability.

Start now with a skills-based approach to assigning AI governance

It’s better to start early. Don’t wait until AI laws are passed in your jurisdictions to start the internal conversations you need to have about AI risks. The current challenges around intellectual property, data protection and product liability repeatedly appear in the press, and there are still a lot of grey areas. New governance models will take time, as will careful negotiation of organizational culture and politics, including associated funding to operationalize and scale a new governance model in line with strategic ambitions for AI.

4 skills for an effective AI governance function

Taking a skills-based assessment to who holds the reins of AI governance can offer clarity and inform an organization’s decision. These four skills are essential for an effective AI governance function:

  • Can explain how AI technologies work: given the range of stakeholders involved, understanding and being able to translate key points of how AI technologies work to a broader governance audience is critical. Technical terms must be understood and communicated to mostly non-technical people and senior leadership in a language they understand, together with the ability to go into the detail as required. This requires an element of reskilling and being proactive in terms of understanding AI and the underlying data and working closely with business.
  • Can establish and manage a governance programme: the ability to influence leaders and secure appropriate budgets is critical for implementing an AI governance programme, especially as it needs to align various existing governance mechanisms, processes and operating models in place already within an organization and effectively manage a new programme, including organizational training. Attention needs to be paid to determining if an existing compliance programme, for example a data protection programme to facilitate GDPR compliance, can incorporate AI governance requirements, or if a new framework or overarching governance model aligned to existing frameworks is best positioned to support it.
  • Understand the organization structure and how work really gets done: it’s critical to know how current compliance functions operate and in which part of the business they sit, to be able to pull together the multidisciplinary team that is needed. For example, having a robust framework that captures all the challenges around the use of personal data, but fails to identify contractual risks on the use of AI or copyright protections, will fail everyone. The ability to navigate across functions with knowledge of existing frameworks (identifying which team determines copyright risks related to open-source software, for example) will be an enormous asset. Creating a common framework that addresses all these needs for AI is no small task — you need both influencers and doers.
  • Solution and outcome-focused: laws, regulations, website terms of use and commercial contracts all bring restrictions, interpretation and ambiguity, so it’s critical that the function driving AI governance is solution focused, proactive and helps navigate the business around these challenges while addressing risk. Otherwise, this will paralyze the business and cultivate a culture that will be less likely to comply with any guardrails established by the AI governance programme. AI governance should provide a way forward to accelerate and enable innovation, not prevent it.

Do things differently and build confidence in AI

AI will undoubtedly disrupt existing industries and roles will evolve across the business, requiring a reskilling of the workforce. It is essential to view AI governance as more than a tick-box exercise — it’s a chance to leverage AI to do things differently while providing strong oversight for people to trust the outcomes. As technology continues to evolve at pace, and as laws and regulations continue to move albeit at a slower pace, creating a multidisciplinary and agile team that is ready to enable the business will set you apart from the competition.

Wherever AI governance lands in your organization, mindset is everything. The more you recognize the skills needed and understand it’s a team sport, the more likely it will be that this new shape of governance is embedded in your organization, providing the right guardrails to accelerate innovation – which is what everyone really wants to achieve.

The views reflected in this article are the views of the author and do not necessarily reflect the views of the global EY organization or its member firms.

 

Share post

Your B2B Navigator in the World of Market Intelligence - Contact Us

Tailored B2B information solutions. We collect and integrate vital intelligence, empowering your growth strategies and competitive edge. Accelerate your pathway to success.