Stakeholder interactions

Clear and effective communications with staff, customers, and other impacted groups builds and maintains trust and confidence in artificial intelligence use and development.

A stakeholder impact assessment can help identify and understand your organisation’s artificial intelligence (AI) stakeholders. This includes steps to:

  • brainstorm individuals, groups, communities or other organisations that have an interest in, can affect, or can be affected by what your business does (and what the AI system does). This could include internal employees, management, customers and clients, shareholders, connected providers, Māori customers and users, and other interest groups. Stakeholders may include, for example, individuals whose personal information and/or IP are used in training or input data; data annotation workers; and anyone being monitored by AI systems.
  • determine the level of impact the project will have on identified stakeholders
  • determine the relative impact of each stakeholder on the success of project implementation
  • define known rights, needs, expectations, concerns, risks, and areas of interest they may have, and if or how organisational-level AI policies and procedures will or can address these
  • understand what engagement approaches will best suit different stakeholders – including frequency, channel, and relationship ‘owner’.

AI systems can affect different people in different ways. Special consideration needs to be given to the impact that AI systems may have (either directly, or indirectly) on specific and/or underrepresented communities such as non-English speakers, disabled people, older people, LGBTIQ+ communities, Māori or Pacific peoples. For example, some speech recognition systems might not understand  different accents or languages. This may stop impacted people from engaging with those systems, limiting customer/user base and exacerbating inequities. Facial or image recognition poses similar challenges.

Engagement and consultation

Considering the needs of impacted stakeholders, and involving them in decision-making as appropriate, helps you build and offer products and services that people actually want and need. Meaningful stakeholder engagement can inform all steps of the AI life cycle.

Stakeholders (identified as part of any stakeholder impact assessment) can be involved through internal discussions deciding what the AI system should do, designing how it works, selecting the right training data, as and/or testing and improving the system. Where you can’t speak directly with stakeholders, you can try to consult independent experts, peak representative industry groups, unions and civil society groups could be consulted. 

It is especially important to be mindful of specific and/or underrepresented communities who may have unique considerations to take into account. For example, where Māori community/ies are impacted by a project, businesses could engage with Māori expertise through channels such as your customer base or governance board. This helps to identify and respond to any problems early that impact service or product offerings, and/or disproportionately affect certain groups.

A stakeholder engagement plan can help businesses stay organised - outlining who to talk to, when to talk to them, and what to share.

Effective engagement is equitable, safe, two-way, value-adding, and conducted in good faith. Careful and strategic planning about the most effective method and timing of engagement is recommended, particularly for larger companies who may be deploying several different systems in different areas simultaneously with overlapping stakeholder groups.

Transparency

People want to know when and how your business uses AI. Being clear builds trust and helps avoid problems.

You can share information on what your AI systems do, how they work, and what risks they might have. Check out the AI Transparency Checklist to help think this through. 

Different groups may need different levels of detail. Consider and document the level of transparency required, method and type of communication, for different audiences. Consultation with representatives from relevant stakeholder groups can help to understand what type and level of engagement might be best.

* In general, it is good practice to let people know when/where a GenAI system is being used or included in functionality (for example a chatbot on a website). Labelling AI generated content, or providing disclaimers about AI use, supports transparency and helps identify GenAI use to customers. AI watermarking is an emerging technology to identify AI generated images in a more robust way that can support authentication, though not immune to manipulation or watermark removal.

Consider digital accessibility and readability needs. GenAI tools can inherently support improved communication, for example with those who have speech or hearing impairments.

Reflect any transparency requirements of third-party suppliers and other partners in relevant documentation and contracts.

Feedback and complaints

Businesses may already have ways for people to ask questions or make complaints about AI use or development.  If not, it could be as simple as a contact email address and/or phone number, and a clear plan for how information received will be responded to.

Some guidance is available on business.govt.nz with regards to steps in a complaints process and training staff to handle complaints.

Complaints process 7-step checklist [PDF 563KB](external link) — business.govt.nz

Training staff to handle complaints(external link) — business.govt.nz

Effective channels for stakeholder queries/feedback/complaints:

  • are easy to find and use - think about groups with specific accessibility needs and how they can be supported to provide meaningful feedback
  • explain how decisions or outcomes of the AI system can be reviewed
  • let people talk to a real person if needed (a human-in-the-loop)
  • support timely and effective responses
  • tell people when they can expect to receive a response.

Recordkeeping is crucial, as proportionate to the risk, to support effective handling of stakeholder issues.