Algorithms at MBIE: Transparency and accountability for our algorithm use

The Ministry of Business, Innovation and Employment (MBIE) is a founding signatory to the Algorithm charter for Aotearoa New Zealand and is committed to applying its principles in our use of algorithms. 


MBIE uses data to help inform, improve, and deliver the important services we provide to people in Aotearoa New Zealand and internationally. While MBIE decisions are largely made and reviewed by people, algorithms play a role in distilling information from large or complex data sets to support human decision-making and reveal insights that could not easily be revealed by human analysis alone. Human oversight is maintained at an appropriate level in our use of algorithms to ensure the data we use is being used safely and effectively.

MBIE defines an ‘algorithm’ as an automatic process which can identify patterns in data to assess criteria or predict outcomes.  

We are a founding signatory to the 2020 Algorithm charter for Aotearoa New Zealand, and are committed to using our algorithms in an ethical and transparent way.

Algorithm charter for Aotearoa New Zealand

The Algorithm charter for Aotearoa New Zealand demonstrates a commitment to ensuring New Zealanders can have confidence in how government agencies use algorithms – focusing on demonstrating transparency and accountability in the use of data.

Agencies who commit to the charter are obliged to self-assess algorithms they use to deliver their services to identify the level of risk associated with the algorithm. This will help us to prevent the perpetuation of bias, protect privacy, and ensure alignment with the principles of the Treaty of Waitangi.

For more information about the charter, supporting documentation and algorithm use across government, click the link below:

Algorithm charter for Aotearoa New Zealand(external link) —

Algorithms at MBIE

As part of our commitment to the charter, we are implementing an MBIE Algorithm Use Policy. The purpose of this policy is to enable accountability for decisions at MBIE, increase transparency about our algorithm use, and strengthen support systems for our staff working with algorithms. 

Algorithms are used at MBIE for one or more of the following reasons:

  • Being more efficient
    Computers can process large quantities of data quickly, much faster than our staff can manually.  We may use algorithms to keep up with large application numbers so our staff can deal with more complex work and respond to the public quicker. For example, the Immigration New Zealand Advance Passenger Processing (APP) which performs validation matching and screens for risks at the border.
  • Limiting human bias from a process
    As humans, we bring our own subconscious biases into all decision we make. Through automation we can limit and reduce the impact of these biases and provide more consistent decisions and responses. For example, algorithms that assist with recruitment, finance and procurement allocation.
  • Predicting future behaviour
    Our predictions for the future are never guaranteed, but are still useful tools for planning, weighing up different options and taking proactive action if risky situations arise. Our algorithms help us understand what our future may look like so that we can stay prepared. For example, using predictive analytics to estimate business health.

We will apply the charter where the use of algorithms can significantly impact the wellbeing of people, or there is a high likelihood many people will suffer an unintended adverse impact. We will assess our algorithms for risk and apply the charter accordingly. 

Data Science Review Board

We have established a Data Science Review Board to provide MBIE with strategic and practical direction, guidance and leadership for matters relating to data science and algorithm governance. 

The Board is made up of MBIE staff and external members, who provide advice on algorithm use and development, and ensure that algorithms have undergone a robust review by quality experts and adhere to accepted standards.

More information

For more information, please contact us:

Contact us

Last updated: 30 November 2023