More than 275 AI and automated decision tools in NSW public sector

By

Number likely to be conservative.

A NSW government survey has uncovered the presence of over 275 automated decision-making tools (ADMs), raising questions about how the tools are administered and their usage is disclosed.

More than 275 AI and automated decision tools in NSW public sector

“Members of the public whose rights and interests have been materially affected by a decision made with the use of ADM, are entitled to be informed of the role ADM played in that decision,” the NSW Ombudsman Paul Miller said in a statement late on Friday.

The list [pdf] of tools covers operational systems and planned projects with a go-live date within three years, such as an artificial intelligence model that predicts risks to children in out-of-home care still in its pilot phase

Two hundred and fifty-seven (257) ADMs are a conservative estimate; the survey used to collect that data was voluntary, and only a quarter of NSW’s 439 public sector entities, including councils, departments and agencies, participated.

Moreover, a public review of procurement records, government websites and other public information, which supplemented the survey, found an additional 702 potential ADMs. 

In his statement accompanying the report, Miller said that the lack of mandatory reporting of ADMs or a public register had stifled “informed debate about what assurance and regulatory frameworks may be appropriate for ADM use now and into the future.”

The lack of mandatory report also made it impossible to know whether the systems were “legally validated, or tested, and whether and how it is subject to ongoing monitoring for issues such as accuracy and bias,” Miller said in the report [pdf]

“Of particular concern to us, it was reported that less than half of the ADM systems reported to the research team had any legal input at the design stage,” Miller added.

Breaking down ADM usage

The ADMs are broken down by the technologies that they use, ranging from decision trees to advanced applications of machine learning, such as classifying inmates [pdf] and detecting people on watchlists entering hospitals, respectively.  

The report [pdf], which the Ombudsman commissioned, estimated an “increase of 50 percent in the next three years” of departments and agencies’ ADMs.

“While structured decision-making tools, risk assessment and geo-location tools continue to be priorities into the future, there does appear to be a growing emphasis on NLP (e.g. chatbot), automated data gathering, recommender, and optimisation tools into the future," it found.

“Compliance” was the most common purpose of the ADMs that the 77 department and agency respondents reported; followed by “enforcement” ADMs, such as Transport for NSW’s image recognition solution for capturing drivers on their phones.  

The list distinguished between “fully automated” systems that “make” decisions, and “partially automated” systems that surface insights or predictions to advise a decision or recommend decisions that still require human approval. 

The Ombudsman’s report said that, even with a human in the loop, ADMs that, for example, recommend which communities to allocate less or more resources towards, still impact people’s rights.  

“Agencies sometimes erroneously assume that if a human is present at some points in a decision-making process, then the system is not an ADM system and the issues relevant to ADM use…are not relevant.” 

The need for legal consultation

A second survey with a smaller sample — 26 entities — on whether legal consultation, privacy impact assessments and other safeguards were used before, during or after deploying the ADMs, followed the initial survey.

Eleven respondents sought legal advice internally, and two externally, before deploying the ADM; only nine confirmed that “legal experts were included in the system design team,” and two were “unsure.”  

Regarding the basis of the ADMs' authorisations, 10 respondents cited "legislation, regulations or other legislative authorisations", 12 cited "organisational policy and procedures," two cited "ministerial direction or guidelines" and one said that there was "no explicit authorisation/directive."

The Ombudsman’s report said that “agencies in a number of cases reported assessment of ADM systems that include cyber testing, internal legal advice, privacy impact and risk assessment. 

“However, there was little evidence of any external or independent assessment, such as external audit or external legal review or advice.”

Backdrop for report release

The report was released at the first hearing of the NSW Parliamentary Inquiry into AI; the Ombudsman’s scrutiny comes against the backdrop of not just Robodebt at a federal level, but of two of the state government’s own controversial ADM case studies.

Revenue NSW used an ADM between 2016 and 2019 that issued illegal, and often inaccurate, garnishee orders to recover debts.

In addition, last year NSW Police discontinued its suspect prediction ADM that over-represented First Nations people. 

Got a news tip for our journalists? Share it with us anonymously here.
Copyright © iTnews.com.au . All rights reserved.
Tags:

Most Read Articles

NAB finds legal limits for GenAI in fight against financial crime

NAB finds legal limits for GenAI in fight against financial crime

BHP taps Azure to keep to its ERP transformation timeline

BHP taps Azure to keep to its ERP transformation timeline

Bendigo and Adelaide Bank uses GenAI, MongoDB to refactor application

Bendigo and Adelaide Bank uses GenAI, MongoDB to refactor application

Westpac sets "aggressive" target for automated tech controls tests

Westpac sets "aggressive" target for automated tech controls tests

Log In

  |  Forgot your password?