Cybernetics standout enters digital front in fight for justice

Cybernetics

Cybernetics standout enters digital front in fight for justice
Cybernetics standout enters digital front in fight for justice

As a boy growing up on his grandparents’ orchard in Armidale, NSW, Ned Cooper got on well with classmates from First Nations communities. When he moved with his parents to Sydney, he connected with organisations and communities there.

“I noticed that Aboriginal kids got a lot more attention from the government, including the police,” Cooper recalled. “The legal system had a big influence on our diverging pathways, and I became interested in why laws come to be the way they are, and how they shape people’s lives.”

Ten years later, he was a lawyer doing policy research for the Aboriginal Legal Service when he was confronted with a nightmarish glimpse of Australia’s future — one that transformed his career path and the trajectory of his life.

“I’d heard about machine learning and artificial intelligence in movies and science fiction, but I hadn’t seen it at work in a system making really important social decisions,” Cooper said. “I thought that was the domain of humans.”

Rise of the machines

It was 2017. Cooper was preparing a submission for the Australian Law Reform Commission about criminal laws and policies that impact First Nations people when he learned about software being employed “to predict future criminals” in the United States.

As reported in a ProPublica exposé called “Machine Bias”, a team of machine learning engineers had designed an automated risk assessment tool that drew upon personal data to expedite judicial decisions on whether a person charged with a crime should be released on bail.

The designers had not intended to use race or ethnicity as deciding factors, but the data points they fed to the machine learning algorithms produced alarming results.

The formula was “particularly likely to falsely flag black defendants as future criminals”, which meant they stayed behind bars while awaiting trial. This happened at nearly twice the rate for white defendants, ProPublica reported. White defendants, meanwhile, were more likely to be mislabelled as low risk and then go on to commit crimes after release.

At the same time, New South Wales was considering reform of the bail-setting process. Cooper was concerned that any introduction of automated risk assessments would have an unfair impact on Aboriginal communities in Australia.

“There is so much data on these communities because they’ve been tracked and traced their whole lives,” he said.

The data that would inform automated risk assessments in Australia would have inherent bias, he argued, because everything that had come before it had had that bias.

“Although bias and marginalisation are not created by technology, they can be amplified or accelerated by technology”, Cooper said.

New South Wales has not, to-date, implemented a machine learning system for bail decision making. Cooper went on to become Manager, Network Strategy at the National Broadband Network (NBN). But he continued to work for the Aboriginal Legal Service as a volunteer and to think about how to prevent the dystopian future he feared.

“I became a lawyer because I thought laws were the best point of intervention to address social harm,” he said. “But then I realised, oh wait, actually technological systems are equally important, and increasingly important. That’s where my focus shifted.”

His lens shifted as well. He saw machine learning systems everywhere ­—smart devices in people’s homes, apps on phones that never left people’s sides. Employers were using computer algorithms to narrow hiring searches down to just the most desirable candidates. The NSW Police Department’s Suspect Target Management Program (STMP) was causing teenagers and pre-teens to be monitored by police because they had been predicted to commit crimes in the future. Those kids were disproportionately Aboriginal.

“I knew these kinds of systems were going to be disseminated more broadly in society,” Cooper recalled.

So, he found himself reading articles about AI on weekends.

“You can’t just critique these systems without understanding how they operate. To learn how they operate, you have to learn how to build them yourself,” he said.  “It was my grandfather who suggested I should probably think about a career shift if I was that fascinated by the topic.”

“I looked around in Australia, and I was like, Who is studying this?”

The precursor to The Australian National University (ANU) School of Cybernetics, the 3A Institute, was announced at the end of 2017. Cooper applied as soon as he felt he had gained enough skills in data science and computing and joined the second cohort.

“I was especially interested in the Master of Applied Cybernetics at ANU because it was situated in a College that really thought about how systems operate in a physical and social environment,” he said.

Creating a new branch of engineering

Cooper arrived at ANU in 2020, drawing immediate inspiration from the academic approach.

“It doesn’t focus purely on optimising an algorithm for some sort of metric,” he said. “It does focus on how to make systems — the interactions they have with the world — responsible and safe and sustainable.”

Cooper said the most exciting part of his experience at the School of Cybernetics stemmed from the diversity of the cohorts and the School’s collaborative approach to learning. 

“There was a real mix within the School of committed people from all parts of the world and totally different disciplinary backgrounds,” he said. “We had computational neuroscientists, we had computer scientists, we had engineers, artists, a few people from government and policymaking.”

“Our discussions were fruitful across the disciplines. We were trying to develop a new branch of engineering.”

And how might we define that new branch of engineering?

“Cybernetics is the science of purposeful systems — systems that learn from feedback received from the environment in which a system operates,” Cooper said. “So, cybernetic systems are those that correct or adjust their behaviour based on that feedback towards some sort of goal.”

Although the global pandemic impacted his studies, Cooper said that remote learning didn’t get in the way.

As part of the program, he participated in a capstone engineering project with the Google People + AI Research team. Cooper reviewed two decades of community-based participatory research and proposed how it might be incorporated in the machine learning development process.

“To avoid injecting bias into machine learning data sets, we need to engage more with the lives behind those data points,” Cooper said. “We can’t just look at datasets divorced from the social settings in which they were captured. We need representation from those communities during the development process.”

Following on from the capstone project, Cooper collaborated with Google on a highly-regarded research paper which he presented at the CHI: Conference on Human Factors in Computing Systems in New Orleans in 2022.

He now works part time for Google, based in Sydney.

Big data a crucial battleground in fight for justice

After obtaining his Masters degree, Cooper decided to continue on as a PhD candidate at the ANU School of Cybernetics, one of four students to do so from his Masters cohort.

“We work together pretty closely,” Cooper said. “We’ve produced podcast episodes together, written papers together.”

His PhD research is still in an early phase, but he said it is a continuation of the work he began during his Masters degree, and indeed, the work he’d been doing long before that.

Law and computer science are similarly dependent on architecture, Cooper explained, citing an academic named Mireille Hildebrandt.

“If you change something in a computing system, that has effects elsewhere in that system. Similarly, changing one law or one piece of legislation has effects in other parts of the legal system.”

Cooper’s interest in studying law had been spurred by a desire to prevent historical biases from being codified into law. Part of that, he learned, depends upon who writes the laws, and with what degree of awareness.

He is now applying the same principle for building cybernetic systems.

“Much of my career has been about trying to get the participation of people in the decisions, the systems, and the policies that affect them,” he said. “I’m looking at engaging with people who haven’t been included in the past in determining how data is represented and how it is acted upon by systems.”

arrow-left bars search caret-down plus minus arrow-right times