EASTON, Pa. – Officials with the Northampton County Department of Human Services on Thursday demonstrated software used to help decide whether to investigate some reports of child abuse or neglect.
The demonstration, during a county council committee meeting, marks the first time the county’s system has been shown to the public.
- The AI-based software is meant to help decide which reports of child abuse and neglect to investigate
- The technology is currently used for "quality assurance," a county director said
- A similar system in Allegheny County built by the same researcher showed signs of possible bias, the Associated Press found
The predictive risk model serves to guide decisions about whether to investigate a report that comes into the county from the state’s Childline abuse and neglect hotline, county Director of Human Services Sue Wandalowski told the committee.
The software takes in data from the report and runs it through an AI model designed to predict a child’s risk for being placed in out-of-home care within the next year.
It is trained on the department’s case records, and takes into account factors like previous removals, parental mental health and referral history. The model does not consider factors like race or criminal history.
“We know that humans are biased. We know that data is biased. Our data, we like to think, is less biased.”Sue Wandalowski, director of the county's Department of Human Services
The idea is to give more context to social workers responsible for deciding which reports to pursue further, while reducing bias in the process as much as possible.
“We know that humans are biased. We know that data is biased. Our data, we like to think, is less biased,” said Wandalowski.
An Associated Press investigation found a version of the software used in Allegheny County may flag parents with disabilities at a higher rate, and has drawn an investigation from the U.S. Department of Justice.
When an agency employee types in a case number, the system generates a score from 1 to 20, displayed on a speedometer-like dial. The dial’s face is broken up into five color-coded sections, from light blue very low risk to bright red very high.
According to Wandalowski, “very low” risk amounts to a roughly 1% chance a particular child will end up in out-of-home care within the next year, compared to 38% for “very high risk” youth.
Currently, the front-line staff who first review a Childline report don’t see the system's risk score. Instead, the score is only used for “quality assurance,” she said.
When a Childline complaint comes in, a screener evaluates whether it warrants further action. If the first screener decides not to refer a report for further investigation, the matter goes to their supervisor, who approves every decision not to investigate.
This supervisor runs each screened-out file through the decision aid. If it returns a score between one and eight, the supervisor doesn’t have to read the file, and the matter is dropped.
Some members of the council showed unease with the program, including committee chair Lori Vargo Heffner.
“It just brings up so many questions,” she said.
If the trial run now underway goes well, the software will play a more substantial role in screening decisions.
Further discussion of the new tool is slated for the next health and human services committee meeting in August.