
In August 2016, Allegheny County Pennsylvania (which includes Pittsburgh) became the first US jurisdiction to use a predictive algorithm to screen every call to the child abuse and neglect hotline. In a brilliant article for the New York Times Magazine, science writer Dan Hurley clearly explains how the tool works and how it changes current practice. Hurley’s account suggests that Allegheny’s experience is a hopeful one for the county and for children nationwide.
Hurley introduces the Allegheny Family Screening Tool, an algorithm developed by leading child welfare researchers in concert with DHS policymakers. To develop the algorithm, the authors analyzed all referrals made to the county child abuse hotline between April 10 and April 2014. For each referral, the authors combined child welfare data with data from the county jail, juvenile probation, public welfare, and behavioral health programs to develop a model predicting the risk of an adverse outcome for each child named on each referral. (A more technical description is provided by the authors here.) The end product was an algorithm that calculates a risk score between 1 and 20 for each child included in a referral.
The policymakers and developers chose to use the algorithm to supplement, not supplant, the clinical judgment of hotline workers. Only if the score exceeds a certain threshhold does it trigger a mandatory investigation; below that level the risk score it provides another piece of data to help the hotline worker decide whether to assign the case for investigation.
Among the most important takeaways from Hurley’s article are the following:
- Before the development of the new algorithm, Allegheny County had experienced a series of tragedies in which children died after maltreatment reports had been made to the hotline but screened out. The problem was not incompetence or poor training. Hotline workers simply cannot within the 30 minutes to one hour allowed for decision making investigate all the historical data on all family members from numerous agencies with which they may have had contact.
- Evaluation data shared with the reporter show that implementation of the Allegheny County Screening Tool resulted in more high-risk cases being screened in and more low-risk cases being screened out. Hurley provides a real case example. A teacher reported that a three-year-old child witnessed a man dying of an overdose in her home. Department records showed numerous reports to the hotline dating back to 2008 about this family, including allegations of sexual abuse, domestic violence, parental substance abuse, inadequate food, physical care, hygiene and medical neglect. Nevertheless, the hotline worker was poised to screen out the case as low risk. The tool, however, calculated a risk rating of 19 out of 20, causing an investigator to go out to the home. Eventually, the mother was found to be unable to care for the children due to her continuing drug abuse, and they were placed with family members, where they are doing well.
- County officials were astute in awarding the contract to develop a predictive algorithm. Several other jurisdictions have gone with private companies such as Eckerd Connects and its for-profit partner Mindshare, which has a predictive analytics tool called Rapid Safety Feedback (RSF). The details of RSF are closely held by the companies, and the state of Illinois recently terminated its contract because the owners refused to share its details, even after the algorithm failed to flag some children who later died. The Allegheny Family Screening Tool is owned by the county. Its workings are public and have been published in academic journals. Moreover, its developers, Emily Putnam-Hornstein and Rhema Vaithianathan are acknowledged as the worldwide leaders in their field, with extensive publications and experience in doing similar work.
- County officials were also astute in developing and rolling out their model. They held public meetings before implementing the tool, giving advocates a chance to interact with the researchers and policymakers. Choosing to use the tool at the hotline stage rather than a later step such as investigation made it less threatening as the tool is not being used as input on whether to remove the child, simply whether to investigate. In addition, the county commissioned an ethics investigation by two experts before implementing the tool. The reviewers concluded that not only was the tool ethical but that it might be unethical to fail to implement it. The concluded that “It is hard to conceive of an ethical argument against use of the most accurate predictive instrument,”
- Many opponents of predictive analytics argue that it institutionalizes racial bias by incorporating data that is itself biased. Supporters have argued that predictive algorithms reduce bias by adding objective algorithms to subjective worker judgments. Preliminary data from Pittsburgh supports the proponents, suggesting that the algorithm has resulted in more equal treatment of black and white families.
- Other jurisdictions are already emulating Allegheny County. Douglas County, Colorado has already commissioned Putnam-Hornstein and Vaithianathan to develop an algorithm and California has contracted with them for preliminary statewide work.
Given the Allegheny County algorithm’s promising results, one cannot help wondering whether a similar algorithm should be used at later stages of a case as well. A similar tool could be very useful in aiding investigators in making a decision about the next step in a case. Such a proposal would of course trigger an outcry if used to decide whether to remove a child from home. But like the Allegheny County screening tool, such an algorithm can be used to supplement clinical judgment rather than replace it. Policymakers need not set any level that would trigger a mandatory removal. However, they could set a risk level that requires opening a case, be it out-of-home or in-home. Many children in many states have died when agencies failed to open a case despite high risk scores on existing instruments. Algorithms can also be used to monitor ongoing in-home cases, as Rapid Safety Feedback has demonstrated. Perhaps if and when predictive algorithms are proven to be effective at protecting children they will be integrated into multiple stages and decision points, like the actuarial risk assessments that many states use today.
Identifying the children most at risk of harm by their parents or guardians has been one of the knottiest problems of child welfare. Allegheny County’s experience, as portrayed by Dan Hurley’s excellent article, provides hope that emerging predictive analytics techniques can improve government’s ability to identify these most vulnerable children and keep them safe.
One thought on “A potentially lifesaving algorithm in Allegheny County, PA”