In my last post, I discussed the tragic case of the six children adopted by Jennifer and Sarah Hart. The entire family is presumed dead in the crash of their SUV off a cliff in California on March 26. Multiple system gaps resulted in the failure to rescue these children before their tragic death. Below are some suggestions for filling these gaps so that children do not continue to suffer and die in abusive homes.
Improve Vetting of Potential Adoptive Families. States that are desperate to find adoptive parents for large sibling groups or other children with special needs should not overlook obvious red flags. Clearly a past investigation for abuse of an adopted child–as in the Hart case– should have resulted in serious reconsideration of their application to adopt the sibling group that was currently living with them for a trial period. But the home study process should also be sophisticated enough to identify more subtle problems. These might include parents with a “white savior” complex who are adopting for the wrong reasons and are not suited to parent traumatized children.
Monitor adoption subsidy recipients. The Harts received almost $2,000 a month in adoption subsidies, but the children were never monitored to ensure that all was well. All agencies paying adoption subsidies should verify periodically that the children are alive and well and still living in the adoptive home. Submission of an annual doctor visit report, and/or an annual visit by a social worker could be used for such verification. There has been little support in the past for monitoring families receiving adoption subsidies, on the grounds that adoptive families should be treated the same as biological families. But the addition of money to the arrangement modifies the picture. Adoptive families sign contracts with the state, which could include a requirement that they cooperate with monitoring. When taxpayers are financing the care of our most vulnerable children until they reach adulthood, they should demand that the well-being of these children be regularly monitored.
Regulate homeschooling. The Harts removed all their children from school after their child abuse case closed in Minnesota. The Coalition for Responsible Home Education (CRHE), an advocacy group for homeschooled children, recommends barring from homeschooling parents convicted of child abuse, sexual offenses, or other crimes that would disqualify them from employment as a school teacher. CRHE also recommends flagging other at-risk children (such as those with a history of CPS involvement) for additional monitoring and support and requiring an annual assessment of each homeschooled child by a mandatory reporter. Unfortunately, the powerful homeschool lobby has beaten back attempts to impose such requirements in many states. But the climate may be changing, with a raft of horrific cases around the country (most recently the Turpins) resulting in proposals to require regulation.
Adopt universal mandatory reporting and educate the public about reporting child maltreatment. If a friend who witnessed abuse by the Hart parents in 2013 or their Washington neighbors had reported their disconcerting observations earlier, the children’s lives might have been saved. Eighteen states already require all adults to report suspected child abuse; the rest impose this requirement only on specified professional groups. All states should adopt universal mandatory reporting, but more importantly they should inform their residents about the signs of child maltreatment and the need to report. Public information campaigns should emphasize that the reporter need not have proof that there is maltreatment before making a report. As one child advocate puts it, “a reasonable suspicion that a child is at risk” warrants a call to the child abuse hotline. Better safe than sorry.
Make investigations more child-friendly. A family friend who reported that the Harts deprived their children of food as punishment was told that CPS could not verify the allegation because the children had apparently been coached to lie. We need to rectify the pro-parent bias that allows many true allegations of abuse to be unsubstantiated or even not accepted for investigation. Investigators must be required to interview children before they can be “coached” by parents. If children appear to be coached, the case should be kept open until enough information is gathered to ensure they are safe.
The Hart children can be seen as victims of a “perfect storm”–adoption by unqualified parents, home schooling, neighbors who failed to report, history not shared between states, and inadequate investigations. But it only takes one system failure to kill a child or scar one for life. All of these systemic gaps must be addressed, so that all children can have a real childhood and grow to be happy, productive adults.
In August 2016, Allegheny County Pennsylvania (which includes Pittsburgh) became the first US jurisdiction to use a predictive algorithm to screen every call to the child abuse and neglect hotline. In a brilliant article for the New York Times Magazine, science writer Dan Hurley clearly explains how the tool works and how it changes current practice. Hurley’s account suggests that Allegheny’s experience is a hopeful one for the county and for children nationwide.
Hurley introduces the Allegheny Family Screening Tool, an algorithm developed by leading child welfare researchers in concert with DHS policymakers. To develop the algorithm, the authors analyzed all referrals made to the county child abuse hotline between April 10 and April 2014. For each referral, the authors combined child welfare data with data from the county jail, juvenile probation, public welfare, and behavioral health programs to develop a model predicting the risk of an adverse outcome for each child named on each referral. (A more technical description is provided by the authors here.) The end product was an algorithm that calculates a risk score between 1 and 20 for each child included in a referral.
The policymakers and developers chose to use the algorithm to supplement, not supplant, the clinical judgment of hotline workers. Only if the score exceeds a certain threshhold does it trigger a mandatory investigation; below that level the risk score it provides another piece of data to help the hotline worker decide whether to assign the case for investigation.
Among the most important takeaways from Hurley’s article are the following:
Before the development of the new algorithm, Allegheny County had experienced a series of tragedies in which children died after maltreatment reports had been made to the hotline but screened out. The problem was not incompetence or poor training. Hotline workers simply cannot within the 30 minutes to one hour allowed for decision making investigate all the historical data on all family members from numerous agencies with which they may have had contact.
Evaluation data shared with the reporter show that implementation of the Allegheny County Screening Tool resulted in more high-risk cases being screened in and more low-risk cases being screened out. Hurley provides a real case example. A teacher reported that a three-year-old child witnessed a man dying of an overdose in her home. Department records showed numerous reports to the hotline dating back to 2008 about this family, including allegations of sexual abuse, domestic violence, parental substance abuse, inadequate food, physical care, hygiene and medical neglect. Nevertheless, the hotline worker was poised to screen out the case as low risk. The tool, however, calculated a risk rating of 19 out of 20, causing an investigator to go out to the home. Eventually, the mother was found to be unable to care for the children due to her continuing drug abuse, and they were placed with family members, where they are doing well.
County officials were astute in awarding the contract to develop a predictive algorithm. Several other jurisdictions have gone with private companies such as Eckerd Connects and its for-profit partner Mindshare, which has a predictive analytics tool called Rapid Safety Feedback (RSF). The details of RSF are closely held by the companies, and the state of Illinois recently terminated its contract because the owners refused to share its details, even after the algorithm failed to flag some children who later died. The Allegheny Family Screening Tool is owned by the county. Its workings are public and have been published in academic journals. Moreover, its developers, Emily Putnam-Hornstein and Rhema Vaithianathan are acknowledged as the worldwide leaders in their field, with extensive publications and experience in doing similar work.
County officials were also astute in developing and rolling out their model. They held public meetings before implementing the tool, giving advocates a chance to interact with the researchers and policymakers. Choosing to use the tool at the hotline stage rather than a later step such as investigation made it less threatening as the tool is not being used as input on whether to remove the child, simply whether to investigate. In addition, the county commissioned an ethics investigation by two experts before implementing the tool. The reviewers concluded that not only was the tool ethical but that it might be unethical to fail to implement it. The concluded that “It is hard to conceive of an ethical argument against use of the most accurate predictive instrument,”
Many opponents of predictive analytics argue that it institutionalizes racial bias by incorporating data that is itself biased. Supporters have argued that predictive algorithms reduce bias by adding objective algorithms to subjective worker judgments. Preliminary data from Pittsburgh supports the proponents, suggesting that the algorithm has resulted in more equal treatment of black and white families.
Other jurisdictions are already emulating Allegheny County. Douglas County, Colorado has already commissioned Putnam-Hornstein and Vaithianathan to develop an algorithm and California has contracted with them for preliminary statewide work.
Given the Allegheny County algorithm’s promising results, one cannot help wondering whether a similar algorithm should be used at later stages of a case as well. A similar tool could be very useful in aiding investigators in making a decision about the next step in a case. Such a proposal would of course trigger an outcry if used to decide whether to remove a child from home. But like the Allegheny County screening tool, such an algorithm can be used to supplement clinical judgment rather than replace it. Policymakers need not set any level that would trigger a mandatory removal. However, they could set a risk level that requires opening a case, be it out-of-home or in-home. Many children in many states have died when agencies failed to open a case despite high risk scores on existing instruments. Algorithms can also be used to monitor ongoing in-home cases, as Rapid Safety Feedback has demonstrated. Perhaps if and when predictive algorithms are proven to be effective at protecting children they will be integrated into multiple stages and decision points, like the actuarial risk assessments that many states use today.
Identifying the children most at risk of harm by their parents or guardians has been one of the knottiest problems of child welfare. Allegheny County’s experience, as portrayed by Dan Hurley’s excellent article, provides hope that emerging predictive analytics techniques can improve government’s ability to identify these most vulnerable children and keep them safe.