News Analysis: Anthony, Noah, Gabriel and beyond: How to fix L.A. County DCFS
From left, Anthony Avalos, Gabriel Fernandez and Noah Cuatro.
In the long, troubled history of L.A. County child abuse cases, certain names stand out as avatars of how the system can go terribly awry. Anthony Avalos. Gabriel Fernandez. Noah Cuatro.
But since the spring of 2020, another name has wielded outsize influence over national perspectives and policies related to child welfare, and energized activists to push for sweeping reforms: George Floyd.
The murder of the Black Minneapolis resident by a police officer in May 2020 set off a national soul-searching over the country’s racist past and the prejudices that still haunt its institutions. In L.A. County, that process has focused intense scrutiny on what a number of racial justice advocates and elected officials say is an implicit bias that may make some Department of Children and Family Services workers more prone to regard poor families and parents of color as unfit to raise their children.
In 2020, three-quarters of children removed from their homes in L.A. County were Latino or Black, according to a motion — authored by Supervisor Holly Mitchell and passed in July by the Board of Supervisors — to begin implementing a controversial pilot project called “blind removal.”
The program, first adopted in Nassau County on Long Island in New York, redacts all race and race-related factors from the dossiers used by social workers and supervisors in determining child welfare cases. And it is gaining popularity, despite critics who say that it has shown insufficient evidence of its efficacy and that it adds one more task to an overtaxed workforce.
“For decades, Black and brown children have been substantially overrepresented in L.A. County’s child welfare system, and it’s past time for us to change that,” Mitchell said at the time.
(Mitchell will take part this evening at 7 in a free Los Angeles Times virtual event, “Ask a Reporter: The future of L.A.’s child welfare system,” with investigative reporters Garrett Therolf of UC Berkeley and Matt Hamilton of The Times.)
Then, just two weeks ago, the Board of Supervisors unanimously passed a motion to step up efforts to provide interpreters for Indigenous families in the foster care system and bolster its Asian Pacific and Native American programs.
But while officials scramble to address these race-related concerns, other child welfare experts assert that another, relatively new methodology using machine learning and algorithms is more likely to yield race-neutral and reliable results that, among other benefits, will enable social workers to accurately identify incidents of child abuse at an early stage and move swiftly to intervene.
The challenge for each of these tools is to narrow the front door of the system so that the agency no longer interferes with Black and Latino families whose situations don’t warrant it, while at the same time ensuring that children living in real peril get the attention they need.
At least one child dies of abuse or neglect in an average month in L.A. County, despite their families having a history of involvement with DCFS. And with equal frequency, the Board of Supervisors orders a set of recommendations to prevent such a tragedy from happening again — until it does.
The tragic pattern repeated itself with Noah Cuatro, the Palmdale 4-year-old whose parents are accused of torturing and murdering him. After the boy’s death in 2019, Bobby Cagle, then director of DCFS, finally moved toward implementing machine learning and algorithms, which some experts had advocated for years.
Rather than relying on caseworkers’ limited ability to weigh a family’s full recorded case history — due to limited time and cumbersome technology — experts had urged the county to partially automate risk analysis with a new generation of predictive analytics tools to scan and evaluate hundreds of known variables regarding families, including prior hotline calls and the child’s age when the first hotline call was received.
The use of such tools to predict which children are at greatest risk has attracted controversy because of the chance that they might exacerbate racial disparities in child welfare. That’s because Black and Latino families tend to interact more frequently with entities such as public hospitals and mandated reporters who generate the data that are used to train the algorithm on how to detect risk. Pushback from the American Civil Liberties Union and others had stalled the development of the program for years, contributing to the decision by Cagle’s predecessor, Philip Browning, to retire in frustration.
But Cagle gained the majority support of L.A. County’s supervisors that had eluded Browning, and the tool was piloted last summer to help flag children like Noah who may be at the highest risk. (Cagle resigned in November, as DCFS faced mounting criticism over a series of fatalities and abuse of children under the agency’s care.)
A lead designer of predictive analytics, Emily Putnam-Hornstein at the University of North Carolina, emphasizes that the tool is designed to be advisory and can be easily set aside by caseworkers if their investigation verifies that no significant safety threats exist.
In a slide show presentation explaining the need for tools like L.A. County’s, Putnam-Hornstein wrote that her work stems from a “growing appreciation that current tools are inadequate, clinicians are poor at weighting factors (and time is scarce!).” An independent evaluation team will ultimately decide whether the tool helps.
The pilot program was implemented at DCFS’ offices in Lancaster, Santa Fe Springs and Belvedere, and used in three major ways, according to a news release issued last year by the Centre for Social Data Analytics, the New Zealand-based team that helped develop the model for L.A. County.
One tool focuses on complex cases and provides prompts to social workers that may assist an investigation, such as involving additional staff or consulting a more senior supervisor. Another tool provides an “investigation overview report,” which offers a visual rendering “of related child welfare history as a reference point” during an investigation.
A final predictive analytics tool is a “quality assurance strategy,” not a tool to aid specific investigations, and is aimed at identifying patterns in child abuse hotline screening practices that may lead to racial or socioeconomic bias.
Years of criticism
For decades, watchdogs and county insiders, such as the state auditor and a special counsel hired by the county to review child fatalities, have filed reports that largely agree on the root causes behind DCFS’ inability to use the power of its $2.4-billion budget to markedly reduce child fatalities. There have been promises of reform, but little action.
Other systemic flaws — not explicitly related to racial or socioeconomic factors, but rather to procedural myopia and institutional inertia — are regularly cited by critics to account for the county’s failures in preventing child abuse.
One near-universal assumption among social workers is that children belong with their own families in the absence of serious safety threats. But critics contend that this can lead caseworkers to reflexively declare success whenever a child remains with their parents, without taking full account of whether the child could be at risk.
“DCFS should change its messaging from Do Not Detain/Keep The Numbers Down,” wrote the Board of Supervisors’ special counsel in a secret internal 2012 report. Yet workers in the cases of Noah Cuatro, Gabriel Fernandez and others went on to finalize decisions to leave children in dangerous homes without even reading their own agency’s case file.
Research has shown that people who are not trained to assess and investigate child abuse are more successful than DCFS workers at predicting the most serious harms to children. Researchers have found that the number of calls made by the public to child protection hotlines was a better indicator of deadly risk than the conclusions of caseworker investigations.
Finally, the county’s own internal reports have repeatedly revealed that underperforming supervisors and executives often manage to hold on to their jobs without penalty.
In 2019, the state auditor concluded that, “although [DCFS] reviews the circumstances surrounding child deaths, the department does not have a process for ensuring that it implements the recommendations resulting from such reviews.”
The families of the dead have pressed the need for reform most fervently.
“We keep in constant contact. I call them my warrior family,” said Maria Barron, aunt of Anthony Avalos, the 10-year-old boy who was killed in 2018. “They know the pain I’m feeling. I feel the same way.”
Over the last decade of relative paralysis by the Board of Supervisors to implement effective reforms, hundreds of children whose cases cried out for help have died at the hands of their caregivers.
Those deaths were disproportionately concentrated in the Antelope Valley, including those of Gabriel, Anthony and Noah.
The region’s two offices, in Lancaster and Palmdale, are responsible for an area that is home to 1 out of 20 children in the county. Yet nearly 1 out of 5 victims of fatal abuse and neglect die in their jurisdiction, according to a review of case records for the more than 100 victims who died between 2011 and 2016.
One reason is that all of the agency’s core problems exist in this desert terrain. What’s more, the individual family cases tend to be more perilous: Rather than containing only one major variable (severe mental health limiting a parent’s ability, domestic violence, refusal to respond to a child’s medical needs), often two or more are present.
Similarly, protective factors — such as close-by relatives to provide support, robust community resources for day care, mental health services, drug rehabilitation and other resources — tend to be more scarce, and the agency has fewer experienced caseworkers.
Complicating matters further is that the region has one of the highest shares of Black residents, so any misstep exacerbates concerns about racial justice.
Even when highly valued resources are available, success often depends on intangibles in short supply: a cooperative relationship between parents and professionals, as well as commitment by all parties to follow through.
In the case of Noah’s family, for instance, the child protective services workers successfully linked the family to clinicians trained in a practice called Parent-Child Interaction Therapy that has significant evidence to support its efficacy in protecting children at risk of abuse. The therapy focuses on the fractured bond between parent and child, as clinicians hidden behind a one-way mirror coach the parent through an earpiece on how to respond to their children in healthier ways.
But, like so many potential fixes, the therapy is difficult to carry out in situations like Noah’s where the relationship is adversarial between the parents and caseworkers, and the therapy course is ordered during infrequent court hearings and is not mutually designed by the family and clinicians.
Instead of feeling like a helping hand, “it can feel pretty punitive coming from” child protective services, said Susan Timmer, a psychologist at UC Davis who helped design the therapy.
“People, if they’re not happy with an intervention, they’ll drop out,” Timmer said. And Parent-Child Interaction Therapy, “like in a lot of more intensive parenting interventions, has a pretty high dropout rate: like 50% to 60%,” she continued.
Now, as the county prepares to select its next DCFS director, that person will have to confront a central problem that has undermined PCIT and so many other initiatives of the past: trust.
At a recent forum held by Fordham’s School of Law to discuss Los Angeles County’s use of machine learning, Ron Richter, the former director of New York City’s child welfare system, said that even “when we talk about a tool that may help reduce disproportionality and family regulation, sincere issues of trust surface, especially for those of us who have witnessed firsthand what child welfare looks like on the ground.”
That’s also true, Richter added, for “those who have been historically judged by this system and feel strongly that many children and families have been misjudged.”
Therolf is a reporter at the Investigative Reporting Program at UC Berkeley’s School of Journalism. IRP reporter Daniel Lempres contributed to this report.