Colin Lecher, reporter at The Markup, says the city's Administration for Children's Services uses a family's neighborhood, mental and physical health history, and whether a family has had previous reports to flag them and scrutinize them.
The New York City Administration for Children's Services, or ACS, has been using predictive artificial intelligence to flag some families for greater scrutiny, according to a recent investigation by The Markup.
It found the agency uses machine learning algorithms to score families for risk. The system weighs 279 variables from past involvement with ACS to socioeconomic factors like neighborhood. Families deemed highest risk, the department keeps closer tabs on. Colin Lecher reported the story and tells Marketplace’s Meghan McCarty Carino, like all AI systems, it can encode historical biases.
“The NYC algorithm deciding which families are under watch for child abuse” from The Markup
“Not all AI is, well, AI” from Marketplace Tech