41 pages • 1 hour read
Launched in August 2016, the AFST is a predictive risk model designed by New Zealand researchers and implemented through independent corporate strategists in Allegheny County, Pennsylvania, replacing a system that relied on human decision-making with one that rigidly adheres to algorithmic rules and sidelines social work professionals. AFST uses data graded along a risk/severity continuum to determine whether a family will be investigated for child abuse by a country social worker. After implementation, the AFST has proved to be inaccurate at best and biased at worst. Its algorithm equates symptoms of poverty with those of child abuse: “A quarter of the predictive variables in the AFST are direct measures of poverty: they track use of means-tested programs such as TANF, Supplemental Security Income, SNAP, and county medical assistance” (156). The AFST’s illogical framework built bias directly into its own dataset.
Los Angeles’s automated welfare system collects, stores, shares, catalogs, classifies, and ranks numerically information about the unhoused. Designed to triage those with immediate need of housing, CES should be “a standardized intake process to reduce waste, redundancy, and double-dipping across agencies” (85), but in practice it often comes up short and is difficult to navigate. Moreover, CES poses a threat to the people it claims to be trying to help because it so deeply invades individuals’ privacy.
Plus, gain access to 8,500+ more expert-written Study Guides.
Including features:
Books on Justice & Injustice
View Collection
Books on U.S. History
View Collection
Business & Economics
View Collection
Class
View Collection
Class
View Collection
Common Reads: Freshman Year Reading
View Collection
Contemporary Books on Social Justice
View Collection
Equality
View Collection
Political Science Texts
View Collection
Politics & Government
View Collection
Poverty & Homelessness
View Collection
Science & Nature
View Collection
Sociology
View Collection