From pretrial decision-making to who gets separated from their kids, algorithms have been embedded in public and private decision-making systems for years. In this time of crisis, algorithms have an even greater risk of embedding massive bias into these decision-making systems.
Building off the work we and many others have done in the criminal legal world around pretrial risk assessment algorithms, MAP is now centering our attention on how predictive analytics enhance the racism and oppression inside “family regulation”, commonly known as the child welfare system. Black and Brown people, particularly poor moms and children, are disproportionately impacted by and overrepresented within the family regulation system. The People’s Algorithmic Power Project, or P3A, seeks to support the fight to end family regulation by building power with those most directly impacted by these systems and shining a light on the abuses of the underlying structures by challenging unjust automated decision-making practices within the system.
Despite calls to the contrary, increasing and refining methods of surveillance and automating judgement does create not a future where these systems no longer participate in racial and economic oppression. It simply obfuscates these injustices behind the black box of an often proprietary software. Tools are often carelessly deployed, and even when best practices are followed and ethical considerations are made, the tools produced still work towards increasing the efficiency of separating poor, largely Black and Brown families. Through our work, we are developing a rich understanding of the state of child welfare and algorithms, working with impacted communities, organizers, lawyers, and data scientists to build local power over the use of automated decision making – first in the family regulation system and then in other key human decision-making spaces.