The Scored Society: Due Process for Automated Predictions
Big Data is increasingly mined to rank and rate individuals. Predictive algorithms assess whether we are good credit risks, desirable employees, reliable tenants, valuable customers—or deadbeats, shirkers, menaces, and “wastes of time.” Crucial opportunities are on the line, including the ability to obtain loans, work, housing, and insurance. Though automated scoring is pervasive and consequential, it is also opaque and lacking oversight. In one area where regulation does prevail—credit—the law focuses on credit history, not the derivation of scores from data.\nProcedural regularity is essential for those stigmatized by “artificially intelligent” scoring systems. The American due process tradition should inform basic safeguards. Regulators should be able to test scoring systems to ensure their fairness and accuracy. Individuals should be granted meaningful opportunities to challenge adverse decisions based on scores miscategorizing them. Without such protections in place, systems could
Big Data is increasingly mined to rank and rate individuals. Predictive algorithms assess whether we are good credit risks, desirable employees, reliable tenants, valuable customers—or deadbeats, shirkers, menaces, and “wastes of time.” Crucial opportunities are on the line, including the ability to obtain loans, work, housing, and insurance. Though automated scoring is pervasive and consequential, it is also opaque and lacking oversight. In one area where regulation does prevail—credit—the law focuses on credit history, not the derivation of scores from data.\nProcedural regularity is essential for those stigmatized by “artificially intelligent” scoring systems. The American due process tradition should inform basic safeguards. Regulators should be able to test scoring systems to ensure their fairness and accuracy. Individuals should be granted meaningful opportunities to challenge adverse decisions based on scores miscategorizing them. Without such protections in place, systems could launder biased and arbitrary data into powerfully stigmatizing scores.
Executive Summary
This article advocates for the implementation of due process safeguards in automated scoring systems used in various aspects of life, such as credit, employment, and housing. The author argues that the reliance on opaque and unregulated scoring systems can lead to biased and arbitrary outcomes, subsequently stigmatizing individuals. To address these concerns, the author proposes that regulators should be able to test scoring systems for fairness and accuracy, and individuals should have meaningful opportunities to challenge adverse decisions. By drawing upon the American due process tradition, the author seeks to establish basic safeguards to prevent the misuse of automated scoring systems.
Key Points
- ▸ Automated scoring systems are pervasive and consequential, yet lack oversight and transparency.
- ▸ The current regulatory framework focuses on credit history, rather than the derivation of scores from data.
- ▸ Individuals should have meaningful opportunities to challenge adverse decisions based on scores miscategorizing them.
Merits
Strength
The article effectively highlights the need for due process safeguards in automated scoring systems, drawing upon the American due process tradition to inform its arguments.
Strength
The author's proposal for regulator testing and individual challenge mechanisms provides a clear and actionable solution to address the concerns raised in the article.
Demerits
Limitation
The article primarily focuses on credit scoring systems, raising questions about the applicability and generalizability of its arguments to other areas of automated scoring.
Limitation
The article does not provide a comprehensive analysis of the current regulatory landscape, which may limit its impact and effectiveness in influencing policy changes.
Expert Commentary
While the article raises important concerns about the lack of oversight and transparency in automated scoring systems, its proposals for due process safeguards are not without their challenges. In particular, establishing regulator testing and individual challenge mechanisms will require significant changes to existing regulatory frameworks and may face resistance from industry stakeholders. Nevertheless, the article's advocacy for a more active regulatory role in governing automated scoring systems is timely and necessary, and its proposals provide a valuable starting point for future discussions and debates.
Recommendations
- ✓ Regulators should establish clear guidelines and standards for the development and deployment of automated scoring systems, including requirements for transparency, explainability, and bias testing.
- ✓ Individuals should have meaningful opportunities to challenge adverse decisions based on scores miscategorizing them, including access to internal scoring models and testing processes.