Books and Journals No. 113-1, October 2024 Georgetown Law Journal Less Discriminatory Algorithms

Less Discriminatory Algorithms

Document Cited Authorities (69) Cited in Related
Less Discriminatory Algorithms
EMILY BLACK*†, JOHN LOGAN KOEPKE**†, PAULINE T. KIM***, SOLON BAROCAS****
& MINGWEI HSU*****
In discussions about algorithms and discrimination, it is often assumed
that machine learning techniques will identify a unique solution to any
given prediction problem, such that any attempt to develop less discrimi-
natory models will inevitably entail a tradeoff with accuracy. Contrary to
this conventional wisdom, however, computer science has established
that multiple models with equivalent performance exist for a given pre-
diction problem. This phenomenon, termed model multiplicity, suggests
that when an algorithmic system displays a disparate impact, there
almost always exists a less discriminatory algorithm (LDA) that performs
equally well. But without dedicated exploration, developers are unlikely
to discover potential LDAs. These observations have profound ramifica-
tions for the legal and policy response to discriminatory algorithms.
Because the overarching purpose of our civil rights laws is to remove ar-
bitrary barriers to full participation by marginalized groups in the
nation’s economic life, the law should place a duty to search for LDAs
on entities that develop and deploy predictive models in domains covered
by civil rights laws, like housing, employment, and credit. The law should
recognize this duty in at least two specific ways. First, under disparate
impact doctrine, a defendant’s burden of justifying a model with discrim-
inatory effects should include showing that it made a reasonable search
for LDAs before implementing the model. Second, new regulatory frame-
works for the governance of algorithms should include a requirement
that entities search for and implement LDAs as part of the model building
process.
* Assistant Professor, Department of Computer Science and Engineering and the Center for Data
Science, New York University. © 2024, Emily Black, John Logan Koepke, Pauline T. Kim, Solon
Barocas & Mingwei Hsu.
** Project Director, Upturn.
*** Daniel Noyes Kirby Professor of Law, Washington University School of Law, St. Louis,
Missouri.
**** Principal Researcher, Microsoft Research; Adjunct Associate Professor, Information Science,
Cornell University.
***** Senior Quantitative Analyst, Upturn.
† Equal contribution.
†† The authors would like to thank the following individuals for their helpful feedback: Olga
Akselrod, Elizabeth Edenberg, Talia Gillis, Stephen Hayes, Daniel Jellins, Cynthia Khoo, Michael
McGovern, Paul Ohm, Catherine Powell, Manish Raghavan, Matthew Scherer, Andrew Selbst, Ridhi
Shetty, Eric Sublett, Dan Svirsky, Suresh Venkatasubramanian, and the staff of Upturn. The authors are
also grateful to the participants at the 2023 Privacy Law Scholars Conference, the participants at the
Law & Technology Workshop, the Washington University School of Law faculty workshop for their
comments, to Julia Monti and Kelly Miller for excellent research assistance, and to the Editors at The
Georgetown Law Journal for their careful and thoughtful editorial work to bring this Article to print.
53
TABLE OF CONTENTS
INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
I. A MULTITUDE OF POSSIBLE MODELS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
A. MODEL MULTIPLICITY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
B. DEFINING LESS DISCRIMINATORY ALGORITHMS . . . . . . . . . . . . . . . . . 65
C. MODEL MULTIPLICITY IN PRACTICE . . . . . . . . . . . . . . . . . . . . . . . . . . 67
1. Searching Through the Pipeline ...................... 67
2. Practical Examples of Searching for LDAs . .. . . . . . . . . . . 69
3. Joint Optimization of Fairness and Performance ........ 70
II. DISPARATE IMPACT DOCTRINE AND LESS DISCRIMINATORY
ALTERNATIVES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
A. THE DISPARATE IMPACT FRAMEWORK . . . . . . . . . . . . . . . . . . . . . . . . 72
B. WHO BEARS THE BURDEN? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
C. WHAT IS A LESS DISCRIMINATORY ALTERNATIVE? . . . . . . . . . . . . . . . 79
III. WHAT MODEL MULTIPLICITY MEANS FOR THE LAW . . . . . . . . . . . . . . . . . 85
A. DUTY TO SEARCH . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
B. MODEL MULTIPLICITY AND DISPARATE IMPACT DOCTRINE . . . . . . . . 88
C. MODEL MULTIPLICITY AND REGULATORY GOVERNANCE . . . . . . . . . . 94
IV. A CASE STUDY: THE UPSTART FAIR LENDING MONITORSHIP AND
MODEL MULTIPLICITY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
V. THE DUTY TO SEARCH FOR AND IMPLEMENT LDAS IN PRACTICE . .. . . . . 99
A. BASIC REQUIREMENTS OF THE DUTY . .. . . . . . . . . . . . . . . . . . . . . . . . 99
B. REASONABLE STEPS . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101
C. SEARCHING FOR LDAS IN PRACTICE . . . . . . . . . . . . . . . . . . . . . . . . . . 103
1. General Methodology to Search for LDAs ............. 103
2. Examples of Interventions ........................... 104
D. COSTS ................................................. 109
54 THE GEORGETOWN LAW JOURNAL [Vol. 113:53
VI. LIMITATIONS AND POTENTIAL OBJECTIONS . . . . . . . . . . . . . . . . . . . . . . . . . 110
A. CONTEXT-SPECIFIC CONSIDERATIONS . . . . . . . . . . . . . . . . . . . . . . . . . 111
B. QUESTIONS OF ACCURACY . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
C. LEGAL CONCERNS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
CONCLUSION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 119
INTRODUCTION
Companies now routinely deploy algorithmic systems as part of their basic
business operations to determine who gets access to critical opportunities and
resources. These developments have worried legal scholars and civil rights advo-
cates, who are concerned that algorithms may reflect or reinforce existing societal
biases.
1
See, e.g., Pauline T. Kim, Auditing Algorithms for Discrimination, 166 U. PA. L. REV. ONLINE 189,
196 (2017); Haley Moss, Screened Out Onscreen: Disability Discrimination, Hiring Bias, and Artificial
Intelligence, 98 DENV. L. REV. 775, 783 (2021); Alicia Solow-Niederman, Administering Artificial
Intelligence, 93 S. CAL. L. REV. 633, 689 (2020); William Magnuson, Artificial Financial Intelligence, 10
HARV. BUS. L. REV. 337, 35455 (2020); Kristin Johnson, Frank Pasquale & Jennifer Chapman, Artificial
Intelligence, Machine Learning, and Bias in Finance: Toward Responsible Innovation, 88 FORDHAM L.
REV. 499, 502 (2019); Crystal S. Yang & Will Dobbie, Equal Protection Under Algorithms: A New
Statistical and Legal Framework, 119 MICH. L. REV. 291, 294 (2020); Margaret Hu, Algorithmic Jim
Crow, 86 FORDHAM L. REV. 633, 638 (2017); Civil Rights Principles for the Era of Big Data, LEADERSHIP
CONF. ON CIV. & HUM. RTS. (Feb. 27, 2014), https://civilrights.org/2014/02/27/civil-rights-principles-era-
big-data/ [https://perma.cc/N4E2-BBEP]; ACLU et al., Principles, CIV. RTS. PRIV. & TECH. TABLE (2020)
[https://perma.cc/9REC-JDU7]; Letter from Upturn, ACLU & Leadership Conf. on Civ. & Hum. Rts. to
Dr. Eric S. Lander, Dr. Lynne Parker & Dr. Alondra Nelson (July 13, 2021), https://www.upturn.org/work/
proposals-for-the-biden-administration-to-address-technologys-role-in/ [https://perma.cc/L8JU-VMXY];
Letter from Leadership Conf. on Civ. & Hum. Rts to Ambassador Susan Rice (Oct. 27, 2021), https://
civilrights.org/resource/letter-to-ambassador-rice-on-civil-rights-and-ai [https://perma.cc/5UE7-WXPZ].
From tenant screening systems to employment assessment and hiring
technologies to credit underwriting models, reliance on these tools raises con-
cerns that they will discriminate against or exclude historically marginalized
groups.
2
See, e.g., Khari Johnson, Algorithms Allegedly Penalized Black Renters. The US Government Is
Watching, WIRED (Jan. 16, 2023, 7:00 AM), https://www.wired.com/story/algorithms-allegedly-
penalized-black-renters-the-us-government-is-watching/ (discussing biases in tenant screening systems);
Miranda Bogen, All the Ways Hiring Algorithms Can Introduce Bias, HARV. BUS. REV. (May 6, 2019),
https://hbr.org/2019/05/all-the-ways-hiring-algorithms-can-introduce-bias [https://perma.cc/7MB9-2568]
(discussing bias in employment assessment and hiring technologies); Will Douglas Heaven, Bias Isn’t the
Only Problem with Credit Scoresand No, AI Can’t Help, MIT TECH. REV. (June 17, 2021), https://www.
technologyreview.com/2021/06/17/1026519/racial-bias-noisy-data-credit-scores-mortgage-loans-fairness-
machine-learning/ [https://perma.cc/47YW-X5JH] (discussing bias in credit underwriting models).
These concerns have stimulated a vast scholarship on how the law
should respond. One strand of the literature focuses on existing civil rights law
and how it applies to new technologies, debating whether disparate impact doc-
trine is adequate to meet the challenges posed by algorithmic tools.
3
Another
1.
2.
3. See, e.g., Solon Barocas & Andrew D. Selbst, Big Data’s Disparate Impact, 104 CALIF. L. REV.
671, 70112 (2016); Pauline T. Kim, Data-Driven Discrimination at Work, 58 WM. & MARY L. REV.
857, 90316 (2017); Michael Selmi, Algorithms, Discrimination and the Law, 82 OHIO ST. L.J. 611,
2024] LESS DISCRIMINATORY ALGORITHMS 55

Experience vLex's unparalleled legal AI

Access millions of documents and let Vincent AI power your research, drafting, and document analysis — all in one platform.

Start a free trial

Start Your 3-day Free Trial of vLex and Vincent AI, Your Precision-Engineered Legal Assistant

  • Access comprehensive legal content with no limitations across vLex's unparalleled global legal database

  • Build stronger arguments with verified citations and CERT citator that tracks case history and precedential strength

  • Transform your legal research from hours to minutes with Vincent AI's intelligent search and analysis capabilities

  • Elevate your practice by focusing your expertise where it matters most while Vincent handles the heavy lifting

vLex

Start Your 3-day Free Trial of vLex and Vincent AI, Your Precision-Engineered Legal Assistant

  • Access comprehensive legal content with no limitations across vLex's unparalleled global legal database

  • Build stronger arguments with verified citations and CERT citator that tracks case history and precedential strength

  • Transform your legal research from hours to minutes with Vincent AI's intelligent search and analysis capabilities

  • Elevate your practice by focusing your expertise where it matters most while Vincent handles the heavy lifting

vLex

Start Your 3-day Free Trial of vLex and Vincent AI, Your Precision-Engineered Legal Assistant

  • Access comprehensive legal content with no limitations across vLex's unparalleled global legal database

  • Build stronger arguments with verified citations and CERT citator that tracks case history and precedential strength

  • Transform your legal research from hours to minutes with Vincent AI's intelligent search and analysis capabilities

  • Elevate your practice by focusing your expertise where it matters most while Vincent handles the heavy lifting

vLex

Start Your 3-day Free Trial of vLex and Vincent AI, Your Precision-Engineered Legal Assistant

  • Access comprehensive legal content with no limitations across vLex's unparalleled global legal database

  • Build stronger arguments with verified citations and CERT citator that tracks case history and precedential strength

  • Transform your legal research from hours to minutes with Vincent AI's intelligent search and analysis capabilities

  • Elevate your practice by focusing your expertise where it matters most while Vincent handles the heavy lifting

vLex