Discrimination In The Age Of Algorithms

NBER Working Paper No. 25548

Issued in February 2019

NBER Program(s):Children, Development Economics, Economics of Education, Health Care, Health Economics, Law and Economics, Labor Studies, Public Economics



The law forbids discrimination. But the ambiguity of human decision-making often makes it extraordinarily hard for the legal system to know whether anyone has actually discriminated. To understand how algorithms affect discrimination, we must therefore also understand how they affect the problem of detecting discrimination. By one measure, algorithms are fundamentally opaque, not just cognitively but even mathematically. Yet for the task of proving discrimination, processes involving algorithms can provide crucial forms of transparency that are otherwise unavailable. These benefits do not happen automatically. But with appropriate requirements in place, the use of algorithms will make it possible to more easily examine and interrogate the entire decision process, thereby making it far easier to know whether discrimination has occurred. By forcing a new level of specificity, the use of algorithms also highlights, and makes transparent, central tradeoffs among competing values. Algorithms are not only a threat to be regulated; with the right safeguards in place, they have the potential to be a positive force for equity.

Acknowledgments

Machine-readable bibliographic record - MARC, RIS, BibTeX

Document Object Identifier (DOI): 10.3386/w25548

Published: Jon Kleinberg & Jens Ludwig & Sendhil Mullainathan & Cass R Sunstein, 2018. "Discrimination in the Age of Algorithms," Journal of Legal Analysis, vol 10.