The ACLU has long argued that the era of big data could exacerbate existing societal discrimination, and last week, the Federal Trade Commission acknowledged this risk in a report titled “Big Data: A Tool for Inclusion or Exclusion?” While the report shows the important role the federal government must play in enforcing civil rights laws online, the need for robust auditing by companies and external groups remains as critical as ever.
The report recognizes that while big data analytics can create opportunities for low-income or marginalized communities by more effectively matching them with goods and services, it is not inherently a force for good, and can in fact reinforce racial, gender, and other disparities.
We are seeing the early stages of research on big data and the new possibilities for discrimination it brings, including in areas covered by civil rights law such as employment, housing, lending, and education. For example, behavioral targeting is one way that businesses decide which ads to show a given person online, but such targeting runs the risk that people might be shown different (and potentially inferior) deals and products on the basis of race. This could happen either through intentional targeting, or because the data used to target is closely linked to race. These concerns are not theoretical; a recent study of Google ads found that ads for high-paying jobs were disproportionally shown to men over women. The more companies that employ sophisticated, often opaque, algorithms, to determine who gets a house, a job, or a loan, the more important it becomes to enforce laws against discrimination online as well as off.
The ACLU, in its comment to the FTC on this issue, highlighted the necessity of enforcing the Equal Credit Opportunity Act (“ECOA”) in online advertising and marketing practices. It is heartening to see the FTC take these concerns seriously by calling on companies doing business online to pay heed to laws such as the ECOA, the Fair Housing Act, the Americans with Disabilities Act, and other civil rights laws that protect against discrimination. It is not enough for companies to say they don’t intend to discriminate when designing their data analytics models, because a variety of factors could nonetheless lead to discriminatory outcomes, including data that is flawed or unrepresentative, inaccurate predictions, and hidden biases in the model itself.
The FTC’s stated commitment to monitor big data practices for violations of civil rights laws and bring enforcement actions where appropriate is an important first step in ensuring big data is not a force for discrimination. But it is still incumbent on companies to incorporate best practices and conduct robust audits of their use of big data and algorithms, given how much about them is still unknown.