Menu

Blog

Apr 20, 2021

FTC warns it could crack down on biased AI

Posted by in categories: economics, employment, habitats, information science, law enforcement, robotics/AI

AI systems can lead to race or gender discrimination.


The US Federal Trade Commission has warned companies against using biased artificial intelligence, saying they may break consumer protection laws. A new blog post notes that AI tools can reflect “troubling” racial and gender biases. If those tools are applied in areas like housing or employment, falsely advertised as unbiased, or trained on data that is gathered deceptively, the agency says it could intervene.

“In a rush to embrace new technology, be careful not to overpromise what your algorithm can deliver,” writes FTC attorney Elisa Jillson — particularly when promising decisions that don’t reflect racial or gender bias. “The result may be deception, discrimination — and an FTC law enforcement action.”

As Protocol points out, FTC chair Rebecca Slaughter recently called algorithm-based bias “an economic justice issue.” Slaughter and Jillson both mention that companies could be prosecuted under the Equal Credit Opportunity Act or the Fair Credit Reporting Act for biased and unfair AI-powered decisions, and unfair and deceptive practices could also fall under Section 5 of the FTC Act.

Comments are closed.