{"id":121764,"date":"2021-04-20T14:22:49","date_gmt":"2021-04-20T21:22:49","guid":{"rendered":"https:\/\/lifeboat.com\/blog\/2021\/04\/ftc-warns-it-could-crack-down-on-biased-ai"},"modified":"2021-04-20T14:22:49","modified_gmt":"2021-04-20T21:22:49","slug":"ftc-warns-it-could-crack-down-on-biased-ai","status":"publish","type":"post","link":"https:\/\/lifeboat.com\/blog\/2021\/04\/ftc-warns-it-could-crack-down-on-biased-ai","title":{"rendered":"FTC warns it could crack down on biased AI"},"content":{"rendered":"<p><a class=\"aligncenter blog-photo\" href=\"https:\/\/lifeboat.com\/blog.images\/ftc-warns-it-could-crack-down-on-biased-ai.jpg\"><\/a><\/p>\n<p>AI systems can lead to race or gender discrimination.<\/p>\n<hr>\n<p>The US Federal Trade Commission has warned companies against using biased artificial intelligence, saying they may break consumer protection laws. <a href=\"https:\/\/www.ftc.gov\/news-events\/blogs\/business-blog\/2021\/04\/aiming-truth-fairness-equity-your-companys-use-ai\">A new blog post<\/a> notes that AI tools can reflect \u201ctroubling\u201d racial and gender biases. If those tools are applied in areas like housing or employment, falsely advertised as unbiased, or trained on data that is gathered deceptively, the agency says it could intervene.<\/p>\n<p>\u201cIn a rush to embrace new technology, be careful not to overpromise what your algorithm can deliver,\u201d writes FTC attorney Elisa Jillson \u2014 particularly when promising decisions that don\u2019t reflect racial or gender bias. \u201cThe result may be deception, discrimination \u2014 and an FTC law enforcement action.\u201d<\/p>\n<p><a href=\"https:\/\/www.protocol.com\/ftc-bias-ai\">As <em>Protocol<\/em> points out<\/a>, FTC chair Rebecca Slaughter recently called algorithm-based bias \u201can economic justice issue.\u201d Slaughter and Jillson both mention that companies could be prosecuted under the Equal Credit Opportunity Act or the Fair Credit Reporting Act for biased and unfair AI-powered decisions, and unfair and deceptive practices could also fall under Section 5 of the FTC Act.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>AI systems can lead to race or gender discrimination. The US Federal Trade Commission has warned companies against using biased artificial intelligence, saying they may break consumer protection laws. A new blog post notes that AI tools can reflect \u201ctroubling\u201d racial and gender biases. If those tools are applied in areas like housing or employment, [\u2026]<\/p>\n","protected":false},"author":396,"featured_media":0,"comment_status":"open","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[39,1878,15,41,1493,6],"tags":[],"class_list":["post-121764","post","type-post","status-publish","format-standard","hentry","category-economics","category-employment","category-habitats","category-information-science","category-law-enforcement","category-robotics-ai"],"_links":{"self":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/121764","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/users\/396"}],"replies":[{"embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/comments?post=121764"}],"version-history":[{"count":0,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/posts\/121764\/revisions"}],"wp:attachment":[{"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/media?parent=121764"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/categories?post=121764"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/lifeboat.com\/blog\/wp-json\/wp\/v2\/tags?post=121764"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}