{"id":20704,"date":"2020-08-03T11:18:57","date_gmt":"2020-08-03T18:18:57","guid":{"rendered":"https:\/\/www.pbs.org\/independentlens\/?post_type=films&#038;p=20704"},"modified":"2025-01-14T11:17:23","modified_gmt":"2025-01-14T19:17:23","slug":"coded-bias","status":"publish","type":"films","link":"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/","title":{"rendered":"Coded Bias"},"content":{"rendered":"<p><span style=\"font-weight: 400;\">In an increasingly data-driven, automated world, the question of how to protect individuals\u2019 civil liberties in the face of artificial intelligence looms larger by the day. <\/span><b><i>Coded Bias<\/i><\/b><span style=\"font-weight: 400;\"> follows M.I.T. Media Lab <\/span><span style=\"font-weight: 400;\">computer scientist<\/span><span style=\"font-weight: 400;\"> Joy Buolamwini, along with data scientists, mathematicians, and watchdog groups from all over the world, as they fight to expose the discrimination within algorithms now prevalent across all spheres of daily life.\u00a0<\/span><\/p>\n<p><span style=\"font-weight: 400;\">While conducting research on facial recognition technologies at the M.I.T. Media Lab, Buolamwini, a &#8220;poet of code,&#8221; made the startling discovery that some algorithms could not detect dark-skinned faces or classify women with accuracy. This led to the harrowing realization that the very machine-learning algorithms intended to avoid prejudice are only as unbiased as the humans and historical data programming them.<\/span><\/p>\n<p><i><span style=\"font-weight: 400;\">Coded Bias<\/span><\/i><span style=\"font-weight: 400;\"> documents the dramatic journey that follows, from discovery to exposure to activism, as Buolamwini goes public with her findings and undertakes an effort to create a movement toward accountability and transparency, including testifying before Congress to push for the first-ever legislation governing facial recognition in the United States and starting the Algorithmic Justice League. The film also includes data journalist Meredith Broussard; Silkie Carlo, director of Big Brother Watch, who is monitoring the trial use of facial recognition technology by U.K. police; Virginia Eubanks, author of <\/span><i><span style=\"font-weight: 400;\">Automating Inequality<\/span><\/i><span style=\"font-weight: 400;\">; Ravi Naik, human rights lawyer and media commentator; Dr. Safiya Umoja Noble, author and expert on algorithmic discrimination and technology bias; and Zeynep Tufekci, author of <\/span><i><span style=\"font-weight: 400;\">Twitter and Teargas<\/span><\/i><span style=\"font-weight: 400;\">.<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>In an increasingly data-driven, automated world, the question of how to protect individuals\u2019 civil liberties in the face of artificial intelligence looms larger by the day. Coded Bias follows M.I.T. Media Lab computer scientist Joy Buolamwini, along with data scientists, mathematicians, and watchdog groups from all over the world, as they fight to expose the [&hellip;]<\/p>\n","protected":false},"author":13,"featured_media":21371,"comment_status":"open","ping_status":"closed","template":"","meta":{"_acf_changed":false,"footnotes":""},"topic":[1260,1239,1264,1983,2125],"class_list":["post-20704","films","type-films","status-publish","has-post-thumbnail","hentry","topic-civil-rights-2","topic-identity","topic-race-ethnicity","topic-science","topic-technology"],"acf":[],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v27.2 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Coded Bias | A.I. Bias &amp; Facial Recognition Discrimination | Documentary | PBS<\/title>\n<meta name=\"description\" content=\"Explores racial prejudices that exist in facial recognition algorithms, and the threat artificial intelligence poses to civil liberties\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Coded Bias | Films | PBS\" \/>\n<meta property=\"og:description\" content=\"Coded Bias exposes prejudices and threats to civil liberty in facial recognition algorithms and artificial intelligence.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/\" \/>\n<meta property=\"og:site_name\" content=\"Independent Lens\" \/>\n<meta property=\"article:modified_time\" content=\"2025-01-14T19:17:23+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/dipsy.pbs.org\/independentlens\/wp-content\/uploads\/2020\/08\/CODEDBIAS_PubStills_34-e1614705746804.jpg\" \/>\n\t<meta property=\"og:image:width\" content=\"1280\" \/>\n\t<meta property=\"og:image:height\" content=\"720\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/jpeg\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:title\" content=\"Coded Bias | Films | PBS\" \/>\n<meta name=\"twitter:description\" content=\"Coded Bias exposes prejudices and threats to civil liberty in facial recognition algorithms and artificial intelligence.\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"2 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/\",\"url\":\"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/\",\"name\":\"Coded Bias | A.I. Bias & Facial Recognition Discrimination | Documentary | PBS\",\"isPartOf\":{\"@id\":\"https:\/\/dipsy.pbs.org\/independentlens\/#website\"},\"primaryImageOfPage\":{\"@id\":\"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/#primaryimage\"},\"image\":{\"@id\":\"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/#primaryimage\"},\"thumbnailUrl\":\"https:\/\/dipsy.pbs.org\/independentlens\/wp-content\/uploads\/2020\/08\/CODEDBIAS_PubStills_34-e1614705746804.jpg\",\"datePublished\":\"2020-08-03T18:18:57+00:00\",\"dateModified\":\"2025-01-14T19:17:23+00:00\",\"description\":\"Explores racial prejudices that exist in facial recognition algorithms, and the threat artificial intelligence poses to civil liberties\",\"breadcrumb\":{\"@id\":\"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/\"]}]},{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/#primaryimage\",\"url\":\"https:\/\/dipsy.pbs.org\/independentlens\/wp-content\/uploads\/2020\/08\/CODEDBIAS_PubStills_34-e1614705746804.jpg\",\"contentUrl\":\"https:\/\/dipsy.pbs.org\/independentlens\/wp-content\/uploads\/2020\/08\/CODEDBIAS_PubStills_34-e1614705746804.jpg\",\"width\":1280,\"height\":720,\"caption\":\"A woman with dark skin tone, short hair, and round, black-rimmed glasses works on a Mac laptop computer. On the laptop screen is her reflection as well as an image of another woman with dark skin tone.\"},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/dipsy.pbs.org\/independentlens\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Coded Bias\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/dipsy.pbs.org\/independentlens\/#website\",\"url\":\"https:\/\/dipsy.pbs.org\/independentlens\/\",\"name\":\"Independent Lens\",\"description\":\"Independent Documentary Films\",\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/dipsy.pbs.org\/independentlens\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Coded Bias | A.I. Bias & Facial Recognition Discrimination | Documentary | PBS","description":"Explores racial prejudices that exist in facial recognition algorithms, and the threat artificial intelligence poses to civil liberties","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/","og_locale":"en_US","og_type":"article","og_title":"Coded Bias | Films | PBS","og_description":"Coded Bias exposes prejudices and threats to civil liberty in facial recognition algorithms and artificial intelligence.","og_url":"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/","og_site_name":"Independent Lens","article_modified_time":"2025-01-14T19:17:23+00:00","og_image":[{"width":1280,"height":720,"url":"https:\/\/dipsy.pbs.org\/independentlens\/wp-content\/uploads\/2020\/08\/CODEDBIAS_PubStills_34-e1614705746804.jpg","type":"image\/jpeg"}],"twitter_card":"summary_large_image","twitter_title":"Coded Bias | Films | PBS","twitter_description":"Coded Bias exposes prejudices and threats to civil liberty in facial recognition algorithms and artificial intelligence.","twitter_misc":{"Est. reading time":"2 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/","url":"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/","name":"Coded Bias | A.I. Bias & Facial Recognition Discrimination | Documentary | PBS","isPartOf":{"@id":"https:\/\/dipsy.pbs.org\/independentlens\/#website"},"primaryImageOfPage":{"@id":"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/#primaryimage"},"image":{"@id":"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/#primaryimage"},"thumbnailUrl":"https:\/\/dipsy.pbs.org\/independentlens\/wp-content\/uploads\/2020\/08\/CODEDBIAS_PubStills_34-e1614705746804.jpg","datePublished":"2020-08-03T18:18:57+00:00","dateModified":"2025-01-14T19:17:23+00:00","description":"Explores racial prejudices that exist in facial recognition algorithms, and the threat artificial intelligence poses to civil liberties","breadcrumb":{"@id":"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/#primaryimage","url":"https:\/\/dipsy.pbs.org\/independentlens\/wp-content\/uploads\/2020\/08\/CODEDBIAS_PubStills_34-e1614705746804.jpg","contentUrl":"https:\/\/dipsy.pbs.org\/independentlens\/wp-content\/uploads\/2020\/08\/CODEDBIAS_PubStills_34-e1614705746804.jpg","width":1280,"height":720,"caption":"A woman with dark skin tone, short hair, and round, black-rimmed glasses works on a Mac laptop computer. On the laptop screen is her reflection as well as an image of another woman with dark skin tone."},{"@type":"BreadcrumbList","@id":"https:\/\/dipsy.pbs.org\/independentlens\/documentaries\/coded-bias\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/dipsy.pbs.org\/independentlens\/"},{"@type":"ListItem","position":2,"name":"Coded Bias"}]},{"@type":"WebSite","@id":"https:\/\/dipsy.pbs.org\/independentlens\/#website","url":"https:\/\/dipsy.pbs.org\/independentlens\/","name":"Independent Lens","description":"Independent Documentary Films","potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/dipsy.pbs.org\/independentlens\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"}]}},"_links":{"self":[{"href":"https:\/\/dipsy.pbs.org\/independentlens\/wp-json\/wp\/v2\/films\/20704","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/dipsy.pbs.org\/independentlens\/wp-json\/wp\/v2\/films"}],"about":[{"href":"https:\/\/dipsy.pbs.org\/independentlens\/wp-json\/wp\/v2\/types\/films"}],"author":[{"embeddable":true,"href":"https:\/\/dipsy.pbs.org\/independentlens\/wp-json\/wp\/v2\/users\/13"}],"replies":[{"embeddable":true,"href":"https:\/\/dipsy.pbs.org\/independentlens\/wp-json\/wp\/v2\/comments?post=20704"}],"version-history":[{"count":7,"href":"https:\/\/dipsy.pbs.org\/independentlens\/wp-json\/wp\/v2\/films\/20704\/revisions"}],"predecessor-version":[{"id":30635,"href":"https:\/\/dipsy.pbs.org\/independentlens\/wp-json\/wp\/v2\/films\/20704\/revisions\/30635"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/dipsy.pbs.org\/independentlens\/wp-json\/wp\/v2\/media\/21371"}],"wp:attachment":[{"href":"https:\/\/dipsy.pbs.org\/independentlens\/wp-json\/wp\/v2\/media?parent=20704"}],"wp:term":[{"taxonomy":"topic","embeddable":true,"href":"https:\/\/dipsy.pbs.org\/independentlens\/wp-json\/wp\/v2\/topic?post=20704"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}