DLD Video
The Coded Gaze: Bias in AI
In her DLD Munich 2020 talk, Joy Buolamwini, founder of the Algorithmic Justice League, paints an alarming picture of the racial, gender and class bias present in current AI systems. “Machines are not flawless, they reflect something”, she says. They “are the reflection of those that have the power of technology.”
When Buolamwini investigated the accuracy of facial analysis systems from technology giants such as Microsoft, IBM and Amazon she discovered that the “typical trend works best on lighter skin men and works worst on darker skin women.”
The consequences of bias in AI technology can be detrimental, exacerbating issues such as racial profiling, Buolamwini points out. But accuracy should not be the only concern, Buolamwini argues. The question for society should also be, “What kind of systems do we want?”
The goal of the Algorithmic Justice League was not just to identify shortcomings but to actively shape the use of AI in society, Buolamwini says. “We don’t want to name and shame. We want to name and change.”
Joy Buolamwini
Algorithmic Justice League
Joy Buolamwini founded the Algorithmic Justice League to create a world with more ethical and inclusive technology. Her work was the subject of the Netflix documentary Coded Bias, and her research has been covered in over 40 countries. Buolamwini serves on the Global Tech Panel convened to advise world leaders and technology executives on ways to reduce the harms of A.I.