02/13/2018: Are computers racist? No, but people still are.

Feb 13, 2018

Facial recognition software has made huge advancements in accuracy, but it has a long way to go — specifically when it comes to recognizing people of color. Commercially available software can tell the gender of a person using a photograph. According to researcher Joy Buolamwini, of the MIT Media Lab, that software is correct 99 percent of the time when it’s looking at a white male but is less than half as accurate when looking at a darker-skinned female. Marketplace Tech host Molly Wood spoke with Buolamwini about her research and the human biases that creep into machine learning.