Indie Film Unpacks Bias in Algorithms
By C.J. Thompson
“Coded Bias” is a new independent documentary about the omnipresence of artificial intelligence — and the myriad ways the biases of its creators are baked into its performance.
The film centers on Massachusetts Institute of Technology doctoral candidate Joy Buolamwini, 31, whose key epiphany occurs when she uses computer-vision software for a project — and it fails to identify her dark-complexioned face.
It engages, however, when she dons a white face mask.
The “Coded Bias” poster features an image of that soulless, synthetic white mask with brown eyes staring from behind it — and the film’s implications are equally haunting.
Pop Culture, Social Media
Over nearly 90 minutes, director Shalini Kantayya argues that the public has been groomed to accept artificial intelligence and surveillance technology via years of sci-fi in pop culture, public cameras, smartphones — and through more recent social media platforms like Snapchat.
The downside of the tech that fuels these tools emerges when it’s used by law enforcement, corporate or even political interests — a reality that’s growing and totally unregulated.
As “Coded Bias” — which premiered at this year’s Sundance Film Festival — illustrates, facial-recognition systems have the highest accuracy when deployed on white faces because the systems are programmed with little regard for diversity.
“We teach machines to see by providing examples,” explains the Canadian-born Buolamwini, who also is a Rhodes scholar. “If I want a machine to see a face, I’m going to provide many examples of faces — and of things that aren’t faces.
“I started looking at the data sets … and what I discovered is many of (them) contained majority men and majority lighter-skinned individuals.
“That’s when I started looking into issues of bias that can creep into technology,” she says.
The film follows Buolamwini — it literally shoots her walking around different locales, from MIT’s campus to Brooklyn to Washington — as she shares her discoveries on how AI infringes people’s rights.
One thing “Coded” does well is show regular people worldwide being affected by the technology.
It’s sobering to see a white man in the U.K. navigate being confronted by police for trying to avoid facial-recognition cameras on the street, contrasted with a visibly bewildered and shaken Black male teen who’s stopped and detained by plainclothes cops who’ve falsely identified him in the same sting.
What’s heartening? Seeing the many efforts in different countries to thwart AI being used in this way.
It is, however, mind-boggling to learn that Amazon and other companies planned to make facial-recognition tech that contained significant deficiencies in identifying minority and female faces available to police departments.
The implications are super problematic, given the law-enforcement inequities people of color disproportionately face.
When Buolamwini’s research assertions gained traction in the media in June, the negative spotlight was so bright that Amazon, Microsoft and IBM voluntarily halted selling the tech for a year.
Buolamwini’s clout skyrocketed.
But while it’s satisfying to see her and fellow researchers chuckling at Amazon’s initially weak rebuttal, she acknowledges a nagging, underlying fear: “If you do work that challenges (these big companies) and makes them look bad, you might not have opportunities in the future.”
‘We Need Laws’
It’s ironic that AI is widely perceived to be more impartial because of its reliance on math-based algorithms, yet evidence of the technology excluding women and people of color in employment, insurance, credit and health care is growing.
“The progress that was made in the civil rights era could be rolled back under the guise of machine neutrality,” says Buolamwini, who since has formed The Algorithmic Justice League advocacy group. “We need laws.”
“Coded” consults a powerful roster of experts, but it rightfully leans on the unassuming star power of Buolamwini, who delivers truisms about AI’s harms in a disarmingly cheerful tone — whether she’s talking to residents in a Brooklyn apartment complex or answering questions from Rep. Alexandria Ocasio-Cortez, D-N.Y., during a congressional hearing in May.
Though the film spends a lot of time with Buolamwini — she even recites her tech-flavored poetry while getting her hair done in a salon — there’s only surface attention paid to her upbringing or any background showing how she developed into the person leading this charge.
But all in all, “Coded Bias” is a worthwhile watch. It poses some heavy-hitting questions from refreshing sources.
The film has a gravity not unlike a climate-change documentary: You’ll feel angry at the sheer scope and implied intentionality of these problems, but you’ll feel a measured amount of hope that the light Buolamwini and others are shining on them is the best disinfectant.
“Coded Bias” is now playing in the U.S. at www.codedbias.com.
C.J. Thompson is a New York writer.