For so long, we have looked to artificial intelligence technology as our key to the future, but what happens when that technology is informed by a problematic past? This question, along with many others, was just what filmmaker Shalini Kantayya set out to answer in her 2020 documentary “Coded Bias,” which was screened for all incoming first-year students this year.
Watch the “Coded Bias” trailer »
On Sept. 15, as part of the Fall 2021 Common Academic Experience, first-year Clarkies were joined by Kantayya for a virtual discussion around the complex issues covered in her film and the implications of an increasingly technological world. Moderated in part by Dean of Faculty Esther Jones and Dean of the College Betsy Huang, the discussion was led by Jonathan Hoff ’22 and Dilasha Shrestha ’22, leaders of the student-led Center for Technology, Innovation, and Entrepreneurship.
A discussion that began around the increasing pace of technological advancement quickly shifted into conversations about racism, policing, and business practices in all our lives. Kantayya explained that her film — which was inspired by MIT researcher Joy Buolamwini’s discovery of racial discrepancies in facial recognition software — evolved into an exposé into the ways in which artificial intelligence (AI) has come to serve as a sort of invisible gatekeeper of opportunity, particularly for people of color.
From insurance policies to prison sentences, algorithms increasingly determine human outcomes, Kantayya explained. While they might seem void of human error and bias, “these systems that we have been trusting so implicitly have not been vetted for racial or gender bias, not even vetted for a shared standard of accuracy.”
Referring to tech as the “wild, wild West,” she critiqued the lack of regulation in this area, emphasizing that new technologies are being integrated with little to no government oversight. Flawed facial recognition software has significant repercussions in the just application of immigration and law enforcement, she said.
“In many senses, coded bias is misnamed,” Kanayya said. “This is not just a problem with bias. Bias is not where it ends. It is really about creating more humane systems, and asking if algorithms should be used at all, at least without education around them and diversity within them.”
Addressing the Clark first-year class directly, Kantayya emphasized the importance of both diversity and an interdisciplinary understanding of the world. “We need a major campaign to address the inclusion crisis in Silicon Valley,” she said, calling on computer science departments to mandate women’s and ethnic studies and encourage diversity. She hopes that a more intense effort to foster inclusion may serve as a first step toward addressing the disconnect between technology and the greater world, noting that “you can’t program for society if you don’t know anything about society.”
Jones echoed this sentiment, giving a nod to Clark’s Program of Liberal Studies. “You need these diverse, broad perspectives that enable you to engage in these ethical concerns, social concerns, and legal implications of whatever it is you’re studying,” Jones said.
Shrestha concluded the panel, thanking Kantayya and urging Clarkies to not be discouraged by these biases in tech. “I want us all to leave feeling encouraged to explore this field,” she said. “If you are coming from a minority background, the tech industry is most in need of your diverse representation. The world is moving very rapidly toward being dependent on these technologies, and we must all see ourselves represented in this future, or risk repeating our flawed past.”
As part of this year’s Orientation, first-year students also were invited to a virtual panel discussion in which Clark faculty members discussed the issues raised in “Coded Bias.” Participants included Provost Sebastián Royo and professors John Magee (computer science), Paul Cotnoir (Becker School of Design & Technology), Nadia Ward (Mosakowski Institute), Valerie Sperling (political science), and Arden Ali (philosophy). The event was moderated by Betsy Huang.