Skip to content
Contact Us
Stories

Examining algorithmic injustice

Coded Bias Screening & Discussion

Coded Bias Film synopsis: When MIT Media Lab researcher Joy Buolamwini discovers that most facial-recognition software misidentifies women and darker-skinned faces, as a woman of colour working in a field dominated by white males, she is compelled to investigate further. Centering the voices of women leading the charge to ensure our civil rights are protected, Coded Bias asks two key questions: what is the impact of Artificial Intelligence’s increasing role in governing our liberties? And what are the consequences for people stuck in the crosshairs due to their race, colour, and gender? 

Panel Discussion: 
Examining algorithmic injustice
Dec 2, 3 – 4:30 pm PST
            4:30 – 5 pm PST (Networking)

Panellists:

  • Craig Martell
    Head of Machine Learning, Lyft
    Part-Time Lecturer, Khoury College of Computer Sciences, Northeastern University, Seattle
  • Ricardo Baeza-Yates
    Director of Data Science Programs, Northeastern University, Silicon Valley
  • Alexandra To
    Assistant Professor, Northeastern University
    Khoury College of Computer Science + College of Art, Media, and Design
  • Matt Kopec
    Associate Director, Ethics Institute, Northeastern University


The panel discussion will be moderated by Bethany Edmunds (Director of Computer Science and Teaching Professor, Northeastern University – Vancouver)

Watch the video.

More Stories

Image depicts brook trout swimming.

Trojan trout: could turning an invasive fish into a ‘super-male’ save a native species?

04.21.2022

Why it’s so damn hard to make AI fair and unbiased

04.19.2022
Northeastern logo

Google Docs’ AI-powered inclusive writing auto-correct now under fire

04.25.22
News