ACORN ANALYTICS
  • Home
  • About
  • Event
  • Student Works
    • SEO
    • Social Listening
  • Connect
  • Alumni Showcase
  • Partners
  • Certifications
  • Blog
  • Mentor

all things analytics

Documentary Review: Coded Bias

4/26/2021

0 Comments

 
By: Anna Cave

If you’re interested in data and social justice, this one’s for you.
Picture
Joy Buolamwini is an MIT Media Lab researcher who was using AI facial recognition for a mirror she was developing when she realized that the mirror wasn’t able to pick up her face. Buolamwini is a Black woman, and the only time she was able to get the software to recognize her is when she put on a solid white mask. 

The reason this facial recognition wasn’t working on her face was because the training sets of photographs used to build facial recognition software had a strong majority of white men included in them. A lot of the preconceived ideas of AI people have come from science fiction. There are all these ideas that AI will become smarter than humans and ruin the species, but the reality is that the AI can only act on what we give them.

The phrase the documentary used was “data is destiny,” which explains that AI learns from patterns, meaning that if data sets are skewed so will the results, which is exactly what happened to Buolamwini. Part of the problem comes with the thought that algorithms are unbiased. We can’t assume that when a machine develops and carries out an algorithm that it rids it of bias usually carried by humans. In reality, algorithms are just using historical information to make predictions about the future, but history and even recent data sets are still rooted in different historical and cultural biases. Instead, what will happen is that systematic biases will be hardwired into the technologies we are quickly adopting. 

One of the ways we can see this lack of accuracy within technologies is when AI is used in surveillance technology. It was used to identify protesters in Hong Kong and it was also used to do police surveillance in London with severe inaccuracies. There is a clear social justice concern because this technology is underdeveloped and is still being put to use. 

Despite knowing that there are biases within different types of AI technology, no one really knows how these algorithms work, especially when you’re on the outside of the group developing them. The documentary poses the question, “how do we get justice in a system that we don’t know how it is working?” Even though it will be difficult, algorithmic justice has become one of the largest civil rights concerns recently. 

This documentary was a really interesting find and the perfect watch for anyone who is passionate about data and social justice.

​
0 Comments



Leave a Reply.

    Categories

    All
    Analytics @ Elon
    Industry
    Jobs + Internships
    Programs
    Resources
    Social Media
    Streaming Services
    Student Spotlights

    Archives

    September 2024
    November 2021
    October 2021
    September 2021
    August 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    December 2020
    November 2020
    October 2020
    September 2020

    RSS Feed

Proudly powered by Weebly
  • Home
  • About
  • Event
  • Student Works
    • SEO
    • Social Listening
  • Connect
  • Alumni Showcase
  • Partners
  • Certifications
  • Blog
  • Mentor