By: Anna Cave If you’re interested in data and social justice, this one’s for you. Joy Buolamwini is an MIT Media Lab researcher who was using AI facial recognition for a mirror she was developing when she realized that the mirror wasn’t able to pick up her face. Buolamwini is a Black woman, and the only time she was able to get the software to recognize her is when she put on a solid white mask.
The reason this facial recognition wasn’t working on her face was because the training sets of photographs used to build facial recognition software had a strong majority of white men included in them. A lot of the preconceived ideas of AI people have come from science fiction. There are all these ideas that AI will become smarter than humans and ruin the species, but the reality is that the AI can only act on what we give them. The phrase the documentary used was “data is destiny,” which explains that AI learns from patterns, meaning that if data sets are skewed so will the results, which is exactly what happened to Buolamwini. Part of the problem comes with the thought that algorithms are unbiased. We can’t assume that when a machine develops and carries out an algorithm that it rids it of bias usually carried by humans. In reality, algorithms are just using historical information to make predictions about the future, but history and even recent data sets are still rooted in different historical and cultural biases. Instead, what will happen is that systematic biases will be hardwired into the technologies we are quickly adopting. One of the ways we can see this lack of accuracy within technologies is when AI is used in surveillance technology. It was used to identify protesters in Hong Kong and it was also used to do police surveillance in London with severe inaccuracies. There is a clear social justice concern because this technology is underdeveloped and is still being put to use. Despite knowing that there are biases within different types of AI technology, no one really knows how these algorithms work, especially when you’re on the outside of the group developing them. The documentary poses the question, “how do we get justice in a system that we don’t know how it is working?” Even though it will be difficult, algorithmic justice has become one of the largest civil rights concerns recently. This documentary was a really interesting find and the perfect watch for anyone who is passionate about data and social justice.
0 Comments
Leave a Reply. |
Categories
All
Archives
September 2024
|