PSA: Be conscious of biased algorithms!
- S

- Jan 29, 2023
- 3 min read
Updated: Mar 18, 2023
I recently read "A Sea of Data: Apophenia and Pattern (Mis-)Recognition", a chapter in the book Duty Free Art by Hito Steyerl. She discusses the growing issue that data analysts face in a world where there is an overwhelming amount of data to go through - how to differentiate between signal and noise. She argues, “Vision loses importance and is replaced by filtering, decrypting, and pattern recognition”.
I thought that her idea applied to us as media consumers and social media users on an everyday scale in a similar way. We’re being hit with so much information at all times that our brains sometimes go on autopilot, constantly deciding what’s worth paying attention to and what has to go. On social media, we end up curating a stream of content that perfectly caters to our interests and shows us what we believe is important - that’s us separating signal from noise.
Who does social media actually care about?
Steyerl references a mythical Ancient Greek story in which “affluent male locals” produced actual speech, while “women, children, slaves, and foreigners” were just noise - annoying and irrelevant. I thought of our last Digital Humanities class, where we talked about how social media platforms function and whose interests they prioritize. While we do have some control over what we want to see on our feeds, we concluded that biased algorithms maintain feeds that are dominated by people who are white, able-bodied, conventionally attractive, etc. Some of my classmates noted how marginalized communities, like LGBTQ+ folks, are being censored and shadow-banned on social media. It sounds an awful lot like a modern-day equivalent of that Ancient Greek story.

TikTok users speak out about their posts being wrongly shadowbanned and flagged for violating Community Guidelines.
Not to burst your bubble… “Dirty data” is a term that Steyerl uses, which can mean inaccurate and inconsistent data, but should also be understood as “real data” that “documents the struggle of real people with a bureaucracy that exploits the uneven distribution and implementation of digital technology”. What she means by this is that entire groups of people are ignored - “not taken into account” - because these digital and social structures simply do not work in their favor. We’re at fault here, too - the downside of having that control over curating our feeds is that we tend to trap ourselves in a bubble. We make connections that reinforce our existing worldviews and cry “dirty data!” at the stuff that doesn’t fit the mold that we’re comfortable with. Organizations like Logic are recognizing this issue and making plans to use their platform to amplify the voices of typically silenced groups of people, like trans and Indigenous writers. Action like this is important in broadening our perspectives and making room for content outside of the bubbles that we and our biased algorithms have worked together to create.
Data vs. reality What are the dangers of relying too heavily on these algorithms to analyze data? To what extent do the patterns they find correspond to actual reality? In the chapter, Steyerl talks about automated apophenia: computers perceiving connections in data where there aren’t any. She prompts us to consider the real-life consequences of making decisions based on these phantom patterns. In recent years, tenant screening technology used in selling and renting homes has been threatening housing equality due to its programmed bias. This article from Curbed explains how the problem boils down to tech experts designing these systems “in a vacuum”, without any knowledge on civil rights or social implications. The NSA’s SKYNET program that Steyerl references in his chapter is the same type of issue on a larger, deadlier scale.
I won’t deny how important technology is for our everyday functioning, but it isn’t perfect or limitless by any means. Supposedly objective and fact-based algorithms often become reflections of our own human biases. But if we’re the ones who wrote them, then we can be the ones to recognize their flaws and work towards fixing them!



Comments