On TikTok, all is not as it seems. Engage with dance videos and you’ll start seeing more people doing the Renegade. If you linger on a TikTok dog, it will give you puppies galore.
But TikTok’s algorithmic obsession with giving you more content that it thinks you will like is having an unintended consequence: it’s started recommending people new accounts to follow based on the physical appearance of the people they already follow.
This week Marc Faddoul, an AI researcher at UC Berkeley School of Information, found that TikTok was recommending him accounts with profile pictures that matched the same race, age or facial characteristics as the ones he already followed.
He created a fresh account to test his theory and followed people he found on his ‘For You’ page. Following the account of a black woman led to recommendations for three more black women. It gets weirdly specific – Faddoul found that hitting follow on an Asian man with dyed hair gave him more Asian men with dyed hair, and the same thing happened for men with visible disabilities.
TikTok denies that it uses profile pictures as part of its algorithm, and says it hasn’t been able to replicate the same results in its own tests. But the app uses collaborative filtering – where recommendations are made based on what other users have done. And this has the potential to add unconscious bias into the algorithm.
“The platform is very appearance driven, and therefore collaborative filtering can lead to very appearance specific results even if the profile picture is not used by the system,” says Faddoul. TikTok’s algorithm will think it is creating a personalised experience for you, but actually it is just building a filter bubble – an echo chamber where you only see the same kind of people with little awareness of others.
This isn’t the first time TikTok’s algorithm has been accused of racial bias. In October 2019 TikTok users of colour called for better representation on the For You page, where users go for recommendations and new tailored content. In January 2019, Whitney Phillips, a professor of communication and online media rhetoric at Syracuse University told Motherboard that the way TikTok works could lead users to replicate the community with which they identify.
To test the findings we created a new account, went on the ‘For You’ page, swiped left to view a profile and followed to see who was recommended. The first account that came up was that of KSI, an internet personality and rapper with 1.2 million followers on TikTok. We followed KSI and the next three recommended accounts were one with a profile picture of a ghostly-looking man sitting too far away from the camera to even guess his race, a blurry picture of what looks like a teenager at a festival, and a very close up picture of a white man’s face. All three are verified, but none look particularly similar.
After that, results started to appear that were similar to what Faddoul found. An account a pet owner set up for their dog produced recommendations for other dog accounts. Following a young black man led to recommendations for two other black men and one cartoon of a black man. Following an 87-year-old man led to three recommendations for three more older men. Following a white woman with brown hair led to three more white women with brown hair, then finally, following an account with the Union Jack as its profile picture sprung up three more Union Jacks, one with a trollface superimposed on top.
If you like one elderly man on TikTok, the app assumes that you will enjoy watching others. But this goes a step too far when racial bias is factored in. “People from underrepresented minorities who don’t necessarily have a lot of famous people who look like them, it’s going to be harder for them to get recommendations,” says Faddoul.
On social media we follow people with opinions that we agree with. Algorithms then throw up more of the same content, creating a separation where you don’t see opinions that differ to yours, allowing you to forget that they exist. In a highly visual platform such as TikTok, this applies to how a person looks. Faddoul’s findings may not indicate how TikTok intends its algorithm to work, but shows how user biases may have resulted in these very specific filter bubbles.
TikTok has economised people’s attention, using the mountains of data it has about how long people spend watching videos and how they interact with them to hyper-personalise the user, says Jevan Hutson, human-computer interaction researcher at University of Washington School of Law. The data from users across the globe feeds into an algorithm that ends up encouraging a segregation of content.
This extraction of data can create patterns that solidify assumptions about particular races or ethnicities, says Hutson. He compares it to dating apps, where the unconscious bias of thousands of users will create an assumption within the algorithm about racial preferences – Coffee Meets Bagel, for example, met controversy when users kept getting recommended people of the same race as themselves even though they never indicated a racial preference. On TikTok, when the app recommends more of the same content and users, it risks leading to radicalisation and segregation.
“I think there’s no ethical consumption under surveillance,” says Hutson. He is an avid TikTok user, and believes that people sacrifice the data the app collects about them for the sake of getting content they enjoy more.
The data TikTok gets from its millions of users feeds into a cycle they get trapped in, and even if you make an effort to diversify your feed, everyone else’s bias will mean the algorithm will keep trying to channel you into a bubble.
Maria Mellor is a writer for WIRED. She tweets from @Maria_mellor
More great stories from WIRED
🏙️ A huge Airbnb scam is taking over London
🚙 Thinking of buying an electric car? Read this first
🍅 Why do modern tomatoes taste so bad?
📢 How Slack ruined work
More Stories
Must-Read News Stories in the World of Computer and Tech
Latest Tech News and Its Impact on the Industry
Common Mistakes When Selling Gold and How to Avoid Them