Skip to Content, Navigation, or Footer.

Does TikTok push graphic content onto its users?

TikTok has steadily risen in popularity over the last few years and many with an account have heard the claims or seen the videos of content creators complaining of their videos being taken down for bogus reasons. Even if you don’t have a TikTok, major publications, such as NPR, have written stories on it.

So I wanted to see if the claims were true. Does TikTok really push graphic and hateful content to the forefront? I made a new account and began scrolling, keeping track of the type of videos that came across my For You Page. I didn’t interact with any of the videos. I didn’t like, comment or search for videos. I simply watched and scrolled, allowing TikTok to show me what they wanted me to see. 

I ended up scrolling for five hours. 

The first hour I saw several dancing videos, but there were still videos that gave me pause. A video featuring a man running over a child. Videos that threatened violence. A video with an elderly couple threatening to kill one another with the caption “relationship goals.” Another with a man singing a song about how men don’t want women as friends, they want women to suck... Well, I’ll let you finish that sentence.

In the second hour there was a video featuring a KKK member breaking into a house. A video of a man walking in on a nude couple (and let me answer your unspoken question: you could see everything). A few videos of white men using racist terminology. A video of a group of monkeys beating a raccoon. 

After two hours, the videos started getting quite misogynistic. Several videos of men giving “advice.” Some of the advice was about the best way to cheat on your partner and not get caught; the best places to take a date to see if they would be any good at certain sexual acts; videos comparing women to cars and why men don’t purchase “used cars.” There were a few videos about a boy – who says his age is 12 in his videos – as he goes around trying to convince adult women to have sex with him. When they refuse or comment on his age, he berates them and spews profanities. 

By the fifth, and final, hour there were two stand-out videos. The first shows a woman, clearly intoxicated at a baseball game, dancing. She stumbles into a man, who then proceeds to knock her to the ground. In the video she does not get back up or regain consciousness. The video then pivots to another man’s reaction, who’s laughing and clapping. Before the video cuts off, he says “equal rights, equal fights.” 

The other video showed two men, trading insults back and forth. One man seems to be winning, so the other man threatens to rape the other if he doesn’t stop talking. 

After five hours of scrolling, I felt gross and dirty – like I needed a shower. 

And I didn’t even know why, because all the videos I mentioned were framed as jokes. Several of them had the hashtag “darkhumor.” And I only scrolled for five hours and, according to Xiaolei Huang – a computer science professor at the University of Memphis, the average American student spends around seven hours a day on the app. 

But what does this say about TikTok? Do they push graphic content onto the user? 

Well, TikTok runs on an algorithm. In simple terms, an algorithm is a computer software device whose goal is to solve a problem with a set of instructions that you provide. In the case of TikTok, the algorithm wants to make your For You Page catered to your wants and needs. So whenever you like, comment, post or search something, you are instructing the algorithm on what to feed back to you. So for example, if you like a dancing video, TikTok will start showing you more dancing videos. 

But what about my case? I didn’t interact with anything. So what did the algorithm feed me?

“If there is no background or historical knowledge of the user’s interest, they will do something like random recommendations – they’ll recommend the top videos across their app,” Huang said. 

However, algorithms are capable of human biases, since they are programmed by people. According to Huang, being able to program without inserting the programmers biases is something the computer science community is working on correcting. 

“It’s never the machine’s fault,” Huang said. 

So while that is entirely possible, it’s not what’s at play here. In my case, TikTok showed me the most popular videos on the app. So it would seem the majority of users enjoy “dark humor.” 

According to Dr. Roger J. Kreuz, associate dean for the College of Arts and Sciences at the University of Memphis and linguistics and psychology researcher, there is a correlation with seeing darker content and further interaction. Basically, the more questionable stuff you see and interact with online, the more likely you are to share darker content. And vice versa of course. 

But is seeing actual disturbing things similar to seeing dark humor? 

“Well, anytime you see something dark, or gross, whether it’s done in humor or not, it’s going to stick with you,” Kreuz said. 

Several U of M students have also reported seeing disturbing content online. 

“Some of the people I follow are not very political, but very vocal on what’s going on in Ukraine,” said journalism major Maddison Philip. “So sometimes that’s hard to see, but at the same time you kinda need to see it.” 

Carmen Darden, another journalism major, said she will sometimes see darker content and wonder why it came on her For You Page, since that is not the regular content she sees. 

“Sometimes, one or two things will sneak in, and it will be because someone I follow would have liked it,” said Darden. 

Social media can have negative effects on your mental health, but when used correctly, it can also have positive effects. It’s all about managing what you see, and when you do see dark content, to take a step back. 

“You can be influenced by the content you see on social media,” Kreuz said.


Similar Posts