[ad_1]
It seems that TikTok’s all-powerful, all-knowing algorithms have decided they want to see some of the most depressing and disturbing content the platform has to offer. Despite TikTok’s claim that its mission is to “bring joy,” I feel no joy at all.
What I’m getting is a glimpse of how proactive TikTok is when it comes to deciding what content users want to see and pushing it. is potentially harmful for users who are triggered or filled with extremist content and misinformation. But TikTok makes it even worse. The platform’s algorithm-centric design engages users in its content in a way that its rivals never do.And those users tend to be younger and spend more time on TikTok than anywhere else.
To give you an idea of what I’m talking about here, my For You page — this is TikTok’s gateway to personalized videos based on what its algorithm thinks you like. The stream — filled with people’s stories about the worst. It’s happened to them. Sometimes they speak to the camera themselves, sometimes they rely on text overlays to tell stories while dancing, sometimes they take photos or videos of their loved ones who are injured and hospitalized. Sometimes it is the image from the ring camera. A person accidentally run over by his own dog. Parent death, child death, pet death, domestic violence, sexual assault, suicide, murder, electrocution, illness, overdose — if it’s bad and someone has a personal story to tell about it , it’s probably in my For You feed.
Videos often have it Distinctive tick tock style This adds a surreal layer to the whole thing and often includes the latest musical memes. The video is edited and Bailee Zimmerman sings “I lost it” at the very moment a woman reacts to learning that her mother has died. Her tears stream down her cheeks filtered with her perfect, radiant, beauty. With TikTok’s text-to-speech feature at your disposal, a hilarious robotic female voice could be describing the action. “Algospeak” — a code word for avoiding moderation of certain topics or keywords by TikTok — is a way for a boyfriend to “not live” himself or a father to “$eggsually a[B emoji]Used’ his daughter.
Oh, and you see a lot of ads for mental health services, too.
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/24140553/tiktok_screeshots_2.jpg)
TikTok is designed to keep you hooked and there, starting with the For You page.The app will automatically open and the video will autoplay. There is no way to open feeds of accounts you follow or disable autoplay. You must opt out of seeing what TikTok wants to show you.
“Algorithms exploit the human psychic vulnerability of curiosity,” says Emily Dreyfuss, a journalist at the Shorenstein Center at Harvard Kennedy School and co-author of the book. . meme warssaid to me.
Watch time is considered a major factor when it comes to what TikTok decides to show you more. Consider it interesting enough and feed it. It’s not what you want to see, it’s what you see. These aren’t always the same, but that doesn’t matter as long as you keep using the app.
The ability to know who your users are and target content based on those assumptions is a big part of TikTok’s appeal. Some say algorithms know you better than you know yourself. One reporter said she knew she was bisexual thanks to TikTok’s algorithm, and she’s not the only one doing so. I thought I didn’t like what her TikTok was showing me, but I didn’t know algorithms were in my subconscious, something I really wanted to observe other people’s misfortunes. I had to think I might have found it. I don’t think this is true, but I’m a journalist so… maybe?
I’m not the only TikTok user who cares what the TikTok algorithm thinks of them. Recent research into the relationship between TikTok users and their platform’s algorithms shows that most TikTok users are well aware of the existence of algorithms and the important role they play in their experience on the platform. Some try to create specific versions of themselves, what the study authors call their “algorithmic selves.” This is similar to how other social media sites try to introduce themselves in a certain way to people who follow them. At TikTok, we’re just doing it for the algorithm.
Aparajita Bhandari, a co-author of the study, told the algorithm that many of the users she spoke to were interested in certain videos, and in order to get more of the same, they could like or like certain videos. He told me that he was making comments.
“They had these interesting theories about how they thought the algorithms worked and how they could be impacted,” Bhandari said. “I feel like I’m having a dialogue with myself.”
:no_upscale()/cdn.vox-cdn.com/uploads/chorus_asset/file/24140558/tiktok_screeshots_1.jpg)
To be fair to TikTok and my algorithmic self, I haven’t given the platform much. My account is private, I have no followers, and I only follow a few accounts. I don’t like or comment on videos, nor do I post my own. I don’t know how or why TikTok decided to watch other people’s tragedies, but I said I would definitely continue to do so. I think I had too many rubber necks.
I should also mention that some of this content is being uploaded and shared for good reason. It’s clear that some of these videos are meant to spread awareness and help others, or share stories with a community that wants to understand and help. And I think we all heal in our own way, so some people just want to meme the tragedy.
This made me wonder what this algorithm-centric platform was doing to people who might be harmed by falling down the For You page rabbit hole. I’m talking about watching content related to eating disorders, which the Wall Street Journal was recently reported. Or, extremist videos that are not too hard to find and have proven to play a role in radicalizing viewers on a platform that is less addictive than TikTok. Or misinformation about the Covid-19 vaccine.
“The actual design choices for TikTok make it very intimate,” Dreyfus said. Then they realize they’ve been watching TikTok for two hours.”
TikTok is becoming an app that people use for more than just entertainment. Gen Z users seem to be using it as a search engine, but the accuracy of the results seems to be an open question. They also use it as a news source, which can be problematic for the same reason. It doesn’t give you as simple context as the date a video was posted. Often you end up trying to find additional information in the comments of the video, but again this doesn’t have to be true.
TikTok says it is currently testing ways to diversify the content on its users’ For You pages. I recently got a prompt after watching a video about someone’s mother dying from gastric bypass surgery asking how I “felt” about what I saw. that. TikTok also has rules regarding sensitive content. Subjects such as suicide and eating disorders can be shared as long as they are not glorified. For example, content featuring violent extremism is prohibited. Some moderators are hired to keep the really bad stuff from surfacing, sometimes at the expense of their own mental health.
There are a few things you can do to make your For You page more comfortable. But it takes far more effort than it took to get the content I’m trying to avoid in the first place. No matter how many times I did this, I didn’t notice much change. Find topics that interest you, watch and participate in their videos, and follow their creators. This is the same for those who participated in Bhandari’s study. I also uploaded some videos to my account. That seems to have made the difference. All my videos feature my dog, and soon I started seeing dog-related videos in my feed.
This is my feed, but there were also many tragic ones, such as the final photo shoot of a dying dachshund and a warning not to feed dogs corn cobs. The cob of corn he gave her. Maybe over time I’ll see more happy dog videos I’m starting to watch creeping up my For You page than sad ones.
This story was first published in the Recode newsletter. SIGN UP HERE Don’t miss the next one!
[ad_2]
Source link