Emotional Vortexes and Rabbit Holes: New Research on TikTok and Platform Trust

May 25, 2024

An interview with Franziska Roesner, ARTT Co-Principal Investigator.

“Does the TikTok algorithm know me better than I know myself?" "Why was I shown this content?” Anyone who has ever lost track of time on TikTok, a video platform with more than 1.5 billion monthly active users and currently the most popular “must-have” iPhone app, has likely asked themselves these questions.

Along with several collaborators, ARTT co-Principal Investigator Franziska Roesner, a University of Washington associate professor in the Paul G. Allen School of Computer Science & Engineering, set out to research how TikTok’s algorithm is personalized, and how its users engage with the platform based on those recommendations. Roesner and collaborators have recently published two papers that examine TikTok’s recommendation algorithm, and its impact on the people who use it.

“Platform designs are not neutral, and they influence how long you watch and what you watch, and what you’re getting angry or concerned about,” says Roesner in an interview with UW News about the research. “The algorithm is such a black box, to the public and to regulators. And to some extent, it probably is to TikTok itself. It’s not like someone is writing code that’s targeting a person who’s vulnerable to an eating disorder.”

We asked Roesner what the seemingly irresistible engagement of TikTok’s recommendation engine might mean for trust when spending time online on the platform.

Q) In 2023, the Wall Street Journal found that TikTok can have an intensifying negative emotional effect upon its users, especially young ones. What does current research show about this? Are there any insights or concerns about possible effects upon conversations, such as potentially intensifying polarization?

Franziska Roesner: In terms of how TikTok users may be taken down an emotional vortex or rabbit hole, our team has had anecdotal experiences like that as users ourselves. But our research actually shows that a large part of what’s going on is that TikTok is showing its users videos about new topics and from new creators that aren’t obviously influenced by their viewing history (what we called “explore” rather than “exploit” videos).

The fact that we found something different from the Journal may be in part because the TikTok algorithm has changed since the 2023 story in the Journal. But it could also be perhaps because in 2022, when we conducted our research, we were looking at real user behaviors rather than bots, which an earlier 2021 Journal investigation examining the TikTok algorithm relied on. Recent research on YouTube has similarly found the story on rabbit holes may not (or no longer) be as dire as once feared. Instead, newer research suggests that YouTube’s recommendations push users, if anything, towards more moderate content.

What the role of users’ interactions with other users may be in terms of what content they view is a really interesting question that I don’t know has been studied yet in depth.

Q) What does this mean for trust on TikTok?

Roesner: All recommendation algorithms – even really simple algorithms – embed values, such as whether or not users experience a chronological timeline. So, it’s important to understand and think critically about what those embedded values are, who benefits from them, and what the alternatives might be.

For users, I think it’s important to remember that TikTok and other platforms are not a neutral “participant” in conversations: what videos you see, what comments you see, which creators and hashtags get more attention are the result of a complex set of factors that are not even necessarily fully understood by the platform developers but that optimize metrics they care about (like engagement).

So, don’t get too angry, don’t overgeneralize about the views of people in general on a topic from the videos you see, seek out alternative perspectives, and use more than one platform/source to learn about what’s going on in the world.

Q) Are there ways to make time on TikTok more healthy, or, fundamentally, is this not possible?

Roesner: Some values embedded in algorithms and platform design are likely fundamentally hard to align between users and the company. For example, TikTok wants and needs more user time and engagement, which may not ultimately be healthy for users – I don’t think that’s fixable by expecting TikTok to change.

But other values may be more in alignment, like exposing users to a variety of viewpoints on a topic, or helping them explore new potential topics of interest, or rewarding positive rather than negative engagement, or ensuring that important public service announcements are seen in a timely manner.

Read more about Roesner and her collaborators’ research methodology and other findings in UW News.

Roesner’s two new research papers can be found here:

https://franziroesner.com/pdf/zannettou-tiktok-chi24.pdf  

https://franziroesner.com/pdf/vombatkere-tiktok-webconf24.pdf