top of page
banner_edited.png
banner_edited.png
banner_edited.png
banner_edited.png
banner_edited.png

TIKTOK

Getting Acquainted

My sister convinced me to download TikTok some time in December 2019. Its main feature is the “for-you” page – an endless queue of algorithmically-curated videos, each less than a minute long. The for-you page is entirely made up of suggested content based on your user data. When I first made an account, the only information I provided was my location and my phone number. My user data didn’t exist yet. I didn’t fill out a questionnaire profiling my interests. TikTok didn’t have much to go off of in terms of approximating what content I would want to see, so when I first encountered my own personal for-you page, I was confronted with cat video after cat video.  On instinct, every time one came up on my feed, I made sure to flag with the “not interested” button and kept scrolling. After just a few days on the app, tapping the heart icon for videos I liked, quickly scrolling past the ones I didn’t care for, and marking the content I really didn’t want to see with “not interested,” the content of my for-you page slowly started to accurately reflect my interests. 


Tiktok’s accuracy has crossed the border into creepy territory plenty of times. One night, over dinner, my mom and I had a brief conversation about Marriage Story, specifically the scene where Laura Dern goes on a long monologue railing against the unfair expectations placed on mothers and its basis in what she calls “our Judeo-Christian whatever.” The whole exchange couldn’t have been longer than a minute or two. The next morning, I opened Tiktok. Just a few videos into my for-you page, there was a clip of the exact scene from Marriage Story we’d talked about the night before, down to the one line I’d said out loud in the exchange with my mom, the Judeo-Christian whatever. 
 

I can try to rationalize this eerie encounter with some more context about my Tiktok and online activity: for one, the movie I referenced isn't exactly niche. I follow a couple film accounts, in the past I’ve liked videos featuring actors that are in Marriage Story, I’ve liked TikToks  about Little Women - which Laura Dern is also in, and the film was directed by Greta Gerwig who is married to the director of Marriage Story. If I don’t want to jump to the conclusion that TikTok is listening in on my private conversations, there’s plenty of dots for them to connect that would eventually lead to that one specific scene in Marriage Story just by following the chain of connections in my data.

 

When I downloaded my TikTok data, I expected to find at least some evidence of the inferences they have made to populate my for-you page, but it is the most opaque and least forthcoming with explanations of all the platforms I've downloaded data from. 
 

The Download

To download my TikTok data, I had to go through the app’s Privacy settings, where I found a Personalization and Data section. From there, I found the option to request a data file. TikTok warns that it can take up to thirty days to process data requests, but my file was ready less than forty-eight hours later:

tiktok-home.png
tiktok-activity.png
tiktok-activity.png
tiktok-liked.png

A log every TikTok that I've marked as "liked."

The information that is there is not exciting or different from what the other platforms have provided. There’s a log of every time I’ve logged into the app. A record of my privacy settings. Every comment I’ve ever left on a video. "Like list" is a log of every video I’ve ever liked. Even the Hashtag document under Activity, which I thought might contain a list of the hashtags from every video I’ve liked or commented on, is just a list of the hashtags I’ve used in my own posts.  However, unlike the other platforms, TikTok doesn't provide any inferences it's made about me in order to generate content my for-you page.

 

In a recent blog post, TikTok gave a rundown of how their for-you ‘recommendation system’ works: 

Recommendations are based on a number of factors, including things like:

  • User interactions such as the videos you like or share, accounts you follow, comments you post, and content you create.

  • Video information, which might include details like captions, sounds, and hashtags.

  • Device and account settings like your language preference, country setting, and device type. These factors are included to make sure the system is optimized for performance, but they receive lower weight in the recommendation system relative to other data points we measure since users don't actively express these as preferences.

​

All these factors are processed by our recommendation system and weighted based on their value to a user. A strong indicator of interest, such as whether a user finishes watching a longer video from beginning to end, would receive greater weight than a weak indicator, such as whether the video's viewer and creator are both in the same country. Videos are then ranked to determine the likelihood of a user's interest in a piece of content, and delivered to each unique For You feed. 6

So it's from all this simple, raw data that they have been able to approximate my interests. Since none of those approximations appear in my data file, I can only guess what inferences they've made based on the ever evolving content of my own For You page.

​

For Me

After almost a year of regular visits to the app, the TikTok algorithm has gotten eerily specific in its understanding of what content I’m most likely to watch. My for you page has become increasingly personalized and niche. There is no scenario outside the context of TikTok where I would watch twenty-second videos matching Le Creuset cookware to different Hozier songs and accompanying kitchen remodels. It’s not something I would ever think to seek out myself, but one day, one of those videos showed up on my for-you page. I was mildly entertained, so I hit the like button, and when more videos of the same nature subsequently showed up, I liked those too.  The algorithm has gotten to know me and my digital habits so well that it knows what videos I’m going to like before I do. 

From my for-you page. TikTok user @ave.abe matches Le Creuset's meringue set with Spanish hacienda kitchens and sets the mood with a song by Irish folk artist Hozier. Why is this so calming?

While TikTok doesn't provide interest categories themselves, their users do. To be posed the question, "What side of TikTok are you on?" is to ask what kind of person TikTok has approximated you to be. 

​

Typically, the users asking this question aren’t thinking about data at all. They’re asking what subcommunity of the app you’ve been grouped into based on the type of content that's been algorithmically placed on your for-you page. The side of TikTok that you’re on — and you can be on more than one “side” at once — says something about you, your social standing, your taste, and how interesting you are as a person. In this realm, there’s no greater insult than accusing someone of being on "Straight TikTok," which means your for-you page is full of conventionally attractive influencers, dance challenges, and lifestyle vlogs, occasionally sponsored by Dunkin Donuts. In other words, it is content that is dull, unoriginal, and uninteresting, which in turn means that you are also dull, unoriginal, and uninteresting, at according to the people who are not on Straight Tiktok. 

Charli D'Amelio, currently the most followed account on TikTok, promotes her new Dunkin Donuts drink, 'the Charli.'  

There was about a three week stretch in September where I found myself grouped into a more niche TikTok community, WitchTok, which is exactly what it sounds like: my for-you page was overwhelmed with videos of people in black pointy hats, people in flowy white dresses explaining how to cleanse bad energy from your home by burning sage while Stevie Nicks played in the background, and one guy who’d made more than a dozen videos dressed as a 17th century Puritan woman reinventing scenes from the Crucible.

My only explanation for why this might have happened is that sometime at the beginning of the semester, I was looking for an elective on my college’s course guide and came across a class on the History of Witchcraft. Suddenly, TikTok was crying witch. I might as well have been Goody Proctor dancing with the devil. While TikTok has denied having access to users' Google data, it's the only online activity I can think of that would link me to an interest in WitchTok.   

WitchTok. TikTok user @bequietjoe puts a modern spin on an old classic. 

Of course, the repercussions of miscategorization in this instance were minimal. It’s not like I was burned at the stake because of my status as a data witch. I was just stuck in a particular corner of the app for a few weeks. In other cases of mistaken data identity, the consequences cross over from the digital to the real world in damaging and alarming ways. 

​

In 2019, the NYU Law Review published a study by race, technology, and law scholar Rashida Richardson that called for skepticism toward data-driven predictive policing practices. Predictive policing is a system that uses past data in order to predict how likely it is that a person will be a perpetrator or victim of a crime, or how likely it is that a crime will occur in a specific location. The study found that the use of data by law enforcement to practice predictive policing in jurisdictions with a history of flawed, racially biased, and unlawful practices and policies has resulted in inaccurate predictions and reified existing biases in the prediction model 7.

 

Because data has mathematical and scientific connotations, it's often viewed as objective. In practice, that's not the case.  To datafy something, whether it's the probability that an individual will like a video or the likelihood that they will commit a crime, will always leave gaps in understanding and lead inaccuracies. While data can be informative, it is never whole objective truth, even when it's properly and accurately collected. Bad data, when fed into prediction models, often leads to the magnification of it's flaws and biases.

bottom of page