TikTok Lawsuit: Internal Docs Reveal App's Teen Impact (Oct 11, 2024)

Summary

Internal documents from a lawsuit against TikTok reveal company awareness of the potential negative psychological effects of the app on teenagers. The documents show that TikTok's algorithm may be designed to maintain user engagement, even at the expense of users' well-being. Such practices include rapid-fire video displays to promote addiction.

Full Transcript

1. Mark your confusion. 2. Show evidence of a close reading. 3. Write a 1+ page reflection. TikTok Executives Know about App’s Effect on Teens, Lawsuit Documents Allege Source: Bobby Allyn, NPR.org, October 11, 2024 For the first tim...

1. Mark your confusion. 2. Show evidence of a close reading. 3. Write a 1+ page reflection. TikTok Executives Know about App’s Effect on Teens, Lawsuit Documents Allege Source: Bobby Allyn, NPR.org, October 11, 2024 For the first time, internal TikTok communications have been made public that show a company unconcerned with the harms the app poses for American teenagers. This is despite its own research validating many child safety concerns. The confidential material was part of a more than two-year investigation into TikTok by 14 attorneys general that led to state officials suing the company on Tuesday. The lawsuit alleges that TikTok was designed with the express intention of addicting young people to the app. The states argue the multi-billion-dollar company deceived the public about the risks. TikTok users can become ‘addicted’ in 35 minutes As TikTok’s 170 million U.S. users can attest, the platform’s hyper-personalized algorithm can be so engaging it becomes difficult to close the app. TikTok determined the precise amount of viewing it takes for someone to form a habit: 260 videos. After that, according to state investigators, a user “is likely to become addicted to the platform.” In the previously redacted portion of the suit, Kentucky authorities say: “While this may seem substantial, TikTok videos can be as short as 8 seconds and are played for viewers in rapid-fire succession, automatically,” the investigators wrote. “Thus, in under 35 minutes, an average user is likely to become addicted to the platform.” Another internal document found that the company was aware its many features designed to keep young people on the app led to a constant and irresistible urge to keep opening the app. TikTok’s own research states that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety,” according to the suit. In addition, the documents show that TikTok was aware that “compulsive usage also interferes with essential personal responsibilities like sufficient sleep, work/school responsibilities, and connecting with loved ones.” TikTok: Time-limit tool aimed at ‘improving public trust,’ not limiting app use The unredacted documents show that TikTok employees were aware that too much time spent by teens on social media can be harmful to their mental health. The consensus among academics is that they recommend one hour or less of social media usage per day. The app lets parents place time limits on their kids’ usage that range from 40 minutes to two hours per day. TikTok created a tool that set the default time prompt at 60 minutes per day. Internal documents show that TikTok measured the success of this tool by how it was “improving public trust in the TikTok platform via media coverage,” rather than how it reduced the time teens spent on the app. After tests, TikTok found the tool had little impact – accounting for about a 1.5-minute drop in usage, with teens spending around 108.5 minutes per day beforehand to roughly 107 minutes with the tool. According to the attorney general’s complaint, TikTok did not revisit this issue. One document shows one TikTok project manager saying, “Our goal is not to reduce the time spent.” In a chat message echoing that sentiment, another employee said the goal is to “contribute to DAU [daily active users] and retention” of users. TikTok has publicized its “break” videos, which are prompts to get users to stop endlessly scrolling and take a break. Internally, however, it appears the company didn’t think the videos amounted to much. One executive said that they are “useful in a good talking point” with policymakers, but “they’re not altogether effective.” TikTok demoted people it deemed unattractive on its feed The multi-state litigation against TikTok highlighted the company’s beauty filters, which users can overlay on videos to make themselves look thinner and younger or to have fuller lips and bigger eyes. One popular feature, known as the Bold Glamour filter, uses artificial intelligence to rework people’s faces to resemble models with high cheekbones and strong jawlines. TikTok is aware of the harm these beauty filters can cause young users, the documents show. Employees suggested internally the company “provide users with educational resources about image disorders” and create a campaign “to raise awareness on issues with low self-esteem (caused by the excessive filter use and other issues).” They also suggested adding a banner or video to the filters that included “an awareness statement about filters and the importance of positive body image/mental health.” This comes as the documents showcase another hidden facet of TikTok’s algorithm: the app prioritizes beautiful people. One internal report that analyzed TikTok’s main video feed saw “a high volume of … not attractive subjects” were filling everyone’s app. In response, Kentucky investigators found that TikTok retooled its algorithm to amplify users the company viewed as beautiful. “By changing the TikTok algorithm to show fewer ‘not attractive subjects’ in the For You feed, [TikTok] took active steps to promote a narrow beauty norm even though it could negatively impact their Young Users,” the Kentucky authorities wrote. TikTok exec: algorithm could deprive kids of opportunities like ‘looking at someone in the eyes’ Publicly, TikTok has stated that one of its “most important commitments is supporting the safety and well-being of teens.” Yet internal documents paint a very different picture, citing statements from top company executives who appear well-aware of the harmful effects of the app without taking significant steps to address it. One unnamed TikTok executive put it in stark terms, saying the reason kids watch TikTok is because of the power of the app’s algorithm, “but I think we need to be cognizant of what it might mean for other opportunities,” said the company executive. “And when I say other opportunities, I literally mean sleep, and eating, and moving around the room, and looking at someone in the eyes.” TikTok’s internal estimate: 95% of smartphone users under 17 use TikTok TikTok views itself as being in an “arms race for attention,” according to a 2021 internal presentation. And teenagers have been key to the app’s early growth in the U.S., but another presentation shown to top company officials revealed that an estimated 95% of smartphone users under 17 use TikTok at least once a month. This lead a company staffer to state that it had “hit a ceiling among young users.” TikTok’s own research concluded that kids were the most susceptible to being sucked into the app’s infinitely flowing feed of videos. “As expected, across most engagement metrics, the younger the user, the better the performance,” according to a 2019 TikTok document. In response to growing national concern that excessive social media use can increase the risk of depression, anxiety and body-image issues among kids, TikTok has introduced time-management tools. These include notifications informing teens about how long they are spending on the app, parental oversight features and the ability to make the app inaccessible for some down time. At the same time, however, TikTok knew how unlikely it was these tools would be effective, according to materials obtained by Kentucky investigators. “Minors do not have executive function to control their screen time, while young adults do,” read a TikTok internal document. TikTok pushes users into filter bubbles like ‘painhub’ and ‘sadnotes’ TikTok is well aware of “filter bubbles.” Internal documents show the company has defined them as when a user “encounters only information and opinions that conform to and reinforce their own beliefs, caused by algorithms that personalize an individual’s online experience.” The company knows the dangers of filter bubbles. During one internal safety presentation in 2020, employees warned the app “can serve potentially harmful content expeditiously.” TikTok conducted internal experiments with test accounts to see how quickly they descend into negative filter bubbles. “After following several ‘painhub’ and ‘sadnotes’ accounts, it took me 20 mins to drop into ‘negative’ filter bubble,” one employee wrote. “The intensive density of negative content makes me lower down mood and increase my sadness feelings though I am in a high spirit in my recent life.” In another document, TikTok’s research found that content promoting eating disorders, often called “thinspiration,” is associated with issues such as body dissatisfaction, disordered eating, low self-esteem and depression Despite these heedings, TikTok’s algorithm still puts users into filter bubbles. One internal document states that users are “placed into ‘filter bubbles’ after 30 minutes of use in one sitting.” The company wrote that having more human moderators to label content is possible, but “requires large human efforts.” TikTok slow to remove users under 13, despite company policy Kids under 13 cannot open a standard TikTok account, but there is a “TikTok for Younger Users” service that the company says includes strict content guardrails. It is a vulnerable group of users, since federal law dictates that social media sites like TikTok cannot collect data on children under 13 unless parents are notified about the personal information collected. And even then, social media apps must first obtain verifiable consent from a parent. In August, the Department of Justice sued TikTok for violating the federal law protecting the data of kids under 13, alleging that the app “knowingly and repeatedly violated kids’ privacy.” In the internal documents, however, company officials instructed TikTok moderators to use caution before removing accounts of users suspected to be under 13. Possible Response Questions What are your thoughts about the TikTok app and its effect on teens? Explain. Did something in the article surprise you? Discuss. Pick a word/line/passage from the article and respond to it. Discuss a “move” made by the writer in this piece that you think is good/interesting. Explain.

Use Quizgecko on...
Browser
Browser