当前位置: X-MOL 学术Social Media + Society › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
“There’s Always a Way to Get Around the Guidelines”: Nonsuicidal Self-Injury and Content Moderation on TikTok
Social Media + Society ( IF 5.5 ) Pub Date : 2024-05-17 , DOI: 10.1177/20563051241254371
Valerie Lookingbill 1 , Kimanh Le 1
Affiliation  

The stigmatized nature of nonsuicidal self-injury may render TikTok, a short-form, video-sharing social media platform, appealing to individuals who engage in this behavior. Since this community faces biased scrutiny based on stigmatization surrounding mental health, nonsuicidal self-injury users may turn to TikTok, which offers a space for users to engage in discussions of nonsuicidal self-injury, exchange social support, experience validation with little fear of stigmatization, and facilitate harm reduction strategies. While TikTok’s Community Guidelines permit users to share personal experiences with mental health topics, TikTok explicitly bans content that shows, promotes, or shares plans for self-harm. As such, TikTok may moderate user-generated content, leading to exclusion and marginalization in this digital space. Through semi-structured interviews with 8 TikTok users and a content analysis of 150 TikTok videos, we explore how users with a history of nonsuicidal self-injury experience TikTok’s algorithm to engage with content on nonsuicidal self-injury. Findings demonstrate that users understand how to circumnavigate TikTok’s algorithm through hashtags, signaling, and algospeak to maintain visibility while also circumnavigating algorithmic detection on the platform. Furthermore, findings emphasize that users actively engage in self-surveillance, self-censorship, and self-policing to create a safe online community of care. Content moderation, however, can ultimately hinder progress toward the destigmatization of nonsuicidal self-injury.

中文翻译:


“总有办法绕过准则”:非自杀性自残和 TikTok 上的内容审核



非自杀性自残的污名性质可能会使 TikTok(一个简短的视频共享社交媒体平台)对从事这种行为的个人产生吸引力。由于这个社区面临基于心理健康污名化的有偏见的审查,非自杀性自残用户可能会转向 TikTok,它为用户提供了一个空间,让他们可以讨论非自杀性自残、交换社会支持、体验验证,而不必担心受到污名化,并促进减少伤害战略。虽然 TikTok 的社区准则允许用户分享有关心理健康主题的个人经历,但 TikTok 明确禁止展示、宣传或分享自残计划的内容。因此,TikTok 可能会调节用户生成的内容,导致该数字空间中的排斥和边缘化。通过对 8 名 TikTok 用户的半结构化访谈以及对 150 个 TikTok 视频的内​​容分析,我们探讨了有非自杀性自残历史的用户如何体验 TikTok 的算法来参与非自杀性自残的内容。调查结果表明,用户了解如何通过主题标签、信号和算法语言规避 TikTok 的算法,以保持可见性,同时规避平台上的算法检测。此外,研究结果强调,用户积极参与自我监督、自我审查和自我监管,以创建一个安全的在线护理社区。然而,内容节制最终可能会阻碍非自杀性自残去污名化的进程。
更新日期:2024-05-17
down
wechat
bug