TikTok is rolling out a new system to penalize users who repeatedly break the video app’s rules.
As part of the policy, users who violate TikTok’s community guidelines will receive a warning when their content is removed, the company announced Thursday. Multiple strikes will result in a permanent ban, TikTok said.
The company is notifying its users of the changes as they roll out globally. Strikes will be distributed to users who make comments that break the rules in posts and live streams, or those who break specific policies on bullying and harassment, the company explained.
The threshold for being kicked out of the app will vary depending on the severity of the violation. In the most severe cases, users will be immediately banned on the first hit. This currently applies to content that promotes or threatens violence, depicts or facilitates child sexual abuse material, depicts real-world violence or torture, and non-consensual sexual acts such as rape or abuse.
It will also crack down on those who violate its policies against promoting hateful ideologies against those who share “low-harm spam,” a TikTok spokesperson told the Evening Standard.
The new app enforcement regime comes as social media is facing increased scrutiny. The UK’s Online Safety Bill, which establishes a duty of care on Internet platforms to protect children from harmful material, is currently before Parliament.
MPs on both sides of the political aisle have called for the bill to carry harsher punishments for tech bosses, including up to two years in jail if they are found to have broken the law.
TikTok itself has previously been criticized for showing users potentially dangerous content, including misinformation about Covid-19 and clips that could encourage eating disorders, self-harm and suicide.
The new rules are a departure from TikTok’s current enforcement system that relies on a combination of temporary suspensions and educational resources. The company’s analysis has found that repeat offenders often follow a pattern of behavior: 90% of offenders consistently abuse the same feature, and more than 75% repeatedly violate the same policy.
TikTok introduces the policy after hearing from creators that its current app system is “confusing to navigate.”
“Overall, we are… bringing more transparency to violations accumulated by an account, and will warn a creator if their account is about to be permanently removed,” the TikTok spokesperson said.