Olufemi Adeyemi 

TikTok, the popular short-video platform owned by ByteDance, released its Community Guidelines Enforcement report for the fourth quarter of 2024, revealing significant efforts in content moderation worldwide. Notably, the report highlighted that 2.4 million videos originating from Nigerian users were removed due to violations of the platform's content policies. This figure placed Nigeria among the top 50 countries contributing to policy violations.

Globally, TikTok removed a staggering 153 million videos during this period, underscoring the scale of content moderation challenges faced by social media platforms. The United States led the list with 8.5 million removed videos, representing the highest number of violations. The top 50 markets, including Nigeria, collectively accounted for approximately 90% of all content removals, indicating a concentrated effort to address violations in key regions.

Policy Violations and Account Removals

The removed content spanned various categories of violations, including:

  • Integrity and Authenticity: Violations related to misinformation, manipulated media, and inauthentic behavior.
  • Privacy and Security: Issues concerning the unauthorized sharing of personal information and online safety.
  • Mental and Behavioral Health: Content that could negatively impact users' mental well-being, including self-harm and eating disorders.
  • Safety: Content that promotes dangerous activities, violence, or exploitation.
  • Civility: Hate speech, harassment, and other forms of disrespectful content.

In addition to video removals, TikTok took action against accounts violating its guidelines. A total of 211.5 million accounts were removed, with:

  • 185.3 million identified as fake accounts, representing the largest category.
  • 20.5 million suspected to be owned by users under the age of 13.
  • 5.6 million removed for other unspecified reasons.

TikTok attributed some of the observed increases in account removals to an updated classification system for fake likes and followers, aiming to provide a more accurate representation of their efforts to combat artificial engagement.

Challenges and Legal Scrutiny

Despite these moderation efforts, TikTok continues to face significant challenges. The platform's algorithm, designed to maximize user engagement, has drawn criticism for its potential to expose users, particularly children, to harmful content.

In a notable development, 13 U.S. states and the District of Columbia filed lawsuits against TikTok in October 2024. These lawsuits allege that the platform's design is intentionally addictive, exploiting children's vulnerabilities for profit. The states are seeking financial penalties and increased accountability, raising concerns about the platform's impact on mental health and the effectiveness of its content moderation.

The concern that TikTok's software is built to retain users, especially younger ones, for long durations has caused worry about mental health and the efficacy of its content moderation procedures. This legal battle intensifies the pressure on TikTok to demonstrate its commitment to user safety and responsible platform management.

Implications and Future Directions

The high volume of content removals in Nigeria and globally highlights the ongoing struggle to balance freedom of expression with the need to protect users from harmful content. As TikTok continues to grow in popularity, particularly among younger audiences, it faces increasing scrutiny from regulators, lawmakers, and advocacy groups.

The platform's future will likely depend on its ability to effectively address concerns about content moderation, user safety, and the potential for algorithmic manipulation. Stricter regulations, enhanced transparency, and ongoing collaboration with experts and stakeholders will be crucial in ensuring a safer and more responsible online environment.