
TikTok today shared its Q2 2024 community guidelines enforcement report to be clearer around content moderation, outlining the content that was removed and accounts violating community guidelines for a safe and positive experience for users.
TikTok Q2 2024 report key findings
The report shows impressive statistics about the efforts made by TikTok toward content moderation in Pakistan. It removed more than 30,709,744 videos; the amount of such material removed proactively before a user report was 99.5% and within 24 hours was 97%.
Globally, TikTok removed 178,827,465 videos, of which 144,430,133 were done through automation. The numbers show that TikTok is always keeping in front of harmful content.
Efficiency and accuracy
The latter now detects 98.2% of the proactive detection rate and proves efficiency has gone up. The report also reduced restored videos by 50%.
Read more: PTA crackdown on illegal SIMs —Third phase begins
It further strengthens the position that the content moderation systems of TikTok prove accurate. This technological advancement helps TikTok scale its moderation efforts while maintaining a secure environment.
TikTok can invest in moderation technologies to maintain transparency. Users can report anything harmful via in-app reporting, TikTok's Support Centre, or email.
TikTok's moderation process
TikTok employs a multi-step moderation process:
- Automated detection: AI-powered technology identifies potential violations.
- Human review: Trained moderators review flagged content.
- Action taken: Violating content is removed or accounts are suspended.