Between January and March 2025, more than 450,000 videos were removed from the TikTok platform in Kenya for violating the platform’s Community Guidelines.
This is according to its newly released quarter one Community Guidelines Enforcement report for 2025.
Notably, 92.1 percent of these were removed before they were viewed, and 94.3 percent removed within 24 hours of being posted.
Additionally, just over 43,000 accounts in Kenya were banned during the same period for violating Community Guidelines.
The TikTok’s Community Guidelines TikTok’s outline the rules and expectations for users on the platform, promoting a safe and positive environment.
Violations can lead to content removal or account restrictions, potentially including permanent bans.
These guidelines cover a wide range of areas, from preventing violence and nudity to combating misinformation and protecting personal information.
By integrating advanced automated moderation technologies with the expertise of thousands of trust and safety professionals, TikTok said it enables faster and consistent removal of content that violates our Community Guidelines.
The approach, it added, is vital in mitigating the damaging effects of misinformation, hate speech, and violative material on the platform.
With a proactive detection rate now at 99 percent globally, TikTok said it is more efficient than ever at addressing harmful content before users encounter it.
The social media platform said LIVE content enforcement remains its top priority.
According to the report, a worldwide total of 19 million LIVE rooms were stopped this quarter, a 50 percent increase from the previous quarter.
This increase, it said, shows how effective TikTok’s prioritisation of moderation accuracy has been, as the number of appeals remains steady amid the increase in automated moderation.
While TikTok LIVE enables creators and viewers to connect, create, and build communities together in real time, the platform has intensified its LIVE Monetisation Guidelines to clarify what content is or isn’t eligible for monetisation.
To ensure harmful content does not reach Kenyan children, in-app mental health support is now available for young people in Kenya
This is after TikTok has partnered with Childline Kenya, to provide young people with direct access to local helplines in-app, offering expert support when they report content related to suicide, self-harm, hate, or harassment.
Childline Kenya will offer assistance including counseling, advice, free psychological support, and other essential services to ensure that the community can access support immediately.
Additionally, in June, TikTok announced a partnership with Mental360 to create locally relevant, evidence-based content aimed at raising awareness, reducing stigma, and promoting open conversations about mental health in Kenya.
As part of this initiative, TikTok also named Dr. Claire Kinuthia as one of its African Mental Health Ambassadors, who will help ensure users have access to trusted and reliable mental health resources on the platform.
This comes at a critical time in Kenya, where there is a growing need to bring mental health resources closer to those who need them the most, especially online.
To further strengthen efforts in ensuring safety on the platform, TikTok actively encourages its community to report any content, comments, or accounts that appear to violate the platform’s standards via the TikTok Help Center reporting a problem.
By working collaboratively, TikTok is fostering a safe digital space conducive for a thriving society to flourish and individuals to share enriching experiences.