A campaign group says TikTok’s algorithm promotes pornography and sexualised videos to children. Researchers created fake child accounts, enabled all safety settings, and still saw explicit search suggestions. These included clips of women simulating masturbation and even pornographic sex acts. TikTok claims it took action once notified and insists it prioritises safe and age-appropriate use.
Child accounts reveal explicit searches
In late July and early August, Global Witness researchers set up four TikTok profiles. They posed as 13-year-olds with false dates of birth. The platform did not ask for additional identity checks. Investigators activated the “restricted mode”. TikTok advertises this feature as a safeguard against mature and sexualised themes. Despite that, search prompts in the “you may like” section suggested sexual terms. These led to videos of underwear flashing, breast exposure and masturbation. At the extreme, some videos contained explicit porn hidden inside innocent-looking clips.
Global Witness calls findings shocking
Ava Lee from Global Witness called the results a “huge shock”. She warned the platform not only fails to protect children but also pushes them toward dangerous material. Global Witness usually investigates how large tech companies shape democracy, climate issues and human rights. The group first encountered TikTok’s porn problem by accident during research in April.
TikTok defends its moderation
Researchers reported the issue earlier this year. TikTok said it removed the flagged material and introduced fixes. But when Global Witness tested again in late July, sexual videos reappeared. TikTok insists it has more than 50 safety features for young users. The platform says nine out of ten violating clips are removed before being viewed. After the latest findings, the company claims it improved search tools and deleted more harmful content.
New safety codes demand stronger action
On 25 July, the Children’s Codes from the Online Safety Act took effect. Platforms must now use strict age verification and block children from seeing pornography. Algorithms must also filter content that encourages eating disorders, self-harm or suicide. Global Witness conducted its second round of research after these rules began. Ava Lee urged regulators to intervene, saying children’s online protection must become a priority.
Users complain about search results
During the study, researchers observed comments from regular users. Some questioned why their search recommendations turned sexual. One wrote: “can someone explain to me what is up with my search recs pls?” Another asked: “what’s wrong with this app?”
