A human rights group claims TikTok recommends pornography and sexualised videos to children. Researchers created fake child accounts, activated safety settings, and still received explicit search prompts. These included clips showing masturbation simulations and pornographic sex. TikTok says it acted immediately once alerted and insists it prioritises safe, age-appropriate experiences for young users.
Fake profiles expose dangerous content
In July and August, Global Witness researchers set up four TikTok accounts. They posed as 13-year-olds using false birth dates. The platform did not request additional verification. Investigators enabled TikTok’s “restricted mode”. The company markets this feature as a safeguard against sexual or mature material. Despite this, the accounts received sexualised search suggestions in the “you may like” section. These led to videos of women exposing breasts, flashing underwear, and simulating masturbation. At the extreme, pornography appeared hidden in ordinary-looking clips to bypass moderation.
Global Witness warns of platform failure
Ava Lee from Global Witness described the findings as a “huge shock”. She said TikTok not only fails to protect children but actively recommends harmful material. Global Witness usually investigates how technology affects democracy, human rights, and climate change. The group first encountered TikTok’s explicit content during unrelated research in April.
TikTok defends safety measures
Researchers reported the findings earlier this year. TikTok said it removed the flagged content and introduced fixes. But when Global Witness repeated the test in late July, sexual videos appeared again. TikTok says it has more than 50 safety tools for teenagers. It claims nine out of ten violating clips are removed before anyone views them. Following the report, the company said it upgraded search functions and removed additional harmful content.
Children’s Codes increase platform responsibility
On 25 July, the Children’s Codes under the Online Safety Act came into force. Platforms must enforce strict age verification and prevent children from accessing pornography. Algorithms must also block content linked to self-harm, suicide, or eating disorders. Global Witness conducted a second study after the codes took effect. Ava Lee urged regulators to step in, stressing that children’s online safety must now be enforced.
Users react to sexualised recommendations
During the investigation, researchers monitored TikTok users’ responses. Some expressed confusion over sexualised search prompts. One wrote: “can someone explain to me what is up with my search recs pls?” Another asked: “what’s wrong with this app?”
