Nestlé, Epic Games and other major brands said on Wednesday that they had stopped buying advertisements on YouTube after their ads appeared on children’s videos where pedophiles had infiltrated the comment sections.
The companies acted after a YouTube user posted a video this week to point out this behavior. For the most part, the videos targeted by pedophiles did not violate YouTube’s rules and were innocent enough — young girls doing gymnastics, playing Twister or stretching — but the videos became overrun with suggestive remarks directed at the children.
The commenters left time stamps for parts of the video that can appear compromising when paused — like a girl’s backside or bare legs. They also posted remarks that praised the girls, asked whether they were wearing underwear, or simply carried a string of sexually suggestive emojis.
About two years ago, hundreds of companies pulled money from YouTube over concerns about ads showing up next to problematic content from terror or hate groups and videos that seemed to endanger or exploit children.
Over the last year, many major advertisers have returned to the site after they were reassured that YouTube had made progress in flagging and dealing with problematic content more quickly.
The video highlighting the comments, posted by the YouTube creator Matt Watson (also known as MattsWhatItIs) and viewed 1.75 million times since it went up on Sunday, accused YouTube of “facilitating the sexual exploitation” of children. Mr. Watson said YouTube’s recommendation system also guided predators to other similar videos of minors — many of which carry advertisements for major brands.
Chi Hea Cho, a spokeswoman for YouTube’s parent company, Google, said it had deleted the accounts and channels of people leaving the disturbing comments, deleted comments that violate its policies and reported illegal activity to the authorities.