YouTube Is Broadcasting Porn Ads For Adult Webcam Sites


YouTube has been showing video advertisements for porn sites and minors were able to see images and links of a sexual nature. According to YouTube, the publisher in question enjoyed a few hours of promotion before its advertisement was moderated for being contrary to the platform’s general conditions, which prohibit showing pornographic content.

If you were looking at trending videos from famous You Tubers like Pewdiepie, for example, you might have seen an ad for an adult webcam website. This video advertisement featured two people engaging in sexual activity and was shown on YouTube channels and other popular videos. A link to something called “Hot Girl 2018” on the bottom left of the video was accompanied by a clearly visible thumbnail showing two people having sex.

YouTubers and the public noticed these pornographic advertisements on YouTube and reported them to the video platform. Viewers did not at all approve of this kind of advertising being visible and accessible to minors. Here is one of their reactions on Twitter:

I don’t normally post this kind of thing, but my friend asked me to because he doesn’t have Twitter. He was watching a video from Markiplier when there was a thumbnail advertisement featuring a porno. @YouTube @TeamYouTube, how can this be permitted in your advertising, on content that children watch?

YouTube responded to concerned Twitter users through the Google “AdWords Help” page. Google AdWords’ advertising policy, which governs the advertising rules for YouTube videos, indicates that “adult content” is prohibited. The page for reporting questionable ads indicates that “some advertisements may appear on Google before the AdWords specialists are able to see them to moderate them”.

The webcam site advertisement was finally removed from YouTube “for violating YouTube’s spam, deceptive practices and scams policy”.

YouTube has already been criticized for broadcasting inappropriate content, especially to children. In December, the company announced that it was hiring 10,000 people to view and moderate inappropriate content on the video platform. It looks like they may have missed this one…