YouTube inadequately deals with sexual exploitation of children

YouTube’s shaky attempt to stop predatory comments on videos of and created by children has proven to be ineffective and has only escalated the situation.

On Feb. 19, a video posted by creator Matt Watson reinvigorated major controversy against YouTube, exposing millions of ignored predatory comments.

In the video, Watson described the major faults in YouTube’s recommendation algorithm. As a viewer with an account searches for and views videos over time, YouTube finds similarities in the content and channels in which the viewers seems most interested in. This feature has only intensified the access and availability to discover the videos circulated in a “soft-core pedophilia ring,” Watson said.

Logo_youtube_ios_(cropped)
Young creators, predominantly vulnerable to the comments and sexual exploitation, are being punished by the counterproductive actions YouTube has taken.  Public Domain Photo from Wikimedia Commons

This is not the first instance that the algorithm has contributed to widespread sexualized child content. In 2017, YouTube updated its policies to address a situation in which altered, sexualized, content of children was being recommended to children. Watson’s video again exposed the algorithm, now instead recommending the videos to provoking pedophiles.

Many of the videos affected are of children eating food on camera, gymnastics, doing try-on clothing hauls or simply talking to the camera. These videos, not pornographic in nature, are littered with up to thousands of predatory comments, sexualizing and exploiting normal behaviors of children.

YouTube has attempted to solve the pedophile problem by demonetizing all videos of children, restricting comments or all together deleting the videos affected. Young creators, predominantly vulnerable to the comments and sexual exploitation, are being punished by the counterproductive actions YouTube has taken. In result, affected creators are completely unable to interact with their subscribers due to the restriction of comments and deletion of videos.

Demonetization is, as defined by CBC.CA, “ YouTube decides which videos can collect ad revenue, based on whether they are deemed advertiser-friendly.”

Content creation itself is an extensive process, minimally including planning, filming, editing and promoting, just to create a single video. This process can take up to days. It can take minutes for a video to get demonetized.

A statement from a YouTube spokesperson confirmed that several of the videos featured in Watson’s video have since been removed, The Verge reported. In response to Watson’s videos and the surplus of concerning comments, many companies, such as Epic Games, Nestlé, Disney and AT&T, quickly pulled their advertisements from YouTube around Feb. 20.

Thirteen year-old creator, MacKenna Kelly, “Life With MaK,” was directly affected by predatory comments and the ineffective measures taken by YouTube. Kelly has over 1.4 million subscribers, and creates ASMR-style videos. Several of Kelly’s videos now have comment restrictions, and some have been entirely deleted. After YouTube deleted tens of millions of comments, predatory and not, Kelly posted a video responding to the situation. Kelly said, “It’s not just the videos that are part of the channel, it’s also the comments. It’s like a 50/50, because how else are your fans supposed to reach out to you?”

Several creators have also expressed their discontent with the way YouTube has handled the situation.

YouTube veteran, Colleen Ballinger, first posted a 30 minute video early March 26, including content briefly talking about the issue, but primarily showing her everyday life. The video, which is posted on her secondary vlog-style channel, was quickly demonetized without explanation. YouTube has long been questioned for its method behind demonetization, often removing advertisements from videos without a clear gauge of what is “acceptable” or not to creators and viewers. This can often leave creators confused and hurt, as the reviewing process has proven to be inconsistent.

Ballinger then received a message saying, “all ads have been taken off this video because the content within this was not ad-friendly.” The video fit within the parameters of YouTube’s guidelines, free of foul language and inappropriate content.

In response, Ballinger posted a video later that day, directly addressing the problem. “What, about me talking about my life with my son, as a new mom, projects I’m working on, and then talking about how YouTube did this. How is that not ad-friendly,” Ballinger said.

Ballinger expressed concerns about the lack of consequences for perpetrators, saying “They can’t comment anymore because comments are disabled, which means there’s no way to find these pedophiles and report them. So, to me this is not a fix to the issue. To me, it’s helping the pedophiles and blaming the victim.”

Emily Stisser – Opinion Editor 

This will be Emily Stisser’s first year on ECHO staff, but she made several contributions while taking journalism class her freshman year.

 


Visit Our Sponsors


right size webster methodist unitedad

Leave a Reply