YouTube’s censorship system is flawed

In a slew of recent controversies, YouTube needs to update their video classifications. SPECIAL TO THE ORACLE

Update: Since the time of publication, YouTube has announced that Logan Paul will no longer be in the system’s top tier of preferred ads, although his videos will still be monetized, and his YouTube Red film has been cancelled.

As internet users in 2018, we are all able to produce content, share it with an audience and be considered “content creators” regardless of background, orientation or qualifications. The internet in the modern day has made it possible for any person with access to it to have the opportunity to create videos for any viewer to see. YouTube is a platform where this opportunity can come to life.

While YouTube is a positive place where creators put out content in many different genres it is also a place where offensive videos have an opportunity to be shared and reach millions of viewers. 

YouTube currently has a policy that prohibits videos with offensive language, sexual content, or “controversial subjects” including tragedy or violence from being monetized. The algorithm in place is that of a robot, separating the offensive material from the non-offensive. However, like many algorithmic programs, there are flaws within the system. 

YouTubers who work to make a living off the platform from advertisements and monetization have their non-offensive videos demonetized due to mistakes within the algorithm, while other YouTubers with offensive videos remain monetized and available for millions of viewers to watch.      The common theme among the flawed system is that more popular YouTubers with greater viewership are able to publish virtually anything they please that will attract views and money and slip through the filters.

The standard that YouTube has set into place is often not enforced equally across the platform.

In March of 2017, YouTube was under fire for censoring content from hundreds of LGBTQ creators and hiding it from the public, labeling the videos under “Restricted Mode.”  

While this is one of many issues that have been centered around YouTube algorithm, the most recent case is in relation to Logan Paul, a popular YouTuber with nearly 15.7 million subscribers. 

Paul posted an offensive video containing images of a man who had committed suicide in the Japanese forest Aokigahara. Otherwise known as the Sea of Trees or the Suicide Forest, Aokigahara is notorious for being a place where many have gone to end their life. Hundreds of bodies are tragically found here every year, as reported by Scare Street Publishing.

In one of Paul’s most recent vlogs, the YouTuber and a group of his friends came across a man who had committed suicide and proceeded to insensitively film and make light of the situation for their millions of viewers to see. The video was edited, posted and quickly became one of the biggest controversies centered around YouTube to date. How did the offensive nature of this video slip past YouTube’s censors?

YouTube released a statement after much backlash regarding both Paul and the site. The statement via Twitter says that the company is “looking at further consequences” and that they will soon share the steps they are taking to “ensure that a video like this is never circulated again.” 

YouTube must improve their algorithm to put an end to their erroneous system. A large and powerful company such as YouTube, which is owned by Google and has a potential net worth of nearly $40 billion, has the means to hire a successful team that is dedicated to filtering out sensitive material. The company has a duty to protect its users.

Will YouTube ignore the issue and continue on with their flawed system, or will they take the proper measures to diminish the problem? YouTube’s credibility as a company depends on their next move.

 

Samantha Moffett  is a sophomore majoring in mass communications.