YouTube considers refusing to share ‘questionable’ content

YouTube considers refusing to share ‘questionable’ content

Discovering viable content online is one of the things that all the major online services are working on during the day, in part because misinformation and conspiracy theories have grown exponentially. Criticisms have been directed at players such as YouTube, Facebook and Twitter. YouTube has now released a long time condition about how they try to prevent the marketing of misinformation through their platform.

They explained that it might be appropriate to turn off the ability to share videos.

In the short version of the blog post, reading is told that the top item on the list is to know about misinformation before it goes viral and spreads all over the place. The second point relates to the way to share misinformation from YouTube to multiple platforms.

In addition, the company will increase its efforts around the world – here we are mainly talking about language challenges. As long as the content is in English, it is easy for a US company to moderate it.

Read also

Now Instagram wants you to ‘take a breather’

As examples of conspiracy theories, YouTube refers, among other things, to those who wanted the “truth” about the September 11 attack in New York, those who believed that the Earth was flat or that no one had actually landed on the Moon. In the past, the Google-owned company (Alphabet) has trained its algorithms on this older and more widespread content to detect new conspiracy theories, but they report that they need to react faster to discover new moves in misinformation and conspiracies, so they can prevent its spread.

See also  Frogner's listed church gets £2m worth of solar panels

This is where it may also be appropriate to refuse to share content. The company explains that it already weights such content with the Youtube algorithm, so it doesn’t usually suggest conspiracy theories as videos on your homepage. But what is out of YouTube’s control are other websites and forums where users share their content themselves.

Thus, it may be appropriate to turn off the share button entirely below questionable videos enough that YouTube does not want to publish them, but at the same time inside enough that they are not removed from the service. Disabling the direct link to the videos URL can also be a solution.

While this is portrayed as a possible solution to prevent conspiracy theorists and others from spreading misinformation, YouTube notes that they see problems with such a restriction. They explain that turning off the automatic selection of content by an algorithm on behalf of people who do not make active choices themselves is something other than turning off active sharing and viewing.

In addition to acknowledging the infringement of users’ rights, the blog post also touches on the fact that context and discussion matter.

We must be vigilant and strike a balance between limiting and spreading misinformation while maintaining space for discussion and learning about difficult and controversial topics.

We will continue to carefully explore different ways to limit the spread of harmful misinformation, as stated in the blog post, which in its entirety It can be read here.

Hanisi Anenih

Hanisi Anenih

"Web specialist. Lifelong zombie maven. Coffee ninja. Hipster-friendly analyst."

Leave a Reply

Your email address will not be published. Required fields are marked *