YouTube is stepping back against claims its platform is helping promote and spread misinformation encompassing the 2020 US election. YouTube also profess it takes action to stop the spread of videos containing a fault or misleading claims by not surfacing them in search results or through its direction engine.
All of these videos were unsubstantiated and spread misinformation challenging the validity of the election’s outcome. Yet those videos are still available on YouTube while being shared widely on Facebook and other social media platforms.
This is not a glitch in YouTube’s policy execution. Instead, YouTube has said its policies are working as intended. Its light-touch approach differs from Twitter and Facebook, which have cracked down on misinformation about the election result and have more prominently labeled content that they consider being misinformation.
It has been found that the most popular videos about the election are from authoritative news organizations. On average, about 88 percent of the videos in top-10 results in the U.S. come from high-authoritative sources when people search for election-related content.
YouTube has come under fire in the run-up to and after Election Day for allowing videos from organizations like One America News Network that falsely say President Donald Trump won reelection and that mass voter fraud is responsible for his loss to President-elect Joe Biden.
Youtube leaves up claims misinformation that Donald Trump won the reelection.
Video is harder to analyze than text, Ms. Kaplan said the founder of Alethea Group, a company that helps fight election-related misinformation, and videos are not shared in the same way that Facebook posts and tweets are shared.
Before the election, YouTube said it would not permit misleading election information, but limited that mainly to the procedures of voting — how to vote, who was eligible to vote or be a candidate, or any profess that could discourage voting. The policies did not extend to people expressing views on the result of a current election that videos spreading misinformation about the vote’s result would be granted.
However, YouTube appears to be struggling with how to contain the spread of videos uploaded to its platform on other social networks like Facebook, Twitter, where the videos often go viral too fast before either company is able to slow down the flow.