YouTube is waiting a month after the election for false allegations of election fraud


YouTube is owned by Google announced that it would begin enforcing this content on Wednesday, citing the U.S. election’s “safe harbor” deadline on Tuesday, which is the date after which the results of the state election cannot be effectively debated. YouTube said enough states have verified its election results to appoint an elected president. National media outlets have generally predicted that Joe Biden will be the next president.

As an example of prohibited content, YouTube has announced that it will take down videos that claim that a presidential candidate won the election due to widespread software bugs or vote counting bugs.

He will start enforcing the policy on Wednesday, saying he will “step up” his efforts in the coming weeks. This will still allow videos with news and comments to remain on the platform if they have enough relevance. Videos posted before Wednesday that violate the policy will continue to exist, even though they now violate YouTube’s policies. They will have an information panel that confirms the election results.

Asked why YouTube did not implement these guidelines before or during the election, a YouTube spokesman cited Tuesday’s “safe harbor” deadline as a justification.

During the election, YouTube arguably took the least aggressive steps to misinform the election. For example, a video claiming President Trump had won another four years in office and flooded with baseless allegations that Democrats are “throwing Republican ballots, collecting fake ballots, and delaying results, causing confusion,” may have remained on the platform. Then YouTube stated that the video does not violate its rules and will therefore not be removed.

YouTube’s election policy has banned content that misleads people about where and how to vote. The company also placed an information panel at the top of search results related to the election, as well as videos of the election. The dashboard linked both Google’s election results and the Cyber ​​Security and Infrastructure Security Agency’s “Rumor Checking” page to misinform about election integrity. YouTube has also sought to promote content from authoritative news sources in search results.

During and after the election Twitter (TWTR) he labeled and restricted how tweets containing misinformation could be shared, including more from President Trump. Facebook (FB) at first they were only labeled, but later temporary measures were taken to limit the availability of misleading entries and barriers were shared.

Since September, the company has announced that it has taken more than 8,000 channels and “thousands” of harmful and misleading videos about the election due to violations of its rules. More than 77% of the videos were removed before receiving 100 views.

In particular, Google also plans to lift its election moratorium on political ads from December 10, the company informed advertisers on Wednesday. In a letter to advertisers sent by CNN, the technology giant said Google would unmark the U.S. election as a “sensitive event” and restore its normal election advertising policies. The moratorium, announced before the election, was expected to last at least a week after election day, but lasted roughly a month.

Brian Fung, CNN Business Report, contributed