As an example of prohibited content, YouTube has announced that it will take down videos that claim that a presidential candidate won the election due to widespread software bugs or vote counting bugs.
He will start enforcing the policy on Wednesday, saying he will “step up” his efforts in the coming weeks. This will still allow videos with news and comments to remain on the platform if they have enough relevance. Videos posted before Wednesday that violate the policy will continue to exist, even though they now violate YouTube’s policies. They will have an information panel that confirms the election results.
Asked why YouTube did not implement these guidelines before or during the election, a YouTube spokesman cited Tuesday’s “safe harbor” deadline as a justification.
YouTube’s election policy has banned content that misleads people about where and how to vote. The company also placed an information panel at the top of search results related to the election, as well as videos of the election. The dashboard linked both Google’s election results and the Cyber Security and Infrastructure Security Agency’s “Rumor Checking” page to misinform about election integrity. YouTube has also sought to promote content from authoritative news sources in search results.
Since September, the company has announced that it has taken more than 8,000 channels and “thousands” of harmful and misleading videos about the election due to violations of its rules. More than 77% of the videos were removed before receiving 100 views.
In particular, Google also plans to lift its election moratorium on political ads from December 10, the company informed advertisers on Wednesday. In a letter to advertisers sent by CNN, the technology giant said Google would unmark the U.S. election as a “sensitive event” and restore its normal election advertising policies. The moratorium, announced before the election, was expected to last at least a week after election day, but lasted roughly a month.
Brian Fung, CNN Business Report, contributed