Between July and December of 2020, TikTok removed thousands of videos for breaking its rules around misinformation about the 2020 presidential election and the coronavirus pandemic.
The company removed 347,225 videos for sharing election misinformation or manipulated media, according to the report. An additional 441,000 clips were removed from the app’s recommendations because the content was “unsubstantiated.”
At the same TikTok took down 51,505 videos for sharing misinformation about COVID-19. In its report, TikTok notes that 87 percent of these clips were removed within 24 hours of being posted, and that 71 percent had “zero views” at the time they were removed.
The app included in-app notices to direct users to important information. TikTok claims its PSAs were viewed more than 73 billion times.
In its report, TikTok says it was well-prepared for the election, and that much of the misinformation was from domestic sources within the United States. “We prepared for 65 different scenarios, such as premature declarations of victory or disputed results, which helped us respond to emerging content appropriately and in a timely manner,” TikTok writes. “We also prepared for more domestic activity based on trends we’ve observed on how misleading content is created and spread online. Indeed, during the US 2020 elections, we found that a significant portion of misinformation was driven by domestic users –– real people.”
The app took down more than 89 million videos that broke its rule, according to the report. Not just election and coronavirus, the company tends to take action if they spot anyone going against its rule.
TECH NEWS>>>>Teenage Engineering Will Be In Charge Of Product Design At Carl Pei’s Nothing