TikTok says it deleted more than 49 million videos which broke its rules, between July and December 2019.
About a quarter of those videos were deleted for containing adult nudity or sexual activity, the company said, in its latest transparency report.
The video-sharing app also revealed it had received about 500 requests for data from governments and police, and had complied with about 480 of them.
The US has suggested it is “looking at” whether to ban the Chinese-owned app.
On Monday, US Secretary of State Mike Pompeo suggested that downloading TikTok would put citizens’ “private information in the hands of the Chinese Communist Party”.
He added that the US government was considering whether to ban Chinese-owned apps: “We are taking this very seriously. We are certainly looking at it,” he said, in a Fox News interview.
The government in India has already banned the app, citing cyber-security concerns.
TikTok is owned by Chinese firm ByteDance. The app is not available in China, but ByteDance operates a similar app, called Douyin, which is available.
TikTok said it had not received any government or police data requests from China, or any requests from the Chinese government to delete content.
US authorities are examining whether TikTok complied with a 2019 agreement aimed at protecting the privacy of under-13s.
The app says it offers a limited app experience, with additional safety and privacy features for under-13s.
According to TikTok’s transparency report:
- 25.5% of the deleted videos contained adult nudity or sexual acts
- 24.8% broke its child-protection policies, such as implicating a child in a crime or containing harmful imitative behaviour
- 21.5% showed illegal activities or “regulated goods”
- 3% were removed for harassment or bullying
- Less than 1% were removed for hate speech or “inauthentic behaviour”
TikTok’s transparency report also revealed:
- The 49 million deleted videos represented less than 1% of videos uploaded between July and December 2019
- 98.2% of the deleted videos were spotted by machine learning or moderators before being reported by users
by James Clayton, North America technology reporter
TikTok was only released in 2017 – and because it’s so new we know much less about the platform than we do about Facebook, for example.
This report offers at least a little detail about the kind of content it takes down.
There has been lots of focus recently on hate and extremism on platforms such TikTok, but fewer column inches about sexual content or the safety of minors.
Yet around half the videos taken down were in those two categories.
What we don’t know, of course, is how much harmful content has been missed by its moderators and machines.