Tech News: YouTube claims that it’s getting better at removing videos that lapse its policies.
YouTube released data on Tuesday showing it is improving at detecting and removing videos that violate its rules against disinformation, hate speech and other banned content.
The video service, owned by Google, said 0.16% to 0.18% of all video views on its platform in the fourth quarter of 2020 involved content that violated its rules. That’s a 70% drop from the same period of 2017, the year in which the company started following him. This means that between 16 and 18 out of 10,000 views have arrived on the platform from videos that violated the content guidelines.
YouTube claims that VVR is a better metric for tracking its performance than, for example, how long a questionable video stays on the platform. That might be true, but it comes with an important caveat. And that is, the statistics only include videos that violate company policies. There are still a lot of questionable videos ending up on YouTube that people find problematic, but that doesn’t necessarily violate the company’s community guidelines.
And then there are the videos that fall into “borderline” cases, such as a “documentary” Livestream from out of the recent mass shooting in Boulder that the company decided, after careful consideration, to let it rest. YouTube doesn’t account for this in its stats, so we only get a partial picture of YouTube’s enforcement efforts.