YouTube removed over 8 million videos from its site during the October-December quarter of 2017, most of which were spam or adult content, but represented a fraction of its total views.
According to a YouTube statement, 6.7 million were first flagged for review by machines rather than humans and 76 per cent were removed before they received a single view.
In early 2017, 8 per cent of videos flagged and removed for violent extremism were taken down with fewer than 10 views, said the company. “We introduced machine learning flagging in June 2017,” it said. “Now more than half of the videos we remove for violent extremism have fewer than 10 views.”
YouTube will now start releasing quarterly reports on how it enforce its ‘community guidelines.’ “This regular update will help show the progress we’re making in removing violative content from our platform,” it said. “By the end of the year, we plan to refine our reporting systems and add additional data, including data on comments, speed of removal, and policy removal reasons.”
The company is also introducing a Reporting History dashboard that each YouTube user can individually access to see the status of videos they’ve flagged for review against its Community Guidelines.
Machines are also allowing YouTube to flag content for review at scale, helping remove millions of violative videos before they are ever viewed. “And our investment in machine learning to help speed up removals is paying off across high-risk, low-volume areas (like violent extremism) and in high-volume areas (like spam),” it added.