YouTube to expand teams reviewing extremist content

Nissan, DeNA schedule public tests of self-driving car service in Japan next year
December 4, 2017
GM puts an e-commerce marketplace in the dashboard
December 4, 2017
This post was originally published on this site

(Reuters) – Alphabet Inc’s (GOOGL.O) YouTube said on Monday it plans to add more people next year to review and remove violent or extremist content on the video platform.

A 3D-printed YouTube icon is seen in front of a displayed YouTube logo in this illustration taken October 25, 2017. REUTERS/Dado Ruvic/Ilustration

YouTube is taking stern actions to protect its users against inappropriate content with stricter policies and larger enforcement teams, YouTube CEO Susan Wojcicki said in a blog post. bit.ly/2km1Dfi

“We are also taking aggressive action on comments, launching new comment moderation tools and in some cases shutting down comments altogether,” Wojcicki said.

The goal is to bring the total number of people across Google working to address content that might violate its policies to over 10,000 in 2018, she said.

YouTube last week updated its recommendation feature to spotlight videos users are likely to find the most gratifying, brushing aside concerns that such an approach can trap people in bubbles of misinformation and like-minded opinions. [nL1N1NY2M9]

YouTube had been facing a lot of criticism from advertisers and regulators and advocacy groups for failing to police content and account for the way its services shape public opinion.

Reporting by Rishika Chatterjee; Editing by Gopakumar Warrier

Our Standards:The Thomson Reuters Trust Principles.