#GlobalNews: “Facebook hiring 3,000 workers to catch and remove streaming violence – National”

Advertisements:

Facebook Inc will hire 3,000 more people over the next year to respond to reports of inappropriate material on the social media network and speed up the removal of videos showing murder, suicide and other violent acts, Chief Executive Mark Zuckerberg said on Wednesday.

The hiring spree is an acknowledgement by Facebook that, at least for now, it needs more than automated software to improve monitoring of posts. Facebook Live, a service that allows any user to broadcast live, has been marred since its launch last year by instances of people streaming violence.

WATCH: Facebook promises action on violent video problem





Zuckerberg, the company’s co-founder, said in a Facebook post the workers will be in addition to the 4,500 people who already review posts that may violate its terms of service.

Last week, a father in Thailand broadcasted himself killing his daughter on Facebook Live, police said. After more than a day, and 370,000 views, Facebook removed the video. Other videos from places such as Chicago and Cleveland have also shocked viewers with their violence.

Zuckerberg said: “We’re working to make these videos easier to report so we can take the right action sooner – whether that’s responding quickly when someone needs help or taking a post down.”


READ MORE:
Facebook facing criticism after Thai man livestreams daughter’s murder

The 3,000 workers will be new positions and will monitor all Facebook content, not just live videos, the company said. The company did not say where the jobs would be located.

Facebook is due to report quarterly revenue and earnings later on Wednesday after markets close in New York.

The world’s largest social network, with 1.9 billion monthly users, has been turning to artificial intelligence to try to automate the process of finding pornography, violence and other potentially offensive material. In March, the company said it planned to use such technology to help spot users with suicidal tendencies and get them assistance.

WATCH BELOW: 


However, Facebook still relies largely on its users to report problematic material. It receives millions of reports from users each week, and like other large Silicon Valley companies, it relies on thousands of human monitors to review the reports.

“Despite industry claims to the contrary, I don’t know of any computational mechanism that can adequately, accurately, 100 percent do this work in lieu of humans. We’re just not there yet technologically,” said Sarah Roberts, a professor of information studies at UCLA who looks at content monitoring.

The workers who monitor material generally work on contract in places such as India and the Philippines, and they face difficult working conditions because of the hours they spend making quick decisions while sifting through traumatic material, Roberts said in an interview.

Note: “Previously Published on: 3 May 2017 | 3:33 pm, as ‘Facebook hiring 3,000 workers to catch and remove streaming violence – National’ on GLOBALNEWS CANADA. Here is a source link for the Article’s Image(s) and Content”.

Global News Canada

Copyright © 2017 Global News, a division of Corus Entertainment Inc. Corus News. All rights reserved. Distributed by PressOcean Global Media (pressocean.com). Contact the copyright holder directly for corrections — or for permission to republish or make other authorized use of this material... Articles and commentaries that identify PressOcean.com as the publisher are produced or commissioned by PressOcean. To address comments or complaints, please Contact us.

0 Comments

No comments!

There are no comments yet, but you can be first to comment this article.

Leave reply

Leave a Reply, an Opinion or a Comment: