Facebook has announced it will add another 3,000 people to its 4,500 strong review team that moderates content

 Facebook has announced it will add another 3,000 people to its 4,500 strong review team that moderates contentNEW DELHI: Responding to the spate of suicides being live streamed, social media giant Facebook has announced it will add another 3,000 people to its 4,500-strong review team that moderates content. The review team will also work in tandem with law enforcement agencies on this issue.

“Over the last few weeks, we’ve seen people hurting themselves and others on Facebook, either live or in video posted later. It’s heartbreaking, and I’ve been reflecting on how we can do better for our community,” Facebook co-founder and CEO Mark Zuckerberg wrote in a status update and added, “Over the next year, we’ll be adding 3,000 people to our community operations team around the world, on top of the 4,500 we have today, to review the millions of reports we get every week, and improve the process for doing it quickly.”

“…we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it, either because they’re about to harm themselves, or because they’re in danger from someone else,” Zuckerberg added announcing the move.

Over the last year, several violent incidents and suicides have been streamed live on Facebook. In India last April, a young student went live on Facebook minutes before he jumped off the 19th floor of Taj Lands End hotel in Mumbai. The same month saw similar news coming out of the US and Thailand as well. A 49-year-old from Alabama went live on Facebook before shooting himself in the head. Another man from Bangkok made a video of hanging his 11-month-old daughter, and uploaded it to Facebook. He was later discovered to have killed himself too.

“It’s a positive development that Facebook is adding human power and tools for dealing with hate speech, child abuse and suicide attempts. It would be interesting to see how Facebook coordinates with the Indian police departments to get an emergency response to a potential suicide attempt or attempt to harm someone else,” says Rohini Lakshane, program officer, Center for Internet and Society, though she warns against false reports clogging up reviewers’ feeds and police notifications.

On Facebook, a video, picture or any other piece of content reaches the review team after it is reported by users for flouting its “community guidelines”.

Chinmayi Arun, research director at the Centre for Communication Governance, National Law University, Delhi, says Facebook must be transparent about this process. “Facebook should also announce how it is keeping this process accountable. It is a public platform of great importance which has been guilty of over-censorship in the past. It should be responsive not just to government censorship requests but also to user requests to review and reconsider its blocking of legitimate content,” she says.

Bureau Report

Be the first to comment

Leave a Reply

Your email address will not be published.


*