Gadget Galiyara
News

Facebook content moderators call for better treatment

“The truth is this work is incredibly important but it’s done completely wrong and while the policy is constantly changed the situation seems to get worse.”

San Francisco: As Facebook chief Mark Zuckerberg prepares to be grilled by a Senate committee about the handling of politically-charged posts, content moderators are insisting that properly valuing their work is key.

Two former content moderators contracted in the US to make judgment calls on posts, and one other currently tackling the same challenge took part in a conference call with reporters on Monday.

The former and current content moderators expressed concerns about posts intended to cause trouble or bedevil the outcome of the forthcoming election.

The worker still on the job spoke under condition of anonymity, since such positions involve non-disclosure agreements restricting what they can say about their work, “I certainly am not supposed to tell the truth about my work in public,” the Facebook content moderator said.

“The truth is this work is incredibly important but it’s done completely wrong and while the policy is constantly changed the situation seems to get worse.” The current and former content moderators described stressful hours spent focused on torrents of hateful, disturbing posts with little regard given to their feedback or their well-being.

They called for Facebook to find a way to make them and their colleagues full-time employees, complete with the benefits for which tech companies are renowned, instead of keeping them at arms-length by outsourcing the work.

“Facebook could fix most of its problems if it would move away from outsourcing, value its moderators, and build them into its policy processes,” said former content moderator Allison Trebacz.

“Moderators are the heart of Facebook’s business – that’s how they should be treated.” Zuckerberg has pushed back against concerns about hateful or violent posts at the social network by saying the social network has invested heavily in artificial intelligence and real humans to take down content violating its policies.

The bulk of that army of content moderators are contracted and their viewpoints — hard-won on the frontlines of the battle — are typically ignored, according to those who took part in the press briefing.

“I became a Facebook content moderator because I believed I could help make Facebook safer for my community and other communities who use it,” said Viana Ferguson, who left the job last year.

“But again and again, when I tried to address content that dripped with racism, or was a clear threat, I got told to get in line, our job was to agree.” Zuckerberg and Twitter chief executive Jack Dorsey are to testify Wednesday before a Senate committee exploring the potential to weaken legal protections given to online platforms when it comes to what users post there.

This story has been sourced from a third party syndicated feed, agencies. Gadget Galiyara accepts no responsibility or liability for its dependability, trustworthiness, reliability, and data of the text. DigitalGaliyara (OPC) Private Limited management reserves the sole right to alter, delete, or remove (without notice). If you have any concerns with the Content, then please write to us at the mail@digitalgaliyara.com Source link

Related posts

Leave a Comment

* By using this form you agree with the storage and handling of your data by this website.

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More