Advertisement
U.S. markets closed
  • S&P 500

    5,254.35
    +5.86 (+0.11%)
     
  • Dow 30

    39,807.37
    +47.29 (+0.12%)
     
  • Nasdaq

    16,379.46
    -20.06 (-0.12%)
     
  • Russell 2000

    2,124.55
    +10.20 (+0.48%)
     
  • Crude Oil

    83.02
    +1.67 (+2.05%)
     
  • Gold

    2,241.80
    +29.10 (+1.32%)
     
  • Silver

    25.00
    +0.24 (+0.98%)
     
  • EUR/USD

    1.0790
    -0.0040 (-0.37%)
     
  • 10-Yr Bond

    4.2060
    +0.0100 (+0.24%)
     
  • GBP/USD

    1.2620
    -0.0018 (-0.14%)
     
  • USD/JPY

    151.3880
    +0.1420 (+0.09%)
     
  • Bitcoin USD

    70,707.70
    +1,818.74 (+2.64%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • FTSE 100

    7,952.62
    +20.64 (+0.26%)
     
  • Nikkei 225

    40,168.07
    -594.66 (-1.46%)
     

Facebook is under new scrutiny for its moderation practices in Europe

Facebook doesn't do enough to protect content reviewers, according to a current moderator.

Stephen Lam / Reuters

Facebook is once again facing questions about its treatment of content moderators after a moderator told an Irish parliamentary committee that the company doesn’t do enough to protect the workers who sift through violent and disturbing content on the platform.

Isabella Plunkett, who currently works for Covalen, a Irish outsourcing company that hires content moderators to work as contract staff, told the committee that non-employee moderators aren’t given adequate access to mental health resources. For example, Covalen allows for an hour and a half of “wellness time” each week, bu the company-provided “wellness coaches” are not mental health professionals, and are not equipped to help moderators process the traumatic content they often deal with. Plunkett told the committee that these wellness coaches sometimes suggested activities like painting or karaoke.

“The content is awful, it would affect anyone,” she said at a press conference following the hearing. “No one can be okay watching graphic violence seven to eight hours a day.” She said moderators should be afforded the same benefits and protections as actual Facebook employees, including paid sick time and the ability to work from home. Plunkett also raised Facebook’s reliance on non-disclosure agreements, which she said contributed to a “climate of fear” that makes moderators afraid to speak out or seek outside help.

In a statement, a Facebook spokesperson said the company is “committed to working with our partners to provide support” to people reviewing content. “Everyone who reviews content for Facebook goes through an in-depth training programme on our Community Standards and has access to psychological support to ensure their wellbeing,” the spokesperson said. “In Ireland, this includes 24/7 on-site support with trained practitioners, an on-call service, and access to private healthcare from the first day of employment. We are also employing technical solutions to limit their exposure to potentially graphic material as much as possible. This is an important issue, and we are committed to getting this right.”

This is far from the first time these issues have been raised. The workplace conditions of content moderators, who spend their days wading through the worst content on the platform, has long been an issue for Facebook, which depends on non-employee moderators around the world. The company last year agreed to a $52 million settlement with U.S.-based moderators who said their jobs resulted in PTSD and other mental health issues.

As part of the settlement, Facebook agreed to make several changes to the way it handles content that’s funneled to moderators for review. It introduced new tools that would allow them to view videos in black and white and with audio muted in an effort to make the often violent and graphic content less disturbing to watch. It also added features to make it easier to skip to the relevant parts of longer videos to reduce the amount of overall time spent watching the content. The company has also made significant investments in AI technology, with the hopes of one day automating more of its moderation work.

But Facebook may soon have to answer questions on whether these measures go far enough to protect content moderators. The committee is expected to ask representatives from Facebook, and its contracting companies, to appear at another hearing to face questions about their treatment of workers.

Advertisement