As Facebook Agrees To Pay $52 Million In PTSD Payments To Moderators, Why Are Some Demanding More Human Moderators?

from the exposing-humans-to-cruelty-to-protect-humans-from-cruelty dept

There's been plenty of talk these days about content moderation, and how different platforms should moderate their content, but much less attention has been paid to the people who do the actual moderation. Last year, we highlighted an amazing story by Casey Newton at The Verge detailing just what a horrible job it is for many people -- in which they are constantly bombarded with the worst of the internet. Indeed, some of his reporting helped spur on a labor dispute that just ended (reported, again, by Newton) with Facebook agreeing to pay out $52 million to content moderators who developed mental health issues on the job.

Each moderator will receive a minimum of $1,000 and will be eligible for additional compensation if they are diagnosed with post-traumatic stress disorder or related conditions. The settlement covers 11,250 moderators, and lawyers in the case believe that as many as half of them may be eligible for extra pay related to mental health issues associated with their time working for Facebook, including depression and addiction.

I've seen many people cheer on this decision, and I agree that it's a good thing that those moderators are getting some compensation for the horrors and anguish they've experienced. But I'm somewhat perplexed that some of those cheering on this settlement are the same people who have, regularly, demanded that Facebook should be doing much more content moderation. It would seem that if they got their wish, that would mean subjecting many more people to these kinds of traumas and stresses -- and we should all be concerned about that.

Incredibly, as Newton's story points out, the content moderator who brought the lawsuit was hired after the 2016 election, when the very same crew of Facebook critics demanded that the company ramp up its content moderation to deal with their belief that misinformation on Facebook helped elect President Trump:

In September 2018, former Facebook moderator Selena Scola sued Facebook, alleging that she developed PTSD after being placed in a role that required her to regularly view photos and images of rape, murder, and suicide. Scola developed symptoms of PTSD after nine months on the job. The complaint, which was ultimately judged by several other former Facebook moderators working in four states, alleged that Facebook had failed to provide them with a safe workspace.

Scola was part of a wave of moderators hired in the aftermath of the 2016 US presidential election when Facebook was criticized for failing to remove harmful content from the platform.

And yet, now the same people who pushed for Facebook to do more moderation seem to be the ones most vocally cheering on this result and continuing to attack Facebook.

Again, there are many, many reasons to criticize Facebook. And there are lots of reasons to at least be glad that the company is forced to pay something (if probably not nearly enough) for the moderators whose lives and mental health have been turned upside down by what they had to go through. But I'm trying to square that with the fact that it's the very same people who pushed Facebook to ramp up its moderation who are now happy with this result. It's because of those "activists" that Facebook was pressured into hiring these people and putting them in harm's way, often.

Content moderation creates all sorts of downstream impacts, and many people seem to want to attack companies like Facebook both for not employing enough content moderators and for the mental harms that content moderation creates. And I'm not sure how anyone can square those two views, beyond just thinking that Facebook should do the impossible and moderate perfectly without involving any humans in the process (and, of course, I'll note that many of these same people laugh off the idea that AI can do this job, because they're right that it can't).

I don't know what the right answer here is -- because the reality is that there is no good answer. Facebook doing nothing is simply not an option, because then the platform is filled with garbage, spam, and nonsense. AI simply isn't qualified to do that good of a job. And human moderators have very real human consequences. It would be nice if people could discuss these issues with the humility and recognition that there are no good answers and every option has significant tradeoffs. But it seems that most just want to blame Facebook, no matter what the choices or the outcomes. And that's not very productive.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, content moderators, human moderators, humans, ptsd
Companies: facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 13 May 2020 @ 11:01am

    "Something MUST be done! THIS is something, therefore, we must do it!"

    link to this | view in thread ]

  2. icon
    MDT (profile), 13 May 2020 @ 11:05am

    Cognitive Dissonance

    This is the cause of your confusion. You are not taking Cognitive Dissonance into account. This is the same process as the people who wear masks while protesting COVID shut downs. The human brain has a remarkable facility to contain two utterly diametric opinions and somehow justify both of them.

    link to this | view in thread ]

  3. icon
    Thad (profile), 13 May 2020 @ 11:08am

    Re: Cognitive Dissonance

    Nitpick: what you're describing isn't cognitive dissonance, it's the absence of cognitive dissonance.

    Cognitive dissonance is the discomfort caused by trying to hold two conflicting positions.

    link to this | view in thread ]

  4. identicon
    dr evil, 13 May 2020 @ 11:09am

    mental health

    guarantee that the ptsd moderators had ..um..disease before they started. after this they will be prepped to be police.

    link to this | view in thread ]

  5. icon
    That One Guy (profile), 13 May 2020 @ 11:10am

    Anything is easy when you aren't doing it

    Anyone who thinks that Facebook should have more human moderators should be faced with a 'put up or shut up' test, where they are presented with a week's worth of the sort of content that the current moderators have to wade through and told they have to view every single piece of it, no matter how vile, before anything they say on the subject will be taken seriously.

    If you're going to insist that other people should be doing something then it's only fair that you get a taste of what that's like yourself, so that your argument is an informed one and you know exactly what it is you are insisting should be done.

    link to this | view in thread ]

  6. icon
    Thad (profile), 13 May 2020 @ 11:11am

    I don't know what the right answer here is -- because the reality is that there is no good answer. Facebook doing nothing is simply not an option, because then the platform is filled with garbage, spam, and nonsense. AI simply isn't qualified to do that good of a job. And human moderators have very real human consequences. It would be nice if people could discuss these issues with the humility and recognition that there are no good answers and every option has significant tradeoffs.

    I think that's exactly the answer. I don't see the contradiction in saying that Facebook needs more moderators and needs to support them better.

    And I think the solution is that we shouldn't have networks the size of Facebook, but aside from being just plain unrealistic, that solution would come with its own set of tradeoffs.

    link to this | view in thread ]

  7. identicon
    mcinsand, 13 May 2020 @ 11:27am

    Re:

    link to this | view in thread ]

  8. identicon
    jilocasin, 13 May 2020 @ 11:55am

    Let people moderate for themselves

    Here's a crazy idea, why don't we let users moderate for themselves.

    Let your AI roughly classify things, the community rate them, and give end users the ability to filter out what they do or don't want to see based on a large number of criteria (ex: source, min. community rating, Facebook advertiser category, etc.).

    I think a major part of the problem is that your average internet user has become an overly coddled snowflake with the mental discernment just south of warmed over road kill (and the truly scary part is that by definition half of the people out there are even worse).

    link to this | view in thread ]

  9. identicon
    Kitsune106, 13 May 2020 @ 12:12pm

    Re: people stating that something should be done

    Should step up to help. If you feel more moderation, volunteer as a moderator. Or set up ways to help those with it. Instead of paying blame at the nearest bogey man company.

    link to this | view in thread ]

  10. icon
    That One Guy (profile), 13 May 2020 @ 12:19pm

    'Your social media account is on bigot-central?' 'Well, yeah...'

    Great idea, I'm sure people would love to wade through piles of garbage on a regular basis, from racists losers, sexist bigots, conspiracy theory idiots, and/or the latest spam, or be on a platform that hosts all that lest it's removal offend someone.

    People already can moderate for themselves, it's called the block and mute functions, the purpose of having the platform engage in moderation efforts it to keep the regular users from having to deal with the vile rot and repulsive individuals who are giving the moderators serious mental issues from having to deal with them.

    If you don't like the current moderation of a platform then you're free to go to a platform with moderation policies you do like or create your own, I hear the various chans are a good place for that.

    link to this | view in thread ]

  11. identicon
    Bruce C., 13 May 2020 @ 12:27pm

    Re: Let people moderate for themselves

    Which would just make the PTSD a problem in the general public, as opposed to a subset of people.

    OTOH, as a backdoor method of creating demand for mental healthcare reform, the idea has merit.

    Here's the thing, there are tough, difficult jobs that have to be done by someone, and many of those jobs can result in PTSD. This includes cops, soldiers, EMTs and others. The key is to ensure that people who take these jobs are adequately rewarded and provided the mental healthcare needed for their well-being.

    The fact that these moderators had to sue Facebook to get the "employee assistance" they needed is disgusting. The fact that human moderators are needed to allow the rest of us to keep our own sanity while browsing the internet is also disgusting, but no more disgusting than other social circumstances like crime and war creating the need for soldiers and police.

    link to this | view in thread ]

  12. icon
    Koby (profile), 13 May 2020 @ 1:22pm

    Football Model

    I remember the NFL getting sued a few years ago by its players, claiming that repeated blows sustained over their career caused some kind of traumatic brain injury. So they came to a one-time settlement payment, and nowadays the players sign a contract acknowledging that playing can cause brain injury. I forsee Facebook copying this model. This is the one-time payout, and in the future, worker employment contracts will warn them that they might get PTSD with no payout.

    link to this | view in thread ]

  13. icon
    That Anonymous Coward (profile), 13 May 2020 @ 1:30pm

    For the same reason the Karens & Chads are out with their guns threatening elected officials to reopen now now now...
    They don't care how it happens, they just want it done.
    It doesn't matter how many people get fed to the meat grinder of the worst humanity can create, as long as Karen never has to see another offensive meme.

    They don't want to think about how horrible the job is, they just want little people to do it for them.

    link to this | view in thread ]

  14. identicon
    jilocasin, 13 May 2020 @ 1:31pm

    Re: 'Your social media account is on bigot-central?' 'Well, yeah

    Thank you for making my point for me. People don't want to be tasked with having to deal with the real world and instead expect a platform to do it for them.

    Also, I never said that the platform had to host it all, it's their platform, they are perfectly free to remove whatever they want. What I did say is that users are too coddled to deal with reality. Today's users expect someone else to do the dirty work. Facebook (for example) is free to delete anything that offends their, or their collective communities sensibilities. It's also free to remove anyone who posts content that they find abhorrent.

    Today we have users who expect Facebook to do the impossible, namely protect them from "bad" content (whether that takes human moderators, AI, or pixie dust).

    What I am saying is that users should;

    1. grow a thicker skin and accept that you might be exposed to things you don't like or agree with
    2. vote down, block, report problematic content that runs afoul of Facebook's guidelines
    3. take charge or your experience by setting your feeds and viewing thresholds to tailor it to what you want to read/see.

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 13 May 2020 @ 2:53pm

    Re: Re: 'Your social media account is on bigot-central?' 'Well,

    take charge or your experience by setting your feeds and viewing thresholds to tailor it to what you want to read/see.

    Often those making the most noise about moderation are selective in their feeds, they deliberately look for things to complain long and loud about.

    link to this | view in thread ]

  16. icon
    That One Guy (profile), 13 May 2020 @ 3:03pm

    Re: Re: 'Your social media account is on bigot-central?' 'Well,

    Thank you for making my point for me. People don't want to be tasked with having to deal with the real world and instead expect a platform to do it for them.

    Why yes, I imagine most people are quite happy to leave it to the platform to deal with the sort of content that causes PTSD in the moderators, rather than having to wade through that to talk to friends and family on social media.

    Truly the most fragile of snowflakes there not wanting to have to deal with hateful bigots talking about how disgusting the 'others' are, or have to wade through pictures of rape and death on a regular basis, they should toughen up and face reality, even if that means facing horrors posting by disgusting losers who delight in their horror rather than letting someone else do it for them.

    What I am saying is that users should;

    grow a thicker skin and accept that you might be exposed to things you don't like or agree with
    vote down, block, report problematic content that runs afoul of Facebook's guidelines
    take charge or your experience by setting your feeds and viewing thresholds to tailor it to what you want to read/see.

    While I'd agree that some people expect the impossible from social media the rest of it already happens. Users can and do flag content that violates the rules, and mute and/or block accounts that they find objectionable, the purpose of platform level moderation is to keep the users from having to deal with the really extreme stuff and keep the platform from being overrun by bigots and other deplorable individuals.

    It's one thing to for a user to block/report a racist douche-bag, that's likely just going to involve reading some text before making the block/report decision, quite another to block/report someone posting pics of a rotting/ripped up corpse(human or otherwise), as that is the sort of thing that's going to stick in a person's mind a lot longer.

    link to this | view in thread ]

  17. identicon
    Anonymous Coward, 13 May 2020 @ 3:20pm

    Re:

    And I think the solution is that we shouldn't have networks the size of Facebook, but aside from being just plain unrealistic

    Scale and amount of content allowed to be uploaded per minute is the core issue. If the idea of not having social networks at the size and scale of Facebook is unrealistic, then we should fight to make it something realistic.

    Letting platforms like Facebook, Twitter, and YouTube continue at their current pace while leaving it up to human moderators that get PTSD and ineffective algorithms to moderate a mere fraction of the abhorrent content uploaded is not a tenable situation for us, as a society, to be in. We need to start asking serious questions about why "You can upload as much as you want whenever you want, we'll serve up ads against that content, and then we'll use underpaid contract workers and ineffective algorithms to sort out just some of it" is treated as a valid and justifiable business model that a corporation is allowed to have.

    For example: YouTube has 400 hours of content uploaded a minute. Maybe it’s not responsible or moral for a company to allow for 400 hours of content to be uploaded a minute? Maybe it's not a good idea for a social network with 2.6 billion monthly active users like Facebook to exist? Maybe we should question the morality of allowing a massive platform like Twitter to exist when Dorsey has made it clear that Trump gets a free pass to sow chaos, spread lies, and foment his countless ignorant fascist followers while underpaid and overworked moderators attend to the attacks and harassment and vile bullshit that said followers spread in his name, but never get to drop the much-needed banhammer on 45 himself?

    that solution would come with its own set of tradeoffs.

    Whatever those tradeoffs are, I'm sure they can't be any worse than the situation we have right now where people and their mental health are treated as acceptable losses by CEOs.

    link to this | view in thread ]

  18. identicon
    Anonymous Coward, 13 May 2020 @ 3:25pm

    Re: Football Model

    The NFL says a lot of things, like protecting the players - but their actions show something a bit different.

    link to this | view in thread ]

  19. identicon
    Anonymous Coward, 13 May 2020 @ 3:41pm

    Re: Re:

    Scale and amount of content allowed to be uploaded per minute is the core issue.

    Lots of small companies would do an even worse job of moderation dealing with the same amount of content, because they could afford even less for moderation because a higher percentage of their income will be going in overheads.

    The scale problem is not the scale of companies, but rather the scale and diversity of the human race.

    link to this | view in thread ]

  20. identicon
    Anonymous Coward, 13 May 2020 @ 4:23pm

    Re: Re: Re:

    A wide swath of small to medium scale companies with a smaller relative amount of people on their social network platforms would be dealing with a much smaller amount of content on their own turf compared to the deluge on massive platforms like YouTube, Facebook, and Twitter. Experimentation with business models could happen, maybe finding business models that de-emphasize quantity of content uploaded and emphasize quality instead. The idea that this is some unsolveable issue because of the nature of the human race is absurd.

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 13 May 2020 @ 4:35pm

    I assume your answer would be to let Facebook do whatever they want in order to maintain the "American Edge", right Mike?

    link to this | view in thread ]

  22. identicon
    OGquaker, 13 May 2020 @ 5:00pm

    PTSD is the product of Facebook

    My twin sister cut off her Facebook account; "tired of seeing who's died"

    I've sat in court witnessing a murder trial for a week, drug court trials for weeks, watched a 14-year old die from five bullets in front of this house (LAPD jammed me against a wall "We were following the car that did it") and i was a combat medic (91B20) with a 'Sharpshooter' badge in 1970. I've dragged a suicide out of the freeway lanes & disposed of my cousin's (Miss Orange County 1954) bloody bedding after she shot herself. I've stitched up dogs on this kitchen table and recovered a road-kill cat with CPR.

    I started to read "Harry Potter". Since a murder occurs in the first five pages, would anyone bother to read any farther? Not Me.

    But, we naively paid to sit through "Django Unchained,2012" on Christmas day (telegraphing the story of Christopher Jordan Dorner a month later) and "Saving Private Ryan,1998". It telegraphs the NATO invasion of Yugoslavia in March 1999 and was played on TV prior to our Afghanistan & Iraq invasions. All the four-letter words: "This film is a critically acclaimed artwork that tells a gritty story of bloody battles and supreme heroism." - FCC Chairman Michael Powell.

    "The Manchurian Candidate, 1962" by Frankenheimer, or "Nine Hours to Rama, February 1963": José Ferrer is the lead in the film, his wife Rosemary Clooney witnessed Bobby Kennedy's death so he put her in a nuthouse & John Frankenheimer drove him to the event.

    PTSD is the product: subsuming our humanity by moving our 'Yellow Line'.

    link to this | view in thread ]

  23. identicon
    Anonymous Coward, 13 May 2020 @ 5:06pm

    Re: Re: Cognitive Dissonance

    Thank you.

    And what MDT was describing is known as "sociopathic behavior".

    link to this | view in thread ]

  24. identicon
    Anonymous Coward, 13 May 2020 @ 5:08pm

    Re: Re: Football Model

    Your point being... ?

    link to this | view in thread ]

  25. identicon
    Anonymous Coward, 13 May 2020 @ 5:09pm

    Re:

    You're at the extreme wrong end of the spectrum, aren't you?

    link to this | view in thread ]

  26. identicon
    Anonymous Coward, 13 May 2020 @ 5:15pm

    Re: PTSD is the product of Facebook

    Everyone has their own threshold for an experience to become traumatic. I haven't dealt with any dead people or pets apart from buryingseveral. I've watch those movies you listed and a great many more, many of which were far more violent. I've played first-person shooters for ages now. I've had cops point their guns at me mistaking me for someone else. I've been in violent altercations with people wielding knives. I've read social media and shitholes like 4chan for entertainment. I do not suffer from PTSD.

    Now that we've shared our stories are we buds?

    link to this | view in thread ]

  27. icon
    That One Guy (profile), 13 May 2020 @ 5:28pm

    Re: Re:

    Great, now, who do you trust to decide how big platforms can be and what people and speech are allowed on them?

    Because once you toss aside the idea of open platforms and the downsides that can come from people using them while you may reduce that problem as smaller platforms might be able to provide more thorough moderation, you also add in the issue of gatekeeping and either the platforms themselves or those giving them their marching orders(and if that's the government that brings it's own issues) deciding who does and does not get to speak using that platform, which is basically one of the current issues magnified by a significant amount.

    link to this | view in thread ]

  28. identicon
    Anonymous Coward, 13 May 2020 @ 5:40pm

    Re: Anything is easy when you aren't doing it

    I propose we call it the "Dairy Queen Rule"

    link to this | view in thread ]

  29. identicon
    Anonymous Coward, 13 May 2020 @ 6:04pm

    Re: Re: Re: Re:

    The number of people required to carry out moderation increases when spread over many small companies because of more problems in dealing with holidays, sick leave etc. The moderation problem is caused by the volume of posts, and not the size of the company doing the moderation, and the moderation effort required scales with the number of posts to be moderated regardless of whether they are via one company or many companies.

    link to this | view in thread ]

  30. icon
    That One Guy (profile), 13 May 2020 @ 8:06pm

    Re: Re: Anything is easy when you aren't doing it

    Afraid the reference has gone over my head, only Dairy Queen story I'm familiar with involved a luxury car, goat blood, and a failed attempt to combine the two.

    link to this | view in thread ]

  31. identicon
    Anonymous Coward, 13 May 2020 @ 8:34pm

    Re: Re: PTSD is the product of Facebook

    PTSD is for those who have not, or can't figure out how to sequester their natural humanity when assassination and external war is a political tool of the State. Rage Against The Machine is my tactic, Kemosabe.

    link to this | view in thread ]

  32. identicon
    Anonymous Coward, 13 May 2020 @ 11:27pm

    Facebook is a gigantic platform that scales the moderation problem up to gigantic levels, very true, but the company also makes gigantic profits. So FB has huge resources that are not being used to address this problem. Reports I've read about working conditions and demands on paid moderators are horrific, requiring too much stressful work without adequate support. What about reducing work quotas and hiring lots more moderators? What about training workers in ways they can deal with toxic material? What providing on-the-job counseling or other support for workers upset by what they see? What studies has FB commissioned to look for ways to make moderation less stressful? Can a system combine AI and human input to accomplish these goals?

    To buy your proposition that moderation is inherently harmful, I'd need to know that Facebook is doing everything that's reasonably called for to protect workers -- in the context of a company raking down billions of dollars in profits every quarter. Otherwise, Facebook is being allowed to externalize harm onto society and its workers that is actually a cost of its business model. Saying, it's impossible to moderate is just allowing this sort of externality. It's impossible to moderate and make the huge profit margin Facebook maintains, but apply more resources to it (making FB's profits less huge), and maybe not.

    I mean, is this any worse a problem than workers handling toxic chemicals or nuclear waste? Or than medical professionals being exposed to disease or extreme stress? I could go on with examples. What's the difference between these situations and moderating toxic activity on Facebook?

    link to this | view in thread ]

  33. identicon
    Anonymous Coward, 14 May 2020 @ 12:40am

    Re:

    Good points.

    link to this | view in thread ]

  34. identicon
    Anonymous Coward, 14 May 2020 @ 1:30am

    FB's foreign political origin has never been legalized and no one I know of thinks their crap should be legalized.

    I hope the foreign corrupt enterprise enjoys their stay in hell after what they have done.

    link to this | view in thread ]

  35. identicon
    Anonymous Coward, 14 May 2020 @ 2:15am

    Re:

    Facebook is a gigantic platform that scales the moderation problem up to gigantic levels,

    The size of the moderation problem is the number of people posting online. Whether those postings are via a single company, or via thousands of companies, the size of the moderation problem cannot get smaller, indeed is small companies are federated the problem grows because some companies will allow comments that others do not resulting in more moderation decisions to be made, rather than fewer.

    The size of Facebook, twitter et al. come from peoples desire to communicate via common platforms. And network effects will ensure that they reassemble if a forced breakup is made, as the most popular platform will attract more and more users.

    link to this | view in thread ]

  36. identicon
    Anonymous Coward, 14 May 2020 @ 4:19am

    Re: Re: Re: Football Model

    That facebook won't pay for PTSD anymore.

    link to this | view in thread ]

  37. identicon
    Anonymous Coward, 14 May 2020 @ 7:52am

    Re: Re: Re: Football Model

    Words are cheap

    link to this | view in thread ]

  38. identicon
    Anonymous Coward, 14 May 2020 @ 12:33pm

    Re: Re: Re: Anything is easy when you aren't doing it

    I don't get the reference either, but i am super interested in this goat blood + car thing.

    link to this | view in thread ]

  39. identicon
    Anonymous Coward, 14 May 2020 @ 12:36pm

    Re:

    What the fuck are you on, and where are you that FB has a "foreign origin"?

    link to this | view in thread ]

  40. icon
    That One Guy (profile), 14 May 2020 @ 12:54pm

    Series includes blood, humor, long enthusiastic walks at night

    Never let Alucard get bored, he gets... creative when he has nothing productive to do.

    link to this | view in thread ]

  41. identicon
    Anonymous Coward, 14 May 2020 @ 1:42pm

    "The moderation problem" is created wholly from the unrealistic and contradictory demands of people who want someone to magically to censor everything they don't like (and only the things they don't like). Since these demands are unrealistic and contradictory, they are impossible to satisfy. And trying to do the impossible is a waste.

    There is an extent to which content moderation can be useful, but that threshold was passed long before these doomed attempts to make anonymous office drones the arbiters of truth and justice on the internet...or the parts of it that are controlled by corporate platforms at least.

    Whether or not ignoring impossible demands is a good answer, it nonetheless is the best answer.

    link to this | view in thread ]

  42. icon
    nasch (profile), 14 May 2020 @ 2:37pm

    Re: Let people moderate for themselves

    People not seeing things they don't want to see is only part of the problem. The other part is that some of this stuff is really harmful to society to allow people to spread as they wish, even to others who actually want to see it. So if you leave the moderation entirely up end users, you still have pockets of people intentionally wallowing in Nazi or terrorist propaganda or rape or child pornography, and Facebook et al don't want to be a part of that for various reasons.

    link to this | view in thread ]

  43. identicon
    Anonymous Coward, 14 May 2020 @ 2:53pm

    Seriously? PTSD compensation from facebook for ... what? ... content moderation? Somebody has lost their G'damned mind. For what? Dodging gunfire and artillery? Good grief. PTSD from facebook interaction. This is proof enough that facebook should be shut down by the U.S. Government.

    link to this | view in thread ]

  44. identicon
    TFG, 14 May 2020 @ 2:54pm

    Re:

    I take it you've worked as a Facebook Moderator and have personal experience to draw from in your dismissal of the claims?

    link to this | view in thread ]

  45. icon
    nasch (profile), 14 May 2020 @ 3:04pm

    Re:

    Seriously? PTSD compensation from facebook for ... what? ... content moderation?

    "alleging that she developed PTSD after being placed in a role that required her to regularly view photos and images of rape, murder, and suicide. Scola developed symptoms of PTSD after nine months on the job."

    Are you really having trouble seeing how viewing images like that day in and day out for months on end could affect a person's mental health?

    This is proof enough that facebook should be shut down by the U.S. Government.

    You trust the US government to decide which businesses get to keep running? You won't complain when they shut down the places you like to use then right?

    link to this | view in thread ]

  46. identicon
    Anonymous Coward, 14 May 2020 @ 3:38pm

    Re: Re:

    I trust the government more than I trust facebook, who continuously discriminates against users on its own platform.

    link to this | view in thread ]

  47. icon
    That One Guy (profile), 14 May 2020 @ 5:45pm

    Re:

    You not understanding what the moderation actually involves is somehow evidence that it should be shut down? Okay then.

    link to this | view in thread ]

  48. icon
    That One Guy (profile), 14 May 2020 @ 5:47pm

    Re: Re: Re:

    Ignoring for a second that your complaint is shot down by the second half of it('their own platform'), out of curiosity who exactly are they 'discriminating' against again?

    link to this | view in thread ]

  49. identicon
    TFG, 14 May 2020 @ 7:26pm

    Re: Re: Re:

    So you've worked as a Facebook moderator and have personal experience with which to dismiss the possibility that the job could result in PTSD?

    link to this | view in thread ]

  50. identicon
    Anonymous Coward, 16 May 2020 @ 3:16am

    Re: 'Your social media account is on bigot-central?' 'Well, yeah

    Conspiracy theory idiots are the worst

    They keep thinking they don't support real terrorism and real conspiracies by denying other people access to justice

    garbage of the earth

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.