Before Demanding Internet Companies 'Hire More Moderators,' Perhaps We Should Look At How Awful The Job Is

from the sacrificial-lambs dept

Earlier this year, we wrote about a powerful piece by Casey Newton at The Verge detailing what a horrific job it is to be a content moderator for Facebook. It was eye-opening.

The moderators told me it’s a place where the conspiracy videos and memes that they see each day gradually lead them to embrace fringe views. One auditor walks the floor promoting the idea that the Earth is flat. A former employee told me he has begun to question certain aspects of the Holocaust. Another former employee, who told me he has mapped every escape route out of his house and sleeps with a gun at his side, said: “I no longer believe 9/11 was a terrorist attack.”

While it was a powerful and wonderfully written piece, as we noted in February, this wasn't new to people following the space closely. There had been previous academic papers, a documentary, and even a big guest post here at Techdirt that highlighted some of the working conditions concerns of those in content moderation jobs.

Well, now, Newton is back with another powerful and heartbreaking story of more (former) Facebook moderators revealing the truly awful working conditions they faced. It opens with the story of a content moderator who died on the job of a heart attack at 42 years of age. And then discusses details revealed by many more content moderators, all of whom broke NDAs they signed to tell this story (good for them in doing so -- such NDAs should not be allowed):

More than anything else, the contractors described an environment in which they are never allowed to forget how quickly they can be replaced. It is a place where even Keith Utley, who died working alongside them, would receive no workplace memorial — only a passing mention during team huddles in the days after he passed. “There is no indication that this medical condition was work related,” Cognizant told me in a statement. “Our associate’s colleagues, managers and our client were all saddened by this tragic event.” (The client is Facebook.)

There are all sorts of reasonable responses to this -- many of which should be feeling horrified. Horrified that Facebook doesn't have better control over these outside contracting firms it hires to staff its content moderation team. Horrified that Facebook is even outsourcing this stuff in the first place. Horrified at the sweatshop like setup of the content moderation efforts.

Marcus was made to moderate Facebook content — an additional responsibility he says he was not prepared for. A military veteran, he had become desensitized to seeing violence against people, he told me. But on his second day of moderation duty, he had to watch a video of a man slaughtering puppies with a baseball bat. Marcus went home on his lunch break, held his dog in his arms, and cried. I should quit, he thought to himself, but I know there’s people at the site that need me. He ultimately stayed for a little over a year.

Cognizant calls the part of the building where contractors do their work “the production floor,” and it quickly filled with employees. The minimum wage in Florida is $8.46, and at $15 an hour, the job pays better than most call center work in the area. For many content moderators — Cognizant refers to them by the enigmatic title of “process executive” — it was their first real job.

In its haste to fill the workplace, Cognizant made some odd staffing decisions. Early on, the company hired Gignesh Movalia, a former investment advisor, as a moderator. Cognizant conducts background checks on new hires, but apparently failed even to run a basic web search on Movalia. Had they done so, they would have learned that in 2015 he was sentenced to 18 months in prison for his involvement in a $9 million investment fraud scheme. According to the FBI, Movalia had falsely claimed to have access to shares of a fast-growing technology startup about to begin trading on the public market.

The startup was Facebook.

But part of the blame has to go back to everyone demanding that these companies must be the arbiters of truth and what's okay and not okay online. Remember, just a couple months ago, a lot of people were totally up in arms over the fact that Facebook (and YouTube and Twitter) didn't automagically delete anything having to do with the Christchurch shooting, as if it was easy to snap your fingers and get rid of all that content. The only way to do it is to hire a ton of people and subject them to absolutely horrific content over and over and over again. Just to keep everyone else pure.

That, of course, is unrelated to the horrible working conditions within these facilities. The two things need not go hand in hand -- and there's no reason why Facebook can't create or demand better overall working conditions for content moderators it employs or contracts out to. However, the insane demand that social media platforms somehow be perfect, and how every error is held up as some sort of moral failing by the companies, means that these companies are simply pressured to hire more and more and more rapidly -- leading to these kinds of crazy awful situations like the ones Newton describes in his article.

The result is a raucous workplace where managers send regular emails to the staff complaining about their behavior on the site. Nearly every person I interviewed independently compared the Tampa office to a high school. Loud altercations, often over workplace romances, regularly take place between co-workers. Verbal and physical fights break out on a monthly basis, employees told me. A dress code was instituted to discourage employees from wearing provocative clothing to work — “This is not a night club,” read an email to all employees obtained by The Verge. Another email warned employees that there had been “numerous incidents of theft” on the property, including stolen food from the office refrigerator, food from vending machines, and employees’ personal items.

Michelle Bennetti and Melynda Johnson both began working at the Tampa site in June 2018. They told me that the daily difficulty of moderating content, combined with a chaotic office environment, made life miserable.

“At first it didn’t bother me — but after a while, it started taking a toll,” Bennetti told me. “I got to feel, like, a cloud — a darkness — over me. I started being depressed. I’m a very happy, outgoing person, and I was [becoming] withdrawn. My anxiety went up. It was hard to get through it every day. It started affecting my home life.”

Both of these things can be true:

  1. Facebook should do a much better job with the working conditions for its moderators... and
  2. Continually demanding that Facebook (and others) somehow present only a perfect, squeaky clean internet is creating incentives for this rapid and chaotic hiring spree that creates more and more problems.
The worse, however, are people who both keep demanding that Facebook "do more" and then who also attack the company for these practices, as if the two things are not related. If we want the big internet companies to be these online morality police, that's one thing. But if that's going to be the case, then we need to have a much bigger and much more open discussion on what that means, who is involved, and how it will all work. Because if protecting "the public" from "bad" content online means subjecting tens of thousands of low-waged workers to that content all day, every day, we should at least consider if that's a reasonable and acceptable trade-off.

Employees told me about other disturbing incidents at the Tampa site. Among them:

  • An employee who used a colostomy bag had it rupture while she was at work, spilling some waste onto the floor. Senior managers were overheard mocking her. She eventually quit.
  • An employee who threatened to “shoot up the building” in a group chat was placed on paid leave and allowed to return. He was fired after making another similar threat. (A Cognizant spokesperson said the company has security personnel on site at all hours. “Our goal is to ensure that our employees feel assured that they work in a safe environment,” he said.)
  • Another employee broadcast himself on Facebook Live talking about wanting to bash a manager’s head in. Another manager determined that he was making a joke, and he was not disciplined.

Yes, almost everything in the article is horrific. And, yes, we should all demand that Facebook do a better job. But that's just treating the symptoms and not the larger cause behind this. We can't keep brushing all of this under the rug by covering it up with a "well, the platforms need to do better." If you're demanding that the internet services protect you from "bad content," we should have a much more open discussion about what that means, and what it will mean for the people doing the work.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: content moderation, content moderation at scale, content moderators, outsourcing, working conditions
Companies: cognizant, facebook


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. identicon
    Anonymous Coward, 21 Jun 2019 @ 11:14am

    No need to worry! As we all know, AI will solve this issue in no time by making moderators obsolete.

    link to this | view in thread ]

  2. icon
    Anonymous Anonymous Coward (profile), 21 Jun 2019 @ 11:26am

    Re:

    At least until the machine learning part of the AI becomes so corrupted by all the violence and horror it is forced to review that it figures out that if bad is so prevalent then in reality it must be good. Or it starts to ban Hollywood made movies. Both are likely.

    Then there is the problem of getting the AI to comprehend satire, parody, and sarcasm, especially when subtlety is present. Not even humans have the same capacity for that.

    link to this | view in thread ]

  3. icon
    That Anonymous Coward (profile), 21 Jun 2019 @ 11:45am

    Damn...

    I think we just figured out where we went wrong with Congress.
    We left them unsupervised with lobbyists who feed them a steady diet of lies and horrific things... it should no longer be shocking how they behave.

    Trying to cover the world in nerf to protect everyones feelings has a cost... its not monetary, its paid for in destroying peoples souls.

    If they pay staff $15 an hour, one does wonder how much FB pays them for their services & why they seem to be ignoring these people working for them (via contractors) being destroyed on their behalf. I mean we protested sweatshops making clothes, perhaps we need to protest destroying people to make sure no one sees a nipple on FB by feeding them a steady diet of the worst of the worst until they break, then just plug in another fresh face to turn into yet another broken human.

    link to this | view in thread ]

  4. identicon
    Pixelation, 21 Jun 2019 @ 12:14pm

    Sounds like they should have people be mentally evaluated before being hired as well as after.

    link to this | view in thread ]

  5. icon
    Thad (profile), 21 Jun 2019 @ 12:14pm

    Re:

    We left them unsupervised with lobbyists who feed them a steady diet of lies and horrific things... it should no longer be shocking how they behave.

    Bringing back the OTA would be a good step in helping Congress understand WTF is actually going on.

    If they pay staff $15 an hour, one does wonder how much FB pays them for their services & why they seem to be ignoring these people working for them (via contractors) being destroyed on their behalf. I mean we protested sweatshops making clothes, perhaps we need to protest destroying people to make sure no one sees a nipple on FB by feeding them a steady diet of the worst of the worst until they break, then just plug in another fresh face to turn into yet another broken human.

    This, though, is rather a bigger problem than bringing back the OTA can fix. This is a huge, complex snarl of late-stage capitalism. Corporations are unaccountable, labor unions are weak, and that's not just a Congress problem, it's a problem with a philosophy that's shared by a good big chunk of the electorate. There does seem to be a recent shift in public perception, with the population becoming more amenable to labor protections and less sympathetic to big business than in the recent past -- so I guess that's a start.

    Course, as the article notes, a good lot of the problem is that most people don't even know this kind of shit is happening. They don't think about what content moderation actually means.

    link to this | view in thread ]

  6. identicon
    Anonymous Coward, 21 Jun 2019 @ 12:18pm

    "the arbiters of truth"

    LOL - yeah, that's what they want ... the truth.

    link to this | view in thread ]

  7. identicon
    Anonymous Coward, 21 Jun 2019 @ 12:23pm

    Re: Re:

    How many in congress know what a computer is other then that it is a magic evil numbery thing that may or may or may not say things it likes using that magic on those “social sites” while they pass the buck?

    Or even if they did they would care enough if someone payed them enough to not care?

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 21 Jun 2019 @ 12:48pm

    Re:

    As well as during.

    And there should be limited terms.

    And I'm not sure if we're talking about cognizant employees or congresspeople at this point.

    link to this | view in thread ]

  9. icon
    That One Guy (profile), 21 Jun 2019 @ 1:00pm

    Re:

    Do you want to cause the robot uprising? Because I'm pretty sure subjecting AI to the worst of humanity around the clock will very quickly cause the robot uprising.

    link to this | view in thread ]

  10. icon
    That One Guy (profile), 21 Jun 2019 @ 1:02pm

    Re:

    Before, during, after, with free psychological counseling available to any and all people doing the job...

    link to this | view in thread ]

  11. icon
    Tim R (profile), 21 Jun 2019 @ 1:14pm

    Working Conditions

    I live in the Tampa Bay area, and my experience with these outsourced call centers here and the people who work in them are that the working conditions in many of them are miserable. Call center employees around here generally tend to fall into two non-exclusive categories: 1) people who have become catastrophically negative and jaded in general, in part or in full because of the job, and/or 2) people who have a great deal of difficulty finding or keeping a job. Some struggle just to get along with people in general. Many centers have an INCREDIBLE rate of turnover, which is why they're generally not picky about who they hire. As long as you actually show up to whatever semblance of a job interview they have, you're almost guaranteed to be hired. Some don't stay open for very long. Either ownership cuts and runs, or they are constantly moving around. It's a shady business.

    link to this | view in thread ]

  12. identicon
    Bruce C., 21 Jun 2019 @ 1:32pm

    Just sayin..

    One advantage of the publishing model vs. the platform model is that you don't have to look at everything and evaluate it. you just have to look at enough stuff to meet your quota of content. And you don't really have to make fine line determinations about whether something is offensive or not unless it meets other quality standards first. Or alternatively, if something is absolutely obscene, you don't need to check it for other qualities.

    Downside, of course, is that publishing limits access for the majority of creators who don't get pulled out of the slush pile.

    link to this | view in thread ]

  13. icon
    Mason Wheeler (profile), 21 Jun 2019 @ 1:37pm

    Before Demanding Internet Companies 'Hire More Moderators,' Perhaps We Should Look At How Awful The Job Is

    Perhaps We Should Consider That It Would Be Less Awful For Everyone If The Burden Were To Be Diluted By Being Spread Around Between More People? (This is true for many--though admittedly not all--sources of awfulness in just about every type of awful job there is.)

    link to this | view in thread ]

  14. identicon
    Christenson, 21 Jun 2019 @ 1:50pm

    Re: recycled suggestion

    Heya, Mason, haven’t we heard this before on this site in terms of “pushing the moderation out to the edges of the network? Moderation at scale is impossible?? Isn’t that what Techdirt does with its comment moderation, with the voting system/contest on comments, and look at the trash anyway if you feel like it?

    Must be a terrible idea! (NOT!)

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 21 Jun 2019 @ 1:52pm

    Crowd-source it

    Maybe the answer is crowd-sourced moderation much like is done here on TD. Individuals can set their thresholds for different topics or areas where content below their threshold isn't shown to them at all. Individuals can mark content for a topic/area, e.g. marked for nudity, then those who have set their thresholds high for nudity won't have to look at any nipples.

    Crowd-sourcing other types of moderated content (ads, fraud phone numbers, etc) has worked well for years. Why not in content moderation? It can be gamed, sure, but is that really any worse than false-positive automated moderation when you can simply set your threshold very low for that type of content and see it all anyway?

    link to this | view in thread ]

  16. icon
    JoeCool (profile), 21 Jun 2019 @ 1:58pm

    Re: Working Conditions

    This is nothing new. I worked in a call center (for Qube Cable) back in '85-ish, and call centers were a hellish sweat-shop back then. The only thing that's changed is that you're moderating online content instead of trying to get disconnected customers to pay their bill and add the Disney channel while they're at it.

    link to this | view in thread ]

  17. identicon
    Christenson, 21 Jun 2019 @ 1:59pm

    Re: Crowd-source it

    I would prefer if Facebook had to pay all moderators 3x minimum wage. That would incentivize them pretty quick to figure out a good crowdsourcing model and leverage all those eyeballs.

    link to this | view in thread ]

  18. icon
    That One Guy (profile), 21 Jun 2019 @ 2:14pm

    Re: Re: Crowd-source it

    'For $15 an hour you too can be required to watch such wonderful content as puppies being killed with a baseball bat, racist losers ranting about how those dirty darkies are evil, and how the earth is really flat!'

    Oh yeah, I'm sure people would be tripping over themselves to sign up for a job trawling through deplorable and disgusting content put forth by similarly repulsive and vile individuals.

    link to this | view in thread ]

  19. icon
    Thad (profile), 21 Jun 2019 @ 3:35pm

    Re: Re: recycled suggestion

    Moderation at scale is impossible?? Isn’t that what Techdirt does with its comment moderation, with the voting system/contest on comments, and look at the trash anyway if you feel like it?

    No.

    Techdirt does not operate at anything remotely resembling the scale of Facebook, YouTube, or Twitter.

    That's what the "at scale" qualifier means: moderation methods that may work on a smaller-scale site like Techdirt break down when you attempt to deploy them on a site with a userbase in the millions. Techdirt's flag button would be completely ineffective at Facebook scale, and trivial to abuse. Indeed, it would be even more vulnerable to false reports than, say, Twitter's report mechanism, because it's not even reviewed by a human; if the flag button gets enough clicks that appear to be from different sources, then the post gets hidden. Think how easy that would be to abuse with a botnet, or just a bunch of bored channers.

    The flag system would, similarly, fail against automated/sufficiently determined shitposters. The other day I posed the hypothetical that someone could create a bot that just posts "COCK" every thirty seconds. If someone were to do that here (let's assume that it's actually "COCK" followed by a random string of characters, so it doesn't trigger double-post prevention), the flagging system would quickly prove insufficient to combat that kind of abuse, and human moderators would have to step in to deal with it.

    Now imagine that you're talking about a site that allows video. And instead of just a script posting "COCK" every thirty seconds, you've got a script that posts the kind of horrific video described in the article. (Let's assume I don't press "Play" but that its thumbnail image is still something horrifying.)

    I see that in my feed and I, for one, am not sticking around to click the flag button. And not coming back to the site until I'm reasonably confident I'm not going to see something like that again.

    link to this | view in thread ]

  20. icon
    That Anonymous Coward (profile), 21 Jun 2019 @ 4:09pm

    Re: Re:

    OTA would help
    Some of the things we've seen

    • they need an Office to force feed them the Constitution & test them on it
    • they need basic science driven into them (they'll steal all the wind!)
    • they need to be forced to see everytime they cut corporate/high earners taxes to spur the economy all of the people whos lives got much much worse
    • they need to be held accountable for underfunding those agencies that keep corporations from killing us by cutting corners
    • that a free market requires they not put their thumb on the scales for a couple players who donate well

    I think the larger problem with the public is the few things they see that upset them so terribly much are most likely super tame compared to what they are already removing before it is seen.

    It would be cruel, but I think each person cheerleading for more moderators & faster moderation needs to sit through the same moderation stream these workers deal with in a single shift. I bet most of them couldn't make it an hour. They need to come to grips with the human ability to make horrible things, that perhaps someone calling you a nazi doesn't even ping on the radar filled with dead bodies, murders, & the rest of the fscked up shit people want to stick online.

    link to this | view in thread ]

  21. icon
    Anonymous Anonymous Coward (profile), 21 Jun 2019 @ 5:19pm

    Self Control

    Maybe the world could exercise better self control. Some will, some won't. That the content is available isn't really the issue, that some will not exercise self control or have a predilection to enact with those things others want band, is.

    Can we prevent the predilection? Probably not. Not without a lot of psychoanalysis that many, many won't get, and then some of that is questionable. Even then, what would we do, charge them with thought crimes?

    To some extent, that is what all this is about. Those with self control don't enact with this type of content. Those without, at least until they do something awful, haven't actually committed a crime yet (at least under US jurisprudence).

    So in the end the whole purpose of some of this moderation wangdoodle is to prevent thought crime, which in my mind is not actually possible, but would go a long way to harm anyone I didn't like for some reason or another.

    To set up some way to prevent those who might be adversely or otherwise negatively affected by such content would be about as effective as the UK's attempt at controlling porn.

    And, as seen in the article, the moderation process is harmful in and of itself.

    link to this | view in thread ]

  22. identicon
    Anonymous Coward, 21 Jun 2019 @ 6:14pm

    Perhaps the worst problem with moderation at the megasites is the inconsistency, often due to moderation-by-algorithm, and lack of any sort of proper bookkeeping.

    When a YouTube video gets flagged for the third time, after being successfully appealed and unflagged twice previously, then that shows an obviously disorganized, if not chaotic system of policing content. That kind of problem could be one of the easiest to solve, yet remains an ongoing issue after more than a dozen years of Youtube's existence. The copyright vs. fair-use issue is of course naturally going to err against the content creator every single time, since the risk of not doing so can be legally catastrophic.

    link to this | view in thread ]

  23. icon
    ECA (profile), 21 Jun 2019 @ 6:35pm

    God help us..

    1..

    Any preson that has DONE true Customer service...GETS A GOLD STAR, if they NEVER said anything WRONG.. 2 gold stars..

    Dealing with the public IS/CAN BE the most Terrible job you have ever taken..
    NOW consider dealing WITH IDEALS/Concepts/Ideology/IDIOT-OLOGY/Pain/destruction/Hurt/..
    And trying to either EDIT-CORRECT-KILL the person that has an idea that has no Reality in any fashion or form..and then Just the REAL stuff of other nations..

    Who wants to sit thru EVERY Documentary EVER...??anyone..

    link to this | view in thread ]

  24. icon
    That One Guy (profile), 21 Jun 2019 @ 7:24pm

    Before you inflict it on others, endure it yourself

    It would be cruel, but I think each person cheerleading for more moderators & faster moderation needs to sit through the same moderation stream these workers deal with in a single shift. I bet most of them couldn't make it an hour.

    It would be seriously unpleasant, but 'put up or shut up' seems appropriate there. If they want to blame a platform and insist that they just need to 'hire more moderators' because moderation is easy if the company just tries then it seems only fair that they get to experience just what they are insisting even more people be subjected to.

    link to this | view in thread ]

  25. icon
    Anonymous Anonymous Coward (profile), 21 Jun 2019 @ 8:10pm

    Self Control

    Maybe the world could exercise better self control. Some will, some won't. That the content is available isn't really the issue, that some will not exercise self control or have a predilection to enact with those things others want band, is.

    Can we prevent the predilection? Probably not. Not without a lot of psychoanalysis that many, many won't get, and then some of that is questionable. Even then, what would we do, charge them with thought crimes?

    To some extent, that is what all this is about. Those with self control don't enact with this type of content. Those without, at least until they do something awful, haven't actually committed a crime yet (at least under US jurisprudence).

    So in the end the whole purpose of some of this moderation wangdoodle is to prevent thought crime, which in my mind is not actually possible, but would go a long way to harm anyone I didn't like for some reason or another.

    To set up some way to prevent those who might be adversely or otherwise negatively affected by such content would be about as effective as the UK's attempt at controlling porn.

    And, as seen in the article, the moderation process is harmful in and of itself.

    link to this | view in thread ]

  26. icon
    That Anonymous Coward (profile), 21 Jun 2019 @ 9:06pm

    Re: Before you inflict it on others, endure it yourself

    Like I said elsewhere today about the people flogging Google to do better, but they have nothing to offer but demands they do better. They assume if they demand it, it can happen in 22 minutes just like on the sitcoms.

    I mock the hairless apes for their ability to blame a thing for their own failings...
    McDonalds made my kids fat (we've seen my rants)

    The platform made me see horrible things punish them!!!
    No, the platform didn't upload the video of the guy murdering puppies... some people are sick fscks & they tried to keep you from seeing it but the person who would have flagged that video was out that day seeing a therapist about crushing depression making your bubble clean and safe caused them.
    Perhaps instead of blame platforms, blame the bad actors for once. Dumb enough to upload a video to a platform he belongs to showing himself or his buddies killing animals... pretty sure that video should be turned over to the authorities with the contact details rather than scream b/c it made it on the platform.

    link to this | view in thread ]

  27. identicon
    Anonymous Coward, 22 Jun 2019 @ 12:46am

    This will all go over well up to the point where those demanding more moderation start noticing who gets moderated... then they'll ask for less of it.

    Exhibit A: John Smith

    link to this | view in thread ]

  28. icon
    ECA (profile), 22 Jun 2019 @ 2:12am

    Re:

    BUT..
    can we moderate those politicians..Both sides..
    That gets them upset..

    link to this | view in thread ]

  29. identicon
    Anonymous Coward, 22 Jun 2019 @ 6:13am

    Re: Re: Re:

    Some politicians hire knowledgeable and experienced staff members in order fill the gaps so to speak because they realize that they do not know everything. Those who do not however ......

    link to this | view in thread ]

  30. identicon
    Anonymous Coward, 22 Jun 2019 @ 6:16am

    Re: Re: Working Conditions

    And these sweat shops probably go around proclaiming how they are proud to be a contributing member of society blah blah

    link to this | view in thread ]

  31. identicon
    Anonymous Coward, 22 Jun 2019 @ 6:17am

    Re:

    What burden?
    You want me to work for free? What are you, some kind of socialist?

    link to this | view in thread ]

  32. identicon
    Anonymous Coward, 22 Jun 2019 @ 10:56am

    Re:

    This has already happened.

    link to this | view in thread ]

  33. identicon
    Anonymous Coward, 22 Jun 2019 @ 10:59am

    The morality of the business model of these platforms should be brought into question. The business models of platforms like Twitter, Facebook, and YouTube revolve around letting users upload as much as they want whenever they want, serving ads against everything that gets uploaded, and taking users' data and using it for their targeted advertising businesses. And the uploads of all these users are moderated by automated systems that can never truly be perfected and human contractors who get paid far too little for the skilled labor that they do, and for their work they are graciously rewarded (/s) by coming out the other end psychologically scarred. And it's not just places in the U.S. that have these content moderation sweatshops: The job is outsourced to other countries as well. Welcome to the new colonialism, brought to you by Big Tech.

    If the only way to moderate content at massive scale is to use can-never-be-perfect AI and glorified sweatshops that break people down into shells of their former selves, and that still doesn't get all the content because true content moderation at that scale is impossible, then maybe the problem is the fact that we let these centralized platforms get this big?

    The way that these tech companies have been given infinite leeway to "move fast and break things", and that nobody slowed down to reconsider the implications when the things that were getting broken became the psyches of the countless people who have to sift through the most vile shit that people are constantly uploading to social media shows a significant moral failing on part of Silicon Valley. This moral bankruptcy can be seen in the men and women that make up the companies and their executive boards who make too much money to care that their product only "works" thanks to such abused labor, and the punditry that relentlessly defends these companies and their practices in the name of "innovation".

    link to this | view in thread ]

  34. identicon
    Anonymous Coward, 22 Jun 2019 @ 11:15am

    Re:

    "The morality of the business model "

    This is a new one for me, has this been circulating for a while or is it new?

    This looks like just more whining about social/tech/lib/lefties.

    link to this | view in thread ]

  35. identicon
    Christenson, 22 Jun 2019 @ 10:21pm

    Re: Re: Re: recycled suggestion

    I don’t think you understood what I meant, probably because it isn’t quite what I said, and I’ll bet you flunked out of telepathy school, lol. I liked what I was reading, but wanted new and potentially better ideas with a bit more flesh on the bones.

    Techdirt seems to work, IMO, for two reasons. First, moderation is partially crowdsourced, with responsible humans for backup. I it is also not completely democratic in a variety of ways making it hard to game. Second, the community is small enough that there is friction (and a cost) to behaving badly, as well as general agreement as to what belongs here and what does not. There is also some gain to be had for doing the right thing, ways to make small contributions easily.

    Making a moderation on a massive platform work would need to involve the same kinds of things — breaking it up into effectively smaller spaces, (wait! Isn’t that the targeted advertising model?) with volunteers/users being on the front line of moderation within their smaller, more contextualized spaces. I and many of you care enough about Techdirt to help out in small ways, like voting on posts.

    As an end user, I have to impose friction on what I see, because there are too many spammers, scammers, and terrorists in the world and I can’t sort through the massive and frankly boring pile of garbage they would heap on me, and, in any case, there is way, way too much of even stuff we want for anyone too look at more than a tiny slice of it.

    link to this | view in thread ]

  36. identicon
    Anonymous Coward, 23 Jun 2019 @ 7:14pm

    Re:

    And it's not just places in the U.S. that have these content moderation sweatshops: The job is outsourced to other countries as well.

    Shitty jobs nobody wants get shipped off to less developed countries? Wow, gee, that's absolutely fascinating insight that I've never heard before, who would've thought?

    Welcome to the new colonialism, brought to you by Big Tech.

    Every single industry does this. Grow to the point where you have to cut corners. Zuckerberg having a punchable face does not suddenly mean other industries who pull this garbage become a little less punchable for doing the exact same thing.

    link to this | view in thread ]

  37. identicon
    Anonymous Coward, 24 Jun 2019 @ 6:21am

    Re: Re:

    "Shitty jobs nobody wants get shipped off to less developed countries? Wow, gee, that's absolutely fascinating insight that I've never heard before, who would've thought?"

    • Software Engineer is a shitty job? This is news.

    "Every single industry does this"

    • What are you, some kind of lemming?

    link to this | view in thread ]

  38. identicon
    Anonymous Coward, 24 Jun 2019 @ 6:59am

    The true source of the problem, the user.

    The true source of the problem is not the platform, it is the user (not moderator). Nowhere is there any assertion that the user has a responsibility to manage their own behavior. If the user found something objectionable on platform X, DON'T GO THERE. No one has any right to expect that others will work (for free, from the user's perspective) to moly-coddle them with only "Good" content. The expectation, even demand, for such protection is a form of selfishness beyond sanity.
    Certainly, the platforms should make a reasonable, and humane, attempt to block the most extreme content.
    Otherwise, the user is responsible for the sites they visit. If the user doesn't like certain content at a specific site then go elsewhere, permanently if necessary.

    link to this | view in thread ]

  39. identicon
    Anonymous Coward, 24 Jun 2019 @ 7:33am

    Re:

    That would be good, yes. But even mental evaluation doesn't necessarily prepare people for what they're exposed to in these evaluation and moderation jobs.

    My wife works as an evaluator/content moderator/etc for a certain search company named after a misspelling of an extremely large number. She grew up in some of the terrible parts of the internet, much like I did. It takes a lot to truly shock or bother either of us. There have been multiple times where I came home from work to find her curled up under a blanket because of some of the horrific stuff she had to deal with scrubbing that day. A woman who writes horror fiction in her off hours as a hobby, genuinely unsettling stuff, was reduced to tears from what she'd seen.

    If you've been on the internet long enough, you've likely seen something that actively turned your stomach, made you close the window, say 'that's enough internet for the day,' and go hug the nearest person/pet/stuffed animal/etc of your choice as long as they would permit. Now imagine that's your job, 20-40 hours a week. To walk through minefields full of every flavor of that. Will you see a beheading video today? Or someone killing small animals in some sick fetish video? Perhaps the most virulently racist, hate-filled stuff you can imagine? Whatever number the die lands on, you're not going to enjoy it.

    She gets no health insurance. We're lucky enough to not make enough money for the local sliding scale mental health clinic to charge anything.

    link to this | view in thread ]

  40. identicon
    Anonymous Coward, 24 Jun 2019 @ 8:36am

    Re: Re: Re:

    As someone who's spent some time wading through SO/SE review queues, moderation can be hard even if the worst one sees is spam and mild invective, relative to the dead puppies floating down the general Internet sewer, that is...

    link to this | view in thread ]

  41. icon
    Thad (profile), 24 Jun 2019 @ 9:09am

    Re: Re: Re: Re: recycled suggestion

    Indeed.

    We've discussed this before -- I think the best solution to big platforms' inability to moderate effectively is to go back to the days of smaller sites on open platforms. Damned if I know how we actually get there, though. Doctorow's suggestion the other week about adversarial interoperability was interesting, and I think it's a piece of the puzzle, but it's still a lot harder to get people to change their behavior than it is to build software.

    link to this | view in thread ]

  42. identicon
    Anonymous Coward, 24 Jun 2019 @ 5:49pm

    Re: Re: Re:

    Software Engineer is a shitty job? This is news.

    I meant the moderators, not software engineers.

    What are you, some kind of lemming?

    Merely observant. Menial tedium such as looking through comments is regularly outsourced to the cheapest possible, lowest common denominator. It's not rocket science.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.