Content Moderation Case Study: Xhamster, The 22nd Biggest Site On The Internet, Moderates Content Using Unpaid Volunteers (2020)

from the one-way-to-do-it dept

Summary: Formed in 2007 and operated out of Limassol, Cyprus, xHamster has worked its way up to become the 20th most-visited site on the internet. The site boasts 10 million members and hundreds of millions of daily visitors despite being blocked by a number of governments around the world.

Being in the pornography business poses unique moderation challenges. Not only do moderators deal with a flood of both amateur and professional submissions, they must take care to prevent the uploading of illegal content. This goes further than policing uploads for unauthorized distribution of copyrighted material. Moderators must also make decisions -- with facts not in their possession -- about the ages of performers in amateur videos to prevent being prosecuted for the distribution of child pornography.

Given the stakes, users would expect a well-staffed moderation team trained in the difficult art of discerning performers' ages… or at least given the authority to block uploads until information about performers is obtained from uploaders.

Unfortunately, this does not appear to be the case. An undercover investigation by Vice shows one of the biggest sites on the internet has chosen to lower its costs by relying on an all-volunteer moderation team.

One member of the discussion is “Holger”, a user created by VICE News to infiltrate the content moderation team and observe its inner workings. Holger finds himself in a team of over 100 unpaid, voluntary workers called “the Reviewers Club”, which means he has partial control over which photos stay online and which are taken down.

Moderators are guided by a 480-page manual that explains what images and videos are permitted. The "Reviewers Club" then works its way through thousands of content submissions every day, making judgment calls on uploads in hopes of preventing illegal or forbidden content from going live on the site.

Decisions to be made by xHamster:

  • Does relying on unpaid volunteers create unnecessary risks for the site?
  • Would paying moderators result in better moderation? Or would paid moderation result in only nominal gains that would not justify the extra expense?
  • As more revenge porn laws are created, does xHamster run the risk of violating more laws by turning over this job to volunteers who may personally find this content acceptable?
Questions and policy implications to consider:
  • Given the focus on child sexual abuse material by almost every government in the world, does the reliance on an all-volunteer moderation team given the impression xHamster doesn't care enough about preventing further abuse or distribution of illicit content?
  • Does asking content consumers to make judgment calls on uploads create new risks, like an uptick in uploads of borderline content that appeals to members of the volunteer staff?
  • Can the site justify the continued use of volunteer moderators given its assumed profitability and heavy internet traffic?
Resolution: Despite the site's popularity, xHamster has not made the move to paid moderation that does not involve site users whose personal preferences may result in unsound moderation decisions. The investigation performed by Vice shows some moderators are also content contributors, which raises more concerns about moderation decisions on borderline uploads.

While xHamster informs users that all uploaded content requires the "written consent" of all performers, there's no evidence on hand that shows the site actually collects this information before approving uploads.

Further skewing moderation efforts is the site's highly-unofficial "reward" program which grants "badges" to reviewers who review more content. The site's guidelines only forbid the worst forms of content, including "blood, violence, rape" and "crying" (if it's determined the crying is "real."). Underage content is similarly forbidden, but reviewers have admitted to Vice policing underage content is "impossible."

Moderation decisions are backstopped by the site, which requires several "votes" from moderators before making a decision on uploaded content. The "democratic" process helps mitigate questionable decisions made by the volunteer staff, but it creates the possibility that illicit content may obtain enough votes to skirt the site's internal guidelines.

Originally published on the Trust & Safety Foundation website.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: adult content, content moderation, moderators, porn, volunteers
Companies: xhamster


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • icon
    Samuel Abram (profile), 22 Dec 2020 @ 5:07pm

    Pronunciation?

    Just out of curiosity, how would one pronounce "Xhamster"? "Eks-HAM-stir"? "ZAM-stir"? I'd like to know, as the Wikipedia page is not producing any clues…

    link to this | view in chronology ]

  • icon
    Ehud Gavron (profile), 22 Dec 2020 @ 5:38pm

    Moderation - again

    TL;DR - to "solve" this technically is beyond current means.

    Long answer:
    To answer Mr. Abram, it's "Eks-Hamster". Not that it really matters, as you can pronounce it any way you like so long as you spell it right when you sign in. Like "Kubernetes". Seriously the debate can go all night long. Is it toe-may-toe or tuh-muh-toe?

    As to the topic at hand, in a previous business I worked with an adult content creation business. They had hired a call-center's worth of people to review content and apply "tags". For example, without getting NSFW, tags like "grandma" and "bondage" would be added to videos to which that would be applied so that future searches would be able to find them. In other words, if one tagged some videos as "grandma" someone searching for "grandma" would find them.

    The toll on my friend who worked there was huge. She worked normal hours (9-5 US or 0900-1700 EU) and she had breaks but the whole time she was there it was watching pornography, clicking on "tags" and then moving on to the next segment. Some were, as she described the experience, disturbing.

    Moderation is an art. The moderator has to apply subjective judgment to evaluate whether a particular item fits or doesn't fit. A 480-page manual only hurts (but may shield liability for Xhamster down the road... but who is to say with today's wolves in congress.)

    To have effective moderation there ought to be an OBJECTIVE standard, which is difficult because what's fine in California is not fine in South Carolina is not fine in Iraq and not fine in China. However, if such a standard could be defined, agreed upon (think international treaty) and codified, then an AI/Neural Net/cloud/crypto/blockchain/VC's-give-me-cash could be set up to do it.

    As always I appreciate the Copia Institute op-eds. Questions and policy implications to consider are difficult to sum up, but starting with CSAM is simply a shift to "What about the children?" It's a consideration, to be sure, but THE VERY FIRST ONE? Children are not the primary users of the Internet or Xhamster.

    Volunteer moderators and volunteer staff is just a money shift. It doesn't address any of the issues in moderation... just a question of how cheap your labor can get. If you get great free moderators and volunteers, good for you. If you can't, and you pay, and you get much better ones, good for you. The underlying issues (outlined above) don't change at all based on how much you pay the people who have to watch the content and make subjective decisions.

    My 2¢ worth.

    Ehud Gavron
    Tucson Arizona US

    link to this | view in chronology ]

    • icon
      Stephen T. Stone (profile), 22 Dec 2020 @ 6:12pm

      Questions and policy implications to consider are difficult to sum up, but starting with CSAM is simply a shift to "What about the children?" It's a consideration, to be sure, but THE VERY FIRST ONE? Children are not the primary users of the Internet or Xhamster.

      You missed the point of the question. I’ll repeat it here for easier context:

      Given the focus on child sexual abuse material by almost every government in the world, does the reliance on an all-volunteer moderation team given the impression xHamster doesn't care enough about preventing further abuse or distribution of illicit content?

      The question isn’t about “what about the children” or children as “xHamster users” or whatever you think it is. It’s about whether the reliance on a volunteer mod team makes xHamster look like it doesn’t give a shit about CSAM. And it is a fair question. If the owners of xHamster truly gave a shit, they’d hire a professional staff to moderate content as well as weed out and report CSAM. Maybe the xHamster owners don’t want to pay for the therapy that said staff would obviously need after spending hours upon hours of looking at porn (including fetish porn, both tame and “extreme”) as well as any CSAM they may come across. Maybe they don’t want to pay for any extra staff, period. But whatever the case, the fact that xHamster moderation relies on volunteers with seemingly no obligations to the site itself is disconcerting — at best.

      This rings especially true after the recent PornHub purge. That site got rid of millions of videos because Mastercard and Visa started refusing to do business with it. That refusal was prompted by reporting from the New York Times that PornHub had a sizeable amount of CSAM on it (among other illegal/unlawful content). xHamster could end up on that same chopping block if the owners refuse to get their shit together and do more about any potential CSAM problem on that site. Asking volunteers to do the job of moderating the site does xHamster no favors in that regard.

      link to this | view in chronology ]

      • icon
        Ehud Gavron (profile), 22 Dec 2020 @ 6:23pm

        Re:

        "Professional staff to moderate content"???

        You've missed the whole point.

        Try reading the original article. Then read anything Mike or Tim have wrote about moderation. Once you have that concept read what I wrote.

        THEN when you can FIX MODERATION for WEBSITES IN THE WORLD, speak up.

        E

        link to this | view in chronology ]

        • icon
          Stephen T. Stone (profile), 22 Dec 2020 @ 6:54pm

          You've missed the whole point.

          No, I haven’t. xHamster employs a small number of unpaid volunteers — people who have no legal, moral, or ethical obligation to work in the best interests of xHamster — to moderate all the content on that site. That could bite xHamster on its metaphorical ass, since credit card companies already have a bug up their own metaphorical asses about porn video sites with lax moderation. Any site that doesn’t appear to take moderation seriously — like, say, a site that uses unpaid volunteers instead of paid employees to handle moderation — could end up in the same position as PornHub.

          And that doesn’t even get into the myriad issues with using unpaid volunteers with no obligations toward a given porn video site to moderate content. For example: What would happen to xHamster if a pedophile lands a mod position and uses it not to delete CSAM, but to download it?

          xHamster has a serious issue to deal with. It doesn’t seem to take that issue seriously. Neither, apparently, do you.

          link to this | view in chronology ]

          • icon
            Ehud Gavron (profile), 22 Dec 2020 @ 9:56pm

            Re:

            You've still missed the entire point of content-based moderation. It doesn't matter if Xhamster "employs unpaid volunteers" (a nonsense expression) or if they take something "seriously" or not or whether I do. I don't work for Xhamster so my commenting on it doesn't effect anything in their business.

            What's important, and I've exhorted reading other people's writings... is that content-based moderation is somewhere between HARD and IMPOSSIBLE.

            I know it's difficult to get but here's an analogy that may help:

            • Republicans say FaceBook censors their comments way too much
            • Democrats say FaceBook doesn't censor Republicans enough
              A computer algorithm or an individual human or 10,000 of them cannot do that.

            Now switch it off from "simple text" to a video clip someone has to watch, has to know background of, may have a database to compare it to, etc. and the "problem" that even FaceBook can't solve becomes exponentially more difficult.

            We can't fix with "throwing more bodies at it" anything we can't fix to begin with.

            E

            link to this | view in chronology ]

            • icon
              cattress (profile), 23 Dec 2020 @ 12:01am

              Re: Re:

              I think that there maybe a few holes in their model, like not getting the written certification that all performers are consenting adults as they claim to do, but otherwise they seem to have found a working solution for their site.
              There is no such thing as an unbiased, neutral, professional moderator of art, especially art that can be lovingly crafted at home by hand, so to speak. It's impossible. And porn is a form of art. Those who earn badges that allow them moderate content are connoisseurs of sorts; people who enjoy the art and have developed vast knowledge and often a trusted sense of taste. While some pervert may temporarily get a moderator position, with a democratic structure that requires input from multiple people, they would quickly be discovered and outed. By relying on voluntary moderators, there are no performance metrics to burn people out, cause psychological fatigue or disturbance. They don't have to depend on reviewing content to pay the rent, and they can do as much or as little as they like. Moderators are incentivized by their love of the art and continued curation of content to share with other art lovers. If they are reckless about screening out what they suspect is csam, they risk seeing their "museum" shut down and their reputation damaged, and even personal liability.
              Consider Back Page, and the networks of both consumer reviewers and sex workers that had developed. While not all abuse could be prevented using the platform, it did bring some victims out of the shadows where they might never had been found and ultimately saved. People who only wish to participate in or facilitate successful and satisfactory transactions between consenting adults did not want to lose their platform to do so by turning a blind eye to abuse and victimization of kids or non-consenting adults, which is why they worked with law enforcement.
              It might not appear to some governments that a voluntary program is indicative of a company that truly cares about preventing csam, but policing themselves vigorously is one of the best ways to avoid being arbitrarily (or rightfully) shut down or policed by a government authority.
              We need to stop complaining about the imperfections of moderation and continue to develop flexible solutions that can be tailored to meet varied demands.

              link to this | view in chronology ]

              • icon
                Ehud Gavron (profile), 23 Dec 2020 @ 12:21am

                BackPage

                ...consider Backpage...

                Backpage did nothing against the law. In fact, pre-VP-elect Kamala Harris testified (under oath) to Congress that she couldn't do anything about them. That's a complicated way of saying "Make laws, because they're not violating any."

                Then she moved to arrest two BP execs [quickly released, charges dismissed, etc.] to cement her "success" as a "tough" prosecutor. I do like her better than the alternatives, but I don't like the hypocrisy.

                BP didn't create sex trade. It allowed people not to have to go walk to the streets and risk their lives. The elimination of Craigslist and Backpage and other avenues doesn't REMOVE sex from the street; it just makes it much less safe.

                Which is worse?

                1. Sex workers are beat up on the street, give up money to pimps, and sometimes die
                2. Websites make money

                Take your time. You're in the comfort of your own home. Sex workers are out on the street.

                E

                link to this | view in chronology ]

          • identicon
            stine, 22 Dec 2020 @ 11:58pm

            Re:

            " people who have no legal, moral, or ethical obligation to work in the best interests of xHamster"

            You're incorrect. They do in fact have an obligation to work in the best interest of xHamster. If xHamster closes down they, and the world, lose access to that library of pornography.

            Do you know anyone who solved CAPTCHAs as a hobby?

            link to this | view in chronology ]

            • icon
              PaulT (profile), 23 Dec 2020 @ 12:06am

              Re: Re:

              "They do in fact have an obligation to work in the best interest of xHamster. If xHamster closes down they, and the world, lose access to that library of pornography."

              True, but how does that fit into "legal, moral, or ethical obligation"? They might have selfish reasons or believe that the site provides an important enough service that it should remain available, but those don't fit into the original definitions.

              In fact, you can argue that your statement just confirms they have no specific obligation to xHamster at all, and would drop the site at a moment's notice if they believe the requirements were being met elsewhere.

              link to this | view in chronology ]

              • icon
                cattress (profile), 23 Dec 2020 @ 11:27pm

                Re: Re: Re:

                Fair enough, but while an employment contract may shore up a legal obligation, does it dictate a moral and ethical obligation? I don't want to wax philosophic, but...
                I don't think there is such a thing as a moral obligation to a company. Perhaps a moral sense of duty, but not if that duty requires violating a relatively higher held moral. Morals are our personal belief systems of right and wrong, and the relative rightness or wrongness. Our obligation is to ourselves, our god, and other people. And I think moral obligation ultimately trumps legal and ethical obligation in most situations.
                Ethical obligations are based on usually the consensus of professionals of how one is to conduct themselves and perform their trade, especially with regard to the impact on others. Ethical obligations are developed by regulators and professional organizations, and usually those bodies have the power someone to account more than an employer. If a lawyer does not represent their client zealously, the client could fire the lawyer, but ultimately the Bar would determine if the lawyer was unethical.
                In fact, I think a legal obligation, like an employment contract could mitigate one's moral or ethical obligations, along the lines of "just following orders".
                I think it comes back to a love of the 'art'. A desire to protect the art from contamination, or a sort of guilt by association, of violent, abusive, harmful content; and a consensus among moderators that content with actual violence, abuse, or produced with anyone unable or unwilling to give meaningful consent does not constitute as art. And while I'm describing these obligations as towards the art, Xhamster is the virtual museum or library where the art is housed, or collected. Many, if not most libraries and Museums in the non-digital live world are run and maintained almost entirely by volunteers, and non-profit organizations; or, people that love them. (Yes, in some cases they do hire and pay handsomely people to curate their museums, but that is because of more tangible skills and specific knowledge of things like authentication, purchase and care of priceless works of art or scientific discovery that just doesn't apply to pornographic videos) Certainly the moderators could lose interest, or get tired of screening out what does not constitute art, or even venture out to create their own "museums", but how is that any different than a paid employee? Since Xhamster runs everything through the moderators before releasing on their site, losing their voluntary moderators would mean they have to modify their business model or close shop. And regardless of whether the moderators are paid or voluntary, xhamster has to be able to trust their judgement. Perhaps by creating a badge system, where the roll of moderator is earned by those who have demonstrated good judgement and developed a reputation is less risky than hiring someone based on a resume, interview, and limited information that can be gathered from references.
                I don't think this model necessarily works for any and all sites hosting pornography, but it shouldn't be discounted just because there is no employer/employee relationship.

                link to this | view in chronology ]

      • identicon
        Anonymous Coward, 23 Dec 2020 @ 6:50pm

        Re:

        I didn't get the notion from the New York Times article that PornHub had a significant amount of child pornography on it.

        They said it was probable there was a significant amount of child pornography, and some anecdotes, which suggested that at some point in recent times, some people had difficulty getting their content removed from the site.

        Kristoff himself admitted, that even with problematic keywords, a vast majority of the content was lawful, so it's entirely possible it was simulated or otherwise, although I personally do not use PornHub, and wouldn't want to stumble onto anything problematic by looking into it further. Several other people have pointed this out.

        Is a hundred reports in a year really that shocking? Surely, if it had a sizeable amount, the number would be higher? It is true it may be difficult to tell someone who is 17 apart from someone who is 18, but with so many abuse hotlines and investigators looking it, should it really be that low?

        And the so called "trauma" someone would get from viewing child pornography is grossly overstated. Most of it is probably not going to be of a little toddler, but someone who is nearly of age. Would you find that traumatizing?

        link to this | view in chronology ]

  • icon
    ysth (profile), 22 Dec 2020 @ 7:23pm

    I usually enjoy these case studies, but this one seemed very biased. All of the decisions to be made and questions to consider seemed worded to only allow one direction of answer.

    I had never heard of xHamster, and have no idea what their business model is, if any; maybe more background on that would have helped me understand where the article author is coming from.

    TL;DR for the article: bad website, bad bad website

    link to this | view in chronology ]

    • icon
      PaulT (profile), 22 Dec 2020 @ 11:25pm

      Re:

      "I usually enjoy these case studies, but this one seemed very biased."

      Yet, you later announce you haven't the first clue about the subject of the article. Weird.

      All of the decisions to be made and questions to consider seemed worded to only allow one direction of answer."

      OK, so this is where you give constructive criticism, such as suggesting which other options the site should have, and what is missing from the article?

      "I had never heard of xHamster, and have no idea what their business model is, if any; maybe more background on that would have helped me understand where the article author is coming from."

      Or, you could type the word into Google and spend 2 minutes reading up on it yourself, instead of whining that you haven't heard of the 22nd largest site on the internet? I can understand not having heard of a particular porn peddler if you're not personally into porn, but I can't understand someone that will immediately complain they haven't been spoon fed basic information when they can open a tab and educate themselves more quickly than it took them to complain about it.

      link to this | view in chronology ]

      • icon
        crade (profile), 23 Dec 2020 @ 6:36am

        Re: Re:

        Hopefully a little bit more constructive.. I often find the case studies do this but this one especially so, the various sections all seem to be basically the same thing.. They are rehashes of one unanswered question: Are unpaid volunteers bad, or not as good as paid workers.

        The study says xhamster is 22nd "largest" site, but doesn't explain what that might mean in terms of their revenue. Does being a popular free porn site mean you are rolling in benjamins and should be able to throw money at any problem?

        What wasn't discussed at all, but I found interesting is that despite xhamster being so large in terms of popularity that their reviewers group is so small. Would a few thousand less involved volenteers reviewers do better than 100?

        Also, there seemed to be absolutely no attempt to discuss the answer to the question.. how well or how badly they are doing at content moderation

        link to this | view in chronology ]

        • icon
          Samuel Abram (profile), 23 Dec 2020 @ 7:07am

          Re: Re: Re:

          Those are far more valid, constructive, and non-trolling points than just saying a post on an opinion blog is "biased" when nobody on TechDirt said otherwise. I don't have any answers, but at least you're asking the right questions.

          link to this | view in chronology ]

          • icon
            ysth (profile), 23 Dec 2020 @ 5:15pm

            Re: Re: Re: Re:

            I certainly wasn't intending to troll. And I wasn't trying to argue for or against the aritcle's position. All I was trying to say was that, from my reading of it, this case study lacked the nuance, backing of points with data, and honest questioning I've seen in others.

            link to this | view in chronology ]

        • icon
          PaulT (profile), 23 Dec 2020 @ 7:11am

          Re: Re: Re:

          " I often find the case studies do this but this one especially so, the various sections all seem to be basically the same thing.. They are rehashes of one unanswered question: Are unpaid volunteers bad, or not as good as paid workers."

          I don't find that to be the case, but that is certainly more constructive.

          "The study says xhamster is 22nd "largest" site, but doesn't explain what that might mean in terms of their revenue"

          I'm fairly sure that the "largest" just means in terms of traffic - obviously, different sites monetise in different ways and I'm not sure how much of that data is publicly available for this kind of site.

          My point was that if you're faced with something that's claimed to be in the top 50 websites on the internet, you don't recognise the name and you don't think to have a look before commenting, you probably aren't armed with enough data to criticise the article about that subject.

          "What wasn't discussed at all, but I found interesting is that despite xhamster being so large in terms of popularity that their reviewers group is so small. Would a few thousand less involved volenteers reviewers do better than 100?"

          It's probably a very tricky question with no clear answer. My guess would be that if you have 100 very engaged porn consumers who are willing to volunteer hours every day to moderate content, you're going to get better results than if you have a few thousand casual watchers who might just have it on while doing something else. But, lacking actual studies to confirm one way or another it's hard to say, so the question instead becomes that of volunteer vs employed moderators, which is what the article is addressing as you mentioned.

          "Also, there seemed to be absolutely no attempt to discuss the answer to the question.. how well or how badly they are doing at content moderation"

          Well, the linked article seems to suggest that nobody really knows. As, apparently, their competitors didn't until they got targeted by an article telling them how badly they were doing.

          link to this | view in chronology ]

          • icon
            crade (profile), 23 Dec 2020 @ 9:01am

            Re: Re: Re: Re:

            "I don't find that to be the case, but that is certainly more constructive."
            I meant the sections within each case study seem to be restatements of each other or too similar to each other, it's only this one that is trying to answer that specific question.

            link to this | view in chronology ]

  • identicon
    Anonymous Coward, 25 Dec 2020 @ 8:59pm

    The moderators aren't being paid minimum wage.

    link to this | view in chronology ]

    • icon
      Ehud Gavron (profile), 25 Dec 2020 @ 9:14pm

      Re: minimum wage

      The moderators aren't being paid minimum wage.

      In which state? All have differing minimum wages. That's just in the United States.

      E

      link to this | view in chronology ]

  • icon
    Ehud Gavron (profile), 25 Dec 2020 @ 11:44pm

    Minimum wages

    To add detail, in the US minimum wage applies to employees, and typically do not work on a farm and are not independent contractors. The numbers (required minimum wages) for those "W-2 employees" varies from state to state, so when evaluating whether Xhamster pays their moderators below minimum wage one should consider among other things:

    • are these moderators employees (as in "W-2 recipients")
    • do they work the required minimum number of hours per week
    • do they get benefits
    • where do they physically reside. It is the place of residence of an employeethat determines the minimum wage, not the location of the business.

    See https://www.dol.gov/agencies/whd/mw-consolidated for the table of what each state has passed.

    Wikipedia has an article but I won't link to it because

    • it's not a primary source of information
    • it says "workers" not "employees" and the laws are pretty clear on that distinction

    As always, I'm not a lawyer, but when one starts to tell a business to pay ICs the same as FTEs one is showing a lack of understanding of what laws apply.

    None of this applies to farms. (see link above).

    Ehud

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.