Stop Blaming Algorithms For Misinformation And Threats To Democracy; The Real Problem Is Societal

from the fixing-the-wrong-thing dept

For quite some time now, we've pointed out that we should stop blaming technology for problems that are actually societal. Indeed, as you look deeper at nearly every "big tech problem," you tend to find the problem has to do with people, not technology. And "fixing" technology isn't really going to fix anything when it's not the real problem. Indeed, many proposals to "fix" the tech industry seem likely to exacerbate the problems we're discussing.

Of course, the "techlash" narrative is incredibly powerful, and the media has really run with it of late (as have politicians). So, it's nice to see at least Wired is starting to push back on the narrative. A new cover story makes it clear that "Bad Algorithms Didn't Break Democracy." It's a great article, by Gideon Lewis-Kraus. It acknowledges the narrative, and even that the techlash narrative is appealing at a surface level:

It’s easy to understand why this narrative is so appealing. The big social media firms enjoy enormous power; their algorithms are inscrutable; they seem to lack a proper understanding of what undergirds the public sphere. Their responses to widespread, serious criticism can be grandiose and smarmy. “I understand the concerns that people have about how tech platforms have centralized power, but I actually believe the much bigger story is how much these platforms have decentralized power by putting it directly into people’s hands,” said Mark Zuckerberg, in an October speech at Georgetown University. “I’m here today because I believe we must continue to stand for free expression.”

If these corporations spoke openly about their own financial interest in contagious memes, they would at least seem honest; when they defend themselves in the language of free expression, they leave themselves open to the charge of bad faith.

But as the piece goes on to highlight, this doesn't really make much sense -- and despite many attempts to support it with actual evidence, the evidence is completely lacking:

Over the past few years, the idea that Facebook, YouTube, and Twitter somehow created the conditions of our rancor—and, by extension, the proposal that new regulations or algorithmic reforms might restore some arcadian era of “evidential argument”—has not stood up well to scrutiny. Immediately after the 2016 election, the phenomenon of “fake news” spread by Macedonian teenagers and Russia’s Internet Research Agency became shorthand for social media’s wholesale perversion of democracy; a year later, researchers at Harvard University’s Berkman Klein Center concluded that the circulation of abjectly fake news “seems to have played a relatively small role in the overall scheme of things.” A recent study by academics in Canada, France, and the US indicates that online media use actually decreases support for right-wing populism in the US. Another study examined some 330,000 recent YouTube videos, many associated with the far right, and found little evidence for the strong “algorithmic radicalization” theory, which holds YouTube’s recommendation engine responsible for the delivery of increasingly extreme content.

The article has a lot more in it -- and you should read the whole thing -- but it's nice to see it recognizes that the real issue is people. If there's a lot of bad stuff on Facebook, it's because that's what its users want. You have to be incredibly paternalistic to assume that the best way to deal with that is to have Facebook deny users what they want.

In the end, as it becomes increasingly untenable to blame the power of a few suppliers for the unfortunate demands of their users, it falls to tech’s critics to take the fact of demand—that people’s desires are real—even more seriously than the companies themselves do. Those desires require a form of redress that goes well beyond “the algorithm.” To worry about whether a particular statement is true or not, as public fact-checkers and media-literacy projects do, is to miss the point. It makes about as much sense as asking whether somebody’s tattoo is true. A thorough demand-side account would allow that it might in fact be tribalism all the way down: that we have our desires and priorities, and they have theirs, and both camps will look for the supply that meets their respective demands.

Just because you accept that preferences are rooted in group identity, however, doesn’t mean you have to believe that all preferences are equal, morally or otherwise. It just means our burden has little to do with limiting or moderating the supply of political messages or convincing those with false beliefs to replace them with true ones. Rather, the challenge is to persuade the other team to change its demands—to convince them that they’d be better off with different aspirations. This is not a technological project but a political one.

Perhaps it's time for a backlash to the techlash. And, at the very least, it's time that instead of just blaming the technology, we all take a closer look at ourselves. If it's a political or societal problem, we're not going to fix it (at all) by blaming Facebook.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: algorithms, blame, disinformation, humans, misinformation, recommendations, societal problems


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 28 Jan 2020 @ 9:36am

    Look up "cyber gangbanging" for why Section 230 is killing people. Gangs now fight on the internet and then kill each other IRL.

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 28 Jan 2020 @ 9:43am

    If politicians were honest they would be responsible.
    That has never happened in the history of man lol

    link to this | view in thread ]

  3. icon
    Thad (profile), 28 Jan 2020 @ 9:52am

    If there's a lot of bad stuff on Facebook, it's because that's what its users want.

    But there's certainly a feedback loop at work, too. Stories gain prominence because a lot of people are clicking on them, but a lot of people are clicking on them because they're displayed prominently.

    link to this | view in thread ]

  4. identicon
    Comboman, 28 Jan 2020 @ 10:11am

    ... and guns don't kill people.

    While technically true, this argument is about as intellectually honest as the "Guns don't kill people; People kill people" argument that the gun lobby use against gun regulation. Every tool has the ability to be misused, but some are far more dangerous and prone to misuse than others. The answer for both tech and guns is more oversight and regulation.

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 28 Jan 2020 @ 10:14am

    Re:

    The whole premise that if it's on Facebook, people must want it, is questionable. (Ads would be an obvious counterexample.) Facebook's algorithms are designed to show people whatever will make them interact with FB as much as possible, not what will provide them useful knowledge or improve society. They're also flawed like many algorithms, in that they only work with data that's easy to collect. FB can easily know whether you clicked. They don't know whether it provided enlightenment, or what the long-term effect on society will be.

    FB et al. aren't the problem, but their algorithms likely play a part. And there's no fundamental reason why everyone's Facebook interactions need to be governed by the same algorithm, or even by an algorithm written by FB.

    link to this | view in thread ]

  6. identicon
    Wyatt Derp, 28 Jan 2020 @ 10:27am

    Re: ... and guns don't kill people.

    tech is not a gun

    The answer is not more of the same, if there is an answer it will be logical and address the root cause of that which afflicts you.

    This wagging of fingers at the thing you dislike is juvenile and dishonest.

    link to this | view in thread ]

  7. identicon
    Anonymous Coward, 28 Jan 2020 @ 10:30am

    Re:

    Democracy is the theory that the common people know what they want, and deserve to get it good and hard. - H.L. Mencken
    We are a nation of idiots yearning to be morons - me

    link to this | view in thread ]

  8. identicon
    Anonymous Coward, 28 Jan 2020 @ 10:31am

    Re: Re:

    What algorithm would you write that would solve the problem of not knowing whether a bit provided enlightenment or would improve society? How would you implement that?

    Stating a problem doesn't automatically make that problem solvable with a computer. Some problems cannot be solved by a computer.

    link to this | view in thread ]

  9. identicon
    Anonymous Coward, 28 Jan 2020 @ 10:46am

    the real problem, in my opinion, is governments! they are so scared of the people finding out what the lying, cheating, self- interested, two-faced fuckers in them are up to and who they are doing the biding of, who they are helping, instead of the people, simply to keep a certain few in total control of the planet, not just countries! nothing other than greed, control and the fear of change is responsible for the crap we're in now, with a world getting quicker at destroying itself and us along with it! what good is it going to do being the richest guy on the Planet, having billions of dollars, when there is NO PLANET? and what makes things worse are that not only do we have the means to hold the Planet together, for the good of all, but those governments and 'friends' have employed the world's security services to help them blame everyone else but them for all the world's troubles, and then continue to fuck things up for all! where the hell is the sense in that? unless aliens have handed over the tech to enable fleeing to another Planet (so we can fuck that one up as well!) those who will leave need to remember, having all the money there is wont clean the crappers out, grow the food or mend the sick. with none of us 'plebs' with them, they'll have to get their own hands dirty for change!!

    link to this | view in thread ]

  10. identicon
    Anonymous Coward, 28 Jan 2020 @ 10:46am

    Re: ... and guns don't kill people.

    While technically true, this argument is about as intellectually honest as the "Guns don't kill people; People kill people" argument that the gun lobby use against gun regulation.

    The fundamental idea of the argument is sound. The gun is just a tool. It doesn't choose what to shoot at. The person with the murderous intent is the actual problem. This article is making a similar point. The algorithm is the tool. The person with the malicious intent is the actual problem.

    That said, I don't agree with how the gun lobby chooses to use the argument, because it seems they only proceed in a half-measure. "Guns don't kill people, so guns should be unrestricted." Unfortunately, this ignores the "People kill people" side of the argument, for which (as possible examples) we could identify and correct cultural/societal issues that contribute to violent crime, try to provide a troubled person with the help they need so that they never resort to killing, and/or recognize that a particular person would simply be far too dangerous in possession of a gun, so they shouldn't be allowed to have one.

    link to this | view in thread ]

  11. identicon
    Anonymous Coward, 28 Jan 2020 @ 10:49am

    Re: ... and guns don't kill people.

    If your answer is just vague "regulation" without any elaboration you don't really have an answer, you have gussied up "Do something! This is something so we need to do it!" Politician's Syllogism for knee jerk policies.

    link to this | view in thread ]

  12. icon
    Koby (profile), 28 Jan 2020 @ 10:58am

    Blame

    With regard to the 2016 election, not blaming Facebook for allowing a few memes published by Russian trolls means that Democrats would blame themselves for their poor candidates and ideas. In the future, not blaming Facebook for engaging in censorship will mean that Republicans must blame themselves for their poor candidates and ideas. These two things seem so unlikely that I just don't forsee it happening. It's just too easy to "blame Facebook".

    link to this | view in thread ]

  13. identicon
    Anonymous Coward, 28 Jan 2020 @ 10:58am

    Technology amplifies human actions, including actions that cause harm. The fallibility of human nature, and our capacity for evil has proven to be an intractable problem that we've been dealing with for centuries -- and human nature hasn't really changed that much, for all that effort. But it's certainly rhetorically handy to insist that we fundamentally improve human behavior before making any other attempt to reduce harm.But apparently harm reduction is completely off the table until we figure out how to make people not be greedy, selfish, and dishonest.

    Ultimately, that's what baffles me about Masnick's constant drumbeat of do-nothingism: his utter and complete dismissal of any attempt to reduce the harm that results from the technological amplification of bad actions by bad actors.Why promote this view? Why insist on digging in to preserve a harmful status quo? Just what is your game, Masnick? Simple motivated reasoning? An attempt to salvage an techno-optimist philosophy left in tatters by the world's steady decline into (a technologically-enable) dystopian future?

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 28 Jan 2020 @ 11:03am

    Re: Re: Re:

    People could choose one or more algorithms to see if they are what they want, from an open market of algorithms, rather than being chained to one interest's marketing algorithm of choice.

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 28 Jan 2020 @ 11:05am

    Re: ... and guns don't kill people.

    What regulation? More pointless idiotic regulation, or something based on outcome evidence which can actually help? Still waiting on a suggestion for what that might actually be...

    link to this | view in thread ]

  16. identicon
    Anonymous Coward, 28 Jan 2020 @ 11:07am

    Re:

    What actual thing would you do? What, pray tell, does your do-somethingism suggest?

    link to this | view in thread ]

  17. identicon
    Anonymous Coward, 28 Jan 2020 @ 11:12am

    Re: Re: Re: Re:

    Are those algorithms run on the servers, or is the fire hose of a data stream fed to every user? Note, both solutions run into scaling problems.

    link to this | view in thread ]

  18. icon
    crade (profile), 28 Jan 2020 @ 11:36am

    Re: Re: Re: Re:

    Which is exactly what happened. People chose and continue to choose. Thats how the current market leaders became and remain such. As soon as someone else does it better, Google adapts or becomes Yahoo or Lycose and Facebook adapts or becomes geocities or myspace.

    link to this | view in thread ]

  19. icon
    Stephen T. Stone (profile), 28 Jan 2020 @ 11:37am

    his utter and complete dismissal of any attempt to reduce the harm that results from the technological amplification of bad actions by bad actors

    If you can think of a way to reduce said harm that doesn’t…

    1. cause more harm than it prevents

    2. infringe upon civil rights

    3. attack a symptom instead of the disease

    …well, champ, now would be the time to offer your thoughts.

    link to this | view in thread ]

  20. identicon
    Anonymous Coward, 28 Jan 2020 @ 11:57am

    Re: Re: Re:

    Stating a problem doesn't automatically make that problem solvable with a computer. Some problems cannot be solved by a computer.

    That's the point. Determining what people want (or more abstractly what they should want) is not solvable by computer. So we shouldn't be claiming that Facebook shows people what they want, as if that is a solved problem.

    As for the more general "what would you do" question, it's an open research question I'm not qualified to write an algorithm for. But right now, only Facebook employees even have the chance. Letting university researchers provide new algorithms for people to try on their Facebook data could produce valuable research. Allowing interface tweaks, like feedback more nuanced that "like" (eg: ranking education and humor separately), could help.

    link to this | view in thread ]

  21. identicon
    Anonymous Coward, 28 Jan 2020 @ 12:00pm

    Re: Re: Re: Re: Re:

    Which is exactly what happened. People chose and continue to choose. Thats how the current market leaders became and remain such.

    That's "choice" in the same way as choosing an ISP or a politician. A thin veneer of competition is not the same as meaningful control. This case is actually worse—unlike ISPs, I can't even switch platforms unless the people I communicate with also switch.

    link to this | view in thread ]

  22. identicon
    Anonymous Coward, 28 Jan 2020 @ 12:00pm

    Re: Re:

    Well, at least you concede the basic point that the current status quo is untenable. It's a start at least!

    link to this | view in thread ]

  23. icon
    That Anonymous Coward (profile), 28 Jan 2020 @ 12:01pm

    An entire generation raised with the ideals of nothing is your fault, someone else should fix it, & they should pay for your idiocy.

    Well that man wouldn't have mailed bombs to congresspeople if FB had limited his exposure to all the hateful rhetoric!!
    Then launch into screaming fits when someone suggests their bleach enemas cure autism group gets flagged.

    No one is coming to save you, stop expecting it.
    No one is responsible for you, except you.

    Yes its hard to learn to filter all the information coming at you, but refusing to try & demanding tech magically fix it for you leads to more stupid people.

    People scream at Twitter b/c they let the nazis, racists, etc etc roam free... they mass report them... then are SHOCKED just shocked when they end up mass reported.

    It is not a game you can "win". You don't have to crush your enemies underfoot and hear the lamentations of their women.
    But human nature is "me me me me me" (disbelieve? watch people trying to merge into traffic & the battle of wills over someone feeling that someone getting ahead of them means they lost... so they will cause larger traffic snarls to "win". They end up making themselves even later/slower but inside they think they won b/c they stayed ahead of the other guy)

    It is easier to blame tech than to admit some people suck & move on.
    Its a war & winning is all that matters... even if winning means creating more tools that can be used against us in ways we never considered & then whine about.

    Stop abdicating responsibility outside of yourself for yourself.

    link to this | view in thread ]

  24. icon
    crade (profile), 28 Jan 2020 @ 12:05pm

    Re: Re: ... and guns don't kill people.

    This particular article is more along the lines of
    "mirrors don't cause acne"
    than
    "guns don't kill people"

    It's not trying to say people are using social media to cause the problem, it's saying the problem is there regardless of social media

    link to this | view in thread ]

  25. identicon
    Anonymous Coward, 28 Jan 2020 @ 12:20pm

    Re:

    I should probably have mentioned that I generally agree in principle with the idea that over-reacting can go badly. My chief problem is with Masnick's obstinacy on this specific subject.

    In terms of what specific actions I'd take (assuming you're asking in good faith, and this isn't some attempt to draw me into some round-and-round of deflection and nitpicking) -- well, that's really a better question for someone with access to a thinktank, but off the top of my head, it might be worth revisiting Facebook's 2011 consent decree and the many privacy violations they've committed since then. They've done a lot of wrong (as Masnick has repeatedly stated) but they have never faced any approaching a serious consequence for any of it. Tiny fines (relative to their revenues anyway) can be easily brushed off; you've heard of nuisance lawsuits? Well, Facebook is big enough to think in terms of "nuisance regulations". But we've got to start somewhere -- Masnick seems not to even want to start.

    What's so galling about this is that Techdirt has historically been so good about holding ISPs, Telcos, and copyright monopolists to account, but social media gets a huge shrug for some reason. I hate to see a valuable source of news and insight periodically interrupted by these plaintive defenses of massive companies that already have armies of lawyers and PR flacks working on that job.

    link to this | view in thread ]

  26. icon
    ECA (profile), 28 Jan 2020 @ 12:34pm

    FB

    FB is interesting, and there are option in the program to NOT see certain adverts..Even YT has them.
    Throw out a few adverts to see the response and if its passed around, then see if others like it.

    The biggest thing about the net comes with the idea that its interactive. And if there is a Address to send comments, DO IT..
    Those sites that Lock down when you goto them and demand you register and pay money are idiots. And sites that dont listen to 'Customers' or give them access to email or phone are even more idiotic.

    link to this | view in thread ]

  27. icon
    ECA (profile), 28 Jan 2020 @ 12:37pm

    Our news worthy...

    are Just finger pointing 2 year olds..
    They know nothing of the world, we give them Candy(money) to shut them up, and they keep bothering other people...
    Time for a Long nap.

    link to this | view in thread ]

  28. icon
    urza9814 (profile), 28 Jan 2020 @ 12:41pm

    The obvious question

    I'm a bit curious about the methodology here and what precisely they're looking at.

    Facebook's algorithm is obviously tuned to provide whatever will keep you on Facebook. That's profitable for them. In a sense, that's giving you what you want. But what you want in order to keep browsing Facebook is not necessarily the same as what you want in life in general. Someone with a strong enough compulsion to try to correct idiots will stay on Facebook forever if you keep feeding them posts from idiots, but that probably isn't actually how they want to spend the rest of their life. If you start from an assumption that what people want is exactly what they click and spend time viewing, then you're already measuring it wrong.

    So, are the studies mentioned measuring what content people actually desire to consume, or are they measuring what content will keep people tethered to their current activity? I don't think these are the same thing, and if they're measuring the same (incorrect) value that social media optimizes for, then obviously their research would indicate that social media isn't the issue.

    People can have conflicting desires. People like to be lazy; people also like the sense of fulfillment that comes from being productive. People like to eat double bacon cheeseburgers but also want to be fit and healthy. These things don't have to create desires out of nothing in order to be harmful; they can do plenty of damage simply by amplifying the parts of yourself that you'd rather suppress.

    That's not to say I'm in favor of banning social media or anything, although that IS why I haven't really touched it myself in 2-3 years. In my ideal world, we'd all be using diaspora* or something, and could experiment a lot more in terms of what truly makes a good social networking platform. But if we're going to stick with these monopolistic walled gardens, there does need to be some regulation. They're looking very much like a drug to me right now, so maybe we ought to regulate them as one. Not Schedule I -- nothing should be regulated like Schedule I -- but maybe more like Advil.

    link to this | view in thread ]

  29. icon
    Stephen T. Stone (profile), 28 Jan 2020 @ 12:45pm

    This is a good answer. 👍

    link to this | view in thread ]

  30. icon
    urza9814 (profile), 28 Jan 2020 @ 12:55pm

    Re: Re: Re: ... and guns don't kill people.

    ...are you implying that people wouldn't be killing each other if nobody had guns?

    The gun analogy seems perfect to me. People kill each other. Some of them use guns, some of them don't. Nobody buys a gun and goes "Well, I wasn't planning on killing anyone before I bought this, but I guess I'll have to now!"

    People believe awful things. Some of them get/post those awful things from/to social media, some of them don't. Nobody gets on social media and goes "Well, I wasn't a white supremacist before, but now that I've got Facebook I sure will be!"

    link to this | view in thread ]

  31. This comment has been flagged by the community. Click here to show it
    icon
    Zof (profile), 28 Jan 2020 @ 1:06pm

    Those poor multinational corporations

    Who will speak for the wealthy folks that keep getting caught screwing us over? Who will defend the indefensible?

    This site. That’s who. Keep up the same work!

    link to this | view in thread ]

  32. identicon
    Anonymous Coward, 28 Jan 2020 @ 1:18pm

    Re: Blame

    With regard to the 2016 election, not blaming Facebook for allowing a few memes published by Russian trolls means that Democrats would blame themselves for their poor candidates and ideas.

    Exactly! They took perhaps the most toxic candidate of all time -- someone so abjectly awful that she ended up losing to freaking Donald Trump of all people! -- allowed her to flat-out steal a primary that Bernie Sanders was the clear winner of before the DNC put its thumb on the scale, and then were somehow surprised when she lost. And they've spent the last 3 and a half years blaming anything and everything else, frantically trying to avoid the truth: they lost because they ran a horrible candidate who never should have been there in the first place.

    link to this | view in thread ]

  33. icon
    crade (profile), 28 Jan 2020 @ 1:28pm

    Re: Re: Re: Re: ... and guns don't kill people.

    No. From what I can tell from the summary in this post the article seems to be contending that people are NOT being radicalized through social media. The article seems to be claiming that not only is social media not "to blame" but it isn't much of a factor as it isn't really having much effect on whether people are radicalized or not so it's not even a tool that people are using for radicalization (like a gun is a tool used to kill people). Instead its just a tool to reveal how people are.

    "abjectly fake news “seems to have played a relatively small role in the overall scheme of things"

    "A recent study by academics in Canada, France, and the US indicates that online media use actually decreases support for right-wing populism in the US"

    link to this | view in thread ]

  34. icon
    crade (profile), 28 Jan 2020 @ 1:36pm

    Re: Re: Re: Re: Re: ... and guns don't kill people.

    Not my personal view..
    I find the claim that online media use decreases support for right-wing populism in the U.S. kindof.. hard to buy myself I'd guess if you checked the math it's something more along the lines of a correlation than a cause-effect

    link to this | view in thread ]

  35. identicon
    Anonymous Coward, 28 Jan 2020 @ 1:52pm

    Re: Those poor multinational corporations

    Are you confusing TD with GOP?

    link to this | view in thread ]

  36. icon
    Thad (profile), 28 Jan 2020 @ 2:00pm

    Re: The obvious question

    That's a fair point. Facebook and the other big social networking sites are designed to encourage compulsive behavior.

    I'd click "Insightful", but that's exactly the kind of compulsive behavior we're talking about.

    link to this | view in thread ]

  37. identicon
    Bobvious, 28 Jan 2020 @ 2:17pm

    Re: Re: DEEPSTATE confirmed!

    Face Book Interactions

    link to this | view in thread ]

  38. icon
    Stephen T. Stone (profile), 28 Jan 2020 @ 2:40pm

    You can’t blame only Hillary and the DNC for the failures of 2016. Doing so implies that things like the Republicans running interference for Trump with the investigations into Clinton, the Russian mis- and disinformation campaigns, and the James Comey “October surprise” had absolutely no influence in the results of that election. To believe such implications is irresponsible at best and willfully ignorant at worst.

    I’m no fan of Hillary Clinton or the DNC. They ran a bad campaign against a candidate that should have been easy to defeat. And yes, those failings were a big factor in Clinton’s loss — but they were not the only factor.

    link to this | view in thread ]

  39. icon
    Stephen T. Stone (profile), 28 Jan 2020 @ 2:41pm

    Who will speak for the wealthy folks that keep getting caught screwing us over? Who will defend the indefensible?

    Republicans.

    link to this | view in thread ]

  40. identicon
    Anonymous Coward, 28 Jan 2020 @ 2:49pm

    The technology is the problem if you understand how some of it works. However it isn't computers or Facebooks pages.

    link to this | view in thread ]

  41. identicon
    bob, 28 Jan 2020 @ 3:04pm

    Re: Re:

    Doing something for the sake of something is a waste of resources. Especially when your limited resources have to be spread between multiple issues. So doing nothing might be a better choice.Doing nothing allows you to instead concentrate on other bigger issues with your limited resources.

    I like your idea of actually having meaningful punishment for companies that abuse their power/position. But from what I've seen there isn't currently anyone in a regulatory position to hand out those meaningful punishments.

    link to this | view in thread ]

  42. identicon
    Anonymous Coward, 28 Jan 2020 @ 3:40pm

    Re: Those poor talking point slingers

    What feels worse bro? When people actually pay attention to you and take apart your stupid argument in one word. Or when it isn’t even worth wasting typing out that much?

    link to this | view in thread ]

  43. identicon
    Rocky, 28 Jan 2020 @ 3:40pm

    Re:

    The root of the problem is NOT people clicking and believing stupid stuff. The root IS societal, since we as a society doesn't seem to be able to educate people in factual and critical thinking.

    As long as a large part of the population doesn't get the "mental" tools to dismiss bullshit, this problem will persist regardless of any algorithms deployed.

    link to this | view in thread ]

  44. This comment has been flagged by the community. Click here to show it
    icon
    Zof (profile), 28 Jan 2020 @ 3:44pm

    Re:

    The biggest factor to Hillary Clinton losing the most winnable election in history I think is all the lies, and when CNN got caught lying for her when half the DNC convention walked out, and CNN tried to pretend it wasn't happening, despite it being all over social media with video and picture evidence. Oh, and she would just be President right now if she had removed her head from her huge ass and made Sanders her VP pick. No question.

    link to this | view in thread ]

  45. This comment has been flagged by the community. Click here to show it
    icon
    Zof (profile), 28 Jan 2020 @ 3:46pm

    One wonders when tech companies are ever responsible for actions

    This site sure has a way to make sure they aren't.

    link to this | view in thread ]

  46. identicon
    Anonymous Coward, 28 Jan 2020 @ 4:09pm

    Re: Re:

    "most winnable election in history"

    The data supporting this allegation would be very interesting, or perhaps it was just in jest. I hope they do not repeat the same mistakes.

    link to this | view in thread ]

  47. identicon
    Chuck Sod, 28 Jan 2020 @ 4:13pm

    Re: Get Off My Lawn

    "An entire generation raised with the ideals of nothing is your fault"

    Every older generation seems to accuse the younger generation of these types of offenses.

    link to this | view in thread ]

  48. identicon
    Anonymous Coward, 28 Jan 2020 @ 4:15pm

    Re: One wonders when tech companies are ever responsible for act

    Examples?

    link to this | view in thread ]

  49. icon
    Stephen T. Stone (profile), 28 Jan 2020 @ 4:43pm

    How is Twitter responsible for the actions of a third party?

    link to this | view in thread ]

  50. identicon
    bob, 28 Jan 2020 @ 5:28pm

    Re:

    Ok boomer. ;P

    link to this | view in thread ]

  51. identicon
    Anonymous Coward, 28 Jan 2020 @ 8:00pm

    Re:

    Because they won’t stop liking things that zof doesn’t.

    link to this | view in thread ]

  52. icon
    laminar flow (profile), 28 Jan 2020 @ 9:59pm

    If people weren't so credulous and gullible there wouldn't be a problem. Politicians are attacking tech because they don't want to acknowledge this reality. The reason is obvious: politicians, their enablers in the media, and the government in general need such an unquestioning public to swallow the endless deluge of bullshit they produce with as little resistance as possible.

    link to this | view in thread ]

  53. icon
    techflaws (profile), 28 Jan 2020 @ 10:15pm

    Re: Re:

    So Hillary's truth-to-lie-ratio was higher than the Orange Chetoo's? Really?

    link to this | view in thread ]

  54. icon
    techflaws (profile), 28 Jan 2020 @ 10:16pm

    Re: Re: Re:

    d'uh. lower.

    link to this | view in thread ]

  55. icon
    techflaws (profile), 28 Jan 2020 @ 10:18pm

    Re: One wonders when tech companies are ever responsible for act

    Wow! I wasn't aware of the power this blog actually wields (while at the same time allowing cluessless asshats like you spout their nonsense). Thanks for pointing that out.

    link to this | view in thread ]

  56. icon
    Mike Masnick (profile), 28 Jan 2020 @ 11:25pm

    Re:

    Technology amplifies human actions, including actions that cause harm.

    Agreed.

    The fallibility of human nature, and our capacity for evil has proven to be an intractable problem that we've been dealing with for centuries -- and human nature hasn't really changed that much, for all that effort. But it's certainly rhetorically handy to insist that we fundamentally improve human behavior before making any other attempt to reduce harm.

    I did not say that we should improve human nature before we reduce harm. I questioned how much "harm" is actually caused by tech.

    But apparently harm reduction is completely off the table until we figure out how to make people not be greedy, selfish, and dishonest.

    That is not what I said, nor implied. I said that the problem is that we blame technology for human problems -- and if you focus "harm reduction" on technology, it will inevitably fail, as it is not targeting the actual problem.

    Ultimately, that's what baffles me about Masnick's constant drumbeat of do-nothingism

    I have never, ever suggested "do nothingism"

    his utter and complete dismissal of any attempt to reduce the harm that results from the technological amplification of bad actions by bad actors

    I have not "dismissed" such attempts. I have pointed out why they will often cause more damage than good. Present me with a plan that ACTUALLY reduces harm, and I'm eager to hear it out. I've talked, for example, about why I think the protocols approach would reduce any harm. I also, similarly, appreciate Cory Doctorow's idea of adverse interoperability.

    My complaint is that most "harm reduction" seems to actually be designed in a manner to simply lock up the market for Facebook and Google.

    Why insist on digging in to preserve a harmful status quo?

    I have never done this.

    link to this | view in thread ]

  57. icon
    Mike Masnick (profile), 28 Jan 2020 @ 11:29pm

    Re: Re:

    In terms of what specific actions I'd take (assuming you're asking in good faith, and this isn't some attempt to draw me into some round-and-round of deflection and nitpicking) -- well, that's really a better question for someone with access to a thinktank, but off the top of my head, it might be worth revisiting Facebook's 2011 consent decree and the many privacy violations they've committed since then.

    The 2019 consent decree literally did exactly that. I wrote about it and did a two part podcast series literally walking through the new consent decree (which is an update to the earlier consent decree directly calling out the privacy violations since the original one). So, uh... done.

    They've done a lot of wrong (as Masnick has repeatedly stated) but they have never faced any approaching a serious consequence for any of it. Tiny fines (relative to their revenues anyway) can be easily brushed off; you've heard of nuisance lawsuits? Well, Facebook is big enough to think in terms of "nuisance regulations". But we've got to start somewhere -- Masnick seems not to even want to start.

    $5 billion is not a tiny fine. It is more than all other FTC fines of that nature combined. Separately, the new consent decree has many, many more parameters that are very much in the vein of "serious consequences." We went over this line by line in the podcast.

    What's so galling about this is that Techdirt has historically been so good about holding ISPs, Telcos, and copyright monopolists to account, but social media gets a huge shrug for some reason.

    Because there needs to be actual evidence -- and so far I haven't seen evidence that makes sense and how any of these approaches actually reduces harm, rather than locks in the giants.

    link to this | view in thread ]

  58. icon
    That Anonymous Coward (profile), 29 Jan 2020 @ 12:02am

    Re: Re: Get Off My Lawn

    Except I have the receipts....

    Its not MY fault I drove recklessly.

    https://www.techdirt.com/articles/20180613/16333340033/section-230-cant-save-snapchat-la wsuit-involving-speed-filter.shtml

    Apple has a patent, they should pay us.

    https://www.techdirt.com/articles/20161229/10472736367/victims-car-crash-sue-apple-not-preventin g-distracted-driver-hitting-their-vehicle.shtml

    Happy Meals made my kids fat, not me for refusing to parent them and say no.

    https://www.latimes.com/archives/la-xpm-2010-nov-02-la-fi-happy-meals-20101103-story.html

    You have the deep pockets!!! PAY ME for what my crazy Ex did!!

    https://www.techdirt.com/articles/20190118/01175141419/herrick-v-grindr-section-230-case-thats -not-what-youve-heard.shtml

    Its not MY fault I slept with the 13 yr old!!!

    https://www.techdirt.com/articles/20150316/17500630332/no-you-cant-sue-grindr-because-it-hook ed-you-up-with-13-year-old-sex.shtml

    Its your fault, pay us!!!!!!!!!

    https://www.techdirt.com/articles/20191209/20241743537/losing-streak-continues-litigants -suing-social-media-companies-over-violence-committed-terrorists.shtml

    Any Backpage lawsuit. We were trafficked!!!!!
    No no one from the company did it, but they have money pay us!!

    Lets sue the hotels!! They didn't pimp us out, but our former pimp doesn't have lots of cash. We never asked for help, but they should have known we needed it.

    https://www.nhpr.org/post/victim-sex-trafficking-files-federal-suit-against-hotel-chains#stream/ 0

    link to this | view in thread ]

  59. identicon
    Anonymous Coward, 29 Jan 2020 @ 1:40am

    Re: Re:

    Also, the problem is that politicians, and those who consider public spaces to be dens of iniquities are able to see the conversations that were previously hidden from their view.

    link to this | view in thread ]

  60. identicon
    Anonymous Coward, 29 Jan 2020 @ 5:29am

    Re: Those poor multinational corporations

    Who will speak for the wealthy folks that keep getting caught screwing us over?

    The same shit-flinging morons who vote for Donald Trump.

    link to this | view in thread ]

  61. identicon
    Anonymous Coward, 29 Jan 2020 @ 5:30am

    Re: One wonders when tech companies are ever responsible for act

    One wonders when tech companies are ever responsible for actions

    When gun manufacturers are responsible for theirs.

    link to this | view in thread ]

  62. identicon
    Anonymous Coward, 29 Jan 2020 @ 5:35am

    Re:

    Do tell.

    link to this | view in thread ]

  63. identicon
    Anonymous Coward, 29 Jan 2020 @ 5:40am

    Re: Re: One wonders when tech companies are ever responsible for

    "When gun manufacturers are responsible for theirs"

    Not sure what offenses you are referring to.

    1) promoting lax gun laws?
    2) encouraging the proliferation of weapons?
    3) suggesting unsafe weapon use?
    4) ???

    link to this | view in thread ]

  64. icon
    Wendy Cockcroft (profile), 29 Jan 2020 @ 6:55am

    Re: Section 230 puts the blame on posters, not platforms

    Section 230 is not responsible for that, the gangs and their members are. Litmus test: who uploads and posts the internet fighting items -- the platform or the gang members?

    If you're blaming the platform for not taking the posts down quickly enough you're basically asking for every internet user to have their comments checked before their posts go live. That means yours. Is that what you want?

    link to this | view in thread ]

  65. icon
    Thad (profile), 29 Jan 2020 @ 7:42am

    Re: Re:

    And the root of the problem with auto safety is bad drivers, but that doesn't mean there's nothing auto manufacturers can do to mitigate the issue.

    link to this | view in thread ]

  66. identicon
    Chuck Sod, 29 Jan 2020 @ 9:54am

    Re: Re: Re: Get Off My Lawn

    Where's the link to the story about how these things, or their equivalent, never happened before?

    link to this | view in thread ]

  67. identicon
    Christenson, 29 Jan 2020 @ 10:00am

    Forbidden Planet Movie

    All this reminds me of the 1950s(?) movie Forbidden Planet...

    ... in which the Krell mind machine gave its owners unlimited power to do anything...without moral consequences...and the krell were destroyed overnight.

    The internet has the same problem... it amplifies lots of communication and lots of biases, both good and bad, and we haven't yet figured out how to deal with that yet, as the net transitions from a small elite to the entire world. On top of that, we have a generation (at least in the US) growing up poorer than its baby boomer parents, so this is a recipe for trouble.

    Sometimes it's hard to tell if the cops are getting worse or our information is getting better, for one simple example...

    link to this | view in thread ]

  68. identicon
    Anonymous Coward, 29 Jan 2020 @ 10:26am

    Way to go techdirt. Censoring my comments. I hope you feel great about it. My voice represents the majority of Americans who are so fed up with these politicians destroying democracy in our country and now you take the ugly far left view of those hellbent on destroying that opposing view of them. Huraah.

    link to this | view in thread ]

  69. identicon
    Anonymous Coward, 29 Jan 2020 @ 11:05am

    Re:

    Look up "cyber gangbanging" for why Section 230 is killing people.

    They did that before the internet. Why is the internet and Section 230 suddenly responsible for it? As Wendy pointed out, it's not the platforms posting that stuff online, it's the gangs. If the gangs stop, the problem goes away.

    Gangs now fight on the internet and then kill each other IRL.

    Uh, what? Fighting online has no direct effect on real life. The only thing that would get someone killed in real life is if someone steps away from their computer and goes out and kills someone. That's their choice, the internet has nothing to do with it.

    link to this | view in thread ]

  70. identicon
    Anonymous Coward, 29 Jan 2020 @ 1:57pm

    Re: Blame

    Facebook makes it so easy to blame Facebook. Its hard not to want to just sit back and watch.

    link to this | view in thread ]

  71. identicon
    Anonymous Coward, 29 Jan 2020 @ 3:23pm

    coming pandemic?

    link to this | view in thread ]

  72. identicon
    Anonymous Coward, 29 Jan 2020 @ 5:57pm

    Re: Re: Section 230 puts the blame on posters, not platforms

    John Smith can't get his mailing list up anymore. Good, because the last thing we need is Weinstein fans like him in the gene pool.

    link to this | view in thread ]

  73. identicon
    Rocky, 29 Jan 2020 @ 6:04pm

    Re: Re: Re:

    False equivalence, Thad.

    To get a driving-license you actually have to take a test which weeds out the persons that are totally unqualified to use a vehicle. And if you do something particularly stupid and cause an accident you can loose your license or even your freedom.

    The worst that can happen on the internet is that someone moderates your rant or kick you off their platform all the while you are consuming junk science and fake news.

    link to this | view in thread ]

  74. icon
    Thad (profile), 30 Jan 2020 @ 10:04am

    Re: Re: Re: Re:

    False equivalence, Thad.

    It's not equivalence, it's an analogy.

    To get a driving-license you actually have to take a test which weeds out the persons that are totally unqualified to use a vehicle. And if you do something particularly stupid and cause an accident you can loose your license or even your freedom.

    All of which is true but completely irrelevant to the point of the analogy. Which is that just because something is some kind of intermediary step and not the root cause of a problem doesn't mean that there's no value in anticipating and mitigating problems that occur at that intermediary step.

    Do you have any criticisms of that statement, which was the entire purpose of the analogy? Or are you just going to nitpick ways in which the two things are dissimilar but which are completely irrelevant to the point of comparison? Because if all you can address is the latter, that suggests you can't think of any criticisms of the former.

    The worst that can happen on the internet is that someone moderates your rant or kick you off their platform all the while you are consuming junk science and fake news.

    Wait, are you arguing that platforms should moderate, or are you arguing that they shouldn't bother because the problem is human nature, not social media?

    link to this | view in thread ]

  75. identicon
    Rocky, 30 Jan 2020 @ 3:16pm

    Re: Re: Re: Re: Re:

    Regardless, you totally missed my original point and you actually agreed with it when you used the word "mitigate", ie if you only can mitigate a problem it will still persist.

    link to this | view in thread ]

  76. icon
    Scary Devil Monastery (profile), 4 Feb 2020 @ 6:18am

    Re: Re: Re: Re: Re:

    "Which is that just because something is some kind of intermediary step and not the root cause of a problem doesn't mean that there's no value in anticipating and mitigating problems that occur at that intermediary step."

    There is when the medium of contention can not be held as analogues of one another.

    Example; Car manufacturers can build ABF brakes, air bags, seat belts, parking cameras, collision detectors, and GPS into their cars.

    Let me know when a social platform has the ability to tell you when you're getting too close to another poster, about to collide with another, and can mitigate or prevent the damage from the resulting impact without a massive amount of 3rd-party interference.

    Otoh a social platform can decide you won't be posting on their platform no more. Let me know when ford can sell you a car then tell you you aren't going to drive it any longer because they think you're a twit.

    In the one case you're operating a device you bought and own with a transparent set of safeguards included.
    In the other case you're walking into another party's living room and using a bullhorn to trash talk assorted 3rd parties.

    So let's be clear about this; Comparing Car OEM's with online social platforms is not good.

    "Wait, are you arguing that platforms should moderate, or are you arguing that they shouldn't bother because the problem is human nature, not social media?"

    Why would this be an either/or from your view?

    Yes, it's a human problem, not the problem of social media.
    Yes, platforms should probably moderate.

    If anything a more proper analogy is that of a bar where people sometimes brawl in the corners and the bouncers are kept busy trying to keep the guy who keeps taking off his pants and shitting on the dance floor out.

    Depending on the type of bar there are varying degrees of corners, brawls, bouncers, and dance floor shitters.

    link to this | view in thread ]

  77. icon
    Scary Devil Monastery (profile), 4 Feb 2020 @ 6:25am

    Re: Re: Blame

    "They took perhaps the most toxic candidate of all time -- someone so abjectly awful that she ended up losing to freaking Donald Trump of all people!"

    Not that bad. Hillary wasn't worse than most other presidential candidates in the last twenty years...but certainly not much better either.

    The primary problem was that there are plenty of young, dedicated, skilled female politicians in the democratic party they could have pushed, but instead they went for the one most guaranteed never to challenge party policy.

    She was pushed, in other words, because the dems had nothing except the vague idea that since a black president worked they might as well go for a repeat performance with a woman.

    Bernie would have won them the white house for two terms without issue but the man's incorruptible and stubborn which means to the democratic party he's the choice which must never be allowed to win.

    link to this | view in thread ]

  78. icon
    Scary Devil Monastery (profile), 4 Feb 2020 @ 6:31am

    Re:

    "But apparently harm reduction is completely off the table until we figure out how to make people not be greedy, selfish, and dishonest."

    As usual, Baghdad Bob, all you've got here is a wordwall wrapped around the conceptual turd that free speech is bad and people shouldn't be trusted with it.

    No, harm reduction is very much ON the table, but your idea of having the platforms be responsible for everything 3rd parties might say isn't it.

    The analogy to your oft-touted suggestions would be to have people selectively banned from speaking unless they could find someone to verify that what they intended to say wasn't offensive to a vested interest.

    You really think you will EVER try to post that garbage around here and not have saner minds correcting you immediately, Bobmail? Or are you wearing the Jhon Smith jacket today?

    link to this | view in thread ]

  79. identicon
    Edward Bernays, 5 Feb 2020 @ 11:13am

    what its users "want"

    "If we understand the mechanisms and motives of the group mind, it is now possible to control and regiment the masses according to our will without their knowing it In almost every act of our daily lives, whether in the sphere of politics or business, in our social conduct or our ethical thinking, we are dominated by the relatively small number of persons who understand the mental processes and social patterns of the masses. It is they who pull the wires which control the public mind."
    -Edward Bernays

    "there's a lot of bad stuff on Facebook, it's because that's what its users want"

    -Mike Masnick

    link to this | view in thread ]

  80. identicon
    Anonymous Coward, 7 Feb 2020 @ 6:23am

    Re:

    Your tin-foil is showing. I submit to you Hanlon's Razor.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.