Everything That's Wrong With Social Media And Big Internet Companies: Part 1

from the and-there's-more-to-come dept

Some of today's anxiety about social-media platforms is driven by the concern that Russian operatives somehow used Facebook and Twitter to affect our electoral process. Some of it's due a general perception that big American social-media companies, amorally or immorally driven by the profit motive, are eroding our privacy and selling our data to other companies or turning it over to the government—or both. Some of it's due to the perception that Facebook, Twitter, Instagram, and other platforms are bad for us—that maybe even Google's or Microsoft's search engines are bad for us—and that they make us worse people or debase public discourse. Taken together, it's more than enough fodder for politicians or would-be pundits to stir up generalized anxiety about big tech.

But regardless of where this moral panic came from, the current wave of anxiety about internet intermediaries and social-media platforms has its own momentum now. So we can expect many more calls for regulation of these internet tools and platforms in the coming months and years. Which is why it's a good idea to itemize the criticisms we've already seen, or are likely to see, in current and future public-policy debates about regulating the internet. We need to chart the kinds of arguments for new internet regulation that are going to confront us, so I've been compiling a list of them. It's a work in progress, but here are three major claims that are driving recent expressions of concern about social media and internet companies generally.

(1) Social media are bad for you because they use algorithms to target you, based on the data they collect about you.

It's well-understood now that Facebook and other platforms gather data about what interests you in order to shape what kinds of advertising you see and what kind of news stories you see in your news feed (if you're using a service that provides one). Some part of the anxiety here is driven by the idea (more or less correct) that an internet company is gathering data about your likes, dislikes, interests, and usage patterns, which means it knows more about you in some ways than perhaps your friends (on social media and in what we now quaintly call "real life") know about you. Possibly more worrying than that, the companies are using algorithms—computerized procedures aimed at analyzing and interpreting data—to decide what ads and topics to show you.

It's worth noting, however, that commercial interests have been gathering data about you since long before the advent of the internet. In the 1980s and before in the United States, if you joined one book club or ordered one winter coat on Land's End, you almost certainly ended up on mailing lists and received other offers and many, many mail-order catalogs. Your transactional information was marketed, packaged, and sold to other vendors (as was your payment and credit history). If false information was shared about you, you perhaps had some options ranging from writing remove-me-from-your-list letters to legal remedies under the federal Fair Credit Reporting Act. But the process was typically cumbersome, slow, and less-than-completely satisfactory (and still is when it comes to credit-bureau records). One advantage with some internet platforms is that (a) they give you options to quit seeing ads you don't like (and often to say just why you don't like them), and (b) the internet companies, anxious about regulation, don't exactly want to piss you off. (In that sense, they may be more responsive than TiVo could be.)

Of course it's fair—and, I think, prudent—to note that the combination of algorithms and "big data" may have real consequences for democracy and for freedom of speech. Yale's Jack Balkin has recently written an excellent law-review article that targets these issues. At the same time, it seems possible for internet platforms to anonymize data they collect in ways that pre-internet commercial enterprises never could.

(2) Social Media are bad for you because they allow you to create a filter bubble where you see only (or mostly) opinions you agree with. (2)(a) Social media are bad for you because they foment heated arguments between you and those you disagree with.

To some extent, these two arguments run against each other—if you only hang out online with people who think like you, it seems unlikely that you'll have quite so many fierce arguments, right? (But maybe the arguments between people who share most opinions and backgrounds are fiercer?) In any case, it seems clear that both "filter bubbles" and "flames" can occur. But when they do, statistical research suggests, it's primarily because of user choice, not algorithms. In fact, as a study in Public Opinion Quarterly reported last year, the algorithmically driven social-media platforms may be both increasing polarization and increasing users' exposures to opposing views. The authors summarize their conclusions this way:

"We find that social networks and search engines are associated with an increase in the mean ideological distance between individuals. However, somewhat counterintuitively, these same channels also are associated with an increase in an individual's exposure to material from his or her less preferred side of the political spectrum."

In contrast, the case that "filter bubbles" are a particular, polarizing problem relies to a large degree not on statistics but on anecdotal evidence. That is, the people who don't like arguing or who can't bear too different a set of political opinions tend to curate their social-media feeds accordingly, while people who don't mind arguments (or even love them) have no difficulty encountering heterodox viewpoints on Facebook or Twitter. (At various times I've fallen into one or the other category on the internet, even before the invention of social media or the rise of Google's search engine.)

The argument about “filter bubbles”—people self-segregating and self-isolating into like-minded online groups—is an argument that predates modern social media and the dominance of modern search engines. Law professor Cass Sunstein advanced it in his 2001 book, Republic.com and hosted a website forum to promote that book. I remember this well because I showed up in the forum to express my disagreement with his conclusions—hoping that my showing up as a dissenter would itself raise questions about Sunstein's version of the “filter bubble” hypothesis. I didn't imagine I'd change Sunstein's mind, though, so I was unsurprised to see the professor has revised and refined his hypothesis, first in Republic.com 2.0 in 2007 and now in #Republic: Divided Democracy in the Age of Social Media, published just this year.

(3) Social media are bad for you because they are profit-centered, mostly (including the social media that don't generate profits).

"If you're not paying for the product, you're the product." That's a maxim with real memetic resonance, I have to admit. This argument is related to argument number 1 above, except that instead of focusing on one's privacy concerns, it's aimed at the even-more-disturbing idea that we're being commodified and sold by the companies who give us free services. This necessarily includes Google and Facebook, which provide users with free access but which gather data that is used primarily to target ads. Both of those companies are profitable. Twitter, which also serves ads to its users, isn't yet profitable, but of course aspires to be.

As a former employee of the Wikimedia Foundation—which is dedicated to providing Wikipedia and other informational resources to everyone in the world, for free—I don't quite know what to make of this. Certainly the accounts of the early days of Google or of Facebook suggest that advertising as a mission typically arose after the founders realized that their new internet services needed to make money. But once any new company starts making money by the yacht-load, it's easy to dismiss the whole enterprise as essentially mercenary.

(In Europe, much more ambivalent to commercial enterprises than the United States, it's far more common to encounter this dismissiveness. This helps explain some the Europe's greater willingness to regulate the online world. The fact that so many successful internet companies are American also helps explain that impulse.)

But Wikipedia has steadfastly resisted even the temptation to sell ads—even though it could have become an internet commercial success just as IMDB.com has—because the Wikipedia volunteers and the Wikimedia Foundation see value in providing something useful and fun to everyone regardless of whether one gets rich doing so. So do the creators of free and open-source software. If creating free products and services doesn't always mean you're out to sell other people into data slavery, shouldn't we at least consider the possibility that social-media companies may really mean it when they declare their intentions to do well by doing good? (“Do Well By Doing Good” is a maxim commonly attributed to Benjamin Franklin—who of course sold advertising, and even wrote advertising copy, for his Pennsylvania Gazette.) I think it's a good idea to follow Mike Masnick's advice to stop repeating this “you're the product” slogan—unless you're ready to condemn all traditional journals that subsidize giving their content to you through advertising.

So that's the current top three chart-toppers for the Social Media-Are-Bad-For-You Greatest Hits. But this is a crowded field—only the tip of the iceberg when it comes to trendy criticisms of social-media platforms, search engines, and unregulated mischievous speech on the internet--and we expect to see many other competing criticisms of Facebook, Twitter, Google, etc. surface in the weeks and months to come. I'm already working on Part 2.

Update: It took some time, but Part 2 and Part 3 are now available.

Mike Godwin (@sfmnemonic) is a Distinguished Senior Fellow at the R Street Institute.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: filter bubble, internet, power, privacy, social media
Companies: facebook, google, twitter


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Machin Shin, 29 Nov 2017 @ 12:11pm

    With a title like that I am expecting this is going to be a very big series.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 Nov 2017 @ 12:15pm

    Not a bad run down. I look forward to see the others in part 2. For me, I think the only anti-social-media argument I've ever made was the "you're the product" one because it's pretty obvious there's no good way for Twitter or Facebook to make money otherwise. They have this pile of data they can exploit. And even if they partition it from advertisers through indirect categories or anonymize it (which has been found to not work when sufficient amounts of data is collected in relation to known non-anonymous data) they're still leaking your identity to the world. So they can have all the good intentions in the world but that won't work when the FBI or the NSA can get a hold of that data to do their own dragnet nonsense (the whole disrupt J20 trial is showing where this is going). Ultimately, social media has to devolve back to a truly decentralized and distributed venues to ensure people have better control of their data. It's not a guarantee for privacy but I think it's a starting point.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 29 Nov 2017 @ 5:34pm

      Re:

      Ultimately, social media has to devolve back to a truly decentralized and distributed venues to ensure people have better control of their data.

      It's vaguely related to point 3 but a good point on its own. The current web (including social media) is a bunch of fiefdoms, with somewhat arbitrary rulership: whoever did a thing first and avoided running it into the ground. So there's no longer an online encyclopedia I can anonymously contribute to, because Wikipedia took over the space and then banned anonymous editing. I can contribute here, but only in the limited topic area this site deals with. And for any activity you can name, there's some web forum, and they're all different forums with totally different policies—which you end up having to agree to if you want to discuss that topic.

      It's why Zuckerberg controls the social interactions of billions of people: good marketing a decade ago, "good enough" maintainership, and all the similar sites fucked up somehow. The current client/server model drives people toward setups like this. Not that it's easy to fix, but there's potentially a huge benefit if we succeed.

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    identicon
    Anonymous Coward, 29 Nov 2017 @ 12:28pm

    BALONEY! "Russian operatives..."

    "... somehow used Facebook and Twitter to affect our electoral process."

    0.004% FOUR THOUSANDTHS OF ONE PERCENT.

    Just delightfully insane that you even bring "the Russians" in. It's the utterly conventional "elite" line of excuse / diversion, right out of NYT.

    99.996% of the "anxiety about social-media platforms" is because they SPY on everyone full-time, whether signed-in or not, whether a user at all.

    "has its own momentum now" -- Yes, soon as people KNOW what these giant multi-national corporations are up to: the relentless invasive "corporate" surveillance that's actually more intense than the "gov't" (not that there's ANY difference these days) -- and in any case, according to Snowden, Google and every one of the biggies give NSA "direct access".

    link to this | view in chronology ]

    • This comment has been flagged by the community. Click here to show it
      identicon
      Anonymous Coward, 29 Nov 2017 @ 12:30pm

      Re: BALONEY! "Russian operatives..."

      Twice "Held For Moderation"; shortening the subject line didn't work, but "Resend" got it through.

      BTW, again: since same text got through on "Resend", all that folderol about "filters" is just, er, baloney.

      link to this | view in chronology ]

    • This comment has been flagged by the community. Click here to show it
      identicon
      Anonymous Coward, 29 Nov 2017 @ 12:33pm

      Re: BALONEY! "Russian operatives..."

      Having skimmed the rest, it's the old "bring up and deny" method, plus a little "you're ALREADY spied on, what's the prob with a little more?"

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 29 Nov 2017 @ 1:07pm

        Re: Re: BALONEY! "Russian operatives..."

        Still mad the ruskies are paid for what you give away eh?

        link to this | view in chronology ]

        • identicon
          JEDIDIAH, 29 Nov 2017 @ 1:57pm

          Re: BALONEY! "Russian operatives..."

          Actual Russians find this crap terribly hilarious. That goes for the idea they altered the outcome of the election and the idea they get paid to troll Facebook.

          link to this | view in chronology ]

          • identicon
            Wendy Cockcroft, 1 Dec 2017 @ 2:50am

            Re: Re: BALONEY! "Russian operatives..."

            I'm sure they do, Jedidiah, but they either do or don't have troll and fake news factories. Assume they do; how are they influencing people? By pretending to share the views of the people they're trolling.

            One was caught out because he put a monetary sum as 500£ instead of £500. Why? That's how Russians write it.

            The way they operate is to amplify confirmation bias in order to create an aura of common sense around particular issues. This then deepens the divide between the nominal left and right so they see each other as enemies. I've seen liberals described as terrorists, damn it.

            They amplify and exaggerate the effects of the shock jock types like Coulter and Limbaugh, the result of which is that such views are now mainstream and we've got Nazis marching in Charlottesville chanting "The Russians are our friends."

            Of course the "actual Russians" are laughing about it, comrade. They're thinking, "Your turn now, you decadent capitalists! You've been doing this to us since the Revolution."

            link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 Nov 2017 @ 12:33pm

    Filter bubbles are nothing new, see church, newspapers. political activist organization and political parties etc. It is just that the Internet is reducing the power of the older establishments, and allowing non local and non national groups to form. The older established groups naturally attack whatever is reducing their power and influence.

    link to this | view in chronology ]

    • icon
      Roger Strong (profile), 29 Nov 2017 @ 1:17pm

      Re:

      Most people in a church or activist bubble still got their news from relatively unfiltered newspapers or TV news. Now they can get it filtered for their bubble.

      When someone claimed that (presidential candidate) was caught red-handed being a Satanist commie, it was unsupported hearsay. Now they can post news stories from multiple sites "reporting" that Hillary controls ISIS. "They wouldn't be allowed to report it if it wasn't true!" Their FaceBook friends don't need any more memory capacity, intelligence or language skills than a chicken trained to peck a "Like" button, to pass the claims along to their own friends.

      link to this | view in chronology ]

      • identicon
        JEDIDIAH, 29 Nov 2017 @ 2:00pm

        It's all ad driven nonsense.

        These people can post each other's nonsense at a much greater speed and with a much larger audience than they could before. It's like the worst part of pre-internet culture on steriods.

        Plus "normal journalism" has fallen into the same kind of nonsense. They probably think they need to in order to "keep up". You can even sometimes see a story summary contradict it's own headline even without digging into the story itself.

        Even if you do pay attention to conventional journalism, it is generally quite deranged too.

        link to this | view in chronology ]

        • icon
          Coward Anonymous (profile), 5 Jan 2020 @ 12:13pm

          Re: It's all ad driven nonsense.

          "...You can even sometimes see a story summary contradict it's own headline even
          without digging into the story itself."

          The headline - body/summary contradiction is a very old fenomenon:
          Setting tha headline is a job for a higher paygrade with the aim to draw attention and viewers. Whitch means he doesn't need to read the article.

          link to this | view in chronology ]

      • identicon
        Anonymous Coward, 29 Nov 2017 @ 2:02pm

        Re: Re:

        Just because the Internet has made the sharing of conspiracy theories, and stories confirming a viewpoint easy does not mean they were not as widely shared before it exited. Its just that the sharing was not as visible to outsiders.

        link to this | view in chronology ]

        • icon
          Roger Strong (profile), 29 Nov 2017 @ 2:40pm

          Re: Re: Re:

          Its just that the sharing was not as visible to outsiders.

          Pre-internet conspiracy theories were some times political but they spread too slow to affect a current election cycle. It was the UFO, the Lock Ness monster and sasquatch type theories that were the most popular. Largely because they could be monetized - slowly - with speeches and books.

          The internet meant that you could monetize conspiracy theories much faster. WorldNetDaiy and InfoWars made a fortune monetizing inbreds with ads, online store sales and gold scams, using claims about 9/11, the North American Union and the Amero.

          Social media kicked it into overdrive. Now you can reach enough people to make a difference in the polls. Because unlike a website or blog, the people who read your message can spread it to a large number of friends with a single click.

          You can do it fast enough to stay well ahead of the debunking. Unlike blogs, people are checking their Facebook feed many times a day. A couple years ago (and probably still?) there was an endless stream of anti global warming claims being spread on Reddit and Facebook: Scientists committing fraud. Computer models proven wrong. Predictions proven wrong. These would all get definitively debunked within days, but by then the next such claim was making the rounds. It's the same with political smears.

          The elements were in place and in use before social media, but social media made a big difference.

          link to this | view in chronology ]

          • identicon
            Anonymous Coward, 29 Nov 2017 @ 3:16pm

            Re: Re: Re: Re:

            >but by then the next such claim was making the rounds. It's the same with political smears.

            So, this is the Internet bringing the ability of the people to create false stories up to the level that was only previously available to politicians via the newspapers.

            link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 Nov 2017 @ 12:34pm

    "Everything That's Wrong With Social Media And Big Internet Companies:"

    The people that use them.

    that is all!

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 Nov 2017 @ 12:35pm

    There is a concern around the companies selling advertisement because of how extremely closed that industry is and what fuels it.

    In the late nineties a lot of todays ad-company giants were seen as malware providers and scourges, today they are hailed as visionary business-people and use even more invasive techniquess to gather data than they did in the past. For criminals, nothing has changed as they are still able to sell information and in that way keep a business going. The combination of a very closed information-for-advertising industry living off information about people and a strong organized criminal culture that is a red flag.

    Don't get me wrong: The problem is not the big internet companies on their own. It is the closed nature of commercial information-for-ads on the net that fuels a lot of speculation.

    link to this | view in chronology ]

  • identicon
    David, 29 Nov 2017 @ 12:36pm

    I'm not finding this opening article very convincing. It seems to be very waffle-y and hand-wavy about its points, to a degree that this site usually heavily criticizes other public speakers for. It feels like it didn't go through a proper critique and editing process.

    The individual points (bold 1, 2, and 3) are arguments that I can see coming up, and are relevant to the premise of the article. However the explanations for them, and the excusing of them as not being "real" arguments, is so poorly supported and detailed that I'd expect this to be to be run on some click-bait news site, not TechDirt.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 29 Nov 2017 @ 1:44pm

      Re:

      It does sound pretty dismissive: the "well, aside from violating your privacy and selling that data without consent, how ELSE will those places get money??" excuse smells so rotten I'm amazed it made a full article, too.

      link to this | view in chronology ]

  • This comment has been flagged by the community. Click here to show it
    icon
    MyNameHere (profile), 29 Nov 2017 @ 12:47pm

    The premise of the article is good, but results given seem little speculative and narrow.

    1) data collection in the past (pre-internet) was made only at your points of contact, when you bought something, as an example. Now companies like Amazon track your every search, know exactly the things you "like". Visit a webpage with an Amazon ad on it and they will know things that interest you. The internet creates a ton more contact points. Social media takes it one step further, with people literally pouring out all of themselves to feed the machine. They know where you go, when you go, etc.

    2) We tend to self-sort on social media, following and tolerating people we like and agree with, and generally ignoring those we don't. We tend to live in echo chambers that reinforce the things we believe in, because that is what we hear.

    3) Using Wikipedia as an example seems like a long way to get to a weak point. Facebook is immensely profitable because they deliver very filtered, very controllable eyeballs to advertisers. They are so specific that they are getting in trouble for it because some of the options can be used in very racist ways (like not showing rental ads to black people). Facebook's massive profitability comes from turning your personal information into a commodity. Like it or not, you are the product being sold.

    The article basically is a big case of "nothing to see here, just a major plane crash, keep moving, nothing to see here". It seems like a setup for future articles that are going to be based on these three questionable statements.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 29 Nov 2017 @ 2:11pm

      Re: Too Speculative and Narrow

      I for one thought I'd outline my position on this issue so we can all have some more clarity. I certainly agree MyNameHere's diagnosis of this artical, though at the same time I worry that regulation may prove to be an inneffective tool. Instead I think it's on technologists to:

      • Understand the perrils of recommendation algorithms and to try to make them more imperfect.
      • Find ways, using peer-to-peer technologies, to not centralize this highly valuable information in the first place.
      • Be more careful about the incentive schemes they develop (I've seen this issue in both centralized and peer-to-peer systems).

      The catch is that for these approaches to address the issue, people actually have to use the peer-to-peer technologies. And while it isn't the full solution, regulation (essentially a form of anti-trust laws for computers) may be able to make it easier to compete:

      • Require all file formats, network protocols, and ReSTful APIs to be publicly and thoroughly documented for others to use even for competing services/software. [We're already most of the way there]
      • If a service/software's main value is in communicating in some form between user accounts or devices, they must support communicating with accounts on any (loosely or directly) competing service.
      • While we're at it, maybe ban 3rd party tracking. Heck from my experience consuming ads, that'll probably only make them more profitable.

      A lawyer might be needed to tighten up this language if it actually goes anywhere (which it probably won't being a blog comment), but like you Godwin I would be warry of any further website regulation.

      link to this | view in chronology ]

      • icon
        MyNameHere (profile), 30 Nov 2017 @ 12:13am

        Re: Re: Too Speculative and Narrow

        I think the main point is "follow the money".

        Facebook and Google both make their money in the same manner. They take your personal information, every scrap of information they know about you, every site that they have seen you visit, and they use that to decide which ads you will see. Selling that demographic and interest information is their business.

        In both cases, the real solution is separation. Facebook's ad system should not be able to see any of your personal information (except what can be gleaned from IP address and browser info) to determine what ads to show you. The data that they have related to your facebook account should not be allowed to be used by their ads department.

        Google is in the same boat. Their ad system depends heavily and almost entirely on feeding back your searches and your site visits into a profile they can profit from.

        3rd party trackers on a website are always a bad thing, especially when there is no way to opt out or block them.

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 3 Dec 2017 @ 9:33pm

          Re: Re: Re: Too Speculative and Narrow

          So does the RIAA and Malibu Media. Where did you complain about those?

          link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 Nov 2017 @ 12:49pm

    One particular point

    There are a lot of serious errors in this piece, but let me focus on just one:

    "At the same time, it seems possible for internet platforms to anonymize data they collect in ways that pre-internet commercial enterprises never could."

    1. Why should they? The data's worth more as is.

    2. But let's move past that and let's assume, for the purpose of argument, that they decide to try to anonymize the data. Anonymization is hard. Have you noticed? How do you know they'll succeed? For that matter, how will *they* know if they succeed?

    3. Let's note in passing that many people who tried to anonymize data and declared success were subsequently discovered to have failed miserably.

    4. But let's move past that too, and let's make the ridiculously optimistic assumption that they not only decided to destroy much of the value in their data, but they actually invested a pile of money in anonymizing it and, incredibly, they succeeded.

    What happened to the raw data? You know, the really valuable stuff that they elected to throw overboard?

    Or to put it another way: what stops an enterprising employee from making a copy of it (a copy is a copy is a copy is an original), strolling out the door with it, and selling it to the highest bidder? C'mon, anybody who's worked in any sizable operation knows that the internal controls are laughably inadequate and that this sort of thing happens all day long.

    There's more, but let me make my point. Once this data exists, it's target. A huge, valuable, easily moved, readily sold target. Therefore there will be buyers. Therefore there will be sellers. There's simply no avoiding that EXCEPT by not causing the data to exist in the first place.

    And let me note that none of this is speculation. I've been following marketplaces (darknet and otherwise) for many years. Data sets that clearly originated INSIDE social media companies show up all the time.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 29 Nov 2017 @ 2:39pm

      Re: One particular point

      By extension from what you've said, they could easily farm out the sale of the raw data to third parties. This would give them plausible deniability at the cost of a slight reduction in profits.

      link to this | view in chronology ]

  • icon
    Roger Strong (profile), 29 Nov 2017 @ 1:02pm

    At the same time, it seems possible for internet platforms to anonymize data they collect in ways that pre-internet commercial enterprises never could.

    It's also possible that Steve Bannon will endorse Elizabeth Warren for President in 2018. But that has nothing to do with what's likely.

    ProPublica: What Facebook Knows About You

    It's not just what people post on FaceBook, with gaps filled in by their FaceBook 'Friends' contact lists:

    When they browse the Web, Facebook collects information about pages they visit that contain Facebook sharing buttons. When they use Instagram or WhatsApp on their phone, which are both owned by Facebook, they contribute more data to Facebook’s dossier.

    And in case that wasn’t enough, Facebook also buys data about its users’ mortgages, car ownership and shopping habits from some of the biggest commercial data brokers.

    This lets them get rather detailed:

    Indeed, we found Facebook offers advertisers more than 1,300 categories for ad targeting — everything from people whose property size is less than .26 acres to households with exactly seven credit cards.

    Pre-internet commercial enterprises never had dossiers like this, and never for so many people.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 29 Nov 2017 @ 1:50pm

      Re:

      The interconnectedness with machines and humans never seemed so scary before social media was invented.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 29 Nov 2017 @ 2:00pm

        Re: Re:

        seemed? wait until you cannot get a job because of something you said in your past you cannot erase or when people are able to just vote for your turn in the electric chair just because they don't like you.

        Think Black Mirror: Nosedive
        and The Orville: Majority Rule

        link to this | view in chronology ]

    • identicon
      Anonymous Coward, 29 Nov 2017 @ 2:40pm

      Re:

      Note that this includes people who don't have Facebook accounts, and that there's no way for those people to opt out.

      link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 Nov 2017 @ 1:27pm

    On point 2

    I'm not an expert, but my understanding of the academic theory is a bit different here.

    First the problem does ultimately lie in a quirk of human psychology, which yes massively predates the Internet, called "confirmation bias". So we will naturally (unless we're careful) self-sort into groups. And the issue with these recommendations "algorithms" is that they exaggerate this mental flaw.

    Second flame wars, due to another psychological quirk, naturally flows out of filter bubble. That is groups of like minded individuals will start arguing against what they think their opposition is saying rather than what actually is being said. Eventually both sides of the argument is more about being against the fuzzy picture of their opposition than it is anything meaningful.

    For example political parties are a great example of this dynamic, as after the last election I'm not really sure what the difference between them and their base is. [Trump though and some of his more fervent supporters do appear different, so we're clear].

    Given that it makes sense to me that an individual living largely but not entirely inside a filter bubble, will lash out when confronted with competing arguments. As such 2.a) and 2.b) go hand in hand rather than being contradicting each other.

    ---

    Conclusion: Yes it's an old problem, but that doesn't mean we shouldn't be concerned about the "algorithms". And maybe some of these same dynamics is at play in this comment or this artical, but I don't think so. Godwin appears to be addressing the right points, so the only question is about the quality of the argument.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 29 Nov 2017 @ 1:57pm

      Re: On point 2

      Sounds like you are more of an expert than you think.

      You may not think that political parties should be abolished, but you already show understanding one of the foundational principles for why political parties should be abolished.

      The people that are bitching about nationalism are the very same as what they bitch about, just for a different faction.

      Survival is linked to the idea that survival is greatest when groups with other that are like minded. This really just results in kings and elitists, but also something people like.

      I liken this to 1 Samuel Chapter 8
      The people want a king, even if that king will use their children for war, farming, and basket making.

      And like George Washington said... the alternate domination of one faction over the other will result in a despotism on the ruins of public liberty.

      George prophesied the coming war between Ds and Rs. Really you can put any party in their spots, but it is happening everywhere in the world. Look at brexit and a few other nations looking to do the same. How about Catalan and wanting independence? Or how about Crimea? Or just Islam with everyone including Islam itself?

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 29 Nov 2017 @ 11:17pm

        Re: Re: On point 2

        Oh I just pieced together what I said from what I've read online. I'm really just a freedom-loving software developer, with an interest in the unique design constraints of peer-to-peer.

        Which helps explain why I've been quoting algorithm, it's a lot more broad of a term then it's used as in this debate.

        link to this | view in chronology ]

  • icon
    DannyB (profile), 29 Nov 2017 @ 1:59pm

    Other things wrong with Social Media

    1. It is socially awkward in a way that real life is not. Some people I would rather avoid. There's nothing wrong with that.
    2. It becomes, like YouTube, a "look at me!", "look what I did" where people get their self worth from trying to please other people.
    3. It is a black hole for time.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 29 Nov 2017 @ 3:52pm

    Re: "because they use algorithms to target you"

    That doesn't even BEGIN to qualify what algorithms are being used for in ad-tracking based marketing.

    It isn't just targeting. Sophisticated psychological techniques are being transliterated into algorithms. We aren't simply talking about quantifying exposure, but also quantifying the depth of penetration in peoples conscious and subconscious minds.

    IOW industrial scale radicalization of the public. While this radicalization is, by and large, driven to manufacture consumer purchase intent it is having severe side effects.

    Because the techniques from one type of radicalization are similar to other types of radicalization, it is becoming more difficult to distinguish between normal communications and fundamentalist communications. You suggest that people are "choosing" to isolate their world view. But there is some consideration to be made for the fact that the engineering that ISIS (for example) uses to craft their marketing message, is the same engineering now used by Bayer (as an example) to sell asperin.

    Extremist idealism is less distinguishable from regular content because they now use the same techniques. And while it may not be the intention of the content developer, it is the de facto professional discipline of the modern advertiser. If content and advertising are seamless, then so is fundamentalist dogma and rational argument. Picking out usable information gets harder.

    And I'm sure there are a lot of folks that are going to call bullshit on this. And it is true that there is very little data to support what I'm saying because it is a very new phenomena. But what I'm seeing in everyday advertising is similar in form and function to the techniques that MK Ultra used when it took a 16 year old kid and turned him into the Unibomber.

    It isn't the volume. It is the technique. And the surveillance and profiles are what enable the technique. Which is why it is imperative that the 4th amendment be restored, particularly in digital communications.

    My expectation is that the fed is aware of this to some degree somewhere deep in a DARPA lab. And some dipshit advisor to POTUS read him a report which is why the FCC is hell bent to kill the Internet in general. Of course it will just divide the trend by telecom service area, which will make the problem even worse. Because there will be higher concentration levels of the same kinds of psychological battery.

    People are being harmed. The 60% increase of diagnosis of teen clinical depression in the past decade, is how you know that.

    link to this | view in chronology ]

  • icon
    Jeffrey Nonken (profile), 29 Nov 2017 @ 4:35pm

    https://www.youtube.com/watch?v=PlDVSibb1OE (1990)

    Because this hasn't been hasn't been happening forever.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 29 Nov 2017 @ 8:44pm

      Re:

      "Because this hasn't been hasn't been happening forever."

      No, it hasn't. The difference is the difference between statistical analysis and real time analysis. Accuracy in real time analysis is way higher. Note also that MRI's are now being used to create human response models for specific personality types.

      They aren't just looking for the best exposure anymore.They are choosing their victems based on susceptibility to particular types of indoctrination techniques, and then applying fully automated algorithms targeting just those people. That is completely new in commercial applications, though these coercion approaches have been used by intelligence and the military for decades.

      It is also more relentless than it has ever been before. No, this is not like the 90's. Not in scale, and not in technique, and certainly not in the scale of harm being sustained by the general public.

      link to this | view in chronology ]

  • identicon
    Kent England, 30 Nov 2017 @ 8:09am

    Looks like an excellent series

    Congratulations on giving Godwin a platform to speak at length. I especially love the references for more reading on particular topics.

    Techdirt readers would do well to google Mike Godwin's impressive history and cred to see how valuable this series will be.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Nov 2017 @ 9:40am

    Glad to see Techdirt is now no longer even pretending to be a site willing to criticize Facebook/Google. I sure hope you guys got paid well to tell us all why we shouldn't be concerned about these companies.

    link to this | view in chronology ]

  • identicon
    Anonymous Coward, 30 Nov 2017 @ 2:42pm

    "(1) Social media are bad for you because they use algorithms to target you, based on the data they collect about you."

    I've got uMatrix blocking TechDirt from loading JavaScript from...

    a248.e.akamai.net
    s3.amazonaws.com
    www.google.com
    platform.twitter.com
    www.google-analytics.com
    ww w.googletagservices.com
    load.instictiveads.com

    Practice what you preach. More than half of those are tracking readers of this very article. Hypocrite.

    link to this | view in chronology ]

    • identicon
      Anonymous Coward, 30 Nov 2017 @ 3:55pm

      Re:

      You must have skimmed the artical rather than read it. Techdirt are dismissing the highlighted argument.

      I'm not convinced by those counterarguments.

      link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.