A Conversation With EU Parliament Member Marietje Schaake About Digital Platforms And Regulation, Part I

from the the-view-from-the-eu dept

We are cross posting the following interview conducted by Danish journalist, Cato Institute Senior Fellow, and author of The Tyranny of Silence, Flemming Rose with European Parliament Member from the Netherlands, Marietje Schaake -- who we've discussed on the site many times, and who has even contributed here as well. It's an interesting look at how she views the question of regulating internet platforms. Since this is a relatively long interview, we have broken it up into two parts, with the second part running tomorrow. Update Part II is now available.

Marietje Schaake is a leading and influential voice in Europe on digital platforms and the digital economy. She is the founder of the European Parliament Intergroup on the Digital Agenda for Europe and has been a member of the European Parliament since 2009 representing the Dutch party D66 that is part of the Alliance of Liberals and Democrats for Europe (ALDE) political group. Schaake is spokesperson for the center/right group in the European Parliament on transatlantic trade and digital trade, and she is Vice-President of the European Parliament's US Delegation. She has for some time advocated more regulation and accountability of the digital platforms.

Recently, I sat down with Marietje Schaake in a café in the European Parliament in Brussels to talk about what's on the agenda in Europe when it comes to digital platforms and possible regulation.

FR: Digital platforms like Facebook, Twitter and Google have had a consistent message for European lawmakers: Regulation will stifle innovation. You have said that this is a losing strategy in Brussels. What do you mean by that?

MS: I think it's safe to say that American big tech companies across the board have pushed back against regulation, and this approach is in line with the quasi- libertarian culture and outlook that we know well from Silicon Valley. It has benefited these companies that they have been free from regulation. They have been free not only from new regulation but also have had explicit exemptions from liability in both European and American law (Section 230 in the US and the Intermediary Liability Exemption in the E-commerce Directive in the EU). At the same time they have benefited from regulations like net neutrality and other safeguards in the law. We have been discussing many new initiatives here in the European Parliament including measures against copyright violations, terrorist content, hate speech, child pornography and other problems. digital platforms reaction to most of the initiatives has at been at best an offer to regulate themselves. They in effect say, "We as a company will fix it, and please don't stifle innovation." This has been the consistent counter-argument against regulation. Another counter-argument has been that if Europe starts regulating digital platforms, then China will do the same.

FR: You don't buy that argument?

MS: Well, China does what it wants anyway. I think we have made a big mistake in the democratic world. The EU, the US and other liberal democracies have been so slow to create a rule-based system for the internet and for digital platforms. Since World War II, we in the West have developed a rules on trade, on human rights, on war and peace, and on the rule of law itself; not because we love rules in and by themselves, but because it has created a framework that protects our way of life. Rules mean fairness and a level playing field with regard to the things I just mentioned. But there has been a push-back against regulation and rules when it comes to digital platforms due to this libertarian spirit and argument about stifling innovation, this "move fast and break things" attitude that we know so well from Silicon Valley.

This is problematic for two reasons. First, we now see a global competition between authoritarian regimes with a closed internet with no rule of law and democracies with an open internet with the rule of law. We have stood by and watched as China, the leading authoritarian regime, has offered its model to the world of a sovereign, fragmented internet. This alternative model stifles innovation, and if people are concerned about stifling innovation, they should take much more interest in fostering an internet governance model that beats the Chinese alternative. Second, because with the current law of the jungle on the internet, liberal democracy and democratic rights of people are suffering, because we have no accountability for the algorithms of digital platforms. At this point profit is much more important than the public good.

FR: But you said that emphasizing innovation is a losing strategy here in Brussels.

MS: I feel there is a big turning point happening as we speak. It is not only here in Brussels but even Americans are now advocating regulation.

FR: Why?

MS: They have seen the 2016 election in the US, they have seen conspiracy after conspiracy rising to the top ranks of searches, and it's just not sustainable.

FR: What kind of regulation are you calling for and what regulation will there be political support for here in Brussels?

MS: I believe that the e-commerce directive with the liability exemptions in the EU and Section 230 with similar exemptions in the US will come under pressure. It will be a huge game changer.

FR: A game changer in what way?

MS: I think there will be forms of liability for content. You can already see more active regulation in the German law and in the agreements between the EU- Commission and the companies) to take down content (the code of conduct on hate speech and disinformation). These companies cannot credibly say that they are not editing content. They are offering to edit content in order not to be regulated, so they are involved in taking down content. And their business model involves promoting or demoting content, so the whole idea that they would not be able to edit is actually not credible and factually incorrect. So regulation is coming, and I think it will cause an earthquake in the digital economy. You can already see the issues being raised in the public debate about more forceful competition requirements, whether emerging data sets should also be scrutinized in different ways, and net neutrality. We have had an important discussion about the right to privacy and data protection here in Europe. Of course, in Europe we have a right to privacy. The United States does not recognize such a right, but I think they will start to think more about it as a basic principle as well.

FR: Why?

MS: Because of the backlash they have seen.

FR: Do you have scandals like Cambridge Analytica in mind?

MS: Yes, but not only that one. Americans are as concerned about protection of children as Europeans are if not more. I think we might see a backlash against smart toys. Think about dolls that listen to your baby, capture its entire learning process, its voice, its first words, and then use that data for AI to activate toys. I am not sure American parents are willing accept this. The same with facial recognition. It's a new kind of technology that is becoming more sophisticated. Should it be banned? I have seen proposals to that end coming from California of all places.

FR: Liability may involve a lot of things. What kind of liability is on the political menu of the European Union? Filtering technology or other tools?

MS: Filtering is on the menu, but I would like to see it off the menu because automatic filtering is a real risk to freedom of expression, and it's not feasible for SME (Small and Medium Enterprises) so it only helps the big companies. We need to look at accountability of algorithms. If we know how they are built, and what could be their flaws or unintended consequences, then we will be able to set deadlines for companies to solve these problems. I think we will look much more at compliance deadlines than just methods. We already have principles in our laws like non-discrimination, fair competition, freedom of expression and access to information. They are not disputed, but some of these platforms are in fact discriminating. It has been documented that Amazon, the biggest tech company and the front runner of AI had a gender bias in favor of men in its AI-algorithm for hiring. I think future efforts will be directed toward the question of designing technology and fostering accountability for its outcomes.

FR: Do you think the governments in the US and Europe are converging on these issues?

MS: Yes. Liberal democracies need to protect themselves. Democracy is in decline for 13th year in a row (according to Freedom House). It's a nightmare, and it's something that we cannot think lightly about. Democracy is the best system in spite of all its flaws, it guarantees the freedoms of our people. It also can be improved by holding the use of power accountable through checks and balances and other means.

FR: Shouldn't we be careful not to throw out the baby with the bath water? We are only in the early stages of developing these technologies and businesses. Aren't you concerned that too much regulation will have unintended consequences?

MS: I don't think there is a risk of too much regulation. There is a risk of poorly drafted regulation. We can already see some very grave consequences, and I don't want to wait until there are more. Instead, let's double down on principles that should apply in the digital world as they do in the physical world. It doesn't matter if we are talking about a truck company, a gas company or a tech company. I don't think any technology or AI should be allowed to disrupt fundamental principles and we should begin to address it. I believe such regulation would be in the companies' interest too because the trust of their customers is at stake. I don't think regulation is a goal in and by itself, but everything around us is regulated: the battery in your recording device, the coffee we just drank, the light bulbs here, the sprinkler system, the router on the ceiling, the plastic plants behind you so that if a child happens to eat it, it will not kill them as fast as it might without regulation, and the glass in the doors over there, so if it breaks it does so in a less harmful way and so on and so forth. There are all kinds of ideas behind regulation, and regulation is not an injustice to technology. If done well, regulation works as a safeguard of our rights and freedoms. And if it is bad, we have a system to change it.

The status quo is unacceptable. We already have had manipulation of our democracies. We just learned that Facebook paid teenagers $20 to get to their most private information. I think that's criminal, and there should be accountability for that. We have data breach after data breach, we have conspiracy theories still rising to the top search at YouTube in spite of all their promises to do better. We have Facebook selling data without consent, we have absolutely incomprehensible terms of use and consent agreements, we have lack of oversight over who is paying for which messages, how the algorithms are pushing certain things up and other things down. It's not only about politics. Look at a public health issues like anti- vaccination hoaxes. Online sources say it is dangerous to vaccinate your child. People hear online that vaccinations are dangerous and do not vaccinate their children leading to a new outbreak of measles. My mother and sister are medical doctors, cancer specialists, and they have patients who have been online and studied what they should do to treat their cancer, and they get suggestions without any medical or scientific proof. People will not get the treatment that could save their lives. This touches upon many more issues than politics and democracy.

FR: So you see here a conflict between Big Tech and democracy and freedom?

MS: Between Big Tech with certain business models and democracy, yes.

FR: Do you see any changes in the attitudes and behaviour of the tech companies?

MS: Yes, it is changing, but it's too little, too late. I think there is more apologizing, and there is still the terminology, "Oh we still have to learn everything, we are trying." But the question is, is that good enough?

FR: It's not good enough for you?

MS: It's not convincing. If you can make billions and billions tweeking your algorithm every day to sell ever more adds, but you claim that you are unable to determine when conspiracies or anti-vaccination messages rise to the top of your search. At one point I looked into search results on the Eurozone. I received 8 out of 10 results from one source, an English tabloid with a negative view of the Euro. How come?

FR: Yes, how come, why should that be in the interest of the tech companies?

MS: I don't think it's in their interest to change it, but it's in the interest of democracy. Their goal is to keep you online as long as possible, basically to get you hooked. If you are trying to sell television, you want people to watch a lot of television. I am not surprised by this. It was to be expected. However, it becomes a problem, when hundreds of millions of people only use a handful of these platforms for their information. It's remarkably easy for commercial or political purposes to influence people whether it's about anti-vaccination or politics. I understand from experts that the reward mechanism of the algorithm means that sensation sells more, and once you click on the first sensational message it pulls you in a certain direction where it becomes more and more sensational, and one sensation after another is being automatically presented to you.

I say to the platforms, you are automatically suggesting more of the same. They say no, no, no, we just changed our algorithm. What does that mean to me? Am I supposed to blindly believe them? Or do I have a way of finding out? At this point I have no way of finding out, and even AI machine learning coders tell me that even they don't know what the algorithms will churn out at the end of the day. One aspect of AI is that the people who code don't know exactly what's going come out. I think it's too vague about safeguards, and clear that the impact is already quite significant.

I don't pretend to know everything about how the systems work. We need to know more because it impacts so many people, and there is no precedent of any service or product that so many people use for such essential activities as accessing information about politics, public health and other things with no oversight. We need oversight to make sure that there are no excesses, that there is fairness, non- discrimination and free expression.

You can read Part II of this interview now.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: eu, eu parliament, flemming rose, free speech, internet regulation, marietje schaake, platform liability, privacy
Companies: facebook, google


Reader Comments

Subscribe: RSS

View by: Time | Thread


  1. icon
    Mason Wheeler (profile), 19 Feb 2019 @ 11:39am

    Digital platforms like Facebook, Twitter and Google have had a consistent message for European lawmakers: Regulation will stifle innovation.

    It's important to realize that this is not always a bad thing.

    Mixed in with all of the amazing societal gains we've gotten over the last 25 years or so, we've also ended up with sleazy companies like Facebook, Palantir and Cambridge Analytica coming up with all sorts of new and innovative ways to spy on people. Stifle that, please!

    We've seen Uber develop innovative new methods of price-gouging and evading the law and those who enforce it. Stifle that, please!

    All sorts of hardware manufacturers from Apple to John Deere have developed innovative new ways to lock down computers and steal our property rights. Stifle that, please!

    Not all innovation is good. Some of it deserves to be "stifled," if not outright "smothered in the cradle." The trick is figuring out policies that are selective enough to not also stifle the good, beneficial stuff.

    link to this | view in thread ]

  2. identicon
    Anonymous Coward, 19 Feb 2019 @ 12:31pm

    TLDR; We want to create a bunch of legislation to make criminals of people we don't like, because we can't win in the free market.

    link to this | view in thread ]

  3. icon
    hij (profile), 19 Feb 2019 @ 12:32pm

    Confounding personal privacy with copyright

    The person doing the interview, Mr. Rose, allowed MEP Schaake conflate the issues of copyright and personal privacy. They are different issues, and the kinds of regulations required are different. With respect to copyright companies should be given the opportunity to police their content without being penalized for doing so. With respect to personal privacy companies should be penalized for attempting to collect, maintain, and sell the information they gather. Copyright is an issue that can happen to them due to other peoples' actions. Violating the privacy or others is something the companies actively participate in. Allowing the MEP to conflate the issues is an unfortunate way to confuse two very different issues.

    link to this | view in thread ]

  4. icon
    Igualmente69 (profile), 19 Feb 2019 @ 12:37pm

    "All sorts of hardware manufacturers from Apple to John Deere have developed innovative new ways to lock down computers and steal our property rights. Stifle that, please!" You trust the same government that designed the laws that allowed those companies to screws us, to stop the companies from screwing us?

    link to this | view in thread ]

  5. identicon
    Anonymous Coward, 19 Feb 2019 @ 12:44pm

    Re:

    It would be fantastic if they created surgically targeted laws to address the issues you raise, but laws have unintended consequences and seem rarely to solve the problems there creation intended.

    I am interested to hear the process to decide what innovation deserves to be stifled, a licensing process with a review board maybe?

    link to this | view in thread ]

  6. identicon
    Bruce C., 19 Feb 2019 @ 1:14pm

    Readers of the Dune series...

    may recognize the first seeds of the Orange Catholic Bible in the recognition/emphasis of the dangers and risks of AI. "Thou shalt not make a machine in the likeness of a human mind."

    link to this | view in thread ]

  7. icon
    Thad (profile), 19 Feb 2019 @ 1:24pm

    Re:

    I definitely believe you when you say "DR".

    link to this | view in thread ]

  8. icon
    Thad (profile), 19 Feb 2019 @ 1:26pm

    Re:

    That's...some pretty impressive circular reasoning, of the "why should we even have laws?" school.

    link to this | view in thread ]

  9. icon
    Peter (profile), 19 Feb 2019 @ 2:05pm

    Those politicians need to spend time talking to real people

    Not just to each other and to lobbyists.

    Us ordinary people will not suddenly vote for Trump just because some idiot labels Hillary Clinton a "crook" on Twitter. We might vote for him if those in charge keep droning on about climate change and saving the frogs while we are worried about losing jobs or getting mugged in the street.

    Leave the internet alone and start working on real problems if you are concerned that we might vote for a populist, for BREXIT or for Putin!

    link to this | view in thread ]

  10. identicon
    Cuthbert Swankhurst IIV, 19 Feb 2019 @ 2:20pm

    Re: Re:

    I am interested to hear the process to decide what innovation deserves to be stifled, a licensing process with a review board maybe?

    HIGH PROFIT MARGIN is the reliable measure. From banking (#) to Microsoft to Google to Hollywood, it's the goal that they all chase and the easily measurable obvious one by which to gauge how predatory they are on the rest of us, as almost anyone outside elitist ranks will tell you.

    (# In the old days, bankers had the 3-3-3 rule: pay 3% interest to depositors, make 3% profit on loans, be on the golf course by 3 PM. That's the way a reasonable society should work. Those who seek merely monetary gains are mere sharks, and we don't have to tolerate them.)

    Broad societal problems aren't so complex as you and masnicks make them out to be: JUST REDUCE THE INFLUENCE OF MONEY EVERYWHERE.

    link to this | view in thread ]

  11. icon
    Thad (profile), 19 Feb 2019 @ 2:35pm

    Re: Those politicians need to spend time talking to real people

    Us ordinary people will not suddenly vote for Trump just because some idiot labels Hillary Clinton a "crook" on Twitter. We might vote for him if those in charge keep droning on about climate change and saving the frogs while we are worried about losing jobs or getting mugged in the street.

    If you voted for Trump because you thought our decreasing crime and unemployment rates were a bigger threat than climate change, then maybe you shouldn't be so sure that disinformation campaigns don't work on you.

    link to this | view in thread ]

  12. identicon
    Rocky, 19 Feb 2019 @ 4:32pm

    Re: Re: Re:

    Broad societal problems aren't so complex as you and masnicks make them out to be: JUST REDUCE THE INFLUENCE OF MONEY EVERYWHERE.

    That's really brilliant!!

    I think we might just have to nominate you for the Nobel Prize in Economics Sciences, to bad there isn't one for Social Sciences though...

    /s

    Anyone that say they have a simple solution for a complex problem is either a fool or charlatan trying to rip you off in some way.

    link to this | view in thread ]

  13. identicon
    Anonymous Coward, 19 Feb 2019 @ 5:28pm

    European man invades Digital Poland for democracy.

    link to this | view in thread ]

  14. identicon
    Anonymous Coward, 19 Feb 2019 @ 6:13pm

    Re: Re: Re:

    You mean the money that Hollywood and the RIAA hold?

    Your argument would actually hold water if not for the fact that you regularly deepthroat the richest people on the planet.

    link to this | view in thread ]

  15. identicon
    Anonymous Coward, 20 Feb 2019 @ 12:15am

    Re:

    You mean, trust the same government that made laws on the behest of rent-seeking companies has to stop the said rent-seeking companies from screwing is with the laws they bought?

    In case you missed it: A lot of the really bad laws that "the government" made were actually made by companies. It's not "companies vs. government" it's "companies bought the government and use it to rip off people".

    link to this | view in thread ]

  16. identicon
    Anonymous Coward, 20 Feb 2019 @ 12:09pm

    Re: Re: Re: Re:

    Sure, why not give this fake scientist the Fake Nobel Prize for their fake solution?

    link to this | view in thread ]

  17. identicon
    Anonymous Coward, 20 Feb 2019 @ 1:17pm

    Learn to code

    Seriously he thinks detecting propaganda is as easy as ad code? One can be done with blind metrics. You could write ad code while literally not knowing the language and it could work. Just split profiles between whatever words, "Karfkar" and "Shoopi" and age estimates and you have something serviceable when combined with arbitrary categories.

    link to this | view in thread ]

  18. identicon
    Rekrul, 20 Feb 2019 @ 5:33pm

    Instead, let's double down on principles that should apply in the digital world as they do in the physical world. It doesn't matter if we are talking about a truck company, a gas company or a tech company.

    So if someone rents a truck and uses it to intentionally run over someone, the company should be held liable because they didn't pre-screen the person to determine if his use of the truck was going to be illegal? After all, that's what they want for the digital world.

    link to this | view in thread ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.