Why The EU Needs To Get Audits For Tech Companies Right

from the it's-too-easy-to-get-it-wrong dept

Discussions over the right – or at least a good – way to regulate big tech companies are heating up in the European Union (EU). Several legislative proposals are set to be negotiated, with public and behind-the-scenes lobbying in full swing already. As with any regulation, a key question is how to hold corporate decisionmakers accountable for their actions and how to create transparency. Some of the ways this has typically been done in other industries include legally mandated corporate compliance regimes, rules for financial or supply chain transparency and mandatory risk assessments and audits. These ideas are coming to the tech sector now, too, especially with the draft “Digital Services Act” (DSA). The DSA suggests new due diligence rules for platforms such as Facebook and YouTube, for example, regarding what processes are in place for content moderation and how they deal with potential infringements on users’ fundamental rights. Audits are introduced to check whether companies comply with the DSA’s due diligence rules.

If done right, audits can be a valuable mechanism helping independent researchers, oversight bodies and the public hold tech companies accountable. If done poorly, audits will be mere check-the-box exercises with little value that might even hurt people and entrench platform power. That is why it will be crucial for the EU get the audit provisions in the DSA right. Four major issues, drawn from past experiences with audits and general risks associated with them, need to be taken into account.

First, there is the risk of a weak auditor or an auditor with only limited powers. Facebook’s and Google’s “privacy audits” in the US can be taken as an example. Each company was subjected to legally mandated scrutiny by the US Federal Trade Commission over data protection issues. Yet, what were at times billed as “privacy audits” turned out to be mere assessments that were later criticized as almost meaningless due to vague language and powers for the regulator.

Second, auditors can, conversely, be too powerful. If their mandate is ill-fitting or too broad, the auditing company or governmental agency that oversees global corporate giants like Facebook and Google might have quite a sway in what billions of people access, read and watch on the web. This could be abused for financial or political interests. Especially authoritarian leaders might try to tip the scale in their favors by controlling big tech companies, which has some lawmakers in the EU worried as well.

Third, the auditing process itself can lack clear guidance and oversight. Without quality control, what is meant to be a safety measure and to incentivize corporate compliance can turn into a check-the-box exercise. Unfortunately, there are grave examples for this danger: “Social audits”, aiming to certify suitable workplace conditions, especially in the clothing industry, have come under intense scrutiny, after audited companies’ factories burned or crumbled, killing hundreds of workers. A for-profit auditing system with little checks can be partly to blame. In the financial industry, bad and sometimes illegal business practices could not be stopped despite auditing regimes being in place, as the WireCard case in Germany illustrates. Similarly, the international “Dieselgate” scandal showed the limitations of overseeing car manufacturers.

Relatedly, fourth, audits need to have consequences if they reveal corporate malfeasance. An audit that shows how a company failed to follow the rules cannot only result in recommendations or a blow to the company’s reputation. Fines and, more importantly, changes in business practices and compliance processes are necessary.

In all four areas, the DSA needs improvement. To address the first two points on the strength or weakness of the auditor, it is crucial that the auditor’s tasks and powers are clearly delineated. For tech companies offering people news and information spaces, a top priority should be that auditors check corporate processes, not individual pieces of content. This means that the auditor should, for instance, monitor whether companies have suitable notice-and-action mechanisms, meaningful reporting standards about their online advertising practices and recommender systems as well as consumer protection measures in place. Determining the legality of content should be left neither to corporations nor governmental regulators, but to independent courts. This would ensure that platforms are held accountable, without establishing an all-powerful auditor. The DSA draft goes in this direction, but the tasks of the auditor need to be spelled out in greater detail.

In practical terms, it is not yet clear who could and should do the auditing. Looking towards established audits in other industries can be helpful but copying existing methods risks perpetuating its flaws (like with the social audits) and not accounting for the peculiarities of tech companies. It is presumptuous to assume that big accounting firms might just take on auditing tech companies. Auditing a company like Facebook, TikTok or Snapchat is not the same as auditing a bank or an insurer. Auditors need different skills and specific technical knowledge in this field, which many existing auditing outfits might not have yet. However, it is also ill-advised to blindly rely on young companies now claiming to audit tech companies or even “algorithms”, as there is no common definition of what such “algorithmic auditing” entails. For example, the Ada Lovelace Institute, a UK-based NGO, has identified four different ways to assess algorithmic systems and those can, in turn, contain different approaches. An industry has sprung up offering to audit algorithmic systems for biases and legal compliance but there are no standards for such audits or auditors. To ensure high-quality auditors and a system of checks and balances, the EU should define what audits are supposed to achieve and what is expected of auditors. A vetting process regarding the financial independence of platforms and auditors could be discussed, as well as guidelines for oversight and quality control. Otherwise, audits risk being a fig leaf for tech companies or, worse, a cover-up for systemic failures like with some “social audits”.

Lastly, the DSA’s remedies for failed audits and non-compliance need to be beefed up. An independent oversight entity should be enabled to stop abusive business practices and sanction companies. Promisingly, this idea on enforcement as well as some potential improvements to the auditing regime have been put forth by the European Parliament. With the DSA, the EU has the chance to build an auditing regime for digital platforms from scratch. It should strive to make it as structurally sound as possible to limit terrible outcomes like those described above. This is not far-fetched, because some platforms’ business practices have been linked to genocide, election interference and invasions of privacy, just to name a few risks.

Establishing clear rules for the content of audits, standards for the auditors themselves and consequences for tech companies would be a true EU innovation. It would ensure a watch-the-watchers approach for auditors and thus alleviate legitimate concerns that governmental or private auditors (especially if paid for by the platforms) undermine democratic oversight. Taken together, this would go a long way in improving accountability for tech companies.

Julian Jaursch is a project director working on platform regulation topics at Stiftung Neue Verantwortung (SNV), a Berlin-based not-for-profit, non-partisan tech policy think tank.

Hide this

Thank you for reading this Techdirt post. With so many things competing for everyone’s attention these days, we really appreciate you giving us your time. We work hard every day to put quality content out there for our community.

Techdirt is one of the few remaining truly independent media outlets. We do not have a giant corporation behind us, and we rely heavily on our community to support us, in an age when advertisers are increasingly uninterested in sponsoring small, independent sites — especially a site like ours that is unwilling to pull punches in its reporting and analysis.

While other websites have resorted to paywalls, registration requirements, and increasingly annoying/intrusive advertising, we have always kept Techdirt open and available to anyone. But in order to continue doing so, we need your support. We offer a variety of ways for our readers to support us, from direct donations to special subscriptions and cool merchandise — and every little bit helps. Thank you.

–The Techdirt Team

Filed Under: audits, content moderation, digital services act, dsa, eu, tech policy


Reader Comments

Subscribe: RSS

View by: Time | Thread


  • identicon
    Anonymous Coward, 20 Aug 2021 @ 3:39am

    As with any regulation, a key question is how to hold corporate decision makers accountable for their actions and how to create transparency. [...] These ideas are coming to the tech sector now, too, especially with the draft “Digital Services Act” (DSA). The DSA suggests new due diligence rules for platforms such as Facebook and YouTube, for example, regarding what processes are in place for content moderation and how they deal with potential infringements on users’ fundamental rights. Audits are introduced to check whether companies comply with the DSA’s due diligence rules.

    Having only managed to get about 3 paragraphs in: It looks a whole lot like this article is predicated on property owners rights over their own property evaporating when other people use it (or maybe since all the examples are non-EU, it's predicated on non-EU things having no right, or similar).

    Honestly I think that is really gross.

    I guess we could talk why property owners (or what ever grouping criteria it is they are being biased against) have no right to be arbitrary and capricious regarding what speech they allow on their own property. But only after the EU passes a law mandating all people (or I guess they could limited to just organization) are mandated to amplify my voice.

    link to this | view in chronology ]

    • icon
      PaulT (profile), 20 Aug 2021 @ 5:31am

      Re:

      "this article is predicated on property owners rights over their own property evaporating when other people use it"

      a.k.a. reality.

      You don't lose all rights when you invite someone to use your property, but your rights are restricted - for example, your right to not allow black people on your property evaporates when you open it to the public, your right to cook expired meat disappears when you're cooking for the public, you have to stop storing boxes in places that block the back door if codes require you to have a fire escape, and so on. When you invite the public on to your property, you have to account for the public's rights as a result, and sometimes their rights override yours. You still retain many rights of your own, of course, but when there's a conflict the trade-off for the benefit of having a public accommodation means that you don't keep them all.

      "I guess we could talk why property owners (or what ever grouping criteria it is they are being biased against) have no right to be arbitrary and capricious regarding what speech they allow on their own property"

      They have every right to do that, of course, but they are not shielded from either criticism or real world consequences if they choose to do so. Sometimes this benefits the company and everyone who uses it - such as when popular platforms decide they don't want to put up with Nazis - sometimes it harms the property owner (doing something that results in an effective boycott in protest, for example), but it's always their choice. People just ask that they understand that the repercussions might not be what they had imagined.

      link to this | view in chronology ]

      • identicon
        Anonymous Coward, 20 Aug 2021 @ 5:48am

        Re: Re:

        They have every right to do that, of course, but they are not shielded from either criticism or real world consequences if they choose to do so.

        I never said they should be shielded from criticism. Only that it seems pretty reprehensible to be trying to make laws to control if/how people can dictate restrictions on their own private property of speech.

        For example: I think drinking is bad. If one of my friends took up drinking, I would definetly talk to them about it. However I also find cramming my (or anyone else's) views of morality down other peoples throats with legislation to be extremely reprehensible. I'd never support prohibition. Though if someone come onto my private property to drink and thinks I wont kick them, out well I definetly would.

        NOTE: by "morality", here, I mean views of right and wrong that don't directly translate to harm to others. For example: it makes sense that murdering (on or off private property) people would be legislated (I hope the argument is obvious form here).

        (as another side note: things like poisoning people, aka feeding expired foods to people, was never a right anyone had. on or off their property.)

        link to this | view in chronology ]

        • identicon
          Anonymous Coward, 23 Aug 2021 @ 11:38am

          Re: Re: Re:

          It's interesting that you would use drinking as an example, as not only are the laws concerning restrictions on public venues serving alcohol extensive, but some of those extend into homes and other areas which are not public accomodations.

          link to this | view in chronology ]

        • icon
          PaulT (profile), 26 Aug 2021 @ 5:41am

          Re: Re: Re:

          "Only that it seems pretty reprehensible to be trying to make laws to control if/how people can dictate restrictions on their own private property of speech."

          Nobody's advocating doing that. However, if you're opening up that property to the public, different rules apply.

          "(as another side note: things like poisoning people, aka feeding expired foods to people, was never a right anyone had. on or off their property.)"

          "Expired" doesn't mean "rotten". Expiration dates where I live at least are very conservative and you can often have meat that's still good to serve several days afterwards. Or, I can freeze the fresh food I bought and reuse it months later with no problems, even though the packaging states it's well beyond the expiry. It's my choice if I want to take the risk, but generally speaking I buy food on a semi-regular basis that's reduced for clearance because it's about to expire, then. cook it a day or 2 later with no problem.

          But, I have that choice at home. Businesses are restricted when they serve to the public. For very good reasons.

          link to this | view in chronology ]

  • icon
    NextraOne (profile), 30 Aug 2021 @ 4:52am

    Best

    You have some really good ideas in this article. I am glad I read this. I agree with much of what you state in this article. Your information is thought-provoking, interesting and well-written

    Kindly Read- <a href="https://www.nextraone.com/dedicated-hosting"&gt; Best Dedicated Server Provider Company </a>

    link to this | view in chronology ]


Follow Techdirt
Essential Reading
Techdirt Deals
Report this ad  |  Hide Techdirt ads
Techdirt Insider Discord

The latest chatter on the Techdirt Insider Discord channel...

Loading...
Recent Stories

This site, like most other sites on the web, uses cookies. For more information, see our privacy policy. Got it
Close

Email This

This feature is only available to registered users. Register or sign in to use it.