from the it's-too-easy-to-get-it-wrong dept
Discussions over the right – or at least a good – way to regulate big tech companies are heating up in the European Union (EU). Several legislative proposals are set to be negotiated, with public and behind-the-scenes lobbying in full swing already. As with any regulation, a key question is how to hold corporate decisionmakers accountable for their actions and how to create transparency. Some of the ways this has typically been done in other industries include legally mandated corporate compliance regimes, rules for financial or supply chain transparency and mandatory risk assessments and audits. These ideas are coming to the tech sector now, too, especially with the draft “Digital Services Act” (DSA). The DSA suggests new due diligence rules for platforms such as Facebook and YouTube, for example, regarding what processes are in place for content moderation and how they deal with potential infringements on users’ fundamental rights. Audits are introduced to check whether companies comply with the DSA’s due diligence rules.
If done right, audits can be a valuable mechanism helping independent researchers, oversight bodies and the public hold tech companies accountable. If done poorly, audits will be mere check-the-box exercises with little value that might even hurt people and entrench platform power. That is why it will be crucial for the EU get the audit provisions in the DSA right. Four major issues, drawn from past experiences with audits and general risks associated with them, need to be taken into account.
First, there is the risk of a weak auditor or an auditor with only limited powers. Facebook’s and Google’s “privacy audits” in the US can be taken as an example. Each company was subjected to legally mandated scrutiny by the US Federal Trade Commission over data protection issues. Yet, what were at times billed as “privacy audits” turned out to be mere assessments that were later criticized as almost meaningless due to vague language and powers for the regulator.
Second, auditors can, conversely, be too powerful. If their mandate is ill-fitting or too broad, the auditing company or governmental agency that oversees global corporate giants like Facebook and Google might have quite a sway in what billions of people access, read and watch on the web. This could be abused for financial or political interests. Especially authoritarian leaders might try to tip the scale in their favors by controlling big tech companies, which has some lawmakers in the EU worried as well.
Third, the auditing process itself can lack clear guidance and oversight. Without quality control, what is meant to be a safety measure and to incentivize corporate compliance can turn into a check-the-box exercise. Unfortunately, there are grave examples for this danger: “Social audits”, aiming to certify suitable workplace conditions, especially in the clothing industry, have come under intense scrutiny, after audited companies’ factories burned or crumbled, killing hundreds of workers. A for-profit auditing system with little checks can be partly to blame. In the financial industry, bad and sometimes illegal business practices could not be stopped despite auditing regimes being in place, as the WireCard case in Germany illustrates. Similarly, the international “Dieselgate” scandal showed the limitations of overseeing car manufacturers.
Relatedly, fourth, audits need to have consequences if they reveal corporate malfeasance. An audit that shows how a company failed to follow the rules cannot only result in recommendations or a blow to the company’s reputation. Fines and, more importantly, changes in business practices and compliance processes are necessary.
In all four areas, the DSA needs improvement. To address the first two points on the strength or weakness of the auditor, it is crucial that the auditor’s tasks and powers are clearly delineated. For tech companies offering people news and information spaces, a top priority should be that auditors check corporate processes, not individual pieces of content. This means that the auditor should, for instance, monitor whether companies have suitable notice-and-action mechanisms, meaningful reporting standards about their online advertising practices and recommender systems as well as consumer protection measures in place. Determining the legality of content should be left neither to corporations nor governmental regulators, but to independent courts. This would ensure that platforms are held accountable, without establishing an all-powerful auditor. The DSA draft goes in this direction, but the tasks of the auditor need to be spelled out in greater detail.
In practical terms, it is not yet clear who could and should do the auditing. Looking towards established audits in other industries can be helpful but copying existing methods risks perpetuating its flaws (like with the social audits) and not accounting for the peculiarities of tech companies. It is presumptuous to assume that big accounting firms might just take on auditing tech companies. Auditing a company like Facebook, TikTok or Snapchat is not the same as auditing a bank or an insurer. Auditors need different skills and specific technical knowledge in this field, which many existing auditing outfits might not have yet. However, it is also ill-advised to blindly rely on young companies now claiming to audit tech companies or even “algorithms”, as there is no common definition of what such “algorithmic auditing” entails. For example, the Ada Lovelace Institute, a UK-based NGO, has identified four different ways to assess algorithmic systems and those can, in turn, contain different approaches. An industry has sprung up offering to audit algorithmic systems for biases and legal compliance but there are no standards for such audits or auditors. To ensure high-quality auditors and a system of checks and balances, the EU should define what audits are supposed to achieve and what is expected of auditors. A vetting process regarding the financial independence of platforms and auditors could be discussed, as well as guidelines for oversight and quality control. Otherwise, audits risk being a fig leaf for tech companies or, worse, a cover-up for systemic failures like with some “social audits”.
Lastly, the DSA’s remedies for failed audits and non-compliance need to be beefed up. An independent oversight entity should be enabled to stop abusive business practices and sanction companies. Promisingly, this idea on enforcement as well as some potential improvements to the auditing regime have been put forth by the European Parliament. With the DSA, the EU has the chance to build an auditing regime for digital platforms from scratch. It should strive to make it as structurally sound as possible to limit terrible outcomes like those described above. This is not far-fetched, because some platforms’ business practices have been linked to genocide, election interference and invasions of privacy, just to name a few risks.
Establishing clear rules for the content of audits, standards for the auditors themselves and consequences for tech companies would be a true EU innovation. It would ensure a watch-the-watchers approach for auditors and thus alleviate legitimate concerns that governmental or private auditors (especially if paid for by the platforms) undermine democratic oversight. Taken together, this would go a long way in improving accountability for tech companies.
Julian Jaursch is a project director working on platform regulation topics at Stiftung Neue Verantwortung (SNV), a Berlin-based not-for-profit, non-partisan tech policy think tank.
Filed Under: audits, content moderation, digital services act, dsa, eu, tech policy