Auditing Big Tech: Tackling Disinformation and the EU Digital Services Act

Combating disinformation with reliable transparency

Partner: Omidyar Network

Disinformation on social media platforms has become a major concern during elections and the COVID-19 pandemic. Under increased governmental and societal pressure, social media platforms attempt to limit the spread of disinformation. However, their exact procedures to identify and regulate the spread of disinformation on their platforms and whether they are successful or not in this endeavor are currently impossible to verify. To counter the lack of transparency of social media platforms and give governments and regulators an independent way to assess these efforts, we propose creating ‘independent auditing intermediaries.’ This kind of auditing mechanism would ensure the accuracy of ‘transparency data’ provided by large online platform providers about the content on their services.

This report was drafted during 2020 before the European Digital Services Act (DSA) was published. Now that the DSA has been released in December 2020, we thought it would be important to respond to the proposed regulation in the DSA and clarify how our report relates it.

First, we should say that the DSA is a ground-breaking piece of legislation that we believe will contribute considerably to combatting disinformation and strengthening the European accountability regime for online platforms. As such, the DSA was developed very much in the spirit of this report. Therefore, we are happy that the DSA was published in this form as it is a crucial first step and demonstrates the political viability of many of our policy proposals. Ensuring better accountability for online platforms is no longer a theoretical academic idea but exists in a very concrete European policy proposal.

However, there are still many aspects in the DSA that can be improved further. As numerous policy analysts have noted, there are considerable challenges regarding independence and impartiality of the regulatory bodies within the DSA , as well as challenges in the ways in which the European Commission and EU member states jointly govern platforms through the DSA.

Our proposal for accountability mechanisms goes beyond what is currently proposed within the DSA in several crucial ways:

  1. Regular auditing by default: Article 28 of the DSA proposes an independent auditing mechanism that should be conducted by very large online platforms, at least on an annual basis. The scope of these audits is relatively limited and infrequent. Our proposal argues that *all* transparency data provided by online platforms should be audited, before it is published or given to third parties, so that it can be considered ‘verified data.’ To our mind, there is no point discussing public statements made by companies unless they are based on verified (i.e. independently audited) data.
  2. Research access to audited data: Article 31 of the DSA provides access to the data on online platforms for “verified researchers”. However, the data provided is not audited. Given the well-established challenges with problematic transparency data provided by online platforms, we believe that all data that is provided by online platforms should be audited before it can be trusted by outside researchers, regulators or the general public.
  3. Public sector independent auditing mechanism: Article 28 of the DSA does not specify whether a public or private sector organization undertakes the independent audit. We believe that a separate, public sector body should be created with the sole purpose of auditing confidential information provided by platforms. Such an institution could be created within the context of the proposed European Digital Services Act (DSA), but it should be a distinct legal entity to guarantee its independence. This institution would be responsible for collecting and verifying data, producing verified data and making them available only to authorities endowed with the legal competence to use them, to a legally specified extent for a legally specified purpose. The collection and verification of the data on the one hand (responsibility of the independent auditing intermediaries), and their use for regulatory purposes on the other (obligation of the regulation authorities), would be distinct processes. This model would further enhance the independence of the institutions involved and guarantee the security of the data in question.

We hope this helps to clarify the relationship between our proposals and the EU Digital Services Act. As it is highly likely that the DSA will evolve considerably during the parliamentary process over the coming months and years, it is important that the actors negotiating the DSA are able to use this input to make the DSA stronger and more protecting of fundamental rights rather than weaker.

Thanks for reading this blog. We look forward to your comments, suggestions and ideas! 🙂

Keywords: Online Platforms, Big Tech, Disinformation, Auditing Intermediaries, Europe, Platform-Regulation, European Digital Services Act (DSA)

Year: 2021

The full report is available below.