Sustainable digitalisation takes into account not only ecological aspects – such as the CO2 emissions that go hand in hand with global data flows and are generated in the production of digital tools – but also the social dimension, i.e. fair working conditions and just and democratic coexistence.
The last point refers to the large digital corporations such as Amazon, Google, Facebook and Co.: How democratic is our society today in the face of the supremacy of these internet giants? The big platform operators have long since become decisive players on the internet. They not only provide information infrastructures, but also moderate discourse, curate content and block accounts based on rules they set themselves.
Vérane Meyer, Leiterin des Referats Digitale Ordnungspolitik der Böll-Stiftung, und Torben Klausa, der u.a. zu Fragen der Plattformregulierung forscht und schreibt, fragen in ihrer Kurzstudie „Regulierung digitaler Kommunikationsplattformen“ danach, wie die privaten Unternehmen die öffentliche Debatte beeinflussen und welche Möglichkeiten es gibt, sie demokratisch zu überprüfen. Im Interview sprachen wir mit ihnen über die demokratischen Herausforderungen digitaler Plattformen, aktuelle Regulierungsvorhaben und Lösungen für einen demokratischen Diskurs im Netz.
Vérane Meyer, head of the Digital Governance Unit at the Böll Foundation, and Torben Klausa, who researches and writes on issues including platform regulation, ask in their short study “Regulating Digital Communication Platforms” (in German) how private companies influence public debate and what possibilities there are for democratic scrutiny. In this interview, we talk to them about the democratic challenges of digital platforms, current regulatory projects and solutions for more democratic discourse on the internet.
How do digital platforms influence public debate? What are the aspects that are the most cause for concern?
Torben: “Digital platforms determine what we perceive to be public debate in the first place. It already starts with what content we come across on Facebook and the like: The algorithms of the platforms decide which videos, pictures and posts are presented to us. If they don’t show something then it basically doesn’t happen in the online world. And the selection of content is not based on what serves a constructive public debate.
Vérane: “As the German elections approach in September, there is concern around possible manipulation happening in an election campaign that is mainly beingconducted online: from a lack of transparency in online political advertising to the increasing spread of disinformation and systematic hate campaigns and problematic content in messaging apps or platforms that are not yet even covered by the current regulation. And the example of Donald Trump shows that by now even the platforms are aware of the risk of digital content for the analogue world.”
In this case, isn’t it a good thing that Trump was deprived of that channel of communication? Or should private platforms not be allowed to decide things like that at all?
Vérane: “That’s the big argument at the moment. Some say that platforms simply have to adhere strictly to their own house rules in their decisions – and after all, every user has agreed to them at some point. Others think that the decision about such interventions in our democratic discourse should not be left up to private companies.
Torben: “Platforms are already bound by German law. Even if moves to enforc that law are sometimes thwarted: content that violates German law – swastikas and insults, for example – must be removed by Facebook and Co. in Germany when they find out about it. Much more problematic, however, is content that is described in English as “awful but lawful” – that is, dangerous and undesirable, but still legal. How do we deal with this? We believe that as a society we have to take this decision into our own hands and cannot just leave it to companies.”
Vérane: “Politicians are also already addressing the issue. In the draft of the Digital Services Act (DSA), the planned platform regulation of the EU, a kind of “Trump article” is currently being discussed: Online platforms would no longer be able to simply block the accounts of users who are of public interest.”
What do you think should be the tasks of the platforms?
Vérane: “Platforms are playing an increasingly important role in the digital public sphere and should therefore also fulfil certain duties of care – not only in dealing with illegal content. It is also about guaranteeing the free formation of opinion and ensuring transparency in advertising and recommendation algorithms, for example. The Trump example shows that the debate about the power and influence of commercial platform operators is more important than ever. In concrete terms, platforms could now ensure that risks in the digital election campaign are minimised, for example, by pushing verified content, identifying disinformation and ranking it low, and acting more transparently overall.
Torben: “The great thing about digital platforms is that people use them for a thousand different things and they are constantly evolving. But when it comes to public discourse, we should not only recognise their creative potential – but also their responsibility. TV stations, for example, are much more regulated in Germany than platforms are – to ensure diversity of opinion.”
How can we assess if platforms are negatively impacting democratic discourse? And what are important requirements to ensure democratic discourse on the internet?
Vérane: “In order to have real control, we first have to really understand how platforms work and what effects they have. For this reason, more and more regulators and researchers are demanding real insight into the processes, keyword: transparency. It is also incredibly difficult to switch from one platform to another – and not lose your entire network, messages, photos and so on. Mandatory interoperability could help break up the existing platform oligopoly and create more democratic structures online.
Torben: “When we talk about platforms, we usually think of Facebook, Twitter and YouTube. But a large part of dangerous disinformation now takes place on messenger services like Telegram and WhatsApp. We should think about how to cover the right services with our regulation – and how to enforce it. There are many more ideas to this effect. We have summarised a whole series in our paper.”
What are current regulatory projects at national and European level?
Vérane: “At the European level, the Digital Services Act is currently the biggest, almost revolutionary regulatory project, in Germany there has been the NetzDG for a few years and since the end of 2020 the Media State Treaty. Overall, we see a trend towards pushing platform regulation at the European level – which makes sense given the international nature of the topic.”
Torben: “At the same time, Germany is definitely a pioneer in this area of regulation with the NetzDG and the State Media Treaty. Many aspects of the draft DSA are very similar to the German rules. Meanwhile, the authorities here are already gaining practical experience with their implementation and the legislator is steering accordingly. For example, the NetzDG is only four years old, but has already undergone two rounds of amendments. This does not mean that Germany is a shining example – some would even argue the opposite. But the times when digital platform giants were considered unregulable are over.
Nevertheless, we seem to be lagging behind the self-imposed rules of the digital platforms. How can we be one step ahead in the future?
Vérane: “By not relying on the good will of the big platform providers to minimise the risks. Instead, we need clear rules for more transparency and against manipulation on the internet. Also, making different communication services compatible with each other can help give users choice and effectively avoid these manipulations, and ultimately strengthen media diversity.”
Torben: “We need to understand that the quality of discourse does not only depend on fighting dangerous content. The responsibility of digital platforms is not only about blocking and deleting – but also about the criteria they use to present other content. And last but not least, it is also up to us users to develop a certain digital serenity and scepticism: The sense of calm needed not to click on every piece of clickbait that is displayed to us; scepticism to distinguish between fact and fake. Because both of them are out there.”
This is a translation of an original article that first appeared on RESET’s German-language site.