Here is a summary of my first academic writing on the topic of my PhD thesis:
The paper is dedicated to the liability of the online platforms for illegal and harmful content in the European Union. The intermediary liability is laid down in the Directive on electronic commerce (ECD) from 2000 that establishes the legal framework for online services in the Internal Market. But the technological advancement and its reflection on the way that people communicate put the question for the necessity of making changes in the regime of the liability of the online platforms on the agenda. Under consideration is the interaction of the Directive on Copyright in the Digital Single Market (EU) 2019/790 and the proposed Terrorist Content Regulation (TERREG) with ECD along with the provisions for more comprehensive revision of the intermediary liability regime through the announced Digital Services Act that is part of “A Europe fit for the digital age” set of measures of the new European Commission.
The paper is organized as follows:
Part 1 (Introduction): Users, needs and usage
– where the numbers and conclusions of some recent sociological surveys and studies (International Telecommunication Union, Eurostat, Eurobarometer) are examined concerning the current consumer habits and behaviour and the concerns about the proliferation of illegal content and the suppression of the freedom of expression online at the same time.
Part 2: The online platforms immunity
– where the relevant clauses of the Directive on electronic commerce are examined for the current intermediary liability regime and its underlying purpose to be explained. Some problems regarding the initiating cause of this piece of legislation from the end of the XX century and its implementation are presented.
Part 3: The democracy immunity
– where the manifested weaknesses of western democracy to the manipulations and misuse of the tools that are making the free flow of information globally possible are analysed together with the political response to this phenomenon.
Part 4: The European Commission balancing efforts
– where the measures of the executive body of the European Union for tackling problems as the incitement to hatred and terrorism, the distribution of child sexual abuse materials as well as the intellectual property rights infringement are presented
Part 5: Step by step towards online content regulation
– where the recent policy-making efforts and approved and proposed legislation, which implies a requirement for the online platforms to take down specific types of content after noticing and/or proactively, are examined
Part 6: How efficient are the self-regulatory instruments
– where the hate speech code of conduct and the code of practice against disinformation of the European Union as soft law alternative to regulative measures are presented along with the assessment of their efficiency and sufficiency
Part 7: Digital Services Act
– where the provisions for setting a comprehensive regulatory framework and the ambitions of the new president of the upcoming European Commission – Ursula von der Leyen – concerning the Digital Single Market completion are presented and put in context
Part 8: Case law
– where the recent Court of Justice of the European Union ruling in the case C‑18/18 Glawischnig-Piesczek v Facebook on fighting defamation online is summarized as the decision to impose a requirement for the online platforms to take down defamatory content proactively is seen as a green light for the use of upload filters
Part 9: Conclusions
– A restatement of the paper central argument and an argument of its relevance, significance, and importance
Keywords: online platforms, intermediary liability, online content regulation, upload filters, freedom of expression, illegal content, disinformation, content moderation, European Union, Digital Single Market