Bringing Back the Judiciary Back In Cyberspace

De La Quadrature du Net
Aller à la navigationAller à la recherche

Bringing Back the Judiciary In Cyberspace - Legal Liability of Internet Service Providers and the Protection of Freedom of Expression Online

Intervention by Félix Tréguer during the e-Commerce Workshop during the Single Market Forum, on Krakow, October 3rd, 2011


I'd like to thank the Commission for the invitation to take part in this debate, and all the panelists.

I do policy analysis for La Quadrature du Net, a France-based advocacy group promoting fundamental freedoms on the Internet.

We've just heard from Mr. Carr a point of view about how we should deal with online illegal content, and I think that crucial points are missing in the discussion. In particular : How do we come to know that a given **online content is illegal**? Who, in a democracy abiding by the rule of law, is competent to declare a given content to be illegal? The answer should be: A judge. An actual judge from an independent and impartial tribunal.

I'd like to suggest how far away we've come from this principle, and what can be done to bring back some balance in notice-and-take-down procedures.

WHERE WE COME FROM: A directive elaborated before the Internet revolution

The e-Commerce directive was elaborated in the late nineties, at a time when most policy-makers had still only a vague idea of how ground-breaking of a technology the Internet was, and how crucial for democracy it would become.

Since then, the Internet has turned millions of passive consumers of media into active producers and distributors of content.

Now, the directive introduced principles that turned out to be crucial for these new capacities , namely liability exemptions. But their importance at the time was to some extent overlooked. Hence, the directive remains quite general, which paved the way for an implementation in many regards detrimental to freedom of speech.

THREATS: The rising extra-judicial regulation of the Internet

A few examples:

One particular area where the directive was vague is on how an intermediary could be held liable upon notification by a third party. In order to better protect freedom of expression, several Member States – or their judges - created the category of “manifestly” of “obviously” content that can be removed upon notification with no court order. Initially, “manifestly illegal content” was limited to serious criminal offenses, such as child abuse, but also hate speech. In the past 5 to 7 years, however, it has been extended to other types of content. For instance in France, copyright infringing, privacy infringing, or defamatory content can now be deemed manifestly illegal. This, of course, leads to a lot of legal uncertainty for ISPs, who have an obvious interest in taking down the content to avoid litigation.

Many court decisions have also led to a so-called notice and stay-down regime, were ISPs are asked to prevent the posting or indexing of content when it's already been notified. This staydown regime leads to monitoring of online communications, and automatic therefore arbitrary filtering or takedowns.

This dangerous trend pushing the judiciary away from the Internet is also encouraged by some EU and national public authorities. They take the view that law enforcement or designated private organizations should be able to replace a judge in ordering takedowns of online material. But quite shockingly, these proposals do not seek to investigate these alleged crimes, nor to prosecute the persons who might be responsible through traditional legal procedures !

SOLUTIONS: How to make the eCommerce directive respectful of freedom of expression?

This raises hard but crucial questions about what kind of information society we want to build: One in which old and fundamental democratic and legal principles are upheld, or one where we ask the police, private actors and machines to judge cases and enforce laws?

If the former, and this will be my conclusion, we should consider two points in particular:

First, affirm a presumption of legality for all online content. We must reassert the role of the judiciary to pronounce measures interfering with the freedom of expression on the Internet. In the case of very serious criminal offenses, such as child abuse images, the administrative authority should be the only non-judicial party competent to order the removal of such content - prior to a subsequent judicial ruling, of course.

Second, make notice and takedowns procedures more user-centric. This means: granting users with the possibility for a counter-notification, like in the US or Japan. The ISP must ask the user if he or she accepts to have her material taken down, and let them take full legal responsibility for the content they have published. if there is a counter notification on the part of the user, the ISP should tell the third-party who sent the initial take-down request, and propose that the case be referred to a court.

I thank you for your attention and look forward to the discussion.