Facebook may be ordered, de facto, to monitor the activity of its users in order to prevent them to repeat posting identical or equivalent unlawful content. This is what the European court ruled today in a judgment referred by an Austrian court (Judgment in Case C-18/18 Eva Glawischnig-Piesczek v Facebook Ireland Limited) and concerning the request of a politician to remove various comments and allegations published by a Facebook user which were harmful to her reputation. Since such comments and allegations were identical or had an equivalent content, the question was whether Facebook, after removing a first time, should keep an eye on future similar illicit behaviors.
The European court ruled in the positive way. It could sounds like a common sense decision in the normal world, but in the case of a social network such an obligation would imply an automated system able to intercept such illicit content and evaluate whether they are identical and/or equivalent. Basically, an AI filter or something very sophisticated would be needed. This is not a little thing in the matter of social networks because a kind of censorship activity would be delegated to a machine.
Today’s European decision may create an important shift from consolidated interpretation of the liability regime of hosting providers under the current European framework, that is to say the European Electronic Commerce Directive (Directive 2000/31). The current system provides per the so-called “notice and take down” mechanism (article 14), whereby an host provider such as Facebook is not liable for stored information if it has no knowledge of its illegal nature or if it acts expeditiously to remove or to disable access to that information as soon as it becomes aware of it. With the news decision, the mechanism may turn into a kind of “notice and stay down”, whereby Facebook would be liable for illicit content not notified to him, provided that they are identical or equivalent to previous notified ones. Facebook should be de facto aware of such future posts, something which sounds a bit peculiar, unless you ask Facebook to constantly check what users are doing on the platform.
This is a huge development in terms of liability regime by hosting providers, because it may imply a de facto monitoring obligation, in contradiction with art. 15 of the same directive which instead affirms that no monitoring obligation may be imposed on hosting providers. Fact is the court reminds that “the directive prohibits any requirement for the host provider to monitor generally information which it stores or to seek actively facts or circumstances indicating illegal activity”. However, the current judgment seems to undermine in practice such an obligation, by stating that Facebook should:
– remove illicit content posted on its social network, the content of which is identical to the content of information which was previously declared to be unlawful, or to block access to that information, irrespective of who requested the storage of that information;
– do the same when the new/future content is just “equivalent” to content previously declared unlawful. The “equivalence” assessment could be done via automated search tools and technologies, since a human intervention does not seems feasible.
This is a paramount decision which may influence the activity of the incoming European Commission, which was already considering to revise the liability regime of online intermediary and platforms via the announced Digital Service Act. Others may even invoke this case in relation to the potential filters which could be imposed when transposing art. 17 of the recent Copyright Directive in the matter of video-sharing.
In fact, one would have expected more cautions by the European court in rendering a decision which may impact on the freedom of speech of Internet users, whose content could now be intercepted and blocked by machines. However, today’s decision seem to refer to a necessary previous assessment by a national court, which should lay down all the needed guarantee and limits to protect the freedom of citizens to share information and opinion on social network. Of course, a bit more guidance by the EU Court for the national judges would have been appreciated, and this lack may be a reason for a new preliminary rulings sooner or later.
This is why today’s decision should not be regarded as leading case allowing filtering on platforms on generic way. The European court recognized that a fair balancing of interests is required, this is why the (filters) obligation can be imposed only by a judge which should consider, inter alia, the proportionality of the measure taking into account the offensive effect of the illicit content, the rights of users to express their opinion, the capability of the concerned platform to install a monitoring software ecc.
Interestingly, the European court does not preclude that an “a stay down” order could be applicable on worldwide basis, that is to say on all national versions of Facebook.