Copyright and Internet

Art. 13 of the new Copyright Directive and the censorship machine, for dummies

bestOK-690x457

 

There is much debate about the Copyright reform and I see the need to make some clarification based on the actual draft of the legal provision to be voted by the European Parliament next week (July 5th), and not on chats and tweets.

Is art. 13 of the new Copyright directive imposing filters?

Yes, but without using the word “filters” and instead calling them “appropriate and proportionate measures leading to the non-availability on those services of works or other subject matter infringing copyright or related-rights”. In technical terms, this can only consists in a software detecting and blocking content. It cannot be done manually (otherwise it would take one week to upload content) and it does not work ex post (otherwise the current notice & take down system would be sufficient). Thus, we are talking about automatically preventive/preliminary filters. Even MEP Cavada, one of the rapporteurs, admitted it enthusiatically in one of his outstanding tweets.

Well, filters are forbidden by the EU jurisprudence (cases Sabam and Netlog), aren’t they?

Yes, and this is the reason why the European Parliament plays with words and avoids to call them “filters”. In addition, art. 13 tries to make such filters to appear the result of a private negotiation, nota an imposition by law. Fact is, according to the provision drafted by the European Parliament, the “appropriate measures” (aka filters) should be “taken” by the video sharing platforms “in cooperation with stakeholders” in case they do not agree on a licensing agreement (aka: paying money) with regard to content uploaded by end users

Why it is so important to make such filters to appear to be a private matter, rather than an imposition by law?

Because when a filter is voluntarily applied by a platform, then it is not prohibited by the jurisprudence of the European court, which instead concerns filters imposed by the law. Youtube has developed a kind of voluntary filters mechanism called Content ID.

Ok, but at the end, art. 13 concerns mandatory or privately negotiated filters?

It is about mandatory filters imposed by the law, because video-sharing platform are obliged to adopt then to avoid liability for the uploaded content in the absence of a license agreement with rights-holders. However, the tortuous and byzantine wording used by the European Parliament create intentional confusion and misunderstanding, so as someone may believe that such filters are a private decision.

Well, could video-sharing platform simply agree on licensing agreement to avoid the obligation to adopt filters?

Yes, they could. However, it would be an unbalanced negotiation, because when just one party (i.e. the platform) risks legal consequences (i.e. the obligation to adopt appropriate measures, aka filters) in case of failure of reach an agreement, the negotiation cannot be balanced. The other party (the rights-holders) will be in the position to charge more than a fair price.

Ok, then the choice for sharing platforms would be between unfair prices, on one side, and adopting filters, on the other. What will happen, at the end?

With the exception of Google/Youtube (which can resist any legal dispute and has already developed a legal filtering technology, namely Content ID) the other (small) players will probably find more convenient to adopt a simple filter technology provided by the rights-holders themselves, blocking everything the latter wish. Small platforms do not have money to develop a proprietary filters technology like Youtube Content ID or to resist in legal disputes against stakeholders.

Ok, this is how the censorship machine work. But I have read that the new Copyright directive provides for an exception for freedom of speech, parodies, memes ecc ….

Yes, but it will not work, because it is mentioned, in a restrictive way,  only in a recital (n. 21a) of the Directive, not in an article, and it is envisaged only in favor of users, not platforms (so reads recital 21d). This means that platforms’ filters will still have the obligation to block and remove, in automatic way, everything that rights-holders consider their property, even if it is just an extract, fragment, elaboration, memes of a proprietary works. Then it will be up eventually to the user to make a claim, then there will be a kind of proceeding for the redress. But is is unlikely that a normal individual will start this procedure against the rights-holder, while the platform will be simply pissed-off, because it is not their job manage judicial proceedings and are not equipped for that.

Ok, lets’ sum up: the first step is that everything is blocked automatically. Then, the user can appeal, wait for justification from the right-holder and then somebody decides to reinstate or not reinstate the content. If it is still not reinstated, the user can further appeal.

Yes, and this is the reason why so many Internet father, scientists, the Italian data protection supervisor Antonello Soro are against. The UN special rapporteur reads it: “I am concerned that the restriction of user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions. Exacerbating these concerns is the reality that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching, criticism, satire and parody”.

One last question: do you work for Google?

No, I never got a penny from them.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s