Europeans Deserve to Have Their Governments Test—Not Trust—Filters
Thanks to the adoption of a disastrous new Copyright Directive, the European Union is about to require its member states to pass laws requiring online service providers to ensure the unavailability of copyright-protected works. This will likely result in the use of copyright filters that automatically assess user-submitted audio, text, video and still images for potential infringement. The Directive does include certain safeguards to prevent the restriction of fundamental free expression rights, but national governments will need some way to evaluate whether the steps tech companies take to comply meet those standards. That evaluation must be both objective and balanced to protect the rights of users and copyright holders alike.
Quick background for those who missed this development: Last March, the European Parliament narrowly approved the new set of copyright rules, squeaking it through by a mere five votes (afterwards, ten MEPs admitted they’d been confused by the process and had pressed the wrong button).
By far the most controversial measure in the new rules was a mandate requiring online services to use preventive measures to block their users from posting text, photos, videos, or audio that have been claimed as copyrighted works by anyone in the world. In most cases, the only conceivable “preventive measure” that satisfies this requirement is an upload filter. Such a filter would likely fall afoul of the ban on “general monitoring” anchored in the 2000 E-Commerce Directive (which is currently under reform) and mirrored in Article 17 of the Copyright Directive.
There are grave problems with this mandate, most notably that it does not provide for penalties for fraudulently or negligently misrepresenting yourself as being the proprietor of a copyrighted work. Absent these kinds of deterrents, the Directive paves the way for the kinds of economic warfare, extortion and censorship against creators that these filters are routinely used for today.
But the problems with filters are not limited to abuse: Even when working as intended, filters pose a serious challenge for both artistic expression and the everyday discourse of Internet users, who use online services for a laundry list of everyday activities that are totally disconnected from the entertainment industry, such as dating, taking care of their health, staying in touch with their families, doing their jobs, getting an education, and participating in civic and political life.
The EU recognized the risk to free expression and other fundamental freedoms posed by a system of remorseless, blunt-edged automatic copyright filters, and they added language to the final draft of the Directive to balance the rights of creators with the rights of the public. Article 17(9) requires online service providers to create “effective and expeditious” complaint and redress mechanisms for users who have had their material removed or their access disabled.
Far more important than these after-the-fact remedies, though, are the provisions in Article 17(7), which requires that “Member States shall ensure that users…are able to rely” on limitations and exceptions to copyright, notably “quotation, criticism, review” and “use for the purpose of caricature, parody or pastiche.” These free expression protections have special status and will inform the high industry standards of professional diligence required for obtaining licenses and establishing preventive measures (Art 17(4)).
This is a seismic development in European copyright law. European states have historically operated tangled legal frameworks for copyright limitations and exceptions that diverged from country to country. The 2001 Information Society Directive didn’t improve the situation: Rather than establishing a set of region-wide limitations and exceptions, the EU offered member states a menu of copyright exceptions and allowed each country to pick some, none, or all of these exceptions for their own laws.
With the passage of the new Copyright Directive, member states are now obliged to establish two broad categories of copyright exceptions: those “quotation, criticism, review” and “caricature, parody or pastiche” exceptions. To comply with the Directive, member states must protect those who make parodies or excerpt works for the purpose of review or criticism. Equally importantly, a parody that’s legal in, say, France, must also be legal in Germany and Greece and Spain.
Under Article 17(7), users should be “able to rely” on these exceptions. The “protective measures” of the Directive—including copyright filters—should not stop users from posting material that doesn’t infringe copyright, including works that are legal because they make use of these mandatory parody/criticism exceptions. For avoidance of doubt, Article 17(9) confirms that filters “shall in no way affect legitimate uses, such as uses under exceptions or limitations provided for in Union law” and Recital 70 calls on member states to ensure that their filter laws do not interfere with exceptions and limitations, “in particular those that guarantee the freedom of expression of users”.
As EU member states move to “transpose” the Directive by turning it into national laws, they will need to evaluate claims from tech companies who have developed their own internal filters (such as YouTube’s Content ID filter) or who are hoping to sell filters to online services that will help them comply with the Directive’s two requirements:
1. To block copyright infringement; and
2. To not block user-submitted materials that do not infringe copyright, including materials that take advantage of the mandatory exceptions in 17(7), as well as additional exceptions that each member state’s laws have encoded under the Information Society Directive (for example, Dutch copyright law permits copying without permission for “scientific treatises,” but does not include copying for “the demonstration or repair of equipment,” which is permitted in Portugal and elsewhere).
Evaluating the performance of these filters will present a major technical challenge, but it’s not an unprecedented one.
Law and regulation are no stranger to technical performance standards. Regulators routinely create standardized test suites to evaluate manufacturers’ compliance with regulation, and these test suites are maintained and updated based on changes to rules and in response to industry conduct. (In)famously, EU regulators maintained a test suite for evaluating compliance with emissions standards for diesel vehicles, then had to undertake a top-to-bottom overhaul of these standards in the wake of widespread cheating by auto manufacturers.
Test suites are the standard way for evaluating and benchmarking technical systems, and they provide assurances to consumers that the systems they entrust will perform as advertised. Reviewers maintain standard suites for testing the performance of code libraries, computers and subcomponents (such as mass-storage devices and video-cards) and protocols and products, such as 3D graphics rendering programs.
We believe that the EU’s guidance to member states on Article 17 implementations should include a recommendation to create and maintain test suites if member states decide to establish copyright filters. These suites should evaluate both the filters’ ability to correctly identify infringing materials and non-infringing uses. The filters could also be tested for their ability to correctly identify works that may be freely shared, such as works in the public domain and works that are licensed under permissive regimes such as the Creative Commons licenses.
EFF previously sketched out a suite to evaluate filters’ ability to comply with US fair use. Though fair use and EU exceptions and limitations are very different concepts, this test suite does reveal some of the challenges of complying with Article 17’s requirement the EU residents should be “able to rely” upon the parody and criticism exceptions it defines.
Notably, these exceptions require that the filter make determinations about the character of a work under consideration: to be able to distinguish excerpting a work to critique it (a protected use) versus excerpting a work to celebrate it (a potentially prohibited use).
For example, a creator might sample a musician’s recording in order to criticize the musician’s stance on the song’s subject matter (one of the seminal music sampling cases turned on this very question). This new sound file should pass through a filter, even if it detects a match with the original recording, after the filter determines that the creator of the new file intended to criticize the original artist, and that they sampled only those parts of the original recording as were necessary to make the critical point.
However, if another artist sampled the original recording for a composition that celebrated the original artist’s musical talent, the filter should detect and block this use, as enthusiastic tribute is not among the limitations and exceptions permitted under the Infosoc Directive, nor those mandated by the Copyright Directive.
This is clearly a difficult programming challenge. Computers are very bad at divining intent and even worse at making subjective determinations about whether the intent was successfully conveyed in a finished work.
However, filters should not be approved for use unless they can meet this challenge. In the decades since the Acuff-Rose sampling decision came down in 1994, musicians around the world have treated its contours as a best practice in their own sampling. A large corpus of music has since emerged that fits this pattern. The musicians who created (and will create) music that hews to the standard—whose contours are markedly similar to those mandated in the criticism/parody language of Article 17—would have their fundamental expression rights as well as their rights to profit from their creative labors compromised if they had to queue up to argue their case through a human review process every time they attempted to upload their work.
Existing case-law among EU member states makes it clear that these kinds of subjective determinations are key to evaluating whether a work is entitled to make use of a limitation or exception in copyright law. For example, the landmark Germania 3 case demands that courts consider “a balancing of relevant interests” when determining whether a quotation is permissible.
Parody cases require even more subjective determination, with Dutch case law holding that a work can only qualify as a parody if it “evokes an existing work, while being noticeably different, and constitutes an expression of humor or mockery.” (Deckmyn v. Vandersteen (C-201/13, 2014)).
Article 17 was passed amidst an unprecedented controversy over the consequences for the fundamental right to free expression once electronic discourse was subjected to automated judgments handed down by automated systems. The changes made in the run-up to the final vote were intended to ensure a high level of protection for the fundamental rights of European Internet users.
The final Article 17 text offers two different assurances to European Internet users: first, the right to a mechanism for “effective and expeditious” complaint and redress, and second, Article 17(7) and (4)’s assurance that Europeans are “able to rely” on their right to undertake “quotation, criticism, review” and “use for the purpose of caricature, parody or pastiche” … “in accordance with high industry standards of professional diligence”.
The Copyright Directive passed amid unprecedented controversy, and its final drafters promised that Article 17 had been redesigned to protect the innocent as well as punishing the guilty, this being the foundational premise of all fair systems of law. National governments have a duty to ensure that it’s no harder to publish legal material than it is to remove illegal material. Streamlining the copyright enforcement system to allow anyone to block the publication of anything, forever, without evidence or oversight presents an obvious risk for those whose own work might be blocked through malice or carelessness, and it is not enough to send those people to argue their case before a tech company’s copyright tribunal. If Europeans are to be able to “rely upon” copyright limitations and exceptions, then they should be assured that their work will be no harder to publish than any other’s.
Published February 06, 2020 at 09:08AM
Read more on eff.org
More Stories
Expanding Broadband in Portland The Time Is Now
The U.S. Patent Office Should Drop Proposed Rules That Favor Patent Trolls
Nurturing the Internet Freedom Movement