Decoding the U.N. Cybercrime Treaty
Negotiations for a proposed U.N.Cybercrime Treaty commenced in 2017 but began to take shape in 2022—and there’s a lot at stake. The draft treaty has the potential to rewrite criminal laws around the world, possibly adding over 30 criminal offenses and new expansive police powers for both domestic and international criminal investigations.
Given that existing cybercrime laws are, as stated by the U.N.General Assembly, “in some instances misused to target human rights defenders” and “endanger their safety in a manner contrary to international law,” these widened parameters amplify the potential implications for billions of people—particularly stifling free speech, increasing government surveillance, and expanding state investigative techniques.
Restrictions on Free Speech
Rather than focusing on core cybercrimes like network intrusion and computing system interference, the draft treaty’s emphasis on content-related crimes could likely result in overly broad and easily abused laws that stifle free expression and association rights of people around the world.
For example, the draft U.N.Cybercrime Treaty includes provisions that could make it a crime to humiliate a person and group, or insult a religion using a computer. This potentially makes it a crime to send or post legitimate content protected under international law.
Governments routinely abuse cybercrime laws to criminalize speech, claiming to combat disinformation, “religious, ethnic or sectarian hatred,” “terrorism,” “the distribution of false information,” and many other harms. But in practice, these laws are used to stifle criticism, suppress protests and dissent, and clamp down on free expression and association. This is despite the right to free expression—including the right to insult and offend—being protected under the Universal Declaration of Human Rights (UDHR) and Article 19 of the International Covenant on Civil and Political Rights (ICCPR)—of which the U.N.Member States negotiating the new treaty are parties to.
Governments may only limit these rights in very narrow circumstances. But the draft U.N. Cybercrime Treaty ignores these permissible limitations, which may lead to the criminalization of legitimate uses of technology that promote access to information and freedom of speech. The U.N.General Assembly has also made it clear that States should refrain from imposing restrictions on discussions about government policies and political debate; participation in election campaigns; peaceful demonstrations; expressing opinions and dissent; and being associated with particular religions or beliefs, including by persons belonging to minorities or vulnerable groups.
Threats to Privacy and the Right to a Fair Trial
Checks and balances on government use of surveillance laws are essential to avoid abuses of power and human rights like freedom of expression and association. We saw how the COVID-19 pandemic incentivized authorities to institute intrusive forms of surveillance without appropriate checks and balances, such as using surveillance technology to track individuals in public and monitoring private communications—all without legal authorization or oversight. And these laws disproportionately restrict the rights of those already marginalized and targeted in society, with personal data on religious beliefs, political affiliations, and other sensitive information collected in mass without guardrails against abuses.
Transparency is vital to ensuring that human rights are respected in communication surveillance, and people should be able to learn if their data was handed over to government authorities. However, the draft treaty allows authorities to impose gag orders even when disclosure would not pose a demonstrable threat to ongoing investigations.
The circumstances upon which police are permitted to access personal data during criminal investigations should always be subject to robust human rights safeguards and overseen by an impartial and independent oversight mechanism to ensure that individuals’ human rights are not at risk and to prevent police abuse of power.
However, the draft U.N.Cybercrime Treaty introduces vague provisions that will compel states to pass laws authorizing the use of overly broad spying powers without these safeguards—placing people at an increased risk of harm, and curtailing civil liberties and defendants’ fair trial rights. Even worse, during draft treaty negotiations, countries including India, Russia, China, Iran, Syria, and Tonga proposed amendments to remove Article 5, a general clause that emphasizes respect for human rights and references international human rights obligations. Rubbing salt into the wound, Egypt, Singapore, Malaysia, Pakistan, Oman, Iran, and Russia requested the deletion of even the most modest limitations on government spying powers, Article 42, on conditions and safeguards.
Some other countries wanted to keep Article 42, but proposed removing references to “principles of proportionality, necessity, and legality” and the “protection of privacy and personal data.” These countries argued that privacy protection was already covered by the general reference to human rights in Article 5. Other countries, including the United States, argued that personal data protection is not recognized as a right at the U.N. level. They also noted that the principles of proportionality, necessity, and legality may not have the same meaning or may not be present in different domestic legal systems.
Broadened Surveillance Powers
On top of government attempts to keep human rights safeguards out of the draft treaty, negotiators have proposed a variety of broad, vague provisions that expand surveillance powers across borders as well as within each country. EFF is calling for the exclusion of provisions that compel governments to adopt domestic laws authorizing very intrusive surveillance powers. Such powers must be narrowly and clearly defined and subject to strong human rights safeguards—neither of which the current treaty language does. We’ve proposed to exclude provisions addressing the interception of content, real-time collection of data, admission of digital evidence, “spontaneous information,” and “special investigative techniques.”
Some of the most wide-ranging surveillance powers mentioned above—real-time collection of traffic data, interception of content, and admission of digital evidence—have been controversial enough among negotiators that they are currently sidelined in “informal consultations.” This step is likely because of a lack of consensus on safeguards and may be because of concerns about rule of law, democracy, and lack of impartiality and independence of the judiciary affecting many prospective treaty signers. We hope these articles stay out of the main treaty unless and until meaningful and comprehensive human rights safeguards are applied to them, and there is an effective compliance mechanism monitoring states’ human rights obligations.
But many other powers are still part of the draft treaty. The general expansion of surveillance powers in the draft includes squishy language that law enforcement could use to argue authorize hacking into our devices without further public debate. That language should be clarified to remove ambiguities about which powers are intended.
The draft treaty also oddly refers to allowing authorities to use “special investigative techniques,” again without ever defining what those are. The current language, indeed, could allow any type of surveillance technology—from malware to IMSI catchers, machine learning prediction, and other mass surveillance tools—as well as any tool or technique that may exist in the future. The use of new surveillance technologies must always be subject to public debate; we must not give law enforcement a permanent blank check to spy on people with methods that haven’t even been invented yet.
Governments Need More Than a Whim to Share Personal Information With Other States
We also call for the withdrawal of the “spontaneous information” provision, which poses a significant risk to individual rights. It provides that governments may voluntarily share the products of their electronic surveillance with other governments, whenever domestic law permits it. While similar voluntary disclosure among governments already happens, adding this option to the draft Treaty would expand and make it more common to implement such voluntary practice even with countries with poor human rights track records.
Moreover, as we fight to ensure that the proposed treaty imposes strong human rights safeguards, “spontaneous information” ignores whatever safeguards exist in the rest of the draft treaty whenever law enforcement authorities feel they’d like another country to have access to certain evidence.
Most of the safeguards in the draft treaty, in the context of cross border cooperation, are based on the scenario of formal requests for information, which must be assessed according to legal criteria—including human rights standards. We should aim for clear legal rules for sharing information, rather than simply having law enforcement decide to do so unilaterally. This is especially concerning when information obtained from country A can be given voluntarily to country B without adequate human rights safeguards, for example, for the purposes of disclosing the identity to prosecute journalists, human rights defenders, pro-democracy activists, and others.
Security is Hard Enough: Open-Ended Technical Assistance Mandates May Harm Security
Another surveillance provision in the current draft requires governments to take certain actions. They must adopt laws empowering authorities to order anyone familiar with a computer system’s functions and security features to cooperate. This includes providing the necessary information for authorities to get ahold of the users’ private information in the system.
This appears to be closely resembling controversial efforts to force tech companies and software developers to assist in circumventing the security measures they have created. The “necessary information to enable” access to secured computers and data could be argued be to include assistance in breaking encryption or other security measures. It could also be interpreted to include government demands for vulnerability disclosures (to be made confidentially to government authorities) or even for disclosure of private keys or issuance of false digital certificates.
The measure doesn’t seem to go as far as explicitly requiring tech developers to create backdoors in their security systems, but it should precisely define the limits of technical assistance and make clear that it is not authorizing the creation of backdoors or the weakening of encryption or other security measures.
Next steps
Cybercrime is not a new phenomenon, and we have already witnessed far too many examples of anti-cybercrime laws being used to prosecute individuals, chill human rights, and bring spurious and disproportionate charges against LGBTQ communities, journalists, activists, and whistleblowers.
While we don’t think the U.N. Cybercrime Treaty is necessary, we’ve been closely scrutinizing the process and providing constructive analysis. We’ve made clear that human rights must be baked into the proposed treaty so that it doesn’t become a tool to stifle freedom of expression, infringe on privacy and data protection, or endanger vulnerable people and communities.
Join us as we fight to protect free speech and privacy for all.
Published April 07, 2023 at 03:12PM
Read more on eff.org
More Stories
Expanding Broadband in Portland The Time Is Now
The U.S. Patent Office Should Drop Proposed Rules That Favor Patent Trolls
Nurturing the Internet Freedom Movement