COVID-19? Data Appropriation as a Feminist Issue (Part 1)

When US President Donald Trump announced that “we’re waging a war against the invisible enemy”, it became clear that the novel COVID-19 global pandemic is no longer considered just a health or socio-economic crisis. It is a matter of national security. 

Echoed by key political leaders of both the Western and Eastern world (including French President Emmanuel Macron and Chinese President Xi Jinping), securitizing an issue through a war-like narrative has resulted in one immediate consequence: executive power has been entrusted to leaders in order to take extraordinary measures in favor of the public good – including the (temporary) suspension of certain human rights and freedoms declared necessary to contain the imminent threat.

However, under international law, when an état d’exception is put in place, governments must observe standards of legality, necessity, and proportionality to ensure that an emergency situation is addressed with the minimum possible restrictions within the rule of law.

The global reaction to the coronavirus outbreak, both by democratic and authoritarian governments, is raising concern. In times of social distancing, as more interactions are moving into digital space, we are seeing an unjustified infringement of the right to privacy and the right of access to information and participation in public affairs

Both privacy and transparency are diminishing at a rapid rate as governments race to address the swiftly moving crisis, and data is becoming an asset whose value is exponentially increasing. exponentially increasing.

More data in the hands of the public sector translates into a higher governmental capacity to control the population, which in turn grants even more power. For the private sector, it signifies more profit. And while making our private data public and the public data private has severe implications for all of us, the effects are inevitably felt far more strongly by vulnerable parts of the population, such as women and marginalized ethnic or religious groups. Make no mistake: this is an intersectional crisis. 

Below is an overview of concerning data practices to date, why those are problematic, and how you can protect yourself and defend your rights.

Less Privacy

Our daily life is surrounded by pieces of software and hardware that collect, use, share, and store our data. Biometrics, location, medical records, financial transactions, conversations: it seems almost impossible to leave any aspect of our lives untouched by ICT. Although the need for more solid data protection is gaining popularity since the digital age began, it stems from the preexisting fundamental right to privacy recognized in various bodies of international, supranational, and domestic law, including international human rights law and constitutional law. 

Above all, not being subjected to arbitrary or unlawful interference with privacy is a universal human right enshrined in Article 17(1) of the International Covenant for Civil and Political Rights (ICCPR), and is a core principle of democratic societies. Translating this right into current digital times, the European Union has laid out several standards in its General Data Protection Regulations (GDPR). Whenever personal data is collected, the person in question must be informed of the specific purpose of its use and consent to this use. The data must only be used for that specific purpose, it must be updated and accurate, and it must be deleted when the person in question requests it or when it is no longer needed for the purpose it was intended to serve.

It is problematic, then, to see governments and private companies, directly and indirectly, attempting to increase their capacity to identify and track people without observing proper data protection standards, while not enough efforts are being put into alerting and educating oblivious employees (now forced to work from home) on how their personal information is being used and exposed.

1) The resurgence of indiscriminate surveillance

The race to “flatten the curve” and put a halt to the spread of the virus has seen facial recognition manufacturers on the rise and government officials competing to collect biometric data (see herehere, and here). While some countries are updating already-in-place infrastructure (China, Poland) others are forging new partnerships with private companies and launching trials to test new technologies, be they for checking quarantine compliance, identifying symptomatic people, or enforcing “zero-contact” retail payment methods (Malaysia, Russia, United States).

Likewise, geolocation tracking through smartphones is gaining notoriety in the Czech Republic, China, Israel, Italy, Singapore, South Korea, Taiwan, and the United Kingdom (see here and here). On a scale from mild to full-Orwellian State, the most concerning practices involve Israel accessing and using non-aggregated data (identifying specific individuals) until the High Court of Justice put a stop to it on 19 March, or resorting to historical data on confirmed COVID-19 carriers (secretly collected before the emergence of the virus).

Those flawed, bigoted, spy machines

Facial recognition and geolocation tracking pose severe dangers for three reasons. Firstly, these technologies are far from flawless. Who says the same person will carry his own device and leave it on at all times? Additionally, studies show that some facial recognition systems have an accuracy ratio of just 20%… except for when it comes to white males. That’s right – these systems have demonstrated a bias towards both gender and race, and are therefore more prone to making mistakes when identifying women and dark-skinned people. 

Secondly, are they really essential to curb the current pandemic? It is hard to tell. While certain media outlets cite the ‘intrusive measures’ taken by South Korea as a key reason for their success against the virus, other organizations claim that there are no conclusive reports on the actual impact of these surveillance techniques on the spread of COVID-19. 

Thirdly, even if they were to work, we have no guarantee that data protection standards are being followed and that our information won’t be used in the future for different purposes. The case of Israel alone shows the severity of having no solid transparency and accountability procedures in place. Unlawfully targeting dissidents by halting freedom of speech and movement – everything becomes possible when anonymity is jeopardized by States

Surveillance Self-Defense

In times of a state of emergency, a healthy grasp of surveillance self-defense is never too far-fetched. But there are currently two caveats to consider: In some countries, law enforcement is making sure people comply with new restrictions (those who ‘rebel’ can get a fine or even an unexpected visit from the police). Tech giants like Facebook and Google, as well as companies purposefully built to collect data (such as Clearview AI), are allegedly considering supporting government overreach during this crisis. 

Consequently, how much we can safely protect ourselves from intrusive scrutiny on all fronts is uncertain. All we can do is: inform and refrain. Inform yourself of the surveillance policies being implemented in your country and the privacy policies of the apps and programs you are using. Refrain, when possible, from downloading/utilizing apps and devices that resort to facial recognition and location tracking. Check your apps and opt-out from tracking and sharing your information (yes, Google Maps as well!). 

2) Passing laws to limit end-to-end encryption

On 5 March 2020, Members of the United States Congress introduced the S.3398 – EARN IT Act (Eliminating Abusive and Rampant Neglect of Interactive Technologies). While the bill does not address the COVID-19 pandemic, it surely profited from the lack of public scrutiny, now mostly focused on the health crisis. 

  • Protecting Encryption? You must EARN IT

Allegedly intended to prevent the sexual exploitation of children online, the bill effectively removes the protections of Section 230 (47 U.S.C. § 230) of the Communications Decency Act. Section 230, “the most important law protecting internet speech”, makes users/speakers responsible and liable for their online speech instead of the Internet companies hosting it. 

Under the EARN IT Act, Internet companies are by default liable for the content posted on their platforms, and can only enjoy the protections of Section 230 if a law enforcement commission considers them to be following a list of “best practices”. Among these best practices, says the Electronic Frontier Foundation, is offering states a backdoor (access) to encrypted messages. Accordingly, online service providers like Apple, Facebook, and Google are at a crossroads, forced to choose between having to censor speech or give governments access to user data. 

United States Citizens: Ask your Representatives to Reject the Bill

In the digital world, encryption is paramount to preserving secure communications and protecting us against mass surveillance. In democratic countries like the United States, Members of Congress are representatives of the people, bound by the obligations imposed by the American Constitution. The EARN IT Act is allegedly in breach of the First and Fourth Amendments. If you are an American citizen, contact your Congressional representatives, and demand that they reject the bill. 

3) Elimination of cash and introduction of Central Bank Digital Currencies

 

It didn’t take long before the World Health Organization (WHO) alerted the global public that banknotes, coins, and exteriors of ATMs could be potential COVID-19 carriers. While contactless payment technology is nothing new, the political, financial, and tech elite grasped the opportunity to raise the bar and advocate for the elimination of cash and the introduction of digital currencies altogether. 

With ‘dirty’ banknotes being quarantined, washed, and even burnt, the concept of Central Bank Digital Currencies (CBDC) is increasingly finding new adepts. States are taking emboldened moves (such as the People’s Bank of China reportedly completing the development of the Digital Currency Electronic Payment system and drafting legislation around it), as well as timid ones (like the American Congress having yet to approve legislation on a Digital Dollar and the Federal Reserve being cautious about revealing concrete CBDC plans). 

In the middle, there is a whole array of experimenters working solo and teaming up. These include the European Central Bank (whose CBDCs working paper released in January 2020 is now gaining traction), the Bank of France publicly calling for digital currency proposals, and the Bank of England actively researching the issueSix Central Banks have also joined forces to assess potential use-cases of CBDCs in their jurisdictions. 

Surveillance Exclusionary Economy

While it remains to be seen if any of these attempts will actually materialize, getting rid of cash and adopting CBDCs comes with serious privacy risks.  

In today’s economy, cash is crucial to our individual freedom for several reasons: anonymity, ‘permissionless’ transactions, and inclusivity. We can keep our transactions and purchases anonymous, unregulated, and uncensored. We don’t need to present paperwork (like birth certificates or IDs) to a financial institution or depend on a device to operate with it. It is available even to the “unbanked population” (usually lower-income and/or undocumented people who are unable to open a bank account or get a credit card). 

And even if some financial institutions are speaking about CBCDs operating in digital ledgers, by definition, digital currencies backed by Central Banks will not have all the properties of blockchain-based cryptocurrencies like Bitcoin or Ethereum, and that is a problem. Ironically enough, this point was raised by Harro Boven, a policy adviser of the Dutch Central Bank. Guardians of countries’ monetary policies to ensure currency-price stability, Central Banks may release more efficient and ‘sanitized’ mediums of exchange, but by no means will those be decentralized or permissionless. 

Where would that leave us? With more personal data and recorded transactions handed over to institutions. We would see an economy of less anonymity, with more surveillance and exclusion hitting some harder than others. Think about the sobering example given by Alex Tapscott, Co-Founder of the Blockchain Research Institute: what will happen to the “identities of women who want to buy contraceptives in communities that condemn birth control or sex outside marriage?”

Time to learn about crypto, stable coins, and DeFi

To defend economic freedom, education is our most powerful tool, and it’s right before our eyes. Luckily for us, for the first time in the History of Money, after the publication in 2009 of the paper Bitcoin: Peer-to-Peer Electronic Cash System, we have the opportunity to redefine the way in which we want to conduct financial transactions. Tap into the concept of blockchain technology; cryptocurrencies like Bitcoin, Ethereum, and anonymity-preserver Monero; the so-called Stable Coins, and the idea of DeFi (decentralized finance). Not sure where to start? Here’s a witty conversation to get going!

4) Working from home

Governments all around the world are either encouraging or mandating people to work from home, and companies and organizations are suddenly being forced to quickly adapt to new human resources and management practices outside of the traditional office environment. Digital platforms like Slack and Zoom are gaining popularity, but not enough awareness is being raised on the potential resulting threats to workers’ digital privacy, and how these might be fertile breeding grounds for Big Brother. 

New office, new threats

Catalystas compiled a list of the most popular apps being used right now by digital-newcomers. The Electronic Frontier Foundation has done a great job at highlighting how some of these might be collecting personal data, including through data retention or attendee attention-tracking (potentially in contravention of the EU GDPR standards mentioned above), and how to utilize them carefully. 

Cyber hygiene and honest apps

Our team has also issued recommendations to maintain proper cyber hygiene and keep your data, software, and hardware protected. If you have the ability to choose lesser-known and more privacy-conscious apps, check Vivaldi’s advice for some inspiration. 

[Go to Part 2 of this article to learn how, during the pandemic, governments are not abiding by transparency standards in law and policy-making, and why this crisis is not only unprecedented but intersectional]

Sofía Cossar

Principal & Operations Manager
Paris, France

Sofía is a professional with more than six years of experience in research, project management, fundraising, training & capacity building, public speaking, and communications strategy design in the fields of international law, democracy and governance, and new technologies. She holds a Master’s in Law applied to International Security from Vrije Universiteit Amsterdam, the Netherlands, and a Bachelor’s of Science in International Relations from Universidad Católica de Córdoba, Argentina.

A doctoral student in legal theory and legal tech, she is part of the BlockchainGov project, an interdisciplinary effort financed by the European Research Center and comprised of members of Harvard University and MIT.

She started out working in The Hague in the field of law-making, public international law, and human rights with major international non-governmental organizations and international organizations, as well as members of parliament from democratic countries all over the world.

More recently, she has collaborated with startups, social enterprises, civil society organizations, and academic networks in the Global North working on blockchain-based applications to identity, voting, representation, and currency. Sofia has also advised NGOs promoting sexual and reproductive health and rights in the Global South.