When US President Donald Trump announced that “we’re waging a war against the invisible enemy”, it became clear that the novel COVID-19 global pandemic is no longer considered just a health or socio-economic crisis. It is a matter of national security.
Echoed by key political leaders of both the Western and Eastern world (including French President Emmanuel Macron and Chinese President Xi Jinping), securitizing an issue through a war-like narrative has resulted in one immediate consequence: executive power has been entrusted to leaders in order to take extraordinary measures in favor of the public good – including the (temporary) suspension of certain human rights and freedoms declared necessary to contain the imminent threat.
However, under international law, when an état d’exception is put in place, governments must observe standards of legality, necessity, and proportionality to ensure that an emergency situation is addressed with the minimum possible restrictions within the rule of law.
The global reaction to the coronavirus outbreak, both by democratic and authoritarian governments, is raising concern. In times of social distancing, as more interactions are moving into digital space, we are seeing an unjustified infringement of the right to privacy and the right of access to information and participation in public affairs.
More data in the hands of the public sector translates into a higher governmental capacity to control the population, which in turn grants even more power. For the private sector, it signifies more profit. And while making our private data public and the public data private has severe implications for all of us, the effects are inevitably felt far more strongly by vulnerable parts of the population, such as women and marginalized ethnic or religious groups. Make no mistake: this is an intersectional crisis.
Below is an overview of concerning data practices to date, why those are problematic, and how you can protect yourself and defend your rights.
Our daily life is surrounded by pieces of software and hardware that collect, use, share, and store our data. Biometrics, location, medical records, financial transactions, conversations: it seems almost impossible to leave any aspect of our lives untouched by ICT. Although the need for more solid data protection is gaining popularity since the digital age began, it stems from the preexisting fundamental right to privacy recognized in various bodies of international, supranational, and domestic law, including international human rights law and constitutional law.
Above all, not being subjected to arbitrary or unlawful interference with privacy is a universal human right enshrined in Article 17(1) of the International Covenant for Civil and Political Rights (ICCPR), and is a core principle of democratic societies. Translating this right into current digital times, the European Union has laid out several standards in its General Data Protection Regulations (GDPR). Whenever personal data is collected, the person in question must be informed of the specific purpose of its use and consent to this use. The data must only be used for that specific purpose, it must be updated and accurate, and it must be deleted when the person in question requests it or when it is no longer needed for the purpose it was intended to serve.
It is problematic, then, to see governments and private companies, directly and indirectly, attempting to increase their capacity to identify and track people without observing proper data protection standards, while not enough efforts are being put into alerting and educating oblivious employees (now forced to work from home) on how their personal information is being used and exposed.
The race to “flatten the curve” and put a halt to the spread of the virus has seen facial recognition manufacturers on the rise and government officials competing to collect biometric data (see here, here, and here). While some countries are updating already-in-place infrastructure (China, Poland) others are forging new partnerships with private companies and launching trials to test new technologies, be they for checking quarantine compliance, identifying symptomatic people, or enforcing “zero-contact” retail payment methods (Malaysia, Russia, United States).
Likewise, geolocation tracking through smartphones is gaining notoriety in the Czech Republic, China, Israel, Italy, Singapore, South Korea, Taiwan, and the United Kingdom (see here and here). On a scale from mild to full-Orwellian State, the most concerning practices involve Israel accessing and using non-aggregated data (identifying specific individuals) until the High Court of Justice put a stop to it on 19 March, or resorting to historical data on confirmed COVID-19 carriers (secretly collected before the emergence of the virus).
Facial recognition and geolocation tracking pose severe dangers for three reasons. Firstly, these technologies are far from flawless. Who says the same person will carry his own device and leave it on at all times? Additionally, studies show that some facial recognition systems have an accuracy ratio of just 20%… except for when it comes to white males. That’s right – these systems have demonstrated a bias towards both gender and race, and are therefore more prone to making mistakes when identifying women and dark-skinned people.
Secondly, are they really essential to curb the current pandemic? It is hard to tell. While certain media outlets cite the ‘intrusive measures’ taken by South Korea as a key reason for their success against the virus, other organizations claim that there are no conclusive reports on the actual impact of these surveillance techniques on the spread of COVID-19.
Thirdly, even if they were to work, we have no guarantee that data protection standards are being followed and that our information won’t be used in the future for different purposes. The case of Israel alone shows the severity of having no solid transparency and accountability procedures in place. Unlawfully targeting dissidents by halting freedom of speech and movement – everything becomes possible when anonymity is jeopardized by States
In times of a state of emergency, a healthy grasp of surveillance self-defense is never too far-fetched. But there are currently two caveats to consider: In some countries, law enforcement is making sure people comply with new restrictions (those who ‘rebel’ can get a fine or even an unexpected visit from the police). Tech giants like Facebook and Google, as well as companies purposefully built to collect data (such as Clearview AI), are allegedly considering supporting government overreach during this crisis.
Consequently, how much we can safely protect ourselves from intrusive scrutiny on all fronts is uncertain. All we can do is: inform and refrain. Inform yourself of the surveillance policies being implemented in your country and the privacy policies of the apps and programs you are using. Refrain, when possible, from downloading/utilizing apps and devices that resort to facial recognition and location tracking. Check your apps and opt-out from tracking and sharing your information (yes, Google Maps as well!).
On 5 March 2020, Members of the United States Congress introduced the S.3398 – EARN IT Act (Eliminating Abusive and Rampant Neglect of Interactive Technologies). While the bill does not address the COVID-19 pandemic, it surely profited from the lack of public scrutiny, now mostly focused on the health crisis.
Allegedly intended to prevent the sexual exploitation of children online, the bill effectively removes the protections of Section 230 (47 U.S.C. § 230) of the Communications Decency Act. Section 230, “the most important law protecting internet speech”, makes users/speakers responsible and liable for their online speech instead of the Internet companies hosting it.
Under the EARN IT Act, Internet companies are by default liable for the content posted on their platforms, and can only enjoy the protections of Section 230 if a law enforcement commission considers them to be following a list of “best practices”. Among these best practices, says the Electronic Frontier Foundation, is offering states a backdoor (access) to encrypted messages. Accordingly, online service providers like Apple, Facebook, and Google are at a crossroads, forced to choose between having to censor speech or give governments access to user data.
In the digital world, encryption is paramount to preserving secure communications and protecting us against mass surveillance. In democratic countries like the United States, Members of Congress are representatives of the people, bound by the obligations imposed by the American Constitution. The EARN IT Act is allegedly in breach of the First and Fourth Amendments. If you are an American citizen, contact your Congressional representatives, and demand that they reject the bill.
It didn’t take long before the World Health Organization (WHO) alerted the global public that banknotes, coins, and exteriors of ATMs could be potential COVID-19 carriers. While contactless payment technology is nothing new, the political, financial, and tech elite grasped the opportunity to raise the bar and advocate for the elimination of cash and the introduction of digital currencies altogether.
With ‘dirty’ banknotes being quarantined, washed, and even burnt, the concept of Central Bank Digital Currencies (CBDC) is increasingly finding new adepts. States are taking emboldened moves (such as the People’s Bank of China reportedly completing the development of the Digital Currency Electronic Payment system and drafting legislation around it), as well as timid ones (like the American Congress having yet to approve legislation on a Digital Dollar and the Federal Reserve being cautious about revealing concrete CBDC plans).
In the middle, there is a whole array of experimenters working solo and teaming up. These include the European Central Bank (whose CBDCs working paper released in January 2020 is now gaining traction), the Bank of France publicly calling for digital currency proposals, and the Bank of England actively researching the issue. Six Central Banks have also joined forces to assess potential use-cases of CBDCs in their jurisdictions.
In today’s economy, cash is crucial to our individual freedom for several reasons: anonymity, ‘permissionless’ transactions, and inclusivity. We can keep our transactions and purchases anonymous, unregulated, and uncensored. We don’t need to present paperwork (like birth certificates or IDs) to a financial institution or depend on a device to operate with it. It is available even to the “unbanked population” (usually lower-income and/or undocumented people who are unable to open a bank account or get a credit card).
And even if some financial institutions are speaking about CBCDs operating in digital ledgers, by definition, digital currencies backed by Central Banks will not have all the properties of blockchain-based cryptocurrencies like Bitcoin or Ethereum, and that is a problem. Ironically enough, this point was raised by Harro Boven, a policy adviser of the Dutch Central Bank. Guardians of countries’ monetary policies to ensure currency-price stability, Central Banks may release more efficient and ‘sanitized’ mediums of exchange, but by no means will those be decentralized or permissionless.
Where would that leave us? With more personal data and recorded transactions handed over to institutions. We would see an economy of less anonymity, with more surveillance and exclusion hitting some harder than others. Think about the sobering example given by Alex Tapscott, Co-Founder of the Blockchain Research Institute: what will happen to the “identities of women who want to buy contraceptives in communities that condemn birth control or sex outside marriage?”
To defend economic freedom, education is our most powerful tool, and it’s right before our eyes. Luckily for us, for the first time in the History of Money, after the publication in 2009 of the paper Bitcoin: Peer-to-Peer Electronic Cash System, we have the opportunity to redefine the way in which we want to conduct financial transactions. Tap into the concept of blockchain technology; cryptocurrencies like Bitcoin, Ethereum, and anonymity-preserver Monero; the so-called Stable Coins, and the idea of DeFi (decentralized finance). Not sure where to start? Here’s a witty conversation to get going!
Governments all around the world are either encouraging or mandating people to work from home, and companies and organizations are suddenly being forced to quickly adapt to new human resources and management practices outside of the traditional office environment. Digital platforms like Slack and Zoom are gaining popularity, but not enough awareness is being raised on the potential resulting threats to workers’ digital privacy, and how these might be fertile breeding grounds for Big Brother.
Catalystas compiled a list of the most popular apps being used right now by digital-newcomers. The Electronic Frontier Foundation has done a great job at highlighting how some of these might be collecting personal data, including through data retention or attendee attention-tracking (potentially in contravention of the EU GDPR standards mentioned above), and how to utilize them carefully.
Our team has also issued recommendations to maintain proper cyber hygiene and keep your data, software, and hardware protected. If you have the ability to choose lesser-known and more privacy-conscious apps, check Vivaldi’s advice for some inspiration.
[Go to Part 2 of this article to learn how, during the pandemic, governments are not abiding by transparency standards in law and policy-making, and why this crisis is not only unprecedented but intersectional]
Catalystas Consulting, Lange Voorhout 43, 2514 EC Den Haag, The Netherlands
© Catalystas Consulting. All rights reserved. Privacy and Policy
Massara is a seasoned mental health professional with extensive experience in Mental Health and Psychosocial Support (MHPSS), specializing in therapeutic interventions for high-risk cases and severe trauma. She works closely with partner staff, providing guidance and training to enhance their skills in communication and de-escalation techniques to support individuals facing mental health challenges.
Beyond her core expertise in mental health, Massara has a broad range of experience in protection programming, including child protection, sexual and reproductive health rights (SRHR), and gender-based violence (GBV). She is skilled in project cycle management, encompassing needs assessment, planning, implementation, and monitoring & evaluation. Massara is adept at coordinating with government bodies, communities, and protection actors to design and implement effective programs.
Her background also includes monitoring and reporting on protection situations, developing and delivering training programs, and conducting evaluations to improve program quality. She facilitates communication among stakeholders, addresses protection concerns, and supports gender and child protection initiatives through advocacy and community engagement.
Massara holds a Bachelor’s degree in Psychological Counseling from Baghdad University and a Bachelor’s degree in English Language from Al-Noor University College. She has also completed advanced training in trauma resiliency, high-risk assessment, and psychosocial support for conflict-affected children.
We built Lysta as an answer to one of our own problems: the need to quickly assemble teams of experts across various subject matter, geographic, linguistic, and thematic areas for projects and proposals as they arise. We quickly realized that we were not the only ones facing this challenge! With the speed that development projects require hiring, turnaround, and technical insights, we see first hand how helpful it is to have a ready-made database of vetted experts to call on.
For existing and potential Clients, you can access all consultant full profiles by signing up here as a client for free.
For consultants, adding your profile to Lysta means jumping to the top of the list for our clients in recruitment processes. We do the heavy lifting: the CV reviews, interviews, vetting, and personnel management; so when our clients come to us, they know they’re hiring someone they can trust to deliver high quality, timely results. Click here to add your profile for review.
We’re proud to be a link in the chain that connects the best of the best – don’t hesitate to reach out and see how you can put Lysta to work for you!