This article was originally published on medium.com on April 10, 2020 by Roxana Nasoi.
“Nothing generates more engagement than lies, fear, and outrage” — Sacha Baron Cohen
Today, in “the era of COVID-19”, fear of an invisible threat, outrage for a failing healthcare system at global proportions, misinformation, and biased data interpretations — hinting here at the “terror in numbers”, as mentioned by Darrell Huff in “How to Lie with Statistics” — all have created the perfect context for governments to implement mass surveillance programs, hand in hand with tech companies. This piece is about the PII (personally identifiable information) that is collected through these state programs, the underlying danger of surveillance used for hidden agendas, and why cryptography is our ally for protecting our privacy.
This piece was originally published on the Tagion Forum.
The Governments’ Response to the Coronavirus Pandemic
It is not the first time I’m voicing my worries (often shared by other privacy advocates) regarding how governments are responding to this pandemic. What we are seeing is a global tendency towards mass surveillance programs — both from authoritarian (China, Iran) and democratic countries (Israel, South Korea, India, Norway, the US, the UK, South Africa, etc.). Governments teaming up with technology providers, with little to no regard whatsoever to sensitive personally identifiable information (medical records, identity, location).
China has taken mass surveillance to an all-time high, by using AI, facial recognition, tracking devices, a database of tracked data, identity authentication in public areas, and monitoring 24/7 of its population. This is the only “transparency” the world is granted access to whenever someone discusses “flattening the curve” in China.
Next, we have democratic countries where surveillance was implemented overnight, mostly relying on GPS data (the US), cellphone tracking and tracing (South Africa), topped by army and police patrols (EU), total quarantines (Philippines), lockdowns (the UK & rest of the world), ankle monitors (the US), forms in which you declare yourself fully accountable for leaving the house (Romania, India). Neighbors reporting on one another for “breaching curfew” (the US, Italy). Governments declaring that the current surveillance laws could expand to 2021 (Norway). Phone calls and police visits if your cellphone is turned off (South Korea). Travel restrictions (everywhere, basically) and new arrivals from abroad required to wear electronic bracelets (Hong Kong).
Meanwhile, the healthcare system remained in the majority of cases at the same efficiency level as before. As fragmented as before. As priced in as before. As unequal as before. As corrupt as before. The sudden lockdown on a country by country basis has left millions unemployed, stuck at home, with little to no alternatives. This makes us all worry about the aftermath of COVID-19, where a virus becomes a symbol of inefficient poor leadership, economic collapse, political instability, no feasible solutions at a global scale. The truth is this pandemic showed how unprepared society is in its current economic, social, and political design.
Paying people to #stayathome through a form of universal basic income scheme may act as a band-aid, but it will not close the wound, while funds aren’t assigned for continuous testing of the entire population to gather real fresh raw data that can be used in research.
Privacy in the Time of COVID-19
One of the most circulated questions these days in the privacy community is whether privacy will survive the Coronavirus, or will we see it bleeding and gradually dying? Nobody knows how the world will look like five years from now. However, the current hasty actions by governments, the lack of proper funding and support for privacy tech, and the rise of digital data-driven business models with complete disregard for privacy and encryption — all these aren’t helping us paint any positive scenarios.
I’ve raised concerns over privacy laws not being actual preventive measures, but rather applying punishment post-breach, during my Tedx Talk in February. How does the law act when the government itself is the one not following the rules? We are seeing massive lobbying towards postponing privacy laws such as CCPA, and the same massive lobbying towards an end to encryption as we know it (EARN IT act in the US). Yet we are forgetting that…
Since 2013 there have been almost 15 billion data records breached or stolen. Only 4% of that number were “secure breaches”, where encryption was present and the data deemed “useless” to attackers.
Personal Note: While we need raw data to better identify potential solutions, our identities should be protected by default. Sensitive data in the form of medical data, identity-related data, the location should be anonymized, randomized, rerouted so that the possibility to re-identify it drops below 1%. What is more, data should be stored in secure environments. Most databases fail to offer the required security measures against ill-intended actors looking to explore security loopholes.
But privacy isn’t just about what is happening with these state surveillance programs popping up everywhere, it’s also about the current infrastructure. As work moves online (for those who can earn a living while working remotely), current remote work infrastructure solutions are faulty and cannot support the influx of new users. Video conferencing tools aren’t the only ones that fail to address encryption and privacy. We are looking at team management tools, email, cloud servers, even simple e-commerce platforms that are dealing with poor encryption by default.
The bigger they are, the harder they fall. Anyone remembering how Facebook justified its lack of encrypted data in its products, by blaming the massive amount of backend work and the implied requirements and efforts of a senior team, to “simply apply” E2E encryption to an existing system with millions/billion users? And argued that WhatsApp had it from the start, which made it easy. It will take years to see Facebook embrace encryption, and if the EARN IT Act passes, they will not even need to bother.
As Shoshana Zuboff states in her book “Surveillance Capitalism”, we are all “raw material”.
Cryptography — The Response to the “Raw Material” Problem
Cryptography — a field of heavy academic origins, started gaining commercial interest thanks to the cypherpunk movement in the ’90s and its implementation in today’s protocols (Bitcoin, privacy coins, DLT, privacy tools). Why am I referring to cryptography and not just encryption, its major subset? Because just encrypting data is not enough to solve the privacy concerns we have with current COVID-19 solutions.
We need privacy addressed at a protocol and system level, where cryptography solves the issues of data confidentiality, data integrity (encryption and decryption provide key roles here), authentication, validation, and so on. Such solutions would aim at creating digital data and communication streams that can in effect “protect themselves”.
So far there are some solutions and ideas in the works specifically for COVID-19. One, from MIT Labs, is to have the infected individual’s PII protected through redaction and hashing, while the healthy individual doesn’t compromise privacy by doing all matching algorithm calculations on their own phone, offline. This could be done via sorting of location information of positive cases, and hashing each piece of location and time data, in a one-way function (meaning it is irreversible to trace back to the actual identity or PII of the user), where the location and timestamp of historical data per user is converted into a unique number. Following this process, other hashed pieces and timestamps can be compared, matched, and downloaded locally on the individual’s device.
Personal Note: It is unsure whether sharding would be used here or how, and if the hashes are stored on centralized servers or there is an openness to use distributed ledger technology here. Hashes could be stored on a Bitcoin sidechain, for example, or in a Tagion sub-DART*. There is still concern whether hashes related to COVID-19 data should be indefinitely stored on an immutable blockchain, or removed once they serve their purpose (weeks, months, years after the recorded event occurred).
sub-DART* — a distributed database in the Tagion protocol, where its own governance and rules can be applied, allowing for the removal of any data once “x” time has passed.
Another solution would be hashing servers to distribute authority and split the data. If this sounds hard to follow, imagine a company having a financial headquarter and an operational headquarter, and replace the visuals with a hashing server and a storage server, controlled by different management/organizations but serving together within the same privacy protocol. The hashing server would handle encrypted location data and have a secret key, which means the hashed data on the hashing server would not be accessible by the storage server.
Personal Note: while this could work for non-IoT devices, it would still be a nightmare for IoT devices as it cannot guarantee full privacy. Mainly because IoT requires complex end-to-end encryption for devices and its “many-to-one” and “one-to-many” data flows (device, cloud server, manufacturer servers). Teserakt E4 was working on a soon-to-be-open source solution to integrate an implant in IoT manufacturer servers, but that is nowhere near even the testing phase. To add more, people are loathing 5G, so the infrastructure needed for IoT will take some time to make it to your friendly county.
Another solution, from the University of Pennsylvania, University of Toronto, and McGill University, is to use “mix nets” similar to how the Tor network operates. Again, redacted, hashed location data of infections would be transmitted to healthcare authorities through a collection of a minimum of three servers controlled by different entities. Each server would mix up the hashes every single time before passing them to the next server. When the hashes reach the government authority server, the last server won’t be able to identify specific hashes with specific users.
Personal Note: It is yet unsure if such a solution would suffer from the major drawbacks that we see today with Tor because clearly just a couple of servers won’t be able to sustain the proposed solution. Eventually, once people return to a somewhat normal life, and leave their houses, we may need increasingly more servers. Currently, the drawbacks of onion-like networks (Tor) are bandwidth performance, low latency, traffic analysis by ISPs, additional SSL and SSH certificates required as security protocol if any confidential data is involved, and obviously, Tor traffic being blocked by certain websites. If we end up requesting people to run their own node in a mix net, can we be sure the “anonymized” mix nets for COVID-19 and other viruses won’t be “identified” as Tor networks? I have my doubts other networks could differentiate, but I could be wrong here. However, I would not rule out completely the mix net solution.
But efforts are being made across the Globe, and in areas where privacy laws do exist, which makes any tracing or tracking technology invasive if privacy is not implemented at a protocol level. The EU is working on an open-source initiative dubbed PEPP-PT (Pan-European Privacy-Preserving Proximity Tracing — we are not that creative when it comes to names, see the AI HLEG ethics group proposal from 2019).
This initiative looks at Bluetooth signal tracing, which appears to be a spin-off of the Bluetooth contract tracing solution proposed by COVID-Watch. Bluetooth, in both cases, plays the role of a proximity detector, without recording location data or tie the communication to the individual’s identity. If a contact event has occurred, then the event is recorded with a unique random number (URN) and participating parties exchange it. If infected, a user would request a unique confirmation code (UCC) from the healthcare officer and use it in a function to unlock contact event numbers (URNs) and upload them to a server. The server then would send a request to all the phones in the network, see if there is a match with other users, and alert them in the app.
Personal Note: KNOB (Key Negotiation of Bluetooth) was an exploit allowed hackers that were physically close, to force your device to use weaker encryption when it connected. Considering users need to turn on their Bluetooth for 15min and be in proximity to each other, for the COVID-19 app to work, it raises a serious security flag, IMHO. Adding to this that connected Bluetooth devices broadcast their identity in a detectable way: if a nearby hacker sends an invalid public key to the Bluetooth device, the current session key can be determined, which means any transmitted data can be decrypted. This leaves me a bit concerned, I must confess because in general I don’t trust manufacturers and there are no strict security guidelines in place for Bluetooth devices. I will not rule out this potential solution, though, just for the sake of diversity.
The bottom line is that we have some propositions as alternatives to the invasive privacy-tracking applications that governments are trying to push onto us in a simulation of communism 101.
Now is the time to act.
WE as consumers have a lot of power. IF we demand it, developers and cryptographers will build it. IF we use these tools once they are out there, maybe more funds will be poured into privacy tech.
IF we demand privacy tools, that allow us to work together in keeping each other safe, without compromising our freedoms and our sensitive identifiable data, we can build an actual future.
Over, and out.
If you’d like to extend the discussions, you can reach me on:
Tagion Forum: https://forum.tagion.org/t/privacy-corner/
Tagion Telegram Chat: https://t.me/TagionChat
Recommended further readings, and sources that inspired this piece: