Elizabeth Renieris is a Senior Fellow at the AI Ethics Institute at the University of Oxford. She is also a data protection and privacy attorney. Her work focuses on the ethical and human rights implications of emerging technologies such as artificial intelligence and machine learning, digital identity and augmented reality technology.
Below, Elizabeth shares five key insights from her new book. Beyond Data: Restoring Human Rights at the Dawn of the Metaverse. Listen to the audio version read by Elizabeth herself in the Next Big Idea app.
1. We are obsessed with data control.
When we think about privacy today, we may reflect on our lack of privacy, especially in relation to the way we are tracked and monitored online through the digital breadcrumbs we leave behind. Companies like Facebook, Google, Amazon, and concepts like surveillance capitalism and targeted advertising may come to mind. We might talk like an endless stream of high-profile data breaches exposing consumer data, health data, financial data, or scandals like Cambridge Analytica. After all, we are said to live in a “data-driven” world powered by “big data,” where “data is the new oil” and “data is power.” It is said For some, this means privacy is dead, and for others, it means they have to “own their data” to get it back.
Our laws equally focus on the management of personal data, i.e. how personal data is collected, shared, stored and otherwise processed. The law provides for little-read notice of how our data is used and requires nonsensical and perfunctory consent for its processing. The law requires companies to keep our data safe and confidential to third parties, while imposing few restrictions on how and what they do with it. Landmark regulations such as Europe’s General Data Protection Regulation (GDPR) and counterfeit laws around the world provide for “data subject rights.” These are the theoretical rights of individuals to access, rectify, erase, and transfer data from the party processing their data. While the U.S. awaits federal privacy legislation, it could prove difficult to enforce in practice. We seem to think that being in control of our data will protect us from technology-related harm and abuse.
2. Privacy is much more than just control over your data.
Privacy is a broader and older concept rooted in constitutional and human rights law. In 1948, in the fog of war, the United Nations General Assembly adopted the Universal Declaration of Human Rights, which proclaimed the right to privacy as a fundamental human right. During World War II, census, national registration, and conscription for military service became commonplace in many parts of the Western world. The trauma of his two successive world wars, particularly the Nazi racially motivated atrocities, helped strengthen the international consensus on human rights and shape the values that influence modern notions of privacy. It was helpful.
“Advancements in computing and networking technology in the second half of the 20th century will challenge these traditional notions of privacy.”
Like previous national constitutions, international human rights law conceived of privacy as delimiting family, home and correspondence in relation to state interference. In other words, privacy was considered necessary to maintain a zone or realm around an individual’s inner self and private life. That means protecting an individual’s physical person, home and family life. Establish boundaries that underlie the exercise and enjoyment of other fundamental rights and freedoms. To protect individuals from discrimination and harassment. And ultimately, to protect the individual freedoms and autonomy that a fully functioning democracy requires.
Advances in computing and networking technology in the late 20th century challenge traditional notions of privacy that, before the advent of electronic and digital communications, still required physical interference or intrusion. The notion of data protection introduced in response, though derived from the human right to privacy, was much narrower. This is the time for an approach to technology governance known as the “ICT turn” in the form of technology governance for databases, or the proliferation of data-focused principles, laws, regulations and policies.
3. Modern data protection laws are based on an outdated worldview.
Early data protection laws emerged in the early 1970s for personal computers (PCs). Back then, data about people was collected by known entities and stored in well-defined databases, both analog and digital. Its purpose is clearly defined, a world where personal data flows can be mapped at scale and the ‘online’ and ‘offline’ environments can be separated. These laws are based on the idea that, with sufficient notice and transparency, individuals have meaningful control over how their data is accessed, used, shared and processed for specific purposes. This unique paradigm is a common approach codified in modern data protection and privacy laws, including his gold standard GDPR.
“It is becoming increasingly impossible to separate the ‘online’ and ‘offline’ environments.”
But that world no longer exists. Instead, we live in an increasingly cyber-physical world where data constitutes the built environment. The vast web of Internet of Things (IoT) devices, sensors, AI, and machine learning systems, including deep learning and neural networks, are increasingly being virtualized, augmented, compounded, and augmented reality systems at unprecedented scale and speed. is flowing in A world in which it is increasingly impossible to distinguish between an ‘online’ and an ‘offline’ environment. The data supply chain has become so complicated and complex that few companies can manage the data they collect, store, and process, or effectively map the data flow. In this environment, the idea that individuals can exercise any meaningful control over their own data is pure fantasy, yet our laws continue to promote the idea.
4. Privacy has become the handmaiden of technology-related harm and abuse.
The gap between the world we live in today and the world data protection law assumes exposes us to true loss of privacy and risks of deception, manipulation, discrimination and harassment. easier. Businesses can also translate privacy rights into their image as a technocratic practice of data confidentiality and security.
Until very recently, commercial attackers have relied on largely ignored terms of service, privacy policies, and even asymmetric bargaining power to exploit user data. As these practices are challenged, there is a growing adoption of “privacy protection” or “privacy-enhancing technologies.” A wide range of technical measures, tools, and approaches to mitigate data privacy and security risks, including the risk of exposing sensitive attributes residing in data sets. Examples include homomorphic encryption, differential privacy, on-device machine learning, and synthetic data generation.
“Data-centric legal frameworks are easy to circumvent.”
When privacy is reduced to mere privacy, confidentiality, and security of data, there are virtually no limits to what a company can do or do, as long as it protects and secures the data it processes in transit. In practice, this distorted, mathematical, or technocratic notion of privacy has forced dominant technology companies to introduce more into their ecosystems, deepen vertical integration, and use privacy as a shield against competition. motivates to use That makes us more vulnerable to control and manipulation. , and exploitation by entities wielding unprecedented power.
Data-centric legal frameworks can be easily circumvented, as synthetic data is used for purposes that are not permitted for personal data use. The more our laws continue to focus on requiring businesses to protect data privacy and security, the more we forget to protect people’s privacy and security. As businesses continue to look for ways to move beyond data, so too will the need for an approach to managing digital tools and technologies.
5. We need a broader human rights-based approach.
As data becomes ubiquitous, we are in danger of making too many demands or underestimating data protection. On the one hand, data protection has become a kind of panacea for the harm caused by technology, acting as a kind of broad technology governance tool, albeit ineffectively. It does more than it was ever designed for, but at the same time, it asks for too little. Our data-centric approach to technology governance enables leading companies to effectively reduce the once-dominant concept of “privacy” to technological efforts to ensure data security and confidentiality. I was. In this way, derivative rights to privacy and data protection have been cut off from the human rights framework and lost much of their validity and power.
There are over 30 fundamental human rights and freedoms that apply to the human experience, with or without digital technology. As the ‘real’ and ‘virtual’ worlds continue to blur, clear dichotomy like ‘online’ and ‘offline’ erode, and everything becomes infused with data, digital and data There are no such things as rights, only rights. As long as technology governance is based on data or specific technologies, it will be governed and shaped by those who control both: powerful commercial interests. Only if technology governance presupposes human rights that come with it thanks to humanity, will it be framed by human interests. In fact, the Human Rights Framework offers us the only truly human-centred, technology-neutral approach, and our best chance of going beyond data.
Download the Next Big Idea app now to listen to the audio version read by author Elizabeth Renieris.