Browser cookies are not consent: The new path to privacy after EU data regulation fail

The endless cookie settings that appear for every website feel a bit like a compliance joke on the part of an internet hell-bent on not changing. Is very annoying. And it feels a bit like revenge against regulators by the data markets, giving the General Data Protection Regulation (GDPR) a bad name and so it might appear that political bureaucrats have once again clumsily interfered. with the otherwise fluid innovation progress. .

However, the truth is that the vision of privacy put forward by the GDPR would usher in an era of innovation far more exciting than today's seedy technology. However, as it is today, it just fails to do so. What is needed is an infrastructure approach with the right incentives. Let me explain.

The granular metadata that is collected behind the scenes

As many of us now know, laptops, phones, and all devices with the prefix "smart" produce an incessant amount of data and metadata. So much so that the concept of a sovereign decision on your personal data hardly makes sense: if you click "no" to cookies on a site, an email will have silently sent a tracker. Delete Facebook and your mother will have tagged your face with your full name on an old birthday photo and so on.

What is different today (and why in fact a CCTV camera is a terrible representation of surveillance) is that even if you choose and have the skills and knowledge to protect your privacy, the overall collection environment massive metadata will still damage it. It's not about your data, which will often be encrypted anyway, it's about how the collective metadata streams will nonetheless reveal things at a granular level and show you as a target - a potential customer or potential suspect in case your behavior patterns stand out.

Related: Data privacy concerns are increasing, and blockchain is the solution

However, despite what it may seem, everyone wants privacy. Even governments, corporations, and especially military and national security agencies. But they want privacy for themselves, not for others. And this leads them to a little puzzle: how can national security agencies, on the one hand, prevent foreign agencies from spying on their populations while simultaneously building back doors so they can snoop?

Governments and corporations have no incentive to provide privacy

To put it in language eminently familiar to these readers: the demand is there but there is a problem with incentives, to put it mildly. As an example of the magnitude of the incentive problem that exists right now, a report by EY values the UK health data market alone at $ 11bn.

Such reports, while highly speculative in terms of the true value of the data, nonetheless produce an irresistible feeling of missing out, or FOMO, leading to a self-fulfilling prophecy as they all rush towards the promised profits. This means that while everyone from individuals to governments to large tech corporations may want to ensure privacy, they simply don't have strong enough incentives to do so. The FOMO and the temptation to sneak through a back door, to make secure systems a little less secure, is simply too strong. Governments want to know what their population (and others) are talking about, companies want to know what their customers think, employers want to know what their employees are doing, and parents and school teachers want to know what children are doing. .

There is a useful concept from the early history of science and technology studies that can somehow help clear up this mess. This is the affordability theory. The theory looks at the use of an object for its particular environment, system, and things it offers people: the kinds of things that become possible, desirable, comfortable, and interesting to do as a result of the object or the system. Our current environment, to put it mildly, offers the irresistible temptation of surveillance to everyone from pet owners and parents to governments.

Related: The data economy is a dystopian nightmare

In an excellent book, software engineer Ellen Ullman describes program some network software for an office. She vividly describes the horror when, having installed the system, the boss excitedly realizes that it can also be used to track the keystrokes of his secretary, a person who had worked for him for over a decade. As before, there was trust and a good working relationship. The novel powers inadvertently turned the boss, through this new software, into a creep, scrutinizing the more detailed daily work rhythms of the people around him, the frequency of clicks and the pause between keystrokes. This mindless monitoring, albeit by algorithms rather than humans, generally passes for innovation nowadays.

Privacy as a material and infrastructural fact

So where does this take us? That we can't just patch up personal privacy in this surveillance environment. However, your devices, the habits of your friends and the activities of your family will be linked and will identify you. And the metadata will be filtered independently. Instead, privacy should be ensured by default. And we know this won't happen just because of the goodwill of governments or tech companies because they simply don't have the incentive to do so.

The GDPR with its immediate consequences has fallen short. Privacy should not be just a right that we desperately try to click on every website visit, or that most of us can only dream of exercising through costly court cases. No, it has to be a material and infrastructural fact. This infrastructure has to be decentralized and global so that it does not fall in the interests of specific national or commercial interests. In addition, you must have the right incentives, rewarding those who manage and maintain the infrastructure so that protecting privacy becomes profitable and attractive, while damaging it becomes unfeasible.

In closing, I want to point out a hugely underrated aspect of privacy, namely its positive potential for innovation. Privacy tends to be understood as a protective measure. But, if privacy were just a given, data-driven innovation would suddenly become much more meaningful to people. It would enable a much broader commitment to shaping the future of all things data-driven, including machine learning and artificial intelligence. But more on that next time.

The views, thoughts and opinions expressed here are those of the author alone and do not necessarily reflect or represent the views and opinions of Cointelegraph.

Jaya Klara Brekke is the Chief Strategy Officer for Nym, a global decentralized privacy project. She is a researcher at the Weizenbaum Institute and has a PhD. from the Department of Geography at Durham University on the policy of blockchain protocols, and is an occasional expert advisor to the European Commission on distributed ledger technology. He speaks, writes, and conducts research on the privacy, power, and political economies of decentralized systems.