Imagine you had a master key for everything. A password that gives access to the door of your house, your room, your diary, your telephone, your computer, your car, your checking account, your medical history. Would you go around making copies of that key and giving them to strangers?
Probably not. So why do you give your personal data to practically anyone who asks for it? ” This excerpt from Privacy is Power (Bantam Press), the first book by Hispanic-Mexican philosopher Carissa Véliz, places us on a subject that haunts her: surveillance capitalismIt is based on an unacceptable interference with our privacy.
Véliz has dedicated his thesis and his research career to the intersection between privacy and technology, which he currently combines with teaching at the University of Oxford.
The data economy has played out right under our noses over the past decade, he argues, and we were late to realize the seriousness of its consequences. Therefore, Véliz concludes, the only possible answer is to end the model. The use of the data must be regulated at once and it must be prohibited to trade with them.
His recipe is a depth charge for big tech .
If there are people who think that it is too radical to say that we have to end the data economy it is because we are speaking from a status quo that is absolutely absurd and ridiculous.
The extreme and radical thing is that it is valid to have a business model that is based on systematically and massively violating rights. Now that is crazy.
In recent weeks we have learned that Washington will denounce Google for dominant position and Brussels against Amazon for using third-party data to compete against them. Does it seem like a first step in that direction?
We will see. It can be an incredible opportunity or just a show . But it definitely shows that the time when technology is going to be regulated is near.
If we review history we will see that we have been able to regulate any other industry: railways, banking, energy, automobiles, food… At the time, Rockefeller’s power was brutal. The most feasible strategy may be to make sure Big Tech is no longer monopolies and then regulate the details of privacy.
It says in the book that Google and Facebook don’t sell our data, but the power to influence our lives.
Yes, what right do they have? Autonomy has been a fundamental principle for Western societies, particularly European ones, for centuries. It is the foundation of our ethics. And it is absurd that there is a business system that is based on the idea of manipulating that autonomy, of undermining it.
The data economy is based on an unethical business model, without any limits and that we have allowed to proliferate to its liking, without consequences “
How would you convince those who say they have nothing to hide that they are wrong?
I wrote the book with them in mind. One way is to tell anyone who thinks like this: if you have nothing to fear, give me your email password. To date no one has given it to me because they don’t trust me, which is reasonable. But do you trust Google, which reads all Gmail emails? Or in Zuckerberg? We all have vulnerabilities.
Let’s assume you are healthy: are you sure about it? Because this algorithm that is on your phone and that analyzes how you move your finger through your contacts says that you have Parkinson’s or depression onset.
And I may know it before a doctor tells you. And perhaps that information is relevant for you to be hired or not in the future. Are you sure you have nothing to hide? On the other hand, by protecting yourself you protect others, especially those who are parents. Once you share something, you no longer control it.
One of the most striking ideas in his book is precisely that privacy is something collective, not individual.
Technology companies are very interested in people believing that privacy is something personal, in that there are those who decide to share their data just as there are those who prefer chocolate to vanilla.
There is actually a collective aspect of privacy that our ancestors understood better, hence why they included it among Human Rights after World War II. Privacy is collective in at least two ways.
On the one hand, our data usually contains data about other people. If I do a genetic test I am revealing the DNA of my present and future relatives. If I share data about my location I am also giving it about those who live or work with me. If I give information about my psychology I am giving information about people I don’t even know who share my profile.
On the other hand, we all suffer from the effects of lack of privacy. It happens as with contamination: you can be very careful with the management of your data, that if your environment is not, you will suffer the consequences.
There have been severe cases of cybercriminals caught because their friends shared photos of them on Facebook.
The Facebook and Cambridge Analytica scandal was paradigmatic: only 270,000 people gave their data, with consent in quotation marks, and from them they came to obtain 87 million, which were used to build a tool that predicted what individuals with psychological characteristics would vote on Similar.
It is so difficult to keep data securely and so easy to misuse it that it is very naive to think that all that information will always be used for good “
It says that democracy itself is threatened by lack of privacy.
On the one hand, it is influencing how voters feel. In the 2016 presidential campaign, Trump strategists identified 3.5 million black citizens who could be convinced not to vote.
Just trying to stop someone from voting should be totally illegal. Facebook and Twitter have tried to implement certain control policies, but they remain one-sided: no one tells them what to do. It is clear that, in the digital age, democracies that seemed very well established can no longer 100% guarantee that there are safe, legitimate and fair elections.
On the other hand, democracy is based on the idea that all people are equal, that we have the same rights, that we all have a vote.
But if society does not treat us as equals, if it does not show us the same opportunities, if it charges us differently for the same service, if we are treated based on the value of our data, the social fabric is being eroded in a moment in which there is a lot of distrust and polarization of discourses.
Do data brokers (companies that collect and sell data) threaten democracy?
It’s crazy that they exist. From a moral, political and justice point of view it is absolute nonsense. There are documented cases in which abusers obtain the data of their ex-partners by buying it from data brokers , or from a group of cybercriminals who bought the credit card numbers of thousands of people.
The latest scandal in England is that data collected by bars and pubs on who passes through the establishments, which was only supposed to be to fight the coronavirus, is being sold. That is one more symptom that we are installed in an unethical business model, without any limits and that we have allowed it to proliferate to its liking, without consequences.