Not only are our behaviors constantly monitored and studied by increasingly sophisticated digital means. What is happening in our heads, warns Susie Alegre, is no longer safe from Big Tech’s data collectors. “And thus an elementary freedom is at stake: the freedom to mean what you want.”
Alegre is a British human rights lawyer, with a keen eye for the implications of technology and artificial intelligence for privacy and freedom of thought. When her daughter was about seven years old, she asked why she could not have Alexa in her room. Her friends had a voice-activated digital assistant from Amazon that you can ask all sorts of questions and give commands to – to find information, such as putting music on or ordering something.
Alegre did not want to hear about it, she says in her book Freedom to think; The long struggle to liberate our minds† She explained to her daughter, “Alexa steals your dreams and sells them.”
“I am not opposed to technological progress,” Alegre stressed in a video call. “I really do not want to go back to the twentieth century.” But she believes legal boundaries should be set now that technological opportunities threaten fundamental rights.
“More and more technology is being developed to access our thoughts and emotions. It’s dangerous, because that’s how we can be manipulated. Some things you should be able to keep to yourself. But it’s getting harder and harder.
“In the world of Big Brothers, i 1984 by George Orwell the protagonist says: nothing is your own except the few cubic centimeters in your skull. But even the few cubic centimeters are now for sale to the highest bidder. It seems like we 1984 not as a warning, but as a model. “
Voice controlled devices
As a lawyer, Alegre analyzes the problem from the perspective of the law. And she is also looking for the solution. She urges legislation to curb the hold that technology companies have over our lives and mindsets. Her premise, like that of other technology critics, is that our online behavior reveals more about ourselves than we realize.
“We do this not only through the content of what we share on the internet, but also through metadata: where we are, how fast we move, when and with what regularity we hang out on social media, what we ask for when from voice activated devices All that data can be put together and analyzed to get a picture of who we are, what we think and what we want.
“Facebook knows you better than your family, newspapers and news sites wrote a few years ago. Scientific research has shown that your personality can reasonably be deduced from your thumbs up on Facebook: whether you have an open character is dutiful, neurotic , welcoming or outgoing. ” For example, a love of meditation, of the work of the painter Salvador Dalí or TED Talks would indicate an open character.
“Many ads that are presented to you on the Internet are specifically tailored to you, or rather: to the person Facebook and Google want you to be. You may say: what does it mean? However, this may mean, for example, that you do not see certain vacancies, rental housing or potential partners, because according to the algorithm you are not the suitable type for it.
Can be annoying, but does it limit my thinking?
“Yes, because the big technology companies are studying us to manipulate us. It’s their revenue model. From your behavior on the internet, they can deduce your vulnerabilities and your emotional state. It’s interesting for a company that wants to sell us something.
“This person is a worrying type, they conclude from their data, for example one who is still endlessly scrolling on Facebook at 11 o’clock in the evening. This is a good time to offer a product that addresses this concern and exploits the vulnerability, for example by presenting ads for gambling.
“Political campaigns also use that kind of personal advertising – ‘microtargeting’. It was already evident from the scandal surrounding Cambridge Analytica (the British company that before the 2016 US presidential election used data from millions of Facebook users to distribute ads for Donald Trump among people who appeared to be sensitive, ed.).
“You do not suddenly become from the left very right. But it can encourage or deter you from voting. And even if it does not work, it is still an attempt to manipulate our thinking.
“The thought that you are not free in your own mind, not free to think what you want, is so awful that we can not bear to think about it. But it is necessary.”
Freedom of thought is generally protected in international law, including in the Universal Declaration of Human Rights. It is an absolute right, says Alegre, which should under no circumstances be violated, just as the ban on torture and the ban on slavery are absolute.
“The right to privacy can be violated in certain cases, for example if national security is at stake. Freedom of thought cannot, it is fundamental to human dignity for what it means to be human. And if you lose the freedom to think, it’s very hard to get it back.
“As individuals, we need mental privacy. An inner space where you can discover who you are, what you want to be, what you think about things, without having to share it with anyone else. Where your ideas can develop. A safe place where you can also think about horrible or dangerous things that come to mind and where you can consider what to do with them.
“If people can see what we think and even if they just say they can, it can be used against us. In criminal law or in a dictatorship, the consequences for your freedom can be very great. Does this person act as someone who may have criminal plans or dissenting views? “
Weren’t commercials and propaganda always meant to get into our heads, even long before the Internet?
“To a certain extent, yes. But the big question now is: where does the dividing line between influence and manipulation go, between preaching and brainwashing? In the case of ads, the limit lies in digital surveillance and personalization. This kind of influence extends far beyond a billboard. Along the road.
Also read: The hungry monster of surveillance capitalism
“For me, all forms of personal advertising are prohibited. Online ads are still possible, but what you see is no longer related to what a website or technology company knows about your personal characteristics or behavior. The context, the topic of the article you read, or the video you watch determines what kind of ad is displayed – to you and all other visitors.
“President Biden has called on Congress to ban personal advertising for children. I would say: why not ban it for everyone right away? I do not think it will be allowed in five to ten years.”
But should we let governments decide what we can see or read and what not? Is it not at the expense of our freedom?
“It’s a difficult case. But you have to realize: the regulation I am in favor of is not about content, but about what happens to our data.
Take mental health websites where people fill out questionnaires about how they are feeling and whether they are depressed. That is, about what is going on in their heads. A study in the UK, France and Germany showed that in many cases the answers were sold to third parties. The key to people’s minds turned out to be auctioned off! Is it not obvious that it should not be allowed?
“Or take apps that help predict your menstrual cycle. Not only do people enter all sorts of information on it, but it can also be combined with other data from your phone. It provides a mountain of data on fertility status, sexual activity and emotional state for countless women.Some of these apps use the information to ask you: you are 29 now, the perfect age to have a baby.This seems useful, but it can also be a form of uncomfortable social control.
“Or take dating apps. People often share a lot about themselves, from their sexual preferences to their health and political views. Users agree that others can see it. But are they aware of how this data is stored, analyzed, shared, sold and used to ‘train’ algorithms?
“I found it shocking that American researchers, using artificial intelligence and images from dating apps, claimed that an image of a person’s face can determine his or her political preference or sexual orientation. You go to such an app because you want to meet someone. But your photo can be used to develop a program that can endanger others in the world because a face can apparently betray your political or sexual identity. “
What can you do to protect your mental privacy?
You can install fewer apps and leave your phone out more often – I don’t even look at it after 7pm. You also have apps against the addictive effect of your phone, such as the Freedom app (which you impose on yourself for eight hours of digital rest, ed.). You can never completely escape. If you are rarely online, nor do you leave traces because you always pay cash, then you are also known as a particular type.
“The real change must come from regulation through laws. In the tech world, there is a lot of talk about ethics, but not nearly enough about human rights and legislation. Because ethics is voluntary and you can choose it or not. You can not be sued for your ethical conduct alone. Therefore, we need to determine by law what can be done with our data, what is only allowed with express permission, and what is not allowed under any circumstances. ”
How realistic is that? It saves a lot of money by collecting data and selling personal ads.
“Apple CEO Tim Cook spoke at Stanford University in 2019 about the dangers of losing ‘the freedom to be human’ in a world without digital privacy. He meant freedom of thought.
“Had we had the distraction and manipulation of digital media 30 years ago, there would have been no innovation,” he argued. Then Silicon Valley would not have existed. I saw him say it hopefully. “