‘Government Can’t Protect Us From Big Tech’

How do you stand up for your labor rights when an algorithm is in charge?

Take Uber drivers: sign up for a shift by scanning their faces with recognition software, get ratings through an online evaluationsystem and can be fired without further consultation in case of a poor score. A group of British drivers were tired of going through all this for less than the minimum wage and without pay for illness or vacation. They saw only one option: to go to court.

Successful. Uber must treat its drivers as employees, including minimum wage and paid leave, the British Supreme Court ruled. If they were really freelancers, they could also negotiate terms of employment, fare and distribution of rides. Similar statements followed in the Netherlands and Italy, among others, and the EU is developing rules for how all digital platforms should treat their employees.

We use apps for everything from groceries to a taxi ride, but in this so-called ‘platform economy’ – the sector of online platforms that match supply and demand – labor rights had no place for a long time. Until now, says Nani Jansen Reventlow, human rights lawyer and founder of the Digital Freedom Fund (DFF). DFF – the first organization of its kind in Europe – supports lawsuits related to ‘digital rights’. In three years, the DFF funded more than fifty cases and many more received free legal advice. They also supported the Uber drivers in England and the Netherlands.

“Digital rights are really just human rights, but in a digital context,” says Jansen Reventlow. These are the employment rights of Uber drivers, the right to education for children who do not have a laptop, and the right to be considered only fraud for good reason, not based on an algorithm from the tax authorities.

“I would love to say I have a good story about why I stand up against injustice, but I do not,” laughs Jansen Reventlow. “My father is Malian, my mother is Dutch born; if one is always ‘different’, one becomes preoccupied with injustice. But for the most part, I went to law school because I did not know what I wanted to be. ” Once in class, everything fell into place. “I was such a terribly annoying student who always did homework and quarreled.”

Years later, she specialized in digital rights. “Many people think that the dangers of technology are limited to ‘privacy’ and ‘data protection’. But it’s much greater. imperceptibly affected.

Nani Jansen Reventlow (1978) grew up in Amsterdam and studied law at UvA. She spent four years at a law firm and five years at the human rights organization Media Defense in London. In 2017, she founded the Digital Freedom Fund. She handed over the organization at the end of 2021 and is in the process of setting up Systemic Justice, an organization that will bring cases around (digital) human rights across Europe, focusing on anti-racism and social and economic equality.

Jansen Reventlow teaches at Oxford University in the UK and lives in Berlin. In 2021, she won the Felipe Rodriguez Award, which the privacy organization Bits of Freedom awards to advocates for online rights. She has also won awards from Harvard, Oxford and Columbia University, among others.

It sounds like a warning; Do you see technology as a threat to human rights?

“What worries me the most is how technology companies present themselves as the ‘great saviors’ and how governments just accept it because they can not develop technology themselves. Take the corona pandemic: Covid apps have been developed galore , which makes surveillance more and more normal. But we’ve never really looked critically at what’s happening with our data. Technology companies do not like to be open, and politicians do not fight hard enough for that. bridge, also want to know exactly where your money is going and whether the bridge is safe?

Information can fall into the wrong hands, so ask yourself: who do I have nothing to hide from?

Many people think that they still have nothing to hide. But it is very naive. Technology companies know more than you think. On the basis of like and status updates predict which party someone will vote for and even who is at risk of developing diabetes or depression. I always use an extreme example: during World War II, there was a register ready with everyone’s religion listed. Information can fall into the wrong hands, so ask yourself: who do I have nothing to hide from? ”

Why does Big Tech get away with this so easily?

“It reinforces itself: as long as companies are not transparent, the abuses remain invisible, and we are therefore not aware of the dangers. Also, many media and politicians simply understand too little about technology.

There is a lot of attention to privacy, data protection and freedom of expression on the web, but one hears a little about how technology puts pressure on socio-economic rights. This is because the whole field – from Silicon Valley to developers, from lawyers and NGOs to media and politics – is incredibly homogeneous. White, male, cisgender, middle class, no restrictions. They have little to fear from automation; it is precisely the under-represented groups that are affected. ”


Picture of:
Mishael Philip

So the rights of marginalized groups in particular are at stake?

“Exactly. Due to automation, an application for a grant can be rejected for no apparent reason. Or take education: Since the pandemic, many universities have used proctoring software to check students taking their exams at home for fraud. This is also happening in the Netherlands. U.S. research shows that this software has racist elements: colored children are not properly recognized, which means they are more quickly identified as fraudulent.

And in Europe, there are more and more cameras on the streets because of ‘security’. But who is it safe for? For a person who is often ethnically profiled and sees enforcement as a threat, constant surveillance is actually less secure. “

In 2020, the Consumer Association and the Data Protection Fund started a claim against Facebook, which over 180,000 people joined. Facebook must compensate its millions of Dutch users financially for having violated their privacy for years. The case is still ongoing.

You are fighting these inequalities through the courts. How does it work?

For example, the DFF supported the case of Liberty, a British human rights organization that fights police use of facial recognition software. They won their first appeal in 2018: according to the judge, South Wales police could not guarantee that their software did not discriminate and did not invade the privacy of persons whose faces were scanned. And we also supported the successful SyRI case [hierdoor moest het fraudebestrijdingssysteem van de Nederlandse overheid van tafel, dat volgens de rechter inbreuk maakte op het recht op privacy, red.]†

The Uber case started with a couple of drivers, now comes new EU legislation

The red line in everything I do is ‘strategic litigation’. The goal is not so much to prove that one client is right, but to force companies or governments to make major changes, such as changes in the law. The Uber case started with a handful of drivers organizing, and now the EU is developing legislation for the entire platform economy. ”

You mentioned that the field of digital rights is very white and masculine. What is it like for you to move in that world?
“I have long doubted whether I would be honest about this, because it is about people I work with. But it is important to talk about this. I may have set up an organization that looks good, but it was not always easy.

The team and board always asked me extra about my expertise or plans for the organization. Although I’m sure if I had been a white guy, no one would have doubted my suggestions for a second.

It’s a constant consideration: should I stand up for myself now, or just continue with my work? It’s not healthy to keep coming across microaggressions – or in fact they’re just aggressions – but I’m quickly dismissed as angry black woman† So I have to react calmly enough so that I do not alienate my own colleagues even more. I have often sighed over how much I could get done if I did not lose so much energy on all that side fights

You put it into action: DFF started the project in 2018 Decolonization of digital rights† What exactly does it mean?

“We want to facilitate the cooperation between lawyers and activists from marginalized groups. So we bring digital rights organizations from all over Europe into contact with, among others, LGBTI + and women’s organizations, anti-racism activists, sex worker groups and undocumented migrants. They can learn from us about the dangers of technology and how to combat them. We, in turn, learn about problems they face that we may overlook and what it takes to solve our own bias.

My new organization, Systemic Justice, therefore works differently than lawyers usually do: we first enter into dialogue with local organizations in Europe – from trade unions to Roma organizations – and then initiate cases together on what they consider important. ”

How can ‘ordinary people’ stand up for their online rights?

‘Civilian coalitions make mass demands against big tech companies, for example, to infringe on their privacy. In the US this happens a lot, in Europe it is quite new.

But the change must come mainly from governments. They should always ask the question: do we really need technology for this? The media is also not always critical about this, so my main advice is: keep an eye on local organizations, such as Bits of Freedom in the Netherlands. Not only do they provide great tips on how to protect your own privacy, they are also the first to warn of dangers online. ”

One World Doxing Illustration

4 questions about ‘doxing’

Doxing is scary, but (yet) not punishable.

Leave a Comment