Expert Speak Digital Frontiers
Published on Jan 29, 2021
If this is a moment that will change the Internet, how will that process proceed from here? And for whom will it change?
US needs a federal data protection law right now: Marietje Schaake Big Tech unplugged Donald Trump after a violent pro-Trump mob — that was incited by the former, twice-impeached president — rampaged through the US Capitol on 6 January, killing five people. Big Tech’s actions suggest a red line was finally crossed. Quite the opposite, actually. Millions of red lines had been crossed by then and the platforms all looked the other way, until Trump was a lame duck and until Senate power had shifted. Even if we leave aside the political calculations for the moment and look at the Trump-shaped silence on social media, questions are circling overhead. Why did social platforms play with fire for so long? If this is a moment that will change the Internet, how will that process proceed from here? And for whom will it change? Nikhila Natarajan, Fellow, digital content, speaks with Marietje Schaake, international policy director at Stanford University’s Cyber Policy Center and international policy fellow at Stanford’s Institute for Human-Centered Artificial Intelligence. Below is a transcript of the interview, lightly edited.
Nikhila Natarajan: The reactions to 6 January were quick, and mostly predictable. For years, social-media tycoons have been hiding behind abstractions with statements like “we want to connect the world,” which form the basis for their uninhibited data collection. It’s about time to account for what they broke and what needs to be done to fix it. Marietje Schaake: For a lot of people watching what happened on 6 January in Washington DC, it suddenly became visible — not only how much of an impact the spread of hate speech, polarisation, and calls for violence can have in parallel universes online or the dark corners of the Internet, but that these theories really spill over and then boil over into the streets. They also saw that, unfortunately, significant amounts of people are susceptible to conspiracy theories, lies and then take matters into their own hands. Then we witnessed social media companies — probably terrified of the spotlight on them — taking the swift action we have not seen them taking for years. Civil rights groups, minority groups, and women have all been very concerned about the growing vitriol that they saw online and what it might lead to ultimately, but social media companies kept saying, “We don’t want to be the arbiters of truth, we need to protect freedom of expression.” Yet, there they were — Twitter first, then Facebook, reluctantly YouTube — banning Trump from their platforms and showing that they were capable of taking a position after basically paving his platform for four years. So the big question is now, what will this lead to? I hope it will not be a distraction from the need for democratic governments to set frameworks because I frankly think we cannot expect the same companies that have created the problem to solve it for us. We really need more independent oversight and clear rules where not only free speech is protected but also public safety, public interest and public health.

I hope it will not be a distraction from the need for democratic governments to set frameworks because I frankly think we cannot expect the same companies that have created the problem to solve it for us.

Natarajan: You offer the idea of middleware as a structural solution. Are you still working on that line of thought? If user data is not flowing, what is flowing and what is being identified? Schaake: It’s a work in progress together with my colleague Frank Fukuyama and others who have worked on this, where we started to look at how to mitigate excessive platform power through anti-trust mechanisms. But then we came to the conclusion that specific remedies are needed to address the specific harms to democracy, and one idea is to give the user — you and I — more agency and choice over what content you want to see. And the idea of middleware is essentially a layer in the middle, which is yourself as the Internet user, and the big tech platforms or search engines like Google. Essentially, trusted parties or organisations can say, perhaps, that they will create middleware so that if you go shopping you will only find products from a certain area, or maybe only biological products; or if you go online, you only get content from certain kinds of newspapers or that you prefer civil society links to come up on top. Basically, it curates your returns on searches and the presentation of content in a way that better suits your own choices, not the ad-driven algorithmically amplified propositions, links, and micro targeting that the platforms come up with today. Natarajan: Staying on that thought, algorithmic engineering is the only branch of engineering where there are no limits and no liability for injecting it into society, right? How does middleware solve the problem of data voodoo dolls? Schaake: We are still working on this proposed model from a technological point of view, from a policy point of view, from an enforcement point of view, and from a transparency and accountability perspective. So, we think it might be part of the solution, but I personally believe that there is no substitute for democratic law-making, checks and balances, and independent oversight so that the rule of law comes up on top and not any commercially-driven model, whether it’s big tech platforms or middleware providers or anyone else. I think democracy is too precious to leave to corporates with their profit objectives in mind to, as you said, basically experiment on us.

I personally believe that there is no substitute for democratic lawmaking, checks and balances, independent oversight.

Natarajan: It’s become popular at both ends of the political spectrum to dump on Section 230, a provision of the 1996 Communications Decency Act that pretty much created the modern Internet, which provides immunity to online platforms from civil liability based on third-party content and for the removal of content in certain circumstances. I get that calls have been growing to moderate content and so on but it is complex, it is costly, and it is inherently political. What is your take on this? our take? Schaake: I think there is consensus in the United States to lift the exemption to liability when it comes to the role of platforms. And you could argue that the platforms have lifted their own exemptions already because they initially said that they’re just neutral passers through of information but, for example, they increasingly started to intervene to redirect searches about COVID-19 so that people would learn from trustworthy sources and not some kind of conspiracy-ridden shop. So we’ve seen growing amounts of intervention but again from a commercial incentive not from a democratic incentive. So what I hope is that any review of Section 230 will not be a black and white discussion about either all liability or no liability; that’s a very unhelpful lens to look through because then people polarise and it’s very easy for the tech companies to say that it will end the Internet as we know it. It’s all about the details and will have to go hand-in-hand with stronger antitrust rules, better data protection provisions, more transparency into how algorithms work, and more accountability, not only liability. It will need more comprehensive processes for the responsibility of these tech companies from the individual level to the Board. So, I believe that we will see in the next, let’s say, decade, a set of different policy initiatives that hopefully will lay a new puzzle of democratic governance into our digital world. Natarajan: We know this stuff happens in fits and starts but what’s the low hanging fruit? What do you think happens first now? Schaake: At least in the United States and European Union, there is a lot of movement in the antitrust space — I think that will continue. But what I hope will happen is that there will also be more horizontal provisions, which will push for more transparency and access to information. One of the real problems is that whether you’re an academic researcher or an average Internet user or a journalist or parliamentarian anywhere in the world, you have a very difficult time discerning, understanding, and comprehending what happens under the hood of these tech companies, even if their power grows and grows and grows. So, having the right kind of access to the right kind of information for the right actors is an integral part of the rule of law. We have to have transparency. And this is a new challenge when the algorithms and machine learning processes constantly change. This is different from, let’s say, looking at the glass that I have here and assessing whether it’s been safely made — we can look at it, we can assess it. An algorithm is much harder — it’s fluid, it’s moving, it’s changing, and you would have a different experience online than I do or your neighbour or your family members etc. The individualised nature, on the basis of micro targeting for example, makes oversight a new challenge because we cannot take a moment’s observation like a screenshot of the experience we get because everybody has a different outcome. So we need new avenues towards transparency and oversight; I think that that is a path that has to be deepened as a condition for then looking at whether, for example, there’s been discrimination or disinformation about health or what not.

We have to have transparency. And this is a new challenge when the algorithms and machine learning processes constantly change.

Natarajan: Based on your lived experience, what is it from the EU playbook that the US can pretty much immediately take, and apply here? Schaake: A federal data protection law. It’s quite remarkable that the United States doesn’t have one. The EU has worked towards what is called the general data protection regulation. A similar federally applicable data protection regulation would help to give Americans better rights protections and to be clearer about where the limits are in terms of data collection and use. So this is something they can take right now from the European experience. Natarajan: Generally, do you sense that there is momentum for this? Schaake: Yes. The observation is that there’s a lot of change happening very quickly, that we’re really reaching a tipping point, particularly when it comes to social media and search companies. For me, the question is, if we look throughout the digital world, there are so many areas where corporations are disproportionately powerful. I would like to think about it more systematically than just what we can see, because we use search and we use social media, but that’s probably the next chapter of this big endeavour of making sure that democracy trumps advertising companies.

If we look throughout the digital world, there are so many areas where corporations are disproportionately powerful.

Natarajan: You went to the Facebook headquarters with a friend or a colleague, and you were waiting to talk about work-related stuff. And you were given this lecture on the Lean In method. Tell us what happened and tell us about how you view Facebook’s role in what happened on 6 January. Schaake: Well, I have to add another anecdote because the colleague with whom I went to Facebook to talk about intermediary liability — basically Section 230 — has today been appointed as the Prime Minister of Estonia. So, the two of us went there to talk about freedom of expression online and intermediary liability exemptions and we were actually lectured about Lean In which was not what we came for. What do I think about the role of Facebook specifically? I mean it is an enormous platform with a lot of resources, and with a lot of power, and it has not chosen to use those resources and that power to firmly stand against hate, not only in the United States — and I think this is really important — but globally. If people think that what happened in the United States on 6 January was bad, they should look at what happened in Myanmar around elections and violence and other places where violence is very shallowly below the surface, where there’s a history of hate speech and ethnic violence, and unfortunately Facebook has become a platform to mobilise those kinds of groups, and it took an eruption of that kind of aggression in the United States for people to wake up. Now the best we can do is to steer this momentum in the right, more democratic direction.
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.

Contributor

Nikhila Natarajan

Nikhila Natarajan

Nikhila Natarajan is Senior Programme Manager for Media and Digital Content with ORF America. Her work focuses on the future of jobs current research in ...

Read More +