Expert Speak Digital Frontiers
Published on Oct 22, 2021
The recent testimony by Frances Haugen placed Facebook under scrutiny after flagging its harmful algorithms, thus raising the question should governments regulate Big Tech after all?
That First Childhood Cigarette: Big Tobacco metaphor drags US Congress to address a bigger dilemma

Big Tech’s Big Tobacco moment is here, with the familiar whiff of an epic battle from another time. Defectors, civil society, and lawmakers are condemning the world’s largest social networking platform for its version of “that first childhood cigarette”—persuasion algorithms that are driving teens off the cliff. Like in the case of cigarettes, protecting children from Big Tech’s harms is an issue driving bipartisan consensus in the thick of a political gridlock. The momentum is there but so is the history. There’s also an asymmetry between the speed at which tech products are evolving and policymaking unique to the digital era.

On the 4th of January, 1954, America’s big tobacco companies released “A Frank Statement to Cigarette Smokers” in more than 400 newspapers in the United States (US). They declared: “We accept an interest in people's health as a basic responsibility, paramount to every other consideration in our business…We believe the products we make are not injurious to health…We always have and always will cooperate closely with those whose task it is to safeguard the public health.”

The echoes from 1954 are unmistakable as Facebook CEO, Mark Zuckerberg, reacted to a whistleblower’s bombshell testimony on how the company’s business model is hurting young people, especially teenage girls.

40 years on, in 1994, seven tobacco company executives testifying under oath in the US Congress declared they didn’t believe nicotine was addictive. It took another 15 years before US president Barack Obama signed the Family Smoking Prevention and Tobacco Control Act in June 2009, giving the Food and Drug Administration authority to regulate tobacco products.

In 2021, the echoes from 1954 are unmistakable as Facebook CEO, Mark Zuckerberg, reacted to a whistleblower’s bombshell testimony on how the company’s business model is hurting young people, especially teenage girls. “I’m sure many of you have found the recent coverage hard to read because it just doesn’t reflect the company we know,” Zuckerberg wrote in a blogpost. “We care deeply about issues like safety, well-being, and mental health. It’s difficult to see coverage that misrepresents our work and our motives. At the most basic level, I think most of us just don’t recognise the false picture of the company that is being painted.”

“Congressional action is needed,” Facebook whistleblower Frances Haugen told lawmakers in three hours of damning testimony on October 5, 2021. “(Facebook) won’t solve this crisis without your help,” she said. Haugen went on to detail a string of fixes that stop well short of breaking up the companies and yet rein the mechanisms by which they weaponise code. Haugen went public with a trove of internal company research that she took before leaving her job earlier this year.

Haugen urged urgency. “I believe we still have time to act. But we must act now. I'm asking you, our elected representatives, to act.” On a day of charged debate, senators promised swift action. “And the days of Facebook evading oversight are over, because I think the American public is aroused about the importance of ... (social media) preying on their own children,” declared Sen. Richard Blumenthal, who heads the Senate subcommittee on Consumer Protection, Product Safety, and Data Security. Amy Klobuchar of Minnesota, a 2020 presidential hopeful who chairs the Senate Commerce Subcommittee on Competition Policy, Antitrust, and Consumer Rights, said it’s time to update children's privacy laws and offer more transparency in the use of algorithms. Klobuchar thinks Haugen may be the “catalyst” who will tip  Congress over into “action.”

Amy Klobuchar of Minnesota, a 2020 presidential hopeful who chairs the Senate Commerce Subcommittee on Competition Policy, Antitrust, and Consumer Rights, said it’s time to update children's privacy laws and offer more transparency in the use of algorithms.

Haugen’s recommendations offer an inside-out view of what can be fixed at the level of code. “Right now, the only people in the world who are trained to ... understand what's happening inside of Facebook, are people who grew up inside of Facebook or Pinterest or another social media company,” she said during a hearing before a Senate panel. Haugen proposed regulatory overlay. Platforms should be responsible for the speech they amplify through algorithms, she declared. This would require changes to Section 230 of the Communications Decency Act which made today’s internet companies possible. Haugen suggests moving away from an engagement-based business model to chronological ordering of posts, wherein one views the content in the order that it got posted and hopefully, it prevent people from going astray. But Facebook’s Vice President for Global Affairs, Nick Clegg, who did the rounds of Sunday shows after Haugen’s testimony, thinks not. Clegg frames Facebook’s algorithms as “giant spam filters” which show less of the bad stuff, not more. Haugen encouraged raising age limits for users of Facebook's platforms from 13 to 16 or 18. Facebook, in the aftermath of Haugen’s testimony, has promised to tinker on the edges—like “nudging" teens to take breaks from repetitive content.

Matt Stoller, author of “Goliath, The 100 Year War Between Monopoly Power and Democracy”, thinks regulatory overlay is a “terrible idea” even if the code-level fixes are sensible. He explains why, in a newsletter unpacking Haugen’s testimony. “We have never allowed one man to set rules for communication networks that structure the information ecosystem of billions of people. But that is the situation we’re in,” writes Stoller. “We have to radically decentralise this power. But a regulatory overlay in some ways would worsen the problem, because it would explicitly fuse political control with market power over speech, and it would legitimise the dominant monopoly position of Facebook.”

In Stoller’s view, regulation is legitimacy for the business model and band-aid for the harms. “Get it at the source,” he urges. “Monopoly and market structure are upstream from the data, misinformation, censorship, and privacy problems.”

In Stoller’s view, regulation is legitimacy for the business model and band-aid for the harms. “Get it at the source,” he urges. “Monopoly and market structure are upstream from the data, misinformation, censorship, and privacy problems.”

Former Chair of the Federal Trade Commission, William Kovacic, compares regulating technology giants to a bicycle trying to keep up with a Formula 1 car. He makes the case that policymakers must be more thoughtful about “social return” and about developing a more reliable process for the “diagnosis”. “Is there some way of knowing that one starting point is likely to generate a better result than another starting point? Kovacic told this author on the sidelines of a CyFy 2021 panel on the antitrust debate in the US. “If you don’t know where you’re going, running faster doesn’t necessarily make you better off.”

Kovacic points to the Competition and Markets Authority in the United Kingdom as a model for investment in research capacity that can keep up with technology’s dynamism. “They have a superb research programme, they’ve made a major investment in studying the relevant sectors to understand them. They have built a superior team of data analysis specialists and computer scientists to actually understand what they're looking at”, says Kovacic. “I'm not entirely confident about just having lawyers and economists tell me what's going on.”

Kovacic sounds a warning: “How do you ensure that in seeking to solve a problem, you are not solving yesterday's problem right now and you're forever playing catch up?”

That conversation, between techies, lawmakers, and the public, is urgent and just getting started in the US. A Faustian bargain is nobody’s case, yet a distinct and dark possibility.

The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.

Contributor

Nikhila Natarajan

Nikhila Natarajan

Nikhila Natarajan is Senior Programme Manager for Media and Digital Content with ORF America. Her work focuses on the future of jobs current research in ...

Read More +