The phenomena of fake news, information warfare, and influence operations have captured public imagination over the past couple of years. From the 2016 Russian interference in the US elections to, closer to home, lynchings in India fuelled by misinformation over WhatsApp—technology firms and governments are fighting to introduce permanent fixes to this challenge. The nature of the challenge itself is not new and has manifested itself in history several times, albeit in different forms. However, the speed at which false information reaches disparate audiences and the behaviour it influences has evolved manifold with the growth of social media platforms. Open societies face the threat of manipulation more acutely with an increasing population of first-generation internet users. Given that influence operations have far-reaching effects—especially for democratic and electoral processes—democracies are looking to institute safeguards to counter such threats while retaining their commitment to a free society.
Panellists agreed that there are nuances in how an influence operation plays out, including the actors involved and their expected outcomes. Katie Harbath noted that often, content on Facebook would be considered polarising or divisive instead of fake or untrue; the former cannot be conflated with the latter. The purpose of spreading these messages is not the content itself but to influence users’ behaviour and stir up their emotions at a crucial point in time. She clarified that the role of technology firms is not to bring down such content but to ensure that the same is not amplified, thereby removing the economic incentive. Technology firms, therefore, need to devise scalable ways of addressing this challenge while even as they should not choose to err “on the side of caution” by bringing down all “suspicious” content.
In the realm of international multilateral and multistakeholder institutions, deterring influence operations has become an increasing priority. Especially given the overlapping nature of information warfare and cyber operations, reimagining traditional international law frameworks to mitigate, prevent and respond to attacks has assumed a lot of importance. The threshold of “armed attack” under existing international law framework only addresses kinetic operations and does not adequately address influence operations and information warfare.
Increasingly, attacks resemble cyber operations even when the real goal is to influence public perception—not unlike information warfare. Alex Klimburg noted that traditional western powers would consider cyber warfare to be technical in nature with a cyberattack on critical information infrastructure (CII) considered as the worst scenario. China and Russia, on the other hand, view information and psychological warfare as the more real threat that necessitates norm-creation.
Latha Reddy outlined the various international initiatives focused on scripting rules and norms to ensure stability in cyberspace. The newly created UNGGE High Level Panel, open-ended working at the UN and the new UNGGE have set out to bring up an international consensus in scripting norms, establishing confidence-building measures and building capacity across stakeholders. The private sector is also playing an increasing role in forging an international consensus with Microsoft’s Tech Accord and Digital Peace Campaign outlining the responsibility of private firms and states in protecting the integrity of cyberspace.
Panellists identified the urgency of consolidating existing platforms to create a single multistakeholder institution similar to the International Committee of the Red Cross (ICRC) or the International Atomic Energy Agency (IAEA) to take the lead on attribution, adjudication and capacity building.
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.