Not a truth engine
Indeed, what is true if the truth is often represented by page one results of a search engine? What if algorithmic advertising-driven search engine results control public discourse
, which people trust as the gospel of truth? The challenge to “human cognition” by search engines is profound, as an average person conducts three to four searches or more. For instance, Google
processes over 8.5 billion searches daily. Besides, 97 percent of 600 million devices in India run on Android
. Studies show that the search engine has over 93.34 percent market share
and receives 3.3 trillion-plus searches annually.
So, what could be the impact when a “dominant digital advertising giant” known as a “search engine” enjoys a monopoly in brokering and filtering people’s “free access” to the information? What do you do, faced with the consequent deteriorating quality
of search engine results pages? Results that are riddled with search engine optimisation (SEO) arbitrageurs, advertising and affiliates; results that push users to the highest bidders; results that drive ephemeral experiences
, subtly control your thinking
and behaviour with biases, and Search Engine Manipulation Effect
– without any paper trail. What do you do if “too many choices” on your screen are offered to trick your cognition and shift your brain to shallow
learning and reduced contemplation? More so as search results can interfere
with human curation driven by businesses, interest groups, and governments around the world
As the Sapir-Whorf hypothesis states, our worlds are shaped by words – conditioning our emotions, our behaviour, our culture, our democracy, and our future.
Deep mind distraction
What if the search engine autocomplete
turns out to be a distraction right from the first character your type in, leading to a “choice overload” and causing cognitive impairment, nudging attention to select people, concepts, or ideologies, and resulting in skewed decision-making? Letting algorithms finish thoughts prompting false associations and untruths is detrimental to the belief system.
As the Sapir-Whorf hypothesis
states, our worlds are shaped by words – conditioning our emotions
, our behaviour, our culture, our democracy
, and our future. Ninety percent of people are likely to click on the front set of results, which limits one’s thought process and constrains access to fair discourse and autonomy as paid news appears on top, driven by the revenue model
of search engines, rooted in data-driven
, targeted advertising
—by surveillance platforms
—designed to feed cognitive biases.
Significantly, the ‘cost of not knowing’ can be infinite, stifling our thought processes, stunting reality, and affecting rational decision-making. Biased search results stifle human cognition
as we surf shallow
information that may be full of distractions and holds our “choice architecture” at ransom, guided by behavioural economics.
Disrupting democratic decisions
Human cognition is duped once you blindly trust “digital advertising platforms
” or “information intermediaries
” as “search engines” or tools for making unbiased information accessible. For instance, research shows internet search result rankings heavily influence
users' opinions of candidates and impact electoral outcomes, as in the United States
(US), United Kingdom (UK) or Brexit
. This is so as citizens scanning search results click on links in predictable patterns, emphasising results at the top of the list and on the first page.
Human cognition is duped once you blindly trust “digital advertising platforms” or “information intermediaries” as “search engines” or tools for making unbiased information accessible.
If the dominant search engine chooses to manipulate rankings in favour of a candidate, political party, or dominant power, opponents would have little to counter those manipulations. For instance, the autocomplete
blacklist might be in use to protect or discredit political candidates. Autocomplete suggests possible search terms
based on what a user starts typing and may unfairly impact the integrity of electoral processes. Just like freedom of speech
, the right to freedom of thought
is utmost in human rights law. Behavioural psychography during the electoral process can potentially undermine democracy
, regardless of who pays for it or which way you vote.
Defending national security
So far, global regulatory efforts have failed
to clearly define how search engines should behave in a social context. The fast-evolving tech sector, complex issues, inadequate industry expertise, low budgets, poor resources, archaic laws, lengthy processes, and a partisan approach to bringing legislations prevent regulatory agencies from enforcing antitrust laws effectively. Besides, enormous sums are paid
by monopolistic tech firms to maintain search dominance through “default exclusivity
” to deny rivals’ data systemically. With personalised search rankings for users, customised rankings might be used to influence voters in the weeks prior to elections disproportionately. Rankings that appear to be unbiased on regulators’ screens might be biased on the screens of select individuals—making it tough for the regulators
to find manipulation, putting individual thinking at risk
, as there is no paper trail
A huge concentration of power to control discourse threatens national security. The states cannot allow a for-profit entity to be an arbiter of permissible speech, promoting, sharing or manipulating information choices
. People must know that search engines are not an objective source
of information. It’s time to call for strong legislation and laws led by our constitutional principles. India must act
now and carefully study the emerging best practices
of the European Union
, the United States
. There is no harm in learning from and adopting
global regulations for the public good.
The fast-evolving tech sector, complex issues, inadequate industry expertise, low budgets, poor resources, archaic laws, lengthy processes, and a partisan approach to bringing legislations prevent regulatory agencies from enforcing antitrust laws effectively.
It’s time to thwart the abuse of dominance by market-leading firms in digital ad tech to stifle ideas, innovation, competition, free expression, competition, and fair opportunities in the marketplace, harming the consumers, economy, and democracy. The January 2023 US Justice Department
lawsuit against Google raises serious questions
about being an authoritarian intermediary
. Significantly, eight states joined the Justice Department
in the lawsuit, including Google’s home state, California.
An Indian approach to reclaim cognition
India must act fast to mitigate the high risks, ensure that search engines live up to their responsibilities, and be transparent
and accountable in protecting the data veracity, privacy, and safety of millions of vulnerable citizens and businesses. India must develop a digital constitutional rulebook
such as the General Data Protection Regulation (GDPR)
, Digital Services Act (DSA)
, the Digital Markets Act (DMA)
and Artificial Intelligence Act
supported by strong compliance, including fines of up to 10 percent of global turnover, in case of non-compliance.
As India aspires to become a globally competitive US$40 trillion economy by 2047
, the nation must develop an “indigenous” search engine that fosters fair discourse and innovation growth for all. The largest democracy in the world needs unbiased online search results to thrive in a data-driven world. Big tech giants
cannot be allowed to abuse their strong market position to exclude competitors unfairly. It’s a serious breach of trust to limit consumers to sufficient, transparent, and easy choices
India must act fast to mitigate the high risks, ensure that search engines live up to their responsibilities, and be transparent and accountable in protecting the data veracity, privacy, and safety of millions of vulnerable citizens and businesses.
As the pace of tech adoption is faster than the existing rules and laws to regulate it, we need a robust, agile regulatory framework to facilitate secure, ethical, responsible and trustworthy tech services. To safeguard people’s interests, Indian laws must keep pace
with the progress of high-risk artificial intelligence (AI). Conventional quality assurance must give way to emerging rules like EU AI Act
that promote excellence in AI that is safer, human-centric, and respect the existing laws on fundamental rights. The nation must boost investments in critical computing and innovations to compete globally, ensuring legal certainty, effective governance and enforcement to drive ethical technologies that work for people, society, and the world.
India’s latest Supreme Court antitrust order
is a landmark decision, but the scope of the investigation can be extended wider. For this to happen, the entire global innovation ecosystem must join hands to protect citizens from the gatekeepers of information. States, civic society, and citizens will have to reclaim control of technology conversations for the better. Setting up a neutral monitoring system
that tracks the search results can mitigate the risks. Failure to diffuse the right information to the masses will mean lost opportunities for human perspectives and progress. Let’s reimagine our quest for intelligence.
The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.