Author : Nisha Holla

Expert Speak Raisina Debates
Published on Feb 19, 2026

India is assembling a sovereign AI stack—across energy, chips, compute, models, and applications—to secure strategic optionality, reduce external dependencies, and sustain global competitiveness

Operationalising India’s Sovereign AI Stack: From Intent to Capability

Artificial intelligence (AI) has evolved from a general-purpose technology into strategic infrastructure, comparable to energy grids and telecommunications networks. Recent export controls on advanced chips and rare earths, restrictions on access to frontier AI models, and the increasing concentration of compute capacity among a few global firms have strengthened the case for the rapid development of sovereign capabilities.

A sovereign AI stack refers to a nation’s ability to build, operate, and govern indigenous AI systems primarily on domestic infrastructure, under its own regulatory authority, and aligned with national priorities. It comprises five interconnected layers: energy infrastructure, chips, data centres, models, and applications. Reliable baseload power determines whether compute can run at high utilisation. Semiconductor access shapes the cost and feasibility of training and inference. Data-centre capacity governs who can store sensitive datasets and deploy models competitively. Models translate compute and data into usable intelligence within national contexts. Applications embed AI into economic and governance workflows. Sovereignty becomes critical at every layer, as dependence in one layer can cascade constraints across the stack.

Figure: Visualisation of a Sovereign AI Stack

Operationalising India S Sovereign Ai Stack From Intent To Capability

Source: Author’s own

The United States and China have embedded AI sovereignty into their AI ecosystems. The US anchors AI leadership through the tight coupling of frontier chip design, hyperscale cloud infrastructure, defence-linked procurement, and capital markets, even as compute and model access remain concentrated among a small set of firms. China, by contrast, treats AI as a state-coordinated industrial asset, aligning energy provisioning, domestic chip substitution, cloud capacity, data access, and deployment mandates to limit exposure to external choke points. Both approaches recognise that AI sovereignty is not established through flagship models alone, but through sustained national control over end-to-end infrastructure. India’s sovereign AI stack will follow a different path and must be built around two foundational principles: maintaining strategic optionality and creating conditions for Indian innovation to remain globally competitive. India’s position across each layer of the sovereign AI stack can be considered with reference to these foundational principles.

Sovereignty becomes critical at every layer, as dependence in one layer can cascade constraints across the stack.

Energy Infrastructure: The Foundation of AI Competence 

The AI stack depends heavily on the energy layer. Large-scale training and inference workloads require continuous baseload power with minimal tolerance for outages. The world’s largest AI clusters are increasingly located in regions that can guarantee long-term, low-variance supply, with capacity for low-cost generation, transmission, and storage. Energy policy is, effectively, becoming industrial AI policy.

India’s position is mixed but promising. Installed power generation capacity has expanded rapidly over the last decade, surpassing 500 GW as of December 2025, with renewables such as wind and solar contributing 190 GW, or nearly 40 percent. However, a grid with a higher share of intermittent renewables, limited long-duration storage, and regional transmission constraints remains insufficient. Recognising this, the proposed SHANTI Bill marks a strategic shift. The bill aims to expand India’s nuclear power base to provide a 24×7 baseload supply for data centres and AI infrastructure. With proper execution, the SHANTI framework can enhance predictability, deepen baseload access, and secure long-term supply contracts.

Chips: Talent Depth Steers Indigenous Optionality

Semiconductors lie at the heart of the AI conversation, as they determine who controls the economics, scalability, and reliability of compute. The types of chips available and the certainty of their supply directly shape training and inference costs, energy efficiency, and the feasibility of deploying AI at a population scale. Export controls have demonstrated how quickly access to advanced chips and manufacturing equipment can be weaponised, highlighting significant power imbalances in global supply chains.

Both approaches recognise that AI sovereignty is not established through flagship models alone, but through sustained national control over end-to-end infrastructure.

India is entering this arena with strengths in chip design and electronic systems engineering; however, it has historically lagged in fabrication and frontier R&D. The India Semiconductor Mission seeks to address these challenges through the launch of large-scale projects across fabrication, assembly, testing, and packaging of semiconductors. Focusing on software–hardware co-design leverages India’s strengths in system integration and design talent. As performance gains increasingly derive from advanced packaging, chiplet integration, and testing rather than node shrinkage, system-level optimisation becomes decisive. Strengthening domestic development of chip design, GPUs, and related software—including compilers, drivers, and design automation tools—reinforces these advantages. While competing with the most advanced semiconductor ecosystems may not be feasible in the immediate future, these initiatives represent tangible progress toward greater self-sufficiency.

Demand will play a critical role. Public procurement initiatives and offtake commitments in strategic sectors—including defence electronics, telecommunications, and energy infrastructure—will significantly influence whether semiconductor investments translate into lasting domestic capabilities, thereby ensuring sustainability and durable value in the long term.

Data Centres: Compute as Strategic Infrastructure

Control over data centre capacity determines who can train models, where sensitive datasets reside, and whether domestic firms have access to competitively priced compute. Globally, AI compute is becoming concentrated in a small number of regions that share key characteristics: abundant energy, deep capital pools, control over intellectual property (IP), and regulatory coordination for hyperscale cloud providers. This concentration is widening a computing divide, in which countries without domestic capabilities must rely on foreign infrastructure for training, inference, and storage, often under IP controls and restricted-use regimes they cannot easily circumvent.

India’s data-centre expansion is therefore strategically significant for its AI-enabled future. Driven by cloud adoption, localisation requirements, and digital public infrastructure (DPI), domestic cloud capacity has expanded rapidly over the past decade. The proposed National Data Centre Policy signals that data centres are to be treated as strategic infrastructure rather than conventional IT assets. Long-horizon incentives, including tax exemptions extending up to two decades, are designed to anchor investment domestically. Early indications suggest momentum is building, with global firms such as Microsoft, Amazon, and Google announcing over US$ 67 billion in investments to localise AI infrastructure and cloud capacity. Indian corporations and public sector undertakings (PSUs) will need to expand their participation to match this capacity build-out.

India’s advantage lies in contextual relevance and cost efficiency, driven by open-source models that can be rapidly reconfigured for domestic use cases.

The IndiaAI Mission reinforces this shift by directly intervening in compute provisioning. Its commitment to establishing a shared national AI compute platform, built around thousands of GPUs and made accessible through competitive, subsidised mechanisms, aims to prevent excessive concentration while lowering barriers for startups, researchers, and public institutions. By strengthening domestic computing capacity anchored in Indian energy, data, and governance frameworks, India is gradually enhancing its resilience to external shocks and policy shifts beyond its control.

From Flagship Models to Layered Model Architectures

Models are the most visible layer of the AI stack, where compute, data, and algorithms combine to produce deployable intelligence. They are also the layer most often conflated with sovereignty, as model performance is easier to benchmark than the upstream infrastructure that enables it. Replicating frontier general-purpose models at enormous cost, however, is neither feasible nor necessary for India’s near-term strategic objectives.

India’s advantage lies in contextual relevance and cost efficiency, driven by open-source models that can be rapidly reconfigured for domestic use cases. Indic languages, speech-first interfaces, and domain-specific reasoning aligned with Indian administrative and social realities represent defensible niches. The IndiaAI Mission recognises this by supporting the development of indigenous foundational models by Indian startups, using India-focused datasets such as those developed by AI4Bharat. The strategy is clear: models optimised for Indian contexts; trained on compliant data; inspired by open-source innovation; deployable on domestic compute; and offered at token costs substantially lower than those of global providers.

A sovereign AI stack can constitute an enduring competitive advantage for the economy.

India requires families of models: open-weight research models to sustain academic and startup innovation; secure, government-grade models for sensitive applications and defence; and enterprise-tuned, vertical-specialist models for regulated sectors such as finance, health, and energy. This layered approach reduces concentration risk from global vendors and ensures that national capability is not locked into a limited set of architectures or providers.

Applications: Translating Sovereignty into Deployment

Applications are the layer where India can move fastest. The country’s DPI, spanning identity, payments, and interoperable public-service rails, creates a uniquely scalable environment for AI deployment.

AI-enabled fraud detection in financial systems, decision-support tools in welfare delivery, predictive analytics in agriculture, and triage systems in healthcare are areas in which domestic applications, running on domestic compute, can be embedded within mission-critical workflows. In several of these domains, early deployments have already demonstrated efficiency gains and improvements in service delivery.

When public systems rely on domestic models and computing, national capability compounds rather than being diluted through licensing fees, opaque model governance, data leakage, or cloud dependence.

The geopolitical context has demonstrated that AI sovereignty will accrue to countries that can continue innovating even as trade flows tighten.

The risk, however, lies in over-regulation and gatekeeping. If public data and digital utilities become captive to a narrowly defined set of vendors and consultants, sovereignty weakens. Broad-based participation by startups, research institutions, and system integrators is therefore essential.

From Strategic Intent to Sovereign Capacity

A sovereign AI stack can constitute an enduring competitive advantage for the economy. Across India’s stack—spanning energy, chips, compute, models, and applications—the policy scaffolding is now visible and largely coherent. Execution must therefore become the organising principle. Schemes such as the INR 1 lakh crore Research, Development and Innovation corpus must be deployed rapidly, with procurement-linked funding that accelerates private investment across hardware, systems, software, and applied AI.

The geopolitical context has demonstrated that AI sovereignty will accrue to countries that can continue innovating even as trade flows tighten. What remains uncertain is whether India’s strategic intent can translate into a durable national capability before global technological fragmentation hardens into a permanent competitive hierarchy. India has entered the race too late to anchor its strategy solely in frontier AI; its comparative advantage lies instead in assembling a resilient, end-to-end stack that supports emerging national priorities at a population scale.


Nisha Holla is a Visiting Fellow at the Observer Research Foundation.

The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.

Author

Nisha Holla

Nisha Holla

Nisha Holla is Visiting Fellow at ORF where she writes on ideas and shifts at the intersection of technology economics and policy. She tracks the ...

Read More +