Expert Speak Digital Frontiers
Published on May 31, 2025

India must factor in demand, sectoral priorities, innovation, and access to optimise computing approaches to build its national AI innovation capabilities. 

Optimising Computing Approaches for India’s National AI Capabilities

Image Source: Getty

Artificial Intelligence (AI) compute demand has grown exponentially for over a decade. Since 2011, the amount of compute used to train AI models has grown by a factor of 350 million. Since 2010, as deep learning gained traction, the compute requirements of AI models began doubling every 5.7 months. In 2015, large-scale machine learning (ML) models emerged with 10- to 100-fold higher training compute requirements. Unlike the other two building blocks of AI innovation—data and models—compute is a scarce resource.

National innovation capabilities in AI would depend on the country’s common infrastructure. Developing these foundational capabilities are imperative for economic development and security.

The rise in computing demand has coincided with growing demand for data centres.  According to a mid-range scenario by McKinsey, global data centre capacity could triple by 2030, with about 70 percent of such demand driven by AI. However, data centres require a combination of scarce resources, such as real estate and power (AI is projected to drive a 165 percent increase in data centre power demand by 2030), alongside high-end strategic and technological inputs such as advanced semiconductors. This demand is also fuelled by the need for cloud service providers that cater to AI workloads. As AI technologies and computing demand continue to evolve, against the backdrop of an unequal global distribution of resources, national computing strategies must span a diversity of approaches, including centralised data centre infrastructure as well as distributed computing capabilities. 

As India stands on the cusp of AI-driven transformation, it becomes imperative to articulate the components of a forward-looking compute strategy. India’s compute landscape is marked by robust data centre growth ( a Compound Annual Growth Rate or CAGR of 24 percent since 2019); public provisioning of Graphics Processing Units (GPUs) through the IndiaAI Mission’s compute portal, which aims to offer access to GPUs at reduced rates; and proposals for a decentralised network of micro data centres that optimise space, energy, and cost. In addition, there is supercomputing availability under the National Supercomputing Mission for academic research and development, along with innovations such as the Ziroh Lab and the Indian Institute of Technology (IIT)-Madras’ Kompact AI, which enable AI models to run on Central Processing Units (CPUs) instead of GPUs. These developments highlight the need to align this momentum with the evolving nature of compute demand—from training to inference—and to recognise the emerging importance of edge computing (running AI models on devices and near data sources) in critical sectors. Policy direction in this domain must be shaped by technological, economic, and geopolitical realities. 

Building National Innovation Capabilities – Comparing Approaches

National innovation capabilities in AI depend on a country’s compute infrastructure as one of the foundational capabilities. Developing these core capabilities is vital for economic development and security. Given the strategic importance of AI as an arbiter of global dominance, and the unequal distribution of computing resources, compute has emerged as the fulcrum of geopolitical posturing—with export controls, retaliatory measures, and national AI innovation policies determining a country’s AI innovation trajectory. 

Currently, start-ups tend to veer towards application focused innovation to ensure quick returns on investment, however given the emerging importance of deep tech it is important to understand the overall nature of compute demand that might unlock broader innovation potential.

As countries seek to bolster domestic innovation capacities, large-scale infrastructural investments have become paramount. The concept of ‘AI factories is gaining traction as a model for optimised computing infrastructures capable of handling AI workloads and life-cycle management. Specialised data centres for AI are becoming a central infrastructural trend, attracting substantial private investment. Industry leaders such as NVIDIA are collaborating with firms such as Foxconn, Dell, and ASUS to build dedicated facilities for generative AI (Gen AI) workloads. NVIDIA is also investing in the development of AI supercomputers in the United States (US) that will power the AI factories of the future. The US hosts several of the largest technology firms in the AI value chain, including developers such as OpenAI, cloud service providers like Amazon Web Services, Microsoft, and Google, as well as leading chip designers and manufacturers such as NVIDIA, AMD, and Intel, and semiconductor equipment producers such as KLA, LAM, and Applied Materials. This environment supports a private sector-driven approach to AI infrastructure. Private investments in the US are buoyed by its current policy direction to develop strategic technological capabilities in the country, aiming to strengthen its AI infrastructure and tech manufacturing.

China’s version of AI factories, known as ‘intelligent computing centres’, has expanded at an accelerated pace in recent years, driven by a state-led approach with support from private players including Alibaba, Tencent, and Baidu. China has also developed an underwater computing centre, such as the one in Hainan, to boost AI development and its national AI computing network. This cluster supports high-end AI applications, reportedly including DeepSeek. Underwater centres are more energy efficient and help overcome real estate constraints, while potentially delivering higher computational efficiency than land-based models.

However, China’s aggressive buildout resulted in below-standard facilities and oversupply. Many data centres were designed primarily for pretraining workloads, rather than inference tasks, more suited to training rather than running models. With the advent and adoption of DeepSeek, the overall industry demand has shifted more towards inference, which requires a different hardware profile. 

As reasoning models become increasingly important, they require successive logical deductions to user queries, making low latency (i.e. time taken for data to go from one point in the network to another) and high speed crucial. This shift has created a mismatch: as data centres need to be co-located with major tech hubs, those in central, western, and rural China, with lower electricity and land costs, now remain underutilised.

AI factories are also a core component of the European Union’s (EU) AI strategy, with the EU adopting an institutional approach to ensure access to computing capacity through research and technology hubs. The aim is to enable start-ups, Small and Medium Enterprises (SMEs), and researchers to develop AI systems and solutions. These efforts aim to stimulate strategic sectors such as health, space, finance, climate, and manufacturing. The initiative operates under the aegis of the European High Performance Computing Joint Undertaking (EuroHPC). 

The Indian Imperative – Balancing Scale, Efficiency, and Access

The US, China, and EU exemplify contrasting approaches to building compute infrastructure—the US through a market-led model, China through a state-directed infrastructure expansion, and the EU through a public-institutional framework. Under the IndiaAI mission, India has made around 14,000 GPUs available under the first round, and is set to add 14,000 more to its earlier acquisition of around 18,700 as per recent reports.  However, concerns have emerged regarding the efficacy of government provisioning, including the risk of market distortion,  and potential bureaucratic hurdles that may limit access to this computing capacity. These developments are unfolding amid a shift in demand: from training to inference, and towards processing data on the edge, in addition to outstanding propositions for decentralised computing infrastructures. 

This underscores the need for a policy development and computing strategy that is attuned to:     

(a) The evolving nature of compute demand 

(b) Key sectors and industries with high AI penetration

(c) The innovation aspirations of India’s start-up ecosystem

(d) Ensuring low barriers to access. 

While the general demand is hinging towards inference, India is also developing its own indigenous LLMs, which will require significant training capacity. Balancing between training and inference compute capabilities is critical. Moreover, understanding the sectoral potential for AI integration will help anticipate future demand. Industries such as telecommunications, manufacturing, automotive, and healthcare show strong potential for edge computing, where data needs to be processed close to the point of use for greater speed and efficiency. 

Currently, most start-ups veer toward application-focused innovation to yield quicker returns on investment. However, as deep tech gains importance, understanding the evolving nature of compute demand will help unlock broader innovation potential. Additionally, lowering access barriers is essential—not only to expand innovation to the last mile, but also to drive productivity benefits for SMEs through AI adoption. 

This highlights the importance of optimising computing approaches that balance scale, efficiency, and access. India must align its strategy with demand dynamics, sectoral priorities, innovation enablement, and inclusive access. Achieving this balance will require harnessing market dynamics, promoting private investment, and offering targeted policy support for decentralised compute infrastructure, alongside facilitating access to centralised compute resources through government channels. 


Anulekha Nandi is a Fellow at the Centre for Security, Strategy and Technology, Observer Research Foundation.

Anusha Guru is a Research Intern at the Observer Research Foundation.

The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.

Authors

Anulekha Nandi

Anulekha Nandi

Dr. Anulekha Nandi is a Fellow - Centre for Security, Strategy and Technology at ORF. Her primary area of research includes digital innovation management and ...

Read More +
Anusha Guru

Anusha Guru

Anusha Guru is a Research Intern at the Observer Research Foundation. ...

Read More +