Author : Kumkum Mohata

Expert Speak Raisina Debates
Published on Mar 03, 2026

India’s AI opportunity lies not in frontier scale but in frugal, compute-efficient, multilingual systems that maximise productivity and inclusion while enabling Global South leadership under real resource constraints

From Scale to Efficiency: India’s Frugal AI Opportunity

Artificial Intelligence (AI) is scaling rapidly, but its underlying economics remain unsettled. The global AI market is projected to exceed US$ 17 billion by 2027, with investment rising sharply each year. Yet, increasing model capability has not translated proportionately into economic returns. High infrastructure costs, growing energy consumption, and reliance on proprietary compute continue to erode margins and constrain diffusion. The result is a widening gap between AI capability and real-world adoption.

The recently concluded India AI Impact Summit reflected these concerns, emphasising the need to build and deploy AI systems under real resource constraints. This aligns with the direction outlined in the Economic Survey, which underscores the need for cost-conscious, context-appropriate innovation. In this regard, frugal AI presents a pragmatic way forward.

The Limits of Frontier Scale in India

The global AI landscape is marked by structural asymmetries in capabilities and resources. In India’s case, these manifest as distinct demand-side and supply-side constraints. On the demand side, adoption presents a mixed picture. Despite ranking third globally in AI competitiveness, nearly 45 percent of Indian firms remain in the early stages of AI maturity. Smaller firms and micro-enterprises face high infrastructure costs and limited digital literacy. At the consumer level, digital access has expanded through smartphones, but capability remains uneven. Urban users increasingly engage with generative AI tools for learning and work, while many rural and semi-urban users are confined to passive consumption. Gaps in skills, school-level digital infrastructure, and local-language interfaces mean that demand depends not only on access but also on usable, context-appropriate AI solutions.

The more pertinent question is how to build deployable, multilingual, and affordable systems that function in real-world settings. This is where frugal AI emerges as a strategic approach.

On the supply side, India faces constraints in high-performance computing capabilities and access to specialised hardware. The Economic Survey notes that global firms are projected to incur extremely high capital outlays for computing infrastructure, with uncertain returns. These pressures extend beyond compute to the underlying infrastructure. AI systems are highly energy-intensive, and scaling data centre capacity requires reliable power, cooling systems, and land. At the same time, access to risk capital for deep-tech and frontier AI remains limited relative to global benchmarks, constraining domestic firms’ ability to compete at scale.

At the global frontier, firms such as OpenAI have reportedly incurred extremely high operating costs for systems such as ChatGPT, with uncertain profitability timelines. By contrast, more compute-efficient challengers report comparable performance at lower cost. Taken together, these trends suggest that while the frontier drives technological progress, it is not the sole model of success. For India, the more pertinent question is how to build deployable, multilingual, and affordable systems that function in real-world settings. This is where frugal AI emerges as a strategic approach.

Operationalising Frugal AI

India’s AI pathway can be understood as a constrained optimisation problem, with the objective of maximising productivity under limits of compute, capital and infrastructure. Frugal AI aligns with this objective by focusing on computational sufficiency—the minimum resource footprint required to achieve task-specific performance. Empirical evidence suggests that adding more compute improves performance, but gains taper off as scale increases. At the same time, frontier training costs have grown rapidly, estimated at a rate of 2.4 times per year since 2016. This implies that beyond a certain point, further scaling becomes economically inefficient, with steeply rising costs outweighing marginal gains.

India’s AI pathway can be understood as a constrained optimisation problem, with the objective of maximising productivity under limits of compute, capital and infrastructure.

Figure 1 depicts this conceptual ‘bliss point’: benefits increase with adoption but gradually plateau, while costs rise at a faster rate. The highest net payoff lies between these curves. This represents the Frugal AI optimum.

Figure 1: Trade-offs in AI Scaling: A Frugal AI Optimum

From Scale To Efficiency India S Frugal Ai Opportunity

Source: Author’s Illustration

(Stylised representation of the trade-off between performance gains and rising compute costs, with an interior optimum from diminishing returns)

At this equilibrium, productivity gains derive from targeted deployment. In agriculture, AI-enabled advisory and market-linkage systems have improved price discovery and logistics efficiency without requiring frontier-scale models. In health, low-cost AI diagnostics for early cancer screening demonstrate how narrow, context-optimised systems generate measurable outcome gains under resource constraints. These applications illustrate how computational efficiency translates into tangible improvements in service delivery and income stability.

Beyond efficiency gains, frugal AI enables broader participation in innovation. India’s digital public infrastructure, built on interoperable and open protocols, demonstrates how modular systems can scale inclusively without frontier-scale capital intensity. Emerging domestic players such as Sarvam AI further reflect this shift. Rather than competing on scale, Sarvam has developed models tailored to Indian languages and real-world use cases, outperforming global systems such as ChatGPT and Gemini on select benchmarks in document intelligence and speech processing. Initiatives such as Bhashini and open-language ecosystems such as AI4Bharat lower entry barriers for developers while reducing the energy and compute intensity associated with large-scale models.

Designing Policy Foundations for Frugal AI

Frugal innovation is already embedded in India’s development trajectory, shaped by resource constraints, a large and cost-efficient technical workforce, and a proven ability to build population-scale digital systems. Therefore, India’s AI advantage lies not in competing at the frontier of model size but in democratising deployment. Policy should advance this shift sequentially. The India AI Impact Summit 2026 signals this direction, emphasising democratised access to AI resources and the scaling of real-world applications. First, a critical starting point is the emerging compute commons. Shared GPU access can reduce entry barriers, but allocation design is crucial. Without transparency, subsidised compute risks are captured by well-capitalised private actors. Access should be tied to use cases, reporting requirements, and measurable outcomes.

India’s AI advantage lies not in competing at the frontier of model size but in democratising deployment.

Second, India should scale sector-specific federated data exchanges, building on models such as the BODH health platform showcased at the summit. Such architectures should be expanded across agriculture, mobility, and education. This demonstrates how innovation can be enabled without centralising sensitive data, foregrounding interoperability with safeguards. Third, financing must pivot towards deployment-linked incentives to address the persistent gap between pilot innovation and population-scale adoption. Fourth, India’s entrepreneurial ecosystem must be leveraged as a core driver of frugal AI innovation. With one of the world’s largest startup bases and deep experience in building cost-efficient, scalable digital solutions, Indian firms are well-positioned to translate AI capabilities into context-specific applications. Finally, India requires a practical evaluation framework. Success should be measured by cost-efficiency, language quality, and user adoption.

AI for the Global South

Building on these policy enablers, India can position frugal AI as a Global South export strategy, shifting from technological catch-up to first-mover advantage in low-cost, scalable AI systems. India is well placed to lead—not by exporting a single model but by demonstrating a replicable deployment logic: shared compute, interoperable data, multilingual systems, and outcome-linked scaling. In this regard, frugal AI is not merely a domestic strategy; it is an exportable governance and deployment template for the Global South, with the potential to generate economic returns through scalable, cost-efficient AI solutions.


Kumkum Mohata is a Research Assistant with the Centre for New Economic Diplomacy at the Observer Research Foundation.

The views expressed above belong to the author(s). ORF research and analyses now available on Telegram! Click here to access our curated content — blogs, longforms and interviews.

Author

Kumkum Mohata

Kumkum Mohata

Kumkum Mohata is a Research Assistant with ORF’s Centre for New Economic Diplomacy. Her research interests lie in development economics, international trade, and macroeconomics, with ...

Read More +