By Amit Kapoor and Mohammad Saad

AI is widely used not only because of its remarkable capabilities, but also because it has remained relatively affordable. That may change, however. Rapid expansion and adoption are driving up usage costs for US based AI models. A hike in AI fees offers an early glimpse into long-held fears around upstream dependency and reflects a broader trend where external shocks create domestic challenges. With AI increasingly being viewed as national infrastructure, this development serves as a reminder for India to gradually build greater control over its compute stack over the long run, even if the risks are not immediate.

While basic AI access is still affordable, the prices are rising for advanced models. American AI software prices have in fact risen between 20% to 37% as per data released in December by Tropic, a provider of procurement software solutions. A more recent example comes from Microsoft-owned GitHub, which announced that it would be switching its flat-rate plans to expensive usage-based models. Similarly, Anthropic has proposed the idea of dropping Claude Code from its Pro plan, while modifying usage for Pro and Max users.

Importantly, the surge in prices is strongly tied to the intensity of AI adoption. AI does not read language like humans, rather it operates on tokens. Each token is essentially a chunk of text. All inputs are first tokenized and then sent to GPUs for computation. Modern AI systems need large compute capacity, and compute requirements scale with the number of tokens and task complexity. If usage is intensive or complex, companies ultimately pay a higher electricity and water bill.

Cognizant of these dynamics, AI companies have rapidly expanded their compute access which has led to considerable drop in token price (for instance, Grok 4.1 models charge only $0.20 per 1 million input tokens and $0.50 per 1 million output tokens). However, given rapid AI adoption, the overall volume of tokens is rising exponentially. This means, that despite drop in token prices, the overall bill on AI usage is rising. Consider a situation where a user uploads a 1000-page book and asks AI to summarize it. The model must first tokenize the document to make it processable by GPUs. Even if a single token is cheap, the sheer size of the input means the total computation requirement, and therefore the cost, becomes significant. The cost rises further if the model is advanced. For example, Anthropic’s flagship Claude Opus 4.6 costs $5.00 input and $25.00 output per million tokens.

A rather unexpected development comes from China, where DeepSeek’s flagship V4 Flash operates at roughly $0.14 per million input tokens and $0.28 per million output tokens which is much less than its American counterparts. However, China has been able to achieve such an efficiency owing to a mix of factors which includes strategies such as sparse architectures like Mixture of Experts, where only a fraction of the model activates per query. That sharply lowers compute costs per inference.

However, for India, this is only partially reassuring as US based models dominate the market, with around 100 million users of ChatGPT alone. While the risks from rising subscription costs are not immediate, India will eventually need to build greater sovereign technological infrastructure, as a significant share of Indian AI workloads is still processed on foreign owned infrastructure. As demand rises rapidly, AI companies charge more not only for prioritising user requests but also for handling increasingly complex tasks. Upstream shocks, combined with factors such as currency differentials, ultimately leave dependent countries more vulnerable to such cost increases. Avoiding long term structural is therefore critical, otherwise India could become persistently exposed to upstream disruptions, of which subscription fee hikes may be only one example.

If AI usage fees become critically high in the medium to long term and domestic compute does not expand, the available solutions would simply replace one dependence with another. For instance, a case is made that Indian businesses are not committed to costly frontier development and may simply resort to open-source models like DeepSeek instead of paid proprietary ones. However, AI is not merely another software. Instead, it has begun to resemble critical infrastructure. If a country becomes deeply dependent on open-source models like DeepSeek, trade-offs involving security concerns, data localization questions and geopolitical trust issues persist, let alone the fact that open-source models lag behind frontier systems in many complex tasks.

Importantly, even if India is not building frontier AI systems, it still requires substantial compute access to train custom domain specific models. While compute infrastructure is expanding in the country, a significant share of it remains foreign owned. India’s data centre sector attracted nearly 13 to 15 billion dollars in investment between 2020 and 2024, of which roughly 80% came from foreign entities. Meaning, that returns from the infrastructure would still flow abroad.

Given these dynamics, India sits at a critical juncture. Staying with American models means absorbing cost shocks and geopolitical exposure while turning to Chinese alternatives trades one form of dependency for another. Building domestic capabilities would take years but is the ideal way forward and domestic conglomerates like Reliance are already aggressively entering the space. For now, the country may well suffice by making the best use of available compute. This would include fine tuning open-access base models on India-specific domain, while simultaneously investing specifically in compute-efficient research, akin to what China is doing.

Even as domestic compute expands over the long term, India must stay vigilant that data centre growth alone does not guarantee cheaper compute. Affordable AI depends on a combination of factors including access to low-cost energy, GPUs and innovation in cost effective compute utilization. This means that India will require a well-rounded policy framework capable of ensuring affordable power, sufficient water availability, semiconductor access, and support for deep tech startups, while simultaneously balancing climate goals. Ultimately, what the country needs is an approach that treats the challenge as one demanding multiple policy levers, calibrated and working in close synchronization.

(Amit Kapoor is chair & Mohammad Saad, Researcher at Institute for Competitiveness. X: @kautiliya).   

The article was published with Economic Times on May 12, 2026.

0 Comments

Leave a reply

Your email address will not be published. Required fields are marked *

*

©2026 Amit Kapoor

CONTACT US

We're not around right now. But you can send us an email and we'll get back to you, asap.

Sending

Log in with your credentials

Forgot your details?