Krutrim is pivoting from developing its own generative AI models to offering full-stack AI cloud services [1].

This strategic shift reflects the growing economic difficulty of maintaining large-scale model development in India. By moving toward infrastructure, the company aims to capture immediate market demand for compute and hosting services rather than relying on the long-term development of proprietary AI [2].

Krutrim, which is backed by Ola and described as India's first GenAI unicorn, will now focus on providing compute, hosting, and enterprise AI solutions [1]. The company is transitioning its business model to prioritize commercial viability in the near term [3].

Several factors drove the decision to move away from model creation. The company has faced a slowdown in AI model development and significant economic challenges associated with building large models within India [2]. This pivot follows a period of internal restructuring characterized by recent layoffs and a pause on chip design [3].

By offering a full-stack cloud approach, Krutrim seeks to provide the underlying hardware and software environment necessary for other businesses to run AI applications [1]. This allows the startup to monetize the physical and digital infrastructure of AI—the servers and hosting environments—rather than competing solely on the intelligence of the models themselves [2].

The move signals a broader trend among AI startups to pivot toward the "picks and shovels" of the industry. Instead of risking capital on the unpredictable success of a specific model, Krutrim is betting on the guaranteed need for the compute power that fuels all generative AI [3].

Krutrim is pivoting from developing its own generative AI models to offering full-stack AI cloud services.

This pivot suggests that the cost and complexity of training frontier AI models may be prohibitive for startups, even those with unicorn valuations. By shifting to cloud infrastructure, Krutrim is moving from a high-risk R&D play to a utility-based business model, reflecting a global trend where the most stable profits in the AI boom are currently found in the hardware and hosting layers.