News

100 thousand tenge a month: true price of an AI subscription

Today, paid access to powerful neural networks is often perceived almost as an unlimited service. A user signs up, opens the chat, and quickly becomes dependent on AI generated text, analysis, code and images. But the industry itself lives differently. OpenAI sells ChatGPT Pro for two hundred dollars a month, which amounts to nearly 100 thousand tenge.

At the same time, even such a tariff does not mean the market has finally learned to turn a profit without relying on external funding. According to Reuters, OpenAI expected up to five bln dollars in losses for the past year, and a significant portion of that risk was tied specifically to expenditures on computing power. In other words, the client pays, but the platform may still be subsidizing their comfort out of its own pocket or from investors' money.

Who pays for the generosity of neural networks?

The claim that AI companies are "subsidizing requests" sounds almost like an abstraction, though its meaning is very practical. The user sees sleek, branded design and intuitive interface, while behind the scenes, the company is paying for servers, accelerators, electricity, data storage, model fine-tuning, and the constant expansion of infrastructure.

Reuters, citing The Information, reported that OpenAI expects to burn through colossal sums before reaching sustainable profitability, while the industry simultaneously invests hundreds of billions into new data centers. When companies have to build a digital power plant first just to sell a "magic chat" later, the cheap end-user tariff almost inevitably becomes partially artificial.

Where the true price of a request comes from

Generative AI has one peculiarity that marketing is reluctant to discuss. It isn't just an app. It is a heavy computing machine, and every long answer, complex reasoning, image or agentic scenario requires real power. The International Energy Agency estimates current electricity consumption by data centers at approximately 415 terawatt-hours and expects nearly twofold growth by the end of the decade, naming AI as the primary driver of this surge.

The smarter the model and the more generous its limits, the more expensive its "politeness" becomes. OpenAI explicitly states that the API is paid separately from the subscription, with costs calculated based on tools. Anthropic follows the same logic. In professional use, the price of AI has long ceased to look like a harmless monthly fee it depends on volume, complexity, and how deeply a business has integrated the model into its daily processes.

Why the market is keeping prices below the "pain point"

The answer is unpleasantly simple. Currently, there is a struggle not just for money, but for habit. As long as a person perceives a neural network as a convenient extension of their workspace, they remain within the ecosystem, migrating their tasks, documents, writing style, and personal daily logic there. In this phase, it is more profitable for the market to earn less today than to lose the user tomorrow. This is an old digital strategy: offer a lot at first, then test how much the audience is willing to pay once they can no longer imagine life without the service.

What this means for Kazakhstan

For Kazakhstani user, this is a story not only about global economics but also about future household expenses. While neural networks help write texts, compile documents, provide translations, process images, and save hours of manual labor, they seem profitable almost by default. However, if the period of hidden subsidies begins to shrink, many specialists and small teams will suddenly discover they built their processes on a tool whose real price was as "affordable service."

This is where the main question arises, generative AI has already become a mass-market product in terms of perception, but by its nature, it remains expensive infrastructure. Such contradictions do not survive for long without a final bill. When the era of generous habit-forming ends, the market will begin to price more honestly and then the conversation will no longer be about the wonders of neural networks, but about who exactly is picking up the tab for the feast.