- BridgeMe Club
- Posts
- OpenAI’s Cost Problem Exposed
OpenAI’s Cost Problem Exposed

Leaked internal documents offer an unusually clear view into how much OpenAI is paying Microsoft to power its models, and how sharply those costs have climbed. The figures point to a business generating enormous revenue but spending even more on the compute required to run its systems.
Revenue sharing runs both ways. In 2024, Microsoft reportedly received $493.8 million from OpenAI under a 20 percent revenue share, rising to $865.8 million in the first three quarters of 2025. Because Microsoft also pays OpenAI a share of Bing and Azure OpenAI revenue, the leaked amounts represent only Microsoft’s net share.
Implied revenue is high. Based on the 20 percent share, OpenAI’s revenue was at least $2.5 billion in 2024 and $4.33 billion through the first nine months of 2025, though outside reporting places 2024 revenue closer to $4 billion. Altman recently said annualized revenue will surpass $20 billion this year.
Inference is the real expense. Zitron’s estimates suggest OpenAI spent roughly $3.8 billion on inference in 2024 and about $8.65 billion in the first three quarters of 2025. Training costs are largely covered by Microsoft’s credits, but inference, which powers every user request, is mostly cash.
Compute partners are expanding. While Azure remains the primary provider, OpenAI has also signed deals with CoreWeave, Oracle, AWS, and Google Cloud as model usage continues to scale.
The gap is widening. Taken together, the numbers imply OpenAI may be spending more on inference than it earns in revenue, raising broader questions about how sustainable today’s AI economics truly are.
The leaked data is incomplete, but it reinforces the tension at the center of the AI boom: soaring demand, massive revenue, and an even larger cost base that could shape the next phase of investment across the industry.