A company valued at hundreds of billions of dollars wants to borrow billions to build a house.
Lenders say: No.
The reason is straightforward: your business model hasn’t been validated, and analysts predict you might burn through all your cash by mid-2027. What will you use to repay?
This isn’t a financing failure of a startup. It’s the real story of OpenAI in 2025.
According to an exclusive report by The Information, OpenAI sent executives across the U.S. to scout locations, planning to build its own data centers and seeking to raise billions of dollars to start construction. The result: rejection from lenders. Tom’s Hardware citing analysts suggests OpenAI might run out of cash as early as mid-2027.
A year ago, Sam Altman stood beside the White House podium and announced the Stargate project: $500 billion over four years, building the world’s largest AI data center network with SoftBank and Oracle. Trump called it “the biggest AI infrastructure project in history.”
A year later, this joint venture has neither assembled a team nor developed any data centers. The three partners haven’t even agreed on responsibilities. OpenAI itself cannot build what it wants.
So, OpenAI started doing the math.
The $500 billion dream shattered over “who will manage it.”
The Information reconstructs a story of a year-long collapse behind the spotlight.
Weeks after the White House briefing, Stargate fell into paralysis. No one took the lead, no coordination mechanism. OpenAI, Oracle, and SoftBank repeatedly argued over who would build, who would manage, and how to split the costs.
OpenAI’s initial obsession was to build its own data center. The logic was sound: leasing compute power long-term is too expensive; only building it themselves can control their destiny.
But lenders saw it differently.
A company that burned through $2.5 billion in cash in six months and expects to burn $8.5 billion annually, asking to borrow billions to build data centers? Lenders look at your cash flow, not your PPT. And OpenAI estimates it won’t turn positive cash flow until 2029 at the earliest.
It’s like someone who hasn’t started making money asking a bank for a mortgage on a villa—the bank’s first question is: what will you use to pay it back? And they can’t answer.
The self-build route is blocked. OpenAI was forced back to negotiations, continuing talks with Stargate’s partners.
But negotiations are tough. SoftBank has several large data center projects in Texas. OpenAI wants to use one of them as its first facility. SoftBank refuses, wanting to retain control. In September and October, OpenAI teams flew to Japan multiple times to negotiate face-to-face with Masayoshi Son.
The final outcome: OpenAI signs a long-term lease, controls the design; SoftBank’s SB Energy will develop and hold the facility.
In other words, OpenAI went from wanting to be the landlord to becoming a tenant.
$800 billion evaporated
If internal chaos at Stargate was a hidden wound, this number is a public self-correction.
According to CNBC, OpenAI lowered its total compute expenditure target before 2030 from about $600 billion to around $600 billion, with a clearer timeline and revenue forecast. By 2030, revenue is expected to exceed $280 billion, split evenly between consumer and enterprise segments.
From $1.4 trillion down to $600 billion—a 57% reduction.
The official explanation: “To better align spending with revenue growth.”
The real meaning: investors are no longer convinced.
That previous figure was more like a wish list; $600 billion is at least a number that can be modeled. But even so, to reach over $280 billion in revenue by 2030, OpenAI would need a compound annual growth rate of over 50% for five consecutive years. Who can guarantee that?
In 2025, OpenAI’s revenue was $13.1 billion, with $8 billion burned. Profitability is still far off. The company expects to turn positive cash flow only by 2029. Before then, cumulative losses could reach $115 billion.
This is the wake-up call.
It’s not that Altman doesn’t want to spend $1.4 trillion. It’s that reality tells him: you can’t afford it.
The books can’t support the dream
Why is OpenAI forced to shift from a dreamer to a bean counter? Not because of strategic mistakes, but because three harsh facts hit simultaneously.
First, money is flowing out much faster than it is coming in.
In the first half of 2025, OpenAI’s revenue was $4.3 billion, burning $2.5 billion in cash. Full-year revenue was $13.1 billion, burning $8 billion. According to investor documents cited by Fortune, the company expects losses to grow annually, with operational losses possibly reaching $74 billion by 2028, only turning positive around 2029 or 2030. Total losses could reach $115 billion.
OpenAI is currently spending ten times faster than it earns. Mathematically, these lines will eventually cross—either in 2029 or never.
Second, can compute efficiency offset scale expansion? Although OpenAI’s “compute profit margin” (revenue minus model operation costs) improved from 52% in October 2024 to 70% in October 2025, thanks to algorithm optimization and hardware utilization, every time they release larger models or more compute-intensive features (like video generation), these efficiency gains are eaten up.
Third, the paid conversion rate has plateaued.
ChatGPT’s weekly active users surpassed 900 million. But according to Incremys data, the paid conversion rate is only about 5%, with over 95% of users on the free tier. OpenAI has begun testing ads in the free version. This signals that the subscription model has reached a ceiling—users’ attention is being monetized.
Meanwhile, competitors are stealing users with less spending. According to Similarweb, ChatGPT’s global traffic share dropped from 87% to about 65% in a year. Google Gemini, integrated by default into Android and embedded in Workspace, surged from 5% to 21%, not because of a stronger model, but due to distribution dominance. Anthropic’s Claude, with only 2% traffic share, achieves the highest user engagement (average 34.7 minutes daily), targeting high-end enterprise clients, and spends far less than OpenAI.
“ChatGPT created this category, but when substitutes appear, users naturally disperse,” says Tom Grant, VP of research at Apptopia.
And competitors are doing the same with less money. DeepSeek uses open-source models and ultra-low costs to shake up the market. Google leverages distribution. Anthropic focuses on high-value customers. If AI models tend toward feature convergence, the market will ultimately be decided not by who has the strongest model, but by who has the deepest ecosystem and lowest costs.
OpenAI is trying to win three battles simultaneously: model competition, infrastructure race, and commercialization. But historically, no company has succeeded in all three at once.
Altman’s Plan B
The dream is shattered, but Altman hasn’t stopped.
He did something recommended by every business textbook but rarely undertaken by dreamers: he abandoned obsession and chose to survive pragmatically.
The dream of building its own data centers is gone. Instead, the strategy is to sign numerous partnerships outside the Stargate framework—signing a $30 billion annual compute purchase agreement with Oracle, deepening cooperation with CoreWeave, and even supplementing with AWS and Google Cloud. Chip supply is also diversified, adding AMD and startup Cerebras alongside Nvidia.
OpenAI’s CFO Sarah Friar publicly stated at Davos that the company is intentionally protecting its balance sheet through partnerships.
A year ago, this was unimaginable. Back then, Altman talked about trillion-dollar infrastructure commitments, 10 GW of compute capacity, and a mission to change human destiny with artificial general intelligence. Now, his CFO talks about “protecting the balance sheet.”
But OpenAI’s fundraising remains staggering, with the latest round possibly exceeding $100 billion. According to Bloomberg, OpenAI is close to closing the first tranche of a new funding round, with the overall valuation possibly surpassing $850 billion. Participating investors include Amazon (expected to invest $50 billion), SoftBank ($30 billion), Nvidia ($20 billion), and Microsoft.
Note the nature of these investors: chip suppliers, cloud providers, and strategic investors requiring OpenAI to use their services. This isn’t venture capital betting on a dream; it’s supply chain locking in a major customer.
Investing in OpenAI now is less like buying a lottery ticket and more like signing a supply contract—completely different.
Gravity
Let’s shift our focus back to Stargate.
A year ago, on the White House stage, Sam Altman announced the $500 billion “Stargate” plan.
A year later, the joint venture in that plan has become a mess. OpenAI bypassed its own joint framework and signed a separate deal with Oracle. The compute target fell short—only 7.5 GW out of 10 GW. Spending projections were cut from $1.4 trillion to $600 billion.
This isn’t a story of failure. OpenAI hasn’t fallen; it continues to raise money, grow, and has over 900 million users.
But it is a story of waking up from a dream.
From “building the world’s largest data center empire” to “first stay alive, then fight using others’ money and infrastructure.” From aspiring landlord to becoming a tenant. From a dreamer to a bean counter.
When Elon Musk commented coldly on X: “Hardware is hard,” he pointed to a harsh reality all AI companies will face: the race for compute power has reached a stage where the real barrier isn’t training the strongest model anymore, but physically deploying gigawatt-scale infrastructure without destroying oneself.
Altman has chosen not to burn out. It may be the least glamorous decision he’s made, but also the most correct.
As for the $500 billion Stargate dream, it hasn’t died, but it is no longer what it was a year ago. It has shifted from a narrative about changing human destiny to a balance sheet that must be checked line by line.
View Original
This page may contain third-party content, which is provided for information purposes only (not representations/warranties) and should not be considered as an endorsement of its views by Gate, nor as financial or professional advice. See Disclaimer for details.
When Dreams Wake Up: When the Dream Makers at OpenAI Start Settling the Accounts
By Ada
A company valued at hundreds of billions of dollars wants to borrow billions to build a house.
Lenders say: No.
The reason is straightforward: your business model hasn’t been validated, and analysts predict you might burn through all your cash by mid-2027. What will you use to repay?
This isn’t a financing failure of a startup. It’s the real story of OpenAI in 2025.
According to an exclusive report by The Information, OpenAI sent executives across the U.S. to scout locations, planning to build its own data centers and seeking to raise billions of dollars to start construction. The result: rejection from lenders. Tom’s Hardware citing analysts suggests OpenAI might run out of cash as early as mid-2027.
A year ago, Sam Altman stood beside the White House podium and announced the Stargate project: $500 billion over four years, building the world’s largest AI data center network with SoftBank and Oracle. Trump called it “the biggest AI infrastructure project in history.”
A year later, this joint venture has neither assembled a team nor developed any data centers. The three partners haven’t even agreed on responsibilities. OpenAI itself cannot build what it wants.
So, OpenAI started doing the math.
The $500 billion dream shattered over “who will manage it.”
The Information reconstructs a story of a year-long collapse behind the spotlight.
Weeks after the White House briefing, Stargate fell into paralysis. No one took the lead, no coordination mechanism. OpenAI, Oracle, and SoftBank repeatedly argued over who would build, who would manage, and how to split the costs.
OpenAI’s initial obsession was to build its own data center. The logic was sound: leasing compute power long-term is too expensive; only building it themselves can control their destiny.
But lenders saw it differently.
A company that burned through $2.5 billion in cash in six months and expects to burn $8.5 billion annually, asking to borrow billions to build data centers? Lenders look at your cash flow, not your PPT. And OpenAI estimates it won’t turn positive cash flow until 2029 at the earliest.
It’s like someone who hasn’t started making money asking a bank for a mortgage on a villa—the bank’s first question is: what will you use to pay it back? And they can’t answer.
The self-build route is blocked. OpenAI was forced back to negotiations, continuing talks with Stargate’s partners.
But negotiations are tough. SoftBank has several large data center projects in Texas. OpenAI wants to use one of them as its first facility. SoftBank refuses, wanting to retain control. In September and October, OpenAI teams flew to Japan multiple times to negotiate face-to-face with Masayoshi Son.
The final outcome: OpenAI signs a long-term lease, controls the design; SoftBank’s SB Energy will develop and hold the facility.
In other words, OpenAI went from wanting to be the landlord to becoming a tenant.
$800 billion evaporated
If internal chaos at Stargate was a hidden wound, this number is a public self-correction.
According to CNBC, OpenAI lowered its total compute expenditure target before 2030 from about $600 billion to around $600 billion, with a clearer timeline and revenue forecast. By 2030, revenue is expected to exceed $280 billion, split evenly between consumer and enterprise segments.
From $1.4 trillion down to $600 billion—a 57% reduction.
The official explanation: “To better align spending with revenue growth.”
The real meaning: investors are no longer convinced.
That previous figure was more like a wish list; $600 billion is at least a number that can be modeled. But even so, to reach over $280 billion in revenue by 2030, OpenAI would need a compound annual growth rate of over 50% for five consecutive years. Who can guarantee that?
In 2025, OpenAI’s revenue was $13.1 billion, with $8 billion burned. Profitability is still far off. The company expects to turn positive cash flow only by 2029. Before then, cumulative losses could reach $115 billion.
This is the wake-up call.
It’s not that Altman doesn’t want to spend $1.4 trillion. It’s that reality tells him: you can’t afford it.
The books can’t support the dream
Why is OpenAI forced to shift from a dreamer to a bean counter? Not because of strategic mistakes, but because three harsh facts hit simultaneously.
First, money is flowing out much faster than it is coming in.
In the first half of 2025, OpenAI’s revenue was $4.3 billion, burning $2.5 billion in cash. Full-year revenue was $13.1 billion, burning $8 billion. According to investor documents cited by Fortune, the company expects losses to grow annually, with operational losses possibly reaching $74 billion by 2028, only turning positive around 2029 or 2030. Total losses could reach $115 billion.
OpenAI is currently spending ten times faster than it earns. Mathematically, these lines will eventually cross—either in 2029 or never.
Second, can compute efficiency offset scale expansion? Although OpenAI’s “compute profit margin” (revenue minus model operation costs) improved from 52% in October 2024 to 70% in October 2025, thanks to algorithm optimization and hardware utilization, every time they release larger models or more compute-intensive features (like video generation), these efficiency gains are eaten up.
Third, the paid conversion rate has plateaued.
ChatGPT’s weekly active users surpassed 900 million. But according to Incremys data, the paid conversion rate is only about 5%, with over 95% of users on the free tier. OpenAI has begun testing ads in the free version. This signals that the subscription model has reached a ceiling—users’ attention is being monetized.
Meanwhile, competitors are stealing users with less spending. According to Similarweb, ChatGPT’s global traffic share dropped from 87% to about 65% in a year. Google Gemini, integrated by default into Android and embedded in Workspace, surged from 5% to 21%, not because of a stronger model, but due to distribution dominance. Anthropic’s Claude, with only 2% traffic share, achieves the highest user engagement (average 34.7 minutes daily), targeting high-end enterprise clients, and spends far less than OpenAI.
“ChatGPT created this category, but when substitutes appear, users naturally disperse,” says Tom Grant, VP of research at Apptopia.
And competitors are doing the same with less money. DeepSeek uses open-source models and ultra-low costs to shake up the market. Google leverages distribution. Anthropic focuses on high-value customers. If AI models tend toward feature convergence, the market will ultimately be decided not by who has the strongest model, but by who has the deepest ecosystem and lowest costs.
OpenAI is trying to win three battles simultaneously: model competition, infrastructure race, and commercialization. But historically, no company has succeeded in all three at once.
Altman’s Plan B
The dream is shattered, but Altman hasn’t stopped.
He did something recommended by every business textbook but rarely undertaken by dreamers: he abandoned obsession and chose to survive pragmatically.
The dream of building its own data centers is gone. Instead, the strategy is to sign numerous partnerships outside the Stargate framework—signing a $30 billion annual compute purchase agreement with Oracle, deepening cooperation with CoreWeave, and even supplementing with AWS and Google Cloud. Chip supply is also diversified, adding AMD and startup Cerebras alongside Nvidia.
OpenAI’s CFO Sarah Friar publicly stated at Davos that the company is intentionally protecting its balance sheet through partnerships.
A year ago, this was unimaginable. Back then, Altman talked about trillion-dollar infrastructure commitments, 10 GW of compute capacity, and a mission to change human destiny with artificial general intelligence. Now, his CFO talks about “protecting the balance sheet.”
But OpenAI’s fundraising remains staggering, with the latest round possibly exceeding $100 billion. According to Bloomberg, OpenAI is close to closing the first tranche of a new funding round, with the overall valuation possibly surpassing $850 billion. Participating investors include Amazon (expected to invest $50 billion), SoftBank ($30 billion), Nvidia ($20 billion), and Microsoft.
Note the nature of these investors: chip suppliers, cloud providers, and strategic investors requiring OpenAI to use their services. This isn’t venture capital betting on a dream; it’s supply chain locking in a major customer.
Investing in OpenAI now is less like buying a lottery ticket and more like signing a supply contract—completely different.
Gravity
Let’s shift our focus back to Stargate.
A year ago, on the White House stage, Sam Altman announced the $500 billion “Stargate” plan.
A year later, the joint venture in that plan has become a mess. OpenAI bypassed its own joint framework and signed a separate deal with Oracle. The compute target fell short—only 7.5 GW out of 10 GW. Spending projections were cut from $1.4 trillion to $600 billion.
This isn’t a story of failure. OpenAI hasn’t fallen; it continues to raise money, grow, and has over 900 million users.
But it is a story of waking up from a dream.
From “building the world’s largest data center empire” to “first stay alive, then fight using others’ money and infrastructure.” From aspiring landlord to becoming a tenant. From a dreamer to a bean counter.
When Elon Musk commented coldly on X: “Hardware is hard,” he pointed to a harsh reality all AI companies will face: the race for compute power has reached a stage where the real barrier isn’t training the strongest model anymore, but physically deploying gigawatt-scale infrastructure without destroying oneself.
Altman has chosen not to burn out. It may be the least glamorous decision he’s made, but also the most correct.
As for the $500 billion Stargate dream, it hasn’t died, but it is no longer what it was a year ago. It has shifted from a narrative about changing human destiny to a balance sheet that must be checked line by line.