Is AI a bubble?

December 8, 2025

In this article, I’ve examined several indicators that make the current AI boom resemble a classic bubble. At the same time, being fundamentally optimistic, I’ve explored the counterarguments - why it may not be a bubble after all (Yes, I am more inclined towards my optimistic view). From a short-term economic perspective, many signs do point to overvaluation and hype, making it a bubble. But, when viewed on a civilizational timescale, this moment looks less like a bubble and more like a genuine inflection point in human history.

Why is it a bubble?

The current AI wave strikingly resembles the dot-com bubble of the late 1990s, where investors poured capital into unproven internet ventures. It was all because of rampant hype and FOMO. Suddenly realisation was made that profitability is elusive, and all lucrative valuations collapsed. Something similar (but not exactly the same) is happening in this AI wave.

  • In 2024 alone, AI-related startups attracted over $100 billion in global venture funding. It was an 80% YOY increase from $55.6 billion in 2023. It was nearly one-third of all VC money. This influx has fueled a wave of massive seed and early-stage rounds for companies often lacking viable products or clear market fit.
  • In 2025, U.S. AI startups secured 49 rounds exceeding $100 million each. Globally, seed funding for AI hit $1.8 billion in 2024, yet many recipients remain in ideation, echoing dot-com era over-optimism.
  • There is a surge in debt-fueled infrastructure bets. Frontier labs like OpenAI, xAI, Google, and Meta are racing to build gigawatt-scale data centers, with collective investments projected to top $300 billion in 2025 capital expenditures.
  • OpenAI's Stargate project, announced by the White House in January 2025, commits up to $500 billion through 2029 for 20 U.S. sites, starting with a 1.2 GW campus in Abilene, Texas. Energy consumption of this single data center is equivalent to powering 1 million homes.
  • xAI's Colossus 2 aims for 1 GW by early 2026, spanning 1 million square feet and requiring 550,000 Nvidia GPUs.
  • Meta's 5 GW Hyperion facility and Google's $40 billion Texas buildout (three sites totaling 2 GW) further strain resources.
  • These projects, often financed through high-interest debt and equity infusions, amplify leverage risks, with total AI infrastructure debt issuance nearing $50 billion in 2025 amid rising interest rates.

All these huge investments are the signs of a solid future. At first glance, it seems like everything is making sense - people are using AI chat tools a lot, and companies are satisfying the demand. But here’s the interesting part - the unit economics.

Unit economics

There is a stark mismatch between revenue generation and operational costs, rendering many AI operations fundamentally unprofitable.

  • OpenAI, for example, projected $3.7 billion in 2024 revenue. They incurred $9 billion in total expenses, ($2.8 billion for inference, $3 billion for training, rest in salaries and other expenses). They yielded a $5 billion net loss.
  • Anthropic mirrored this, losing $5.3 billion on $1 billion annualized revenue, with compute costs alone consuming over 100% of inflows.
  • Pricing exacerbates the issue: OpenAI's ChatGPT Plus charges $20/month for "unlimited" access, yet inference costs per heavy user often exceed $50/month, while API rates barely cover the $0.01–$0.10 in underlying GPU expenses for models like GPT-4o.
  • Training a single frontier model like GPT-4 now costs $79–$100 million, scaling to $1–$10 billion for successors, per Anthropic estimates.
  • These imbalances are worsened by external bottlenecks, like, Nvidia generated 88% of its Q3 2025 revenue from AI data center chips. It has sparked shortages, with its pivot to low-power LPDDR memory potentially doubling server prices by late 2026.
  • Energy demands are serious: A 1 GW data center consumes $1.3 billion annually in electricity. This is comparable to a mid-sized city's usage. Stargate alone requires 4.5 GW, prompting on-site gas turbines and delaying coal plant retirements.
  • Oracle’s $300 billion, five-year deal with OpenAI has sparked growing skepticism. Amid rising “AI bubble” fears, Oracle shares fell 4 % after their initial surge, wiping roughly $34 billion off Larry Ellison’s net worth. In this overheated climate, any slowdown in adoption or cost efficiencies could trigger a sharp correction - reinforcing the classic bubble characteristics of today’s AI boom.

Why is this AI wave worth it, even if it’s a bubble?

History shows that the most transformative technologies often appear economically irrational, highly speculative, and balance-sheet-negative in their infancy. Yet they ultimately reshape civilization. We can compare this AI wave to The Manhattan Project, only that AI is going to have a far greater impact on civilisation.

  • The U.S. alone spent ~$2 billion on the bomb itself (≈ $35 billion in 2025 dollars), then invested an additional $400+ billion (inflation-adjusted) billion through the 1950s–1980s on civilian and military nuclear programs. All this was financed largely through deficit spending and public debt.
  • Global nuclear R&D and reactor construction eventually exceeded $2 trillion in today’s dollars.
  • For decades, light-water reactors operated at negative or break-even economics in many markets, required massive subsidies, triggered fierce public opposition, and carried catastrophic tail risks. But it was all worth it.
  • Nuclear power now supplies ~10% of global electricity (2,500 TWh/year), enabled naval propulsion, medical isotopes, and laid the scientific foundation for fusion research.

The AI wave is following the same pattern, only at planetary scale and far greater speed:

  • Cumulative global investment in AI (public + private) already exceeds $1.2 trillion since 2020, with another $1–2 trillion committed through 2030 for compute clusters, energy infrastructure, and frontier models.
  • Current inference and training costs are astronomical. OpenAI alone is on track to spend $15–20 billion in 2025 while generating < $10 billion in revenue. These are all classic “learning curve” investments. Compute cost per FLOPs has fallen ~ 40–50% per year for the past decade. At that rate, today’s $10-100 million training runs will cost < $100,000 by the mid-2030s.
  • Early nuclear reactors cost $10,000–20,000/kW to build; modern ones are ~$6,000/kW and South Korean/AP1000 designs are approaching $2,500/kW. We are seeing the same steep experience curve in AI data centers and custom silicon (Google TPUv5, Amazon Trainium2, etc.).

We can’t even comprehend the impact AI is going to have on civilization. It’s an inflection point. Everything is leading to a brighter and abundant future. Following are some important impacts that makes me think that all the ‘bubble risk’ will be worth it end of the day:

  • Drug discovery: AlphaFold 3 solved 99% of human protein structures. Follow-on models have already reduced typical small-molecule discovery timelines from 4-6 years to months and cut costs by 70–90%. McKinsey estimates $300–600 billion in annual value from AI-accelerated pharma alone by 2030.
  • Materials & energy: AI-driven discovery of new superconductors, catalysts, and battery chemistries is accelerating. Companies like Materials Project + AlphaFold-style models have already identified thousands of viable new compounds; several are entering pilot production in 2025–2026.
  • Physics & space: Automated theorem-proving and simulation are closing century-old problems (e.g., portions of the Yang–Mills mass-gap problem). NASA and SpaceX now use AI for trajectory optimization, in-space manufacturing design, and autonomous mining architectures. This will cut Mars mission costs by estimated 50–80% and enable asteroid retrieval economics that were previously impossible.
  • Robotics & abundance: Boston Dynamics, Figure, Tesla Optimus, and 1X Technologies are scaling humanoid platforms on track for < $20,000–30,000 unit cost by 2030 at scale. With energy and compute becoming super-abundant (fusion pilot plants targeting 2030s, gigawatt-scale solar + storage), physical labor constraints effectively disappear. - Economic models from Ark Invest and others project global GDP could grow 5-15× by 2050 under aggressive AI adoption, turning today’s trillion-dollar losses into rounding errors.

Today’s “bubble” is the price for compressing a century of scientific and industrial progress into a single decade. Just as nuclear power eventually delivered orders-of-magnitude more energy than was invested in it, AI is on the cusp of delivering orders-of-magnitude more intelligence, discovery, and productive capacity. The short-term financial losses are real, but the long-term payoff is more important. We are just accelerating the inevitable - science, medicine, energy, space colonization, and the abundance for humans. The future is bright.

Interesting threat - The “Brain-Like New Breakthrough” Scenario

There is one thing that could actually burst the bubble irreversibly - what if there is a breakthrough in AI infrastructure that is like the human brain. A lot less hardware will be needed, a lot less power will be needed. What if the ultimate AI infrastructure is not silicon, but it's a biological part grown in a test tube? A lot of great technology breakthroughs had a “substitution shock” moment: a radically more efficient alternative that obsoleted the entire capital stack. Railroads → trucks + highways. Vacuum tubes → transistors. Film photography → digital sensors.

For AI, the equivalent existential risk to today’s $3-4 trillion invested ecosystem (Nvidia GPUs, H100/B200/Blackwell clusters, hyperscale CUDA data centers, high-power-density liquid-cooled racks) would be the discovery of a neuromorphic or biologically inspired architecture that achieves frontier-level performance at 10-100× lower energy and 100-1000× lower capex per FLOP.

Here are some numbers that make the threat credible:

  • Human brain: ~20 watts for ≈10¹⁶ analog synaptic operations/sec → roughly 5 × 10¹⁴ OPS/watt.
  • Today’s best GPU (Nvidia H200): ≈700 watts for 4 × 10¹⁵ digital FP8 ops/sec → ≈6 × 10¹² OPS/watt
  • The brain is already ~100× more efficient on an ops/watt basis, and the gap widens to 3-5 orders of magnitude if you measure in effective intelligence per watt.

If a research group (or a skunkworks team at Apple, Tesla, Meta, or a DARPA/ARPA-E program) demonstrates a neuromorphic chip that runs o1-class reasoning at ≤10 watts and can be manufactured on a $2 billion 3 nm node instead of a $300 billion GPU mega-cluster, the consequences are catastrophic for the current paradigm. The effects:

  • Immediate stranding of >$1 trillion in GPU/ASIC inventory and in-flight data-center builds.
  • Nvidia market cap could drop 80–95 % within quarters (similar to Cisco in 2001).
  • Hyperscalers left with hundreds of gigawatts of stranded power contracts and white-elephant facilities.
  • Every AI startup valued on “compute moat” becomes worthless overnight.

But there is a lock-in effect here: Once $2–3 trillion has already been sunk into the current CUDA-transformer-dense-GPU paradigm, the political and financial inertia against abandoning it becomes enormous. Entire national strategies (U.S. CHIPS Act, EU AI Act funding, China’s GPU independence push) are now tied to this specific hardware-software stack. A pivot would require admitting that the biggest investment wave in history was directed at a technological dead end.

There is so much invested in all this, there is so much dependence on the current architecture that most incumbents will fight the new breakthrough tooth and nail. They will do everything - lobbying, patent thickets, talent poaching, or simply refusing to fund the new architecture. The current AI boom shows many classic bubble signals - insane valuations, terrible unit economics, and trillions committed to GPU megaclusters.

If someone does crack 100–1000× efficiency, and all the big players decide to go with it, then nothing - no amount of sunk cost, no government subsidy, no scaling law - will save the current trillion-dollar bet from collapsing like the dot-com fiber-optic glut in 2001–2002. A new breakthrough in compute is the one asymmetric risk that could turn today’s “worth it even if it’s a bubble” argument into “we just repeated the same mistake as the 19th-century railway mania, only 100× larger.”

Anyway, the future is bright.