Nvidia Invests in Thinking Machines Lab

What happened: Mira Murati’s Thinking Machines Lab signed a multi-year partnership with Nvidia that includes a strategic Nvidia investment and a commitment to deploy at least a gigawatt of Nvidia Vera Rubin systems starting in 2027.

Why it matters: “A gigawatt” is a signal about scale: access to large, predictable compute is becoming a competitive moat for frontier labs, and Nvidia is increasingly shaping the ecosystem not just as a supplier, but as a capital partner with long-horizon capacity.

Wider context: The deal lands amid a broader compute crunch where model developers compete for GPUs, power, and datacenter buildouts — and where the biggest players are locking up multi-year infrastructure commitments to reduce training uncertainty.

Background: TechCrunch reports Thinking Machines has raised more than $2B since its 2025 founding, is valued at over $12B, and shipped an early API product called Tinker last year, while also seeing several high-profile co-founder departures back to other labs.


Singularity Soup Take: Nvidia’s smartest move isn’t just selling the picks and shovels — it’s deciding who gets early, abundant shovels. But the industry’s bottleneck is starting to look less like chips and more like electricity, permitting, and disciplined spending.

Key Takeaways:

  • Compute at industrial scale: The partnership includes Thinking Machines deploying at least one gigawatt of Nvidia Vera Rubin systems beginning in 2027, framing compute access as a long-term infrastructure plan rather than an on-demand cloud purchase.
  • Nvidia as investor, not just vendor: TechCrunch says Nvidia is making a strategic investment in the lab, a pattern that can align roadmaps and supply, but also tightens the web of dependencies between frontier labs and their primary hardware provider.
  • Frontier-lab turbulence: The report notes multiple co-founder exits over time — a reminder that talent mobility is high and organizational stability is itself a differentiator when companies are attempting multi-year bets on expensive training and productization.

Relevant Resources

The Future of AI: What's Coming Next? — A practical overview of what drives AI progress (models, data, compute, and deployment), and why infrastructure constraints increasingly shape what “breakthroughs” are actually possible.