Home / News / Technology / Eric Schmidt Reveals Big Tech’s AI Spending Surge: ‘Companies Need $20B, $50B, $100B — Very, Very Hard’
Technology
3 min read

Eric Schmidt Reveals Big Tech’s AI Spending Surge: ‘Companies Need $20B, $50B, $100B — Very, Very Hard’

Published August 16, 2024 2:44 PM
James Morales
Published August 16, 2024 2:44 PM

Key Takeaways

  • Ex-Google CEO Eric Schmidt has commented on the ongoing Big Tech AI spending spree.
  • He said large companies are looking to invest $300 billion in AI infrastructure.
  • A large portion of that money will go directly to Nvidia, he anticipates.

As Big Tech firms look to secure a slice of what is expected to become a trillion-dollar industry by the end of the decade, they are spending enormous sums to kit out data centers with the necessary hardware.

During a recent speech at Stamford University, former Google CEO Eric Schmidt said that business leaders he had spoken to were eying tens if not hundreds of billions of dollars of investments in the coming years to ensure they have the necessary infrastructure in place to ride the AI boom.

Eric Schmidt Anticipates $300 Billion AI Infrastructure Spending

During his speech , Schmit said building the new generation of AI data centers could set Big Tech firms back as much as $300 billion.

“I’m talking to the big companies, and the big companies are telling me they need $20 billion, $50 billion, $100 billion — very very hard,” he expanded.

And who does he think will be the largest beneficiary of the AI investment bonanza? Nvidia, of course. 

Nvidia’s AI Chip Dominance 

While Nvidia isn’t the only chipmaker in the GPU game, its processors are far more powerful than rivals’, making them invaluable for AI training.

Google, OpenAI, Microsoft and Meta are all currently reliant on Nvidia chips to fuel their AI development. And although concerted efforts are being made to break the firm’s stranglehold on the AI training market, there haven’t been any significant breakthroughs.

One factor that will make it difficult for competitors to close the gap is the widespread popularity of Nvidia’s CUDA programming language, Schmidt observed.

With Big Techs racing ahead to secure the latest Nidia chips and build ever-more powerful GPU superclusters, it’s hard to argue with Schmidt’s logic.

Looking further ahead, however, there are those in the industry who believe that TPUs (Tensor Processing Units) rather than GPUs are the future.

Post-GPU AI Development

Since 2015, Google has used TPUs internally for AI workloads. And they have been available to third-party developers via the cloud since 2018.

But until now, CUDA’s popularity and Google’s failure to realistically open up the TPU ecosystem has endured that GPU-based training has remained dominant.

That being said if Google can A) bring down the cost of TPU production, B) start selling the chips to other data center operators, or C) encourage more TPU-based AI development by open-sourcing the technology, it could invoke a shift away from the GPU-centric model and potentially erode Nvidia’s market share. 

Besides Google, other players are also eying the potential of TPUs. 

OpenAI CEO Sam Altman has reportedly solicited investment for a new TPU venture that could rival Google and Nvidia, however, the ambitious project could need between 5 and 7 trillion dollars to achieve its goals.

The eye-watering sums and massive technological shift required to destabilize Nvidia’s current monopoly suggest the company has a guaranteed role in the AI space for years to come. What happens after that is anyone’s guess.

Was this Article helpful? Yes No