Home / News / Crypto / News / Bitcoin Mining Negativity Spreading to AI as Nvidia Chips Consume Huge Energy
News
4 min read

Bitcoin Mining Negativity Spreading to AI as Nvidia Chips Consume Huge Energy

Published
Omar Elorfaly
Published

Key Takeaways

  • Stats show Nvidia GPUs consume too much power to enable AI.
  • Demand for Nvidia chips jumped after the emergence of ChatGPT.
  • Nvidia is currently under investigation by the European Union for alleged anti-competitive practices.

The hype created by digital asset trading enhances the demand for intensive graphics cards created by the likes of Nvidia. “The crypto market’s hype can vary with major events like the FTX collapse and the NFT market decline.

Now, there’s a rising demand for computing power for a new application: artificial intelligence, or AI. And, it all started with ChatGPT.

The Rise Of AI

Artificial intelligence has been the plot of many sci-fi stories in the past, from the Terminator’s Skynet to Star Wars’ R2D2. AI technology was usually depicted in two ways: as a quirky robotic sidekick or as an evil, unstoppable force created by humanity, leading to regret.

Lately, the technology has been synonymous with the emergence of any smart tech product. If Google announces nearly anything, including the launch of their new smartphones, company executives on stage are likely to repeat the term “AI” more than once before exiting stage left.

But, above all, one application of artificial intelligence has taken headlines and office chats by storm, OpenAI’s text-based Generative Pre-trained Transformers, ChatGPT.

The chat module became incredibly popular, attracting 100 million users in just two months, surpassing platforms like Instagram, which took over two years to achieve a similar user base.

More importantly, ChatGPT ushered in a new business sector, the artificial intelligence products sector. The majority of software solutions companies will attempt to create products that automate a part of any process. They do that just to earn the “AI” stamp on it.

But AI Models Come With a Major Flaw

To compile extensive data for training AI models, developers rely on transformers, software that emulates quantum computing, rapidly generating multiple outcomes for a single query.

Developers need substantial computing power, typically provided by Graphics Processing Units (GPUs), to drive transformers and meet ambitious timelines.

While plenty of OEMs (Original Equipment Manufacturers) such as Intel, AMD, and ASUS offer a variety of GPU products, Nvidia seemingly has a near monopoly on the market (which is now under investigation  by the European Union). The company’s latest RTX lineup offers staggering performance for AI developers, gamers, and crypto miners alike.

AI Consumes Too Much Power

However, understandably, such GPUs are rather power-hungry. To enable transformers to help developers create these AI models, GPUs draw electrical power at abnormal rates.

As a result, critics are now pointing fingers at AI for causing a significant rise in power consumption amid discussions regarding climate change.

Discussions on AI’s energy consumption draw parallels to past Bitcoin mining debates. Supporters claim Bitcoin mining controversies have subsided. But critics warn that AI data centers will escalate energy consumption.

Critics even claim that Nvidia’s GPUs consume more power than the average American home.

Was this Article helpful? Yes No