Home / News / Technology / Elon Musk’s Grok 2 and Grok 3 Timelines, Reveals Supercomputer Plan
Technology
5 min read

Elon Musk’s Grok 2 and Grok 3 Timelines, Reveals Supercomputer Plan

Published
Giuseppe Ciccomascolo
Published

Key Takeaways

  • Elon Musk’s xAI will launch new versions of Grok in the coming months.
  • XAI plans to make a significant investment in Nvidia chips to boost its AI offer.
  • Musk also plans to develop a “Supercomputer”.

Elon Musk’s xAI is set to release Grok 2 in August and Grok 3 by the end of the year, with the latter trained on 100,000 Nvidia H100 GPUs. However, despite 1.5 version’s strong performance, Grok remains less popular than ChatGPT and Gemini due to its lack of a free version and high subscription costs.

While the attention may turn for a while on Grok’s new developments, the traders and artificial intelligence (AI) community will focus on Musk’s recent plan of building a “Supercomputer,” also boasting Nvidia’s chips.

When Grok 2 And 3 Will Be Available

An X user shared an interview with Cohere CEO Aidan Gomez, where he remarked that many AI models are trained on OpenAI’s outputs, creating a “human centipede effect” with similar outputs. Elon Musk responded to the tweet, agreeing and stating, “Sadly quite true. It takes a lot of work to purge LLMs from the Internet training data. Grok 2, which comes out in August, will be a giant improvement in this regard.”

Musk also mentioned that “Grok 3, expected at the end of the year after training on 100k H100s, should be really something special.”

The latest version of Grok, version 1.5, was released in March with enhanced reasoning capabilities and a context length of 128,000 tokens. While Grok 1.5 didn’t match GPT-4 on the MMLU, MATH, and GSM8K benchmarks, it wasn’t far behind and even outperformed GPT-4 on the HumanEval benchmark.

So far, it’s clear that Grok is less popular than ChatGPT, Gemini, or Monica. This is due to the fat that X doesn’t offer a free version of Grok. Due to the high costs of running LLMs and Musk’s goal to boost X’s revenue, a free version seems unlikely in the near future.

Accessing Grok requires an X account and a subscription to the X Premium+ plan, which costs $16 per month or $168 per year. X Premium+ is the most expensive subscription on X and it removes all ads from the For You and Following feeds. Additionally, it offers a hub for users to receive money for posts and subscriptions, boosting Premium+ users’ replies in X’s rankings.

New Grok Needs 20,000 Nvidia H100 GPUs

Elon Musk is generating excitement about Grok 3, the next version of xAI’s chatbot. In a post on X, he hinted that Grok 3, trained on 100,000 Nvidia H100 GPUs, would be “something special.”

Nvidia’s H100 GPUs, essential for AI development, are in high demand, each costing between $30,000 and $40,000. Training Grok 3 could thus cost between $3 billion and $4 billion, though xAI might be renting these GPUs from cloud providers like Oracle, with whom Musk’s xAI was reportedly  negotiating a $10 billion deal for cloud servers.

Previously, Musk diverted a $500 million shipment of H100s from Tesla to X, highlighting his significant investment in AI. Grok 2 required around 20,000 H100s for training, a steep increase from earlier versions.

So far, xAI has released Grok-1 and Grok-1.5. While 100,000 GPUs is a significant number, Meta’s Mark Zuckerberg plans to acquire about 350,000 Nvidia H100 GPUs by the end of 2024, aiming for a total of around 600,000 GPUs, translating to an estimated $18 billion investment in AI.

Musk’s Supercomputer Plan

In May 2024, Elon Musk’s xAI announced  plans to build a supercomputer facility in Memphis, marking the largest investment in the city’s history, according to Ted Townsend, president of the Greater Memphis Chamber. The project, still in early planning stages since March, aims to enhance xAI’s capacity in AI development.

While details such as total cost and job creation aren’t public yet, discussions about potential tax breaks or incentives for xAI are ongoing. Townsend expressed gratitude for Musk’s interest in Memphis, although Musk has not commented on the plans.

xAI, which prioritizes staying competitive with OpenAI, Google, and Meta, recently secured $6 billion in venture capital, valuing the company at over $24 billion. The completion of the new “gigafactory of compute” may occur by fall 2025.

According  to The Information, when completed, the interconnected array of Nvidia’s flagship H100 graphics processing units will be at least four times larger than the largest existing GPU clusters.

Was this Article helpful? Yes No