Home / News / Technology / Meta and Samsung Develop GPU Chips as AI Frenzy Fuels Demand
Technology
4 min read

Meta and Samsung Develop GPU Chips as AI Frenzy Fuels Demand

Last Updated April 11, 2024 3:59 PM
Giuseppe Ciccomascolo
Last Updated April 11, 2024 3:59 PM

Key Takeaways

  • Meta has unveiled a new generation of artificial intelligence (AI) chips, the MTIA.
  • This announcement raises questions about Meta’s recent shift to Samsung for AI chip development.
  • Is the Meta-Samsung partnership safe?

Meta Platforms revealed  details on Wednesday, April 10, 2024, regarding the next generation of the company’s proprietary artificial intelligence (AI) accelerator chip. The Facebook and Instagram owner assured that the upcoming iteration of its custom AI chips will deliver greater power and significantly expedite the training process for its ranking models.

But Meta’s announcement regarding its AI chip has sparked concerns within both the market and the tech community, especially after its recent transition to Samsung for new AI chips.

Meta’s New AI Chips

Earlier this year, Reuters reported  Meta’s intention to introduce a new iteration of its custom data center chip to accommodate the increasing computational demands of AI applications across Facebook, Instagram, and WhatsApp. Internally dubbed “Artemis,” this chip aims to diminish Meta’s dependence on Nvidia‘s AI chips and lower overall energy costs.

The Meta Training and Inference Accelerator (MTIA) is specifically tailored to optimize Meta’s ranking and recommendation models. These chips enhance efficiency during training and simplify the inference process, streamlining actual reasoning tasks.

In a recent blog post, Meta emphasized that MTIA constitutes a crucial component of its long-term strategy to establish AI-centric infrastructure for its services. The company aims to develop chips that seamlessly integrate with its existing technological framework while also accommodating future advancements in GPUs.

MTIA v1 was initially unveiled in May 2023 with a focus on deployment in data centers. The forthcoming generation of MTIA chips is expected to continue targeting data center environments. Despite initial projections placing the release of MTIA v1 in 2025, Meta has announced that both MTIA chips are currently in production.

What About Its Partnership With Samsung?

Just a month ago, Meta made headlines by announcing its departure from TSMC in favor of Samsung Foundry for AI chip development. This strategic move marked a significant paradigm shift in both semiconductor manufacturing and AI technology landscapes. Meta is heavily investing in custom chip development. Its dedicated R&D teams leads the charge and the decision to partner with Samsung Foundry underscores its commitment to this endeavor.

According  to Koreatimes, Meta’s decision to switch chip manufacturing partners stems from concerns regarding the “uncertainty and volatility” surrounding TSMC. Especially amid geopolitical tensions and supply chain disruptions.

Mark Zuckerberg praised  Samsung Foundry as the “world’s largest chip maker.” He highlighted its crucial role in Meta‘s future endeavors in this domain. Meta’s transition to a new foundry partner is a notable win for the Korean conglomerate. Which has been proactive in attracting new clients and fostering trust in the market.

However, this latest development in AI chip manufacturing does not imply the end of Meta’s partnership with Samsung, and there’s no indication of any breakup in their collaboration.

AI Frenzy Fuels Demand

Despite Meta showing no signs of abandoning its partnership with Samsung, the focus of Meta’s MTIA chip primarily lies in training ranking and recommendation algorithms. However, Meta intends to broaden the chip’s capabilities to encompass training generative AI models, like its Llama language models.

According  to Meta, the new MTIA chip aims to balance computing power, memory bandwidth, and memory capacity. It boasts an on-chip memory of 256MB and operates at 1.3GHz, surpassing the v1 chip’s specifications of 128MB and 800GHz. Initial tests conducted by Meta revealed that the new chip outperforms the first-generation version by threefold across four evaluated models.

In response to the increasing demand for computing power driven by AI applications, several other tech giants have ventured into developing their own chips. Google introduced its new TPU chips in 2017, while Microsoft announced its Maia 100 chips. Amazon has also unveiled its Trainium 2 chip, capable of training foundation models four times faster than its predecessor.

Expanding the scope to the global AI chips market, its valuation was $13.1 billion in 2024. And it is projected  to reach $92.5 billion by 2031, with a robust CAGR of 38% during the forecast period. All of the mentioned companies will play a pivotal role in this growth.

Was this Article helpful? Yes No