Home / News / Technology / Ant Group Turns to Alibaba and Huawei Chips as Viable Nvidia Alternatives for AI Training
Technology
3 min read

Ant Group Turns to Alibaba and Huawei Chips as Viable Nvidia Alternatives for AI Training

Published
Kurt Robson
Published
By Kurt Robson
Edited by Samantha Dunn

Key Takeaways

  • Ant Group has found domestic alternatives to Nvidia’s leading semiconductors for AI development.
  • The Chinese digital payments platform has joined other China-based firms looking to replicate the success of DeepSeek.
  • While many are looking to cut the costs of AI development, some executives have predicted much more money will be needed.

Ant Group, the Jack Ma-backed Chinese financial giant, has used domestic semiconductors from Alibaba and Huawei to cut AI development costs by 20%.

The breakthrough marks a significant step forward for the Chinese firm, which is looking to continue the momentum of China’s domestic AI wins following DeepSeek’s success.

Ant Group’s Domestic AI Breakthrough

Ant Group is using the newly acquired chips to develop techniques for training AI models, such as the Mixture of Experts (MoE) approach, Fortune reported , citing people familiar with the matter.

MoE divides AI work among smaller “expert” models instead of one big AI model trying to do everything; each smaller model is designed to handle a specific type of input or task.

The domestic chips provided the same results as Nvidia’s H800 chips, which are the usual semiconductors the company relies on.

Companies like Nvidia have created modified versions of their advanced AI chips specifically for the Chinese market. Nvidia’s A800 and H800 chips are designed with reduced capabilities to get around U.S. export controls.

Although Ant Group continues to use Nvidia for AI development, it is increasingly moving its focus to find Chinese alternatives, a source told Fortune.

AI Alternatives

The MoE technique has risen in popularity, gaining recognition in China through its use in DeepSeek.

The method works by assigning specific tasks to smaller, specialized models rather than relying on a single large model. Only the most relevant “experts” are activated for each task, making the system more efficient and reducing computing demands.

However, powering MoE still requires high-performance chips, a technology that companies like Nvidia typically provide.

This has led to a stifling of some AI prowess in China, as U.S. restrictions have limited the exports of its most powerful computing equipment.

Is a Cheaper AI Future Possible?

Following in the footsteps of DeepSeek, Ant Group has made it its mission to scale high-performing AI models “without premium GPUs,” Fortune reported.

Despite DeepSeek rattling global investors with its AI model, which it reportedly developed for a fraction of the price, many executives have remained steadfast in stating that the industry will require substantially more funding.

In February, Robin Li, CEO of Chinese technology giant Baidu, said more money was needed to create competitive AI models.

“The investment in cloud infrastructure is still very much required. In order to come up with models that are smarter than everyone else, you have to use more compute,” Li stated .

Similarly, Nvidia CEO Jensen Huang said on Friday that the world would need 100 times more computing power than it has now to see the next surge of AI development.

Was this Article helpful? Yes No
Kurt Robson is a London-based reporter at CCN with a diverse background across several prominent news outlets. Having transitioned into the world of technology journalism several years ago, Kurt has developed a keen fascination with all things AI. Kurt’s reporting blends a passion for innovation with a commitment to delivering insightful, accurate and engaging stories on the cutting edge of technology.
See more
loading
loading