Home / News / Technology / Edge AI Chips Offer an Alternative to Nvidia’s $70K GPU
Technology
3 min read

Edge AI Chips Offer an Alternative to Nvidia’s $70K GPU

Published
James Morales
Published
Key Takeaways
  • Running AI applications locally requires specific hardware.
  • So-called “edge” AI chips excel at on-device inference.
  • These AI chips are much smaller and cheaper than the heavyweight Nvidia GPUs that are central to cloud AI.

Artificial Intelligence chips are often associated with power-hungry, high-capacity GPUs installed in giant supercomputers and commercial data centers. 

However, for applications that require local data processing but can’t justify the $70,000 price tag for Nvidia’s latest Blackwell chips, a range of smaller, edge AI accelerators offer an alternative model for AI deployment. 

Edge vs. Cloud AI

The concept of edge computing started to gain traction around 2014 when the term started being used to distinguish on-device processes from cloud-based ones.

Edge AI can, therefore, refer to any type of machine learning application that takes place at the local level on a specific consumer or industrial device. 

In the consumer domain, edge AI encompasses shifting priorities in electronics design, which has seen personal computer and smartphone manufacturers embrace new hardware that can perform more operations per second.

Meanwhile, edge AI chips for industrial use cases cater to the specific needs of sectors such as manufacturing and automotive. 

Who’s Who in Edge AI

As the AI boom has unfolded, major global chip makers have played to their existing strengths. 

For example, Intel has focused on midweight AI accelerators that still need a fixed power supply, while batteries can efficiently power smaller chips from Apple, MediaTek, and Qualcomm

Besides the established giants of the semiconductor industry, more niche players also have an important role. 

Companies like Hailo and Untether AI have made inroads by focusing on specific applications such as running computer vision models, a key technology used in smart cameras, industrial robots, and self-driving vehicles

Similarly, Groq’s Language Processing Unit (LPU) is tailor-made to run language models, positioning the startup as a potential disruptor in the smartphone market as manufacturers look for ways to power mobile AI assistants at the edge. 

Optimizing On-Device AI

While chip makers are increasingly focused on building hardware for AI workloads, developers are equally busy creating smaller AI models that can be deployed on edge devices.

Thanks to advances in small model capabilities, applications that once required heavyweight hardware accessed via the cloud can now run locally. 

When Meta unveiled  its latest small language models in September, they highlighted how they had been “enabled on day one for Qualcomm and MediaTek hardware” and were optimized for the Arm processors used in almost every mobile device today.

Was this Article helpful? Yes No

James Morales

Although his background is in crypto and FinTech news, these days, James likes to roam across CCN’s editorial breadth, focusing mostly on digital technology. Having always been fascinated by the latest innovations, he uses his platform as a journalist to explore how new technologies work, why they matter and how they might shape our future.
See more