Artificial IntelligenceNews/PR

On-device Generative AI expected to drive heterogenous AI chipset shipments to over 1.8 billion by 2030

2 Mins read
AI chipsets

Generative Artificial Intelligence (AI) workloads have moved beyond the bounds of cloud environments and can now run on-device supported by implementing heterogeneous AI chipsets. Combined with an abstraction layer that can efficiently distribute AI workloads between processing architectures and compressed LLMs with under 15 billion parameters, these chipsets can enable enterprises and consumers to run generative AI inferencing locally. Consequently, ABI Research, a global technology intelligence firm, estimates worldwide shipments of heterogeneous AI chipsets will reach over 1.8 billion by 2030 as laptops, smartphones, and other form factors will increasingly ship with on-device AI capabilities.

“Cloud deployment will act as a bottleneck for generative AI to scale due to concerns about data privacy, latency, and networking costs. Solving these challenges requires moving AI inferencing closer to the end-user – this is where on-device AI has a clear value proposition as it eliminates these risks and can more effectively scale productivity-enhancing AI applications,” says Paul Schell, Industry Analyst at ABI Research. “What’s new is the generative AI workloads running on heterogenous chipsets, which distribute workloads at the hardware level between CPU, GPU, and NPU. Qualcomm, MediaTek, and Google were the first movers in this space, as all three are producing chipsets running LLMs on-device. Intel and AMD lead in the PC space.”

Hardware alone will not be enough. Building a solid on-device AI value proposition requires strong partnerships between hardware and software players to create unified propositions. These collaborations will nurture the development of productivity-focused applications to be deployed on-device. ABI Research expects this will spur demand and shorten replacement cycles of end devices like smartphones and PCs. This will lead to accelerating shipment numbers between 2025 and 2028 as the software ecosystem matures, breathing new life into markets that have been stagnating. Automotive and edge server markets are also impacted but to a lesser extent.

The productivity of AI applications running on-device, powered by heterogeneous AI chipsets, will drive significant market growth in personal and work devices. This is reflected by the increasing penetration of heterogeneous AI chipsets, eventually encompassing most systems towards the end of the decade. “Chip vendors and OEMs should look to expand the productivity AI application ecosystem to tempt more customers and mature the offering. This will create opportunities analogous to the growth previously spurred by the expansion of Android and web-based applications in their respective markets and require reaching a critical mass of applications that appeal to a broad range of customers in consumer and enterprise markets. Success in creating popular and useful applications could make or break the transition to on-device AI.”  

These findings are from ABI Research’s Opportunities for Heterogenous Computing: General & Generative AI at the Edge application analysis report. This report is part of the company’s AI & Machine Learning research service, which includes research, data, and ABI Insights. Based on extensive primary interviews, Application Analysis reports present an in-depth analysis of key market trends and factors for a specific technology.

Image by on Freepik

Read next: GenAI and automation to propel India’s AI market to $17 billion by 2027: nasscom-BCG report

Leave a Reply

Your email address will not be published. Required fields are marked *

20 − 19 =