On Monday, October 27 2025, Qualcomm announced the launch of two new AI accelerator chips, the AI200 and AI250, aimed at data‑center inference workloads. The company said the AI200 will go on sale in 2026 and the AI250 in 2027, with a third chip and server slated for 2028.
Both chips will be available in rack‑scale systems capable of filling a full liquid‑cooled server rack. Qualcomm highlighted that its AI cards support 768 GB of memory—higher than the memory capacities offered by Nvidia and AMD—while focusing on inference rather than training, which is the primary use case for large‑scale AI workloads.
The announcement positions Qualcomm directly against Nvidia and AMD, the dominant players in the AI data‑center GPU market. Leveraging its mobile‑AI expertise, Qualcomm claims its chips offer lower power consumption and cost of ownership, potentially giving it a competitive edge in the high‑growth AI infrastructure space.
This move expands Qualcomm’s portfolio beyond mobile and aligns with its broader transformation strategy toward connected computing. By entering the AI data‑center market, Qualcomm opens new revenue streams and strengthens its position in a rapidly expanding industry, marking a significant milestone in its strategic evolution.
The content on BeyondSPX is for informational purposes only and should not be construed as financial or investment advice. We are not financial advisors. Consult with a qualified professional before making any investment decisions. Any actions you take based on information from this site are solely at your own risk.