Qualcomm stock shot up by 23% on Monday, after the company said it’s launching new AI accelerator chips to take on Nvidia and AMD in the most expensive chip war to 0 announcement, made on October 27, was the company’s loudest statement yet that it’s entering the data center arms 1 two new chips (AI200, set for release in 2026, and AI250, coming in 2027) won’t be in smartphones. They’ll be powering entire liquid-cooled racks inside massive AI server 2 to CNBC, these new chips are a major leap away from Qualcomm’s usual comfort zone of mobile and wireless 3 accelerators can fill a full rack like Nvidia’s and AMD’s current systems, which let 72 chips operate as one.).
Durga Malladi, the company’s general manager for data center and edge, told reporters last week: “We first wanted to prove ourselves in other domains, and once we built our strength over there, it was pretty easy for us to go up a notch into the data center level.” These racks are built for inference, not 4 means Qualcomm isn’t trying to build chips that help train models like OpenAI’s GPTs, which were trained on Nvidia GPUs. Instead, the focus is on running those models faster and cheaper once they’re trained. That’s where most real-world workloads actually 5 there’s money here… real 6 says the world will spend $6.7 trillion on data centers by 2030, and most of that will go to AI 7 controls more than 90% of that market today and is sitting on a market cap of over $4.5 8 customers are getting 9 recently said it’s buying chips from AMD and might even buy a piece of the company.
Google, Amazon, and Microsoft are all designing their own AI 10 wants an option that doesn’t involve waiting in line behind a dozen other AI labs just to get a GPU shipment from 11 draw, flexibility, and memory make Qualcomm stand out Malladi said the racks draw around 160 kilowatts, which matches the power usage of Nvidia 12 Qualcomm claims its systems are cheaper to run, especially for cloud service 13 company will also sell parts separately, giving clients the freedom to build custom racks. “What we have tried to do is make sure that our customers are in a position to either take all of it or say, ‘I’m going to mix and match,’” Malladi 14 Nvidia and AMD could end up buying parts of Qualcomm’s 15 includes its central processing units (CPUs), which Malladi said will be available as standalone 16 full pricing for chips, cards, and racks hasn’t been 17 didn’t confirm how many NPUs can fit in a rack 18 this year, Qualcomm signed a deal with Saudi Arabia’s Humain, which plans to install Qualcomm inferencing chips across data centers using up to 200 megawatts of 19 deal made Humain one of the first major customers for the rack-scale 20 company also said its AI cards handle 768 gigabytes of memory, which is more than what Nvidia or AMD currently 21 also claimed better efficiency in power and cost of ownership, though it didn’t provide exact 22 your free seat in an exclusive crypto trading community - limited to 1,000 members.
Story Tags

Latest news and analysis from Cryptopolitan



