Google has spent over a decade quietly building the tech that’s now helping it punch way above its weight in the AI 0 Nvidia sits at the top of the food chain, selling its GPUs to just about every major tech company on earth, Google has been busy designing its own weapons; chips it never intended to sell, but instead uses to power everything behind its AI 1 now, with demand for compute at an all-time high, that in-house silicon is finally showing its 2 Thursday, Google confirmed that Ironwood, the latest generation of its Tensor Processing Unit (TPU), will be available in the coming weeks. It’s the company’s seventh-gen chip, built to handle everything from training huge models to running real-time AI agents and chatbots.
Ironwood, Google says, is over four times faster than the last 3 customer, Anthropic, is planning to deploy up to 1 million of them to power its Claude 4 builds, others catch up Even though Google continues to stockpile Nvidia GPUs, it’s not just sitting around relying on other people’s 5 have been in development for over ten years and first became available to cloud customers back in 6 used only for internal workloads, they’re now a central part of Google’s public Cloud AI 7 Stacy Rasgon from Bernstein said, “Of the ASIC players, Google’s the only one that’s really deployed this stuff in huge volumes.” At the moment, Google doesn’t sell TPUs as physical hardware.
Instead, customers rent access through Google Cloud, which has become one of the company’s biggest revenue 8 quarter, Alphabet reported $15.15 billion in cloud revenue, up 34% from the previous 9 Pichai, the company’s CEO, told investors: “We are seeing substantial demand for our AI infrastructure products, including TPU-based and GPU-based solutions.” Deals, satellites, and space-bound chips With the pressure mounting across tech to get access to compute, Google is locking in monster 10 month, the company expanded its partnership with Anthropic in a deal reportedly worth tens of 11 agreement will give Anthropic access to more than a gigawatt of AI compute capacity by 12 has invested $3 billion in Anthropic so 13 though Amazon remains the company’s main cloud partner, Google is now providing the main infrastructure for future Claude models.
Anthropic’s Chief Product Officer, Mike Krieger, said: “There is such demand for our models that I think the only way we would have been able to serve as much as we’ve been able to this year is this multi-chip strategy.” That strategy includes TPUs, Trainium, and Nvidia GPUs, and it’s built for performance, cost, and 14 added that his team had done early prep work to make sure Claude could run smoothly across all major chip types. “I’ve seen that investment pay off now that we’re able to come online with these massive data centers and meet customers where they are,” he 15 months before the Anthropic deal, Google signed a six-year, $10 billion+ cloud contract with 16 company also got a piece of OpenAI’s business as it diversifies away from 17 confirmed to Reuters that it’s using Google Cloud but isn’t deploying GPUs 18 week, Google revealed a new project called Suncatcher, aimed at launching solar-powered satellites equipped with 19 goal is to build a system that harnesses solar energy in space to power compute-intensive AI 20 company said it plans to launch two prototypes by early 2027, calling the experiment a way to minimize pressure on Earth’s resources while preparing for large-scale computation in 21 flows, demand rises, Nvidia watches Anat Ashkenazi, Alphabet’s CFO, said the Google owner’s momentum is coming from massive enterprise demand for its full AI stack, including both TPUs and 22 now reports that it signed more billion-dollar cloud contracts in the first nine months of 2025 than it did in the previous two years combined.
Meanwhile, Amazon’s cloud unit grew 20% last 23 CEO Matt Garman said, “Every Trainium 2 chip we land in our data centers today is getting sold and used.” He added that Trainium 3 would bring even more gains in performance and power efficiency. Still, Google is going all 24 raised its capital expenditure forecast for 2025 to $93 billion, up from $85 billion, with even more spending lined up for 25 stock has jumped 38% in Q3 and another 17% in Q4, its strongest stretch in two 26 at Mizuho pointed to the cost and performance advantage of TPUs, noting that although they were originally for internal use, Google is now getting real traction with outside 27 Stanley also said in a June report that TPU familiarity among developers could be a major boost to Google Cloud 28 in a September report, analysts at 29 wrote, “We continue to believe that Google’s TPUs remain the best alternative to Nvidia, with the gap between the two closing significantly over the past 9-12 months.” They also mentioned increasing positive sentiment among developers and suggested Google could even start selling TPU systems directly to AI 30 you're reading this, you’re already 31 there with our newsletter .
Story Tags

Latest news and analysis from Cryptopolitan



