Google to offer Ironwood TPU for public use; Anthropic among first major clients
ETtech November 07, 2025 04:00 AM
Synopsis

Ironwood, the seventh-generation tensor processing unit (TPU) launched by the search giant in April this year for testing and deployment, is built by linking up to 9,216 chips in one pod. It removes data bottlenecks to let customers run and scale the largest, most data-intensive models.

Tech giant Google will release its specialised chip, called Ironwood, designed to run artificial intelligence (AI) models for public use in the coming weeks.

Per a report by CNBC on Thursday, Google will also announce upgrades to Cloud, making it "cheaper, faster, and more flexible." The report added that AI startup Anthropic is also planning to use up to one million of the new TPUs to run its Claude model.

Ironwood is the seventh-generation Tensor Processing Unit (TPU) launched by the search giant in April this year for testing and deployment.


Google’s in-house Ironwood TPU is built to support both training and real-time AI workloads, including chatbots and AI agents. By linking up to 9,216 chips in one pod, it removes data bottlenecks and lets customers run and scale the largest, most data-intensive models.

Google competes with the likes of Microsoft, Amazon, and Meta to build next-generation AI infrastructure. While most major AI models still run on Nvidia GPUs, Google’s custom TPU chips offer potential advantages in cost, performance, and efficiency, the company had said earlier in a blog post.

Technical insights

  • It is an enormous cluster of up to 9,216 liquid-cooled chips working together as one single unit.
  • These chips are linked with inter-chip interconnect (ICI) networking that consumes 10 megawatts of power.
  • For customers, the chip is available in two scalable configurations: 256 chips or a full 9,216-chip cluster.

Key features of Ironwood


  • Ironwood is built to handle the heavy computation and communication needs of advanced "thinking models" like large language models (LLMs), mixture of experts (MoE) and advanced reasoning systems.
  • It is capable of allowing AI workloads to run more cost-effectively. Google claims that Ironwood is nearly 30x more power efficient than its first Cloud TPU from 2018.
  • It offers 192 GB per chip, making it easier to process larger models and data sets, and is six times greater than Google's sixth-generation TPU called Trillium, announced last year.

Google parent Alphabet reported its first-ever $100 billion quarterly revenue on October 30, led by strong growth across its core search business and rapidly expanding cloud division that was buoyed by AI.

The company's ambitious approach to offering AI "is delivering strong momentum, and we're shipping at speed," CEO Sundar Pichai said.
© Copyright @2025 LIDEA. All Rights Reserved.