When you purchase through links on our site, we may earn an affiliate commission. This doesn’t affect our editorial independence.

Ironwood, Google’s seventh generation Tensor Processing Unit (TPU), was unveiled last week at Google Cloud Next 25. It is regarded as the most powerful and advanced chip for artificial intelligence (AI) operations. The chipset was made especially for AI inference, which is the computation technique an AI model uses to process a query and produce a response.

Google stated that it will be providing developers with access to its Ironwood TPUs through the Google Cloud platform. The company announced the introduction of its seventh-generation AI accelerator chipset, Ironwood in a blog post. It said Ironwood TPU will help the firms transition from a response-based AI system to a proactive AI system.

It will focus on agentic AI systems, mixture-of-expert (MoE) models, and dense large language models (LLMs). TPUs, in particular, are specially designed chipsets for AI and machine learning (ML) processes. Particularly for deep learning-related workloads, these accelerators provide incredibly fast parallel processing speeds and notably high power efficiency. Google stated that the peak computation of each Ironwood chip is 4,614 teraflop (TFLOP). This is significantly faster than that of its predecessor, Trillium, which was introduced in May 2024.

Ironwood Chipsets: Roll Out and Accessibility

Google intends to offer the Ironwood chipsets as clusters to optimize processing capacity for more complex AI operations. Ironwood can be packed into a cluster of 9,216 liquid-cooled chips connected by a network of Inter-Chip Interconnects (ICIs). Ironwood is available in two configurations for developers on Google Cloud: 256 and 9,216 chips. The chipset can produce up to 42.5 Exaflops of computational power in their most extended cluster.

The chipset is a significant addition to the architecture of the Google Cloud AI Hypercomputer. Google has also stated that this new chip is more than 24 times more powerful than those in El Capitan, the greatest supercomputer in the world, which has 1.7 Exaflops per pod. The memory capacity of Ironwood TPUs has also increased; each chipset has 192GB, which is six times what Trillium had. It also added 7.2 Tbps to the memory bandwidth.

Interestingly, Google Cloud developers have not been given access to Ironwood at this time. The tech giant is probably going to switch its internal systems, including its Gemini models, to the new Ironwood TPUs before opening up its access to developers, just like it did with the previous chipset.

Check Out Our Previous Post:

Cloud Computing: Alibaba Cloud Introduces New AI Packages

LEAVE A REPLY

Please enter your comment!
Please enter your name here