Search a title or topic

Over 20 million podcasts, powered by 

Player FM logo
Artwork

Content provided by TechDaily.ai. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by TechDaily.ai or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://ppacc.player.fm/legal.
Player FM - Podcast App
Go offline with the Player FM app!

Inside Google’s Ironwood: AI Inference, Performance & Data Protection

9:33
 
Share
 

Manage episode 477433095 series 3642779
Content provided by TechDaily.ai. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by TechDaily.ai or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://staging.podcastplayer.com/legal.

In this episode of The Deep Dive, we unpack Google’s 7th-gen TPU, Ironwood, and what it means for the future of AI infrastructure. Announced at Google Cloud Next, Ironwood is built specifically for AI inference at scale, boasting 4,614 TFLOPs, 192 GB of RAM, and breakthrough bandwidth.

We explore:

  • Why inference optimization matters more than ever
  • How Ironwood compares to Nvidia, AWS, and Microsoft’s chips
  • The rise of sparse core computing for real-world applications
  • Power efficiency, liquid cooling, and scalable AI clusters
  • What this means for data protection, governance, and infrastructure planning

This episode is essential for IT leaders, cloud architects, and AI practitioners navigating the explosion of AI workloads and the growing complexity of data management.

  continue reading

224 episodes

Artwork
iconShare
 
Manage episode 477433095 series 3642779
Content provided by TechDaily.ai. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by TechDaily.ai or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://staging.podcastplayer.com/legal.

In this episode of The Deep Dive, we unpack Google’s 7th-gen TPU, Ironwood, and what it means for the future of AI infrastructure. Announced at Google Cloud Next, Ironwood is built specifically for AI inference at scale, boasting 4,614 TFLOPs, 192 GB of RAM, and breakthrough bandwidth.

We explore:

  • Why inference optimization matters more than ever
  • How Ironwood compares to Nvidia, AWS, and Microsoft’s chips
  • The rise of sparse core computing for real-world applications
  • Power efficiency, liquid cooling, and scalable AI clusters
  • What this means for data protection, governance, and infrastructure planning

This episode is essential for IT leaders, cloud architects, and AI practitioners navigating the explosion of AI workloads and the growing complexity of data management.

  continue reading

224 episodes

All episodes

×
 
Loading …

Welcome to Player FM!

Player FM is scanning the web for high-quality podcasts for you to enjoy right now. It's the best podcast app and works on Android, iPhone, and the web. Signup to sync subscriptions across devices.

 

Listen to this show while you explore
Play