The director’s commentary track for Daring Fireball. Long digressions on Apple, technology, design, movies, and more.
…
continue reading
MP3•Episode home
Manage episode 523274954 series 2986762
Content provided by Dan Turchin. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Dan Turchin or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://staging.podcastplayer.com/legal.
Sid Sheth is the CEO and co-founder of d-Matrix, the AI chip company making inference efficient and scalable for datacenters. Backed by Microsoft and with $160M raised, Sid shares why rethinking infrastructure is critical to AI’s future and how a decade in semiconductors prepared him for this moment.
In this conversation, we discuss:
- Why Sid believes AI inference is the biggest computing opportunity of our lifetime and how it will drive the next productivity boom
- The real reason smaller, more efficient models are unlocking the era of inference and what that means for AI adoption at scale
- Why cost, time, and energy are the core constraints of inference, and how D-Matrix is building for performance without compromise
- How the rise of reasoning models and agentic AI shifts demand from generic tasks to abstract problem-solving
- The workforce challenge no one talks about: why talent shortages, not tech limitations, may slow down the AI revolution
- How Sid’s background in semiconductors prepared him to recognize the platform shift toward AI and take the leap into building D-Matrix
Resources:
317 episodes