Manage episode 520410192 series 3586131
In this episode, Stewart Alsop III sits down with Stewart Alsop II to explore a wide sweep of themes—from getting an ESP32 and Arduino IDE up and running, to the future of physical AI, real-time computing, Starlink’s mesh network ambitions, and how edge devices like Apple’s upcoming M-series gear could shift the balance between local and cloud intelligence. Along the way, the two compare today’s robotics hype with real constraints in autonomy, talk through the economics and power dynamics of OpenAI, Anthropic, Amazon, and Google, and reflect on how startups still occasionally crack through big-tech dominance.
Check out this GPT we trained on the conversation
Timestamps
00:00 Stewart Alsop opens with Arduino, ESP32 setup, vibe-coding, and the excitement of making physical things.
05:00 Discussion shifts to robots, autonomy limits, real-world complexity, and why physical AI lags behind software.
10:00 They unpack BIOS, firmware, embedded systems, and how hardware and software blur together.
15:00 Talk moves to cars as computers, Rivian’s design, and rising vehicle autonomy with onboard intelligence.
20:00 Stewart demos Codex, highlighting slow API inference and questions about real-time computing.
25:00 They contrast true inference vs derivation, creativity, and doubts about AGI.
30:00 Conversation turns to Microsoft, Google, OpenAI integration, and why apps fail at real personal utility.
35:00 Exploration of on-device LLMs, Apple’s strategy, M-series chips, and edge computing.
40:00 Broader architecture: distributed vs centralized systems, device power vs cloud power.
45:00 Discussion of big tech dominance, coordination costs, and how startups like Tesla or Anduril break through.
50:00 OpenAI unit economics, tokens, APIs, and comparisons with Amazon, Uber, and WeWork.
55:00 Closing with mesh networks, Starlink’s satellite routing, low-Earth-orbit scaling, and space debris concerns.
Key Insights
- Hardware as a path to understanding reality: Stewart Alsop describes using Arduino, ESP32 boards, and a Raspberry Pi as a way to gain “intimacy with reality,” arguing that building physical systems teaches constraints and feedback loops that pure software often hides. His process—installing toolchains, debugging libraries, and interacting with sensors—highlights how hardware forces real-world learning that complements AI-driven coding assistance.
- Physical AI lags far behind software AI: The conversation emphasizes the gap between LLM-based software agents and embodied robotics. Despite flashy demos, most robots remain remote-controlled, brittle, or gimmicky. The real world’s variability—stairs, dirt roads, weather—makes autonomy extremely difficult, pushing truly capable physical AI far into the future.
- Everything is becoming a computer, including cars: They outline how EVs like Rivian and Tesla represent a shift where the computer is the primary design element and the vehicle is built around it. With autonomy features, sensor fusion, and operating systems more akin to smartphones, cars are evolving into mobile computation platforms with wheels.
- Real-time computing and the “Evernet” are the next frontier: Stewart Alsop II argues that the future hinges on synchronous, always-available, high-bandwidth connectivity. Starlink serves as a preview of a world where real-time, global, low-latency networking becomes the norm, enabling continuous context awareness and distributed intelligence across devices.
- Inference today is really derivation, not true reasoning: They distinguish between LLM “inference”—predicting tokens from prior data—and human inference, which creates new, orthogonal ideas. This raises doubts about AGI timelines, suggesting that creativity and genuine reasoning remain uniquely human for now.
- Edge computing will rival cloud-based AI: Apple’s focus on on-device LLMs, fueled by increasingly powerful M-series and A-series chips, points to a hybrid future. Local models will handle personal context and privacy, while cloud models tackle heavier tasks. This could rebalance power away from centralized AI infrastructure.
- Big tech dominance persists, but disruption remains possible: Although companies like Apple, Google, Amazon, and Meta have deep structural advantages—from chips to cloud to data—examples like Tesla, SpaceX, and Anduril show that startups can still break through. The key remains exceptional execution, timing, and identifying architectural gaps in the incumbents’ strategies.
64 episodes