Artwork
iconShare
 
Manage episode 493888445 series 1334308
Content provided by Gus Docker and Future of Life Institute. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Gus Docker and Future of Life Institute or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://staging.podcastplayer.com/legal.

Anders Sandberg joins me to discuss superintelligence and its profound implications for human psychology, markets, and governance. We talk about physical bottlenecks, tensions between the technosphere and the biosphere, and the long-term cultural and physical forces shaping civilization. We conclude with Sandberg explaining the difficulties of designing reliable AI systems amidst rapid change and coordination risks.

Learn more about Anders's work here: https://mimircenter.org/anders-sandberg

Timestamps:

00:00:00 Preview and intro

00:04:20 2030 superintelligence scenario

00:11:55 Status, post-scarcity, and reshaping human psychology

00:16:00 Physical limits: energy, datacenter, and waste-heat bottlenecks

00:23:48 Technosphere vs biosphere

00:28:42 Culture and physics as long-run drivers of civilization

00:40:38 How superintelligence could upend markets and governments

00:50:01 State inertia: why governments lag behind companies

00:59:06 Value lock-in, censorship, and model alignment

01:08:32 Emergent AI ecosystems and coordination-failure risks

01:19:34 Predictability vs reliability: designing safe systems

01:30:32 Crossing the reliability threshold

01:38:25 Personal reflections on accelerating change

  continue reading

240 episodes