Artwork
iconShare
 
Manage episode 514980927 series 1310569
Content provided by Ross Richey. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Ross Richey or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://staging.podcastplayer.com/legal.

Don't hold back guys, tell us how you really feel.

If Anyone Builds It, Everyone Dies: Why Superhuman AI Would Kill Us All

By: Eliezer Yudkowsky and Nate Soares Published: 2025 272 Pages

Briefly, what is this book about?

This book makes the AI doomer case at its most extreme. It asserts that if we build artificial superintelligence (ASI) then that ASI will certainly kill all of humanity.

Their argument in brief: the ASI will have goals. These goals are very unlikely to be in alignment with humanity's goals. This will bring humanity and the ASI into conflict over resources. Since the ASI will surpass us in every respect it will have no reason to negotiate with us. Its superhuman abilities will also leave us unable to stop it. Taken together this will leave the ASI with no reason to keep us around and many reasons to eliminate us—thus the "Everyone Dies" part of the title.

What's the author's angle?

Yudkowsky is the ultimate AI doomer. No one is more vocally worried about misaligned ASI than he. Soares is Robin to Yudkowsky's Batman.

Who should read this book?

For those familiar with the argument I don't think the book covers much in the way of new territory.

For those unfamiliar with the argument I might recommend Superintelligence by Nick Bostrom instead. It makes the same points without being quite so tendentious.

Specific thoughts: The parable of the alchemists and the unfairness of life

  continue reading

450 episodes