Artwork
iconShare
 
Manage episode 520998689 series 3506872
Content provided by interfluidity, subscribed podcasts. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by interfluidity, subscribed podcasts or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://staging.podcastplayer.com/legal.

Machine learning using neural networks has led to a remarkable leap forward in artificial intelligence, and the technological and social ramifications have been discussed at great length. To understand the origin and nature of this progress, it is useful to dig at least a little bit into the mathematical and algorithmic structures underlying these techniques. Anil Ananthaswamy takes up this challenge in his book Why Machines Learn: The Elegant Math Behind Modern AI. In this conversation we give a brief overview of some of the basic ideas, including the curse of dimensionality, backpropagation, transformer architectures, and more.

Blog post with transcript: https://www.preposterousuniverse.com/podcast/2025/11/24/336-anil-ananthaswamy-on-the-mathematics-of-neural-nets-and-ai/

Support Mindscape on Patreon.

Anil Ananthaswamy received a Masters degree in electrical engineering from the University of Washington, Seattle. He is currently a freelance science writer and feature editor for PNAS Front Matter. He was formerly the deputy news editor for New Scientist, a Knight Science Journalism Fellow at MIT, and journalist-in-residence at the Simon Institute for the Theory of Computing, University of California, Berkeley. He organizes an annual science journalism workshop at the National Centre for Biological Sciences at Bengaluru, India.

See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

  continue reading

171 episodes