The award-winning WIRED UK Podcast with James Temperton and the rest of the team. Listen every week for the an informed and entertaining rundown of latest technology, science, business and culture news. New episodes every Friday.
…
continue reading
MP3•Episode home
Manage episode 296991575 series 2921809
Content provided by PyTorch, Edward Yang, and Team PyTorch. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by PyTorch, Edward Yang, and Team PyTorch or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://staging.podcastplayer.com/legal.
Today, Shen Li (mrshenli) joins me to talk about distributed computation in PyTorch. What is distributed? What kinds of things go into making distributed work in PyTorch? What's up with all of the optimizations people want to do here?
Further reading.
- PyTorch distributed overview https://pytorch.org/tutorials/beginner/dist_overview.html
- Distributed data parallel https://pytorch.org/docs/stable/notes/ddp.html
83 episodes