Artwork
iconShare
 
Manage episode 509747152 series 3418643
Content provided by Carl Franklin. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Carl Franklin or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://staging.podcastplayer.com/legal.
AI in the cloud dominates, but what can you run locally? Carl and Richard speak with Joe Finney about his work in setting up local machine learning models. Joe discusses the non-LLM aspects of machine learning, including the vast array of models available at sites like Hugging Face. These models can help with image recognition, OCR, classifiers, and much more. Local LLMs are also a possibility, but the hardware requirements become more significant - a balance must be found between cost, security, and productivity!
  continue reading

1965 episodes