Artwork
iconShare
 
Manage episode 522185388 series 3474148
Content provided by HackerNoon. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by HackerNoon or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://staging.podcastplayer.com/legal.

This story was originally published on HackerNoon at: https://hackernoon.com/crossentropy-logloss-and-perplexity-different-facets-of-likelihood.
We explore the link between three popular loss functions: crossentropy, logloss and perplexity
Check more stories related to machine-learning at: https://hackernoon.com/c/machine-learning. You can also check exclusive content about #ai, #machine-learning, #statistics, #crossentropy-explained, #what-is-logloss, #facets-of-likelihood, #software-libraries, #hackernoon-top-story, and more.
This story was written by: @artemborin. Learn more about this writer by checking @artemborin's about page, and for more stories, please visit hackernoon.com.
Machine learning is centered on creating models that predict accurately. Evaluation metrics offer a way to gauge a model's efficiency, which allows us to refine or even switch algorithms based on performance outcomes.

  continue reading

453 episodes