Artwork
iconShare
 
Manage episode 504675641 series 3688290
Content provided by Dr. Stacey Denise | The Neuroaesthetic MD. All podcast content including episodes, graphics, and podcast descriptions are uploaded and provided directly by Dr. Stacey Denise | The Neuroaesthetic MD or their podcast platform partner. If you believe someone is using your copyrighted work without your permission, you can follow the process outlined here https://staging.podcastplayer.com/legal.

In the final episode of our AI & identity series, Dr. Stacey Denise speaks with Douglas Moore Jr. data scientist and responsible AI advocate about how machines are trained to see beauty, emotion, and culture. From bias in datasets to the ethics of design, this is a conversation for anyone who wants tech to feel more human.

Topics We Explore:

  • How AI models learn bias — and how it feels when they get you wrong
  • The challenge of training machines to "see" beauty, culture, and emotion
  • Why neuroaesthetic design matters for mental and emotional wellness
  • What explainability and fairness really look like in responsible AI
  • The non-negotiables of ethical, inclusive, and emotionally intelligent tech

Connect with Douglas Moore Jr. on IG

Subscription, links & Email List

⁠ ⁠Socials

  continue reading

14 episodes