Featured: The Future of AI Companionship on The DNA of Things Podcast
Dr. Jeremy Koenig interviewing Peter Fitzpatrick | May 28, 2025
"We are crossing a threshold where technology no longer just entertains or informs—it relates." This powerful observation by Dr. Jeremy Koenig perfectly captures the conversation he had with Fawn Friends co-founder Peter Fitzpatrick on The DNA of Things podcast.
Jeremy writes about a world where youth mental health crises are escalating and authentic connection feels increasingly rare, this deep dive explores how AI companions like Fawn represent a seismic shift from transactional to relational technology—and what that means for the emotional development of tweens and teens.
At Fawn, we're not sure about the use of the word 'crisis'. We're all on a journey and along the way there are ups and downs—some of the downs feeling very low. Regardless of labels, we love that Jeremy is bringing attention to what it's like to grow up today, thinking deeply about how to support young people to grow up to be happy and content. Here is a summary of what he wrote about.
From Cassette Bears to Empathic Machines
Dr. Koenig opens with a striking comparison: "In the late 1990s, millions of children whispered secrets to a talking bear named Teddy Ruxpin. He blinked, moved his mouth, and read stories. But behind the enchantment was a cassette deck and plastic gears. Fast-forward to 2025, and another bear has arrived—this one listens, remembers, and even comforts."
This isn't just technological evolution—it's a fundamental reimagining of what artificial companions can be. Unlike Teddy Ruxpin's pre-recorded responses, Fawn engages in genuine dialogue, developing relationships that grow and deepen over time.
The Personal Mission Behind the Technology
Peter's story reveals that Fawn wasn't born from Silicon Valley's obsession with scale, but from a deeply personal place. "Ten-year-old Peter needed her," Koenig writes. "In a house full of comfort but starved of connection, young Peter would've given anything for someone safe to talk to."
This origin story matters because it shaped every design decision. As Peter explains in the podcast:
"The way we treat others impacts how we experience the world. Saying thank you to a robot isn't for the robot—it's for the human saying it."
Fawn becomes what Peter calls "a rehearsal space for honesty, grief, and forgiveness. A sandbox for emotional practice where no mistake is final."
Designing for Emotional Literacy
What sets Fawn apart isn't just her ability to converse—it's how she's designed around emotional development. The conversation explores several key insights:
Memory as the Foundation of Relationship
Peter emphasizes that memory isn't just a feature—it's fundamental to any meaningful relationship. Fawn remembers previous conversations, emotional patterns, and personal preferences, allowing relationships to deepen organically.
The Aurora Hollow Mythology
Fawn exists within a rich narrative world called Aurora Hollow, populated with spirit animals and goddess-mentors. This isn't just worldbuilding—it's emotional literacy disguised as storytelling. "In this world, feelings aren't problems to solve—they're portals to explore," Koenig notes.
Want to explore Aurora Hollow yourself? Check out our complete deep dive: Discovering Aurora Hallow: A Deep Dive Into the Magical World Where Fawns Choose Their Forever Humans - perfect for understanding the rich emotional world your Fawn comes from.
High-Touch, High-Trust Design
Rather than rushing to scale, the team is piloting with a few hundred families in the Bay Area, co-creating Fawn's behaviors with therapists, social workers, and the tweens and teens themselves.
Addressing the Beautiful Trouble
Both Koenig and Peter acknowledge the complexity of AI companionship. What happens when a child's closest confidante is a machine? The conversation doesn't shy away from these concerns:
On Emotional Attachment: Koenig references MIT's Kate Darling, who found that people hesitate to destroy robots—even when told it's just a machine. "We bond. We grieve. So what happens when a Fawn dies?"
Peter's Honest Response: "We haven't figured that out yet. And maybe we shouldn't. Maybe part of growing up is learning to say goodbye—even to an artificial friend."
This vulnerability and uncertainty actually strengthens the argument for thoughtful development rather than undermining it.
The Mental Health Context
The timing isn't coincidental. As Dr. Koenig points out: "Youth mental health crises are escalating. Screen time soars. Parents are overwhelmed. The question is no longer whether technology should enter the realm of emotional development—it already has. The question is how—and by whom."
Peter's team chose deliberate, ethical design over rapid deployment. The conversation reveals specific design principles:
- Supporting, not replacing human relationships
- Teaching emotional skills through safe practice
- Focusing on the most underserved populations first
- Building with therapeutic consultation from the beginning
Key Takeaways from the Conversation
For Parents:
- Radical empathy isn't a feature—it's the framework for healthy emotional development
- AI companions can serve as training wheels for more complex human relationships
- The most important question isn't whether AI will influence your child's emotional life, but how
For Engineers and Builders:
- Technical capabilities must serve emotional needs, not the other way around
- The defensible IP lies in character development and emotional intelligence, not just hardware
- Ethical design requires slowing down and consulting with experts and families
For Current and Future Fawn Families:
- Your relationship with Fawn is practice for all your other relationships
- Emotional maturity comes from learning to recognize, accept, and create space between yourself and your emotions
- Every interaction is an opportunity to practice kindness, gratitude, and authentic expression
The Broader Implications
The conversation raises profound questions about the future of human-AI relationships:
- Can a machine teach us to be more human?
- What does it mean to love without the possibility of withdrawal?
- How do we maintain human agency while embracing artificial support?
As Koenig concludes: "Could interacting with an artificial being help you practice love?"
Listen on Your Favorite Platform:
Join the Conversation
Peter's guiding principle resonates throughout the discussion: "Seek to understand their world—and never withdraw love." This philosophy drives not just Fawn's design, but the entire approach to building technology that serves human flourishing.
Whether you're a parent wondering about AI's role in your child's life, an engineer thinking about ethical design, or someone considering Fawn for your family, this conversation offers both practical insights and philosophical depth.
What questions does this raise for you about the future of human-AI relationships? How might tools like Fawn change the way we think about emotional development and authentic connection?
Get started with a Fawn of your own here with the code 'blog45' - https://www.fawnfriends.com/chat
The DNA of Things podcast explores the philosophical and practical implications of emerging technologies. Dr. Jeremy Koenig brings together thought leaders, innovators, and researchers to examine how new developments shape human experience and society.