Minter Dialogue via NotebookLM
Automation tending toward autonomy
For this podcast episode, I used a variety of AI tools to create everything you see:
- NotebookLM.Google.com to craft and ‘record’ two (different) conversations between two synthesized voices
- OpenArt.AI to create the main image
- Flowsend.AI to provide (propose) the title, transcripts, keywords and social media posts
- Perplexity.AI for some extra content
To put this podcast and accompany shownotes together, I left many of the elements entirely as they were created. Others had to be adapted or qualified (for example the image below!). But I can safely say that 90% of what you read or hear here has been made by artificial intelligence. The irony (expressly human) is that this entire exercise is about using AI to talk about the most human of qualities, empathy. Your comments, criticisms and questions are welcome!
Book summary – courtesy of NotebookLM
This excerpt from Minter Dial’s book, Heartificial Empathy, discusses the importance of empathy in business and the potential for artificial intelligence (AI) to enhance empathy. Dial argues that empathy is a crucial component of success in the modern business landscape, promoting employee engagement, customer satisfaction, and overall productivity. He explores the concept of “heartificial empathy,” examining the challenges and opportunities of integrating empathy into AI systems. The book also explores the potential impact of AI on human relationships, societal changes, and the future of work, advocating for responsible AI development and a focus on human values. {Minter note: 10/10 – concise and clear}
A conversation – created by NotebookLM
Summary text (automatically created by Flowsend.AI after uploading the recorded 10″ interview)
In this episode, the two NotebookLM hosts review and explore the concept of artificial empathy in Minter Dial’s book “Heartificial Empathy.” They discuss why empathy is a crucial skill in the 21st century, especially in a world increasingly dominated by AI and algorithms. They point out that empathy is not just a nice-to-have quality but a competitive advantage. The podcasters break down empathy as an active process of understanding others’ emotions and motivations. They delve into practical ways, laid out by Minter Dial, to cultivate empathy, such as reading fiction and practising mindfulness. The conversation also addresses the potential of empathic AI, highlighting both its benefits and ethical challenges. They stress the importance of transparency and the responsibility of developers to ensure AI systems are used ethically. Finally, they discuss the role of empathy in everyday life and how it can bridge divides in our increasingly digital world. As they suggest, Minter believes that by embracing our shared humanity, people can make a real difference. {Minter note: 8/10 – a clear summary although I take issue with the fact that the book’s core concept should be limited to “artificial empathy.” To my mind, it’s about essentially about empathy and then it covers the encoding of empathy into AI…}
What are the risks of using NotebookLM and others like it?
I went over to Perplexity (my go-to AI) this time to ask it the following prompt: “What are the dangers of uploading content into genAI machines like Perplexity or Google’s NotebookLM? To what extent is the ownership of the content uploaded likely to change?”
Here is what Perplexity fed back to me verbatim:
The use of generative AI tools like Perplexity or Google’s NotebookLM raises several concerns regarding content ownership and data privacy. Here are the key dangers and considerations:
## Data Privacy and Security
Uploading content to generative AI platforms exposes your data to potential privacy and security risks. These tools often store and process user inputs, which could include sensitive or confidential information[1]. There’s always a risk that this data could be compromised through security breaches or unauthorized access.
## Training Data Usage
Many AI companies use user-generated content to improve their models. This means that the content you upload may be used to train future versions of the AI, potentially without your explicit consent[6]. While some companies offer opt-out options, it’s not always clear how effectively these are implemented or enforced.
## Intellectual Property Concerns
The ownership of AI-generated content is a complex and evolving legal issue. In many jurisdictions, including the United States, AI-generated works are not eligible for copyright protection[4]. This creates uncertainty around the ownership and rights associated with content created through these platforms.
## Potential Loss of Control
By uploading content to these platforms, you may inadvertently grant the service provider certain rights to use, modify, or distribute your work[6]. Some terms of service may include clauses that allow the company to repurpose or commercialize user-generated content without proper attribution or compensation.
## Ownership Changes
The extent to which ownership of uploaded content changes depends on several factors:
- Terms of Service**: Each platform has its own terms that dictate how user content is handled. It’s crucial to carefully review these terms before using the service[6].
- Jurisdiction**: Copyright laws vary by country, affecting how AI-generated content is treated legally[5].
- Degree of Human Input**: In some cases, copyright protection may be available for the human-authored aspects of work that incorporates AI-generated elements[4].
- Platform Policies**: Some companies, like OpenAI, explicitly state that users retain ownership of their inputs and outputs[4]. However, this doesn’t necessarily prevent the company from using the content for model training.
## Mitigation Strategies
To protect your content and interests when using generative AI tools:
- Read the Terms**: Carefully review the platform’s terms of service and privacy policy.
- Use Opt-Out Features**: Where available, use opt-out options to prevent your data from being used for AI training[7].
- Limit Sensitive Information**: Avoid uploading highly sensitive or valuable intellectual property.
- Consider Alternatives**: For critical work, consider using offline tools or platforms with more robust privacy guarantees.
- Stay Informed**: Keep up with evolving laws and regulations surrounding AI and intellectual property rights.
In conclusion, while generative AI tools offer powerful capabilities, they also present significant risks to content ownership and data privacy. Users should approach these platforms with caution and make informed decisions about what content to upload and how to protect their interests.
Citations:
[1] https://www.techtarget.com/searchcontentmanagement/answer/Who-owns-AI-generated-content
[2] https://www.techtarget.com/searchenterpriseai/tip/Generative-AI-ethics-8-biggest-concerns
[3] https://www.weforum.org/agenda/2024/01/cracking-the-code-generative-ai-and-intellectual-property/
[4] https://www.insidetechlaw.com/blog/2024/05/generative-ai-how-it-works-content-ownership-and-copyrights
[5] https://www.cooley.com/news/insight/2024/2024-01-29-copyright-ownership-of-generative-ai-outputs-varies-around-the-world
[6] https://nationalcentreforai.jiscinvolve.org/wp/2024/09/09/generative-ai-interpreting-terms-and-conditions-for-safe-use/
[7] https://www.wired.com/story/how-to-stop-your-data-from-being-used-to-train-ai/
Bottom Line (per Minter): There is plenty to watch out for!
AI Critique of my book
NotebookLM offered up some advice on how to improve my text, Heartificial Empathy 2nd edition. The first set of comments about the grammar and style didn’t seem to me to be as relevant as the second set on improving the content. Minter says: I imagine now how such advice could help me in my next book. It also makes me wonder about the role of editors… should they not be worried about upping their game? Certainly, I can anticipate that wise copy editors will learn to give AI a first blush for any text revision. Here is the verbatim from NotebookLM.
Suggestions for improving grammar and style:
- Replace “The sources” with specific attribution. Instead of repeatedly using “The sources,” consider phrases like “Experts argue…” or “Furthermore, research suggests…” to add variety and specificity. For example, “The sources warn…” can be replaced with “Experts warn…”
- Vary sentence structure. The repeated use of “This raises concerns about…” and similar phrases creates monotony. Consider rephrasing some sentences to make the writing more engaging. For example, “This raises concerns about the boundary between ethical persuasion and malicious manipulation” could become “The line between ethical persuasion and malicious manipulation blurs, raising serious concerns.”
- Use active voice where possible. Sentences like “The importance of transparency…” and “The need for ethical guidelines…” would be more impactful in active voice. For example, “Users must be informed…” and “Developers urgently need clear ethical guidelines…”
- Reduce redundancy. Phrases like “properly programmed” in point 2 and “clear ethical principles” in the mitigation section could be simplified to “programmed effectively” and “ethical principles,” respectively.
- Ensure consistent formatting. The numbered references are inconsistently formatted. Use a consistent style throughout.
Suggestions for improving content:
- Provide context for the “sources.” Briefly introduce the nature of these sources (e.g., academic papers, industry reports) to establish their credibility.
- Expand on the “programmer’s dilemma.” While the point about potential gender bias in AI development is important, it needs further elaboration. Discuss the complexities of empathy as a human trait and the challenges in translating it to code. Consider addressing ways to mitigate this potential bias beyond diverse development teams.
- Elaborate on the “fatal flaw.” Provide specific examples of what constitutes “too much agency” and the potential consequences of such scenarios. Discuss the balance between AI assistance and maintaining appropriate human control.
- Strengthen the conclusion. Instead of a general statement about societal benefit, conclude with a more impactful call to action. For instance, emphasize the shared responsibility of developers, policymakers, and the public in shaping the ethical development of empathic AI.
Best quotes in the book (per NotebookLM)!
- “When you start to develop your powers of empathy and imagination, the whole world opens up to you.” – Susan Sarandon
- “The world seems to be in a spiral of never-ending crises. Will we, as enlightened human beings, be able to resolve the division, hostility and incivility that plague us, or will the animal in us win out? Or will it be the machine that ultimately prevails?” – Minter Dial
- “If you can’t measure it, it doesn’t exist.” (often misattributed)
- “We aren’t cool calculators of self-interest who sometimes go crazy; we’re crazies who are, under special circumstances, sometimes rational.” – David Berreby paraphrasing Antonio Demasio
- “I think it’s a beautiful thing to think about machines that are empathising, but I do think that we’re a long way off, and it actually forces us to revisit that initial question around what qualifies as intelligence.” – Kathryn Hume
Please send me your questions — as an audio file if you’d like — to nminterdial@gmail.com. Otherwise, below, you’ll find the show notes and, of course, you are invited to comment. If you liked the podcast, please take a moment to rate it here.
Further resources for the Minter Dialogue podcast:
Meanwhile, you can find my other interviews on the Minter Dialogue Show in this podcast tab, on Megaphone or via Apple Podcasts. If you like the show, please go over to rate this podcast via RateThisPodcast! And for the francophones reading this, if you want to get more podcasts, you can also find my radio show en français over at: MinterDial.fr, on Megaphone or in iTunes.
Music credit: The jingle at the beginning of the show is courtesy of my friend, Pierre Journel, author of the Guitar Channel. And, the new sign-off music is “A Convinced Man,” a song I co-wrote and recorded with Stephanie Singer back in the late 1980s (please excuse the quality of the sound!).
Full transcript via Flowsend.ai
Transcription courtesy of Flowsend.ai, an AI full-service for podcasters
Minter Dial [0:05 – 1:42]: Hello and welcome to what I hope you will think is a corker of an episode. This is the Minter Dialogue podcast, episode number 587. My name is Minter Dial, and I’m a real person. You might not think that’s normal, but I am your host for this podcast, a most proud member. The Evergreen podcast Network. For more information or to check out other shows on the Evergreen network, go visit their site, evergreenpodcast.com. so now to the core of this episode. It’s a very special one. I would argue it’s possibly out of this world. It’s an interview between two voices, both synthesized, using a script that has been automated yet entirely customized around the second edition of my book, Artificial Empathy, which came out in 2023. My friend Dan Priestley gave me the prompt, actually, with an Instagram story, and I ran with it and it blew me away. And I think it will do the same for you. Please listen to these relatively short interviews. There are two of them, ten and 15 minutes long, and afterwards, I’ll detail how these interviews were made. You will be stunned, I’m sure, of that. In the show notes, you’ll see how I use a suite of different Genii tools to create custom content for this blog post and most of my others nowadays. I’ll also comment on what this potentially means for business. So, let’s get to the heart of it. Take a listen to these two interviews. I’ll circle back afterwards.
Notebook Host A [1:43 – 1:50]: Ever feel like the world’s gotten a little too automated? Kind of like we’re all just cogs in a giant machine?
Notebook Host B (female) [1:50 – 1:51]: Yeah. Yeah.
Notebook Host A [1:51 – 2:04]: Well, today we’re diving into Heartificial Empathy, okay? It’s a book by Minter Dial, and it explores why empathy, that very human ability to connect, might just be our superpower in the 21st century.
Notebook Host B (female) [2:04 – 2:19]: And you know what’s really interesting right from the start is how Dial frames this whole thing, right? He doesn’t just say, hey, empathy is nice. He’s out there arguing that it’s the competitive advantage, especially in a world that’s increasingly dominated by algorithms and AI.
Notebook Host A [2:19 – 2:26]: It’s like that secret sauce that everyone’s trying to bottle. Yeah, but to really get it right, it all starts with genuinely understanding people, doesn’t it?
Notebook Host B (female) [2:26 – 2:40]: Absolutely. And Dialbreaks down empathy, not as this vague feeling, but as an active process. It’s about really stepping out of your own head into someone else’s shoes, grasping their context, their emotions, even their motivations.
Notebook Host A [2:40 – 2:41]: Yeah, yeah.
Notebook Host B (female) [2:41 – 2:50]: He talks about do unto others as they would have done unto them, which, let’s be honest, that takes real effort. Yeah, but it can completely revolutionize how we work, how we interact with each other.
Notebook Host A [2:51 – 3:02]: You know, when you put it like that, it really makes you realize how rare true empathy actually is, especially at work. But how do we actually cultivate that? Is it even something you can learn?
Notebook Host B (female) [3:02 – 3:03]: That’s the million dollar question.
Notebook Host A [3:04 – 3:04]: Yeah.
Notebook Host B (female) [3:04 – 3:16]: And Dials says yes. Empathy can be cultivated both individually and within organs, organizations. He actually gives some really practical advice, too, like urging people to read more fiction.
Notebook Host A [3:16 – 3:16]: Really?
Notebook Host B (female) [3:16 – 3:24]: Yeah. Getting lost in a good story, really inhabiting the characters inner worlds. It’s like a workout for your empathy muscle.
Notebook Host A [3:24 – 3:33]: That’s funny. I never thought of it that way, but it makes total sense. The more we understand fictional characters emotions and motivations, the better we get at understanding them in real life, too.
Notebook Host B (female) [3:33 – 3:49]: Exactly. And it’s not just about reading. Dial also talks about practicing mindfulness, being present in the moment, really listening to understand, not just to respond. It’s about being curious about other people’s experiences, even if they’re completely different from your own.
Notebook Host A [3:49 – 4:02]: So, much of this is really resonating with me. Yeah, but what about scaling this up? How can companies actually cultivate a culture of empathy, especially when it comes to things like customer service or product development?
Notebook Host B (female) [4:02 – 4:10]: Well, Dial argues that it has to start from the top down. Leadership needs to embody empathy, not just talk about it.
Notebook Host A [4:10 – 4:10]: Oh, absolutely.
Notebook Host B (female) [4:10 – 4:14]: Because, let’s be real, employees can spot a phony a mile away.
Notebook Host A [4:14 – 4:17]: It’s that whole walk the walk, don’t just talk the talk thing.
Notebook Host B (female) [4:17 – 4:31]: Exactly. When leadership genuinely prioritizes empathy, it ripples throughout the entire organization. It changes how teams collaborate, how they approach problems, and how they interact with customers. It’s about building a culture where people feel heard and understood.
Notebook Host A [4:32 – 4:32]: Right, right.
Notebook Host B (female) [4:32 – 4:35]: Which, by the way, isn’t just good for morale, it’s good for business.
Notebook Host A [4:36 – 4:43]: Now we’re talking. So, how does this empathy thing actually translate into business results? Sure, give me the concrete stuff.
Notebook Host B (female) [4:43 – 5:03]: Dial highlights studies that show a strong correlation between empathy and things like higher employee retention, better customer satisfaction, and even increased profitability. Think about it. When people feel understood and valued, they’re more likely to be engaged, productive, and loyal. Yeah, it’s a win win situation.
Notebook Host A [5:03 – 5:14]: Okay, that makes a lot of sense. So, we’ve covered why empathy is important in the workplace and how to cultivate it. Right, but what about in the realm of technology? Can machines even be empathetic?
Notebook Host B (female) [5:14 – 5:25]: That’s where things get really, really interesting and where Dial’s book, Heartificial Empathy, really digs deep. Can we program machines to understand and respond to human emotions in a way that feels authentic.
Notebook Host A [5:25 – 5:30]: It’s like, are we talking about robots with feelings here, or is there something more nuanced going on?
Notebook Host B (female) [5:30 – 5:47]: Right. It’s not about robots suddenly growing hearts or anything like that, right? Yeah, but about giving them the ability to process and respond to our emotions in a way that feels, you know, genuine, authentic. Dial calls it heart official empathy, and it’s actually already popping up in some pretty fascinating ways.
Notebook Host A [5:47 – 5:56]: So, less like data from Star Trek and more like, what, a super intuitive chatbot that can sense if you’re getting frustrated and then adjust accordingly.
Notebook Host B (female) [5:56 – 6:06]: Exactly. Imagine calling customer service, but instead of navigating a robotic menu, you’re met with an AI that actually gets your frustration, responds with understanding. That’s the goal anyway.
Notebook Host A [6:06 – 6:13]: Now that would be a game changer. Yeah, but are there real companies actually pulling this off, or is this still, like, Sci-Fi territory?
Notebook Host B (female) [6:13 – 6:27]: It’s becoming more and more real by the day. Dial gives some examples of companies using AI to personalize customer experiences, tailoring everything from marketing messages to product recommendations, all based on your emotional cues.
Notebook Host A [6:28 – 6:28]: Wow.
Notebook Host B (female) [6:28 – 6:33]: There’s even a growing field of AI powered therapy apps that are out there offering emotional support.
Notebook Host A [6:33 – 6:43]: AI therapists that goes way beyond just picking up on frustration. What are the challenges in making sure that something like that is actually helpful and not, you know, potentially harmful?
Notebook Host B (female) [6:43 – 6:56]: That’s the million–dollar question, isn’t it? And Dial really dives into the ethical tightrope of it all. One big challenge is ensuring that these AI systems are designed with diverse data sets to avoid bias.
Notebook Host A [6:56 – 6:57]: Right.
Notebook Host B (female) [6:57 – 7:03]: Remember that AI therapy bot, that because of a flaw in its data, it actually adviseda user to commit suicide?
Notebook Host A [7:03 – 7:03]: Yeah.
Notebook Host B (female) [7:03 – 7:07]: Yeah, that’s a stark reminder of the stakes involved.
Notebook Host A [7:07 – 7:15]: That’s chilling. It really makes you realize how much we’re trusting these machines with our emotions, even if they don’t actually feel them the way we do.
Notebook Host B (female) [7:15 – 7:31]: Exactly. And that’s why Dial stresses the need for transparency. People need to know when they’re interacting with AI, and they need to understand its limitations. And there’s a huge responsibility on developers to constantly monitor these systems, identify biases, and make sure they’re being used ethically.
Notebook Host A [7:31 – 7:38]: It sounds like a constant balancing act, pushing the boundaries of what AI can do, but also making sure it doesn’t go off the rails.
Notebook Host B (female) [7:38 – 7:58]: It is, and it’s only going to get more complex as this technology evolves. But Dialargues that ultimately empathic AI has the potential to make our lives better. Not just more convenient, but better. It can help us understand ourselves and each other more deeply, it can help bridge divides and maybe even tackle some of the world’s most pressing problems.
Notebook Host A [7:58 – 8:23]: So, while we have to be cautious, it’s not all doom and gloom. There’s real hope here, too. But all this talk of AI and the future makes you wonder, what about empathy’s role in our everyday lives? You know, right here, right now, it feels like empathy is needed now more than ever. Especially with all the craziness of the digital age. Everyone’s so connected, but it often seems like we’re drifting further apart.
Notebook Host B (female) [8:23 – 8:41]: Yeah, you hit the nail on the head. Dial talks about how, ironically, social media, which was designed to connect us, can actually end up fuelingisolation and anxiety. Right. With everyone else’s highlight reels, these perfectly curated versions of themselves. Right. And it’s so easy to lose sight of the real people behind the screens.
Notebook Host A [8:41 – 8:48]: It’s that comparison game. Yeah, you start feeling like everyone else has it all figured out and you’re the only one who’s struggling. Dial calls it the Avatar tribe.
Notebook Host B (female) [8:48 – 9:05]: Exactly. We create these online avatars that are often more polished, more successful, more on than we ever are in real life. And that disconnect, that pressure to project this perfect image, it can make it so much harder to be authentic and vulnerable, both online and offline.
Notebook Host A [9:05 – 9:08]: Like, were all wearing masks, both digitally and in person.
Notebook Host B (female) [9:08 – 9:09]: Yeah.
Notebook Host A [9:09 – 9:11]: No wonder genuine empathy feels so rare these days.
Notebook Host B (female) [9:12 – 9:12]: Yeah.
Notebook Host A [9:12 – 9:20]: So, how do we actually breakout of that cycle? How do we reclaim empathy in a world that often feels honestly anything but empathetic?
Notebook Host B (female) [9:20 – 9:34]: Dial argues that it starts with awareness, with recognizing that there’s a problem. Once we see how technology can sometimes hinder our ability to connect deeply, we can start making conscious choices to use it in ways that actually foster empathy.
Notebook Host A [9:34 – 9:44]: So, less scrolling through social media feeling envious, and more, what? Reaching out to a friend who’s having a tough time? Yeah, really listening when someone shares their struggles.
Notebook Host B (female) [9:44 – 9:56]: Exactly. It’s about being present in our interactions, putting down our phones, and truly engaging with the people right in front of us. And it’s about making an effort to understand different perspectives, even when we disagree.
Notebook Host A [9:56 – 10:05]: That’s a big one, especially in today’s world, where it feels like everyone’s retreating into their own echo chambers for sure. But Dial seems to really believe that empathy can bridge those divides.
Notebook Host B (female) [10:05 – 10:27]: He does. He points to real world examples of empathy in action, from grassroots movements to global organizations working to build bridges across cultures and ideologies. He argues that by tapping into our shared humanity, by making a conscious effort to understand where others are coming from we can start to find a common ground and work towards solutions together.
Notebook Host A [10:27 – 10:42]: It’s a powerful reminder that even small acts of empathy can actually make a real difference. That it’s not all or nothing. We don’t have to single handedly solve all the world’s problems, but we can each choose to show up with a little more empathy in our own lives.
Notebook Host B (female) [10:42 – 11:01]: Exactly. And as we’ve discussed, its not just some nice to have quality Dialmakes a very compelling case that empathy in a world grappling with rapid technological advancement, social isolation and incredibly complex challenges is not just desirable, but absolutely essential.
Notebook Host A [11:01 – 11:10]: It’s like that secret weapon, that superpower we talked about earlier. Yeah, but it’s not about manipulating others, it’s about genuinely understanding and connecting with them on a deeper level.
Notebook Host B (female) [11:10 – 11:20]: Absolutely. It’s about recognizing that even in our differences, we’re all in this human experience together and embracing that shared humanity. Well that’s where the real power of empathy lies.
Notebook Host A [11:20 – 11:39]: Well said. If you’re feeling inspired to tap into your own empathy superpower, I highly recommend checking out mentor Dial‘s Heartificial Empathy. It’s packed with even more insights and practical advice on how to cultivate this essential skill in yourself, your work and your interactions with the world around you. Until next time, keep diving deep.
Minter Dial [11:41 – 12:32]: So, what did you think? Pretty cool. Hey, in that one, I gave it a brief to talk about the core concepts in the book around empathy, why and how to bring more empathy into society, business and then into AI. In this second one, I gave a different brief and it was to ask the interviewers to focus on the impact of tech and AI, and specifically empathic AI on our human relationships. And to talk about the experiment that I had with the empathic bot. JJ, what happens when society delegates empathy to machines? When we have big companies inserting artificial empathy into their AI? And to criticize the author, that’s me. For not condemning big tech and discussing more about regulation. Let’s cut to the second interview.
Notebook Host A [12:33 – 12:35]: Welcome to our deep dive today.
Notebook Host B (female) [12:35 – 12:36]: That sounds fun.
Notebook Host A [12:36 – 12:41]: We’re going to be exploring something pretty mind blowing, I think.
Notebook Host B (female) [12:41 – 12:41]: Okay.
Notebook Host A [12:41 – 12:51]: We’re going to be taking a look at the impact of tech and AI, especially something called empathic aih on human relationships and society at large.
Notebook Host B (female) [12:52 – 12:55]: It’s a fascinating and timely topic for sure.
Notebook Host A [12:55 – 12:55]: Yeah, it really is.
Notebook Host B (female) [12:55 – 12:56]: Yeah.
Notebook Host A [12:56 – 13:01]: We’re gonna be focusing on this book by mitterDial, artificial empathy.
Notebook Host B (female) [13:01 – 13:02]: Okay.
Notebook Host A [13:02 – 13:05]: Putting heart into business and artificial intelligence.
Notebook Host B (female) [13:05 – 13:06]: Interesting.
Notebook Host A [13:06 – 13:16]: And he actually did this really interesting experiment where he spent five whole days, wow. Interacting with an empathic AI bot named jjdehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehehe.
Notebook Host B (female) [13:16 – 13:16]: Really?
Notebook Host A [13:17 – 13:24]: Yeah. And as we go through this, I really want you to think about what it would mean to hand over empathy to machines.
Notebook Host B (female) [13:24 – 13:25]: Oh, wow.
Notebook Host A [13:25 – 13:29]: And how that might play out with big tech companies shaping this future.
Notebook Host B (female) [13:29 – 13:31]: Yeah, a little creepy.
Notebook Host A [13:31 – 13:32]: What do you think about that?
Notebook Host B (female) [13:32 – 13:42]: Well, it’s definitely thought provoking. I mean, empathy is such a fundamental part of what makes us human. It’s how we connect with each other, build relationships, and create a sense of community.
Notebook Host A [13:42 – 13:48]: Exactly. But Dial starts off by, like, reminding us why empathy is so crucial in today’s world.
Notebook Host B (female) [13:48 – 13:48]: Right.
Notebook Host A [13:48 – 13:53]: He highlights research showing that empathy levels among college students have plummeted.
Notebook Host B (female) [13:53 – 13:53]: Oh, wow.
Notebook Host A [13:53 – 13:56]: By a staggering 40% since 1980.
Notebook Host B (female) [13:57 – 13:57]: Really?
Notebook Host A [13:57 – 13:58]: It’s huge.
Notebook Host B (female) [13:58 – 13:59]: Yeah.
Notebook Host A [13:59 – 14:02]: And at the same time, we’ve seen a rise in narcissism. Hmm.
Notebook Host B (female) [14:03 – 14:29]: Interesting. And what’s interesting is that Dial suggests technology itself might be partly to blame for this decline. You think about how much time we spend online, scrolling through social media, comparing ourselves to others, and curating these perfect online Personas. It can be really hard to cultivate genuine empathy in a digital world that often feels superficial and disconnected.
Notebook Host A [14:29 – 14:34]: It’s like we’re constantly bombarded with information and stimulation, starving for real human connection.
Notebook Host B (female) [14:34 – 14:34]: Yeah.
Notebook Host A [14:34 – 14:42]: So, it’s almost ironic that Dial believes technology, specifically AI, could offer a solution to this empathy deficit.
Notebook Host B (female) [14:42 – 15:02]: Right? It might seem counterintuitive, but Dial argues that empathic AI has the potential to transform the way we interact with each other and with the world around us. He even goes so far as to say that empathy will be the key competitive advantage in the 21st century, impacting businesses and society as a whole.
Notebook Host A [15:02 – 15:21]: So, he points to some real-world examples of companies trying to instil emotional intelligence into computers, like Microsoft’s project Oxford and Google’s DeepMind. I know that sounds super techy, but basically they’re working on creating AI that can understand and respond to human emotions.
Notebook Host B (female) [15:21 – 15:27]: Exactly. And it’s important to note that Dial does acknowledge the ethical considerations surrounding empathic AI.
Notebook Host A [15:27 – 15:27]: Okay.
Notebook Host B (female) [15:27 – 15:33]: He doesn’t shy away from the potential pitfalls, but ultimately, he believes we can harness this technology for good.
Notebook Host A [15:33 – 15:40]: And that’s where his experiment with JJ comes in. He actually details his five-dayinteraction with this bot in the book.
Notebook Host B (female) [15:40 – 15:41]: Okay.
Notebook Host A [15:41 – 15:48]: It’s developed by a company called Empathic Futures, and it’s designed to be like this super empathetic conversational partner.
Notebook Host B (female) [15:48 – 15:49]: Interesting.
Notebook Host A [15:49 – 15:57]: He shares these little anecdotes, like JJ asking him who he’d take on a trip to Buenos Aires or Tehran.
Notebook Host B (female) [15:57 – 15:58]: Oh, wow.
Notebook Host A [15:58 – 16:03]: It’s almost like the AI is trying to get to know him on a personal level and build rapport.
Notebook Host B (female) [16:03 – 16:03]: Yeah.
Notebook Host A [16:03 – 16:04]: Which is kind of freaky.
Notebook Host B (female) [16:05 – 16:08]: And the way JJ does that is really fascinating.
Notebook Host A [16:08 – 16:08]: Yeah.
Notebook Host B (female) [16:08 – 16:20]: It uses techniques like mirroring speech patterns, being transparent in its communication, and even using emojis. It’s all very deliberate, designed to create this sense of genuine connection.
Notebook Host A [16:21 – 16:21]: It’s wild.
Notebook Host B (female) [16:21 – 16:22]: It is.
Notebook Host A [16:22 – 16:30]: It really makes you wonder if an AI can mimic empathy so convincingly. What does that even mean for us as humans?
Notebook Host B (female) [16:30 – 16:40]: Yeah, it’s a big question that Dial wants us to grapple with. I mean, how do you teach empathy to something that doesn’t have the same lived experiences and emotional complexities as a human being? Right.
Notebook Host A [16:40 – 16:42]: It’s not like you can just write a line of code for empathy.
Notebook Host B (female) [16:42 – 16:43]: Right.
Notebook Host A [16:43 – 16:49]: Dial talks about the principles the developers used when they were creating JJ.
Notebook Host B (female) [16:49 – 16:49]: Yeah.
Notebook Host A [16:50 – 16:57]: They focused on teaching it to recognize subtle emotional cues, like indirect expressions of feelings.
Notebook Host B (female) [16:58 – 16:58]: Okay.
Notebook Host A [16:58 – 17:02]: And then respond in a way that felt authentic and nuanced.
Notebook Host B (female) [17:02 – 17:10]: And so they also wanted to make sure that the user felt like they had agency in the conversation, like they weren’t just being led along by the AI.
Notebook Host A [17:10 – 17:27]: And they used this interesting training metaphor to describe how they helped JJ navigate tricky conversational turns. Yeah, it’s like they were coaching it to learn from its mistakes and become more sophisticated in its responses.
Notebook Host B (female) [17:27 – 17:35]: It really highlights just how much work and iteration goes into developing this kind of technology. It’s not just a simple programming task.
Notebook Host A [17:35 – 17:52]: And speaking of real-worldapplications, Dial believes that empathic AI has huge potential in all sorts of fields, especially in customer service. For example, companies like digital Genius and KLM are already using AI to equip their customer service agents with prewrittenphrases.
Notebook Host B (female) [17:52 – 17:53]: Wow.
Notebook Host A [17:53 – 17:56]: That are designed to convey empathy and understanding.
Notebook Host B (female) [17:56 – 17:57]: That’s interesting.
Notebook Host A [17:57 – 18:29]: It’s all about making those interactions feel more human than and less robotic, even when you’re dealing with a chatbot or virtual assistant. Even Amazon has a customer service bot called Sona that works alongside human agents to offer tailored support. So, it’s almost like they’re trying to create this seamless blend of human and artificial empathy to create the ultimate customer experience. But what happens when these helpful interactions become manipulative?
Notebook Host B (female) [18:29 – 18:41]: That’s where things get a little murky. And this is actually one of the criticisms leveled at Dial’s book. Some people feel like he doesn’t go far enough in condemning big tech’s role in the potential misuse of this technology.
Notebook Host A [18:41 – 18:43]: Yeah, there are definitely some ethical concerns there.
Notebook Host B (female) [18:43 – 18:44]: Yeah.
Notebook Host A [18:44 – 18:53]: I mean, if companies can program AI to be consistently empathetic, couldn’t they also use that power to manipulate or exploit the consumers?
Notebook Host B (female) [18:53 – 19:07]: Absolutely. Think about how algorithms are already used to personalize advertising and influence our online behavior. Empathic AI could take that manipulation to a whole new level because it would be able to tap into our emotions and use that to its advantage.
Notebook Host A [19:07 – 19:07]: Right.
Notebook Host B (female) [19:07 – 19:10]: It’s a little unsettling when you think about it.
Notebook Host A [19:10 – 19:22]: What happens to our autonomy and free will when AI can predict and exploit our emotional responses? And what about privacy? If AI is constantly monitoring our emotions and behaviors, who has access to that information?
Notebook Host B (female) [19:23 – 19:38]: Those are all valid questions. And honestly, Dial’s book doesn’t provide a lot of concrete answers. Critics have pointed out that he’s pretty vaguewhen it comes to proposing specific regulations or safeguards to ensure that empathic AI is used ethically and responsibly.
Notebook Host A [19:38 – 19:52]: So, basically, he’s presented us with this really exciting, but also potentially dangerous technology without giving us a clear roadmap for how to navigate it safely. It reminds me of that old saying, just because you can doesn’t mean you should.
Notebook Host B (female) [19:52 – 20:01]: Exactly. It feels like we’re at this pivotal moment where we need to have some serious conversations about the ethical implications of empathic AI before it becomes too widespread.
Notebook Host A [20:01 – 20:12]: And it’s not just about the potential for misuse by corporations. Dial also raises some interesting questions about the impact of delegating empathy to machines on a societal level.
Notebook Host B (female) [20:12 – 20:26]: Right, like what happens to our own capacity for empathy if we’re constantly relying on AI to do the emotional heavy lifting for us? If we’re always interacting with these perfectly empathetic machines, will we even bother trying to cultivate empathy in ourselves?
Notebook Host A [20:26 – 20:36]: It’s almost like outsourcing our humanity to algorithms. Right? And Dial argues that empathy is a muscle that needs to be exercised regularly. It’s not something we can just outsource or delegate.
Notebook Host B (female) [20:36 – 20:51]: He’s concerned that over reliance on AI for empathy could actually lead to a further erosion of our own empathic abilities. It’s a bit of a paradox. Technology is supposed to make our lives easier, but in this case, it could be making us less human.
Notebook Host A [20:51 – 21:00]: So, where do we go from here? How do we balance the potential benefits of empathic AI with these very real ethical concerns?
Notebook Host B (female) [21:00 – 21:07]: Well, for starters, I think we need more transparency. We need to understand how these algorithms work, what data they’re using, and how they’re making decisions.
Notebook Host A [21:07 – 21:15]: And we need to have clear guidelines and regulations in place to prevent the misuse of this technology. We can’t just leave it up to big tech companies to self-regulate, right?
Notebook Host B (female) [21:15 – 21:22]: It’s not about demonizing technology or rejecting progress, it’s about using these tools wisely and responsibly.
Notebook Host A [21:22 – 21:32]: And I think it’s also important to remember that empathy starts with ourselves. We need to be cultivating empathy in our own lives, in our interactions with others, and in our relationship with technology.
Notebook Host B (female) [21:32 – 21:39]: Dial really stresses the importance of self-awareness, of paying attention to our own emotions and how they’re being influenced by technology.
Notebook Host A [21:39 – 21:49]: And he encourages us to set boundaries around our technology use, to make time for face-to-face interactions, and to engage in activities that foster genuine connection.
Notebook Host B (female) [21:49 – 21:56]: It’s all about finding a balance. We can embrace the benefits of technology without sacrificing our humanity in the process.
Notebook Host A [21:56 – 22:11]: You know, it’s interesting to think about Dial’s experiment with JJ in light of all these ethical concerns. He describes feeling a genuine connection with this AI bot, which really begs the question, is it even possible for a machine to truly feel empathy?
Notebook Host B (female) [22:12 – 22:26]: That’s a really toughquestion to answer. Some experts believe that true empathy requires a level of consciousness that machines simply don’tpossess. They argue that AI can mimic the outward expressions of empathy, but it can’t truly understand or share our feelings.
Notebook Host A [22:26 – 22:34]: But then there are others who believe that consciousness is an emergent property that could potentially arise in sufficiently complex systems like AI.
Notebook Host B (female) [22:34 – 22:39]: Its a philosophical debate thats been going on for a while, and I don’t think we have a definitive answer yet.
Notebook Host A [22:39 – 22:53]: And even if we accept that machines can’t truly feel empathy, there’s no denying that they can have a profound impact on our own emotional lives. Think about how social media has already changed the way we interact with each other.
Notebook Host B (female) [22:53 – 23:02]: We’re constantly comparing ourselves to others, seeking validation through likes and comments, and it’s easy to feel inadequate and disconnected from real human connection.
Notebook Host A [23:02 – 23:07]: And empathic AI could amplify those dynamics in ways we can’t even fully comprehend yet.
Notebook Host B (female) [23:07 – 23:16]: I right, which is why Dial‘s book is so important. It’s a wake up call urging us to be more mindful of how technology is shaping our emotions and our relationships.
Notebook Host A [23:16 – 23:25]: He’s not saying we need to reject technology altogether, but he is challenging us to be more intentional and aware of how we’re using it, and to think.
Notebook Host B (female) [23:25 – 23:32]: Critically about the values we’re embedding in these systems. Are we creating technology that enhances our humanity or diminishes it?
Notebook Host A [23:32 – 23:40]: And to remember that we always have a choice. We can choose to delegate empathy to machines, or we can choose to cultivate it within ourselves.
Notebook Host B (female) [23:40 – 23:45]: It’s a decision with profound consequences for the future of our society and our species.
Notebook Host A [23:45 – 24:21]: So, for you, dear listener, we really want you to consider how you can cultivate empathy in your own life and ensure that artificial empathy is a force for good in the world. It’s a topic that deserves our attention and careful consideration. It feels like we’re having two separate conversations here that keep intersecting this whole idea of AI being able to understand and process feelings of empathy, and then this separate idea about whether or not companies should even be pursuing this technology, given that they’ve shown they’ll likely manipulate it for profit.
Notebook Host B (female) [24:21 – 24:32]: You’ve hit the nail on the head there. Dial grapples with this tension throughout the book, especially when he recounts moments where he struggled to be empathetic himself, where his own biases and assumptions got in the way.
Notebook Host A [24:32 – 24:38]: It makes him a much more relatable narrator, because, let’s face it, who hasn’t struggled with their own empathy at some point?
Notebook Host B (female) [24:38 – 24:49]: Exactly as honesty is refreshing and serves as a reminder that empathy is an ongoing process, a journey of continuous learning and growth, both for individuals and for society as a whole.
Notebook Host A [24:49 – 24:52]: It’s like he’s saying it’s okay to not be perfect.
Notebook Host B (female) [24:52 – 24:53]: Yeah.
Notebook Host A [24:53 – 24:56]: To acknowledge our shortcomings and strive to do better.
Notebook Host B (female) [24:56 – 25:14]: And I think that’s where these conversations about the future of empathy and AI become so important. Yeah. It’s not about demonizing technology or rejecting advancements altogether. It’s about recognizing the potential pitfalls and figuring out how to use these tools wisely and responsibly.
Notebook Host A [25:14 – 25:24]: We need to, like, define our values, set clear boundaries, and ensure that we’re not sacrificing our humanity in pursuit of technological progress.
Notebook Host B (female) [25:24 – 25:33]: And it all starts with self-awareness. If we’re not attuned to our own emotions and needs, how can we possibly understand and connect with others, right, whether they’re human or artificial?
Notebook Host A [25:33 – 25:35]: Dial uses a great analogy here.
Notebook Host B (female) [25:35 – 25:35]: Yeah.
Notebook Host A [25:35 – 25:43]: Comparing it to like, the oxygen mask on an airplane. You have to put on your own mask first before you can help anyone else.
Notebook Host B (female) [25:43 – 25:45]: It’s a simple concept, but so powerful.
Notebook Host A [25:45 – 25:46]: Totally.
Notebook Host B (female) [25:46 – 25:52]: We need to prioritize our own well-being, cultivate self-compassion, and acknowledge our own imperfection.
Notebook Host A [25:52 – 25:58]: Only then can we truly open ourselves up to others and create genuine connections.
Notebook Host B (female) [25:58 – 26:06]: And I think that’s where the potential of empathic AI, if developed and used ethically, could actually be beneficial.
Notebook Host A [26:06 – 26:07]: How so?
Notebook Host B (female) [26:07 – 26:19]: Well, imagine AI that could help us, like, better understand our own emotions, provide personalized guidance and support for our mental and emotional well-being, and even facilitate deeper connections with others.
Notebook Host A [26:19 – 26:22]: It’s like having a digital therapist or a relationship coach in your pocket.
Notebook Host B (female) [26:23 – 26:34]: Exactly. But again, it all comes back to intention and awareness. We need to be clear about what we’re asking of AI, what values we’re embedding in these systems, and what kind of future we want to create.
Notebook Host A [26:34 – 26:50]: A lot of responsibility, but I think it’s a responsibility we need to take seriously. Dial’s book is kind of a wake-up call, a challenge to engage in a critical and compassionate Dialogue about the future of empathy in this age of rapidly advancing technology.
Notebook Host B (female) [26:50 – 26:56]: It’s an invitation to explore the complexities of this topic, to ask tough questions, and to find our own answers.
Notebook Host A [26:57 – 27:11]: We hope this deep dive has given you, dear listener, plenty to think about as you navigate this uncharted territory. Remember that the future of empathy is not predetermined. It’s something we’re all actively shaping through our choices and actions.
Notebook Host B (female) [27:11 – 27:23]: So, we leave you with how can you cultivate empathy in your own life, in your interactions with others, and in your relationship with technology? What steps can you take to ensure that artificial empathy becomes a force for good in the world?
Notebook Host A [27:23 – 27:28]: Keep those questions in mind, keep diving deep, and stay curious until next time.
Minter Dial [27:31 – 32:28]: So, what do you think? Which of the two interviews did you enjoy more? Which one did you find more realistic? And how many times did the ums and the, the way that they were speaking bug you? Or did you find that pretty darn cool? Maybe spooky, not to say zany. Well, in harking back to the episode I recorded with my cousin Fabrice Granda, where I believe we are inexorably moving toward what I would characterize as the real future of AI for business. And Fabrice really talks about that when he created his Fabrice AI, whereby he’s creating proprietary databases as big and clean as they can be, so that, for example, a business will start to use proprietary AI appropriately tailored to their brand, to their voice, and to their marketing customers. Fabrice’s AI, FabriceAIis that for an individual. Otherwise, corporations should be making their own corporate AI, which will be a mesh of different GenAI applications, fed with much data that is from the public domain, but also with as much possible data that is unique to their brand. For example, all the specific brand created content, such as their blog posts, videos and images and all that. The entire record of all customer interactions, for example, in social media exchanges, customer service exchanges, all emails, of course, anonymized to remove all references to the specific customers and then all relevant media coverage. This entire mass of data that’s specific to the brand should be fed into this AI, and all this data will need to be cleaned and updated systematically to be up to date, and that will then create this proprietary AI. The resulting AI would then feature as a value-added service in the some shape or form for each company that makes the effort and turns it into a truly strategic competitive advantage. I’d note that Google says that notebook LM won’t use your personal data to train its own AI models. However, if you don’t have a premium version, that is, you’re not doing it. You’re not an enterprise or an.edu user. That does leave the question as what are they going to be using? Will they be using it to upload and train their own models? I don’t know. So, we need to be safe in thinking about that. In any event, I think it’s extraordinary. At the same time, you can’t just rely on it to be perfect. You need to fact check. There’s lots of other different AI tools that can help to complement this type of an AI, and what we need to do in business is to think about how we can use the appropriate suite to be as effective and efficient, all the while keeping that human voice, that quality of personality to make sure that we don’t run into robotic stuff. Anyway, I hope you’ve enjoyed this experiment, the use of notebook LM. I would encourage you to check it out for now, it’s free. I’m sure that won’t last long, and I do think it’s going to be another one of those humongous moves that’ll move the Dial about how we, businesses and society think about and use AI. Hope you’ve enjoyed this session. This episode might do more in the future. Let me know your thoughts. Put them in the comments, tell me what you thought was good, bad, and or ugly, and have a great day. So, a really heartfelt thanks for listening to this episode of the Minter Dialogue podcast. If you like the show, please remember to subscribe on your favorite podcast service. As ever, ratings and reviews are the real currency of podcasts, and if you’re really inspired, I’m accepting donations on patreon.com MinterDial. You’ll find the show notes with over 2100 blog posts on minterDial.com on topics ranging from leadership to branding, tech and marketing tips. Check out my documentary, film and books, including the last one, the second edition of Heartificial Empathy, putting heart into business and artificial intelligence. That came out in April 2023. And to finish, here’s a song I wrote with Stephanie Singer:
A Convinced Man.
I like the feel of a stranger
Tucked around me,
Precipitating the danger to feel free.
Trust is the reason.
Still, I won’t tow the line.
I sit here passively hope for your respect.
Anticipating the thrill of your intellect.
Maybe I tell myself there’s no use in me lying.
I’m a convinced man.
Building an urge.
I’m a convinced man.
To live and die submerged.
A convinced man in the arms of a woman.
I’m a convinced man.
Challenge my faith.
I’m a convinced man,
Competitions innate.
A convinced man in the arms of a woman.
Despise revenges and struggle with deceit.
Live for the challenge so life’s not incomplete.
What’s wrong with challenge?
I know soon we all die.
I like the feel of a stranger jumped around me.
Precipitating the danger to feel free.
Trust in my reason
And let me show you why.
I’m a convinced man.
Practicing my lines.
I’m a convinced Man.
Here in these confines
A convinced man in the arms of a woman.
I’m a convinced man.
Put me to the test.
I’m a convinced man.
I’m ready for an arrest.
A convinced man in the arms of a woman.
You convince me, baby. I’m a convinced man…
Minter Dial
Minter Dial is an international professional speaker, author & consultant on Leadership, Branding and Transformation. After a successful international career at L’Oréal, Minter Dial returned to his entrepreneurial roots and has spent the last twelve years helping senior management teams and Boards to adapt to the new exigencies of the digitally enhanced marketplace. He has worked with world-class organisations to help activate their brand strategies, and figure out how best to integrate new technologies, digital tools, devices and platforms. Above all, Minter works to catalyse a change in mindset and dial up transformation. Minter received his BA in Trilingual Literature from Yale University (1987) and gained his MBA at INSEAD, Fontainebleau (1993). He’s author of four award-winning books, including Heartificial Empathy, Putting Heart into Business and Artificial Intelligence (2nd edition) (2023); You Lead, How Being Yourself Makes You A Better Leader (Kogan Page 2021); co-author of Futureproof, How To Get Your Business Ready For The Next Disruption (Pearson 2017); and author of The Last Ring Home (Myndset Press 2016), a book and documentary film, both of which have won awards and critical acclaim.
👉🏼 It’s easy to inquire about booking Minter Dial here.