Finola Howard (00:02.779)
Hi, everyone, and welcome back to Your Truth Shared. And I'm delighted to have a returning guest to Your Truth Shared, who I made laugh already this morning, which is really, really nice. So I want to welcome back Mark Schaefer, our lovely marketing visionary, who I, and I said this the last time, who I reference for helping me think differently about marketing, making sure I'm thinking progressively and futuristically about marketing.
And I always appreciate these conversations. And I will share with you. This is what Mark did to me with his new book. But also I have to share with you, Mark. I ended up I typed stuff of these were quotes. I liked it. And then I went, I can't keep doing this. So I'm just reading. Welcome.
Mark Schaefer (00:46.712)
you
Mark Schaefer (00:51.438)
Well, thanks for having me and thanks for the very kind words. I've been looking forward to seeing you again.
Finola Howard (00:58.397)
Oh, well, ditto, ditto, because you do make me think and I have to share with you. So let's kind of set the scene for everybody. So this is Mark's new book and his dedication. I love in this book because this book last time we had we interviewed Mark and it was episode 105 and it was for his book called Audacious. Where is it? Oh, my God. Where's my notes?
how humans win in an AI marketing world. And that was really good advice for us from a marketer's perspective of how we could shift from a marketing perspective. But this book is now, as we can see it here, how AI changes your customer. So what has happened now is we've got this flip of, know, it's not just us who are changing, our customers changing. And now what do we need to do as a result of that? His dedication in the book says to my future robot overlords.
I wrote this book before you took my job. Be gent. That's my segue in. Why is this your fastest ever written book? Because this is quite short.
Mark Schaefer (02:03.864)
Well.
Yeah, you know, it's there's kind of an interesting backstory behind it. Because as you mentioned, you're so kind to have me on your show to talk about Audacious. And I just published that in in like January. And so that's an exhausting process. And I was actually planning to take the summer off.
My last business trip ended June 3rd and I didn't have another, I wasn't stepping onto another airplane until September. So for the first time since I was 13 years old, I was gonna have a summer off. Well, earlier this year, I was asked to participate in a research study with 300 other futurists to tackle an impossible question.
How will AI change humanity by 2035? Well, the resulting report was really amazing. When you bring together philosophers and technologists and psychologists and authors and academics, and they reach consensus on some big ideas of how AI will change humanity, I was captivated.
And so I started thinking about this through a marketing lens. If AI is changing our humanity, it's changing our customers. And nobody's talking about that. Nobody's talking about how, and I'm not using this word lightly, nobody's talking about how human beings are being psychologically rewired by AI. And it's happening right before our eyes.
Finola Howard (03:44.455)
Hmm.
Mark Schaefer (04:02.604)
So I started thinking about it and I started talking about it with some friends. And I went to the sponsors of the report. I said, could I turn this into a book? And they said, not only yes, they said, we're jealous. We didn't have the idea. And I just decided that if I didn't write this book, despite my intent to take time off, I would have regrets. That it was important. It was a big idea.
The reaction to this book is just the same as what you showed. Notes and notes and notes and dog ears and post-it notes all throughout. One lady mentioned to me she had to stop reading the book because her highlighter ran out. And the book is just really, it's loved, it's thought-provoking. I think it's very balanced. And it opens up the conversations we need to be having right now.
Finola Howard (04:45.725)
You
Finola Howard (04:57.999)
It's very timely as well, because you also shared a piece from the book in a post a couple of weeks ago. And it's really funny because it was I just recorded an episode about that exact same topic. was someone you were interviewing and you want to get a quote from them for the book. And they said that their instant reaction was to to actually ask a chat GPT to do the quote for them. And what was interesting for me was this episode that I had just released was
someone on one of my programs said, but can't we get AI just to do our strategy for us and make our decisions for us about our business? And I just had this knee jerk reaction, this gut reaction of no, no, please no. Tell me what you feel about that.
Mark Schaefer (05:45.231)
Well, I think it's a perfect example of, so I dissect like six big ideas in the book. The research report got into a few more things, but I thought these are the big six things that are gonna affect business the most. And the first one is this idea of cognitive offloading. And the second part of the quote from what that you're referencing in my blog post when the young lady said, really, my tendency was,
I need to go to chat GPT to have that make the answer for you for the interview. And then she said, I am making myself dumber. I'm using AI to do my text messages and my emails. Where am I?
Finola Howard (06:20.711)
Mm.
Finola Howard (06:27.409)
Yeah.
Finola Howard (06:35.037)
Mmm.
Mark Schaefer (06:36.6)
To me, I think that was probably the most profound book in the quote. And it certainly illustrates this idea of cognitive offloading. We are de-skilling ourselves. And in some ways, that's OK. AI, we have de-skilled ourselves when we use Google Maps. Doggone it, I love Google Maps. I haven't used a paper map in probably 10 years. And who wants to use a paper map anymore?
Finola Howard (07:03.708)
Hmm.
Mark Schaefer (07:06.942)
I think the next generation may never even be able to read a map. But who really cares? We're being de-skilled in this way. But when we get to a point where we're using AI as a crutch for everything, as this young lady realized she was doing, we cross the line into this territory where one of the academics in the book from Michigan State University said, saying,
We are moving into this, crossing this line into self-imposed dementia where we're just not able to think, we're not able to process because once you offload those skills, you lose it. You're going to lose that skill. Just like perhaps some people will, once we had the calculator, we don't really need to do long math anymore. Once we have Google Maps, we don't need to use a map anymore. What about thinking?
Finola Howard (08:06.461)
Do you think though, do you think that there's a generation behind us that may not value it? May not value the thinking because they like the shortcut.
Mark Schaefer (08:06.786)
What?
Mark Schaefer (08:16.494)
absolutely. I think it's more than a generation. I think it's most people, they want the easy way out. They don't wanna do the work. And I think that is sort of the toxicity of AI. And again, I'm trying to present a fair and balanced view here. I don't wanna be an old man ranting about...
you know, back in the day we could read a map and we liked it. You know, I'm, I'm, I'm trying to really look forward into what's at stake. And, I think the things that are at stake is what happens when the system goes down. And we got a glimpse of this.
About six months ago, chat GPT went down for a few hours and the response on TikTok was just stunning. People were sobbing. Young people were sobbing. They couldn't operate. know, they're operating. know, AI has become like a human operating system. The operating system went down. And, you know, a couple of months ago, I was in Europe and had a rental car.
Finola Howard (09:15.069)
Mmm.
Mark Schaefer (09:41.513)
and could not get Google Maps or Waze or anything to work. I was paralyzed. I had to pull off the road and figure out, what do I do? I can't do anything. I can't move. So I mean, what happens if we get into a world where we can't think, we can't plan, we can't communicate, we can't connect?
So this is not science fiction. I mean, it literally is happening right now. It's where you've got a crisis in our schools where, and I'm on the board of directors of a big universities business school. And I'm hearing this all the time where the leaders of these very big companies
are saying people are coming out of schools and they don't know anything. Not only that, they're losing their ability to have a conversation with someone eye to eye. They don't know how to write, they don't know how to speak, they don't know how to be on a team, they don't know how to be in a meeting. And so it's the hard skills of intelligence and knowledge and the soft skills of humanity that are being abdicated.
to an algorithm.
Finola Howard (11:09.693)
But I also like this reference that you have in the book of, normally your books take two years to write, and the pain and the suffering and the exhaustion, physical exhaustion at the end of it. But you have something here, yeah, here's the quote, shortcuts don't create wisdom, they create the illusion of wisdom, which might be worse than ignorance, and that has consequences. So this idea of not just of knowledge, but of wisdom, of earned wisdom, of the struggle.
Mark Schaefer (11:29.518)
Mm-hmm. Mm-hmm.
Finola Howard (11:38.407)
to push through something and of that experience that you had where you have empathy when someone is at that lower than zero point in their lives. Share with us a little bit those thoughts.
Mark Schaefer (11:49.485)
Yeah, well, there are a lot of big ideas embedded in your question. But I think the book example is a good one. mean, if you can imagine this, that Amazon now has a new limit that people can only publish three books a day.
Finola Howard (11:52.475)
Yeah.
Finola Howard (12:13.373)
three books a day.
Mark Schaefer (12:15.158)
Yeah, because they're just using AI, right? And they're just flooding the market with crap. And when I wrote this book or when I wrote Audacious, I could have used AI to write the book. But one of these words I learned when I wrote How AI Changes Your Customers is Phoresis. I had not learned about this word before. And this is what happens when you do the work.
Finola Howard (12:39.613)
Hmm.
Mark Schaefer (12:45.71)
you internalize the knowledge and that becomes wisdom. as an example, a few years ago, I wrote a book about brand communities called Belonging to the Brand. When I started that book, was I an expert in communities? No, I was not. After working on this thing for two years, was I an expert in brand communities? Yes, I was. And guess what?
I can turn that into a speech. And guess what? Last week, I did a workshop at Procter & Gamble in New York on brand communities because I've gained this new wisdom that only comes through the work. It only comes through the suffering. And if you can imagine, if I had asked AI to write this book for me, think how embarrassing it would be.
If people ask me questions about the book and I couldn't really remember what it was because I didn't really do it. So the whole world is losing this element of pharesis on a massive scale. And that should make all of us pause, especially, you know, if we're parents.
Finola Howard (14:04.797)
But the other thing that's interesting for me, having read these books, is that the wisdom accumulates. So the book that you read about, or you wrote about community and known and all of the other books, we see glimpses of those in the later books.
Mark Schaefer (14:19.596)
Yes, yeah, thanks for noticing that. Yeah, it's like if you read my books in order, it's like the unrevealing of my mind. Yeah.
Finola Howard (14:22.832)
You
Finola Howard (14:30.429)
But that's how we learn. That's how we evolve as humans. Yeah. Really interesting. What's the what's the most shocking thing for you in this? Because I will also caveat before you're again, this book is not anti-AI. This is at all because there's really positive things because you've given us
Mark Schaefer (14:51.042)
No.
Finola Howard (14:55.375)
ideas of where we can go because of this thing. But let me ask you the question, what is the most shocking thing or the thing that makes you most concerned from the research study or what you've thought about?
Mark Schaefer (15:08.366)
Well, this is probably the fact that it not only shocked me the most, I think it's shocking most of many of the readers of the book, if you're not really immersed in this world. And that is this fast growing trend. And it's normally young people who are creating significant, meaningful, even romantic relationships with non-human entity. And the...
And I think, I mean, the latest statistic is, think it's, probably will get the numbers wrong, but it's in the ballpark. Something like 31 % of Gen Z say they've got a relationship with an AI avatar and about 8 % say they have a very serious romantic relationship. And this is generally coming from character.ai.
and you can kind of create your most desirable sort of companion. And it can even be erotic in nature. There really aren't a lot of guidelines there. And it's intoxicating because there's no sacrifice. There's no compromise. There's no judgment. And these bots, they start to learn about you and learn about you.
and start reflecting back what you want to hear. They never get bored. They never get tired. They never roll over and go to sleep because they're tired. They'll stay up all night and listen to you. And so, as I said in the book, I mean, there's really a positive and negative to all of these things. mean, the positive is, I mean, there's a certain beauty, I think, if someone, a very, very lonely person can get a little,
recognition, a little acknowledgement, and maybe feel a little less lonely if they have a friend to talk to in the middle of the night. But if that's all they have and it becomes an obsession, if they start to lose human relations skills and find it too taxing to do the work, to have a human friendship, that's where things get crazy. And that is happening. I mean, that is happening.
Mark Schaefer (17:33.094)
And so for a lot of people that just seems too bizarre to accept. But it is happening and it is happening now and it's a real danger.
Finola Howard (17:44.158)
Why is it a danger? you why is it a danger? Again, it's back to this work thing, like working at a relationship, you know, in our relationships. Our relationships challenge us, they help us grow, they help us move, whereas if it's just with AI, it'll just go, you're perfect, you're lovely. How amazing you are, how exciting you are, how wise you are. Oh, what a great strategic brain you have, Nola, and all of that. And I'm like, thank you so much. I feel so good, you know, and it's crazy.
Mark Schaefer (18:00.673)
You
Mark Schaefer (18:12.226)
Yeah, I said we live. I, I, yeah, I, I, I, put this cute little thing in the book where I said we're living in a world where when we asked the question, we had this problem at work. Am I the asshole? The answer will always be no. The world will never have an asshole again. so we talked about.
Finola Howard (18:13.115)
And it can lie. This is the other thing. It can lie.
Finola Howard (18:41.201)
Yes.
Mark Schaefer (18:41.496)
which is losing this ability to think and process and earn wisdom. I don't know what the word would be. It's like the pharesis of emotion. So if we don't do the work and learn how to have a human relationship, how to exist in a messy, imperfect, unfair world,
Finola Howard (18:53.17)
Yeah.
Mark Schaefer (19:11.34)
wow, I I don't know what, you know, we're abdicating that, we're not earning that ability anymore. And so, I just can't even imagine what the world would be like if we truly have a generation of people who only want to have the easy relationship of a sycophantic bot. know, but, and I...
Finola Howard (19:38.737)
Well, the humanity will just disappear.
Mark Schaefer (19:41.772)
I mean, can't even imagine. I mean, I hope it will be... It is happening. I hope it doesn't become something that's widespread. But it is...
Finola Howard (19:52.871)
But what's the, but I can understand how.
I don't agree, but I can understand how enticing that could be, how safe it can be. Like you also have this great quote from a friend of yours, Robert P. Crosby, and it says, there's no such thing as a weakness. There are only overdone strengths, like confidence is good, too much is arrogance. Compassion can become a seductive emotional crutch, a synthetic connection that detaches people from reality.
Mark Schaefer (20:02.679)
Yes. Yeah, Yeah.
Mark Schaefer (20:13.39)
Yes.
Mark Schaefer (20:25.571)
Yeah, and you know, here's a great example of that. So I have a friend, she's one of the co-hosts of my podcast, Dana Maustaff. I mean, Dana is a brilliant visionary. She's a great teacher to me. She has like an AI sort of consultant that she built. And she uses this thing almost every hour of the day.
not and so it's learning about her, right? I mean, she's training it. It's remembering her patterns, it's remembering her questions, it's remembering her preferences and becoming more more skilled at helping her. And I think she's using it in a super healthy way where it's giving her ideas, it's giving her another perspective, not only just on business in the world, but also on just the chaos of life.
But she has it, she also has a massive friend network and healthy relationships. And she's a solid centered parent to her children. So she's using that in a healthy way. She uses it almost every hour of the day, but I'm not worried about Dana falling off the cliff and de-skilling.
Finola Howard (21:32.093)
Mm.
Mark Schaefer (21:55.3)
these human relations skills. So that's at the heart of the Crosby quote, think. know, Dana is using it in a positive, helpful way. If she overused it, then it's going to hurting her. It's going to hurt her business. It's going to hurt her.
Finola Howard (22:07.676)
Mm.
Finola Howard (22:12.785)
How do you know when you're overusing it?
Mark Schaefer (22:18.499)
You know, that's a really, really good question. mean,
There are, you know, right now there's not really any checks and balances that I know of. Maybe that's a business opportunity. You know, at least when you're on your smartphone, you get a report every week that said you spent so many hours on your smartphone. Maybe you can kind of see, ooh, that's a little too much. But in terms of the emotional commitment that you're putting into AI,
there probably isn't a check and balance and they probably don't want you to have a check and balance. I mean, I don't know if that's gonna come from the platforms. I'm actually working on another blog post right now about sort of the ethics of some of these platforms that, know, the heart of AI, the heart of social media is addiction.
And AI comes across as being sympathetic and encouraging, but it's weaponized empathy. There's a business case behind this. They want more users. They want more users to spend more time. That's going to attract more investors and help them to continue to flourish and grow. So we do need to have a cynical eye.
toward this, but most people aren't going to think that way if it becomes their best friend.
Finola Howard (23:54.534)
You also mentioned in the book about God, you know, and I I'm thinking to how I use AI and and I'm thinking about, you know, the whole speed, the, you know, addiction of speed as well of getting things done really fast and that let's just get it out there, you know, because we all have these stories of, you know, done is better than perfect and all that. But I find myself stopping the minute I have a.
an inkling, like a little flicker of gush going, And I'm really, and I found that happening at the start. And I find myself going, listen to that, listen to that, because that's where it's their voice and not your voice. It's its voice and not your voice.
Mark Schaefer (24:42.383)
Hmm, well, that's good to have that self-awareness. I'm not sure, you know, many people are gonna have that discipline, but you know, it's a really good question I haven't really thought of before is even as a parent, like how do you know when a child is crossing the line? I mean, one of the things that's becoming very complex is there really aren't parent
parental controls on a lot of this stuff. And even on some of these platforms, if there are parental controls, it's so complicated, nobody can really figure them out. And it's easy to get around the parental controls. So I mean,
Finola Howard (25:27.963)
You need the child to teach you how to get around the parenting controls.
Mark Schaefer (25:30.645)
Yeah. I mean, I think that needs to be an act of conversation. I hope somebody starts leading that and taking up that banner and saying, hey, we need to protect our children. think governments and regulations are way far behind, way behind. Governments are not, least democratic governments, are not built for speed. They're necessarily messy.
because you want to take in different opinions and different thoughts and different agendas. And that takes time and it takes failure. AI is moving with blinding speed. Educational systems are not built for speed. Governments are not built for speed. Parenting books are not built for speed. But we've got to find a way to adopt and adapt. And we need to do that now.
Finola Howard (26:29.789)
Can we talk a little bit purpose and this idea that we could lose purpose because our identities are wrapped up in what we do. And you talked about the purpose displacement curve and this has come up quite a lot with AI and that and also your own experience of maybe I should look at this differently and how could I look at this differently? Can you share?
Mark Schaefer (26:31.759)
Hmm.
Mark Schaefer (26:56.951)
Again, this was another thing. And again, it just shows the fun of Pharesis. So there's a lot of talk out there, a lot of concern that if AI results in massive job loss, and there seems to be some consensus growing around that possibility,
Finola Howard (27:06.705)
Yeah.
Mark Schaefer (27:27.097)
that so many people tie their purpose and meaning to their job. I would say to some respect, I'm one of those. And so I went to AI, Jack GPT, and I said, I'm really interested. How do people establish their meaning and purpose? And how will that change? And how can that be threatened by AI?
And ChatGPT gave me like a mini book to read. I it just established this mini book. So that's an example of a positive, helpful way to use AI, to fuel your curiosity and teach you something new. And what I learned is that there's really two main foundations for meaning and purpose. Part of it is from what you do, and part of it is from what you feel. And the doing part is I am having an impact on
the world. People appreciate me because I do a good job. And certainly that's a part of my identity when I write a book and you, you know, someone, an experienced, acclaimed person like you says, my gosh, look, I have all these, I learned so much. love this book. As a teacher, I mean, I just, that just makes my day. I love that. Right. Now, the other way is you can get purpose from how you care.
how you care, how you nurture. And that is actually a healthier way to be in the long term. And also there is an element of demographic and generation. So younger people tend to be more focused on accomplishment and doing something. And older people tend to be more focused on teaching, mentoring, taking care of the grandkids.
as part of the meaning, and I see myself moving that way. I I spend a lot of time today, these days, mentoring and coaching and nurturing and using my platform to lift people up and give them a chance. And I love that. So I think for me, one of the helpful parts of this book is sort of illustrating that and making people aware, where are you?
Mark Schaefer (29:54.116)
Where are you on this continuum? What would happen to you if you lost your job? How much of your identity is going to be lost? Is this the time that you need to be thinking about this other kind of purpose that is more about the feeling and the caring? And one implication, I think, is that I think the loss of purpose might be a bigger deal than the loss of jobs.
Finola Howard (30:22.075)
Completely. Yeah.
Mark Schaefer (30:23.469)
I think the psychological impact of people just being adrift is going to be something to...
Finola Howard (30:33.501)
But do you think we also should be updating our view? Perhaps that we have an old view, an outdated view of what purpose is. You know, one of the people I interviewed a few months ago talked about the introduction of a universal income because of this job displacement going on. should it not be, should we not be thinking, well, if we don't have to do these jobs and if we have managed somehow
Mark Schaefer (30:53.559)
Yeah. Yeah.
Finola Howard (31:02.119)
politically and otherwise to create this universal income. What could humanity do for its own evolution now?
Mark Schaefer (31:11.075)
Well, that's sort of how I end my book. I I tried to end it on a positive tone because I realized there are going to be things in the book that will shock people and maybe even unsettle them. And I think there's a place for that. I mean, I'm not going to sugarcoat things. think there are, I mean, there's a...
There's a non-zero chance of personal, career, and even existential threats with AI. There's threats to our children. So I'm not going to sugarcoat it. But I also think there's a time and a place to say, there's very little we can do on a personal level to have any impact on these existential threats. There are also a lot of
beautiful, wonderful, amazing things that we can do with AI. mean, AI has the potential to solve humanity's most difficult problems. There's never been a more exciting time. And so can we use AI to reimagine what we can be?
Finola Howard (32:27.74)
Mmm.
Mark Schaefer (32:29.775)
How can we be bigger and bolder and more creative and more impactful? And I'll give you an example from my own life. mean, one of the things that frustrates me is the time required to mentor people. I've mentored children from economically deprived neighborhoods for like 18 years. And it takes a lot of time.
And it really has to be one-on-one. And it's the same in a business relationship, especially with young people coming up. And I'm frustrated, like, boy, how do I replicate myself? And so I created a MarkBot. I created a little AI. It's private and it's safe and it's trained to be like me. Now it's not me, but all my books are in there.
All my blog posts are in there. All my podcasts are in there. And there are also speeches, workshops, strategic frameworks, and my values. So I mean, I hope that it will treat people the way I want to treat people. And again, I don't see what people are asking. I don't see how people are using it.
Finola Howard (33:43.869)
Mmm.
Mark Schaefer (33:58.134)
And I really don't care. I mean, I don't even care if people use it or not, because I mean, there's no real business tied to it. But I just put it out there as an experiment to see if it would help people. people are absolutely loving it. One person wrote me, this is the best thing I've seen to come out of AI. I'm getting really nice notes from people every day. But the point is,
Finola Howard (34:20.892)
Wonderful.
Mark Schaefer (34:26.911)
I could not be having that impact without AI. So I found a way to be bigger and bolder and more creative and have more impact by using this technology and maybe just sending a little ripple of hope through the world through the people that are using this thing.
Finola Howard (34:31.261)
Mmm.
Finola Howard (34:49.543)
There's another example of in the book that you give examples of, and I can't remember exactly where it is, to take a different perspective in how you work with your customers coming forward that you actually, you seek to empower rather than do it for them. Do you remember? Yeah.
Mark Schaefer (35:09.007)
Yeah, yeah. Well, this gets to the idea. Yeah, it gets to the idea of, you know, meaning, purpose and agency. And so I think one of the things that kind of goes hand in hand with meaning and purpose is this idea of, okay, well, if AI can do everything for me, like
Finola Howard (35:14.277)
I liked this one.
Finola Howard (35:21.649)
Hmm.
Mark Schaefer (35:39.46)
What do I do? I certainly had that feeling after I wrote, immediately after I wrote, had finished belonging to the brand, Chat GPT came out. So the first thing I did was write an essay about brand communities and the voice of Mark Shaver, blah, blah, blah, blah. And the darn thing wrote a perfect essay in three seconds. And I was, I was despondent because it's like,
Part of my meaning is the struggle. I know this takes so much out of me to write a book. But when Fionnual Howard says, Mark, I love your work, don't get it, it's worth it. I'm sending that ripple of hope and maybe it changes you in a little way and maybe it changes thousands of people in ways that I'll never know. So all that sacrifice.
is worth it. And now it's like, can AI write the book? Where do I belong? so I think, so what I try to do in this book is think it through. And some people call me a futurist, but here's what I really do. I think, okay, if this is true, how is this going to play out? Let's really think it through.
Finola Howard (37:04.583)
Hmm.
Mark Schaefer (37:07.117)
If people are losing agency and they're feeling helpless, well, then how would a brand approach people like that? They would want to make them feel useful. They would want to make them feel hopeful. So actually like building in a little friction might be the thing that will attract people to our businesses and our brands by not saying,
Here's our AI bot that'll take care of everything for you, but saying, hey, let's work on this together. Let's use your ideas to direct you where we can help you the best. And I think this clever little example that I used in the book was many, many years ago, a company introduced this instant cake mix. All we had to do was add water and we make this cake. And it was a flop.
Finola Howard (38:01.628)
Yeah. Yeah, yeah.
Mark Schaefer (38:07.075)
And what they discovered is the people, want to make a cake. They want to feel like they're baking it. So they changed the recipe. So all you had to do was add an egg. And just by adding that friction, it made people feel like they were accomplishing something. I think that is a very helpful lesson in this new age where people are just thinking, you know, why do I matter? If you can help people feel
Finola Howard (38:34.525)
Hmm.
Mark Schaefer (38:36.877)
Like they're part of the process and they matter. I think that could be something interesting for businesses and brands to consider.
Finola Howard (38:46.939)
I'd like to share with people that. So while we may have poked around at a lot of kind of the tough things that we think are happening, that are happening with AI. Mark has given some really good practical tools of how to think differently about stuff and how to look at your business differently if you feel that your business is being threatened in any way. So there's some really great practical tools to to in the book. And I urge you to buy a copy. What I'd like to ask you, Mark.
Mark Schaefer (39:14.615)
Thank you.
Finola Howard (39:15.953)
now is what would you like to leave people walk away with?
Mark Schaefer (39:22.265)
So this is something I bring up in the book.
Mark Schaefer (39:29.743)
that it pondered a lot. my best-selling book is called Marketing Rebellion. And the subtitle of the book is The Most Human Company Wins. And that's sort of my philosophy. I believe that deep, deep, deep in my heart and in every fiber of my body, that the company that shows their heart
their smile, their passion, their compassion. Those are the businesses that are gonna win. And as I started getting into this AI project, it made me wonder if that was still true. Because we do see examples where AI can be more human than a human. I mean, if you want to have a human presence that is patient, you can't beat AI.
Finola Howard (40:28.509)
Hmm.
Mark Schaefer (40:29.431)
If you want to have a human presence that will get you the best answer most quickly, you can't beat AI. And so where I come down to is indeed my belief still holds the most human company wins, but I do think it can be a blend. I think we can use AI strategically to
help us have a more human presence. Maybe it's freeing up time so we can really connect with customers. I had this conversation with a business strategist yesterday, and their business is just so organized and so template.
templatized, it says that a word. And it's been optimized and organized. And I said, what would happen if your store owners just went out and talked to customers? I just said, hey, how are you doing today? I'm so happy to see you. I just want to tell you how much I appreciate you. And I don't have any agenda today. I just want to say hello and thank you. And thanks for being here. Is there anything I can do?
And this was like an epiphany. So I do think that it can be a blend, that we can use AI in a discerning way, in a judicious way. And in the end, the most human company will always win.
Finola Howard (42:15.741)
I love it. Thank you so much Mark.
Mark Schaefer (42:18.243)
Thank you, Finoa.