Episode 33

full
Published on:

5th Nov 2025

Season 0 Wrap-up

Wrapping up Season 0 – We're almost ready to begin!

Transcript
/:

0:05

Forget trying to keep up with AI. It's moving too fast. It's time to think

0:10

differently about it. Welcome to the AI readiness project hosted by Kyle Sham

0:14

and Annne Murphy. They're here to help you build the mindset to thrive in an

0:19

AIdriven world and prepare for what's next.

0:26

Well, hey Kyle. Forget trying to I have a question.

0:34

Go on. I have a question for you and I bet our

0:38

um audience is curious too. In our new season, are we gonna have a

0:45

new theme song? We can have a new theme song unless you

0:49

have described our theme song as a bop. It is a bop.

0:55

I mean, we can't not We could have a We could have a new one that's also a bop.

1:01

We could Oh, do I have feedback?

1:04

You've got feedback and delay. You've got weird delay stuff.

1:08

Now you're good. What do I What should I do?

1:12

I don't know. Good. Okay,

1:13

you're good now. Your computer seems to do that thing where it freaks out a bit

1:17

and you you get delayed. Anyway, so so here we are. We are um

1:26

wrapping up season zero. Now, can I I want to want to give a confession to

1:31

you. Um I thought we were just podcast and

1:35

then at one point you informed me that we were in season zero and I have never

1:41

in my life started something counting at zero. And so I did not tell you in

1:46

season zero the entire season zero, but now I'm excited to start season one.

1:53

Now, now it really starts. Now is when it starts.

1:57

Explain yourself. What What is your thinking on season zero?

2:02

Well, I figured I figured if we if we call it season

2:10

zero, then no one will have set any expectations that it's like a real

2:15

thing. So this So season zero is an apology for

2:20

our first season. It's the apology tour. It's the apology tour. Well, that's

2:25

great. Okay, so this is good. And And how many do you know how many episodes

2:29

we had in season zero? It was a lot. It was more than 25, right?

2:33

It was a lot. Oh, yeah. I think Okay. I think it might be 30.

2:38

Okay. Yeah. So, that's a lot. Most podcasts

2:43

die before 10. I think the average is nine nine episodes. So, the fact that we

2:48

did 33 um I'm assuming you're kind of like me

2:52

with our neurospiciness that my recollection of any conversation we had

2:57

in those 30 things is essentially gone. So, I know that you've got a cheat sheet

3:01

there where you can remind us who we talked to.

3:03

Yes. But but before we dive into talking

3:06

about like specific guests and things like that,

3:10

I don't know, why don't you and I just check in like where are you right now?

3:13

like where are you with AI in life and AI readiness and you just

3:20

create conference which is remarkable and I just went to TED AI which was

3:24

remarkably disappointing and and Vanessa um who was at both of those um

3:32

did a Tik Tok today talking about how one was you know all about technology

3:38

and one was about humans and you were the human

3:42

of that equation How are you doing? Where where are you?

3:47

um 30 30 episodes. I was I was kind of thinking about

3:53

things in terms of how I feel about having a podcast

4:01

30, you know, 30 episodes later. And like how do you remember why I have like

4:09

remember how when we started I was like, "Oh, that's interesting because I've got

4:12

the fancy mic. I've got the I've got a podcast coach. I've got 15 different

4:15

brand ideas, all this stuff. But the one thing I didn't have was a [ __ ]

4:19

podcast. I do remember that

4:22

I had whole the podcaster [ __ ] and be like, "Oh, really? You need that? Oh,

4:27

I'll be right back. I've got that. I've got all the things of a podcast, but I

4:32

don't have a podcast because I did not feel like

4:36

Yep. I did not feel like I was

4:43

worthy of having a podcast. I also recognize now that I had romanticized

4:49

Yeah. having a podcast in my brain quite a bit

4:53

and that there are podcasts and then there are podcasts, right? like like the

4:58

people who our friends at the Daily AI show doing a a a morning show every

5:05

single day. That's different, right? Like that that just is totally

5:11

remarkable. And then there's the people who have like editors and stuff. And

5:15

then there's people who like just kind of have like there's uh what's it

5:20

called? Our friends at Jelly Pod. Right now you can have a podcast that's all

5:24

AI, right?

5:27

We're somewhere in the middle, you know, like we're clearly not overproduced, but

5:32

we're also not just rolling it, right? And why will

5:40

No, this is this is where we we just talked about this. We don't diminish.

5:46

This is it's um it's like the Nike the Nike,

5:49

right? So, we're somewhere in the middle where

5:59

we don't overthink it and um we have the fabulous support of our team members and

6:06

friends and colleagues who help with some of the a lot of the

6:09

behind-the-scenes stuff. So, we get to do the fun part of having a podcast.

6:15

Um, by the way, Kyle, I think that there

6:19

might be a sound coming from your area. Might be.

6:25

Talk. It's a little bit.

6:29

Is it echoey? It might be. Hang on. I'll turn this

6:33

down. How's that? Is that better? You got to talk to see if it's

6:38

better. Yeah. Can you hear anything weird on my end?

6:42

No. You talk again. Hello. Hello. Hello.

6:47

All good. I think it's good. But it was weird. I

6:53

was like getting both of our voices were coming back to me.

6:56

Yeah. Should we start over?

7:03

No, we're live.

7:06

Just cut this part out. No. No. We're live. We're not

7:12

We're live right now. Yeah.

7:15

Kyle, I thought we were recording.

7:19

Are there people here? There's Well, there were there were two.

7:23

There's one now. All right. Hi, everybody. Okay. As I was

7:32

saying, we're not too overly produced. Um, but like

7:41

that's awesome. Yeah.

7:48

Yeah. Go ahead. Well, well, finish your thought because

7:50

I I have some thoughts. Well, well, um, for me it was a really, I guess what I'm

7:57

trying to say is that it was a really big deal to have a podcast and that I

8:03

wouldn't have done it if I I really still would be trying to get fancier,

8:07

you know, equipment and never having a podcast if you hadn't said, "Hey, we

8:12

should do a podcast." I And I'm so so glad that you did.

8:17

Oh, thank you. Well, I I'm so glad I did, too. I I think the world of you um

8:25

you know it's funny when when you were talking about you know sort of where we

8:29

live in the podcast world in a lot of ways how I think about this

8:35

is not as a podcast but but as a conversation

8:39

a conversation you know if you think about

8:44

the things that we've done together GPT for good fest of us last year we're

8:49

getting ready to do festivist this year. We've got our communities which are

8:53

these independent communities, but we have a lot of overlap. We have a lot of

8:56

members that are that are both or we're in this one and are now in that one and

9:00

vice versa. Um,

9:04

and the thing that I'm most interested in in my life right now is having

9:10

conversations around this stuff. So, so for me, this podcast actually just feels

9:16

like an extension of what we're already doing

9:20

and and and what I think is important, right?

9:23

Because, you know, in the AI salon, we're kicking off this this idea of a

9:28

we're calling it the the AI salon mastermind practice, but the idea is we

9:33

want to help people design a daily practice around using AI.

9:38

And we kicked it off at our uh we had our our monthly salon presents meeting

9:44

last night and we kicked off this idea of a practice and um Liz Miller Gersfeld

9:50

who was kind of her idea and we've we've put put it into a framework and she

9:54

talked about how she does it and I talked about how I feel like I'm I'm an

9:58

accidental um I have an accidental AI practice and and what I what I realized

10:04

last night is you know I show up to my AI lives night after night which is kind

10:08

of like a practice. But what I realized it's more like a habit.

10:12

Yeah. There's a difference between a habit

10:15

where you just do something remotely and a practice where you do

10:19

something with intention. And and I had committed to just showing up but I

10:23

didn't but I'm not necessarily regularly present with what's my

10:28

intention, what am I trying to accomplish.

10:30

And I think I started the AI salon in a similar sort of vein where like it just

10:34

felt important for me to start it. And what you and I were talking about before

10:38

we went live was this idea of you said it to me. You

10:42

said, "I think we've created something with these communities that there's

10:47

something bigger going on here that we don't quite know what it is." And and I

10:52

think there might be something around this where both of our communities and I

10:56

think you and I as individuals have gotten to this place where it's time to

11:01

shift gears. that that we've done the work of of birthing these things that

11:06

we've created this podcast and festivists and she leads and your create

11:10

and I'm doing a residency in:

11:15

elements but what they all have in common is

11:18

people connecting right I think that that AI readiness project

11:24

2.0 know starts to look like well what's our

11:29

intentionality like what do we want to do as opposed to just starting these

11:35

things just just saying I have an instinct I should put energy over here

11:39

that we actually go now that we've created these things that have all this

11:43

energy and and and are literally changing people's lives

11:48

what do we what do we want that to be and what's our intentionality and so I

11:52

think for me season one and I think, you know,

11:55

version 2.0 of our communities and of of all these properties we've created

12:00

Yeah. starts to just feel like

12:06

like we've got more clear intentions about what we want um and and uh and

12:12

what we provide and and what maybe it's even what what purpose we serve and what

12:19

purpose this podcast serves and our community serve. I don't know. I don't

12:23

have the answer, but it it feels like we're in a like right now we're in this

12:27

tate between what was and and:

12:33

beginning. You said that earlier. Absolutely. For some reason, November

12:37

and December seem like this liinal phase for me of just

12:42

in:

12:48

all the things and I did that. I did that and I don't

12:53

want to throw spaghetti at the wall anymore because I don't need to. That's

12:56

not my intention, right? like I have already done that part and now I want to

13:02

really shape and mold and nurture on the foundation that's already built

13:09

that is so incredibly special and unique and deserves I think deserves the

13:17

it deserves the slow slower thinking and slower decision making and slightly

13:24

slower action that we can make now because we've gone this far so quickly.

13:29

I mean, 10 months for She Leads AI. 10 months since Festivus last year.

13:35

Yeah, that is crazy.

13:37

Is that when you kicked off She Leads? That's when we launched our membership.

13:42

th,:

13:46

Well, you've done a hell of a job. It feels like it's two or three years old

13:50

to me. That's really cool.

13:54

That's really, really cool. Yeah. Yeah.

13:58

So now we do do me a favor. Tell tell me um

14:03

e're going to do something in:

14:07

salon with in-person meetings and we've actually secured a space here in Denver.

14:11

We're gonna AI salon's going to have an actual physical home. We might do one in

14:15

New York. What?

14:17

Yeah. I'm I'm super excited about it. Um

14:21

when are you gonna tell me this? Well, I haven't quite figured it out

14:24

yet. It's in place, but but I haven't quite figured out what I want to do with

14:29

it. So, once I figure it out, I'll let you know. But you just did create,

14:32

right? you just did this create conference and you've got your virtual

14:35

version of it coming up because a lot of people couldn't come in because of

14:38

situations situations

14:43

but talk to me about your experience with with create and I I get I know that

14:49

there was a lot of bonding and a lot of really special stuff but just in the

14:52

context of AI readiness like what were the themes that what what was your

14:56

experience of to the extent that you could be present to it because you were

15:00

hosting it and probably like with your hair on fire. What was your experience

15:05

of create? What was what were some of the themes that came out of that that

15:09

have stuck with you now that you're kind of two weeks out from it?

15:15

One is no matter how often we face this thing that you and I try so hard not to

15:27

bring into our lives and not to bring into other people's lives. Everyone has

15:30

imposttor syndrome and so there's a there's this universal

15:37

thing that we're tapping into of you know and because we were because we have

15:42

the culture that we have and it was a very safe space and everyone had made

15:47

you know quite an effort to get there people brought that really positive

15:54

energy to it but they were still walking through the door thinking that they

15:57

didn't quite belong And everyone else thought it more

16:01

together than they did. And everyone else

16:03

And every one of them thought that, right?

16:05

Every one of them thought that. And I know I know because I asked um because

16:12

one of one of the women who was one of our one of our um kind of helped helped

16:18

us welcome everybody to Utah is a woman who Penny Atkins

16:25

who runs this $50 million responsible AI center at the University of Utah on

16:31

behalf of the whole entire state and she was remarking on how she was a little

16:36

bit nervous to come here today because she felt like everyone else really

16:40

understands this whole AI thing. Keep in mind she's also an academic scholar with

16:44

many many referee journal articles on the topic.

16:48

And she's saying and she did she wasn't putting on errors or anything. She was

16:51

genuinely saying I was kind of nervous coming in here because I figured that

16:56

everybody else is so much better at this than me and they know more things than

17:00

me and all this stuff. And I said hold on. So, and I asked everybody, did

17:05

everybody else feel this way when they came in here, too, like Penny did?

17:08

Everyone raised their hand. Every single woman.

17:13

Yeah. And it's bananas because we are we our

17:17

group, our people, Kyle, you, me, and all of the people we get to hang around

17:20

with, we're top tier, you know? We're top tier in the AI

17:24

world. Well, we're top tier. And I think to

17:27

your point, if you ask everyone to a person, with maybe one or two exceptions

17:33

of, you know, some people who, you know, have healthy egos, I think to a person,

17:37

they would basically say, I don't know anything like

17:40

I don't know anything. You know, I kind of feel like I kind of

17:43

feel like at this point we've forgotten more than than we've learned, right?

17:50

Because things are moving so fast. But what I know has stuck with me,

17:59

what what I what I've absolutely lost is the specific skills of how to use this

18:04

tool or that tool or that tool. But what stuck with me is the kind of

18:12

nimleness of I'll figure it out. Like like I'll figure it out along the way.

18:17

Kelly Camp at at at the uh when she was talking about her daily practice, she

18:23

had talked about she started her AI agency the the week

18:29

chat GPT started and she said for the first year and a half of that it she was

18:35

like anxious all the time because she was trying to keep up with everything

18:38

and and you know I was encouraging that I was trying to keep up with everything.

18:42

I was modeling like keep up with all these tools. We'll we'll you know we'll

18:47

learn all of them and everything they can do and you know and I was pretty

18:52

good at it for a while there and then it just it just passed me right. I'm not

18:57

worthy. Come on, get over it. Said Kevin Clark. That's awesome. Um

19:02

Um I'm not worthy. Oh, is Kevin here? Hi

19:06

Kevin. Yeah, Kevin's here. That's awesome. So,

19:08

but but what Kelly was saying is that it took her it took her a while to to

19:13

transition, but she's in this place now where she's kind of got this calm,

19:18

which is she can now just talk to people about what's going on with them, and she

19:23

trusts that she knows enough about how to

19:28

how to adapt to to AI into the situation where she'll figure the AI part out. So

19:33

she's not she's not feeling that frenetic

19:36

need to keep up with the tools. She's now kind of settled into okay, I've got

19:41

a confidence those I'll I'll deal with that when I need to. But what I can be

19:47

right now is just listen to the person I'm interacting with. And I that to me

19:51

feels that feels evolutionary.

19:55

And and I think and I think in our community like the through line

20:00

through the people that talked last night about their practice, there was

20:03

kind of a calm to it that they're doing their thing,

20:06

they have an idea, they put it in practice, they've got systems to

20:11

organize their assets like it it wasn't this panic to use all the tools to feed

20:17

ybe maybe that's how we enter:

20:24

with a call confidence to say we'll figure the tech stuff out.

20:29

Yes, we just need to figure maybe what it is

20:31

is we need to figure out what we want. And I think this is the intentionality

20:34

thing, right? What do you want for she leads? What do I want for the salon?

20:37

What do we want for ourselves personally?

20:39

Yeah. Figure that out and get clear on that

20:42

and then all the other stuff will follow.

20:47

You know, when we first started saying it's not about the tools, I was kind of

20:51

like, yeah, but we really know it's about the tools. Like,

20:57

you were just going along with me. Yeah. Like, oh, this is the conceit.

21:01

This is the conceit of the of, you know, of the show or of the month or of the

21:05

week or whatever. Um, so it took me a while to get there.

21:10

I think in part because um it's really challen like it's very

21:15

challenging for me to know all the tools like I was doing ke the Kelly camp thing

21:20

too. I think we you know a bunch of us were in that mode.

21:23

Yeah for sure. And now there have been enough occasions

21:27

to figure out what the tool might be that I know now you know a couple years

21:32

later that I'll figure out what the tool is.

21:36

Yeah. Well, and I also think I think the tools have gotten good enough like it it

21:40

really was in the early days it really was of these seven tools, one of them is

21:46

the best tool, right? And you kind of had to know which one was the one that

21:50

was the least janky right

21:51

now. I kind of feel like all of the tools are good to to whatever degree

21:56

like and so like we can we know that if we need something with analyzing text,

22:04

there's six different things we can go to. It doesn't really matter.

22:08

It doesn't really matter, especially if you don't know what the hell you're

22:11

actually trying what you want out of the project. If you don't know what you

22:15

want, you really don't even need to be like nerding out about the tools because

22:20

you're just going to be chasing, right, chasing shiny objects. It's that thing

22:25

of before we start and the I I haven't learned enough about the concept of the

22:32

AI practice. So, I'm catching up to where you guys are. But if you think

22:36

about how a regular practice is, you wouldn't do the

22:44

a training plan is metered out over time. It's not learn

22:50

everything, do everything, learn all the skills at once because you would burn

22:52

out and you would not be able to continue doing it. So this

22:57

your body was just rejected. Yeah, that's really good.

23:01

Exactly. Exactly. We were kind of binging. I think we were binging on AI

23:06

stuff for a little while. Well, we were we were binging on

23:09

learning because because Well, oh, so here's another thing. Cindy [ __ ] has

23:14

talked about this. Kelly Camp has talked about this.

23:19

Um, the things that you and I and everyone

23:25

in our community, I see Gareth here and Kevin Clark here.

23:31

The things that we think are just absolutely obvious that you can sit down

23:35

at Chat GPT and have it write a LinkedIn post for you. There are still many,

23:41

many, many, many people who don't know that that's possible.

23:47

So, so, so, so sometimes interacting, you know, with a with a new client or or

23:54

someone who's new to AI, we don't have to know [ __ ] like like the

23:59

stuff that we just we take for granted is just mindblowing to someone who

24:05

hasn't done anything, right? So, so I think there's

24:09

I think there's something that that part of our role is just being in a calm

24:14

place and being in a place where we can actually listen to whoever we're talking

24:17

to and just try to understand where they are, what they want to accomplish,

24:22

right? And just have the trust that we'll figure this other stuff out.

24:27

So, I just had a little epiphany that the new brave thing that I'm going to do

24:35

that I'm I'm committing to is that I

24:42

I want to speak to people 100% where they are.

24:47

Yeah. without

24:51

the narrative in the back of my mind that says, "Yeah, but the smart people

24:54

won't think you're smart if you don't say all the fancy words

24:58

or if you don't in reality stuff." Yeah.

25:04

Right. If you don't do it, you those smart people. Well, guess what? The th

25:08

those people like our peers are not my clients. We forget that. are for in my

25:14

case all of my clients, not a single one of them would benefit if I went and

25:19

learned automations. It would not bring them one single benefit to their lives.

25:24

If I went off in a corner and learned how to be the greatest AI video maker if

25:31

I had fun doing it or whatever, that's separate. But for me to seek out things

25:37

just for a some kind of like a chip or a notch in my belt or something versus how

25:43

can I actually be useful to people. That's what I want to do. That's what I

25:47

want to be. I I think that's huge. I I'm not going

25:50

to say the person's name because I I want to respect their privacy, but I I

25:54

know someone a good friend of mine who spent many many years

26:00

securing um a building and creating a building in

26:05

in New York City um and getting a an AI supercomput built

26:14

um with the intent of helping nonprofits and foundations train their own models

26:19

on their own data. Oh wow.

26:22

Yeah. Really, really ambitious, really amazing, amazing thing. And I talked to

26:29

him about six months ago and and I said, "How's it going?" And he said, um, he

26:35

was a little bummed. He was a little depressed. And he said he said, "No

26:38

one's using my fancy superco computer." basically because the foundations and

26:44

nonprofits they don't they don't know what training

26:48

a custom data set is. They're not even using chat GPT. So like they don't even

26:54

how to write a grant with chat GPT and he's sitting over here on the far end of

26:58

the spectrum ready for them to like take their data that they've collected for

27:03

the past 50 years and turn it into remarkable things. They're not even

27:06

using chat GPT. So I think to your point about being where people are, I think

27:12

it's really important because we sit in this weird unique position of

27:20

knowing that that that far that that far goalpost is there today. We can we can

27:26

run out there if you're ready for it, but we might just be ready for can we

27:30

just make an email easier to write, right?

27:34

That might be enough for three months. Right?

27:38

That might be enough for three months. Well, and here's here's how I can prove

27:43

it. This is what Kevin Clark said about AI

27:45

is a confidence delivery system. Confidence delivery system. Yes. So,

27:49

thank you for saying that, Kevin, because the Oh, God. I'm so I'm so

27:54

gratified that you said that because I was talking to somebody about this the

27:58

other day and I was like but you don't understand how important it is

28:00

particularly with women to be able to work through an idea

28:06

in safety insecurity without retribution with with a nobody with a

28:11

or without mansplaving what what an meant to say

28:15

what meant to say um and that it gives us in many ways like it gives it's not

28:22

about it would It's like if you could go eat dots or

28:26

junior mints and get this and get this kind of confidence from that. Cool. It

28:30

just happens to be talking to Chad GPT and working through a problem. Now you

28:33

have this new kind of confidence. That's not the thing that like Sam Alman is out

28:38

here talking about or Elon Musk or what all of them. They

28:43

don't give a [ __ ] about that. But my clients do.

28:46

Yep. the people who I work with on a daily basis care that they can practice

28:51

for a conversation with a donor or for an almost impossible conversation with

28:57

their boss. Like that's what matters to people. So I have really struggled

29:04

between trying to like keep up with my peers and also just speak to my clients,

29:09

my people where they are. And I'm going to go with the ladder for a while now.

29:13

Yeah, it's good. Um, this thing that Gareth said, I have a friend say to me

29:18

today who's new to AI, the more I embrace AI, the more I'm thinking about

29:22

thinking differently about how I leverage it.

29:25

Yes. The the thing that struck me when I when I went to TED AI two weeks ago, the

29:32

the the disappointment there was that what was being celebrated, what was

29:39

being talked about, what was put on stage

29:43

was the only real thing of value right now

29:48

is better algorithms, more math, more science, more chips.

29:53

that that more compute will you know will be the solution.

30:01

Um the innovation that's going to happen

30:06

from someone not using a AI at all to transforming their business over the

30:11

course of a year working with you doing stuff that is trivial by today's AI

30:17

standards. the innovations in business are going to happen at a much more

30:21

granular level and probably you know if if if AI compute capacity is up here and

30:28

current businesses are here the jump from them to go from here to here is is

30:33

like mindblowing to them like that might be a 2x increase in productivity idea

30:39

generation effectiveness whatever it might be it might be years or decades

30:43

until they you know if ever get to Yep. What's possible? And so there's a lot of

30:50

attention and money being spent raising the ceiling when we're just sitting down

30:56

here on the bottom where again the things that we think, you know, are are

31:01

trivial are not trivial to most people coming into the space. And so I think

31:06

that I you said it. I I think that if if I

31:10

think AI readiness, it's when you interact with someone,

31:14

understand where they are. take in where meet them where they are. I think that's

31:19

right. And I believe that those of us that's

31:25

why we're building AI advisors, not agents. Yes. Um I believe that that's

31:31

the kind of grounded, generous, and at the

31:38

end of the day much more lucrative for people with my skill set. That's the

31:43

better approach because I mean there is not a single person who appreciates the

31:48

smartest person in the room, right? We're always waiting for that person to

31:52

shut the [ __ ] up, right? It's true. It's true.

31:56

Right. Who likes that person? Do you want to have a beer with the

32:00

smartest person in the room? No. You want to you want to have a beer with the

32:04

person who's like nice and kind and funny like that person and and not

32:09

afraid to ask dumb questions. So, um, and being able to be present for when

32:16

those little things that we think are small because we've been at it for a

32:20

little while. When those like awakenings happen for them and how

32:25

existential that really is for them to be able to be there and like hear them

32:30

and be alongside on that part of the journey, I think is where

32:34

I don't know. I just think it's a kind of a it's feels more like a calling than

32:39

a job say. Yeah. No, I think that's good. Kevin Clark, Kevin, who's on the

32:44

on the in the audience right now. Um, he we we've been we've been putting a a

32:51

kind of product in marketplace for businesses to be able to get up to speed

32:55

and build internal cohorts and things like that. And and he he shared a a

32:59

story of framing, a way he was talking about AI with someone that we've since

33:04

used to to actually change the name of of our offering. So our our offering is

33:08

now um it's it's called wow AI right and and the question that he asked was you

33:13

know he was talking to someone in business and said hey have you ever had

33:16

a wow moment with AI you know at home and and they were like oh yeah you know

33:21

whatever I made a kids book or whatever it is and he goes have you ever had that

33:24

wow moment at work oh no like the answer is always there's no wow moment at work

33:30

and but there's if we're just over experimenting or or you know our kid

33:35

shows to something. We can have a wow moment over here, but that is much

33:40

harder to discover in the workplace. And and why is that? Well, maybe people

33:46

aren't allowed to use AI or maybe they're they're sitting off in the

33:49

corners just experimenting with it or um or maybe everyone's just too focused on

33:53

efficiency to really discover those vow moments in business, right? And so, how

33:57

do we discover those? And again, I think that's where thoughtful people that can

34:03

can understand where someone is and then say, "Hey, so you know what you said you

34:09

wanted was X and here's how we would do X, but you know, you know what else is

34:14

possible? Let me show you this thing over here that might be like one little

34:18

step over." And you show them that and they're like, "Wow." Right? Like like

34:22

can we provide wow moments? Maybe we become a bridge because because they

34:26

don't need to jump immediately to the most capable thing. like they need to

34:29

understand where they are now and how they kind of level up in in an

34:33

incremental way. Absolutely. Absolutely. And making the

34:38

space for those increments like I

34:43

one of the things that I took for granted

34:48

the fastest of all of the tools were AI meeting recorders. I I they they you

34:54

know they came on the scene. I tried every single one of them out. We chose

34:58

the one we liked and then we basically, you know, built the rest of our world

35:02

around recording our our conversations and then everything just started working

35:07

really well and I I thought that everybody had already had that

35:13

transition specifically with meeting recorders. So, but still all this time

35:18

later that's really people would be happy in working with me if we talked

35:22

about just meeting recorders for a week. Yeah. like that really would make people

35:29

feel like they've accomplished something.

35:32

Coming up with those moments that make people feel that dopamine rush

35:40

at work within their work would be such a gift because it would mean that we now

35:45

have a work life that is worthy of wow moments. How sad is it?

35:52

How savage is it? Well, yeah, that that's a whole that's a whole other

35:55

thing. But whole story.

35:57

Yeah, that's a whole other thing. Well, I'll tell you I'll tell you one that

36:00

that happened today. So, as you probably know, Brandon within

36:06

within the the salon community. He's he's going to produce Festivus and he's

36:10

the producer of my life. He built this um custom GPT for people to find food if

36:15

they get cut off from SNAP benefits. and and he talked about it on the live

36:20

and he talked about how compassionate it was and things like that and and sort of

36:24

word got out that he had built this thing and and one of the people that we

36:27

know that you and I both know who's got, you know, big connections with, you

36:31

know, people in the world said, "Oh my god, this is amazing. We've got to get

36:35

word out. We've got to get articles written about this." And he got really

36:38

excited. And then I was on a call today and Daisy Thomas was on there and she

36:44

was really excited about Brandon's thing and she said she went into Brandon's

36:48

tool and was able to create um a shopping list for nutrientrich

36:56

for a nutrientrich 21-day meal plan of food for $125

37:04

in his thing. and and and she said she was blown away that she was able to do

37:08

it and it did this really remarkable thing and it wasn't just food for a

37:12

hundred bucks. It was nutrientrich nutrient-dense food, right? That would

37:17

really sustain a family for 125 bucks. Um, and what struck me in that was

37:24

Brandon getting his custom GPT talked about is one thing, but like someone

37:28

like Daisy knowing that she could go into that and and get that kind of

37:34

result out of it. Most people are most people are going to go in there and not

37:38

know what to do with a custom GPT. So, I feel like there's this whole string of

37:43

wow moments that can happen from even a sing a simple little thing like I took a

37:48

data source, put it into a custom GPT and made it compassionate.

37:52

Well, there's all sorts of ripples that can come out of that if if you've got

37:55

people interacting with it that have got AI readiness.

37:59

Yeah. Right. And so, so I think our roles

38:02

start to become to recognize, oh, Ann made this thing. Oh, you know what I

38:06

could do with that? I could do this and maybe I could show that to so and so. I

38:10

think making those connections starts to feel like the the new

38:17

I don't know some some new kind of intentionality. It's not just about

38:21

learning to build the thing. It's about now that things are built, like in

38:24

content evolution, we've got these digital advisors and we're putting

38:28

together a project that's like, okay, if you're overwhelmed with information,

38:31

here are the 17 steps that you need to do that in the middle of it is this

38:36

thing that we've built, but we're giving them all that we're sort of spoon

38:39

feeding them just all sorts of tactical little steps along the way so they don't

38:43

have to figure it out. They can just use it

38:45

to learn. Is it what is it for? What do you are you learning something when

38:49

you're doing it in the system that we're putting

38:51

together? It's it's um it's we call it the information overwhelm protocol

38:58

where becau be so so because of AI right it

39:03

used to be if you were an executive people would give you a PowerPoint or

39:07

they give you a four-page report or whatever they put together well now

39:10

everyone's giving you 45page reports that they wrote with chat GPT so

39:15

everyone is inundated with too much information

39:18

and so this is a protocol to say okay just go gather all stuff, throw it into

39:23

notebook LM, put these six prompts in notebook LM and that's going to generate

39:28

this. Now take that, you know, and make your decision. And now, you know, so

39:32

it's literally, you know, take it into your digital advisor and have them back

39:36

and forth and then from there make your decision. So, it's it's literally just

39:40

saying, "Here are how to use a series of tools

39:44

that feel familiar to what you already do, but really allow you to do it at a

39:49

much higher level because it's got this AI stuff infused into it."

39:56

You know, it's the thing is like we need that tool

40:01

to get people to use that tool. There's still something missing, right? You can

40:07

or we need a human. We need a human a little sav

40:13

savviness around AI and ideally enough business savviness like you with

40:18

fundraising where someone in the room can say well here's the challenge I've

40:22

got where your brain half of it clicks in to say okay here's what I do in the

40:26

real world and then your other half goes and here's the six AI tools I would use

40:30

to to accomplish that right both of those are really important

40:34

but the most important thing there is that you're actually listening to

40:38

and and not often some AI adult fever dream.

40:45

Well, if you think about Brandon's, by the way, I have to get I I have to get

40:50

the GPT. I can't remember where I saw Is it in the salon? The link to the GPT.

40:55

Yeah. And it's it's it's a post snap something.

41:00

I think if you just go to the GPT store, but it's in the salon. Yeah.

41:03

Oh, the GPT store. Because here's what I wanted to say about that is and Brandon

41:07

being Brandon is so humble and he's I I his first post was he was like well

41:14

shucks guys I don't have I don't I don't know very many people

41:20

and I just you know I don't have a big audience but I made this thing.

41:24

Yeah. And you know it's this is an important

41:28

time for people to be able to figure out what the heck and I was thinking about

41:32

that. But I was like, "This GPT is awesome and people it is they people

41:37

will go and find it." But like it wouldn't be as good without Brandon. It

41:41

wouldn't be as good without that this nice guy who would literally never heard

41:47

a fly who gives and gives and gives to this community and and and beyond that

41:53

he made this thing and he showed up on TikTok and said, "Well, shucks guys, I

41:57

don't know. I built this thing. I hope it can be

41:59

useful. If you know someone, tell them about it." And it was just

42:03

like ah yeah it's amazing and to your to your

42:06

point technically what he did was he took a

42:10

data set put it in a GPT and wrote a prompt right

42:14

but because he's Brandon he wrote that prompt in such a way that when someone

42:19

says hey I you know I'm I'm struggling to find food in this city it writes an

42:24

empathetic response. Yeah

42:27

because he prompted it to write an empathetic response. Hey I'm sorry

42:30

you're going through that. That must be tough. you know, we've got the resources

42:33

here. We'll help you through it. Like that part of the experience has nothing

42:39

to do with the technology, has everything to do with who Brandon is.

42:42

Exactly. And that he was aware enough of how to

42:46

use the tools to be able to take that part of his value system and create

42:52

something that reflected that. That's beautiful,

42:55

right? So, so that's the thing. It's that's the

42:59

opposite of what TED AI was for you. TED AI did not have any of this. She leads

43:06

de celebrated that you could build a custom

43:11

GPT. It ignored the fact that you might build something with a custom GPT that

43:15

had heart. That had heart. Yeah.

43:18

Right. Like like they just said that the value

43:21

is only over here on the technical side. It's like no no it's it's over here as

43:26

well. And the wow moments that this thing that Kevin said about have you had

43:29

a wow moment at work. A wow moment. It's a very human moment.

43:34

Oh yeah. It might be a wow of the technology, but

43:36

it's a human moment. But the wow thing comes from you get AI to do something

43:41

that's relevant to you personally. That's what causes the wow. You're like,

43:44

"Wow, I know how to do that the other way. I

43:48

didn't know we could do it this way." Right. That's that's a very personal

43:52

thing. And that's what was missing at TEAI for me. Yeah. So there I think

43:57

there gonna be a lot of people who through they're going to be doing fun

44:02

wow stuff at home right all weekend long and then Monday's going to roll around

44:07

and they open their laptop and they look at their freaking co-pilot and they look

44:11

at their calendar and they're like I don't want to do any of this [ __ ]

44:16

anymore. Yeah. That's those are the people who

44:20

were talking about what does:

44:24

like? How do we support those people who if it's their choice or not their choice

44:29

are no longer in the type of like employment, maybe they're just done

44:34

being an employee forever, but they've always been an employee. The number of

44:38

things that I had to learn, oh my goodness.

44:41

And everybody else is going to be starting from square zero.

44:45

Well, that's so so you and I have talked about this before. This is this is my my

44:51

part of my intentionality for:

44:58

You and I, because we're crazy, choose to be entrepreneurs,

45:03

right? like you choose to go into the vast unknown of just I'm just going to

45:08

go book a a bank of hotel rooms in Salt Lake City, Utah

45:14

nine months out from an event that's never existed before. Right? You've got

45:20

you've whatever happened in your childhood, I'm sorry about that. That

45:24

that made you like this. I'm the same way. I'm just like, hey, let's just

45:29

start a company. So, so there are people who are naturally entrepreneurial.

45:34

That's not most people. And and even worse for some people is if they've been

45:40

in a in a position, you know, where they've been told what to do. You know,

45:44

they're in a position that's like, you know, we're going to give you

45:47

instructions of what to do. You're going to do that every day. You don't have to

45:50

think on your own. Like, there are many careers that are just, you know, just

45:53

show up for work, punch the clock, do the work. Those are the ones that are

45:57

going to be the quickest to be automated out. And so those people don't have the

46:02

innate natural skills to be an entrepreneur, but they may just be out

46:06

of work. And not only can they, you know, maybe not find a job, their whole

46:11

sector might be just automated away, right? So there's going to be a lot of

46:16

people forced into entrepreneurship or soloreneurship. And so

46:23

how do we provide I think this is you and I talked about this earlier. I think

46:26

for our communities I personally feel this. I know you do too. What is our

46:31

responsibility as leaders of these communities to provide resources,

46:36

infrastructure, training?

46:39

Yeah. like emotional support for people that that have to go from I never had to

46:45

think about what my point of view is on something or what's a problem I want to

46:50

solve that I want to start a business around. I've never had to think like

46:53

that. That's that's that's got nothing to do with tech. That is that is a a

46:58

cognitive jujitsu move that a bunch of people are

47:03

going to be forced into. That's that's hard stuff. That's really hard

47:08

stuff. Um, look what Gareth said. Just don't AI

47:14

your way out and he goes he goes Gareth is gonna talk

47:21

he's gonna tell us you know what's so perfect Gareth is that um so Gareth I'm

47:26

gonna I'll DM you after this but we might have to move you to another

47:29

weekend but I want to tell you why this is for the audience as well not just

47:33

between me and Gareth. Um, so what's so cool about um we if we do

47:41

our three interviews that we have set up for Sunday, Kyle, one would be a woman

47:46

who is uh um Trudy Armand she her her uh like new branding is she is your income

47:56

resilience bestie. So her thing is every day, what have you done for your income

48:02

resilience today? because none of us, right? If you think you have like if you

48:08

think you just have your main job and a side hustle,

48:11

maybe you need a third one, right? Because we're all running around like

48:14

this. But the the juxtaposition of Trudy and Gareth, it's just too perfect.

48:21

And her saying, "Here's how you Oh, all of a sudden you're no longer an employee

48:25

and you never have to worry had to worry about where the paper clips come from or

48:30

whatever you're using." Yeah. I just love it, you know.

48:35

Yeah. Um question. Did you want to talk about

48:37

some of the people that we've talked to over the course of this season?

48:42

Well, we only have eight minutes left. We have eight minutes. So, I'm going to

48:46

How about if I just say a few names? Beautiful.

48:51

Okay. Say the name and kind of if there's if

48:54

there's any thing you have there about either what we talked about or what what

48:57

was a highlight from from what they said.

49:01

Okay. All right. So, hold on. Everybody just

49:06

calm down. All right. So, let's see. Well, we had Jennifer Huffagel. I missed

49:12

that one. Jennifer is the educator. Um, and she is a she's very strict about

49:18

people and data privacy and security, force of nature. Love her.

49:25

Force of nature and data privacy and security.

49:28

One of one of the things that came up regularly over the course of season

49:33

zero, the last 30 episodes of this podcast was was um was the the concept

49:40

of professionalizing AI that there's playing with it is fun. Learning it is

49:45

fun, but if you're going to do this in a business context, in fact, the call that

49:48

I took from my co-founder when I was talking to you earlier was around data

49:53

privacy and things like that. So, so, um, as you as you learn to do this,

49:59

understanding that you've potentially got liability risks, your customer

50:03

potentially has liability risks, and you should understand what those are, and

50:07

you should agree with that. Just just professionalizing your AI practice is

50:12

that's something we're all going to have to deal with, especially if we're

50:14

getting paid for it. Back to the fin

50:18

resilience piece. Speaking of speaking of financial resilience, how about do

50:24

you do you remember our conversation with Sid Hargo? She he's the beautiful

50:29

storyteller. Yeah, he's amazing.

50:31

Beautiful storyteller and like so many um people who we get to hang around

50:35

with, we're catching him right in the middle of like a pivot in his life and

50:40

it's been such a delight to watch him do that. And he learned so he he went to

50:45

Festivus last year with AI.

50:49

Yeah. ended up being one of Kimberly Offford's students, learned how to make

50:53

AI videos, and that just translated into a whole new shift in his life.

50:58

Yeah. Yeah. We had Kimberly Offford on, you know, talking about, you know,

51:03

making beautiful films for Grammy award-winning musicians and we had Joy

51:08

Party on was our first guest. who was our first

51:11

she's our first guest and you know she's another one and who

51:18

you know she she was a a 30-year sleep technician

51:25

um who discovered AI film making um and the minute she discovered it she

51:30

realized oh all my life I've had these stories I've wanted to tell so it's like

51:34

she like she's a person that has has literally been waiting her whole

51:40

life for something to happen technologically that freed up.

51:45

Yeah. Her her desire to to to express herself

51:49

in a particular way like that's I I find that remarkable.

51:54

Absolutely. And those stories are crazy.

51:58

Can you imagine like for us to be able to learn what was there that now can be

52:04

liberated? I people can say whatever they want I about all of the horrible

52:09

things about AI. I do understand all of that and I feel all of it and I'm

52:13

they're not wrong. And also the opportunity for us to actually know each

52:19

other in ways that we have never been able to before because our ability to

52:26

communicate is had had been suppressed and now is less. So the joy parties of

52:32

the world and this being sleep technicians in a little room in the back

52:36

of a hospital. They need to be out. We need she Joyy's coming out.

52:41

Yeah. Do you remember um Sam Swain and Kristen

52:46

uh Steel? Kristen was the one who was like

52:49

they were great. Right. So she was very anti and then

52:52

then you know we had good conversations about AI and now she's using AI. We

52:58

wanted to have some people who were um kind of like converts a little bit. Um,

53:03

of course, Sunday. You know, you know what I what I

53:06

appreciated about her is her willingness to be in the conversation and not just

53:11

be so closed off where like we're we're in a time because of the politics of the

53:16

day where if you have an opposing opinion with

53:19

someone, it's like, well, then they're awful and you're great. It's like,

53:23

right. And so, and with this AI stuff, it can be very polarizing. So, to be in

53:29

a conversation with someone who said, "Here's what I don't like about it." Um,

53:34

and the fact that she was willing to be in that conversation and actually hear

53:40

some of the things that made possible that that that didn't make all of the

53:44

stuff she didn't like, right? Like all that all the stuff that she was talking

53:47

about, we actually agreed. Yeah, that's crappy.

53:51

And there's this other way you can think about it. and she was willing to hear

53:55

that and I thought that was a quite evolved but I just it was it was very

54:00

refreshing and the fact that they're they're

54:03

business their whole livelihood is bringing people together right

54:06

yeah exactly so introducing introducing AI and let's talk about a couple more

54:12

people do we have to stop right in three exactly in three minutes

54:16

we don't have to we're adults

54:21

it's our thing okay so

54:27

So she said, so Sundy said her real tipping point came with generative AI.

54:33

It changed the trajectory of my life. It was everything. It made problem solving,

54:37

problem presentation, decoding the different facets of my brain and

54:42

stitching it together in a way that was palatable.

54:47

What's Sy's last name again? Williams.

54:50

Sundy Williams. Yeah, she's a special one.

54:53

Yeah, she she was great. I was this idea of of brain decoding and um idea mining

55:03

and um

55:07

when you think about AI and computers, like one of the words I hear a lot is

55:10

accessibility and I think about, you know, people that are blind or people

55:13

that are, you know, deaf. I think there's a whole other layer of

55:17

accessibility where if you're neurode divergent or if you just

55:24

you maybe haven't had the capacity or the the the technical skills to be able

55:30

to articulate ideas really clearly and now that you can I think I think AI as

55:36

an accessibility tool is is is is not just for handicapped people right or you

55:42

know whatever I don't know what politically correct term is

55:47

people with disabilities. Yes, that thank you. Um, but I think

55:52

disabilities can are very subtle, right? And and so I get really inspired by

55:57

people who discover things about themselves that they would they they

56:00

have never been able to get around and now they can.

56:05

So the not the people who came on our show who had a p a use case

56:13

that they were passionate about kind of like with Brandon and his Snap um is it

56:18

called Snap GPT? It's post Snap Advisor or something like

56:23

that. Okay. Snap Advisor. Okay. So you know

56:26

people coming to the table with a use case that's like from their heart like

56:30

we saw when we did GPT for good. So, you remember Miloo and

56:35

Amber Trevetti with they they have the um their company is making a

56:42

I want to say it's like a a life slashcareer choice

56:48

gamified platform so that kids can go through the like what color is your

56:53

parachute kind of phase of life. Oh yeah. Yeah.

56:56

You know and listen to what Mai said about what's your definition of AI

57:02

readiness. She said, "I take it from a change readiness perspective. We're all

57:07

on the adoption curve. Some early, some waiting to see. Readiness means knowing

57:12

where you are, then helping others along the curve. We missed the mark with the

57:16

internet and social media. Let's get this one right. AI readiness is about

57:20

adopting it yourself and bringing others with you."

57:23

Wow. Yeah, that's really good.

57:28

She and her she and her besties, they left the company they were working for

57:32

and created their own. Yeah. I just got a got a LinkedIn invite

57:38

from some friends that are like, "We should start an agency." Like, yeah, you

57:42

know, people are going to start doing

57:45

interesting things. Um, by the way, Brandon's GPT is called Help After Snap.

57:51

Oh, help after snap. Okay. Help After Snap. Thank you.

57:56

I don't like it when I can't see the see the comments and you can. You're keeping

58:01

all the all the goodies to yourself here.

58:04

Yes, please. So, Vanessa Oh, you know what? I should

58:10

I should have asked you who this came from. You will know. So, Vanessa said

58:12

that this is Vanessa Chang, uh, who we all just are we just are in love with

58:18

othing but but more of her in:

58:23

to the language of AI or nothing. We get to decide. We are the ones that are

58:28

human- centered. Businesses have power, but they're selling to us. Don't forget

58:32

your agency in this. Well, that's that's square down the pipe

58:38

of creating a daily practice with AI that that that we're working on in the

58:43

salon right now. Exactly. is is how do you

58:50

as I'm as I'm experiencing this with people and especi especially people that

58:54

are resisting AI, one of the common themes is they're

58:58

treating AI like a competitor or like something that they need to defend

59:02

themselves against as opposed to treating AI like an amplifier. So if you

59:07

if you put AI in front of you like it's this thing you have to battle then

59:11

you're constantly battling this thing that's smarter than you and that's got

59:15

that can't be pleasant. But if you say I'm leading I'm going to lead with my

59:19

ideas with my values with my agency and then I'm going to bring in AI to support

59:25

that and amplify that. Great. Right. Then

59:29

exactly that feels right to me. Well, and and

59:33

the the pieces and parts that I'm not reading are all there's I mean this has

59:37

come up in almost every single one of these. So like Tam Win talked about

59:41

start with a problem you already have and try to apply AI to that context

59:45

context. You can have a mediocre prompt as long as you feed it the right

59:50

information. I think of it like train. Well, she's talking about the intern.

59:53

What do you need to know about your business to succeed? So it's not about

59:57

the tool. Not about the tool. Um, I think, by the

::

way, I think we had 31 because we have 31 people here. I'm skipping some

::

because we don't 31. 31. Oh,

::

uh, Mr. K. Oh, yeah. Mr.

::

So, Mr. K just got his dream job. He is, did you hear this? Yeah. He's He's like

::

the AI advisor for his whole school district or like the tech adviser for

::

his whole school. they put him in a position where he can actually affect

::

the change that he was struggling against. Um, so he's super excited about

::

that. Um, why don't we do this? Why don't we

::

wrap up and Okay.

::

Tell me what what you're most excited about for

::

season one. What am I most excited about?

::

I'm excited about learning what our audience

::

wants, needs, and likes from us because, you know, we did our first chunk live

::

like this and then we did recordings and put it out on the podcast things and I

::

feel like I don't really know that much about who all is out there, how they're

::

finding us, you know, what their experience is, if

::

we're delivering what they need, if we're redundant, because you and can go

::

a lot of different directions. So, I'm interested in knowing more about who the

::

audience is and and again meeting them where they are.

::

Yep. That's good. And I for for me I'm the thing I'm excited about and

::

we'll talk about this as we design the the framework for for season one is is

::

the idea of intentionality that that maybe less about are you ready

::

for AI and more like How are you making yourself ready for

::

what you want to do using AI? Some subtle shift that's around what are what

::

are people passionate about? What are their values? How are they putting that

::

forward? And then how are they using AI to amplify that? That for me is the

::

thing that I'm I'm excited about. Yeah. I same same. I think that there's

::

something more about Yeah. the intention behind it. Boy, look

::

at the things that people have made that with the intentions like with GPT for

::

good. The the the tools that we made because people had a passion. The thing

::

that Brandon made because he had a passion.

::

Yeah. Yeah.

::

Nice job, Kyle. I mean, it's for for a season zero. It was

::

we we got nowhere to go but up. Just even even in binary.

::

Yeah, exactly. Um, and if I'm not mistaken, we kick off season one with a

::

with a special guest. Correct. Is Liz our first se our first person?

::

I totally Yes. Liz Liz Miller Gersfeld, speaking of

::

intentionality. So, the the woman who basically is architecting the uh the uh

::

the AI practice in the salon. Um she's my co-host in the salon and she's

::

joining us next week live. She's she's amazing. She's just she's an amazingly

::

thoughtful and intelligent woman and just has a passion for

::

it's it's almost like it's almost like her passion is figuring out life. Like

::

she she hit a wall where she was like not happy in her career. She was just

::

like this isn't doing it for me anymore. So I'm gonna kind of just like blow it

::

up. Yeah. and go into this place of

::

nothingness, but with some intentionality that I want

::

to figure some things out. And she's emerging out of the other side of that

::

as this powerful AI creative producing, you know, powerhouse that's got this

::

amazing career. So, yeah, I'm She's going to be amazing. I'm super excited

::

about that. Awesome. Me, too. Me, too. So, next time

::

we chat with everybody, it'll be season one officially.

::

It will. That's next week. Thank you, Ann.

::

Next week. All right. See y'all later.

::

Thanks, Kyle. Bye, everybody.

Listen for free

Show artwork for AI Readiness Project

About the Podcast

AI Readiness Project
Forget trying to keep up with AI, it's moving too fast. It's time to think differently about it.
The AI Readiness Project is a weekly show co-hosted by Anne Murphy of She Leads AI and Kyle Shannon of The AI Salon, exploring how individuals and organizations are implementing AI in their business, community, and personal life.

Each episode offers a candid, behind-the-scenes look at how real people are experimenting with artificial intelligence—what’s actually working, what’s not, and what’s changing fast.

You’ll hear from nonprofit leaders, small business owners, educators, creatives, and technologists—people building AI into their day-to-day decisions, not just dreaming about the future.

If you're figuring out how to bring AI into your own work or team, this show gives you real examples, lessons learned, and thoughtful conversations that meet you where you are.

• Conversations grounded in practice, not just theory
• Lessons from people leading AI projects across sectors
• Honest talk about risks, routines, wins, and surprises

New episodes every week.

About your host

Profile picture for Anne Murphy

Anne Murphy