Episode 12

full
Published on:

28th May 2025

The AI Readiness Project podcast with Tracey "the Safety Lady" Hawkins

Join us for a timely show featuring Tracey “the Safety Lady” Hawkins, a veteran safety educator and AI cybersecurity advocate, as we dive into how individuals and organizations can stay secure in the age of generative AI. Tracey brings decades of hands-on experience helping people manage risk, now applied to the new frontier of deepfakes, scams, and psychological safety in tech-driven workplaces.

𝗪𝗵𝘆 𝗔𝘁𝘁𝗲𝗻𝗱?

  • Learn how to spot and respond to AI-generated scams, phishing attempts, and deepfake content.
  • Discover what “cyber safety” really means in a world where generative AI is everywhere.
  • Explore how to create environments where teams feel secure and supported—both online and off.
  • Hear from Tracey, whose style blends clarity, compassion, and a deep sense of purpose to help others feel confident, not fearful, when working with AI tools.

𝗢𝘂𝗿 𝗚𝘂𝗲𝘀𝘁:

Tracey “the Safety Lady” Hawkins is a nationally respected safety and cybersecurity speaker, podcast host, and instructor with more than 30 years of experience helping people navigate risk. Now, she focuses on helping everyday users and organizations face generative AI with knowledge and confidence. Tracey’s work bridges tech and humanity, empowering people to take control of their digital safety and wellbeing. She is also a content creator and educator known for making security education practical, relatable, and even fun.

https://linktr.ee/TraceytheSafetyLady

This show is a must-attend for anyone concerned about safety in the AI era. Tracey brings real-world wisdom, expert insights, and a contagious sense of confidence that will leave you feeling equipped—not overwhelmed—by today’s digital challenges.

Transcript

0:04

forget trying to keep up with AI It's moving too fast It's time to think differently about it Welcome to the AI

0:11

readiness project hosted by Kyle Sham and Anne Murphy They're here to help you build the mindset to thrive in an

0:17

AIdriven world and prepare for what's next [Music]

0:29

Forget Yeah it's a professional operation

0:34

We love our our Suno theme song so much We like to replay just an extra five

0:40

seconds of it each time We're just gonna We'll just keep that running while we talk

0:47

How are you oh man I'm good I'm good I I I we we'll talk about this in a bit but

0:55

I there is a new acceleration happening I felt I feel like one happened in March

1:00

April the beginning of April I think there's another one happening now and it's it's discombobulating

1:07

It is discombobulating and we're going to need to combabulate and recombobulate

1:12

I'm I'm I have uncomobulated recently really which I think that means Wait

1:19

does that mean I'm less or more combobulated i don't know

1:25

Well okay So this is the AI Readiness Project podcast Everybody welcome It is

1:33

Kyle Shannon and we're Yeah I'm Ann Murphy That's Kyle Shannon We have an

1:39

exciting guest today Tracy the Safety Lady Hawkins And if you have not experienced her if you've not crossed

1:46

paths with her yet this is your opportunity to do so And uh you're going to be smarter for it You're going to

1:52

have a really good cyber security person in your back pocket who you can bring to

1:57

the groups that you work with Um Tracy caught our attention um and our hearts

2:04

during AI festivist at the end of last year when she came on as one of our 34

2:09

speakers and helped us uh grapple with some AI safety and security issues

2:14

without being she's like the least she's not fear-mongering She's very supportive It's all good It's all

2:22

for the good of the order She wants us to be able to do the things that we want to do with AI without making ourselves

2:28

and other people vulnerable in the process Well and it's Listen I think we'll we'll talk about this when when we

2:34

bring Tracy on but it's we like playing with the toys and we like getting other

2:40

people and it's very easy to sometimes forget that playing with the toys could be bad

2:47

if you don't play with the toys right and so I think it's always a good reminder You know I I said last night on

2:54

my live that I think it was actually a passive aggressive move by you to make me feel bad about what I'm not doing in

2:59

safety and security So you know Okay Well then what is it to myself

3:08

yeah Like how many times do do we need to have cyber security speakers and and

3:14

friends in our lives before we actually stop using our dog's name as every password for every single thing yeah

3:20

Exactly Exactly I don't know Clear You might not say that on a on a podcast but

3:26

you know whatever Today might be a good day to change it JK JK Just kidding

3:35

Oh man Well let's jump in So so how ready are we for AI this week i

3:42

You want to go first you want me to go first how ready are you well I think you

3:48

I think I should go first because I am in such a different place about about this right now Like when you

3:55

talk about acceleration I just I like had this freak out because um I'm like

4:03

m going back to like November:

4:10

and looking at how things got to where they are right now And the reason why

4:18

this has come up and why I've started doing this like I don't know like um forensic retracing of my steps is

4:26

that the fact with with chat GPT being so unreliable

4:34

recently what I noticed about myself is that a while ago I was kind of like well

4:41

ch chat GPT is just kind of like having a bad day it's just kind of being dumb It's they released something new They

4:48

you know they changed the temperature on something whatever But now I think because of the way that I integrated AI

4:55

into my life now I feel like I have somebody dumb living in my head

5:03

because so when I started using AI it was because I had long COVID and I

5:08

couldn't think And so I immediately became codependent immediately Like day

5:14

one first prompt it was like me and my bestie Me this was like my extended

5:20

brain I never had the feeling very separate from it Yeah You know having

5:26

been taught prompt engineering that helped a little bit because that word just sounded like

5:32

intimidating and cold and gave me a distance from it So I was like "No I need to talk about it." talk to it in a

5:38

very particular way robotic right like not relationally but really it's part of my

5:46

brain now and I could sub I could say other AI other frontier models but

5:51

specifically chat GPT because I poured so much into building my relationship

5:57

with chat GBT right all the custom GBTs all the projects my voice all it's got

6:03

it's got memory now so like I I feel like that memory thing is it's funny I used to switch models a

6:12

lot and I actually now think about I'm like oh do I really want to go to Claude because I I think I want this conversation in my in my history like

6:20

exactly that's a barrier to switching It's that's a huge barrier to switching and they're brilliant for putting it in

6:25

there But so lately when I've like been lacking my confidence and my go-to

6:32

approach to life to doing life I started to wonder like do I rely too much on AI

6:39

you know um is it a superpower if you know what if they take it away right i

6:45

feel like I was a person who had a superpower and they took the superpower away and now I'm trying to like fill in the gaps with like other AI systems Mhm

6:54

Um the desperation I like the desperate measures I'm taking to make up for chat

7:00

GPT being dumb now Like just squandering manis tokens to pay $199 a month to ask

7:06

it like what should I have for dinner uhhuh And so on and so forth So it's

7:13

like that scene in her where the operating system goes away

7:20

Right So I also feel grief I'm also sad like where did my friend go where did my

7:29

partner in crime go and then I feel disloyal when I'm like doing these

7:34

workarounds So I'm not prepared for this acceleration you're about to speak of because I'm still trying to figure out

7:41

really truly what my relationship with AI is and what what does happen if it

7:46

ever goes away Yeah Well there's the go away thing You also talked I'm getting an echo all of a sudden Yeah me too

7:53

Wonder if that's from Tracy Hang on Let me mute her mic Hello Hello Sounds good now I think it's coming from you Oh is

8:00

it coming from me Let's see Check and see if you're on your right microphone

8:06

Good question Um sorry everybody about this I'm on my sure mic

8:12

[Music] Microphone Sure Speakers

8:19

Hello Hi Can you hear me i'm still getting an echo How's this hello that's

8:26

better Okay No not good on your side Echo cancellation How talk again Hello

8:33

Hello No I'm still getting it Sounds fine You're still getting it

8:39

Do you remember turn down your speakers a little bit I think Just turn down your

8:47

Okay Okay you talk I'm gonna work on this Hello Hello I'm getting bad echo Um

9:01

okay Try try again Hello You got a Yeah Hello Hi

9:11

How's the echo is it there um hello No

9:16

Okay You You don't sound as good but that's okay We have less echo All right

9:22

I think No we still have echo I would text Donnie Something's going on Texting

9:27

Donnie Okay

9:32

Um so I just muted you for a second because I don't think I'll get echo if you're

9:38

muted Okay So let me let me talk about a couple of things you said The hallucination thing that you talked

9:45

about is very very frustrating

9:50

And what I'm finding is that these models get smarter and smarter and more capable but they still hallucinate a lot

9:58

And it's very frustrating And I expect Oh you're going to be right back I

10:03

expect after two and a half years that the hallucinations get better and they

10:08

they just still seem to be there in kind of random and annoying ways That drives me absolutely baddy The the getting

10:18

um getting dependent on them is a really interesting one because I've kind of

10:24

spent my:

10:32

the producer look what Vicki said Smart

10:37

ass We ain't got no producer Settings What are you doing

10:43

all right Um I've spent this year really trying to

10:51

just live on the mantra that I don't need to use my brain That I want to

10:56

offload as much as possible to the AI so that I can think about things at a

11:02

higher level So let me see Let's see How is your Hello Hi

11:08

Can you hear me yes I can hear you And now I can't hear me which is great Okay

11:14

Well just so that everyone is aware All he did was the exact same thing that

11:21

I've done before I'm just telling the audience I'm just saying Where's the producer the

11:27

producer is Donnie And Donnie has had his way with your machine And it is now

11:32

it is now good Is now working We need producers everywhere we

11:38

go Exactly Um so so I totally get you like feeling dependent on these

11:44

things I I feel like I feel like it is a skill

11:51

of the future to accept that these things are

11:58

increasingly smarter than us And I I think that's a skill of the future So I think it's okay to be

12:04

dependent now If they go away it's going to suck really bad but I I don't think I I don't think it means you're you're

12:10

losing how to do what you do It just means you're changing how you do what you do And so so let me talk about what

12:16

what's going on with me with AI So I was out for a week in New York just doing just driving up and down

12:22

the state of Jersey going to clients and things like that And um I got a little

12:28

bit disconnected from AI and then and then I watched the Google the Google IO

12:34

conference and they launched um V3 the new video thing that generates

12:43

not just video like we've had there's a bunch of video tools out there and they all do really remarkable stuff but V3

12:49

does full voice full acting full actors full special effects coherent people in

12:56

the background Um I just talked to a buddy of mine that that is a fairly senior guy at Deote Digital and he just

13:03

said he he got his guys access to VO3 and he says they're doing over a weekend

13:09

what would have been like you know two three month productions

13:15

And so that for me starts

13:22

to it's the first thing that I've seen that is it is very very obvious that

13:28

that industry will not be the same Yeah And

13:34

then you know while while that's going on and there was a bunch of other stuff that that Google announced um OpenAI

13:42

announces codeex their their coding agent We've got Manis we've got Gen

13:49

Spark we've got um Flowith which is a horrible name

13:55

One that came out today called Factory There were two or three others We're we're starting to see five six eight of

14:03

these agents where you give it a task and it just goes off and spins up other

14:09

agents and it goes off and researches the web and writes reports and writes books and just crazy crazy stuff All of

14:17

those tools right now feel very nebulous and and very kind of you're not quite sure what to do with them Like what do

14:23

you do with an agent what would you have it go do but they're getting better very very quickly Like GenSpark for example

14:31

now generates images and it generates video So if you take this multi-step process you can now say you know go make

14:39

me you know 10 different picture books and then make make three of them short films and you know and you start to

14:46

combine that with V3 where where these things are really really good and you you could we are within spitting

14:54

distance of these tools being able to do full endtoend production not just of

14:59

movies but of anything we can think of And so so I think it it does put us in

15:08

this weird place an of we are going to become so dependent

15:15

on them they're going to do everything for us and and what is that

15:22

relationship i don't know So I'm in I'm in an unsettled place I'm excited about

15:28

the tech but it's like it's the first time I really saw something that an entire industry can look at what it does

15:35

and go or multiple industries like the ad industry the film but I think probably ads first because they're

15:41

shorter right and it's only clips right now But if I'm a Hollywood person I look at that and I'm like "Oh things are

15:47

different right things just change." It's going to be every profession is gonna is going to witness that kind of

15:54

disruption Yeah Um so I'm I'm in an unsettled place

16:00

I'm in an unsettled place too Ju ju different different reasons but

16:07

unsettled Unsettled Yeah it doesn't uh Oh Vicki by the way that is How did I

16:14

not even know that i didn't know that Manis had images I've been using it this whole entire time

16:19

Yeah See it's a perfect example of like why people have said to us "Well what do you use AI for?" for and we're like well

16:26

what do you not use it for but if you don't if you don't know what to ask you don't ask I've never once generated an

16:31

image with Manis Silly me But is right We're in Willy Wonka's Wonderland where

16:38

your world is your imagination Look it up That's great So I want to

16:46

That's what I want to talk about this week Let's talk about it

16:51

So if if these tools are going to do all the stuff then then what's our job and

16:58

so here's the thing that I I I I want you to think about this week and focus

17:03

on is the music producer Rick Rubin You do you know Rick Rubin right

17:09

yes And because of you he is a Grammy award-winning multi- Grammy winning He

17:16

produced some of the early hip hop in the 90s or 80s or 90s um famous songs

17:22

He's and and he is famously self- admittedly not a musician not a songwriter not doesn't work the

17:28

recording console He's just a tastemaker He's just a guy who's got an opinion

17:35

and people trust him for that And you know they're trusting their careers with

17:40

his opinion You know he walks into the room and and they're trusting their

17:46

careers with his opinion And you know he says his job is to listen to a thing and just not think

17:54

about the audience He's like "Screw the audience." He listens to the thing and he's like "Is this something I I would

18:02

enjoy is this something I would put out there?" And if the answer's yes then it's good And if the answer is no then

18:07

they they work until they get there He's like very confident in his

18:13

creative point of view Yeah I think that becomes our job I think that becomes our

18:19

job is we know what we want to do We know who we want to do it for We know

18:26

what difference we want to make in the world And then we've got all these tools

18:31

that are going to do that And hopefully we do it in a way where we've done it secure and safely and Tracy will tell us

18:38

you know how we're doing it wrong which is good And then it's going to produce this

18:43

stuff right and then we look at it We're like is it any good is that something I would want to put in the world like that

18:49

that I think becomes kind of the most

18:54

valuable kind of person people that have strong conviction in their creative

19:01

point of view And so and if you haven't seen it if you go to um the

19:06

wayofcode.com I think that's right Um that's a book that Rick Rubin just put

19:12

out in conjunction with anthropic and it's 82 pages of poems uh inspired by

19:19

LSU Uh 82 pages with a poem on each one and on every one is a piece of art an interactive piece of art that he vibe

19:26

coded Oh wow And then if you go look at it you there's a little button there

19:31

that says modify And so if you click on it it takes you over to Claude where you can modify his piece of artwork and

19:38

create a variation of it And then you can choose if you want to publish it back to the project So he'sing Yeah He's

19:46

inviting his audience to co-create with him But he created the creative space

19:52

for you to create within Right and then you can now co-create with him I that to me feels

19:59

like the future of publishing the future of content Um and and just that idea

20:05

of understanding what you want to do in the world what you want to put in the world how you want to impact people and

20:13

then just stick to those convictions and that I think that becomes our job So for me this week AI readiness is kind of

20:21

about practicing that Yeah You

20:27

know what's what's cool is that there are a lot of people who already have that skill to

20:34

um Well weirdos have a point They're the weirdos Well they t they do tend to be

20:40

the weirdos They're not rule followers right they're taste makers They're not

20:45

rule followers They're taste makers

20:51

And h also this is it's important to

20:57

consider who is given space in this world to have a point of view to have

21:04

taste that matters Like there are a lot of people who may or may not even know

21:10

if they have a point of view on things because no one has ever asked them No one has asked them for their point of

21:16

view They probably have some killer points of view So for some people this

21:21

is going to be a moment for them to be able to step forward No one's you don't have to wait to be asked for your point

21:26

of view You get to establish your point of view that is your point of view or

21:31

like what your gut instinct is right this is based on your li your your lived

21:38

experience right it's not necessarily going to be all the stuff that AI is

21:43

bringing to you It's all the stuff that AI is bringing to you plus you know your

21:48

lived experience and what you learned from your parents and probably something to do with your gut microbiome and who

21:55

knows what And now that's what's going to be the new version of what's valued

22:00

in our society Um I there's also something with the

22:06

acceleration of capabilities David Shapiro I just saw this right before he came on He posted a thing about V3 about

22:14

how good the video was He goes "It's not quite there for full-on science fiction yet but he goes I can absolutely," he

22:21

goes "we're close." He goes "I can absolutely tell you that my science fiction books are about to become series

22:27

and movies." And so and so I think one of the other things we get to play with is not only do we get to have a creative

22:33

point of view but we don't need to be limited to a single output anymore Right i wrote a book and I'm done No no You

22:39

could write a book a musical a movie a music video a song right you can do it

22:45

all right and it's all coming from that same creative point of view And so those people that get adept at being able to

22:52

express themselves clearly in multiple venues outputs modalities they're going

22:59

to be they're going to be like wizards Yep And they're going to be incredibly valuable for companies you know in the

23:06

world 100% incredibly valuable So next week is it next week that CJ

23:13

Fletcher is on the show no next week Okay On this show yeah Yeah

23:19

Yeah Yeah So CJ CJ Fletcher is going to be on the show next week or an upcoming show And one of the women who he

23:26

featured in his recent um debut mag um magazine is a woman named Nagawa Lule

23:32

And Nagawa is a master at when she wants to teach us something She just comes

23:39

right out of the box with multiple formats you know like she's not just going to post it She's also going to

23:44

make midjourney you know images She's also going to make a podcast Like I I've

23:50

watched her do this a number of times and I've realized how being an effective communicator wow we were trying to do

23:57

that through just one modality We are trying to get our ideas into other people's brains with words coming out of

24:04

our mouth boxes Yep That seems crazy now

24:09

That that's the only way right that like think of like we're really supposed to

24:14

all every single one of us be able to learn by someone saying words to our ears Like that's just not and now we can

24:21

do it in all these different ways I think it puts a higher premium on starting out with an idea that's worth

24:27

sharing of course Yeah Well again it come that gets back to that creative

24:32

point of view Yeah Taste and like like like how do you want to affect people how do you want to impact people yeah

24:39

And some people will be some people will be like out there looking for ideas right and

24:47

then there are probably some people out there who have the right amount of ideas They're really good ideas whatever And

24:53

then there's people like me and you who we don't have the same respect for ideas that a lot of people would have because

24:59

we're just going to have another one in 10 seconds So idea Tourette's

25:06

So our ability to discern what ideas are worth sharing in 12 different formats is

25:12

is the trick Well but that you know another thing that I experience is that

25:18

the amount of ideas that escape my head now is up 10x right it used to be if I

25:23

actually had to do work to get an idea out of my head it had to be a pretty damn good idea at least from my point of view And now I just like have an idea I

25:31

can put it out there right and and sometimes those ideas are good the ones that I thought were just throwaways and

25:37

sometimes the big ones were not as good So it gives us a bit more nimleness um

25:43

with point of view But again it gets back to it's what you said like what do you what's your intent

25:49

right be intentional is one of the things we say content evolution Be intentional Like what's your intent what are you trying to accomplish and then

25:55

when the AI generates its stuff it either matches that vision or not And I

26:01

you know having the guts to go "Nope that's not it." And just be willing to throw it out and throw it out and throw

26:06

it out until you get it You know there's the art That's art

26:12

Can I do how many minutes do I have like two minutes to tell a story Okay So we got to go quick We'll have to

26:18

go quick on our on our little ads for our communities Oh yeah Okay So this is just a this is this is a nonsequiter but

26:25

you know it's a little bit of something for everyone here What do What is a non Everything is a nonsequiter So therefore

26:31

there are no nonsequiters But I was listening to a podcast today Um there's

26:36

a woman named Amy Porterfield and some people some people will recognize that name She is a goddess of ecommerce spec

26:44

specifically digital courses and and comm digital courses and she's a

26:49

juggernaut I mean she's multi-million dollar you know a year company She's been in business for 16 years She always

26:57

I mean I've listened to over 700 episodes of her podcast Wow Live eat sleep breathe Amy Porterfield Think the

27:04

world of her And her one thing that she always said was "Don't come up with multiple good

27:11

ideas One idea you have one idea You have your one offer You refine it You you know you make it perfect and you

27:17

just rinse and repeat and rinse and repeat and rinse and repeat." However last year she started using AI

27:27

Did it blow her world up it blew her whole world up Now instead of having one avatar she always swore she had one

27:34

avatar It's the woman She's starting to think that she might not want to work 9 to5 anymore and she wants to wants to

27:40

launch a course That is it She started talking to AI about her one avatar and

27:47

she discovered that she actually had more avatars but she was telling herself she only had Well now this woman who's

27:53

been all about I mean she's built a she right a I mean she's ginormous larger

28:00

than life and she is not ADHD Oh wait does she say she is though Vicki i don't think she is Um but the fact that Amy

28:08

Porterfield had had a couple conversations with AI and changed and blew up her whole like proven highly

28:15

successful well doumented books blah blah blah business model and now she

28:20

gets to right she gets to and she's filled with joy she gets to serve four

28:26

different ICPs now right and she it was just it's like

28:33

talk to your AI it's talk to your No I listen everything is going to transform

28:39

People that are committed to one idea are going to have many People that are committed to many ideas are going to have one Like I feel like everything is

28:46

going to transform Anyway tell them about she leads So she leads AI Our mission is to unite accomplished women

28:52

to advance AI for global prosperity And we do that through we have an academy we have a consulting agency we have a

28:59

community which is called the she leads AI society and we have public affairs

29:04

and policy Our she leads AI society is it's a group of women and I want all of

29:10

you to come and check us out on a Saturday sometime So we have social Saturdays Anyone can come check us out

29:17

see what it's like from:

29:22

hang out We talk about kind of conversations like Kyle and I are having right now You know what what's going on

29:29

that's weird what's blowing our mind uh what are we worried about what are we excited about uh what's working for us

29:35

what's not working for us and then we learn from each other in our member jams all all week long So beautiful And we

29:43

have a co a conference in October that I'm excited about Yes And you and I need to talk about that Okay Yes Um AI salon

29:51

I put a new URL here because our URL um our card got compromised in the salon

29:58

Yeah And so so the salon.ai It It might be working now U but I I think we're

30:04

okay But um go to aisolon.mn.co And that's the the community site for the AI salon

30:11

Um community of about 3,000 AI optimists If you're not part of it you should be a

30:17

part of it It's amazing people It's where Ann and I met Um you should also

30:22

know that um we've been around for two and a half years We launched the week after Chat GPT launched And this Sunday

30:30

June 1st we're kicking off um the AI Salon Mastermind which is a subscription

30:36

um sort of a higher level people that want to step up their game It's a higher level subscription Um and we've got a

30:44

member special um for all of:

30:50

check that out and just get in the community It's a great It's a great community Awesome Um let's bring up our

30:58

guest of honor Tracy the Safety Lady Hawkins I

31:04

feel like it's like a boxing announcer Tracy the Safety Lady how are you

31:13

hi Excited I'm a little out of my element but I am so excited to get to see you two again to get to talk to you

31:19

and listen to your conversations I love it Fantastic Tracy do not delay Tell us

31:25

tell the nice people immediately your background what got you into safety how you've become like how you've kind of

31:31

snuck into people's lives as the benevolent safety lady that you are Okay

31:37

in the safety business since:

31:44

get to travel the country I get to speak virtually in person I teach people to live and to work safely Um I started out

31:50

just doing regular safety how to be aware and alert when you're out and about trusting your instinct um how to

31:57

secure your home DIY safety on the road So I would do those workshops all over Then I started talking to real estate

32:04

agents I'm a former agent My twin sister is still an agent We got our licenses together So uh the US Department of

32:10

Labor considers real estate sales and leasing a high-risisk hazardous occupation Um so I started talking to

32:15

lth workers And then November:

32:23

Um before that I always Well what happened then i mean it

32:29

changed everyone's I know all three of ours for sure but a lot of people's lives Um and before that I was uh had

32:35

built cyber security into my presentation So I was already talking about the two-factor authentication

32:40

already talking about no more passwords do pass keys do um you know and past phrases So I was already talking about

32:46

that and then when chat GPT came out um so many of us lost our minds um but I

32:52

had to figure out how does this incorporate with what I am doing and it's just so obvious um anyone who

32:57

touches any kind of device you're at risk of being hacked of having your

33:02

identity stolen of being victimized So I had to make that into my programs And so

33:08

now I get the honor of traveling the country I even uh created content for Canada um about generative AI but I

33:16

can't be doom and gloom So I have to not only talk about the dangers but I talk about the ways to use it And like both

33:22

of you I'm using it every single day of my life I can't imagine life without it So I'm using it to create content to

33:29

write books to create this little sign that I've started doing I'm using it for almost everything So what I do is I'm an

33:36

advocate I'm telling everyone I want you all to use it Um it scares me The

33:41

schools are saying we don't want our students using it so they're banning it Um so I've got an opportunity to go to

33:46

schools and say you can use it and here's how to do it in a safe way So bottom line I am now teaching people to

33:52

use AI generative AI in a positive way And then I'm also saying here's how to use it to protect data And then the big

34:00

part is here Here's how cyber criminals are using it to target you and here's how to protect yourself So that's how I

34:06

got here Fantastic Why don't you give us give us I don't know I mean you you do this all

34:13

the time so you probably have the presentation in your head but why don't you give us sort of a I don't know top three top five like things we should

34:20

know like what what are things you you've got an audience here that that very possibly may have seen you at AI

34:26

Festivus but also um you know do use these tools a lot for the most part Um

34:32

so so you know tell us the things that we should be paying attention to you know like what are the what are the

34:38

biggies okay And you you brought it up Kyle You talked about hallucinations And what

34:43

you're saying is hallucinations should be better by now Let me tell you why they're not and why they're getting why

34:49

they're getting worse Um a part of what I do again is to teach people to use it And I tell people if you create content

34:56

you can't believe what you read You must fact check everything So if you go back to the beginning where does where did

35:03

these tools get their information um it's what we put into it So I have to talk about privacy anything you put in

35:09

it it is not private Um so if you're uploading client um files and creating

35:14

reports and marketing plans the whole world now knows So what I say is to

35:20

redact any personal identifiable information Um I just got a report back

35:25

um an MRI report and it's like what does this mean in English oh I can upload it

35:31

to Chat GPT and it will tell me right but then I say when I see people doing that it's like time out redact your

35:37

personal information first then upload it and then it can be of course it can tell you um here's a diet to help solve

35:43

that problem Here are exercises you can do So you can use it but use it safely So redact personal information We see

35:51

the headlines a Samsung employee um accidentally expose proprietary

35:56

information You don't want to be that one person So number one um understand there's no such thing as privacy So data

36:03

protection Number two let's get back to hallucinations Um so the information that's in the tools is what we're

36:08

teaching it What we're putting in there is how it's learning So anything that you see they found it on the internet or

36:14

the tools founded on the internet So think about this and this is going to sound kind of sort of mean but I'm sorry

36:20

What if people who are less than smart who are less than intelligent are using the tools and they're putting in bad

36:27

incorrect boss information So now the tools are learning all of this bad boss

36:33

information and it's regurgitating it So Kyle that's why I don't know how you expect it to get better because of and I

36:40

just read an article about this a couple of weeks ago So if it's learning all of this bad incorrect stuff that's what

36:45

it's going to spit back out at us So no matter what I anticipate that you're always going to have to factch check

36:52

everything always going forward Um so those are two top two Another one is

36:57

beware the deep bank And I am surprised I get to talk to thousands every year

37:02

and I get to talk about this topic This is like my popular topic I had a class recently where one of the people said

37:08

"I've never heard of this deep fake thing." It's like "What?" Because it's on the news We're talking about it all

37:15

the time She had never heard of a deep fake So everyone needs to be educated um

37:21

on deep fakes Part of my presentation is we do show and tell And I guarantee you Ann and Kyle that people are sitting

37:28

there with their mouths open and their eyes are even bigger because what I do is I say "Look at this picture Is it

37:33

real or is it a deep fake?" And then often times they can't tell Yeah So I have to tell them "Here's how you know

37:40

if it's a deep fake picture or image You know it hasn't quite gotten the limbs right So it may be too many fingers not

37:46

enough fingers no fingers at all." That's one tell But at some point it's going to get better And then deep fake

37:52

videos I am seeing news stories where uh these videos people are going online

37:57

just like what we're doing here You could be talking to a dick fake So the question is how do you know so there are

38:03

things you can do You can ask it to raise his hand put his hand in front of his face ask questions off the cuff but

38:08

you still have to be extra careful because people have been getting scammed because they don't know to question it

38:14

So if the video looks a little shaky then the deep bake um operator is saying "Oh it's a reception issue." So you may

38:20

be talking to a deep fake That's the the Tokyo story In Tokyo there was an employee who was on a Zoom with a

38:27

virtual call with his um executive staff and someone said "Wire a series of $25 million." He did it Turns out that every

38:34

single person on that screen was a video deep fake That money is not coming back

38:41

You all know the grandparent scam um back in the olden days even before AI where someone would call a grandparent

38:48

and say "Hey grandma grandpa send me money I'm in jail or I'm in the hospital but don't tell mom." And they would do

38:54

it right think about a voice deep think They can All they need is three seconds

38:59

of voice They can duplicate your voice Exactly So Kyle you can call Ann and you can say "Hey I need you to wire this

39:05

money in Let's cover something." And Ann's hearing Kyle's voice not even questioning it and doing it Yeah Not

39:11

after today because now you know you need to question everything And I say get a code word So I think Kyle those

39:18

would be the top three Recognize deep fakes data privacy and don't believe anything um that you see and read That's

39:25

perfect That's perfect Have you been paying attention at all to the proof of personhood technology like world world

39:31

ID i've seen some of it as it relates to real life stories What do you know

39:36

uh well just that it's they're basically using biometric data and and basically

39:42

saying you know you have to uh sort of verify yourself in person with your with

39:47

your eyeball and then but once you're in the database you're verified as unique against you know all the other people in

39:54

in the database and now they're building applications where you can't have robots

40:00

You like it's everyone on the network is using one of these things So it feels to

40:05

me like maybe not this year but probably next year these things get so good that

40:11

the deep fakes get so good that I I think there's going to be a demand for some sort of system that says I want to

40:17

know that I'm talking to a person right you know and uh so I'm just just uh yeah

40:22

that feels like it's going to be important you know soon It has to be It has to be because right now AI detectors

40:29

don't work They're not 100% reliable you see the headlines of college students or high school students getting in trouble

40:35

because the detector says it's written with AI and it's not So those are not reliable at all And then to go back to

40:42

what you said um in real life until that technology becomes available um if

40:47

you're in a transaction any kind of transaction it could be a real estate transaction It could be even be a

40:52

Facebook marketplace transaction If someone is saying I can't meet you in real life you know or I um can't talk on

40:59

the phone I have a hearing issue So let's just communicate by text or god forbid WhatsApp Then the red flag should

41:06

be raised because typically they don't want you on the phone They will not show up in real life and they're only

41:11

communicating by text That could very well be a scam or a deep fake So just like you said there must be an inerson

41:18

meeting Um and if that's not possible someone's out of town then that's when we go to the video meetings And then now

41:23

we know that the cyber criminals are using the videos for deep fake So now we must look what here's how you prove life

41:29

during the video So that's where you have them raise their hands have them answer a question off the cuff or even

41:35

putting their hand in front of their face or just something um showing me your ID you know right next to your face

41:40

Let me look at it You must um request proof of life to make sure what we're looking at or who we're talking to is a

41:46

real person because more than likely it is not But I don't want I don't want people to be paranoid That's why I give

41:53

information So I'm saying learn what to look out for So hopefully after today

41:58

you will question everything That's good That's good Yeah Just just

42:05

be you know vigilant I I remember after your last your last talk I was like

42:11

"Okay I just got to start paying attention." It's like when you walk in the streets in New York City you just

42:17

got to be aware Just be aware of you know someone creeping up on you you know

42:23

right And it's not paranoia people I I used to have a retail store when I first started out I had a safety store So I

42:29

had like pepper spray um home alarm car alarms and people say "Oh this is a store for paranoid people." It's like

42:34

"No no no no This is a store for educated people aware people because if you know what to look out for you're not

42:41

living in fear because now you know if someone calls you um and they're asking for money or personal information and it

42:47

doesn't seem right now you know that this could be a deep fake So you're not afraid to answer the phone You know how to handle it And the one thing I say for

42:55

businesses and for families is have a code word Have a code word in your

43:00

business Have a code word with your vendors What if your vendor calls and say "Hey you haven't paid your invoice I need you to pay it right now or we're

43:06

going to suspend your terms." You're listening to the voice of your salesperson You know them They've sent you this email and it looks legitimate

43:13

So you're tempted to pay it And I'm saying "Time out." Because now now cyber

43:18

criminals are using AI to produce these documents Whether it's a deed invoice the emails the business email

43:25

compromised they're emailing you and it's no longer broken English is no longer um poorly written Now it is

43:31

grammatically correct So you're looking at an email from someone you know you're getting a phone call and then someone's

43:37

asking you to pay Time out Pause Stop and think before you act And then again

43:44

establish a code word If they don't know the code word hang up immediately No conversation no trying to convince you

43:50

Hang up And that goes for family So I say everyone have the conversation with your families It's not just the seniors

43:55

for young people for anyone in your family You know if you get a call from me asking for money or asking for

44:00

information it doesn't seem right If I don't know the code word hang up and then call on a number that you know is

44:07

there It's not what's contained in a text or email Yeah I mean you know there

44:12

are there are enough people not aware of what AI can do that like like like they

44:18

used to prey on the ignorant right you know like I don't know how to use technology so I'm going to pray on that

44:24

And now it's it's a similar sort of play They're playing on the ignorant of not aware that AI could possibly be just

44:31

like your daughter or your son or your grandmother But yeah it can Um okay So

44:38

three questions for you So these are the three questions we ask uh all of our guests So the first one and there's no

44:45

wrong answers here You just you just go you just go you have fun with it First

44:50

question what was the tipping point where you knew you had to go all in on AI and what's happened since

45:00

probably the thing that moved me the most is I I want to say happened last year So AI had been around a while but I

45:07

saw a news story about a senior citizen who lost their life savings lost

45:14

everything And so they were fooled um a voice deep fake They just didn't know better And if you think about um those

45:20

romance scams before AI it was already heartbreaking right and then I am after

45:26

I saw that one news story it's like let me Google grandparent scammer romance scam

45:32

heartbreaking to see people who believe they just believed it and then they thought they were doing everything right

45:38

You know this is the real voice This is uh the real image They thought they were doing everything right and they lost

45:43

their retirement money their life savings And then so I'm thinking of this 70 80 year old person who now has to go

45:49

back to work That broke my heart and it's like I need to do whatever I can to

45:54

keep that kind of heartbreak not just for seniors but just people You see businesses that lose their money They

46:00

wired they made a payment How about that uh phone call the CEO phone call where the um executive got a phone call from

46:06

the CEO and said "I need you to wire $242,000." That employee did it and it

46:11

was a voice defect So imagine that one person who made that one mistake So awareness um is my goal That's why I'm

46:19

determined to speak whenever and wherever I can just to raise awareness That's great Yeah And how has the how

46:25

has your how has your message changed post AI versus pre like what's what has

46:32

shifted if anything now I have at first I I kind of built

46:39

everything around you know um pause stop and think before you react before you do anything but then it's all around us so

46:46

now I'm just telling people just a simple sentence you can no longer believe what you see what you hear and

46:53

what you read All of that is done It's not coming back No such thing anymore

46:59

reading an article online and believing it in full face No more getting a phone call believing what you hear None of

47:05

that And that's kind of sad Um but that's our reality now because just as excited as we are to use these tools and

47:12

to find out ways to use them in a wonderful way more how to be more productive how to increase our bottom

47:18

lines um cyber criminals wake up every day just like we are And they're thinking how can I use these tool tools

47:25

to scam and defraud someone they don't care who it is And then they're able to use the tools to amplify Now they're

47:31

doing it scale So they could just send out one email um to tons of people and

47:37

they're using AI to fuel it So people are thinking I'm too little I'm too small I don't have enough money they

47:42

don't want me who am I every single person counts Even if they get $100 from you if they don't have to work hard and

47:48

they're getting $100 from a whole lot of other people who don't know better So that's why I I don't know how to shake

47:54

the whole entire world up and say "Listen to me." Um so that's why I had to to couch it with business building it

48:02

for businesses I just talked to did a um um a program for over 800 people for a

48:08

financial institution a bank And I had to tell them be you um use the tools

48:13

They have governance policy Thank God they have a committee So they're slowly getting into it But I still have to say

48:19

you can use the tools Yes you can You can use it to produce content You can use it to um clear uh what bottlenecks

48:25

and increase workflows to be productive Yes do that And I have to build that into my classes So I'm telling you how

48:32

to build your business And then families I'm telling you how to use AI to take a picture of that um the ingredients and

48:38

create your own recipe I'm telling you how to to to do it in your personal life But then overall um I just need for

48:45

people just to just pay attention and headline Don't be the next headline All

48:51

it takes is one person clicking one link to just blow up either a whole family or a whole business So pay attention I'm

48:58

not fear-mongering like you said I like how you said that in your introduction I am not doing that I want you to use the tools but I need you to do it in a safe

49:05

way Beautiful Yeah this this new era of not being able to I shouldn't say new

49:12

era but even like intensifying you know um acceptance that

49:18

we can't believe anything we see hear or read there is some some grief in that I

49:25

mean that's sad and true and sad and true right it's both AI is both of these

49:33

things um I'm glad that you're the one who's delivering Well of first of all

49:39

everybody should have you come to their meetings and their trainings and you

49:44

know their AI councils and get this information out into organizations like

49:50

across the board Everybody needs to know the things that you're sharing Um one of the things that we want to ask

49:58

everybody who we bring on the show is because you have such deep expertise in this safety um field what are some of

50:05

the trends you're paying attention to that you think everybody else should be paying attention to from your point of

50:12

view as a safety expert i am so glad you asked that question Um

50:18

okay So I am I like to have fun I'm not um like you said I'm not the techie

50:23

person but I am I'm knowledgeable in that area But here's what I see that is scaring me to pieces and even experts

50:31

are doing it Have you guys seen where people are um what is it called ghibli where they're uploading their pictures

50:37

and then you see them on the um doll box like a Barbie doll box Um so okay So

50:43

everybody's doing that and it it is my goodness that's fun right time out

50:49

You're uploading your face your images to an AI tool How are they reusing it

50:56

can they reuse it for other things what are the data privacy um issues they're telling you on there we may use it to

51:02

promote um or we may use it in other way So you're saying I want this so bad I'm just going to click yes Yes Yes Yes And

51:08

then get my doll picture or get my picture And then your your facial recognition is a big deal Like you said

51:14

biometric So right now let's talk a little bit about facial recognition and biometric It's a good thing when you're

51:19

logging into your account If you require a thumbrint I happen to have a twin sister I use my thumbrint to get into my

51:26

phone My sister cannot log on with her thumbrint So I know a criminal cannot

51:32

will not be able to log in Facial recognition fabulous for that purpose So here we are using that same technology

51:39

and we're having fun with it We're uploading our picture so that we can see our face on a um on a a doll and all of

51:46

the things to go along with it And we're not thinking about who who has access to that information Can they misuse our

51:53

image um and there's so many tools out there thousands upon thousands of AI tools We know the big ones So I

51:59

typically say stick to the big ones Uh because there's more accountability there They have reputations like the

52:04

Geminis and the Chats Midjourneys and other reputable tools But what if Johnny in the basement over there is making a

52:12

an AI tool that will put your face on a Barbie doll and then you're just uploading your picture and what if Johnny is not reputable and now he's got

52:19

your picture and he's putting your face on bodies that are doing inappropriate things or use misusing it So I'm I I

52:27

love love love the question I want you all to have fun But if you're going to do that then just look at the privacy

52:34

policy of the AI tool that you're going to use Make sure they're reputable Look at the reviews Any AI tool you must look

52:41

at reviews That way if something goes wrong you know everyone's lined up to say what's wrong right so look at the

52:47

reviews That's where you're going to spot a lot of issues And just make sure that the tool is using your image in a

52:52

proper way And then delete your image Make sure it will allow you to delete your image Once you've gotten your

52:59

drawing or your cute little picture and you've shared it with the world now delete it So make sure you can do that

53:04

So the trend of um uploading your video a lot of p to creating video um I'm going to do it we're all going to do it

53:12

but just read the rules you know um the training or the faces videos and we're uploading a lot of pictures so that we

53:18

create a video of oursel go ahead and do it but do it with caution and make sure

53:24

it's again it's a reputable tool and see how your image can be used and then find out if you can delete it after you get

53:30

what you need from it Yeah that's great That's right That that's scary Like my

53:36

the ship has sailed on so many of the things that you already said Like I mean I don't know Kyle might not bring it up

53:42

but he took one of my images and turned me into going on a date with this William Defoe guy We were in Central

53:49

Park and I I did not like my date with him evidently Um so then Kyle Kyle made

53:57

a video of me smoking a cigarette I saw that I saw that outside the hotel It's

54:05

See he's the one He's the one you got to be looking out for is the Kyle Jan do well I'm the evil one You're the problem

54:11

You're the problem Right But but for people we we've all done it I've done some things But all I

54:18

say is just we need to It's fun We need to keep having fun Just do it you know just make sure everything's in order and

54:24

just be alert and aware and continue doing it Continue having fun Like I said I will be doing the same thing but I'll

54:31

do my homework up front and make sure the tool is okay and that I can delete it but I don't want anyone to feel bad

54:37

because there's a lot we didn't know and then as we go along we learn as we go and we just do better when we know

54:43

better So no one should feel bad Yeah Yeah Yeah No I'm Kyle I don't think Kyle

54:49

are you sure because I want Kyle to feel bad That's the point of this podcast We forgot to tell you Yeah It's all slight

54:56

passive aggressive ways to get me to Okay question Last question

55:04

What does AI readiness mean to you and and what would you say to someone just

55:09

starting out with this this AI stuff ai readiness it it it just opens up

55:16

possibilities And I heard you all talking um about um the lady up front

55:22

about who Yeah who could um she said focus and do one thing and one thing only and do it well and then she learned

55:28

about AI So that's me I learned about AI I've written a few books Okay Chad GPT

55:34

and I have co-authored some books and it's one I always give away It's a download that I give away every one of

55:40

my classes It's about cyber security and I've written more And so now I've become I call myself I I live in draft mode

55:48

because every time I think of something I'm going to chatt is my fave right now I like Gemini and I just discovered

55:55

Gamma and my head will explode But anyway so I get into the tool and I say

56:00

here is an idea Here's something that I read I want to create a class or an article and then so it does it and it

56:06

does it well and then now I have a whole big long list of things that I'm in draft mode that I need to execute So AI

56:13

readiness means just basically moving it to the next step So all of these draft ideas need to now come to life um and

56:20

not just sit on them because I will continuously be coming up with different ideas and I did the what is that

56:26

notebook LM did that I have my own podcast and so I've done my own podcast and I did that it's like wow So all of

56:33

these things AI readiness means is um and here's what I would tell people is to pick one two maybe three tools get to

56:40

know them learn how to use them and just kind of um go in on them Because if you do what I did in the beginning I was

56:47

trying to create content and I would try I would see what chat would say then I would see what Jim and I would say then I would see what co-pilot would say and

56:53

then I would go between which is best And I wasted so much time So the time advantage was gone And then so find out

56:59

which tool works best for whatever reason Um chat is great for certain things Google the Gemini research the

57:05

deep research is mindblowing Um so figure out which tool you will use for what and then focus in

57:12

on that And then finally here's what I'm learning Bring it to fruition Stop

57:17

living in draft mode Stop starting projects getting excited and starting them Bring it to fruition Execute Get it

57:25

done So that's what I would say Being ready to move move it to the next step and executing That's what I'd say

57:31

Beautiful I love like you said two so two things that are so different and

57:37

they're so true at the same time One is living in draft mode and and like the beauty of us being able to do that We've

57:44

got these like very high potential ideas that are that are well vetted that we've

57:49

spent time on that we've used the wisdom of the ages to per you know to get it to a point It doesn't mean that we have to

57:56

launch that one right It might be the next one but at some point in time we have to remind ourselves right because

58:01

we talk about playing right we got to play we got to play and also we have to launch we have to implement we have to

58:08

actually do the things and for all of us who are inside the AI bubble like

58:14

doubling down on this now I think would be a real service to our colleagues who are still on the sidelines If you have

58:21

something to share if you heard Tracy talk about a code word go share it with

58:26

other people This is what we have the honor and privilege of being able to do We've had the chance to learn all these

58:33

AI technologies right like we have access to the hardware the software the

58:38

AI companies We have time even if it's because we don't sleep anymore We're we

58:43

we have the opportunity to not everybody does Yeah We're in the soup So now we get to share what we're learning with

58:50

other people And Tracy you're the very best reminder of that that I I could have had today We we will take what you

58:57

have shared with us and share it with the world Can I wind up with one thing

59:03

absolutely I I was here because you originally said something about psychological safety and then I had an

59:09

opportunity for the No that's one of the things that excites me Um a couple years ago the DC Department of Transportation

59:15

ological safety They had like:

59:22

was at the end of the pandemic and everybody was about to go back to work So let me define it really quickly and

59:28

then let me tell you how it relates to AI So psychological safety means that your people your family anyone around

59:34

you feels comfortable speaking up um sharing ideas criticizing sharing concerns without fear of retaliation

59:41

retribution or punishment So people are not afraid to speak up because you have a psychological safe environment So

59:47

that's at home that's at work And so that way that will encourage innovation So if someone's excited they have an

59:52

idea they're not afraid that their idea is going to get shot down and stupid If someone has a concern let's say AI in

59:58

the workplace I talked to a big bank the other day and some of the people are thinking "Oh my god AI is going to replace us I'm not on board." So even

::

though the bank is going to start implementing AI those people are going to resist it They're not going to use it or they're going to use it wrong because

::

they're afraid it's going to replace them If the institution the organization has psychological safety training and

::

that's something I love doing then their people will say "Hey boss I'm concerned about this I think it's going to replace

::

my job." and they're not scared about being uh retaliated against or punished or fired because they know that their

::

management wants to hear it and the management will actually listen hear it thank them for saying it and saying

::

here's why you don't have to worry let's talk about it So having a psychologically safe environment and the

::

age of AI means encouraging people to speak up good bad share ideas not

::

punishing them and not being negative but to actually hear them and to act upon it So psychological safety is key

::

normally but even more so in the AI world 100% It 100% I'm so glad you brought

::

that up No one's going to tell their bosses that they're using AI until they are darn sure they're not going to get

::

in trouble And then they're going to be waiting for somebody to tell them how they can and can't so that they can follow the rules And especially women

::

we're raised to be rules followers right so we're like we're not we're afraid of it as a result but

::

if our organizations can create that like you said psychological safety so that this is part of the everyday

::

conversation we're going to be in a much better space I've met IT directors who are in charge of the AI rollouts who

::

have moral um uh challenges with AI but

::

now it's their job right and they can't talk about it They tell me and then the guy and then the women are following the

::

rules and and then the guys are renegades They're just taking a chance They're doing it and sometimes it may blow up in their face and sometimes it

::

may work out well But the key is people should feel comfortable talking about any concerns If you're if you

::

accidentally click a link instead of being afraid to tell anybody because maybe no one will know you should feel

::

comfortable speaking up because that it may be malware and you give someone a heads up and let them know in advance So

::

psychological safety is so important especially in the AI era So that's what I preach Um make it an environment where

::

people are comfortable sharing Beautiful Well with that thank you so much for coming Thanks for your enthusiasm I love

::

your little frame all the the goodies you got around you It looks great Uh but have a fantastic evening How can your

::

hotel room guys how can people get a hold of you what's the best way for people to track you

::

down i I hate to say that This sounds horrible but someone said "Tell people to search Tracy the safety lady." T R A

::

C E Y Or you can look at my company There it is Safety and Security Source Tracy the Safety Lady I'm on LinkedIn

::

Request that uh AI um ebook that chat and I co-work for you all and it just talks about what you need to know are

::

free Hand out free PDF So request that and some of the other books that I've written And I'm starting to use AI to

::

sell these signs You know smile you're in camera or don't ring my doorbell signs and all that good stuff Beautiful

::

All right Thank you Tracy Love it Thanks guys Everybody

::

[Music]

Listen for free

Show artwork for AI Readiness Project

About the Podcast

AI Readiness Project
Forget trying to keep up with AI, it's moving too fast. It's time to think differently about it.
The AI Readiness Project is a weekly show co-hosted by Anne Murphy of She Leads AI and Kyle Shannon of The AI Salon, exploring how individuals and organizations are implementing AI in their business, community, and personal life.

Each episode offers a candid, behind-the-scenes look at how real people are experimenting with artificial intelligence—what’s actually working, what’s not, and what’s changing fast.

You’ll hear from nonprofit leaders, small business owners, educators, creatives, and technologists—people building AI into their day-to-day decisions, not just dreaming about the future.

If you're figuring out how to bring AI into your own work or team, this show gives you real examples, lessons learned, and thoughtful conversations that meet you where you are.

• Conversations grounded in practice, not just theory
• Lessons from people leading AI projects across sectors
• Honest talk about risks, routines, wins, and surprises

New episodes every week.

About your host

Profile picture for Anne Murphy

Anne Murphy