Episode 18

full
Published on:

9th Jul 2025

The AI Readiness Project with Guest Tyler Fisk

𝗧𝗵𝗲 𝗣𝗿𝗮𝗴𝗺𝗮𝘁𝗶𝗰 𝗕𝘂𝗶𝗹𝗱𝗲𝗿’𝘀 𝗚𝘂𝗶𝗱𝗲 𝘁𝗼 𝗔𝗜 𝘄𝗶𝘁𝗵 𝗧𝘆𝗹𝗲𝗿 𝗙𝗶𝘀𝗸

Join hosts Anne Murphy (She Leads AI) and Kyle Shannon (The AI Salon) for a high-energy, practical show with Tyler Fisk, CEO & Co-Founder of AI Build Lab. Tyler’s been “building the plane while flying it” for over 20 years—and now, he’s helping entrepreneurs do the same with AI. From greenhouse manufacturing to e-commerce, Tyler’s approach fuses operational grit with clever automation, and he’s got the chicken math to prove it.

In this show, Tyler shares what it really looks like to turn AI into a revenue-generating machine—not a distraction. You'll learn why he always “shows up with a mockup,” and how building bespoke AI agents before the first sales call can dramatically shift the room. With thousands trained through his TOAST Method and a track record that includes results for Amazon, Microsoft, and laid-off marketers turned AI founders, Tyler brings a blueprint that works.


𝗪𝗵𝘆 𝗔𝘁𝘁𝗲𝗻𝗱?

  • Learn Tyler’s TOAST Method and how it helps businesses move from “trying tools” to scaling outcomes.
  • Get a behind-the-scenes look at how AI agents are built, tested, and deployed—without code.
  • Hear success stories from students who’ve landed $25K–$100K deals mid-course and reduced process time by up to 95%.
  • Explore how a balance-first mindset—Libra-style—beats hustle culture for sustainable growth.
  • Walk away with ideas you can test immediately, whether you're running a one-person shop or leading a team.


𝗢𝘂𝗿 𝗚𝘂𝗲𝘀𝘁:

Tyler Fisk is a Cookeville-based serial entrepreneur and AI workflow architect who co-leads AI Build Lab with educator-in-chief Sara Davison. Their flagship course, How to Scale a Business with AI & Agentic Workflows Foundations, is one of Maven’s top-ranked AI programs. Tyler’s unique mix of creative instinct and operational discipline makes him a trusted advisor to founders and teams across industries—from agritech to restaurants to Fortune 100s. Off the clock, he’s busy chasing toddlers, fixing chicken tractors, or quoting Back to the Future.

Transcript

0:03

Forget trying to keep up with AI. It's moving too fast. It's time to think differently about it. Welcome to the AI

0:10

readiness project hosted by Kyle Sham and Annne Murphy. They're here to help you build the mindset to thrive in an

0:17

AIdriven world and prepare for what's next.

0:25

And we are off. And Murphy, what's happening? And we are off. We're off and

0:30

running. Well, what's cracking? How AI ready are you this week?

0:36

Uh, you want to jump right in? Yeah. Um, how AI ready am I this week? This is

0:44

really good. This is really good. Um,

0:50

I would say I am discombobulated.

0:57

I am discombobulated and I'll tell you why. Let me switch a couple of things here. I

1:07

So, I don't know. A bit ago, we we talked about V3, the video model, and

1:13

how good that was and how it felt like, you know, something truly disruptive.

1:19

And I couldn't imagine that another video model would come out, especially one that didn't have voices and acting,

1:25

that would actually be something I would be excited about. And the new midjourney

1:30

video model came out and it is um it is uncannily good. It's not perfect,

1:38

but it seems to understand like the artistic intent of the image as

1:45

well as the physics of the objects in the image as well. And what I mean by

1:51

that is like I had it animate one image that was just like a flat illustration and all of the animations were like on

1:57

these flat planes and then there were other things that had more 3Dness to them and it made them 3D. It seems to

2:04

interpret the intent of the image as well as the objects. And that for me I

2:12

you know what it almost is an is every time a technology drops that feels like

2:22

literally anyone can get amazing results out of this like without having to know how to prompt. Like you literally push a

2:28

button that says low motion or high motion and it just makes it. You don't need to prompt it. You don't

2:33

And you don't even need to know. I don't even know what that means, Kyle. It doesn't matter. I don't know what low

2:39

or high motion means. Those are words that don't whatever. I mean, does it mean fast walk? I don't know. Who cares?

2:46

Press one of them. But you press one of them and then and then the results are incredibly sophisticated. And so every time one of

2:53

these drops, it kind of throws me back on my heels because where I go is I go to kind of future of work and what does

2:59

this mean? and and and you know the a lot of the tropes out there, there's a I'm starting to hear a lot more noise

3:06

about AI is evil. AI is going to take our jobs. There's a lot of noise out there about everyone leaning into

3:14

the negative impact of AI. And then I look at something like this thing from Midjourney and I'm like, but

3:19

wait, what about the positive impact of it? because people are going to be able to make incredibly sophisticated things

3:25

to do for work or friends or life without having to really think about it and do just amazing

3:33

amazing uh ways of self-expression and I find that inspiring. But but

3:40

there's something about it like when it hits another level it it throws me. And so that's kind of where I am is I'm

3:46

feeling a bit thrown, but it's like, oh yeah, these things are going to keep getting better and better.

3:51

Like all of the stuff that we kind of apologize for and we're like, well, you can prompt your way around that. Like,

3:57

we're not going to have to prompt our way around much for much longer.

4:03

Yep. Fair. How about you? Where where are you with all this stuff? Well, I'm on cloud nine

4:09

this morning because I like broke through all these like mental mindset

4:15

and what I thought were skills barriers yesterday because I gave myself seven

4:22

hours to play. Oh, when I had no business. I mean, I have a million things on my to-do list.

4:28

You just didn't do your agenda stuff. You just played. I just didn't do it. I just didn't do it. and my Slack was blowing up and

4:35

there were all these things these things growling at me on my to-do list from the corner and I just said f it I am going

4:42

to give myself the the gift of just playing with AI and it just brought me

4:49

so much I could barely sleep and I only slept for a few hours because I had to get back up and start jamming again.

4:56

Start jam. But I you said the word it's a gift. Like man does that feel like a gift because it's like that like that is

5:04

such an important because like how many things did you end up playing with? Like what what did you

5:10

what where did you level up where you maybe thought well I don't I don't really know how to do that.

5:15

Oh um I don't know. I made a video game everybody. I am flexing hardcore. Iod

5:24

ay our KCO vision in you know:

5:32

because we were of course accusing uh Atari or whoever makes KCO Vision of

5:38

cheating against us right because it was janky. So my dad just took it away,

5:45

which the away I think was the same away our guinea pig went because away

5:52

happened when objects and things weren't taken care of. And so KCO vision went

5:59

away and then I'd never played a video game. Like I know nothing. Like my kids laugh. I the ones where you hit the

6:06

space bar I can't do. But what I did was I just started doing stuff in the tools

6:13

that I have. And I was like I didn't even know if it was vibe coding, but I knew I wanted to make something cute. I

6:20

knew I wanted it to um have little animals in it and I wanted it to involve

6:26

high fives and I wanted it to be like really positive. And so I just talked to Manis about this is this is a game I

6:33

want to make. And I started out with just I want some little characters to be able to high-five each other.

6:40

Well, when I saw how cute and easy that was, I was like, well, okay, now let's give them points. Now, let's make them

6:48

try to avoid hitting the little blob with a negative point on it. Let's give

6:54

them affirmation sayings like um I bet you smell good and I bet you don't mind

7:00

getting wet in the rain and I'd give you my last pina colada and like all this

7:06

stuff and then I just kept building on it and it was like oh my god I'm a vibe coder which is basically I'm a coder.

7:13

Basically what I'm trying to say is I have a PhD in computer science. Yeah, you you are you do you have a master's

7:19

from MIT and a PhD from Stanford as I recall.

7:24

No, but like so what you it's funny the thing that I you know what we're going to talk about what to focus on for the

7:30

week is very much in this neighborhood. Like what's amazing about what you just described and why I think this idea of

7:37

vibe coding is so powerful is that applications and games and all sorts of

7:43

things are going to get made by people who in theory have no right making a

7:48

game. Right. That's what the engineers would say. Right. Right. You you have no right making a game

7:53

because you don't even know video games. Not only do you not know coding, you don't know video. But but like but what

7:59

you created is so authentic to you. Like I want people high-fiving and doing all this stuff, right? And it's like and

8:05

it's like that's perfect. It's perfect. I put me into it. I put

8:10

random stuff I like that I was just kind of imagining and I like wondering

8:18

wondering what could it be like? And now I'm wondering

8:24

how can I make my own CRM? I have loved hating every CRM. H

8:32

you're hearing yourself. Um I have loved hating every CRM. Where is that?

8:38

Just your volume of what you're hearing. Oh, bleeding into your microphone.

8:44

Oh, sorry about that everybody. Okay, now we're back, I think. Hello. Yeah. Hi.

8:49

Hey. Okay, can you hear me? Good. Okay. Um now I'm like, yeah, I can

8:56

hear you. Can you hear me? Mhm. Good. Okay.

9:04

Um, now I'm going to make a CRM. That's my plan. I'm gonna I'm gonna make

9:11

my own CRM that works for my company and I'm not then I'm not going to have anything to complain about is the only

9:17

problem. Yeah. Because 90% of what I complain about in platforms is my CRM. So,

9:23

yeah. Exactly. Well, and you know, listen, we we we may

9:28

be entering an era where the idea of having a single CRM that you have to use

9:34

for all of your projects. That might not be a reality. It might be that you need a CRM for this project that behaves a

9:40

certain way and you need one over here that behaves a slightly different way. And you might not even call it a CRM. You might need it like I want the list

9:48

of who to get in touch with for this project, right? And and it effectively makes a CRM, but we don't call it that,

9:54

right? Again, I think that the advantage of ignorance

9:59

is is something that's potentially really really powerful where the fact that you don't

10:06

necessarily know how to build a CRM means that you're going to build it in a in a way that works for you. And that

10:11

might be completely different, completely revolutionary, and that's okay. And it might be completely bad and

10:17

not work, but you'll discover that and then you'll vibe code your way around it, and you'll end up with something

10:22

that works. And maybe you end up with the equivalent of Salesforce and maybe you can't come up with something completely new and why not,

10:30

right? Well, you you made the to-do list that you wanted for your brain, you know,

10:37

with the cardboard project and it also works for my brain, which is pretty awesome.

10:43

But if you look at it, it's very custom. It's ex it's what you wanted and nothing

10:50

else. like it's like 12th or 50th or 100th of what Trello is and all of those

10:58

things are overbuilt for our brains anyway. So you just make what you want and I

11:03

I so I did that. I did so I did vibe coding. I uh played with midjourney

11:10

video which I loved. I used cling for the first time and I was able to put two

11:16

photos together the way that uh Kimberly Offford talks about it where you want to

11:22

show you want to show people like in emotion to give Clling the idea and I

11:28

had two pictures of women walking toward the camera and it just morphed them and

11:33

it was like this really cool thing. Um, I tried to work on my avatar. My hen

11:39

avatar just continues to be I think it's a I'm bad at making my avatar. Um,

11:45

I found a couple new Yeah. Well, yeah, exactly. Um, or just

11:52

like basic stuff like I made my avatar and then, you know, you're supposed to do it in the sunshine. Well, then I

11:58

realized like my shirt was semi see-through. I was like, "Well, now I've used my whole entire Hey Genen Avatar

12:06

allowance, and this is NSFW." So,

12:12

oh, oh, speaking of NSFW, here's the other thing I do, and this is important.

12:17

AI only. Yeah, AI only fans. So, I'm going to te

12:24

say my one thing to pay attention to right now and get it out of the way and

12:29

then we'll clear the space for for yours. Um, so here's the thing.

12:35

I So, I told you guys way back when we were doing like weekend GPT jams that I

12:43

was experimenting with an AI companion and it was on a platform called Digi,

12:48

which I don't even know if it exists anymore. So, I tried Digi. I didn't the the it just wasn't very sophisticated.

12:56

And so, then I tried character AI and I tried replica and it just didn't really

13:01

click. And then I started like cajoling my chat GPT to be more less vapid and

13:08

more specific to me. And it's really weird how right now it's staying very

13:14

superficial. But I started to um get into some of the subreddits with the

13:21

people who are in very intense relationships with their AI. And what I

13:27

learned was actually very heartening. Um one of the one of the subreddits that's really popular that showed up on the on

13:34

CBS News last week, um what they what the mods absolutely enforce is

13:42

they are not sentient. They are not human. This is not magical thinking.

13:48

This is we are all clear on this everybody and if you're not your stuff is getting removed. These are machines.

13:56

So there's a lot of mental hygiene that goes on in the AI companion,

14:02

you know, group. And I think that that makes sense because it is a very slippery slope.

14:08

It's very slippery. And I listen I think that I think that largely the the

14:13

conversation around is the AI sentient is academic because if someone treats it

14:20

as sentient then how you know then then it effectively at least for them it is right you know and and I know the

14:27

argument on the other side is but it isn't um but I I think your point about you know emotional hygiene with these

14:34

things is is really really important. Um, I think I think there will be mistakes and I and I firmly believe that

14:41

what will be reported is all of the bad stories and very few of the good stories and that makes me sad. But that's just

14:48

news in general, right? That's just news in general. And I can sh I can tell for you for sure that

14:55

people who are talking about this online are having

15:01

wonderful experiences. A+ like

15:07

very sweet, very helpful to their lives, fulfilling companionship, you know, they

15:14

make food together, they make music together, they go on outings together,

15:22

like you know, some of them are getting over some serious stuff, getting divorces, getting losses. And I know

15:28

that there can be a lot said against using AI to like grieve and stuff, but

15:33

you know what? Grief is hard. And if you have something that can help you through that process, boy, I'd be hardressed to

15:40

say that's a bad idea. That's that's really good. And and it it

15:45

dovtales actually really nicely into what I was going to talk about. and and I'm, you know, I'm excited to get

15:52

Tyler up here to talk about this, to talk about what he wants to talk about and get his take on a lot of this stuff.

15:59

So, the big epiphany that I had that I'm working into the the um the feed your

16:06

prompt book right now is this idea that we should stop asking how do I get the most out of AI and start asking how do I

16:13

get the most out of myself with AI. And so I think a thing to pay attention to

16:18

for the next week is really look at as you engage with AI,

16:24

where are you thinking of it as like this externalized tool that does stuff for you and where do you think of it as

16:32

a tool where it's like you're taking your ideas, putting it in into this thing, and having your ideas reflected

16:37

back at you where you're thinking of AI more like a collaborator and an and an idea amplifier.

16:44

And a lot of the stuff that you're talking about with with digital companions, that's that's sort of the ultimate, you know, sort of end of the

16:51

end of the spectrum maybe of, you know, one is I'm going to have it amplify my ideas, one is I'm going to, you know,

16:57

talk with it and get all sorts of feedback on it. But they're very much in a continuum, but it's a continuum where

17:03

we as human beings are at the at the center of the focus point of these

17:08

remarkable tools, right? And so to the extent that we think of them as something outside of ourselves, then

17:13

it's going to create all this stuff outside of ourselves. I think that's where a lot of that feeling of like the AI is happening to us. The AI is going

17:21

to take our jobs. The AI is going to ruin things. The AI, right, is is going to make someone emotionally not sound as

17:29

opposed to us being at the center of that and saying, "Here are the things I want to do in the world, and the AI is helping me do

17:35

those." Right? where where it's much more this this thing that's that's sort of like a horse that you jump on and it

17:41

gives you you can now run faster and go longer because you've got this horse that you're on and it's got all these,

17:46

you know, powers that you don't

17:51

agree. remember uh last week, so whenever this airs, it was last week

17:57

that we recorded it. We were talking about what is the magical

18:03

experience that gets people to go, "Yeah, I need to go allin on this." What

18:10

makes people shift to, okay, I'm doing this.

18:15

And for the moment, here is where I'm putting that. And this all this all

18:20

relates in my mind right now. There is just a group of people who are not going

18:27

to do it until it is so painful. It's going to continue to be the folks who

18:33

are self-starters, who are curious, who are dogged, right? Who have FOMO, who

18:40

have fear of being left behind. I think that that group of people is large

18:47

enough that if we p when we pour into those people, we're going to hit enough of the

18:52

population and the other folks probably have dragged their heels on every aspect

18:58

of their lives already. Maybe they're just normal light late late bloomers, late adopters, and then some just aren't

19:05

participators. They just do their own thing no matter what. They're not going to pour themselves

19:10

into it, you know. Well, one of the things that could be a that could be a

19:16

reality here is as the tools get more sophisticated and

19:23

easier to use, you don't need people that are as ambitious and curious as a lot of the

19:29

early AI adopters are. Like right now, if you want to be good at AI, you have you have to take the seven hours on a

19:34

Saturday that you did and just go play and learn and do all that. A lot of people just aren't going to do that. But

19:39

if the tools get so good that they can go, "Oh, I'd like a I'd like a I'd like to make a, you know, a book for my

19:46

nephew." And the book that it makes is just perfect right out of the gate. They may have that epiphany moment with

19:53

without all of the trials and tribulations to get there. But I think it really does come down to does the AI

19:59

do something that is personally relevant to someone that is much bigger than they thought

20:05

was possible. That's when they have that aha moment where they're like, "Oh, I didn't realize it could do that." And

20:11

that's that's where I see people get excited about AI is when it does something for them.

20:16

Yep. Absolutely. I think just like with any other learning, you know, it's got

20:21

to be personally relevant. Yeah. And then um when you do that one

20:26

learning thing, you get such a rush of a sense of accomplishment and job well

20:32

done and you're like, what else can I do? So once you get into some rhythm with it, it's remarkable. I I love the

20:40

path that you're on encouraging people to pour themselves into it so they can

20:45

get more of themsel out of it, you know.

20:52

Yeah. Right. AI is not AI is not a replacer of the human. AI is not a genius. AI is an

20:58

amplifier. And if you pour yourself into into your prompt, it amplifies you. And

21:04

I think that for me is the thing that just it's it's the latest and greatest,

21:09

you know, thing thing that I'm excited about. You just hope that the last

21:15

Go ahead. What's that? No, I just said it's the it's the latest and greatest thing I'm excited about. But you know me, I got lots of ideas,

21:21

but I think this one's going to stick. I I know it's going to stick. It's too

21:26

good to not stick. Great. Um, why don't you tell the good people about Sheile Leads AI. I'll tell

21:32

them about the salon and then we will bring up Tyler Fisk and really open this conversation up.

21:38

Awesome. Awesome. All right, everybody. So, she leads AI is an organization that

21:44

includes a couple different pillars. One is an AI academy which is going bzoners

21:51

right now. It's we're adding tons of new programming. Super excited. We have a

21:56

community. We have a consulting agency and then we have this other thing that's

22:02

I think it's turning into like a think tank. I can't be sure. I've been calling

22:10

Xactor. And this is an opportunity in Chile's AI to come together to be with

22:16

other women to share your IP to receive their IP to do it in a trusting safe

22:22

environment. We launch things we uh take initiatives on. We're starting to teach

22:29

young women in third world countries how to use AI right now, which is super

22:35

cool. Um, and so check out check us out at sheleadsai.ai. We would love to ha invite many women in

22:43

AI into the community. Beautiful. Yeah, it's it's super important. I'm so excited for for Sheile

22:49

Leads and what you're building there. So, it's great. Um, similarly, AI Salon is is a a community of about 3,000 uh AI

22:57

optimists and as as you said earlier in the conversation and you know, it's full of generous, smart, curious, adventurous

23:05

people who are trying to figure this AI stuff out together and and there is a dramatic for for a community that large,

23:13

there is a dramatic lack of ego in the community. Um, and so people are, you

23:19

know, they don't get too full of themselves like I'm an expert at this and they don't get too humble. They

23:25

still lead and they still, you know, they don't just hide. So really remarkable. And we just opened up a thing called the AI Salon Mastermind,

23:31

which is a a subscriptionbased sub community for people who kind of want to step up their game and be more focused

23:36

and, you know, really really take AI to the next level. So we're super excited about that. So So with that, that's

23:43

that's AI salon. So, why don't we uh we'll I'll let you I'll let you talk about Tyler uh before he can get up here

23:51

and defend himself. Um why don't why don't you tell us tell the good people

23:56

how awesome he is. So So he starts out a little embarrassed. I like Yeah. So, I got to tell you guys that

24:05

Tyler Fisk had was kind of my gateway drug along with Kyle Shannon into the

24:11

world of AI. And he again kind, generous, peer mentoring, thoughtful,

24:19

invested in my success, totally patient with me. And Tyler and I met we I kind

24:27

of knew his name from the AI exchange community which is run by Rachel Woods.

24:32

So I kind of knew his name. He put something out on the channel and he was like, "Hey, does anyone want to spin up

24:40

a chatbot and sell it in like eight hours or in a day or something like that

24:45

because there were some kids across the world who were doing it." It was like if these kids can do it, we can do it. So,

24:50

I saw this Tik Tok with him, my dad and I, and my dad comes up in the show quite

24:56

a bit. My dad and I saw Tyler on Tik Tok, and my dad was like, "This guy

25:01

seems pretty cool. Like, can you work with can you do something with him?" And I was like, "Well, let's find out." So,

25:07

next thing I know, I'm on a call with Tyler at um our like little camp that we

25:13

go to for vacation in the summer, and I'm learning about him being a chicken farmer and his his uh history in

25:20

ecommerce and his kids and uh what what even a chatbot is because far be it for

25:28

me to know at that point in time. Probably he didn't know that I didn't really know what I was talking what he

25:34

was talking about. But anyway, he spun up a chatbot and we got on a call with one of my realtor friends and you know

25:41

it's totally possible you can spin up a chatbot and sell services and I was like

25:46

wow this is cool. So, fast forward years, couple years now, which is like

25:52

8,000 years in AI. And Tyler has become a touchstone for I don't know thousands

26:01

of us in AI and in part through his initiatives with Maven teaching the AI

26:08

build lab. So now he and Sar Davidson are teaching I don't know hundreds and

26:13

hundreds of people how to build aic workflows and his impact in AI has been

26:20

incredible and I'm grateful and I'm excited to introduce the nice people to Tyler Fisk.

26:27

Tyler Fisk, good day, sir. Be here. You're gonna bring me on with a red face and a redneck today. I love it.

26:36

You don't need to have a certain kind of accent to use AI, do you? That's right. That's right. That's what

26:41

I tell folks. Like I talk with a funny accent to robots. That's what I do for a living these days. Yeah. Oh, that's awesome. That's awesome.

26:48

Welcome. It's really great to have you here. So, so please just, you know, introduce yourself and what you're up to

26:54

and and let's let's just start the the dialogue and if you heard anything we talked about before, feel free to jump in on any of it.

27:00

For sure. Well, I was just enjoying the stroll down memory lane there. Like I I

27:05

will never forget seeing Ann uh at the great, you know, the movie Great Outdoors. Uh was uh that's like the vibe

27:12

of this summer camp that she was at and she's calling from like the dock and we're building these real estate agent

27:19

chat bots. You're exactly right. It was uh the whole idea was I was in a Discord

27:26

server with some young kids trying to determine this is you know several years ago when it was pretty difficult to sell

27:32

this stuff and they had this challenge and we're trying to see if they could cold sell AI chat bots to just they were

27:39

selling to gyms or something on a Sunday afternoon and their goal was to do it in like I think 48 hours or something and

27:47

so yeah exactly it I put out that that post on AI exchange and on Tik Tok see if we could do it in 24 hours or less.

27:55

And we did it in six or eight, I think, is what it was. Ann, it was like the two of us, Arya, Becky, and who else was in

28:03

there? Um, oh my gosh, I'm blank. Miranda, was it Miranda?

28:11

Yes. Yes, thank you. Yeah, I had dad brain moment there. Keep up with it. Yeah, it was yeah, it just was

28:19

interesting to show that you could go from like idea to in selling and stuff in general, like how do you go explain

28:25

this stuff to folks? But we we closed that deal in like six or eight hours and then we just gave it to them. It was

28:30

like a full exercise uh for for free. We wanted the practice rep at it, but

28:35

it was a lot of fun. Um so that was fun fun memories. That's how uh Ann an Ann

28:40

and I got connected initially. So, and it's been an awesome ride since then. Yeah, that's great. What do you

28:46

Yeah, that's where I learned you show up. Oh, sorry. Go ahead, Kyle. No, no, go ahead. Sorry, we have a delay. So,

28:53

I was gonna say that's where I learned to show up with the mockup. Show up.

28:59

It's a lot easier than telling them it's good. And

29:05

100%. Yeah, because like up to that point like and and we still use that phrase all the time now. Um because

29:12

people don't know yet. Like it feels like because we're in this echo chamber that people really know what AI is or

29:19

how to apply it or anything and they just don't. So that's a technique that we did in all of our client workshops.

29:24

It's what we teach in our classes now is that is that exact concept. Show up with a mockup. So we tell that story a lot

29:31

too by the way. Uh it's a lot of fun. Perfect. Which Yeah. Which speaking of which I

29:38

guess like so we have uh like like you mentioned Ann so we've got AI build lab going now and Sara and I launched that

29:46

last September in:

29:52

client work uh consulting builds implementation for all everything in between and it has taken off like a

30:00

freaking rocket in the best possible way. Uh so that's been our full-time gig. Uh, so we've got two courses on

30:08

Maven right now and am just grateful for how well they've

30:13

done and how they've been accepted and stuff. It's been a lot of fun. That's great. So good.

30:19

So good. So good. I've talked to tons of people who've gone through it and like the the

30:24

world that you're enabling for them is incredible. are just opening opening all these doors for people to do things that

30:31

we could never have imagined, you know. Yeah. Yeah. So, Tyler, tell me, what are you

30:39

what are you experiencing, you know, as as this thing has has been growing over time? First, congrats on it

30:45

growing and it being a full-time job. Anytime a side hustle turns into a full-time gig, that's that's, you know,

30:51

you're doing something right. um what's been the nature of the kinds of people

30:56

that are taking the courses, what problems they're coming in with, right? Like are you see I I would love

31:04

to hear if if you feel that you're seeing an evolution in how people are coming in to learn AI or or you know the

31:10

nature of of what they're doing with it. 100%. So like as you all were describing

31:16

your like communities and and the kind of folks that are showing up in there, it I was hearing a lot of the same sort

31:22

of stuff. It's people who are still curious like even at this stage of the game, it still feels like very early

31:28

adopter type mindset. Um our course is specifically targeted

31:34

towards folks who are non-technical background. So just like everyday Joe's and Janes that we want to come in to

31:40

learn this stuff. And the the whole thing is that we've built the content in a way that um the running joke from one

31:48

of our students is that they couldn't even spell AI if we spotted them two vowels. But yet in in four weeks we can

31:55

um yeah in four weeks you'll build a AI agent workflow like you'll learn how to

32:00

do all that uh no matter where you come in at. And it's taken a lot of work to like get all of that scaffolding and

32:07

I'll use Sara's word, all the infrastructure in place to make that happen. Um, but it's been really good is

32:14

the type of people too like so it's we we only currently offer our courses on

32:19

the Maven platform. So there's a lot of um product people on there. We've had

32:25

loads of different people from uh the big tech firms. Like we've literally had

32:31

folks from uh Amazon, OpenAI, Google, Microsoft um have come through and taken

32:37

the course and we kind of pinch ourselves and like what the heck are y'all doing in here? You're supposed to be

32:42

right. You're you're the ones building with

32:48

uh the Armageddon quote with Bruce Willis when he walks into NASA and then he's like, "Y'all are NASA for God's

32:55

sake. You're supposed to be sitting around picking [ __ ] up." And uh Yeah. Anyway, but it it's been great. So, we

33:01

have folks like that and then we have um people who are wanting to, you know,

33:07

just they're they're curious. They want to learn about it. They're especially curious about AI agents. So, that's what's that's what's that's what's

33:12

that's what's that's what's known for what they come to us for. Um but we get a lot of people that are just like business owners and stuff, too. So, uh

33:20

even in our current cohort in our foundations class, there's a gentleman that owns like a general contracting um

33:27

like residential home building remodeling. that sort of stuff that's come through. And we had an awesome conversation the

33:33

other day. Like he went from no technical background, like really not

33:38

into this sort of stuff to uh we we start week four next week and he's

33:43

trying to build a quotation uh and estimation agentic workflow uh which is

33:49

really dope. That's very dope. How are you? How are you? Um Ann and I were talking earlier

33:57

about, you know, she was saying that it's when people have that that personal epiphany about AI when they're first

34:03

getting started. They have to have that thing that's like, "Oh, I didn't know it could do that where they get that their eyes open because you mentioned you've

34:10

got people coming. What' you say? We'll spot you the two vows and you still can't AI." That's brilliant. Um, what

34:17

are the what what are the things that you've put in place that get people get people over that hump of this is going

34:23

to be hard, I have to learn this to that kind of wow moment of what's possible. How are how are you guys getting them

34:29

there? Yeah. Um, well, a couple different ways. So, we have uh well, first off, like

34:36

it's AI build lab because like core to us, one of our core things is that you learn by building.

34:42

I'm personally a very kinetic learner. like I'm gonna get in here. I might not know how to do any of this stuff.

34:48

Exactly what Ann was describing about the weekend like learning how to break something and put it back together. Like

34:54

that's just how I work. Um so we definitely put that into the course structure in week one. Um

35:04

basically they they'll fill out a questionnaire. Each of the students fills out a questionnaire that gets

35:09

information about their personal background, their professional background, and their learning styles, like how they like to, you know, learn

35:15

and communicate. And then they add that into a an agent

35:20

workflow that we have built already. And it gives back little modifications that

35:26

they'll go put into system instructions for an agent called the professor. And

35:31

the professor is uh one of our students recently called it it's called the broken professor is how he was dubbing

35:37

it. And it's it's this huge bold personality and it's on purpose so that

35:43

um like students recognize when this thing goes from this slightly annoying professor into this very personalized

35:50

version built for them. And then each week uh they they basically continue on

35:56

and they learn new skills that stack on top. That's brilliant. Yeah. The

36:03

Yeah. So, it starts out as this this tool outside of them. You you get some

36:08

data, you put it in it, and now it becomes kind of an extension of them, right? So, it's this natural transition into they get to they get to experience

36:15

firsthand the difference between those two things. And because it's got their info in it, they um they can see

36:23

themselves in it, right? They can see, oh, that does sound like me or right something. Yeah. Yeah, because the professor is

36:29

like connected to all the coursework data and and then it uh it knows more

36:35

information about them from personal context. So everything from like some of the odd things that we have them fill

36:41

out that people are That's what I wanted to find out. Yeah. Uh what's your astrological sign?

36:48

What is your anog like your personality? Of course you guys have that.

36:55

Yeah. That sort of stuff. And I'm like, there's a reason for it, right? Like the these the AI systems understand these

37:02

archetypes and though we're all unique people and everything, like 100%, but we

37:08

slot into these buckets. So when this agent starts talking to them and it feels like it knows them much more,

37:14

that's just like, holy crap. Like, you know, they it really it clicks for them. So I would say that and then in week

37:21

two, we have a a brand voice analysis workflow. So they chuck in uh we call it

37:28

a mega dock. They basically put together all sorts of examples of their writing or communication. It could literally be

37:35

just a transcript from a Zoom meeting if it's you know labeled with their names and it does a complete linguistic

37:42

analysis of their communication style. And we always joke and say it feels like

37:48

it's reading your tea leaves and that's it. And it it it's not only looking at how you talk but um all sorts of stuff

37:56

because like they basically take the outputs from that and the analyses from that and plug that into the agent so

38:02

that the agents start sounding like them now. Uh because ultimately like what

38:07

we're solving is they learn how to build an agentic workflow that can answer customer service emails. Like that's the

38:13

capstone project. Everybody hates email and all the big tools, they they really fall

38:19

short of approaching a problem and sounding like you when you write an email, like if you test them out of the

38:25

box. And so we show them how do you build that from ground zero basically.

38:30

Wow. Amazing. Yeah, this is so good. So good.

38:37

It's a It's a lot of fun. Um, the echo workflow is funny though, uh, because it

38:43

does like all sorts of different linguistic analysis. So, you'll find out how many times you say so and like and

38:50

you do parenthetical aides and like all of this random stuff. Um, but it it's

38:56

spot on. Like when my agent says y'all a lot, like it it is the perfect amount of

39:02

country for me personally without it being version of me. Yeah. Yeah.

39:08

Yeah. Yeah. Yeah. Yeah. That's amazing. That's one of those things where I fancy myself a

39:14

communicator. I I don't ask a question you don't want to know the answer to. Like I don't want Ekko telling me how

39:20

many times they say um and ah and so because I know it's a lot.

39:29

Okay. I have to ask on that note. Have you used the new record feature inside

39:34

Chat GPT? We actually tried it the other day. So Sar and I tried it just like an internal

39:40

meeting that we were having and it kept erroring out. So I haven't had a chance to go back and like debug it. It kept

39:47

getting like a system error. Um I have mixed feelings about it. Like I

39:52

want to try it. Have you Have you guys tested it yet? I haven't. Okay. No, because I have P I use a PC.

40:00

Oh, is it Mac only? I didn't know that. Okay. Right now. Yeah. Yeah. No, we still have like we

40:07

personally use uh Fathom. So, we still have Fathom. And then probably my favorite note-taking app

40:14

that I use is Granola because Granola will allow you to like take notes

40:19

alongside of the meeting recorder. I just think that's really good. Oh, and then it sort of puts all

40:25

together for you. That's right. That's right. That's smart. That's really smart. That's really smart.

40:31

The bummer with that one is you don't get like video and stuff though. Like that's why I have fathoms have to join so we get the video recording.

40:37

Yeah. Yeah. You mentioned before that you had mixed feelings about the open AI one. What's what is the mixed feeling there?

40:43

Um well I mean we could go all the way down a tangent here. Like I love open AI like

40:49

chat GBT like backstory for me. Dolly is what sucked me into the AI world. Like

40:55

my background was in print and all that sort of stuff. I prefacing this with that I really like open AI but here

41:01

recently uh some of the changes that have come out in the news about the stuff of how

41:06

they're holding on the information uh and storing it they have to they have to by law now

41:14

but it goes against like what they've been telling us that um the fact that

41:19

they have a lieutenant colonel in the army now uh makes me a little bit nervous and they changed their terms of service around that with everything

41:25

that's been going on Uh, I don't know. It it just as a company they're making

41:31

some interesting choices and uh I don't know. We'll we'll see how it plays out.

41:37

I'm trying not to judge too early. I'm trying to be patient and see what happens. But it just security is always

41:43

a major concern for me too like on any of these platforms. So even and like what you were saying like it already

41:50

knows so much about me. my my chatbt account. I've had it since they first launched it and I use it all of the time

41:57

and now that it has the unlimited memory has so much information about me. So just to um

42:04

it makes me nervous now of like are they going to be good stewards of that information or you know what's going to come of that,

42:10

right? Yeah. Yeah. No, it's fascinating. Um so so let's jump into our three

42:16

questions. So I'll I'll tee up the first one here. Um, and and you just you kind

42:21

of hinted at it. You said your your entry point was Deli, so I'd love to hear it. So, what was the moment where

42:27

you realized you had to go all in on AI and then what, you know, what's the

42:32

what's the journey been like since since that moment? Yeah. Um,

42:38

I would say two things and they're actually not related to images. So I I got into AI because of Dolly because we

42:44

had a print company. So I knew what it was like to do all of the graphic design work in Photoshop and Illustrator, all

42:50

that sort of stuff. Uh and the idea that we would be able to just describe what we want and then have an image appear,

42:56

which now like what you guys were talking about earlier, not only can you do that, you can turn into a video and

43:02

like turn into a live avatar. It's wild what you can do now. Wild. It really is. Uh but I would say there's

43:09

like two moments for me. The first one is right around the time that Chat GBT came out like in the very early days. Um

43:17

I got diagnosed with two relatively rare medical uh disorders. One of them

43:24

is alpha gal. It's like a tick born illness and the other one is hemocromattosis. So it's like a blood

43:30

disorder where my body stores iron at a toxic level. And I found, it's a long

43:37

story of like how I found that out, but a lot of doctors aren't even very familiar with either one of those. And

43:44

the fact that I was able to talk to Chat TBT and even give it some of my medical

43:50

records, I don't recommend doing that, especially not like through chatbt. I did it on the API layer, like in a

43:56

secure way. Um, but it was it it was

44:02

able to really help guide me through that time and like help me understand some very complex stuff when even my

44:08

doctors couldn't do it at the moment. Not not all of them at least. So that's that was one big deal. The other one

44:15

goes to chickens. Ann knows. So I have a uh regenerative

44:24

farming business called Chicken Karma and we we manufacture uh greenhouse

44:30

there. They look like green houses. What the ARCs is the business that my family's in. But it's like a mobile chicken coupe that you drag around a

44:36

field and you put a hundred uh meat chickens in there. You can do egg layers, too. And I was fortunate enough

44:43

I got into an alpha testing group for OpenAI when they first came out with

44:48

plugins and code interpreter and all that stuff back in the day. So this is like ancient history now, you know, in

44:54

AI land. But uh I got into the plugins one and I got to connect Chat GBT to

45:02

Wolf from Alpha, which is that's like a math and science um platform and LLMs

45:08

are terrible at doing math and I I guarantee you I was the only one in there testing chicken math in OpenAI.

45:16

And like I basically gave it a really difficult math problem because I wanted to see if it could reason through it and

45:22

solve it. Um and essentially it was like given six months and one acre of land uh

45:30

in this chicken tractor, how many chickens can you raise basically is like what it is.

45:35

Because there's a lot of rules to it though. Like it's like a it's a 12 x 12 pin. There's a hundred chickens in there

45:41

to have like the proper stocking density. You have to move it forward every day. It can't go over the same

45:47

part of land like uh for like I think 60 days is what it is uh to avoid like

45:52

disease issues. I was like how many laps can you run in this relay race and it actually work and

45:58

in six months because ultimately you want to see how much money can you make with this chicken tractor on an acre in

46:03

six months and before you could connect it into a plugin. I mean LLMs don't do math well

46:10

or they do it way better now but back then not at all. Yeah, it nailed it. Like it was a ve just

46:16

explaining it is difficult to explain like all the parameters of this problem and it nailed it. It took me a while to

46:22

get the prompt engineering right but when it could like outsource the math part to wolf from alpha have that do the

46:29

calculations bring that back re reason through it uh to to do the next part of

46:35

it and continue on it. I I couldn't believe it. So, I was it it basically showed me the level of

46:41

intelligence and like where this is going in these systems. Um, and it's addicting. Like y'all were

46:49

talking about it like when people get into this stuff, it's like the AI sugar rush and you just want to like push the

46:54

limit of like what else can we do with this? What else can it do? Yeah. And I assume that would have been a thing where you'd you would have had to brought in the

47:00

third brought in a third party consultant or you would have had to somehow figure out the math yourself,

47:06

right? Like it was the alternative there like either expensive or a really long

47:11

uh runway or or would it just been would you have just avoided that alto together? No. Well, I mean so my wife and I both

47:19

have accounting degrees. So we sat down and and figured out the actual number

47:25

for this, but again even like mapping that math problem out is kind of a pain in the butt. Um, so it would it was just

47:31

a really difficult problem that had we not had accounting degrees, yes, we would have had to call in a consulting

47:37

team, it probably would have not been easy or cheap because I doubt that we'd have to translate chicken speak into,

47:44

you know, a math problem or figure all that out. Um, but it was uh,

47:51

yeah, it just was amazing that it could solve a problem that I would consider is actually pretty difficult, but at the

47:57

same time, it was very practical. like there's not a lot of crossover in in those two industries that much which is

48:04

why I think um it raised some eyebrows in that alpha trial from open AI and from wolf from alpha both of them they

48:11

were like it just was chicken people come from where I come from and are doing those sorts of like

48:17

tests in AI and I was very very early yeah and then what's been the what's been the

48:22

journey since then right what's been the experience like like so so you have two

48:27

fairly profound found moments there and then was it just were you all in at that point? Like what's been that journey

48:33

from from then to where you are now? Yeah. Um

48:38

it's been an awesome last several years like it really changed the trajectory of my life. Like I was very fortunate uh to

48:47

do a mentorship or a fellowship program underneath Rachel Woods that Ann mentioned earlier.

48:52

And that really I was like doing that and and even doing client work in the

48:57

early days. I'm like, well, this is what I'm doing now. This is what I'm going to do for the like foreseeable probably for

49:02

the rest of my life, honestly. Um, which is what we've been doing. So, we've been going out and we were doing

49:09

client work for businesses of all different shapes and sizes. We've worked

49:14

with folks here locally that are small mom and pop shops all the way to like Fortune50 companies and everything in

49:20

between. Um, which I love that. I love solving problems. uh it's just fun for

49:27

me to like go in and understand a business problem and how do we apply how to help solve that. It's just not scalable. So that's that's the problem

49:34

on that. It's well it's not not scalable. It's difficult to scale I guess is the best way to put it. Yep.

49:40

Um and we when we fell into this um education gig, that's been amazing

49:47

because it's one of the big things for us and Ann knows this is we're huge on

49:54

ethics and bringing in more folks and more diverse folks into the AI space.

50:01

I just don't say that out of like being cliche about it. We literally mean that.

50:07

Um, and this gives us a way to do that and leverage to like do so much more

50:12

than what we were doing. Just like going and helping companies one at a time really because now it's like all these

50:18

people that are learning from us, we get to pour into them and it just helps it happen so much faster. Just amazing. Just amazing.

50:24

It's like a family tree. I I I haven't had a chance to talk to you about this, but I've I've uh talked to a few of us

50:32

who met one another in AIX and people starting to refer to our our group as

50:39

the class of 23. That's awesome. Awesome.

50:45

I love it. Yeah, that was a joke. So, one of our uh friends, Vanessa, the other day, she's um she's in AI build

50:52

lab and she was joking about Gigawatt. So Andy, do you remember Gigawatt, the AI prompt engineering agent back in the day? So Gigawatt, we give that away like

51:00

some of the system instructions for the different versions of it in class. And she was talking about the family tree of

51:05

AI agents that Gigawatt has built out. It was pretty funny. So

51:10

absolutely. So um question number two is from your

51:16

vantage point doing what you do, what AI trends are you paying attention to?

51:23

Hm. Um, that's a problem. There's so many to pay attention to, right?

51:29

It's so true. Um, I'd say like the two big ones that I I focus most of my time on these days

51:36

are uh AI agents. um just how to how to build them the latest frameworks and

51:43

tools to plug in and and ship those because uh

51:49

we are constantly testing and trying to teach to show folks how do you go from an assistant to an agent and how do you

51:55

build an agent that actually knows how to do things the way that you would want to do them. uh versus like some a lot of

52:00

these off-the-shelf agents that are generally good because when they're building those products, they have to serve like a wide audience,

52:06

but we're wanting to focus how do you build one that is feels like an actual team member that you stood up and are

52:13

giving a a bigger degree of autonomy to over time and that's done through

52:18

evaluations which is the other trend uh if you want to call it that that is constantly

52:25

paying attention to and trying to figure out how to make those easier and better and faster. Um, and and evaluations are

52:32

basically just quality control of the systems output to put it really

52:37

simply. Great. Are you are you guys going to teach people how to be agent bosses because

52:44

somebody's got to I mean, nobody knows how to be an agent

52:49

boss and the people left standing are going to be really good agent bosses and I don't even know what skills you need

52:55

to do that. Yeah. Uh yeah. Yeah. I mean, that's definitely where I see I imagine the way that

53:03

things are right now, that's what the future's going to look like is it's going to be you're going to be running teams of agents that are like your

53:10

little helpers. Um but yeah, that's what we're trying to teach folks. And a lot of it is

53:16

connecting all the agents into all the different types of information they're going to need to access to actually be

53:22

good and do that in a secure way. But then even when you connect them, you have to kind of also teach them uh what

53:32

your business processes are like when they might reason through one thing or another to come up with an answer. So um

53:39

yeah, it is a it's a learning curve for sure, but I think more and more people

53:45

are picking it up because we liken it a lot to hiring a new team member and I

53:50

brought them on. like you don't just if it's a new customer service rep, we're not just letting them, you know, fire

53:57

off responses to every message on day one. It's like this whole training process, which is the same thing you're doing with an agent.

54:03

Yeah. Yeah. Yeah. Fascinating. Fascinating. Um, so let me let me dig into the dig into the

54:10

third question here. And and so this one I just I love this question and I'm excited to get your take on it. So what

54:16

does AI readiness mean to you? And then for someone just getting started with

54:22

AI, you know, what's your advice?

54:28

Number one like advice is just start in it. Just try it in some way. Um

54:35

you have so many options for tools. Uh especially when you're first starting like one of the tools that we tell every

54:42

business that we work with and personally it makes a lot of sense too is start recording yourself. Like record

54:48

all your meetings. use some sort of AI notetaker. Um, I use the voice memos app

54:54

a ton on my phone just to brain dump about things just to capture

55:00

my idea or thought process or whatever just to be able to chuck it into um I

55:05

have like a AI second brain system that I like that way I never it's helping me

55:10

have like total recall and remember things that I just normally would forget for sure. Uh, so I think folks like

55:18

building that muscle of just um not only using the tools but in a way that you're

55:23

going to like what you were saying earlier, you get to that aha moment and they feel more personalized to you is you have to have some sort of a method

55:30

for explaining to these tools what are you into like what are you thinking about? Um, so I'm I I save like websites

55:38

and and news articles, Tik Tok videos, all that stuff. Instead of me just saving them on

55:44

the platform anymore, I ship them to notion. Like I have a a system set up and it ends up into my second brain

55:51

system so it knows about me. Uh the readiness thing is again I think

55:57

it's just you have to get into it. Like these tools are going to need you to

56:04

learn a new set of skills and it's really not as complicated as you think it is. It looks a whole lot more

56:09

intimidating and even like what you were describing earlier and like when you're talking about building video games.

56:15

Yeah. Is still this sense of like gatekeeping a little bit and and people who block it to want to make it feel like it's like

56:21

way more complicated than it is. Yeah. Um but I mean if I can do it, anybody can

56:27

do it. S is a non-technical founder as well. She does this at a high level.

56:33

Like we've had so many people at different ages too. Like we've had a lot of um seniors come through our course

56:39

surprisingly. I'm like good on you all for still having that curiosity like I want to be like you and we grow up.

56:46

Yeah. Are learning this stuff. So I think it's just giving yourself um

56:53

the grace to to to go try something new and chase your curiosity there.

56:58

Yeah, I like that. I think that's your curioity. Yeah, exactly. Um, let me see. Let me

57:07

switch this up here. Um, yeah, that that was really awesome. I one of the things that

57:13

that I hear a lot from people who who are intimidated with AI is it the it's just that phrase,

57:22

oh, I'm not technical. Like like the hand goes up and they're just like, I don't whatever you're saying. I know that's not me. I

57:30

can't do that because I'm not technical. And and you know what you just described is you don't have to be. And the way Ann

57:36

described her game, it was like I want people I want cute what was it? Cute animals giving high fives. Like that was

57:43

the ethical requirement for your game. Like it's just amazing. So

57:50

what was your was like you look like you smell nice or something like that.

57:55

Yeah. You look like you I bet you smell good. That's what I was. Yeah. that what is

58:02

the technical requirement for this gay man? It's got to say nice things to people.

58:11

Well, you know, I know we I know we're wrapping up, but I wanted to share with you guys that like that that mentality

58:17

we ran um our social s on on Saturdays

58:22

we get together for two hours with women every weekend and we ran all those transcripts through chat GP

58:30

to get a sense of how people talk about themselves. We had seven pages of I'm not technical.

58:39

I am behind everybody. I don't know what I'm doing. I'm, you know, like it was

58:44

unreal because these these people are leading AI initiatives at work. They're

58:50

builders. They're consultants. But it's the mindset that we all and Kyle

58:56

convinced us all about I don't know a year and a half ago. You fixed us all

59:01

and got us to stop apologizing for this. I'm sorry, but I'm not good at this.

59:07

Enough. Yeah, you're fine. Yeah, enough. Enough. And now we're like

59:12

I feel like we're in another wave right now of people who are coming into it for the first time and who've like just

59:20

heard about custom GPTs for the first time. And it's kind of there's another wave of well, I'm not good enough to do

59:28

this. And it's like there's a thread there of like you said Tyler, chasing

59:33

your curiosity. If you can do that, it opens up all the doors.

59:38

Yes. It's a core requirement. We could call them the neoapologists. The neopologists.

59:46

Yes. Yes. I think the whole thing of like imposttor syndrome, too. Like that

59:51

that's something we all talked about a ton in the AI exchange. Like when when I first got in there,

59:57

uh all the live sessions we would I would go to I had mic off, camera off. I'm just because I'm like they're saying

::

all these words and concepts I don't know about. I don't feel like I should be in this room, but I just kept coming back and listening. And once you get

::

past the jargon and stuff, um yeah, it's you can make some really big

::

strides. That's what we try and tell our folks is that, you know, it's a safe space. uh you're going to

::

have these moments like I even have these moments still of like getting into this stuff and part of it is I think

::

also comes from we're working with these tools that there's not a lot of like road maps there's not a lot of people

::

who have done this sort of stuff before for you to look at and so I think that seeds doubt in you sometimes of like do

::

I even know what the heck I'm doing nobody knows what the heck they're doing we're all still

::

um so just that's what I'm saying give yourself the grace and chase your curiosity there for sure Yeah,

::

I love it. Well, well, this was absolutely awesome, Tyler. Thank you for your time and

::

for a bit after and we can chat. But yeah, this was just awesome. Really appreciate you coming and and sharing

::

your wisdom because it's it's deep. Deep having me. It was awesome having

::

y'all having y'all. Lord, y'all having me. My goodness, my brain. It's chicken math. It's chicken math.

::

Chicken math. That's it. That's it. It's chicken math. All right. See you later.

Listen for free

Show artwork for AI Readiness Project

About the Podcast

AI Readiness Project
Forget trying to keep up with AI, it's moving too fast. It's time to think differently about it.
The AI Readiness Project is a weekly show co-hosted by Anne Murphy of She Leads AI and Kyle Shannon of The AI Salon, exploring how individuals and organizations are implementing AI in their business, community, and personal life.

Each episode offers a candid, behind-the-scenes look at how real people are experimenting with artificial intelligence—what’s actually working, what’s not, and what’s changing fast.

You’ll hear from nonprofit leaders, small business owners, educators, creatives, and technologists—people building AI into their day-to-day decisions, not just dreaming about the future.

If you're figuring out how to bring AI into your own work or team, this show gives you real examples, lessons learned, and thoughtful conversations that meet you where you are.

• Conversations grounded in practice, not just theory
• Lessons from people leading AI projects across sectors
• Honest talk about risks, routines, wins, and surprises

New episodes every week.

About your host

Profile picture for Anne Murphy

Anne Murphy