The AI Readiness Project Podcast with Guest Matt Karabinos: Where Does Education Need to Go in a World with AI?
🎧 About the Episode:
Matt talks with Kyle Shannon and Gwyn Chafetz (standing in for Anne Murphy) about the real-world challenges teachers face and how AI can actually help—without replacing the human touch that makes education meaningful. From his classroom in Pennsylvania to online workshops for teachers and school leaders, Matt brings a fresh, honest perspective on how educators can use AI to support every learner and bring more creativity into their workday.
🧭 Key Takeaways:
- What students actually need in an AI-influenced future
- Why teacher trust and curiosity matter more than tech tools
- How to keep things playful and inclusive when introducing AI in schools
- Small steps educators can take to get started with AI today
👤 Our Guest:
This week, we’re joined by Matt Karabinos—better known as Mr. K—a sixth-grade math and science teacher and AI in Education consultant. With a background in developmental psychology and seven years of classroom experience, Matt’s mission is clear: make AI useful, ethical, and exciting for educators.
📅 Don’t Miss It:
Join us every Wednesday at 3pm Pacific for The AI Readiness Project, hosted by Anne Murphy of She Leads AI and Kyle Shannon of The AI Salon.
Transcript
0:04
Forget trying to keep up with AI. It's moving too fast. It's time to think differently about it. Welcome to the AI
0:11
readiness project hosted by Kyle Shamim and Anne Murphy. They're here to help you build the mindset to thrive in an
0:18
AIdriven world and prepare for what's next. [Music]
0:26
Hey, hey, hey. Ann Murphy, you've done something with your hair. A little curlier these days, isn't it?
0:34
Hey everybody, welcome to the AI readiness project. Uh Ann Murphy was on vacation and so I reached out to Gwen. I
0:42
said, "Hey, would you be willing to hop on here?" And you said yes. Heck yeah.
0:48
Great to see you. Why don't you introduce yourself? Tell tell folks a little bit about who you are.
0:54
Great. Hi everybody. I'm Gwen Shapitz. I am a strategic marketing consultant and
0:59
I came to AIO I guess it was 20 the end of 23
1:05
beginning of 24 I lost my last position and was searching around saw something
1:10
on the net and was hooked right away. It's been uh quite a journey learning
1:15
about AI. Yeah. Yeah. Yeah. Yeah. Exactly. So so so you've seen some of these so you kind
1:21
of know how it goes. So, so the first thing that we do is we we talk about
1:29
how ready for AI are you this week? So, so catch me up like what what where's
1:35
your head these days? Like what are you how how are you dealing with AI lately?
1:40
Like what's what's been going on in your world? Gosh, you know, there's so much it's
1:46
just every week it's just hitting you so hard. And I know you say that all the time, but gosh. So, one of the projects
1:52
I'm working on, um, it has to do with natural disasters, and it's something
1:57
actually one of the people from the AI salon and I and, um, another person are are working on. And we're uh, we started
2:06
thinking, oh, it's going to be an app. It's going to be real simple. Then we realized, no, it's going to be multi-
2:11
aents involved. And then we realized we need an MVP. And then we realized, yeah, we're not going to start with an app. I
2:18
saw uh Brandon in the life hacks. He was doing um lovable. He showed us how to
2:24
use Lovable and I was like, you did that website how quickly? And as much as I hear about that and
2:30
know that when you have to actually apply it, then you start going, "Oh, wait a minute. What a brilliant idea."
2:35
So, we're gonna switch. Um my my partner Bob Mitten uh was one of the people who
2:42
brought it up and said we should start with the website and an MVP there instead of the app. So we're making that
2:48
switch. And then I was listening this morning to something they were talking about websites and scraping the web and
2:55
um that that's getting more and more difficult and I'm thinking well we're going to need to scrape the web. Are we
3:01
going to have to rethink this? you know, with um with the new agent model that's
3:06
come out with chat GPT. So, so much going on, right, Kyle? So, so so much going on. So much going
3:13
on. Um I I along with most of the other world right now. Um I'm waiting for uh
3:20
OpenAI's agent to actually show up in my chat GPT.
3:26
Um, but you know, you know what's funny is I got a uh one of one of the irregulars from the from the AI learning
3:32
lab, uh, Cam Katkin, she she sent me a note on Discord and she goes, "I keep
3:37
having Kevin Mallister moment." She just said, "Km moments. I keep having KM moments." And that was the only thing
3:43
she said. And I said, "What what's going on?" Like, "What's giving you Kevin Mallister moments?" Which if you don't know what those are, it's
3:50
moment. um when you use AI and it just it just you know your jaw literally drops open,
3:57
right? You're just like um and she said she's having a series of those. So I wrote back to her and I said
4:02
what you know what's going on and it's just she's just using chat GPT and I
4:07
think she flipped into 03 mode, right? The reasoning model. Yeah. And the 03 reasoning model, if you
4:13
don't know it, because it's got deep research in it and because it can use tools and it can it can, you know, write
4:19
its own code and things like that. She's basically trying to do some research and she's just watching her chat GPT
4:27
do all these other things and and it's like going off and researching websites and then turning them into spreadsheets
4:33
and and just just all this stuff that she didn't know was in there. And so,
4:38
you know, wow. like even and she you know she's been an irregular for a year or two and
4:44
I think I think one of the things that I see consistently and even experienced it myself is
4:52
you leave a tool for a little bit and you come back to it and it's a completely different thing or even you
4:57
just are using chat GPT the way you've always used it and then all of a sudden you've got to do something slightly different and you're like oh let me try
5:03
this other tool and you realize this is a fundamentally different thing that I've got in front of me and It's just
5:10
it's just mind-blowing, right? It's just the the the capability of these things is crazy.
5:15
It's really crazy. And you were mentioning, you know, in 03 how you can see the reasoning and it's it's really
5:20
fascinating to watch it say, "Well, now I'm going here." But the user
5:26
Yeah. The user wanted me to solve this and now I'm thinking, "Huh?" Yeah. Exactly.
5:32
Like I'm right here watching you. Yeah. It was funny. I saw there was a there was an interview with uh with Sam
5:38
Alman I don't know yesterday or the day before and and he was saying that you
5:44
know people you know treat these things kind of like humans right they they
5:50
interact with them like entities more than tools and he said you know his preference is that they would interact
5:55
with them like tools but when you when you have that kind of when it has an internal dialogue and you get to read it
6:01
and it's like wow the user did this and I think I should do this like it's responding very humanlike. So the fact
6:08
that it is technically a tool um doesn't mean that you know people aren't going to interact with it like it's some sort
6:15
of entity. Right. And I you know that's just you know new way of dealing with computers.
6:21
Right. 100%. In fact um I just had that moment
6:26
because I I use advanced voice from time to time. In fact, I was telling you that I I teach um seniors how to use AI. And
6:34
one of the things I do is I pull out the AI and I go, "Oh, we've got a problem with the computer. You know what? Not
6:39
quite sure how to solve it. Let me show you." And we talked to the AI and she looks at it and they they get blown
6:46
away. And I so I know of it and I I've used it as a tool. So the other day I
6:51
sat down and I said, "You know what? I want to know more about myself and um you know, all the people that I'm
6:56
touching with the messages. Is it getting out there? Is it not? And um what more can I do? And I started out on
7:03
the computer and then I pulled out the advanced voice and oh my gosh, it was like talking to your best friend. And
7:10
for some reason, that's when it flipped for me. I've been using it for the last six months. But that was the one that
7:16
said, "Wow, I feel really good having this conversation. I feel really prepared." She really it did feel very
7:24
human. Yeah. Yeah. And it and well, I think the it sounds like the key there was
7:32
when when you were using it like fix this computer thing, you're using it like a tool off to the side, but when you're using it for your own personal
7:40
development, all of a sudden it's like she's there as a support for you, right? Yes. 100%. 100%. That's th this is my
7:49
this is my big new epiphany of the past I don't know two or three months is is that the more you can make
7:57
what you put in the prompt about you the the stronger it becomes like the more
8:03
magical it becomes because it's it's basically just amplifying your ideas
8:08
your it's supporting you in doing your thing. Um, I would love for you to can
8:14
you talk about like who are the who are the seniors that you're teaching? Like
8:20
how did that come about? Um, like are they showing up with attitude? Do they
8:25
not want to talk to the robots? Are they like where you know like where where are your students?
8:32
Yeah. No, it's it's very funny. It's um there is a local synagogue that I approached and I said, "Hey, I noticed
8:39
it's an older crowd there. Predominantly older crowd." And I I approached them and I said, "Hey, you know, um I'd be
8:45
happy to do a tech talk for them because a lot people as soon as they know that I
8:50
know computers, they're like, I don't know how to use my cell phone. There's a problem with this that, right?" So, I
8:56
thought, why don't I just do a freebie and and do a class and help them along because I really I'm very passionate
9:02
about seniors. Um, one of the things I mentioned is that I have a healthc care background. So, I've worked in senior
9:10
programs. I've worked with the underserved, whole bunch of stuff. So, senior's a special place in my heart.
9:15
And I moved to San Diego. My parents um are aging and so they're always hitting
9:21
me up with I don't understand and help me out. So, I I did the tech class and
9:26
in there I gave them a little taste of AI and then they loved it and asked that I come back and I said, "Why don't I do
9:32
one that's focused simply on AI?" And everybody was like, "Yeah, because
9:38
I'm I don't I'm a definitely don't agree with it, etc." Right. Right.
9:43
Yeah. Yeah. Don't agree with it in what way? Like what what what are the tropes that they heard? Yeah. Well, there security is definitely
9:50
top security. Yeah. They're afraid that um everybody already knows too much about them.
9:56
Yeah. And they're afraid that if they use AI, all their information is going to be put out there in the universe.
10:02
They're afraid of making mistakes. That's always top of the list. Yeah. Yeah. So, um and then just it's so
10:10
complicated. They think it's very complicated. You have to be a you have to be an MIT scientist to be able to use it, right?
10:17
100%. 100%. Yeah. In fact, there is someone there who um he makes a AI robots. I'm going
10:25
to do him a disservice for everything he does, but you know, like they have uh robots at the border
10:30
that can that interviews people. Yeah. And he's involved with that and he teaches at the university and the whole
10:36
thing. So, I was a little intimidated to say I'm going to teach an AI class just the basics. Right. I'm literally, you know, doing
10:43
robots. Right. Right. Right. Right. Um, so I warm them up a little bit and say, "You do realize
10:51
you're already using it, you know, like Netflix, your Siri, um, you know, just
10:58
different things that you're using and your information is already out there. You know, as soon as you use your bank,
11:03
they putting everything in the in the web. So Google Google saw to that 20 years ago.
11:08
So your concerns are valid. It's just that it's that that ship has long since
11:13
sailed, right?" Yes. Yes. Yes. Yes. And then one of my
11:19
favorite things is to talk to them that um AI is like a li library.
11:24
Go to the front desk and you're talking to a librarian, right? And general information, but really what you want is
11:31
something in fiction. So now you're being sent to another person in the fiction area. So now you're getting your
11:36
prompt is getting better, right? Yeah. Yeah. But in fiction, I really I want kids
11:41
books. Mhm. So the more descriptive, the more context, the more information you give,
11:47
the better your results are going to be. And that just kind of opens the door right from there. Huh. That's fascinating. That's
11:54
fascinating. Are they what can you Well, so have they had Kevin Mallister
12:00
moments? Like, have there been jaw-dropping moments for them? And are there anything in particular that that you know surprise and delight them?
12:08
Truthfully, I live for the moment. I mean, that's really the reason
12:14
why I do any of this. I love that light bulb that goes off and then they're like, "It could help me with my travel.
12:21
It could help me with, you know, all these things." So, one of uh one of the things that's fun, a tool that I bring
12:28
in is Notebook LM. Oh, yeah. And I'll say, you know, right here in class, let's take a look at it. And then
12:34
you put the podcast on and they're blown away at how quickly, how personalized, the fact that you can write into the the
12:41
podcasters and yeah, you get a lot of aha moments from that. Or um you know,
12:47
street signs. I'll bring that in as another example. Can you tell me what's on the street sign? Are you allowed to
12:52
park there or not? Especially if you're in LA. And they see how I just quickly take a picture, ask it, and it tells me.
13:00
Yeah. Abs. Absolutely amazing. Yeah. Yeah. And are they like how long have you been doing it?
13:05
Uh about three months now. Oh wow. And are do you have do you have
13:11
uh students who are who are actively using it now? Like what what are they using it for? What are the use cases?
13:16
There's one who is using um co-pilot and I I do have to I have to put a what
13:24
is it disclaimer out there. I I'm not into co-pilot. Yeah. Exactly. Well, apparently
13:30
Microsoft salespeople are having a tough time because their clients are like, "Yeah, no, we don't want your co-pilot
13:35
thing. We got chat GPT. We'll keep paying for that." Oh gosh. Oh gosh. Oh gosh. No. But yeah.
13:42
No, people are starting to um use it more and more. Um I mean, I tell them, "Hey, you know, if you want directions,
13:48
but in the car, like I'll tell them, have a conversation with them in the car or get your jokes or like you do with
13:54
Siri." Yeah. Except you can talk about anything, right? Yeah. Yeah.
14:00
That's one of one of my favorite things to do. I don't know if you've done this. If if you haven't, it's it's a blast. Is
14:05
is get in the car, put it on Bluetooth, and like when you leave for work, start a brainstorming session and and then by
14:13
the time you get to work, like wrap it up and then you get to your desk and you've got this big write up of whatever you had the brainstorming session about.
14:19
It's I love that. I love that. It's so cool. I love that. Yeah. I I think you were the first one
14:26
to mention that that I picked it up on. Yeah. Yeah. Yeah. It's It's so much fun. Cool. Um Okay. So, let's talk about
14:36
a thing to pay attention to this week. So, a lot of times when I'm doing this, I'm I'm talking about, you know, news or
14:42
some new feature came out or some new model. Um and we're still waiting for uh Chad GPT agent for for non-pro users.
14:49
So, so it'll it'll come at some point. I'm on the edge of my desk waiting. I keep putting it in my grips. I'm like,
14:56
"Has anyone gotten it? Has anyone gotten it?" Yeah. No one I I saw one one person on Twitter today said said they got it, but
15:02
I I haven't seen it. Um I I I I am confident that it's going to be like a a
15:09
disappointing version of Gen Spark or Manis. Like if you've used Gen Spark or Manis, it's going to be like that. But I
15:16
do feel like having it in line in a chat is going to be really interesting. But that's not what I want to talk about today. Oh,
15:22
what I want to talk about today is a thing that I think might be worth paying attention to, and I'd love to get your
15:28
take on it. And we we've sort of talked about this a bit already, okay? Is personal point of view.
15:37
And here's what I mean by that. So, so when when you talked about you, you'd been using advanced voice for this thing
15:43
that was kind of off to the side, right? You know, help me figure out this computer problem. And then the minute you brought it to, hey, I'm working on
15:50
this thing and you started interacting with it, you had a completely different experience with it.
15:55
One of the things that I've talked a lot about, I think one of our one of our biggest roles in working with generative
16:02
AI is when it gives us something to say, well, is it good or not? Is it is it bad
16:09
or not? And what that ultimately forces you to
16:14
do is have a point of view. and and say, "Well, wait a minute." Like, if if the
16:20
thing it gives me back is something that I'm not happy with, well, then what would you be happy with? Right? Like it
16:26
immediately puts you in a place of if if I want this to get better now, I actually need to think about, oh, well,
16:32
what does good look like for me? And so, so I think a thing to pay attention to
16:37
this week is what's your relationship with having an opinion, right? With having a point of view. And what's that
16:44
based on? Is it just based on your worldview or is it based on hey I sat down to write a business paper and I
16:51
know business papers look a certain way and act a certain way and right and so is it context you know is it is it just
16:58
about the context of the conversation you're having is it just about your worldview is it some combination of those so I'd love your thoughts on you
17:06
know how you get AI to come to life with you and what's your relationship with point of view but for me I think point
17:12
of view is one of those things people are not necessar necessarily thinking about all the time and when you
17:19
get into AI, I think it I think it makes us confront that a bit. So, I'm just curious what your take is on that. No, I
17:26
think that's a brilliant question and it's something I've been thinking about lately because
17:33
another thing I've been doing is some copywriting on the side and in doing that you know uh I do have chat GPT
17:42
sometimes sometimes start a draft especially if I'm staring at a blank paper and we all go through that but
17:48
when it gives me that draft and it's a subject I may not know something about
17:53
it gets me thinking Um, not only do I have to make sure that I put the right context in, but do I really understand
18:01
it? And I think that's something that a lot of people are missing now with chat GBT.
18:06
They get the answer if they just make it sound like it's them a little bit and then they put it out there, but they
18:11
don't think about, do I really understand what I'm looking at? Yeah. Do I have that point of view? do I um
18:22
you know take that logic even further and further which is why I'm really excited about our our guest who's coming
18:28
to understand how we're working with students because that's how we're going to u be able to
18:34
evolve as a yeah it's ultimately a critical thinking exercise but so one of the things I'm
18:42
curious about is if you've worked in a corporation for 20 years you know there's some corporations they have a
18:48
culture where having an opinion is not necessarily the best thing. Yes.
18:53
Right. Yes. Does that bleed over into not having an opin like if Chat GPT gives you
18:59
something that you don't like? If your instinct is don't have an opinion,
19:05
is is that hard for you, right? like is is this going to is this a skill that's going to be more hard for some people
19:11
than others just because they've never really had to do that or or they might be in a work environment where doing
19:17
that is actually you know politically dangerous right I don't know
19:24
you know I think even more just as interesting in as the adding to it is um
19:29
the yes and part right is um our company is going to create the
19:37
environment for AI where there's not a lot of room for opinion because they're able to set up the
19:44
custom instructions or the way they set their a up AI up maybe with parameters.
19:51
Yeah. So you have to work within this box versus right you know doing your own having your own ideas and
19:58
thoughts. Yeah. There's also you know maybe one of the downsides of models getting
20:03
stronger. So, so if if AI models get stronger and
20:09
get to the point where they don't hallucinate as much and they don't make as many mistakes, it's going to be
20:14
really easy for people to just go, "Oh, I'll just let the AI answer it." And I think it's already easy enough for people to do that, right? But but right
20:21
now, they hallucinate so much that that's kind of irresponsible. But when these things get better, it's going to
20:27
be easy for people to just go, "Oh, I'll just take that answer, right? Because I know it's probably right." But but then
20:32
there's no critical thinking at all there, right? So, I think I think there's there's a there's a bit of a danger in these things getting smart
20:39
enough that you don't have to engage with them that people just won't. And and I think that's a bad thing because I
20:45
think in the end for me when AI like becomes magical is when it's like I have
20:52
this vague idea of something in my head and I put it into the m a machine and what gets reflected back to me is like
20:59
really good and I'm like oh that's really good or it's clearly not what I
21:04
had in mind and I'm like oh I must have not communicated that well because that's not at all what I thought and
21:09
then and then I'm in some sort of thing but it's that's a very engaging ing like
21:15
personally enlightening experience as opposed to just oh it made some stuff for me and I'm going to send it out to
21:21
the world, right? Which is this very disassociated thing. My mind is kind of splitting on on what
21:27
you're saying. It's very interesting because um another thing that I do is I work
21:34
with girls with impact and it's sort of a mini MBA program for underserved girls and at the end they
21:41
end up with a a business pitch and in doing that the work that's coming in from them like
21:46
I could tell right away one of the girls she just went to chat GBT and got her answers and put it out there. So I think
21:54
didn't didn't have it in her head and yeah, she may have had that initial thought of here's the project I want to
21:59
do, but it was very formulaic. Yeah. Yeah. Yeah. In the responses and I think what people
22:05
are soon going to discover is you can recognize that. So even if that is the way the world goes
22:13
and the critical thinking may not be there, people are going to catch that. At least the maybe the older generation
22:20
like at a certain point is definitely going to catch it. The question is whether or not we teach that soon enough
22:26
to the younger generation so that as they age they don't lose that. Does that make sense?
22:32
No, it makes it makes perfect sense. And I I I mean I think you're hitting on something that
22:39
when when everyone has the the
22:45
capacity to be able to execute good writing, good images, good music, good
22:51
coding, good right when everyone has that capacity, it doesn't mean that we're going to see
22:58
all this brilliant work. it what it means is like right now the the sort of
23:03
low lowest tier content um is just bad content and and what's
23:08
going to happen is AI is going to raise the level of that but that's still going to be the floor
23:14
and the floor is always going to look the same the floor is always going to look like some version of garbage that
23:19
you instantly recognize oh that's just the that's just noise right and and the
23:24
things that'll rise above that even in a world of kind of infinite content generation are things where someone
23:31
created something with intention and with meaning and with a point of view with taste, right?
23:37
Yeah. And I think that's going to be the stuff that rises above. I think your point's a really good one that that when when
23:42
these girls brought in the work, one was immediately apparent as her she wasn't
23:48
connected to that. No. Right. Like that that to me feels really important. I think Mr. K is going to
23:54
have some some things to say about that too, right? you know, dealing with students and you know, who thinks they
24:01
can get one over. I I think he deals with middle schoolers, too, so they're probably the worst at this.
24:07
Definitely. Definitely. And and you'd start seeing the same thing in everyone's work. So, you know, I mean,
24:14
so how can you get away? That's that's that bottom, right? You just start to see and you can you can see it just just flicking through um you
24:21
know, flicking through X or LinkedIn or whatever. You can just tell someone put in the minimum the minimum prompt and
24:28
they got back the uh like that sort of 90s illustration with like a cyber dude with a brain inside a an outline of a
24:36
profile with the brain glowing like that one. It's like okay you didn't really try on this one.
24:43
Exactly. Exactly. And actually, that's a really great illustration, if you will, of the point because you
24:49
can tell, you know, the the picture itself, they all look green.
24:55
Yeah. Yeah. Exactly. Exactly. It just looks the same. Yeah. It just all looks the same. Well,
25:01
it's funny. the I did I did the LinkedIn post about Chain of Chain of Craft um
25:07
where I got I got yelled at by all these artists and and one of them pointed out that the crappy piece of artwork that I
25:14
that I put with the piece I think was was one of the reasons they pushed back
25:19
so much because it it was it was just a like I like I was proud about the article but I'm like oh it just needs an
25:25
image and so I just quickly threw an image in there not really connecting that people might look at that and go
25:31
Wait a minute. There's something in congruous here where you're talking about chain of craft and clearly you
25:37
didn't have much for that image, right? So So I got a bit busted there, but that was your That was your tell. That
25:43
was your That was Yeah, I gave him a tell. I gave him I gave him ammo. Like you did such a great job with the
25:49
Now can I trust the writing because your picture exactly? Although Vicki says she can
25:56
relate to that. Although Vicki Vicki Vicki at least went this far. She's got a mood board for like have if you
26:02
noticed Vicky's posts always have this that kind of cool whimsical illustration style. It's like it's become a brand for
26:08
her. So she can now knock out something quick that doesn't look like everything else out there. It looks like her thing.
26:14
So I you know what I I'm glad you pointed that out. I absolutely love her post
26:19
just for the pictures alone. Exactly. Exactly. No offense, Vicki. You've got great
26:25
posts, but those pictures, man. Oh well, I want to get Mr. K up here. So, so I want to talk about the AI
26:31
salon. So, if you if you are not a member of the AI salon, you you need to upend your life and become a member of
26:38
the AI salon. Uh you should also join the AI salon mastermind, which is a subscription area, smaller, more focused
26:44
area of the AI salon. Um but the salon is a community that that I started the
26:49
week after chat GBT came out with Leah Fon, a photographer out of Boston. And
26:54
it's really about people essentially doing what we're doing here, right? Trying to figure this stuff out, trying
27:01
to make sense of it, trying to figure out in a world where everyone can create everything, how do you how does something rise above the noise? Well,
27:08
like you're starting to recognize, oh, here's stuff that definitely doesn't rise above the noise. Right. Right.
27:14
And that kind of dialogue and that kind of exploration is what the salon's all about. Um, it's it's a it's a remarkable
27:21
uh community of people. Very generous, very creative, and uh and just nice people. Nice. It's it's a good group of
27:27
people. Um, so so yeah, join the salon. No, it really it really is. We um I'm
27:34
going to do a little plug there. We we have uh the business club within the AI salon and that's where business minds
27:40
are coming together and we're talking about what's going on with the job market. what is go you know how are you seeing things affecting in the uh in the
27:47
job place so yeah what's yeah yeah what's required what's nice to have right
27:54
I think this I think this AI fluency AI readiness is is not a nice to have anymore I think it's
28:00
it's really starting to become a requirement it is it is I mean when you even just
28:06
the way you apply for a job now I just saw a post where someone was saying you had to demonstrate AI by the way you
28:14
used the interview questions that they sent you there. There was something along those lines. So now they're starting to incorporate AI into the
28:22
interview beyond the bots and all that sort of stuff. Yeah. Yeah. Yeah. That's really smart. It gets back to chain of craft. Well,
28:28
why don't you tell the good people about um she leads AI. I think you are qualified. You're a member of that. You
28:34
you know Anne Murphy. You've met her. Oh, I love Anne Murphy. Yes.
28:39
How can you not? She's got such a great personality. She's so kind and inviting and uh very very warm and incredibly
28:47
supportive and that's why I like the um she leads AI as her her group and
28:52
community and she's really built something special there for women where we're able to support one another and
28:58
really lift each other up. Um she's got the uh on Saturday mornings uh she she
29:05
has a a chat for two hours on Saturday mornings 10 o'clock West Coast time. you'll have to do the translation for
29:11
your time zones. Um, and it's really great because we we help each other out.
29:18
You know, it's you could be starting out and saying, "I don't know the first thing about AI and I'm scared." And like
29:24
you say all the time, Kyle, people will come and say, "Excuse me, I'm sorry. I'm sorry. I'm sorry." And you're like,
29:29
"Don't be sorry. We're all learning together." And that's really what that community does. It's women
29:35
celebrating women. They've got people who never did a podcast before, now doing a podcast, never did this before.
29:41
You know, Chef Kelly came out of that. Um Yeah. Yeah. Did her film strip. Yeah. So,
29:46
it's it's amazing. And it's just about success. Beautiful. Love it. And then the the
29:53
other thing that is now live is the AI readiness training program. So, this podcast came out of that. So, so at the
29:59
at the uh beginning of the year, we did this thing called AI Festivus. And Mr. K, who we're about to bring up, was a uh
30:06
was a speaker there and uh and and we turned that into um this this five-part
30:12
training series called the AI readiness training program. And we worked really hard uh Vicky Baptist put together the
30:18
training and we worked really hard to make sure that the training is evergreen, that it's not tied to any
30:23
technology. This really is about a mindset shift and and and being adaptable and curious and adventurous. A
30:29
lot of the stuff that we've organically discovered in these communities, we've now put into this training. So, uh, if
30:35
you're looking to shore up your AI readiness, go there and check that out.
30:40
And I'm gonna add one more thing to it is have fun, right? Yeah. Yeah. Exactly.
30:47
That's what what you put together and what the community does. It's all about let's have fun together and figure this
30:53
out. Yep. Exactly. Exactly. So, with that, let's uh let's bring up
30:58
our guest of honor, uh Matthew Carabinos. We'll talk about him a little bit in he's not here, so he can't defend
31:05
himself, but but he's he's absolutely awesome. He's an educator. Um and he has been a
31:12
member of the AI salon uh for for at least a year now, but for for a good long time, and has been a champion of AI
31:21
in central Pennsylvania. He's up in Harrisburg. I grew up in York, Pennsylvania. I know what central
31:27
Pennsylvania is like. Hey, he's going down to the mole today. Like, I know how they talk. I know how they think.
31:34
And uh he is fighting the good fight in the middle of some closed-minded some close-minded action. So, I am so
31:42
excited. Mr. K, Matthew Carabinos, is it Carabinos? Did I say that right? You got it. You got it.
31:48
Beautiful. Good day to you, sir. Good day. Hello. Good day. Welcome. Welcome. Welcome.
31:54
Yes. Thanks for having me. Like I I love podcasts. So when you were like I
31:59
randomly jumped into your live on on TikTok Live and you were like, "Hey, let's do it this day." And I was like,
32:05
"Cool. Let's do it that day." Whatever. So it works for me. You're like, "I'm in. Let's go."
32:10
Yeah. I'm in. Yeah. No, thank you for having me. I uh I wanted to make a quick correction. Altuna.
32:17
Oh. Yeah. So still Central P. It's pretty much the
32:23
diagn. But yeah, big region. Spelled with O's and not a U.
32:30
Correct. Correct. Sounds like Toon Town. Yes. Toontown. Yes.
32:35
There you go. Yeah. Exactly. Exactly. Um, so, so why don't you, you know, catch us up, tell the
32:41
introduce yourself, tell the people who you are, where you work in Altuna. Like, clearly I knew that. Everybody knows
32:46
that it's Altuna. But but just, you know, give us a bit of your journey because like I I feel like
32:53
you're you're you're on one in the middle of that region.
32:58
Yes. Uh roller coaster is a understatement. So this is like a like
33:05
the beast roller coaster from like Six Flags. Um but no, so I I um I live in
33:12
central Pennsylvania. I live in Altuna. Um I do not teach in Altuna. I actually teach in a suburb of Altuna, which
33:20
Altuna is kind of a suburb itself. So I teach in a very very rural school
33:25
district uh outside of Altuna. The whole enrollment for the whole school is about
33:32
300 kids. Oh wow. Um so it's one of the smallest it's a middle school or middle and high
33:37
school combined. It's it's that is from K to 12. Oh wow. 300 or so kids. Yeah.
33:43
Wow. Uh there's about 40 about 40ish kids per grade level.
33:50
That's crazy. So very very small school district. They um they've been around for a long time.
33:57
So they're they've been, you know, dealing with you're too small, you need to merge for a long time, but they have
34:02
a very very strong uh tight-knit community out there. So they've been able to like make it work and stay on
34:09
their own, which is admirable, uh to say the least. So, I teach in a very rural school district. I am a very very uh
34:18
passionate person about uh AI as pretty
34:23
much anyone who's ever spoken to me knows about. Uh you're you're one of those one of those
34:29
annoying people at dinner, right? Oh, I'm 100% that annoying person at dinner because my brain never shuts off
34:36
when it comes to AI. So, when someone's saying something and I'm I'm literally like like trying to like bite my tongue
34:43
or just like keep eating so that I don't say the thing that's there, you know, like, you know, and my wife, you know,
34:50
she's like hitting me under the table. Don't do it. Don't bring it up. Don't talk about it, you know. Um, but no,
34:56
like I I'll be honest with you. I I jumped in really really fast to AI. Like
35:02
right away. Like in November:35:09
reaction as a lot of people, which was uh oh, this is how you get Skynet. I was like quoting Archer, you know, like
35:15
this. You want you want Skynet? This is how you get Skynet. Like you went right to right to the sci-fi
35:21
tropes, right? Immediately. Um because when I started seeing like how it was responding people
35:27
like even the early you know 3.5 or three I guess it was that that came out
35:33
first. Um the fact that it was able to like respond so well.
35:39
Yeah. Even then you're like that's creepy. Um that's a little bit creepy. So
35:45
I paid attention to it kind of in the background and kept seeing more and more stuff. my friends, um, you know, we're
35:53
all like elder millennials. They're like, "You need to get on Tik Tok." And I'm like, "No, I don't absolutely do not
35:59
need to get on Tik Tok." I fought that for years. Um, until finally
36:07
they kept sending me Tik Toks and then it would scroll me to something else and it'd be like a teacher sharing like,
36:12
"Oh, look what I did with AI." And I was like, "Oh, I heard about that or I already knew that or whatever."
36:19
And then literally the next spring I created my first quiz because I needed like a oh crap I need a grade in the
36:26
grade book assignment. So I was like I need a really fast quiz and I need it like right now. Give me a
36:33
quiz on prepositions like just that's literally all I said was like I need a 10 question quiz on prepositions for
36:39
sixth graders I think is what I said. And uh it generated real fast. I like
36:44
looked over the answers real quick to make sure that like it was legitimate and uh and all the answers weren't C. So
36:51
like I was making sure that too. But but um I I was like wow that did that
36:59
really fast. And my first question which goes along with my message of my shirt
37:04
here, be curious. Uh the bottom says not judgmental. That's a that's a quote from Ted Lasso for all the Ted Lasso fans out
37:12
there. Love, love that. Yes. But uh but um I was cur I was like I wonder
37:19
what else this could do. And then enter rabbit hole. Uh so here
37:25
we are. Yeah. Yeah. But I started to realize that the things can I just reframe what what was the
37:32
what was the time timing of that? Like when when was that when you started going down the rabbit hole?
37:37
May:37:43
years now. Yeah. May 202. So you got into it sort of toward the end of the school year. Had the summer
37:49
to get indoctrinated into the Skynet philosophy. A million%. Okay. And then and then so you started
37:55
the next year all gung-ho, right? Yeah. Yeah. Absolutely. And that's that
38:02
summer was like we my wife doesn't like to talk about that summer because that
38:08
was like the summer of AI like I literally got lost and every single day
38:16
every single day I was like did you know that it could do this? Did you know I
38:22
one day I said this on another podcast I was on and one day she literally said I
38:27
swear to God if you say something about AI right now I'm going to punch you in the throat. I I I'm going to punch you.
38:32
I do not want to hear anymore about it. And I said heard I need to like tone it
38:38
back in my that's how my brain works. I go when I'm on something I go. And you
38:44
know that's sometimes a good thing or a bad thing for a teacher but usually pretty good. I I go and I get into
38:51
things. And so yeah, I went heavy. I literally I would shudder like I don't even want to look
38:57
at my Google account to see how much I signed up for in that first summer.
39:02
Literally anything that had an account I signed up for immediately. Um and it was overloaded and I was like,
39:09
I want to learn how to do this and I want to learn how to do that. And you know, I joke with people, one of
39:14
the biggest edte platforms out there is Magic School. And uh I joke with people that I signed up for Magic School when
39:20
they had four buttons. Uh now they have like 120 buttons and
39:26
they have chat bots and they have all this other stuff, you know. I signed up when they had four buttons and they were
39:31
incredibly simple, you know, but like it right there. I I saw
39:38
probably what a lot of other teachers saw too was that there's more to this
39:44
than just making quizzes, making worksheets, doing the stuff we've been
39:49
doing faster. There's more there. Yeah. Especially Yeah.
39:54
Yeah. Especially once I started with, you know, 3.5 and then four. Four really
40:00
was the was the tipping point for me. that changed how I view what I can use
40:06
AI for. Um because then you started to get into a much more advanced model and of course
40:11
now with 40 and 03 and deep research and all these I mean it's it's nuts. So
40:17
can can you that's where Yeah. Can you walk me through I I want to hear about I want to hear about because it it I as I recall
40:24
you had a good part of a year where you were just using it and then you were hot basically. Um, but I would I would love
40:31
to understand from you and maybe it was what you experienced
40:36
over the summer, but I'd really like to understand how you got to the other side of, oh, this is just something the kids
40:43
are going to use to cheat with versus this is something that could be a profound educational tool. like you
40:50
didn't seem to start in that other that other place or maybe you did but like can you just unpack that that particular
40:56
relationship because I hear so many teachers are like well they're cheating with it and it's evil right you know so
41:03
like how did how did you miss that particular trap so this comes from just my teaching
41:11
philosophy so like I was never so I'll just full disclosure and my kids love
41:17
hearing this they get so confused I I did teach reading for like two years
41:22
and when I my first thing I would tell kids when I teach reading is I hate reading and they're like what I was like
41:31
I don't like to read. Like for fun I don't like to read. Like I can't do it. Like I start reading I fall asleep. I
41:36
don't know what it is. I can't read. Like I I can read. I can do it well but like I just
41:42
it like I literally have to like force myself to read. And they're like but you're a reading teacher. And I was like, that's the
41:50
point I'm trying to make to you here is like I am good enough at reading that I
41:55
can still teach you reading and I know what I need to teach you and I know what I need. So
42:01
my whole philosophy I I never once like I mean I I heard it from other people
42:06
but I was like there's there's so much more that it can do like writing a paper
42:11
is like whatever. So, my first reaction when I heard about AI writing papers, AI cheating, stop writing papers. That was
42:17
my like I immediately went to don't write papers anymore. Um, have them build something, have them make
42:23
something. Um, have them present information. Presenting is way more valuable in my opinion than writing.
42:30
So, have them present something. Um, AI, you know, and that's what so that's
42:36
where that's why I didn't really fall down there is because I was already on the other side. I I was already at like
42:43
I don't I don't like writing essays. But yeah, I will say
42:48
the hard part for me and this is this has been my major message as of as of
42:54
late is the educational system that I am bound to forces me to still have to
43:01
teach essays. forces me to still have to teach multiple choice questions, how to
43:08
answer them, how to like read and and slash the trash and all like I still
43:13
have to teach those things because I'm bound to the system that I am in. So my my message more recently is I would love
43:20
to just change the system to be more open uh not so rigid into
43:27
what we do. Um that's a bigger ask way. it sounds
43:32
like you're doing um is something that we were talking about a little bit earlier, which is you're starting that
43:37
critical thinking much sooner because you're getting them do the logic part, right? Which forces
43:45
you to read. Yeah. Yeah. And how how are the other teachers
43:51
responding to this as they see you doing this? So, I get mixed I get I get mixed
43:57
reviews in my uh school. Uh, again, I'm me, so I'm I'm very much a lot to a lot
44:03
of people. Um, I'm trying to work on that so that I'm not so like much. Um,
44:10
but hang out, you got to hang out with big bigger people, right? Yeah. Uh, why do you think I'm
44:16
here? Uh, why am I in this group? Right. You know, but um like so people are the
44:23
my colleague that was my closest colleague the last three years, she's like, I gotta get on that. I gotta try that. I got to try that. Just last year
44:30
finally, she's like, I want to show you what I did with AI and she held me up her computer and she did something that
44:36
I didn't realize yet. And it was when the new image generator came out. She made a personalized sheet that was a
44:43
like decodable story for one of her kids. And this kid liked um wanted to be
44:50
a national park ranger and like liked bears. So she had an image of a national
44:56
park ranger like talking to a bear and then there was a story that was on the image that went along with it.
45:02
Wow. So that was she showed me the like how the new like image generation with way better text in chat. She was like oh he
45:10
loved this and he did the and he was like way enga and I'm like ah there you
45:15
go. Like so it takes a little bit of time and I understand that. So yeah, you know,
45:21
I I was very patient with her. I just said, "Look at what I did. Look look at what I did here. Look at what I did
45:28
here." And I have another teacher who, you know, she would start talking to me about the
45:33
frustrations of what she was going through. And I would say, um, you know, and she goes, "Don't don't bring up the
45:39
AI stuff. Don't even don't I don't want to." Okay. I said, "Okay, heard heard.
45:45
Just okay. Um, never mind." you know, and then it's like, so I I get both ends of the spectrum and,
45:52
you know, so how I try to deal with that is I look at them where they're at
45:57
and that's how I handle any training that I've ever done with teachers is I start with what is I always say,
46:05
what's the one thing you hate the most about your job? Like what's the one thing you hate doing? Yeah. Use that as an unlock to make it
46:12
relevant to them. Exactly. And so that's that's an easy way in. Um, I also use a good one too
46:19
that usually shocks teachers, um, especially early on with AI, which was,
46:24
what's something that you think AI can't do? Like like
46:29
like so I always tell them, do you think that there's an assignment that you have that you don't think AI could do?
46:35
Yeah. And then I obviously I've gotten pretty good at prompting by this point.
46:41
Yeah. I will prompt the AI and it will do the assignment that they did. And I said,
46:46
"Now, I'm pretty advanced." I said, "I doubt there's a lot of students on the same level as me, but
46:53
I just wanted to show you that no matter what you think about your assignment, whether it can or can't be done with AI,
46:59
there's a way that AI will be a part of that. So, you need to start thinking about that as you build assignments in
47:06
the future." Absolutely fascinating. Can I sneak in one more quick question?
47:11
One more question, then we'll go to the questions. Jump in. I know I'm pressing the button here, but um parents, are you
47:18
getting resistance from parents? Yes. So, I had to go through a learning
47:24
phase just like everybody else is doing this. I when I first uh came in that
47:30
school year after that summer where I was obsessed with AI, I just went all in on myself and I started using it for
47:36
everything. I started signing up for programs and putting my kids in programs. I didn't have the knowledge at
47:45
that point to question these tools, to question
47:50
student data privacy, to question where does their data go when they're talking
47:55
to this chatbot. I put my trust in the tool, which is
48:01
probably what a lot of the teachers did. So, you know, I've never really shared that with anybody. So, big reveal. Um, I
48:06
screwed up. I screwed up big time. I I did that. I also did not like ask my
48:13
admin about it and I did like so I I went the totally the route that probably a lot of
48:18
teachers went which is nobody else around me is talking about this so I'm just going to test it out myself
48:24
and I learned right there like okay there's there's a better way to do this
48:30
now the second year I came in and I had some guidelines okay you're you're
48:35
allowed to do this but don't do this and so I followed those guidelines but even with those guidelines,
48:42
you know, parents called in and were like, I don't want my kid using AI. And I was like, well, your kid's not using
48:49
AI. Um, I'm using AI to give your kid first feedback on something and then so
48:56
they can improve it and then I can grade the final product. Or I'm using AI to
49:02
give it feedback on a formative test like an autograded thing, which isn't
49:07
even really AI at that point. It's just did you get it right or wrong? And that's all I need to know.
49:12
And so there's a big misunderstanding. Um this year I wanted to propose that I
49:19
have a community night before school starts and ask me anything.
49:25
Bring any parent, any student can come in because Kyle, what happened throughout the course of the year is
49:30
that my kids, first of all, they know that I have Tik Tok. Like these kids know how to maneuver Tik Tok better than
49:36
I do. They found me. They know that I like this topic. They know that I can talk
49:42
about this topic forever. It's part of the reason my name is Mr. K all day because it's what I can talk about all day. But so when they bring it up in
49:50
class, they know and they've used this to their advantage, which I need to fix. But, you know, they know I will go off
49:56
on a tangent anytime they ask me a question about something that I'm interested in. But I'm also I feel obligated to help
50:03
them learn. Yeah. because no one I know this for a fact no one else is teaching them how to
50:08
do it and they're they're gonna enter a world where this is everywhere right I mean
50:14
think about like how much has happened in the past three years and your kids are in middle school right now they've
50:19
got all of high school to get through then college if they go to it if it still exists at that point
50:24
right seriously and then and then they're out in the world um yeah okay we got to ask you the three questions okay so I'll go first uh
50:32
then Glenn will go and then we'll wrap it So, so we ask all of our speakers the same three questions. There are no wrong
50:38
answers, but we may make fun of you. Fair. That's fair. Yeah. We're judgmental, not curious.
50:47
Fair. All right. Lay it on me. I'm ready. Okay. So, the first one you you kind of
50:52
talked about it, but maybe unpack a little bit what happened in the summer. So, it's what was the tipping point where you knew you had to go all in on
50:59
AI and what's happened since then? We've been talking a lot about what's happened since then, but I would love to understand what was your experience once
51:06
you realized, you know, you you you once you discovered it, uh what what what happened next?
51:13
Once I started seeing people on like Tik Tok and social media talking about the ways that they could like run businesses
51:19
or like do marketing and do things like that, I started to realize that like AI was a little more than like you know uh
51:27
helping me write something or helping me word something or brainstorming with me. Like it is like literally a tool that
51:33
could be used to like build and work with things, right? And with people. So
51:39
that is really that summer when I started to see how like businesses were using it and all the you know create a
51:46
digital product and put yourself out there with a digital product and then get a lead magnet, get this, get that,
51:52
start an email newsletter, start all this and I was like if people can use AI
51:57
to do all of that stuff and again this is the early AI, right? So like,
52:03
so like at that point when people are like it can use AI to do all this stuff, I'm like, oh my gosh, like my students
52:10
are going to leave school and like be able to build a business, like they could build a business before they left
52:16
school if they wanted to. Um, and you have a lot of stories around the country of kids doing that. They're building
52:22
businesses before they even graduate high school. Yeah. Um and then and then if they're good,
52:28
uh they'll keep them or they sell them to a larger company who loves their product, you know, and then they're millionaires at like 18. Um
52:35
but like that was really my tipping point was when I saw the the more
52:41
advanced ways that people were using it. And so that pushed me. It was like, okay, well, what are more advanced ways
52:48
that teachers can use it? M and so for me it turned into instead of like automating my creating quizzes,
52:56
creating tests, the big one at the beginning was creating rubrics because rubrics are a pain to to develop. Um
53:03
yeah, I'm nodding like I understand what a rubric is. That's okay. Uh that's okay. I I won't
53:08
that's for another time. But um I'll ask Chad GPT. I'll be good. There you go. Perfect. It'll make you one beautifully by the way. So, um, but
53:17
once I realized like, wait a minute, I can, so I like to like move around my
53:22
classroom, and I'm sure a lot of my students like to move around my classroom. Um, here's what I have to
53:28
teach. Here's the standard I have to teach. How can I incorporate movement into this lesson?
53:34
How can I get the kids up and out of their seat? Um, how can I gamify this? How can I turn this into a game that's not just
53:40
Jeopardy? Like, stop giving me Jeopardy. That's like boring. How can I turn this into a game where I get kids up and out
53:47
of out of their seats moving around? And then it it you know and then I started learning from other people. That's
53:53
that's summer two after I started TikTok. I revamped my LinkedIn and I
53:59
took off on LinkedIn like like way faster than I expected to on LinkedIn.
54:04
But again, it's because I was sharing ideas and learning ideas from people who were doing some truly like amazing
54:12
things with AI in the education space, right? And that like literally opened up my brain. So that's that was my tipping
54:18
point and that's where I'm at now is trying to rebuild how I teach
54:25
u based on the things that I like to do. And and here's the other cool thing. Every year I can spend the first week or
54:32
two reading my class. What do they like to do? Are they can they not sit in their seats? Do we need to get up and
54:37
moving? I can adjust my instruction on the fly every single year, every single week if
54:45
I need to. Like that's a superpower. That's a superpower.
54:50
That's awesome. And they're probably also changing year-over-year where they're finding themselves not sitting
54:55
still for for you know shorter and shorter periods. So this brings me to our next question which is really if you
55:01
could tell us about some trends that you're seeing in your area of expert expertise and um paying attention to why
55:09
why you think that is but in a 30 second spot. No that's okay. So the big one I'm
55:15
paying the big one I'm paying attention to is all the big companies uh Google, Microsoft, OpenAI, Anthropic, they're
55:22
all making plays in the AI and education space. So Google just had its big u
55:28
announcement about Gemini being introduced into Google Classroom and Google edu workspace. That's going to
55:35
catch a lot of schools off guard uh if they don't realize that like oh hey I open up Google classroom all of a sudden
55:40
I have this new suite of AI tools there. And Google right now they're allowing
55:46
teachers to build custom gems with their Geminis. So again, a teacher who hasn't
55:51
even experimented at all with AI, now you're throwing, you could build your own AI. They don't even know where to
55:57
start with the original one. And it's like, so the idea in education is like I
56:02
like that these companies are trying to get involved. But I am very cautious as to how involved they get
56:08
because right now what what's hap what I see the trend that's happening is these education companies are going to come in
56:14
and say and I just got this notification like literally as we were sitting here
56:19
the white house just released an AI and education like plan of what they're going to do right so these companies are
56:27
the ones who want to enact that plan and obviously there's a very big partnership
56:33
going on there the education industry itself is a very lucrative market. So if all of these
56:39
people make plays in that market now you've taken the teaching of this AI
56:46
and the instructional knowledge that all the teachers have. You've taken that power away from the
56:51
teachers and put it in the hands of the tech companies. Right. Right. Right. Now the tech companies are going to be the ones saying this is how you should
56:57
use AI. Right. And you know, yeah, they've also opened uh a number of them now have
57:04
services components. So, they're going to say, "Hey, we'll come in and teach your district how to how to use these
57:10
tools as opposed to coming from people like you." Fact, if anyone from any of those companies sees this, I'm around. Just saying.
57:17
Yeah. Yeah. Seriously, bring in bring in people that are on the ground that understand critical thinking. So, yeah.
57:23
Great. Okay. Last question. This is an important one. AI readiness project. What does AI readiness mean to you? And
57:31
then what would you say to someone just getting started with AI? So AI readiness
57:37
to me is really I'm going to connect this to another like group that I'm a part of is called the human intelligence
57:43
movement. Um so if you haven't heard of the human intelligence movement, check them out on uh LinkedIn. They have a website
57:50
humanint intelligencemovement.org. But their goal is to it's one that I'm I
57:56
very much am in agreement with. Um I've been with them very much since the beginning. Um despite and even though AI
58:04
is here um we can't lose sight of our human skills. Um you know and teachers like to always
58:11
reference like the four C's. I I throw in a fifth C which is curiosity because
58:16
I really think that nothing nothing else matters if you're not cur like if you don't want to learn anything
58:22
then you you don't you're just going to sit there and not learn anything 100% being ready one of the big skills they
58:28
talk about is adaptability they they they say that adaptability is going to be one of the the most important human
58:35
skills that we can have going forward and that ties in with AI readiness so AI
58:41
readiness to me means means that you can adapt when your future company or when
58:47
your future thing that project that you're working on now has AI integrated throughout it.
58:54
You're already ready for that. You have played with it, right? So, I know you we
58:59
talk a lot in AI salon about play play with it. Yep. Play with it and challenge it and
59:06
stress test it. And I think that's the way forward. adapt being able to adapt
59:11
to what um to anything that AI gets thrown into what we're doing in the
59:16
future. That's what AI readiness means to me. Yeah. No, I agree. I think it's I think it's one of the one of the the biggest
59:23
the biggest um I guess it's an attribute of of people moving forward is is is
59:29
adaptability. Absolutely. And it's not just, you know, what what AI is going to do, but it might be that AI completely
59:37
upends how you're how your industry does its business or how education is taught, right? Like
59:44
part of adapt might be yeah part of adaptability might be I build all this stuff and then six months later all that
59:51
stuff is irrelevant and I have to throw it all out and not get bitter about that. Right. It's there's a there's a
59:57
resilience that that is required with with adaptability because I think things are going to shift a lot over the next 5
::10 20 years. It's like I don't think it's slowing down. I think it's living
::in that uh living outside the box, thinking outside the box on a constant basis instead of a unique basis.
::That makes yeah beautiful Mr. K. I you know we need
::another hour here. I feel like we're just getting started, but you know I think it's good when the time flies,
::right? This is awesome. Hey, I always tell people there's a reason why my name is Mr. K all day. I
::could literally talk about AI all day. So that's awesome. That's awesome. Awesome.
::All right. Well, great seeing you. Thank you so much for doing this, Gwyn. Such a pleasure to do this with you. Thank you
::for hopping in. And my pleasure. Thank you for having me. This was fantastic. Awesome.
::And education is a terrific topic. So, thank you. Take care, guys.
::Thank you. [Music]