Amplified: The Chesapeake Public Schools Podcast

AI in Our Classrooms: Part 1

Chesapeake Public Schools Season 2 Episode 15

In our latest podcast episode, Dr. Jeffrey Faust, our Chief Technology Innovation Officer, explores how we are starting to incorporate artificial intelligence in our schools. He shares AI’s practical applications, addressing how it helps streamline teacher workloads, while also touching on critical considerations such as data privacy and the ethical implications of AI in learning environments.

Listen today on your favorite podcast platforms, as we explore the evolving landscape of education technology and the importance of equipping students and teachers with the skills and best practices needed to thrive in a future shaped by AI.

Send us a text

Speaker 1:

Welcome to Amplify the Chesapeake Public Schools podcast.

Speaker 2:

Chesapeake Public Schools is located in the Hampton Roads region of southeastern Virginia. We proudly serve over 40,000 students in 45 schools and three centers. Join us as we share the stories behind our story by celebrating the people and programs that make us one of the premier school districts in Virginia.

Speaker 1:

Hey listeners, this is Matt Graham and I am here with Chris.

Speaker 2:

Vail.

Speaker 1:

And we have yet another hot topic to discuss today, right, chris?

Speaker 2:

Yeah, we're going to dive into artificial intelligence AI in the classroom.

Speaker 1:

Previous episodes that we've had. A lot of teachers, a lot of administrators have mentioned how much technology has changed over the course of their years in education, and right now at the forefront is AI.

Speaker 2:

Yep. Imagine, matt, when you and I started teaching together at Great Bridge Middle. It was we were excited if we had a whiteboard in the classroom and an overhead projector. Now we're talking about AI.

Speaker 1:

I know man Times change so it's so much of a hot topic that, even while we're making this episode, our governor announced an artificial intelligence task force where they are going to meet to discuss policy standards, it standards and education guidelines, and we just recently spoke with our Chief Technology Innovation Officer, dr Jeffrey Faust, on this topic and how basically a big picture what we're doing as a district to start to incorporate AI, its best practices and so forth.

Speaker 2:

Hey, dr Faust not only brings great hair into our studio, but he's our expert for Chesapeake Public Schools on AI.

Speaker 1:

Absolutely, Today we have with us our Chief Technology Innovation Officer, dr Jeffrey Faust, who has been with us since 2020, right, that's correct. So welcome to the podcast. We are happy you're here.

Speaker 3:

I'm glad to be here.

Speaker 1:

I've been looking forward to this genuinely Well, and today is all about AI, artificial intelligence and who better to have help drive this conversation than yourself?

Speaker 3:

Well, I can think of a lot of people, but they probably were busy winning Nobel Prizes and other things that are happening right now.

Speaker 1:

Correct. Earlier this month their Nobel Peace Prize for Chemistry, I believe, was awarded to three individuals about predicting protein structure. I think that's right. What I was reading up on that was it takes like years to navigate this stuff and they build an AI program to help get that done. Obviously a lot faster.

Speaker 3:

It is a buzz. What AI offers for medical sciences is something I obviously a lot faster. It is a buzz. You know, what AI offers for medical sciences is something I think there's a lot of optimism around Right.

Speaker 1:

For those of you that don't know you, you've been with us since 2020. Correct, right, so tell us a little bit about yourself, your background, how you got here, sure.

Speaker 3:

So you know I'm not a Virginia native and I own that, but I have lived in Virginia longer than anywhere else, and so what happened for me is out of college I graduated from IUP, was going to be a teacher, come from a family with many teachers and many educators, and what was going on in Pennsylvania at the time was contraction, economic contraction, jobs were tough, and I had a principal of a school that I interviewed with said hey, listen, I'm going to tell you what I wish. Somebody would have told me when I started my career, and that was you're going to be a phenomenal teacher. I know you're a phenomenal teacher. There's no way I'm ever going to be able to hire you in my school district because you're applying against a thousand other applicants, most of whom have been doing this exact job for years. Wow. So he was like you want my advice, go somewhere where they need teachers. And I said, sure, I was pretty adventurous, young, full of lots of energy and optimism and found Virginia.

Speaker 3:

So I ended up in Culpeper. I was a teacher in Culpeper, virginia, for several years. I moved to Fairfax where I was a teacher and then became a tech support specialist in Fairfax, and then I left education. Okay, so funny thing. So during my time, while I was, you know, teaching and working as a tech specialist in Fairfax, I discovered technology. And when I say discovered, obviously technology already existed. But working in graduate school at UVA, wahoo, there was a.

Speaker 1:

I'm sorry, we have some tech people in this room.

Speaker 3:

Hey, listen, I am a hokey-hoo, so I have a degree from tech and I have a degree from UVA.

Speaker 1:

There you go.

Speaker 3:

So I proudly wear the maroon and orange, and also the orange and blue.

Speaker 3:

Okay, that's acceptable, yeah and there was a work that I was doing at UVA required me to actually get into software development, got into coding, and it was the first time where I built something, a technology product, that wasn't just I'm going to use a piece of technology to do what it does. It was I'm going to build a piece of technology to do what I needed to do because there's nothing else out there to do it. And that changed my whole worldview, built some relationships in Charlottesville, ended up working for a technology company in Charlottesville, an LMS company specializing in medical CME and targeting doctors and medical professionals who need continuous training.

Speaker 3:

So it was still education, but it was the technology side of education and specifically adult learners, and I did that for the better part of six or seven years. Eventually, somebody came to me and said hey, there's this job opening in in the city schools in Charlottesville. You need to apply. And this was a friend of mine. Our daughters swam together and I was like nah you know, I don't think so I think I got out of education, and I don't think that that's where I need to be, and he's like no he's like you know, we need you, we need your vision, we need you.

Speaker 3:

And I said you know, okay, why not? So I threw my hat in the ring and, long story short, whatever it is now, 14 years later, I came back into education, but in a technology role and in technology leadership for Charlottesville for eight years and now here in Chesapeake for four plus years, and along the way I think I've learned a few things. But yeah, so it's a strange path, not what anybody would call as the traditional or prescribed path to how you get into this seat and into this position, but I think that it's providing me with a perspective that makes my worldview, my view of education, my view of technology, fairly unique.

Speaker 1:

Yeah, that's great. Well, we're happy to have you, and today's episode is all about AI. At one of the superintendents community engagement council, I believe last year, about sort of the history of AI, how it is moving and what direction it's going and how we're going to incorporate that into our schools and learning. So do you mind giving a little brief background on some of that information that you shared at the community engagement council?

Speaker 3:

Sure, so I think that the place we want to start is that there was a I'm going to equate it to cloud. If you remember, four or five years ago, cloud cloud everybody was saying cloud, cloud, cloud, and so I think one of the first things we want to do is tear down some of the loaded aspects of the term AI, and I think we do need to think about it as just being technology and a way to augment human capabilities, and obviously AI has become the moniker that we're all familiar with and we use out there when we're talking to friends and we're hearing the news stories and everything else. But the truth is, it's the next iteration of technology that's designed to support the work that we do, and so, to me, the first thing is sort of going okay, well, let's relax, because we're not talking about iRobot, we're not talking about mechanized warfare, the Terminator Right.

Speaker 3:

But I do think for some people, ai invokes that. What I want to think about, though, is it's not new. We've just come to accept this newer application of technological systems in a way that, in providing opportunity to us that we weren't able to do before, we're lumping that all into this group of AI, so we have to talk about what kind of AI there's. You know neural networks, and there's generative AI, which is the one we're most often talking about and the one that I think has the most implications for us. You know, and some people don't even use the word AI Some people are just calling everything chat GPT the idea here that we have these tools that are enhancing our abilities. I think that they're not necessarily new, but I think widespread usage and adoption of them and the rapidity with which they're advancing is absolutely new. Two years ago, nobody was talking about chat GPT. Now everybody's talking about chat GPT, and that's just one of many, many publicly available tools, and so to me, the landscape right now is uncertain, but full of optimism.

Speaker 1:

It's like the wild wild west with AI a little bit.

Speaker 3:

That's fair. I think that one thing I want to give some credit to and credence to is I hope as a society we learned a little something from the social media trajectory. Social media came out, we all embraced it. We were putting it everywhere, we were spending all our time on it, and now here we are, 20 years later or 15 years later, going, whoa, maybe that wasn't all good, maybe some of this social media fanaticism actually has been destructive towards society at large. What's great to see right now is that some of the conversations that we probably should have had around social media are in fact happening around AI, ethical usage, bias, exposure to things that are untrue, misinformation, disinformation campaigns and AI's role in that. And we see, you know, even Congress and politicians in Washington taking an interest in asking for people to come and speak about this already, which is great, because it took 10 years into the social media sort of adoption before those conversations were being had you brought up a good thing.

Speaker 1:

You were saying about how kind of globally we're addressing the ethical use of AI. What are we doing as a district to help navigate that?

Speaker 3:

Yeah, so our start has been I think cautious is probably the right word to use Last spring we formed a committee called the Disruptive Technology Committee and we have teachers and administrators and community members all represented on this committee, and where we started was let's talk about what is, what isn't, let's have conversation, and from that we spun out some small pilots hey, let's check out this tool.

Speaker 3:

And so what we've been doing is having healthy conversation around it. And then one of the things that we were able to do this year for our teachers was to enable them to utilize AI that's built into our productivity suite here in Chesapeake so that they can begin to ask it for supplementary help and support To me. Everybody knows that our teachers are overwhelmed. Everybody knows that we keep on asking more of them in spite of knowing that they're overwhelmed. And one of my favorite things around AI that we're really encouraging for our teachers is we teachers spend and some of the studies that I've seen an enormous amount of time just searching the web, and let's all agree that search is broken right when I search if the first page is all sponsored it's all sponsored, again paid right no doubt.

Speaker 3:

Exactly. That's not. That's not real search. That's that's me pretending to participate in marketing, unwittingly participating in marketing in some cases. But if, instead, what I really need is like three questions about a topic and appropriate for a third grader in both English and Spanish, if I can get those three formative questions from an AI, product and a platform and I don't have to search the web for other teachers who have written them, and that's a huge thing because a teacher can get those in 30 seconds.

Speaker 3:

So where we're starting is with cautious piloting and trying and trialing, and we have made Gemini and again, that's not because it's better or worse or whatever, but the fact is that we've embraced our relationship with Google, and Google Workspace is our productivity suite and our students are on Chromebooks and our teachers are on Chromebooks, and so the Gemini tool is being integrated with our platform and our ecosystem makes a ton of sense, and so teachers being able to say, wow, I can save myself time, make myself more efficient, do some analysis, get some feedback, is a great place for us to start and getting them comfortable with. Hey, this is a personal assistant built into your device, built into our resources, that you can leverage to help you be more efficient in some of the work that we know is taking teachers hours and hours every week.

Speaker 1:

So obviously, productivity is one of those things that we're excited for with using it for the teachers At the same time, what is being done to help sort of train those teachers with that use so that they can implement it in the classroom?

Speaker 3:

and then also just for themselves. Yeah, so this year, during pre-conference, our weeks before the school year, we had a number of training sessions for our teachers. We're also we've tried to get representation across our technology innovation coaches, who are present in all of our schools, helping them to understand what's available, helping them to understand how to work with their teachers, and to say to teachers it's okay to ask you know Gemini to provide you with some questions. It's okay to ask Gemini to look at something that you've written or used before and ask it to reword it for you, or to take it down to a third grade reading level instead of an eighth grade reading level. Those are all very, very useful tools. So, between our technology innovation coaches, as well as the trainings that we've offered in the weeks leading up to school, both for administrators but also for teachers, and then the ongoing work of the disruptive and emerging technologies committee, that's how we're approaching it right now.

Speaker 3:

The other thing we're doing is we are keeping our eyes open to partners that we work with. So one of the things is do you need a new product? Do you need to go out and buy one of these boutique products from a company that is selling something to schools that is, the school AI. That isn't yet another product, yet another platform. Or do we want to look to our big partners and vendors the Microsoft of the world, the AWSs, the Geminis, the you know? For us, Instructure is a huge partner for us with our LMS. They're all pursuing and have AI strategies that we're watching very closely to see how those get integrated into tools that we're already using. One of the most recent articles that I read suggested that 75%, or better, 75%, of the current existing AI companies will be out of business in less than a year Like this is the cycle.

Speaker 3:

The cycle is really fast so you've got these AI companies. They stand up, and if we buy a product and then that company is out of business in nine months and we spend time investing in training, and then that product's just gone, they're out of there right I think that that's one of the things that we also do need to avoid okay, something that comes up.

Speaker 1:

I believe I'm a parent, you're a parent yeah and there's definitely going to be some challenges and there's probably some concerns. I know one of the biggest, as a parent and using technology on a daily basis, is data privacy. What are we doing to help protect our students and teachers, our staff, with data privacy, and how does that interplay with AI?

Speaker 3:

Yeah, so it interplays with AI the same way it would interplay with any other technology system. One of the reasons, going back to, I mentioned the fact that Gemini is convenient for us because it's integrated with our ecosystem. The other thing is that it's covered under the agreement that we have with Google, so one of the promises we have from Google is that they're not going to train their AI off of the data that we've entered.

Speaker 1:

Okay, that's good to know as an education customer.

Speaker 3:

We're very, very cognizant and aware of those concerns, and one of the reasons why we would not just go open up product X that is the next greatest AI if they wouldn't offer us the same guarantee which says, when you type in a question or when you type in an idea, or when you upload a document to us, if they're not going to promise us that they're going to keep that as our data and not their data, and that they're going to let us use it, but they're not going to use it, then that wouldn't be a good partner for us. So yeah, so that's one of the promises we have. The other one, with student privacy, gets really, really interesting, and I think this is one of the areas where I see some overlap with the cell phone policy. Part of the problem with the cell phone and apps is the current EULA right, E-U-L-A End User Licensing Agreement. It says you have to be 13, but there's no checks that say are you actually 13?

Speaker 3:

And so one of our jobs is to read through the agreements and read through the privacy policies and practices of the company. So we read through that and we make a determination as to whether or not they're adhering to Student Privacy, pledge, project, unicorn, some of the standards that are out there from IMS Global, now One Ed Tech. It's front of mind, for sure. It's front of mind for us to think about what's happening to the data that we're putting in, about which ones we encourage and or enable for staff, is because we want to be assured of those kinds of things before we would roll it out any more widespread than that.

Speaker 1:

Gotcha In the classroom. It's very simple for someone to type up maybe something through chat. Well, I don't know if chat GPT is open yet in classrooms.

Speaker 3:

So I mean, here's the thing is if we talk about our devices and our network and students no. But at the same time, like if a student goes home, student goes home, they're on their device, exactly Right.

Speaker 1:

And they write up an essay. That's right. How do we know if it's their work or the AI work, or how does that even come into the conversation with education?

Speaker 3:

This probably is one of the most challenging topics right now. Traditional mindset, traditional perspectives would say they're cheating. Okay, what if they're not cheating? What if they're just using the tools that they have in front of them? And I'm going to give you a good example of this. When I was in high school, graphing calculators were just coming out.

Speaker 1:

Right, this was not the TI-81 or 82,. Whatever that costs like an arm and a leg, that's right.

Speaker 3:

Back then I remember my family literally being really concerned about that requirement. I had a teacher in high school who required us to learn how to use a slide rule because calculators were cheating. I know that teacher was genuinely doing what he thought we needed and was best for us.

Speaker 3:

I can promise you that the last day of that unit when we used a slide rule was the last time I ever touched a slide rule, and that's you know. I went on to earn a physics degree and never touched a slide rule again. So to me that's a wonderful comparison for what is cheating In a day and age when we have augmented tools to support my abilities and make me more able to do these things? Is that cheating any more than using a wrench? Is that cheating any more than using refrigeration to keep food from spoiling, or turning on the lights in a room rather than having to stoke a fire and light a bunch of candles? So it's a tough conversation because we bring to that conversation our own biases and our own perspectives.

Speaker 3:

And if I ask an English teacher, a language arts teacher, those teachers value reading and writing at a level that nobody else does, for good reason, and they want everybody to be as adept and passionate about reading and writing as they are. So to them it's cheating. They're not wrong. But to a person who has struggled with, maybe even has, a learning disability around reading and writing, and they're now able to do something they weren't able to do before. None of us would call that cheating. So I do think this is the crux of the conversation there's not one right answer to, but I do think whenever something new comes out that makes our lives easier, there's a perception that somehow that we're cheating by using it, and that's a natural human response to it. But if what I'm actually able to do is write 10 times more in half the time, then maybe that's really good.

Speaker 3:

And I'm going to put one more caveat on this Taking credit for somebody else's work is wrong. Okay. So one of the things we have to, I think, normalize is, if you're using AI to augment, enhance, create, add to the work that you're doing, we need to talk about what the proper treatment or citation for that is. It's not that high school composition classes shouldn't use AI, but what we should do is say well, if you're using AI, shouldn't use AI. But what we should do is say well, if you're using AI, here's best practices, including citing which AI you used, what your prompt was, how much of that content you used. If a student goes home and generates 12 pages of writing and just turns it in without in any way reviewing and changing, adding their own perspectives to it that I would argue yeah, that's taking credit for somebody else you didn't do that.

Speaker 3:

Right, right, absolutely, and so these are the conversations I think we need to have, and I think, with AI, the reality that we face is that critical thinking and those I'll call them uniquely human traits are more important today than they've ever been and will be more important tomorrow than they are today. So, as education institutions, we need to go. Okay, I want to make sure that I'm not focused on the mundane those. If we talk about bloom's taxonomy, you know knowledge, understanding.

Speaker 3:

We need that analysis synthesis we need high level stuff because humans do that really well, machines don't, you know, and and as much as ai is getting better, it still doesn't do those tasks really well. So we want to, we want to challenge our learners to embrace those more difficult levels and really get into that synthesis analysis. You know high level in evaluation, high level thinking, what we often collectively refer to as critical thinking. It's more important now than ever and it's going to continue to gain in importance.

Speaker 1:

I wonder if people were having these sort of same conversations when you mentioned that TI calculator when it was first being introduced. I wonder if this sort of discussion has taken place.

Speaker 3:

I can promise you. There's a wonderful photograph from I believe it was the New York Times, of teachers picketing out in front of their schools no calculators in class. It literally created protests. But more interestingly, let's go back to the turn of the century, not this century, but the previous century. Like late 1800, early 1900s, there were major, major concerns about introducing paper into school. That was technology that was disruptive at the time. There's a moral panic aspect, but I do think it's important for us to work through that.

Speaker 1:

So let's talk about the students. Why is it important for our students to start leveraging and using AI appropriately in our schools?

Speaker 3:

Current students current K-12 students are going to be moving into a world where AI is augmenting professional pathways what that looks like. I'm not a futurist, right.

Speaker 1:

We don't know what sort of jobs are going to be out there, right, but I can probably guarantee you it's going to incorporate AI, absolutely.

Speaker 3:

Kids are definitely going into a world with AI and that AI is going to augment human capabilities. But here's one of the statistics that I've seen recently that got me excited, and that is that augmentation of humans provides better efficiencies and profits to a company than replacement of humans. We saw that in some of the auto jobs right, machines were going to replace autoworkers, and then what we found out was that machines were just very good at getting the job done, but they weren't very good at assuring the job was done right. So Detroit and other manufacturers around the world have gone okay. Well, yeah, we've got more robots than ever and, I would say, ai driven, but at the same time, there's humans that work with the robots to make sure that the things are in the right place, the joints are where they need to be and all that kind of stuff.

Speaker 3:

So that augmentation and the idea that kids are moving into a world where AI and skills with AI are important is real. So for our students, we, as Chesapeake Public Schools, are looking to make AI tools available to students. We, as Chesapeake Public Schools, are looking to make AI tools available to students. We also are being very purposeful about that because we want to make sure that we choose tools that truly are beneficial to all students and enhance their experience and make them better learners and make them better citizens. So, for students, they also just have to accept the fact that it's here, it's real. They should be playing with it genuinely. Except the fact that it's here, it's real, they should be playing with it genuinely, and I think that the last survey that I saw was you know, 74% of kids admit to using AI, so we know they're using it whether it's on our devices or on their personal devices.

Speaker 3:

they acknowledge it and I do think that that is going to be a really critical skill. Prompt engineering is a job that pays six figures and there's not enough prompt engineers out there for the jobs that are open right now, and we know that large companies are saying hey when the next 10 years. One of the top five skills we're going to be looking for is ability to work with AI for productivity.

Speaker 1:

So it makes sense for us to not just have this discussion but figure out the ways, like what you just said, to incorporate it in our schools that are purposeful to help prepare our students for these future jobs.

Speaker 3:

Yeah, and one of my favorite things is to amplify those things that are uniquely human and to enhance the humanity of the experience. And those are, that's to me, the overarching, the overarching.

Speaker 1:

Well, that's great to hear. As a former technology instructional now coach, I'm into this stuff, so I could sit here and talk about this all day, but unfortunately we're going to have to go. We are going to speak with a teacher and a technology innovation coach on the next episode and they're going to share how they're starting to incorporate AI in the classroom. Thank you so much for coming in today to share where we're moving as a district with AI in our classrooms.

Speaker 3:

Fantastic. Thank you. It has been a pleasure and glad you're talking to a teacher and a TIC as well, because their perspectives are going to be invaluable.

Speaker 1:

Yeah, it's going to be great, awesome. All right, thank you so much. We hope you enjoyed the stories behind our story on this episode of Amplify the Chesapeake Public Schools podcast. Feel free to visit us at cpschoolscom. Forward slash amplified for any questions or comments and make sure to follow us wherever you get your podcasts.

People on this episode