Toya S. in class of 2019

Embracing AI in Education: Balancing Innovation and Ethics

Ever wondered how AI is crafting the future of education? Dr. Brian Arnold, an education technology expert from National University, joins me, Kimberly King, to demystify AI’s impact on learning and teaching. Together, we dissect the evolution of educational practices, casting a light on the power of AI to personalize the educational journey. As large language models like ChatGPT begin to permeate classrooms, we critically examine the shift from traditional knowledge demonstration to an outcomes-based approach that emphasizes competencies.

The landscape of education is a tapestry of diverse opinions, and our discussion traverses the gamut of emotions AI evokes among educators. While some embrace the wave of change, others stand cautious at the water’s edge, and Dr. Arnold and I tackle the tough questions. We address the balance between preserving the sanctity of academic freedom and embracing the imperative of preparing students for an AI-integrated future. The practicalities and challenges of updating assessment methods also make their way to the forefront, as we underscore the importance of equipping our educators with the right tools to navigate this transition.

The ethical dimension of AI in the classroom cannot be overstated, and it’s here that Dr. Arnold and I take a deep exploration into the misconceptions surrounding AI—debunking the fear-inducing myths fueled by dystopian narratives. We dissect the ethical and moral nuances, and the conversation naturally flows toward envisioning the future role of teachers in this new era. Closing our insightful dialogue, we reflect on how AI can ignite creativity and critical thinking, reshaping not just how students learn but how they engage with the world around them. Join us in this exploration for a fresh perspective on AI’s role in education.

  • 0:02:02 – Expert in Media Arts and Technology (102 Seconds)
  • 0:12:41 – Educators’ Disposition Toward AI Obstacles (87 Seconds)
  • 0:17:20 – Integrating AI in Teaching Challenges (63 Seconds)
  • 0:24:08 – Ethical Considerations in Using AI (72 Seconds)
  • 0:29:11 – Understanding and Communicating with AI (59 Seconds)
  • 0:36:30 – Fostering Creativity and Critical Thinking (93 Seconds)

0:00:01 – Announcer

You are listening to the National University Podcast.

0:00:10 – Kimberly King

Hello, I’m Kimberly King. Welcome to the National University Podcast, where we offer a holistic approach to student support, well-being and success – the whole human education. We put passion into practice by offering accessible, achievable higher education to lifelong learners. And coming up on today’s episode, we’re discussing using AI technology as a teacher. According to the World Economic Forum article recently published, with the rapidly accelerating integration of artificial intelligence (AI) in our work life and classrooms, educators all over the world are re-evaluating the purpose of education in light of these outsized implications. At the same time, we see huge opportunities for teachers to use these technologies to enhance their own teaching practice and professional experience. Such an interesting and relevant topic coming up on today’s show.

On today’s episode, we’re discussing using AI technology as a teacher, and joining us is Dr. Brian Arnold. Dr. Arnold is the Department Chair for Global Innovation Social Emotional Learning within the Sanford College of Education at National University. He earned his PhD in educational technology and educational psychology from Michigan State University and then spent the first half of his career focused on media arts, film, games and design, and ended up in teaching and administration in those disciplines. His current research includes an interest focused on humane emerging technologies, and one of his early career highlights include working as an editor for Nickelodeon Animation Studios in the late 1990s on shows like SpongeBob Squarepants and so interesting. We welcome you to the podcast, Dr. Arnold. How are you?

0:02:06 – Doctor Brian Arnold

I’m great. Thank you for having me. I’m looking forward to talking about this topic.

0:02:10 – Kimberly King

Yes, what a great background. Why don’t you fill our audience in a little bit on your mission and your work before we get to today’s show topic?

0:02:18 – Doctor Brian Arnold

Okay, well, I appreciate you launching me with that short biography. Basically, I started off in the field of media arts. I was interested in animation and writing. I was a horrible singer. I was a bad clarinet player. I was looking for all the things that I could express myself in and I realized that what I was looking for was teaching, but I always loved the technology. I’ve been a gamer since back in the day, since my Commodore VIC-20.

0:02:44 – Kimberly King

You’re aging yourself now, by the way.

0:02:25 – Doctor Brian Arnold

It’s okay, I didn’t use punch cards, but it was close right. So, basically, I’ve always been a fan of this kind of stuff, so I’ve always been interested in the technology side of things and the experiential side of things, and so I had my eye on AI as it was popping and when it hit the scene, I thought, wow, this is an important, big deal. Not alone there. And my mission, what drives me, what gets me out of bed in the morning is looking for ways to people to have the conversations that develop their digital literacy, that empower them to use these tools in ways that benefit them as much as it benefits society and the institutions that develop these tools.

0:03:36 – Kimberly King

I love it. You’re the right person for this position here and I love that you have moved into the teaching arena. So today we’re talking about using artificial intelligence as a teacher and so, Doctor, what is AI technology and how can we use to help education?

0:03:54 – Doctor Brian Arnold

That’s an excellent question and you probably would still be served by using good old Google. But essentially, the AI that we use isn’t so much the behind-the-scenes AI that helps suggest to you what you might want to purchase next on Amazon, and a little bit more of AI wrapped in what’s called an LLM or a large language model, which is essentially a chatbot, which makes it easier for us- the user, the consumer, the learner, the educator- to interact with the technology, artificial intelligence. The artificial intelligence itself, we could probably talk for eight hours and no one would be listening, but essentially it’s a prediction machine. It’s good at looking at a large body of data, like you know. Hey, go read the internet up to 2022. Okay, boss, got it. All right. I’ve read it all and say okay.

So when you see a pattern, what do you think comes next? So I pledge allegiance to… what’s the next word, right? So there’s like a 0.05% it’s porcupine and a 99.9% chance it’s flag, right. So it has that ability to make predictions and therein lies its power. So what is AI for our purposes as educators, as learners, as consumers? So what is AI for our purposes as educators, as learners, as consumers is it’s a powerful prediction machine. That at this point mostly makes words for us, the words that we ask it to make. It is already making images for us and sounds for us and video for us, but not sort of at the scale that it’s been making words for us. So that is my simplest definition that I can share with you on that topic. There are worlds more to cover, but excellent question, thank you.

0:05:34 – Kimberly King

Yeah, but I like that you kind of brought in what we type into Google now and that’s also sort of prediction as well, and they start to see your patterns. So that’s a good analogy.

0:05:44 – Doctor Brian Arnold

Well said.

0:05:45 – Kimberly King

How can AI personalize learning and assessment for students?

0:05:51 – Doctor Brian Arnold

That’s really the big question, I think that’s facing educators and I think that those answers aren’t easy. It’s not a matter of upgrading to the new version of Windows. It’s really taking off the doorknob and building a new house on it. It’s really reimagining how we teach and how we learn and, from my experience, from the discussions that we’ve had so far, part of that is starting with focusing on the outcomes that you want your students to learn and, beyond that, really focusing on the skills and the knowledge, rather than the method they demonstrate those skills and knowledge right.

So for the long time, the workhorse of the essay has served us tirelessly as the artifact that demonstrates we have critical, organized thinking. And that really wasn’t true before AI, and it’s way less true now that AI can make the words for us.

So being able to organize our thoughts, being able to clearly communicate, those are still essential skills, and the challenge before us as educators is to find ways to use technologies like AI or other emerging technologies to both teach and challenge and engage the students so that they can still learn those skills and demonstrate those skills in an authentic assessment, an authentic activity that genuinely measures that skill. And just to circle back for a half second on my own topic. There is some sort of human drive to not let go of the things that have worked for us in the past, and I think one of the critical questions is to really just kind of ask yourself is this the best way to measure what I want to learn, or is it the easiest or most familiar way to measure what I want to learn? Or is it the easiest or most familiar way to measure what I want to learn? And these are uncomfortable conversations and they’re uncomfortable answers, but I think it’s the work that’ll get us where we need to go.

0:07:54 – Kimberly King

Again, it’s kind of like the wild wild west, yet we’re just dipping our foot in the water and seeing what’s going to work, what’s going to be productive and what doesn’t work, or two, I guess. Right, you’re trying to find those boundaries.

0:08:07 – Doctor Brian Arnold

Absolutely, it’s a moving target.

0:08:10 – Kimberly King

Right, exactly. So how do I choose AI tools when they’re changing so often with that moving target?

0:08:18 – Doctor Brian Arnold

That’s a really great question. I know- what I mean when I say it’s a great question- it’s a big question, it’s unanswerable. But what I can start with is sort of you know, the technology has changed, but the ways in which we make decisions, our goals, our learning hasn’t a whole lot and human behavior hasn’t changed a whole lot. So how did you choose your technology before? One way is wait for a vendor to show up and say look, I’ve got this deal on a great widget, and that doesn’t often end well. A better approach can be to really look at what your needs are, what your learners needs are, and say I need a widget that can do X and shop around for it and then test it and see how it does and have some criteria for evaluating whether it works over the long, which- and also you know, we’re not all the same, right, so not all learners will benefit the same way from the same intervention.

As a lifelong gamer, people are like oh, our game’s good for learning. Like, is this game good for learning? Like? Well, for some players, for players who like to explore, this is a great game. For players who like to socialize, this is a great game, and it’s not the same thing, similarly with these tools. So knowing your audience, knowing your learners, knowing their level and knowing- when I say knowing, I don’t mean read an article about it, but actually get your fingers on the keyboard and play with the thing. Put yourself in a chair, spend some time, just see what it does and what it doesn’t do and what your pain points are, so that you can predict that for your audience.

0:09:49 – Kimberly King

Good point and I think you know we all really probably learn best by jumping right in and trying it. As you said, get your foot, your fingers on the keyboard and just really trial and error instead of just reading about it or talking about it. So how does AI technology support students with special needs?

0:10:10 – Doctor Brian Arnold

So again, sort of a theme hopefully from this discussion is you know, human beings haven’t changed, the technology has, so neither have some of the principles governing good design for learning. So for those of you that are familiar with universal design for learning, it’s basically designing so that all kinds of learners with all sorts of abilities can learn from the content. The more popular examples that have been usually thrown out as examples are the curb cut effect, which is the idea that curbs didn’t used to have like little dips in them for on-ramps off and on, but at the corners they started cutting the curb so that wheelchairs could then get on at the intersection. Right? How many moms with joggers, skateboarders, roller skaters, somebody with a heavy package- Everyone benefits from the curb cut, but it was originally designed as an accommodation.

Similarly, subtitles for the hearing impaired- this is something that was a specific accommodation, but I raised my daughter with her reading the subtitles while watching television. I often- someone’s being noisy while I’m watching TV and I can understand what’s going on because I can read the subtitles. So how does this technology benefit those special needs is, it can create an accommodation, but the best kind of accommodations are the ones that kind of everyone benefits from and that’s often found if you dig into the field of universal design for learning or UDL.

0:11:40 – Kimberly King

And talk to me a little bit about your daughter. What does she have special needs, and is that why you’re this? Is you know? You’ve taught her how to read the subtitles?

0:11:49 – Doctor Brian Arnold

No, that’s a great question. No, I just, you know, I come from a media arts background but I’m a little conflicted about putting my kid in front of the television. So I’m like, all right, where’s the value happening here? Like, okay, so at a young age- and she’s older now, she actually teaches fifth grade- she, you know, while she’s watching she’s also associating the sounds of the words with the text and understanding that relationship. And she did become an avid reader and still is. So correlation, not causation, but yeah.

0:12:17 – Kimberly King

Right, no, and you know what, I mean, in the world we live in today, everything is multitask, it seems. So you know, if you can pat your head and rub your stomach at the same time, or read and watch, and you know that kind of thing is probably, that’s a great tool to have.

0:12:31 – Doctor Brian Arnold

Absolutely, and those people that can do that make me jealous, because if I do two things at once, I’m just doing two things poorly.

0:12:37 – Kimberly King

I agree. The older you get, it gets harder, doesn’t it? Why are so many educators split on this disposition toward AI?

0:12:45 – Doctor Brian Arnold

Right. So this is something that I tried to manifest not happening in the early days, like no, everyone’s going to jump in, it’s going to be great. But at the end of the day, many educators have initiative fatigue, many educators are overwhelmed. Many educators really love teaching the way that they learned, and these are all obstacles to adopting something new. These are all obstacles to even being open to something new, to even being open to something new, and many of them, legitimately, have invested a lot of time and effort into initiatives and technologies that have not benefited them. When COVID came along and everyone went online, a lot of schools went online badly, and so now, from their perception, online’s not a good option because it didn’t work very well. And in their experience, that is valid.

There’s also a lot of fear around what these tools can do. A large amount of that fear is based on not understanding what the tools can do, but some of it is legitimate. Once you really get into it, you’re like nope, that’s a really rational thing to be concerned about, so, really there are there’s always the early adopters, the people who dive in. They love it. It’s new toys. Yay! That was me. And then there’s the people who are like you know what? I don’t need one more thing. And there’s also the people who say this is replacing good things. You know, I don’t want to replace this activity. This is an activity I like. And so my concern is, with it, that it becomes yet another issue to polarize our general population over. And I think that’s one of the bigger dangers at the societal level. As from the educational level- hopefully it involves, it leads to some rigorous debate and some challenging conversations, but we have to have those conversations. We can’t just sit in our camps and grumble.

0:14:56 – Kimberly King

Yeah, I agree with you on that and I think, as you say, polarizing and I think everything in this day and age has heightened so polarizing, politicizing all of that. So you’re right, I think we still jump in and we figure it out ourselves, rather than, yeah, as you said, sitting in a corner grumbling about it, but really finding out what those boundaries are and what works for people. Well said yes. What if some teachers just don’t want to use AI technology in their courses? What happens then?

0:15:27 – Doctor Brian Arnold

So I can sort of answer that question from two perspectives. One from an academic freedom perspective. That really is a strong platform Like hey, that’s just not how I do things here. Personally, I do believe that our job as educators is to prepare our learners for the future they will inhabit, for good or for bad, that future will include AI technologies. The learners that are savvy and able to use those technologies will have an advantage finding work, feeding their families and having a successful life over the ones that don’t.

So maybe you don’t think that it’s necessary in your course, and that may be true, but the bigger picture is you know the sort of a joke, not joke you know you won’t be replaced by an AI, but you’ll be replaced by someone using one. So you want your students to be the ones that have that advantage. So, in a holistic sense, you do have the right to say it’s not appropriate, but you should have a good rationale for it other than your own discomfort. You say this is not for the benefit of my student, and if you can make that rational argument, then by all means. There’s no need- I used to say not everything’s better just because you rub some internet on it. You don’t need AI yogurt, we don’t need everything AI, but you should have a very good reason. No, this genuinely, from the student’s welfare perspective, doesn’t make sense. And then by all means, you know hey, let’s use other tools.

0:17:04 – Kimberly King

But I do love what you say. I mean we use that rational critical thinking on both sides to be able to say, okay, you know, and just really learn about it instead of just because you’re afraid of it or uncomfortable with it, but to really use rational thought, which seems to be missing in so many areas these days. So I think we need that as our headline for kind of everything.

0:17:25 – Doctor Brian Arnold

Well, I think people need to speak through their feelings first. This isn’t again about AI, necessarily, but you know, oh, the red-eyed robots are coming for me or whatever it is that you know you’re concerned about. Get that out, have the discussion and go like, okay, well, yeah, I guess here’s one timeline on which that happens, but here’s like a thousand others where you’re just carrying around this technology in your pocket and you’re going about your business and et cetera, et cetera.

0:17:50 – Kimberly King

Yeah, yeah, I love it. What challenges might teachers face when integrating AI into their teaching practices?

0:17:59 – Doctor Brian Arnold

So, from the perspective of the reluctant adopters, there’s almost always the initial response of cheating, plagiarism… How do I control for this? Wait, I can’t use my old assessments anymore. So there’s this great study the academician’s last name was Pope out of Stanford, where they asked students K-12 students about their cheating- the last 10 years and it was around 60 or 70% of students said yeah, I cheat from time to time. And then AI came out and they included the question in the survey and the percentage of students that cheated was exactly the same.

0:18:42 – Kimberly King

Really? Wow.

0:18:43 – Doctor Brian Arnold

Reasons for cheating are important to get at. Reasons for the students not seeing the value or authenticity of assignments are important. People aren’t doing more cheating with AI. They’re doing the same cheating, but now they have a different, better tool.

0:18:59 – Kimberly King

Interesting.

0:19:00 – Doctor Brian Arnold

What are the challenges? It’s a little bit of a tangent, but what are the challenges? The first thing is well, okay, I can’t use my old assessments now, my old worksheets, my old material, because this technology makes them invalid. You’re correct. So do you have the time and resources to invest in those new tools, those new technologies, the new training? Do you have the time to invest in the time-saving efficiencies that result from mastering the tool? It’s a little bit of a catch-22. These are legitimate, real challenges.

Not every institution is going to somehow find you a magical extra 10 hours for two weeks so that you can go ahead and dive in and learn these sort of things. So there’s the teacher’s ability to assess and trust in that assessment. There’s the teacher’s ability to understand the tools that they’re using and use them effectively, and those can be some fairly profound barriers, even if someone is willing. And then there’s the institution itself, which we’ll probably talk about a little bit more later. That you know the levels at which that’s supported at their institution make it easier or harder for them to adopt.

0:20:15 – Kimberly King

So I guess we’re just kind of thinking about this. What you’re saying: referencing, making sure that they’re honest about their references, right? I mean, you have to do that when you’re writing essays and when you just reference, when you’re getting it out of an actual book or a title, an author, but you reference your work.

0:20:33 – Doctor Brian Arnold

I’m so glad you brought that up. Yeah, exactly so. It was never not the policy that- reframed it in the positive- it’s always been true that you have to be transparent about your sources, so that just continues on with this new tool. Now, this new tool is more than just supplying you with a fact. It might have reformatted something for you, it might have offered you some other things. So exactly how you’re able to be transparent about that has changed.

And if your institution doesn’t have clear rules about it, you get a lot of what I affectionately call sneaking your AI. So you’re not sure if you’re going to get in trouble or not, so you don’t say anything, and this is a very human thing to do. So if you want your classroom or your institution to be a place where that transparency happens, you just need to be clear with your stakeholders. This is okay, that’s not okay. And here’s how you do it. Most people will be pretty straightforward, and if you can somehow present yourself, as again, these are just good teaching things- rather than the stage on stage, I am the lord of all things AI, but I’m your guide alongside. Let’s explore this together. Let’s figure out what makes sense together. Let’s test it together. Get a lot more buy-in and I think you get a lot more authenticity and more transparency.

0:21:46 – Kimberly King

Perfect, I’m glad you answered that. How can AI technology free teachers from prep to allow them to work directly with students?

0:21:55 – Doctor Brian Arnold

Okay, so I think I talked about this a little bit, but basically, in theory, these tools, these prediction tools, these creation tools, should be massive productivity boosters if used with understanding and familiarity. So it takes some time to kind of get there, but there’s a lot of work that K-12 teachers put into prepping for their courses that could be automated. Specifically preparing individualized learning for specific learners, for creating remediation for certain learners, even for serving as a virtual tutor for those students who need some help after hours. These are all things that the AI-based tools can help out with. So that takes some of the burden off of the instructor and their instructor during instruction time, so that, rather than chasing worksheets, they’re talking to Susie or Bobby and working with them individually on something a little bit more meaningful.

0:22:50 – Kimberly King

Good point. Yeah, I think time can really certainly help. This is such interesting information and a conversation that is much needed, so we’re going to be right back in just a moment. We have to take a quick break, so don’t go away. And now back to our interview with National University’s Dr. Brian Arnold. And such an interesting conversation. We’re talking about how we can use AI as a teacher and just off mic, a second ago we were just talking about how movies probably do this an injustice. Right, they always show the worst of the worst. Talk a little bit about that.

0:23:27 – Doctor Brian Arnold

Thank you. Yeah, so movies require conflict to drive the story. So if they’re going to feature an artificial intelligence, it’s going to turn into an angry red-eyed robot who’s chasing our protagonist around until he can valiantly defeat it, reasserting that belief is the most important thing and that technology is evil and scary and bad. That makes a great story. Don’t get me wrong. Some of my favorite stories- but that’s not necessarily a prediction tool for what our world is going to look like in the future. Maybe a slight cautionary tale, but just not a roadmap for the future.

0:24:02 – Kimberly King

Good point, yep, it’s definitely entertaining and that’s why they call it that right. What ethical and moral considerations should teachers keep in mind when using AI with students?

0:24:15 – Doctor Brian Arnold

So I feel like one of the tricks of that question is that it’s a bit loaded. So ethics and morals are often interchanged and they’re not the same thing. So a system of ethics is a way, it’s a perspective of looking at a complex, unresolvable issue and working through what are the implications if we take this path versus that path? Morality is this is right and this is wrong, and it’s absolute and it’s concrete and it’s subjective. So the easiest, most tempting path when creating rules about AI are to go down the moral path, and maybe that works for your institution.

In general, it comes across as a bit more of an ethical issue. So, one of the things we talked about already is okay, so good scholarship still requires transparency. That’s a good ethic to observe, because the implications of not following that are that the peer review, the trustworthiness of the scholarship, the quality, goes down. The other pieces are, and this is a little bit of a pivot- something I like to talk about sort of humane emerging technologies is the ethical considerations of how can we arm our learners with the skills and the digital literacy to make intelligent choices about the tools that they’re using so they can use them in a prosocial way, use them in a way that benefits themselves and others without harming others, and how can they use them and find the tools that empower them rather than harvest them as data?

0:25:52 – Kimberly King

Good point, and again a lot- I feel like it is having conversations and talking out loud, just like you were saying. I mean really just finding out what happens next. What is this?

0:26:06 – Doctor Brian Arnold

Absolutely. And it can be really empathic too, because you find out oh wait, you know what? Four out of five people had that same thought, had that same concern. And this fifth person had an idea I hadn’t tried before. And then you’re moving forward rather than again kind of stewing in your corner.

0:26:20 – Kimberly King

Right, right, right, and it kind of humanizes, it kind of almost normalizes it when you find that other people are having the same issues. So how does AI change the teacher’s role in the classroom?

0:26:31 – Doctor Brian Arnold

So it doesn’t have to change the teacher’s role if that’s not something desired by the teacher. However, in general it kind of- I’m going to borrow it from business a little bit- kind of decentralizes learning a little bit. So the teacher isn’t necessarily always the person with all the answers. They are the person who helps direct the student along their journey of learning and discovery and enrichment and growth. And so there again, it’s more of that guide alongside, that person who says here’s some boundaries, here’s some resources. Why don’t you try this direction? Hey, I noticed you struggled with X, y and Z and less about giving you line edits for the 900th time that you forgot to put the Oxford comma in in the third paragraph.

0:27:16 – Kimberly King

Yeah, right, and that’s good. I actually think that’s a good conversation to remind teachers and students you know what that role is too. It’s not just a you know, but really just to guide them. I love that, yeah, and recognize what their talents are and you know where they need help. So what skills or mindsets do teachers need to effectively implement all of the AI in their teaching?

0:27:45 – Doctor Brian Arnold

So I think that the mindset is, you know, the Dweckian open mindset. You know that you can make mistakes, you can fall, you can stumble and you can try again, and that’s okay. That you don’t- you’re not a lesser person for that experience. That your students can see you make a mistake, that your students can see something not work out, as long as you frame it for them Like, hey, we’re going to try this, it may or may not work out. So and the other one is just, you know, cultivating curiosity, the skills involved in right now in using the AI, because it’s a large language model and because we type in our questions the ability to frame questions, to think them through, to understand the computer thinking of the machine on the other end, that when you get a certain response, to sort of get familiar enough with the tool that you’ve asked it to do something and it’s generative.

So you can ask it a question like how many? How many fingers does a normal person have on their left hand? It’ll say five and you can say what are some other body parts? Yeah, just that second question. It knows you’re talking about the hand, it remembers the first question, but it also has sort of a degrading memory at this point. So six or seven questions from now. It might forget you mentioned hand. It doesn’t really forget, but it stops looking at that as a source of information. So being able to kind of get the logic and the way that the software quote, unquote, thinks- again, I don’t mean to anthropomorphize it, it doesn’t think it’s not sentient- that’s a really big skill and that just comes with practice. So the skill of communication, the skill of being able to write clearly and the skill of being able to evaluate the response you get from it, to know what it’s telling you and what it’s not, and that it’s not a truth machine, it’s not a fact machine. It makes words for you and it wants to make the words that you want.

0:29:50 – Kimberly King

So a question for you about that, to your example, about talking about your hand and how many digits and whatever. So a few questions after. What would happen if you keep asking it and it forgets that you’re talking about the hand? Will it ask you or will it just answer something in general?

0:30:08 – Doctor Brian Arnold

And that’s kind of the fun black box nature of the tool. So it’s really good at inferring things, but if it doesn’t have something to infer off of, it will make something up, and this was sort of the classic hallucination effect.

0:30:20 – Kimberly King

Right right.

0:30:21 – Doctor Brian Arnold

Yeah. So it might sort of move on to a body in general. It might move on to a starfish that has five hands. I mean, I have no idea, so I don’t know what its internal logic is, but it’ll fill in that gap and start giving you and you’re like, oh, this doesn’t work, it’s crazy, it’s broken, and like it’s not.

It’s just that you probably want to start over the brand new prompt. Again, these are just sort of the experience skills you get from working with the tool and know that ChatGPT is one tool among many that are commercially available and they each have their own quirks and their own strengths and weaknesses, affordances and constraints. So just sort of getting in there again, fingers on the keyboard, playing with it and seeing what works and what doesn’t, that develops the skills and the savvy in the same way that you can jump into a word processing program most of us and when something’s not going right we have a pretty good idea of where to look under the hood right or when to call tech support but if you’re brand new, you don’t know.

0:31:14 – Kimberly King

Right, right, Okay, that makes sense and you kind of just answered this a little bit. But how can teachers model responsible use of AI tools? Is that kind of what you’re talking about? Looking under the hood and just knowing kind of where to go to find those tools?

0:31:28 – Doctor Brian Arnold

Yeah, but also being open and transparent and candid with the students. Hey, I generated these worksheets for you all using ChatGPT. Let me know if you see something that doesn’t look right or let me know if you’d like something different. Hey, I had to write a report in my job and I’d written a rough draft and it wasn’t really tracking very well, so I dropped it in chat GPT and cleaned it up a little bit and I got a much better piece and then I edited it again. I didn’t just take it at face value, just walking them through your experience to show it’s not evil, it’s not good, it’s not bad, but modeling for them what you’re doing so they know what’s okay and what’s not okay.

0:32:06 – Kimberly King

Okay, okay. What criteria can teachers use to evaluate the effectiveness of AI tools?

0:32:15 – Doctor Brian Arnold

So and this circles back kind of to it’s based on your learner’s needs and wants and understanding that what you think your learners want and need doesn’t always turn out to be true. Additionally, those learner wants and needs change from moment to moment and over time, and so just being keenly aware that these are not silver bullet solutions and that you try something out and you’re like hey, wow, this works really well with the boys in fifth grade, that’s so great with some of the other kids in ninth grade, and some of these tools are just straight up inappropriate for kids under a certain age helped develop. So I always refer to it. It’s called TPACK- Teaching, Pedagogy And Content Knowledge, and so it was developed before AI, but the idea still holds is that looking at what you need to teach, looking at what does the teaching environment look like, what is the content that you need to impart and what do the students need to learn and how does this tool help or not help, you know, because sometimes it’s like, wow, wouldn’t it be great if we moved into this virtual space and we had this gamified thing and this that, and they’re like well, how does that in any way address the learning outcomes or the content?

I’m like well, the students were really into it, great, okay. So if engagement is your goal, then you’ve achieved it with this tool. I realize I’m kind of out on a on a ledge here. Then you’ve achieved it with a tool. But so how do you, how do you choose the right tool is use it for long enough to discard the novelty effect. Almost all students like something new for a short period of time, but then look at it over a couple of weeks. Are they still using it? Is it still effective? Is it still getting you towards your learning goals, or is it now to become a more complicated way to do the thing you used to do before and you still have to chase them for their homework?

0:34:18 – Kimberly King

Right, right. So some things never change, right, no matter how advanced and technological you get. What are the long-term implications of using AI in education?

0:34:30 – Doctor Brian Arnold

Great question. So we don’t know is the short answer. I think the popular sentiment is we’re not replacing teachers. I think there might be a change in the way, and this is again me with my science fiction writer hat on. This is by no means a fact or even a valid research theory, but teachers will be able to work with more students effectively than before and simultaneously give them more of the personalized attention than they could before. Those are sort of the utopian views. The dystopian view might be that teachers are now expected to do a lot more than they did before. And the ones who aren’t tech savvy, which some educators might say I’m here to teach, not to be an IT person. It might affect their employability in the long term. I think much of what’s new and alien will seem silly in the future.

I like to give the example of the teacher receives a paper today and she holds it up, she weighs it in her hand. She’s like wait a minute, did you use a word processor? You didn’t tell me you used a word processor. That’s a tremendous advantage over the other students.

AI will just be one tool among many that we leverage to get learning done, but that learning will look different, and I’ll come to a conclusion on this. I think in the future, answers will become a lot less focal and important and the ability to craft insightful questions is going to be highlighted and a more valued skill than the way it’s been traditionally.

0:36:26 – Kimberly King

That’s a really good point. And also to your point about the word processor I love that. I always lately, you know, talking about AI, we’ve talked about going from an abacus to a calculator and having the AI skills there too, but I do also love the word processor from a typewriter and having that ability to spell check and all of that. So we have, we’ve advanced all these ways and now it just feels like it’s just come along faster and you know we just have to embrace it. So how can AI technology foster creativity and critical thinking in students? And you kind of were just talking about that, you kind of flipped the script on the way we ask those questions and the way we go and do the research.

0:37:08 – Doctor Brian Arnold

So, thank you, I’ll tackle the critical thinking and then go to creativity. So the critical thinking is being transparent with your students and saying, hey, let’s take one of our standard essay questions and ask the chat GPT and then let’s analyze its result, let’s let what works and you can do this with almost any age of learner, any, any level hey, is this a good answer? Why or why not? And really kind of get critical with it. You know what it is and what it isn’t. As far as creativity goes, this is one of those buzzwords, like innovation, where you know you ask six people what their definition is, you get 10 answers.

So for me, creativity is a novel approach to something. So it’s a great tool for defeating the blank page, right? So I can only give examples. I think for this I already have used ChatGPT. I’ll say hey ChatGPT, tell me the story of Goldilocks and the three bears, but make it science fiction, and you know, make them hamsters. And make it a romantic comedy. And it’ll just sort of spit back something usually honestly pretty good but it’s not necessarily something that you’re gonna use.

But suddenly you’re thinking about things differently. It’s flipped it. Or let’s say, you’re working on a story- and I know not all creativity is stories- but you’re like, okay, I’ve been working on this story forever and it’s this dark piece about person who struggles. I’m like, okay, could I’m going to upload the text. Now, rewrite this as a, as a comedy. You know as a musical you know, and it’ll give you something.

But that’s that sort of gets you sparked and creative in the right direction. So, and now, with the ability to generate images, it’s just a great tool to create and add illustrations to what you’re doing, to really start thinking and tweaking. Kim, you mentioned how quickly all this stuff evolves. I was using ChatGPT, which I think harnesses Dali 3, it might’ve changed already, to generate images and they added a new tool while I blinked. So, let’s see, I wanted a picture. So I work in social emotional learning and I wanted to create a cell fish right, cell fish. So I put it in there and it created one. I’m like, oh, I really need some fins. And I looked up and there’s a tool. You can highlight the part of the image you wanted to change and say add a fin here to the fish.

Yeah, so now it’s empowering you to be a little bit of a designer, a little bit of an artist. Now are artists and designers and writers upset? Absolutely. Should they be? Absolutely. There’s a whole complicated ball of wax. But as far as ways to be creative, it just opens up some things, shortcuts you a little bit down the road to some skills that take years to learn, to at least get you to see things in a new and creative way.

0:40:10 – Kimberly King

Wow, I love it and again you’re not. It is fostering your creativity really to your point. I like the Goldilocks, you know analogy there too, but yeah, you, just you just add to it and really kind of go have fun with it. But again, give it that credibility there. Credit-

0:40:28 – Doctor Brian Arnold

No, I think you nailed it. Go, have fun with it. That is a hundred percent the way to use these tools.

0:40:33 – Kimberly King

Yeah, oh well, this has been a delightful conversation. I really appreciate your knowledge and if you want more information, you can visit National University’s website. It’s nu.edu, and thank you so very much for your time, Doctor. I really appreciate it.

0:40:48 – Doctor Brian Arnold

My pleasure.

0:40:50 – Kimberly King

You’ve been listening to the National University Podcast. For updates on future or past guests, visit us at nu.edu. You can also follow us on social media. Thanks for listening.