How Can AI Support Inclusive Education? - Tamsyn Smith
#7

How Can AI Support Inclusive Education? - Tamsyn Smith

Priten: Welcome to Margin of Thought, where we make space for the questions that matter.

I'm your host, Priten, and together we'll explore questions that help us preserve what matters while navigating what's coming.

Priten Shah: Technology and education looks different depending on where you're standing.

Today we're crossing the Atlantic Tamsyn Smith is a senior learning designer and team lead at the University of South Hampton in England, and she's also halfway through a PhD investigating how generative AI can support inclusive education.

Her journey spans from learning basic programming at age seven to teaching 11 to 19 year olds, to now designing learning experiences in higher education.

She focuses on accessibility.

How do we ensure that education technology serves all students, including those with disabilities or learning differences?

And what happens to foundational skills when students become too reliant on generative ai?

Let's get started.

Tamsin Smith: so My name's Tamsyn Smith and I'm a senior learning designer, team lead at the University of Southampton in England.

But my background has always been in pedagogy.

So prior to working in higher education, I worked, as a teacher mostly.

With sort of 16 to 19 year olds, but before that also with 11 to 16 year olds.

So I spent about a decade as a classroom teacher before moving into higher education and alongside my role as a senior learning
designer team lead, i'm also halfway through a PhD where I'm looking at how generative AI can be used to support inclusive education.

Priten Shah: Very cool.

Awesome.

So I'm sure we'll have lots to talk about today.

let's start with you as a student.

what's the very first, uh, tech tool you remember using,

Tamsin Smith: oh, gosh, now it's gonna reveal my age because it was probably using CD one to do research, We didn't have, you know, sort of library searches in the way you would have now.

So, so just being able to use that, although I was very fortunate, I actually learned some programming in basic when I was probably about seven years old.

I studied, computing when I was about 13 or 14, which was quite unusual.

A lot of schools were starting to introduce it and word processing, but I was actually doing computing.

But yeah, it's, it's a long time ago now.

Priten Shah: That's amazing though.

Just like you barely hear stories of, um, students encountering that level of computer science literacy, um, at that age now.

yeah, what are the biggest shifts that you've seen, in education technology?

Tamsin Smith: Yeah, when I was at university, when I, when I was a college age student, internet access came in in my second year, and so, I spent an entire summer in the basement learning how to navigate the internet, but I think that was really unusual.

So, so actually learning, that we could communicate with people in different places so easily, whereas previously at school, I'd done, you know, communication with people in other schools via a letter, so, so that was really exciting being able to use email.

But I think really the biggest change I've noticed has been the introduction of learning management systems or virtual learning environments.

So being able to easily share resources with students and use discussion forums, use some of those interactive learning tools, has been a massive shift.

It was not something that I experienced when I was a student, but it's what I work with all the time now and it's what I've worked with kinda throughout my time as a teacher.

I I introduced, Moodle in, in the college I was working in, and so.

Yeah, I think that's probably one of the biggest shifts I've seen.

but then there have been so many shifts since then, the introduction of mobile technology and all students having a cell phone in the classroom and yeah, people having easy access to tablets, having access to laptops.

Priten Shah: When folks are talking about, what AI means for education, how we're gonna integrate it, how we're gonna safeguard education from it.

there's some folks who say this is just similar to the transitions we've always gone through, right?

Whether it be the internet, um, whether it be the computer.

Do you agree with that, that this is just yet another technological, step in our regression as humans Um, or does something feel different about this?

Tamsin Smith: Oh, I'm really torn on that one.

I, I think when people had calculators first in the classroom.

Everyone worried that pupils wouldn't learn how to do the basics of maths.

They would just, rely on their calculators so easily that, that they wouldn't do the computing themselves to be able to work out that their mathematics.

But I think we've realized that's not the case.

You have to have a basic understanding before you use a calculator.

And, and I think now with, the evolution of gen AI tools.

People still need to be able to do something manually.

They need to be able to do things the old fashioned way.

Uh, and so I think it is a shift, but we need to make sure that students know how to formulate a good essay without relying on gen ai.

Because how can you know whether the output is any good if you couldn't create something yourself?

So I think there's a lot of tension there and people are really concerned.

But I think we will spend time learning how to do things, because if you can't do it yourself, you can't assess whether the output from your computer is any good.

Priten Shah: you know, As I talk to different teachers, um, and educators, we're all largely agreed that that process is very important before you get to use the tools.

One thing is that I'm noticing that's different is students seem to be even more resistant, than I have noticed in the past about.

uh, gathering those skills or learning those skills, because the larger method just seems to be that, well, AI will do it for us, and now it's like the compounded effect of, computers, internet, mobile devices, and ai, right?

It's not just that AI exists in a vacuum.

and it's that it exists in your pocket, that exists in your headphones, right?

So, What are some ways that you think we can navigate that conversation with students?

Tamsin Smith: I mean, I certainly think we've had all of these technological changes alongside a huge number of social changes.

So if we look back over the last decade, you know, the last five years, COVID-19 has been something everyone's been talking about.

All of those huge changes that it made in people's education.

So the students who are starting college right now.

We'll have missed out on some fundamental time in education when they'd have been learning how to form friendships, how to navigate, how to write assignments.

And so they've got gaps in their knowledge and some of the privileged students.

We'll have had fewer gaps.

They, they will have had access to tutors.

Perhaps they will have had access to the internet at home and reliable wifi and their own device.

And so perhaps they will have been less impacted, but I've seen with, with students at a whole range of ages.

That that has had a big impact on how they're studying and perhaps people's attitudes towards Gen AI would be different if we hadn't had the impact of COVID-19.

So I think those things have come around the same time and we're still.

Seeing the repercussions of, of education moving wholly online and students not necessarily having a good online education experience.

It was emergency remote teaching.

It was not something that staff were trained for.

It was not something we'd planned for.

So, so I think we've had those two cataclysmic shifts at the same time.

But I think.

Students are unwilling.

So many students are saying, will this be in the test?

Do I have to know how can I get the highest grade?

And I think grade grubbing is a real problem where students say, well, I'm paying for this education.

I want to get a good grade, or I need to get a high grade to please my parents if they're younger.

So, so there's these other factors that they're not seeing learning as a process, they're seeing it as an output.

Priten Shah: I've been trying to talk to, um, educated at all levels, right?

Because This is, a problem that you face differently in K to 12 versus in higher education or but largely I think we're, we're failing the, the task of answering why learn this, right?

Like, I feel like that's, that's really where we all need to kind of, like refocus and say, okay, like.

In, in the past.

We said, oh, because it's gonna be on the test.

we gave them that answer at times, right?

And so it's only natural that that's what they're repeating.

We said, oh, you'll need it for your jobs.

And now they're like, oh, I don't need it for my job because I'll have AI at my disposal during my job.

And sometimes we just said, because I said so, like that was also, you know, in our most frustrating moments, I know all of us have at police at some point said that.

So, lots of educators have spent time trying to explain that to students.

and we also don't have that kind of time in our classrooms to spend, you know, hours explaining, um, a college freshman why their education is going to be important.

least It sounds like you're agreeing that, we need to get the students on board with it partially, um, for us to be able to do any of this.

Right.

And I think that the example of COVID is, is pertinent because I think.

Um, it shook everything up and you're right, that we never fully settled from it and it exposed a lot of questions and exposed a lot of fracture points, and we kind of all tried to just move along with, from it, um, and said, oh, things are back to normal.

But I think it brought up really important questions.

It created long lasting impacts.

but also we didn't fully get to say, okay, what does it mean if technology becomes a larger part of our educational systems?

What would it look like if university had stayed online forever?

Like There was a clear turning point.

Um, and then we had three years to kind of deal with the same thing.

and it was still really hard.

And, you know, most institutions like barely got through it.

This one is one where like there's a new turning point every day.

there wasn't like a moment in time where things geographically changed.

that's part of the fear is that it took us that long to kind of adapt to, this one turning point.

how do we work with our teachers and students to continuously adapt as, as the landscape of not only education, but larger society is changing dramatically, like the way folks did their jobs three years ago.

And the way they do them today are very different.

And, and it's sometimes it's subtle enough where we don't fully feel it right away, but when you look back at your emails from three years ago, you immediately see conversations were very different.

, Tamsin Smith: Yeah.

And I, I think we have lots of concerns about how our students might be changing.

So, There's a reluctance to do things in person.

People say, oh, but it's easier online, you know, and, and there's so many advantages.

You know, I can talk to you hundreds of miles away, and it's really easy.

But we want students to come to campus and form social relationships and have those conversations with people that are outside of their immediate friendship group.

But it's just so much easier to socialize online.

Yeah.

And so students are perhaps not getting the breadth of experience that we want them to have, If they're in a classroom, you can go round easily and hear those, those private conversations or the, the small group conversations that are going on.

Whereas if you are teaching online, if you go in and out of a breakout room, they notice you're there as soon as you arrive.

So the conversation stops.

It changes, and you get the bigger picture of which group is in trouble.

they don't know the problem they're supposed to be solving.

They don't understand it, right?

You can see that in a classroom.

You can't see those things as easily online.

So we've had all the, these shifts.

Staff don't know how to deal with them.

They worry that students know a lot more than they do in terms of using technology.

But actually, when it comes to using technology for academic purposes, many of our students are not as, as skillful as we think they are.

they may be excellent at using WhatsApp or any of the other apps they have on their smartphone.

They're great at TikTok, but if you ask them to, use a device for academic study.

Then chances are they're not that skillful.

And so we need to support our students' digital capabilities to make sure that they can use the tools.

We can't assume they know how to.

Priten Shah: And that's true for teachers and students, right?

And I think that that's, part of the, the challenge here is we're catching up two different populations who work in the same system, who come with very different backgrounds and different problems.

And, no one is fully there in terms of being able to navigate these questions, one of the things I noticed in, in the survey that you filled out.

was you pointed out that teachers don't often read the terms and conditions, um, what does that mean for your role?

how does that come up?

Tamsin Smith: Yes.

I, I think often you, you have those, educators at different ends of the curves.

You know, you've got your early adopters and your laggards and the early adopters just they see a new tool, they want to use this.

They're like, oh, chat GPT must try this.

Oh, I'm gonna try Gemini.

I'm gonna try it.

And I'm just like, magpies.

They're looking for something new and shiny and they just want to have a go with it.

The role that I'm in, I have to be there and say, okay, hold on.

Put the brakes on.

Let's take a step back before you start using this with the students.

What data are you putting in there?

Where is that data going?

What is happening with that data?

Because if somebody's using a tool to create a resource and they get some output from it, and they want to use it with their students.

They can make an informed judgment about whether it's pedagogically appropriate, but if they're telling the students they have to log in and use a tool online to create some kind of
assignment or learning resource, the students need to know what's happening with their data, so the staff have to be informed and they, they have to make all those choices around.

Yeah.

Is the tool fully accessible if you've got a learner who uses a screen reader who has hearing problems or anything like that?

We have to be mindful of all of those things.

So we have to look really closely at things like the VPA and all the other parts of the terms and conditions to think, is this actually a tool that's suitable for my classroom?

I'm not sure that all teachers do that.

I, I know certainly when I was an enthusiastic teacher in my early twenties, I'd see something new.

I'd think, Hey, this is exciting.

My students are gonna like doing this.

I didn't go and read the terms and conditions, I'd sign up, I'd start using it, and I'd get feedback from the students about whether they liked it or not.

It never crossed my mind that there might be ethical issues about the data and what was going in there.

And so I think that's something that we really need to pay careful attention to.

I mean, I have an 8-year-old daughter and I'm, I'm quite surprised at how much youth she's making of generative AI tools at school and.

I, I'm assuming it's all done on the teacher's account.

She doesn't have her own email address, but don't know how she's accessing it and I dunno whether her teacher has looked into that or thought about it.

Priten Shah: Let's start with data privacy concerns when we're talking about any tech tool, but especially when we're talking about AI tools.

how do you help teachers navigate that, right?

Tamsin Smith: for the educators that I'm working with, we always talk in terms of supported and unsupported tools of which ones are the ones where we have support at the institution, that there are people who know that tool works, that they, they have looked
into all of those terms and conditions and data issues, the institution that I work at is a Microsoft institution, so we support our staff with using copilot, and they know that if they use copilot at my institution, then the data is secure within our system.

It's not going outside of it.

So if they are choosing to upload a university document, we don't have to worry about.

Is that being made public being used in other people's searches and so on.

So it's issues around that of, of kind of what data are people uploading, where is it going, you know, if.

If somebody decided they were going to use it to try and assess a piece of student's work, and they chose to upload that essay, which is not a practice that I would recommend, but you know, I know these things happen.

At least we know that that is within our instance, within our copilot.

Whereas if they chose to put that into chat, GPT or Gemini.

I couldn't give them the same guarantee because they're not systems that we are working with.

So I think it's knowing is there something that's supported at your institution that people are saying, yes, this is the one that we are telling you is okay to use and within our institutional restrictions, you will be okay with this tool.

Priten Shah: what do you say to teachers and especially at larger institutions where they haven't caught up yet, we still work with Indi individual teachers who say, my school hasn't even brought up ai.

We haven't had a single professional development session on it.

Um, our school doesn't have a budget for tools or, you know, navigating these conversations.

so I'm left with students who are using these tools on their own, who I know they're gonna go home and use these tools to do my assignments.

and I'd like to figure out what role they play in my classroom because the students want to use the technology.

So when, when the institution isn't providing that support, um, I find those cases to be the hardest for individual teachers because there's professionals like you who, who can spend
time, looking through those data privacy agreements, helping the institutions as a whole navigate, you know, which platforms are worth the risks that come with using the technology.

and then there's of course folks who don't have resources, like that.

And so, And they have to make those decisions on their own.

Tamsin Smith: It's the same as other areas of digital capability.

You know, I, I don't care whether a student uses Word or Google Docs for their word processing, yet at the end of the day, they'll create a document, right?

And, and so I think it's thinking about what skills do we want people to have?

So.

do the educators, do the students know how to write prompts effectively?

Do they know what goes into crafting an effective prompt?

And actually you can do that in a classroom with no computers.

You can talk to students, you can use a framework.

I mean, you know, there's, there's hundreds of frameworks out there, that you can use for prompt crafting.

And so.

I think getting people to think about, well, what is it they want to achieve and what are they going to have to include in a prompt to get that?

And so I think getting students and colleagues to think about carefully crafting those prompts is really helpful.

And I think.

It is, it is got multiple benefits they're, they're thinking really precisely about what output they want, so they're going to get a better result.

But I think the other concern that I hear a lot from the people I work with are the environmental concerns.

Right.

And I think if we remind people that the fewer prompts you use, then the less energy is gonna be wasted, then, then that's a sort of, yeah.

Hopefully a win for people that they'll really think about what needs to go into a prompt.

And I think it comes back to other digital capabilities that we teach students.

If we are teaching them library skills, research skills, we're telling them, well, you need to think carefully about what it is you're searching for.

You can't just say you want books on history that's gonna give you thousands, so what is it you want to know?

Right.

It's the same process.

So it's all part of that academic research skills.

Priten Shah: Right.

and then what about the data privacy aspect, how do we get students and teachers to make healthier decisions?

When it comes to data privacy, because I think oftentimes we hear a couple of different things, right?

You hear from them, oh, this tool is made for schools that's the tagline.

It's made for schools.

And so it's safe enough to use, within our system, within our classroom.

and then sometimes we hear a blanket like, oh, it's ai, so it uses the data so we shouldn't use this.

We do see both extremes of it, where it's like the bar is very, very low for us to think that this is safe enough for our schools.

on the other end, nothing is gonna ever meet our criteria because it requires data.

Right.

And inherently tech tools will collect some data in order to function.

these tools would not be as useful as they are if they took no data from us.

hopefully they don't have to take too much data.

Yeah.

Tamsin Smith: What data are you giving away when you sign up for a tool that's free or freemium?

So how much information do they gather from you at the start?

Is it simply that you have an email address and log on?

Or do they want to know your name, where you are based, what your job role is?

You know, how much of that personal data about you as a user as it's gathering?

And I think, you know, you, you can generally find that out fairly easily.

But what else?

are there any cookies?

What's it gathering from you that you're not thinking about?

And then the other side is what data are you choosing to put in there?

And and I would always say to an educator.

If they wouldn't be prepared to share it on their Facebook page, put it on their work profile, then they probably shouldn't be putting it into a Gen AI tool.

I wouldn't be sharing photos of my students, so I wouldn't put them into a Gen AI tool.

yeah, I mean, it might be a large group photo or a. Something where it's the back of students doing an activity.

Yeah, that's fine, but not the students' faces.

So I think, you know what, what personal information might be be given away about other people.

And if it's not something that I would normally consider sharing, if I wouldn't put it on the LMS, if I wouldn't put it somewhere publicly, then I wouldn't put it into chat GPT.

So I, I think it's getting people to think about things in that way.

Would they share.

A list of all of their students' grades.

No, they wouldn't.

So let's not put that in when we don't know what might happen with it right now.

Priten Shah: Right, right.

Yeah.

I think that those kinds that have, uh, heuristics are really useful for teachers.

'cause I think.

It all feels new.

And so I think everybody feels like they're starting from scratch when they're making these decisions.

But I I like the idea of pulling on the social media heuristic to kind of guide them at least as a short term.

okay, here's a quick and easy gut check you can do.

that's very valuable and I, hope folks are listening to that.

I wanna make sure we get time to talk about your research.

So, tell me a little bit about, what that looks like.

I know you mentioned, you're thinking about ai, its possibilities for inclusive education.

Tamsin Smith: Yeah.

So, I think many institutions in the UK as, as are not as advanced as those in North America in terms of implementing Universal Design for Learning.

But it's something that I'm particularly passionate about.

I think it helps to, solve a lot of the problems that we experience in classrooms where we have put up barriers that prevent students from learning in the way that they would like to.

So.

I think a lot of focus around Gen AI tools has been telling students, Hey, you can use this tool and it will make things better for you.

But actually, if a student has some challenges that they face in their education already, perhaps because they have a a disability, then telling them there's a tool that can help them is great, but it's still making them do extra work.

as the educator in the classroom, we should be doing the work.

We should be opening the doors to make sure as many students as possible can achieve.

So it's looking really at how can we minimize those barriers.

For time pressed teachers to make things easier for their students because I think that's what I hear from most educators now is they really want things to be better, but they don't have any time.

Everyone's workloads have increased.

I think workloads got got bigger during the pandemic and then they never came back down again.

That that's the story I hear from my colleagues here.

So I think if we can look at ways that simplify how educators can present resources to students, to come up with ideas.

My team has a number of Lego Serious play kits, which are absolutely fantastic, but we've also got a lot of educators who've never used them before.

They say, well, what, what could I do?

And we say, well, it's really great if you've got some hands-on activities that get the students discussing and thinking, and it's more fun.

Talking with a gen AI tool, asking questions, getting ideas.

It can help with idea generation of, of thinking, I've got these resources, this is the learning outcome I want, how might I use these resources?

And so we're making better use of the resources that we've got.

if we've got a video, it might not already have subtitles so we can use it to help generate those captions.

Still probably needs some human editing to make them, yeah, as accurate as they need to be.

But then we can upload those again and quickly create a transcript and a summary.

So it's coming up with all of those alternative formats really easily.

Again, we need to be mindful of, of what that video is that we're uploading.

But it may be that it's, you know, I've recorded something, explaining a point for students, and therefore I'm happy with uploading it.

so I think it's all those possibilities of, of how can we create alternative formats, make content more accessible, but also being mindful that if we are asking for image generation, is it coming up with an image that's appropriate?

if I ask for.

a picture of a soldier, will it always be a man?

If I ask for a picture of a nurse, will it always be a woman?

So what stereotypes are we getting?

But my recent experience is that the tools are improving so quickly that you know, those prompts, you will get a variety of different images back and they won't all be the stereotypical image in your head.

I think, there's so many possibilities.

Priten Shah: You talk about busy teachers, so the narrative around AI is is largely that it will help them well, you know, lessen their workload.

free up more time.

are you seeing that over there?

Because I'm not seeing much of that here yet.

It is, it largely seems to be adding to the teacher's workload, more so than taking away.

Is that, is that a universal experience?

Tamsin Smith: I think it is that eternal optimism where actually we always know that a new technology comes in.

You've got that steep learning curve.

You say, oh, well it's gonna be time upfront, but you'll gain your time back later.

But actually in education there's never a later, there's always something more that gets added to someone's workload.

Right?

So yeah, it's feeding the capitalist system really.

we save the time that we can then use for something else that's making money for someone.

So think there are some ways where people are saving time.

We have quite a lot of manual processes at my institution where somebody might be uploading some data somewhere and then it has to go somewhere else and somebody has to do some processing.

And I think where people are looking at streamlining of processes and how some, some simple outputs can come from that data, then they are having some time savings with using AI tools.

But, but right now I think.

People haven't got enough experience.

They don't know what the possibilities are.

They're worried about the risks, although there's a lot of progress in the tools in terms of people's behavior and attitudes, that's quite slow moving I, I've
encountered a lot of educators who tried chat GPT when it first came out, and then when I speak to 'em, they say, I'm not using Gen AI because it will hallucinate.

It's told me things that are clearly incorrect.

It doesn't know the latest information.

So this tool is rubbish.

And, And I say, well actually, have you tried this?

You know, I, I've done some research recently.

It has generated some excellent, really interesting references for papers that I've gone away and read.

And they were exactly what I wanted.

And it's not perfect every time, but it's certainly better than it was a year ago.

And I think we have to remind staff that it's changing very quickly.

They do need to keep going back and checking because what they thought was right yesterday won't necessarily be true tomorrow.

Priten Shah: I think without institutional support, the burden that's following, following on individual educators is so high.

Because as somebody who spends most of my day, like thinking about AI and education, I can barely keep up.

Like there's constantly a new tool, there's constantly a new model, there's constantly a new, lawsuit against one, , right?

that scares me a little bit because I don't think we have the full resources to make sure that everybody can navigate those changes rapidly enough.

Yeah, I think we, we face the same thing.

we had teachers who've tried only the free model from when it initially came out.

and then we have other ones who are like trying up the $200 a month plan that gives you the deep research tools.

I mean, it's an astronomical difference in quality of output.

And so buy-in is very, very different.

if there is one thing that, teachers can take away, from your experience in terms of, how to make sure that the way they're navigating all rapid changes that are happening in education, are done in a way that's fair to our students, what would it be?

Tamsin Smith: Don't assume that students are using gen AI tools.

I know from from research we've done at my institution, we, we've asked students about their usage and yes, they're using it, but they're actually very ethical.

They understand you know, that they're studying at university to get an education.

So yes, they might use it to help with their research.

They may be using it to help cut the word count down if they've written too much.

But most of our students are honest at heart so that there are all these concerns about academic integrity.

But whatever technology we were using in the past, students who were determined to cheat would've cheated.

They would've paid, you know, essay mills.

they would've found someone else to write it for them.

They would've plagiarized something.

So if someone's determined, it's a different way of cheating.

But I think most of our students really do want to do something.

We need to look at why, if they are using Gen AI to generate their essays or assignments, why are they doing it?

And is it because we're actually expecting too much of our students?

Are we giving them just too much busy work without giving them time to think about what it is they need to be learning?

it's talking with our students about their usage and understanding what they're doing.

We shouldn't just assume that they're going straight to cheating.

Priten Shah: that's very valuable advice.

Thank you.

I really appreciate your time.

I feel like there's a lot of overlap between the kind of work we do just in obviously different contexts, but, um, it was reassuring in some ways to hear that the experiences are global

Tamsin Smith: We've got global students 25 of us and I, I went to a, a webinar we did yesterday and one of the folks is working in Canada and he was talking about the same problems I've got in the Middle East.

And so I think, yeah, that they're all problems and whether people working with 13 year olds or 23 year olds it's kind of the same issues.

Priten Shah: yeah.

that's amazing.

the last thing I'll leave you with is, I teach part-time in higher ed.

I'm working with students on just how AI can be productively used in their in your life, we did a lesson on biases last week and we tried to do image generators.

they still failed.

I was quite disappointed because I saw the image generators were getting better.

we had every single student try to do a picture of a doctor and a nurse.

It was largely like, 90% male, um, to female.

The other one I like to do is things, is have them ask AI to explain the same concept to their mother versus their father.

and see what analogies and metaphors it uses.

that's an aha moment for a lot of them.

'cause it's a little bit more insidious.

It's not so direct.

Yeah.

when you're like, oh, help me explain AI to my mom.

It'll give me like, oh, when you're doing laundry, or when you're cooking a meal, and when you're like, help me explain AI to my dad.

It immediately goes, oh, when you're in the garage getting these tools that's a fun one to get them to start making that connection that they need to read a little bit deeper, um, to really see like where that bias shows up.

And even that still, those are still.

Not as obvious as the more regressive levels of bias and stereotyping, but still fairly obvious.

I'm

Tamsin Smith: Sure you've seen the global Barbies We've got a number of African colleagues on my course and that was one of the first things they, they kind of talked about when we talked about all the stereotypes you get.

And they were like, yeah, look at this Barbie that's allegedly from the country I'm from.

And we're like, ah.

Priten Shah: I think a lot of the cultural stuff is really interesting to see.

Um, And so yeah, you don't really realize unless it's a stereotype that you can like fully see, or you have enough global awareness, which are both sometimes, um, lacking.

I, I appreciate your time.

Tamsin Smith: Awesome.

That's great.

Thank you so much.

It's been really interesting talking to you this evening.

Priten Shah: Same here.

Have a good night.

Tamsin Smith: Thank you.

Bye

Priten Shah: Tamson emphasizes that ethical ed tech has to account for more than just efficiency or engagement.

It has to work for everyone.

Her research on using AI to support inclusive education combined with our concerns about students losing foundational skills captures the tension we all feel.

How do we harness technologies potential or protecting what we can't afford to lose?

Keep listening this season as we continue exploring what responsible innovation and education really means,

and if you're interested in learning more about responsible tech integration, pre-order my upcoming book, ethical Ed [email protected].

Priten: Thanks for listening to Margin of Thought.

If this episode gave you something to think about, subscribe, rate, and review us.

Also, share it with someone who might be asking similar questions.

You can find the show notes, transcripts, and my newsletter at priten.org.

Until next time, keep making space for the questions that matter.