From Cookie-Cutter to Captivating: Empower Learning through Principled Approaches to Assessment
Discuss principles and showcase inspiring examples from educational institutions including Iowa Workforce Development and Utah State University. Discover how forward-thinking formative and summative assessments ignite student engagement, enable precise outcome measurement, and optimize the learning experience.
Good morning, everybody. Happy Friday. Thank you for joining us this morning. So, we're happy to be here. We'll just make real quick introductions. My name is Neil Legler.
I'm the director of the Center for instructional design and innovation at Utah State University, which is our centralized instructional design and faculty support in all things technology and academics, at Utah State University. And I'm Tony Hoskin. I'm the the VP of curriculum at Atomic. I've been here for about three years. And prior to that, I was at Utah University University in the teacher education and leadership department.
And then down in the audience, if you'll raise his hand, this is Joel Duffin. His name's up here. And he's he's a great resource if you have questions. Also Justin Ballback there on some of the the technology stuff. So, so yeah, today we're talking about just some of the principles that that we've used at Utah State University to guide assessment.
And, and also just some of the just giving some case studies, some examples of what some teachers have been doing. In this case, particularly using atomic assessments, And, and then Tony, I believe, has a cool sneak preview of some cool new stuff. So Okay. So, around the room, you had a couple of, topics. So we're just gonna run through some of those topics, today, and then you'll have a chance later to discuss these a little bit more amongst yourselves, and we'll be able to post those to a jam board.
Hopefully, you'll learn. From each other more than you learn from us today, because that's always the fun part of these conferences. So one of the things that, guides, the way that we try to design assessments, when we have a chance to to, take a role in at is this. We recently went through as a team and read this book by James Zoll. Have you has any of you had a chance to read this one? If you hadn't.
It's a pretty it's a pretty cool book, the art of changing the brain. And it's built kind of as a a cognitive science of learning around sort of the biology of the brain. And one of the neat things that he does in this book is he says, okay, let's take he takes Colbs learning theory of, which is around the brain here, concrete experience, reflective observation, abstract hypotheses, active testing. And he maps that onto the physical brain. So when we have an experience as humans, when we're learning, basically the the experience gets read registered back here in the post sensor in the sensory and post sensory area of our brain, back in the back.
And then it starts to work its way around down into the, like the amygdala region, the temporal integrative cortex. And and that's where you start to, kinda respond emotionally a little bit immediately to that experience. And you start to connect that to your immediate surroundings and the way you're feeling. And then the the front integrative cortex, and and I'm not a neuroscientist. So I am quite possibly butchering this.
If you are in here, please forgive me if I'm butchering some something. But, then up here in the the frontal integrative cortex, this is the control center of your brain, where you you're, kind of controlling some of those pulses. And also, it's kind of your logical reasoning center of the brain. That's where you start to, form some of these abstract hypotheses around what it is you're experiencing. And then you go up to the premotor and motor part of the brain where, you're actually acting out.
Your ideas, your thinking. And in cold cycle, that's the active testing phase. And then it feeds back And that becomes kind of the the the the end of the cycle where you have that concrete experience where you can then kind of respond and see how that kind of worked out for you, basically. And then it continues to cycle. So when when we have this section over here about learning, recy requires a cycle completion.
We really like to think of assessment as being an important part of that cycle completion because if we're sitting up here, and talking, that's maybe the concrete experience. Right? And maybe you get a little bit of that reflective observation, but you need you need something to complete that cycle to where you do the the active testing, and get that feed back. And, okay. We'll talk through this real quick. We obviously this is nothing new to anybody here.
But we we look at formative. We look at testing as both the ative and a summative tool. So testing plays a major role in learning. It's one of the things we're really starting to understand, better and better as time goes although there have been many studies going back to, like, the nineteen thirties, where this, this idea of retrieval practice. And testing as a learning tool has been shown time and time again where they've taken people who have just like red material versus red and self tested.
And they've seen that the people who read and self tested. Basically, at the point they self tested, they stopped forgetting. Whereas the people who had just read the content, later on in post tests, they're level of recall was quite a bit lower than people who had self tested. So, so retrieval practice, testing is important as a learning tool because it interrupts that process of forgetting. And then there's other concepts to you, the idea of spacing.
If you've read you've if you've read the book, make it stick, it goes into a lot of these things. You know, if you if you give a little bit of time to start to forget the material, And then you test it. Sometimes that can actually make it, stick a little bit better in the long term memory. So some of the things we try to do at Utah State is encourage our faculty, and I I don't think we're unique in this. I think this is instructional design shops around the world are this, but we try to encourage our faculty to use varied and interspersed assessment, to do more than just have, you know, a bunch of lectures and have two midterms in a final or one big, you know, massive summative test, but make it something that happens throughout and make it something that mixed up.
So we see students responding in different ways. So I have a couple of examples. This one's a bit notorious in our office, because we have an excellent, an excellent Latin instructor. May you all have the opportunity to meet him one day. Just go and look up Mark Damon.
He's got stuff online. He's got website. He has recorded a bunch of stuff. He's just a delight of a person. And he has spent so much time on his classes.
So he's put together in his courses, some really scripted, lectures And then throughout his whole course, he follows those up with all of these various exercises. So, so here he's got and again, we actually transferred these in some cases. From, like, classic quizzes. If we had started actually probably in atomic assessments, we could have had some of these as drag and drops. But he's just built throughout his entire his entire course, all of these opportunities for people to take, fill in the blanks, answer the questions from the lecture, you know, circle options, put in, translations.
And he's just has, then he has his grammar drills where he has them kind of identify the parts speech. I don't know if everything's gonna pull up on the Wi Fi here, but he's just really done a good job of building assessment in and giving students the chance to just practice practice practice practice practice all the way through learning lab. And I'll go to the one that's loaded here so you can take a picture. So and then, I have another example here too. This is a, a stats class, an intro to stats that goes over the the normal curve.
And in this case, what she's done is she's used atomic assessments here to to combine the lesson material, on the normal curve and just structure that in with, with practice exercises. And I'm totally not familiar with this content, but she's got that all set up so that then, you know, they can check answers. Ta da, and then continue to try Yes. I see you have a question back there. But how do they embed the assessment? So there are two ways with the atomic assessments that you can do this.
In this case, what she did was she actually built the content in atomic assessments because it has the ability to plug in passages and videos and images and calculators and rulers and all that. And then to plug in, questions and kind of intersperse those. So because what this is using is, it's actually like, wrapped up lernosity, which is what a lot of the publishers are using. So if you ever see your kids do their math homework. And it looks like this.
They're using, in a lot of those cases, lernosity. This is wrapped up, and this is put into into Canvas in a way that we can have these interactive activities, but it can also embed into a page. So we do have people who start with their Canvas pages like the, the Latin example here, he started with the Canvas page, and then you have a little bit of our, city design tools, progress, things, and all that other stuff. But then we've embedded the activity into the page as an assignment. Yeah.
This is down here. Yep. So up here is Canvas and Kaltura. And then down here is atomic assessments. Okay.
So those are the so then some other principles that we look at is, trying to approach assessment thinking like an assessor. And this comes from the book backwards, or understanding by design, by Wiggins and Mc Mictige. Mictige. I've never figured out how they're supposed to be pronounced. And really trying to go into our assessment thinking about more than just kind of, having something to give them a grade, but actually thinking, what kind of evidence do we need to see that they've met our objectives? What specific characteristics and student responses are we actually looking for that's gonna help us to understand that they really are getting it.
And And does that does this really help us to infer that they've learned beyond maybe just like multiple choice? So are we really trying to our assessment authentic. We spent some time a little while back trying to get a little bit of training. None of us in our office are psychometricians, But we spent some time trying to get a little bit of consulting around. Any psychometricians in here? Alright. I'll better.
Okay. So there actually are standards around assessment. So, there's a these organizations, academic education. I don't know what the acronym stands for stating up here, but, they there are some specific standards of educational testing that include validity, reliability, fairness, and security. And so in in designing good assessment, especially summative assessment, this specifically applies to that.
We're looking for validity, does it actually measure what we're intending it to measure? Does it do so consistently so that if we offer test to people of different backgrounds, we would get roughly a consistent result. If we gave to the same student today. Next week, we get roughly consistent result. Is it not biased, against specific or people with specific characteristics, is it fair, and also is it secured against cheating? And then, and then in that same consulting. So I'm I'm citing some work done here by Cavillon.
Which I don't know if they're here at this conference today, but they are the ones we were talking to for this. They gave us this acronym. So tips for test items it should be congruent, accurate. So the correct answers are correct, and we don't have faults, incorrect answers that can be construed as correct, relevant to the material being taught, and appropriately difficult to differentiate between a student who's knowledgeable and one who's not. So if you've played around in Canvas in the, the statistics where they have the differentiation score, we'll pull those up sometimes with faculty and look at those and say, okay.
Which of your assessment items are, you know, getting everybody's getting them or nobody's getting them that don't have this, this correct differentiation. So, and then we have an example here. Again, these are, atomic assessment examples. Our, our math, our college algebra, of course, math ten, has, created in their course a bunch of, of competency based, assessment. So they require students to go through and take these until they've mastered them, up to a certain point to score up to a certain level, and then they're able to move on.
And so they've done a really good job going in and tagging all of their items. And within all of their items, building out datasets, so that, you know, each item can have a different set of variables and a different set of results. And also every time they take, the the Sam, it uses the, the item tagging. And they go a little bit. I mean, this is this is a bit crazy here, but they really use item sampling in a big way to pull from their item banks and and vary it quite a lot.
Okay. Any questions? Because I'm going quick question, Jack? Okay. And then the other thing, this again, kind of comes from that make it stick book, but the idea of desirable difficulty, and I look at this as being more in the formative assessment realm? What can we do in our assessments to mimic that sense of it? It's hard enough that it presents a challenge and and makes you reach a little bit to inter interrupt that process of forgetting. But also it's not, like, so hard that it's defeating. And, and maybe even in if we can, can we make it so that students are motivated to go back and try again.
And so we have another example here, where the teacher's gone through and built This is a, a political science course. He's gone through and just built his modules. His vocabulary items, his mapping items. He's built all of those in, an assessment so that as they go through the activity, they're assessed as they go, and they're learning as they go. But they have the, again, they have the opportunity to check their answers multiple times.
And I have another example here of an educational statistics course, where again, she's giving them three opportunity to test. And I just think when I look at this, my own son who's, now fourteen, he was an eight grade last year, and there was a game that allowed him that he did for school that actually allowed him to flag to tag all of the countries in, in the world, and then all of the states and capitals and things in the country. And he spent probably hours at home on his own, just sitting there, practicing it over and over and over and seeing how fast he can do it. See if he could ace it and taking the borders off and seeing if he could do it without the borders. And he didn't have to do it, but it was set up in such a way that it was it became fun.
And so now, like his his knowledge of world geography, at least for right now, is is pretty awesome. Until maybe, you know, political things move the borders. But it's great. He can tell me, what country that where that country happens to be. So it's really exciting.
So those are some of the main principles. That we use to guide our assessment. So now what we're gonna do is have you go back into the groups that you were kinda grouped up into, at the beginning of this, and and just discuss some of these principles, share with each other, maybe what you're doing at your school, to to apply these principles, what you're doing in your design. And there is a jamboard here, so you can either pull it up on your or comp on your laptop or on your phone, we've got the QR code. And see if you can pull together an idea and post it up there.
And then we'll just pull that up and take a look at it. So we'll take about, what, three minutes? Okay. Okay. So we're gonna go ahead and calm the discussions down. We're happy to hear that you're all discussing principles of assessments and how it's being used.
We will check the jamboard later today as well if you and keep it open if you wanna open that discussion to to refer to later. There's questions you can post it on there. We'll check it again later today and later next week as well. We're gonna go ahead and move on to a case study. With Iowa workforce development.
So Neil did a fabulous job at, discussing the principles and how those are utilized in higher education. I wanted to showcase a few options of how those principles are are integrated into adult basic education. So before we go too far into the case study, I wanna do you at a quick raise of hands of who even knows what adult basic education is. Oh, wonderful. Okay.
Now let's take it a step further and who is an adult basic education provider or is partnered with an adult education provider at a community college. Great. We had more than I was expecting. So I wanted to give just a quick foundational knowledge of what this is before I dive into the because it's a slightly different world than higher education. Adult basic education provides foundational skills to adult learners who have not completed primary or secondary education.
So the mostly working with people that haven't completed high school for a variety of reasons, another large population right now is refugees. They're coming over with little to no native education either. They may not be able to read or write in their native language. That can be a huge barrier to testing, performative and summative assessing assessments. And then you can we're working with a variety of age ranges.
So you can have eighteen and up, and up to Iowa had a ninety two year old join the class. And so you're trying to imagine digital literacy and how this is gonna work in canvas. Right? So we were we partnered with Iowa workforce development to develop a statewide distance education curriculum and build informative assessments along the way to assist with teacher led instruction and also allow for students to work independently. This was also a struggle. They go home and they're like, I don't know what to do now.
I can't even navigate Canvas, let alone go back and forth to discussions, quizzes, assignments, It was so we needed to find a way to streamline that so we could get valid, formative and summative assessment results. Again, the the largest barrier was literacy. To be able to actually read the question, to answer the question, and then also just digital literacy skills navigating in canvas. And then again, it's always difficult to find challenging and novel questions that don't overwhelm people. And so I'm gonna go ahead and hop into some of these examples so you can see how we implemented some of these items.
So we kept to the students, this is a publicly facing course. Let me actually go back. If you wanna access this course, so you can look on your phone or on your laptop, you can. So you can follow along more easily. The QR code, there's also a link here for the presentation afterwards if you wanted to access it later.
Okay. I'm gonna go ahead and hop into now. So a couple people taking pictures. Okay. So you can see that we've kept the learners on one page for the entire virtual class, the pre class, the post class, and class time.
So when you move into the actual this is moving around a little bit. So when the student attends class, we built out facilitator instructions because we know this specifically was a reading and writing course. We know that explicit teacher led instruction is an important component for successful reading. So we wanted to, however, with distance education, working on zoom and having them navigate was really difficult. And so we we recommended that they pull up zoom on often the times they'd have a phone and a laptop.
So they pulled up zoom And then they would have canvas on their laptop, and they wouldn't have to leave the page. And it did wonders. So they could come here And on the virtual class assignment, again, an embedded atomic assessment in there, the teachers could do the I do model first then the students could move to question number two, then they could do it together at the same time. So they really didn't require high digital literacy skill or literacy skills be able to successfully complete the assignment, and the teachers during virtual class, the students could complete it, and then they could go into the reporting and tracking and see how the students did. To inform instruction right away so they didn't have to wait.
And then we also found that some of the seasoned learners were struggling with these all text. And so there's also accessibility options in here where they can change the color scheme and also the font size so they can make the text huge, and they can see. So that's not a barrier for them. And then after class, they already knew how to navigate to that page because of the instructor. They could just scroll down and complete the post class activities right on the page.
We're gonna go ahead and skip down to one of these so you can see what it looks like. Again, barrier for literacy, they can't even read the question, so it's a struggle. So that we went ahead and added in an audio player option so they can go ahead and play the question stem before they respond. If they are already at that if they're beyond the emerging reader and can read it themselves, they don't have to use the support. Some of the other question types that we utilized A lot of them have reading disabilities, so there's an option to have a line reader.
It makes the text feel not as overwhelming to the students. So they can read line by line. They could also read a paragraph at a time, and that was extremely helpful for them to not be overwhelmed by the text before they even got to the questions and just wanted to give up. Some other options. So this vocabulary is huge in reading.
And we we want to actually test their vocabulary in the assessment, not their reading ability. And so again, we've just built some assessments out on the page, where they can listen to the question stem, then they can come down here and also listen to the vocabulary word. So we're actually testing vocabulary, not their reading ability. Some other novel ways that, again, people don't we we don't wanna over test people, and they get just bored with the multiple choice questions and such. So we've just built in a lot of different types of question types.
We've also given them the option to audio record. They really struggled with audio recording in discussions and such because they had to go in and there were some multiple clicks and steps that would lose them. We went ahead and just added an audio recorder question. So they could practice their speaking abilities. They may be able to not type it, but they could speak it.
And then also practicing their English language learning skills as well. Some of these question types are already showcased a little bit by Neil, so we'll skip through some of those another item that the students really loved was the opportunity to provide a little bit of scaffolding. So you they can highlight in the sentence, you know, the subject, and then the verb, but they might have forgotten what that means. So they can go ahead and do the first tint. If that wasn't helpful enough, they can go ahead and do the second tint.
And the students are really empowered to be able to complete the assignment independently. This was huge for them. And then after the post class, the teachers could go ahead and look at at tracking progress, and then they could inform discussion for the next class, or also reach out to them and say, Hey, they need one on one support before we even start the next class so they can stay caught up and then increase retention as well. So any questions on that? Yeah. Especially for the highlighter, options.
How does that work with issues like visual impairments inside Sure. So the majority of question types in Joel, correct me if I'm wrong, but token highlight does work with screen reader. It is five zero eight compliant, and meets what CAC two point o standards. So those question types they could technically use with a a screen reader that they would have. Good question.
And then this is actually open content. So it's available to use. We did enhance it with atomic assessments, but you could, you know, move them into quizzes if you wanted. You just kind of lose the the digital literacy needs to be a little bit higher at that point. Yeah.
And if you had any other questions afterwards, you can go ahead and grab me. I'm gonna skip through just a little bit. I went ahead and put screenshots in here just in case internet wasn't working for me. So we're gonna go ahead and move on to the authoring aids. So we've talked about all of these assessment principles You as instructional designers know how to implement them.
Do you have the time to actually do it? It's the question. Also, can you get faculty buy in? Right? You can teach the faculty about assessment principles, but do they have the time, right, to be able to create it, even on paper takes time, but then to migrate it to a digital system, it's it's really overwhelming for the faculty. And also sometimes as the designer to do it at scale. And so I wanted to showcase the authoring aid. So this is in beta right now.
It's usable It's available. So we wanted to show what this looks like. So atomic assessments has over forty different question types that you can you can use at your pleasure. Author aid, they're focusing on three right now. So do you guys wanna create an assessment and see how fast it works? Yes.
I am so excited about this. For for me personally, and I hope that you enjoy it just as much. So the three options that we have are multiple choice, true false, and then also a closed drag and drop. This is just kind of the first part of connecting AI and integrating it with atomic assessments, and then it can also translate the items. I I just my mind is baffled on this where it's just a click of a button.
You can just translate it and it's there. You don't have to you don't have to copy and paste and go into chat GPT and then go back and then say is this correct? So Who wants to be my little guinea pig? Who wants to choose what type of question type we use? What multiple choice, true faults, or close drag and drop? Drag and drop? K. Who would like to choose a topic? Is there an assessment that you're currently working on that you're a little bit anybody got a question right off the top. If not, I'll come up with one. Okay.
Anatomy, it is. So anatomy and then let's focus on lungs. So we just go ahead and pop in a subcategory. If you have some text from your actual content, you can copy and paste that in. And then you can drill down further into a subtopic.
So we're just gonna go ahead and do something that's already in here, which is plural space. Let's see what it comes up with. So This is this is helping the teacher in a scaffolded approach, right, to create assessment, prince principled assessments. Okay. So let's go ahead and pick a difficult or let's actually pull up.
Okay. A difficulty level. Do you want easy, medium, or hard? Easy. Now, this is one of my favorites. It has Blooms taxonomies framework right in here.
So you can select, do you want a knowledge based question, comprehension, application, analysis. What do we wanna pick? Synthesis. Synthesis. Okay. Let's see what it comes up with.
We can easy synthesis. And then we can pick the number of blanks So since it's easy, let's just go ahead and go, you know, three blanks. We won't go blow out of the water, but let's provide a lot of possible options so we can see what it comes up with. I'm gonna go ahead and generate this response. It is going to take a second because of the awesome wifi here.
So while it's doing that, I'm gonna go ahead and do a multiple choice question as well. So you can see what this looks like. So is this stranger just one question right now? Yes. So right now, it does So, like, the first five questions, the first five will be local So right now, as the author aid, it's doing one item at a time, eventually you want to expand to do a full assignment. Right? And then allow the teachers those options to go in and and make edits as they see fit.
Good question. So let me go ahead and do a similar type question with multiple choice so you can see some of the feedback that happens in here as well. And we'll have this going at the same time baking in that oven while this one's already ready. So you can see, voila, we're done with the question. So you can see that it it pops up a question stem, you can go over to the left hand side right away, and you say, oh, I wanna make some changes.
I wanna add an image. Maybe I wanna change the text. You can change the possible responses if there's something that you don't love or if you wanna take away. You can also, you know, tell the authoring aid, Hey, I don't like this distractor. Can you pick a new one? You can tell it why you don't like it, and it'll populate a different response for you.
And then you, as a teacher, can go ahead and, you know, check things, you can manipulate it, make sure it's what you want, you can change things on the left hand side, then you go ahead and save it and you're done with the item. It took literally two clicks. Super great. Now let's hop hop on over and look at the multiple choice question really quick because that's done baking as well. So you can see that something that I feel takes such a long time is even even using chat TPT.
I'm like, I have to copy and paste so many different things and put it in so many different boxes. So this just streamlines it so it can happen fast and effectively for the teachers. So you can see you can go through here and check. Do I like the distractor? Do I like the incorrect response? Do I like the correct response? And then again, you can just go ahead and save it. You can make adjustments.
It literally takes a couple of clicks and you're done. Any questions? Is that just making your day? Do you guys wanna do you wanna see the translation? It's actually really neat. And it takes just super quick. So I'm gonna go ahead I saved that item. I'm gonna go ahead and just select translate I'll go into the item itself.
What language do you want? Spanish. So that took one click, two click, three click, it'll probably take a second to do its translation magic because of the wifi but we'll see what it looks like in just a second. While we're waiting on that, any any questions, discussion points, yeah, go ahead. That's still being slightly worked out. At the moment, we You can go in and beta test it right now, within the platform, and then we'll kinda go from there.
Good question. Joel, if you have a better answer, you can answer that. For those, I will say this isn't using this, but we have had an instruct in at our institution, who's a CS instructor. In atomic jolt, you can actually go in and get the the raw JSON for a question. And then you can use that if you want to manipulate it and kinda make custom question types and things.
But he went in, copied the raw JSON, plugged it into chat GPT, and said, here's the format. Can you now create a qu questions for me in this format. So he asked for it to create a question in that format, and it turned it out in the JSON, and then he just copied and pasted that JSON and got all the distractors and everything in one. One felt smooth. One felt swooped that way.
So it's something you can do. We had a one of our CS guys did it. So What are the questions? Yeah. Go ahead. Where do you where do you draw these questions from? Like, what are you using, what resources are you using to come up with these questions? So this is being integrated with with AI.
Specifically chat GPT. And so it's basically honing in and helping a teacher focus on what needs to happen in order to create an assessment, a principle based assessment. So then, we'll be able to step further. The play devil's advocate a little bit here. Do we know that these questions meet the standards of validity, reliability, fairness, security, but that's That's where we have to be good human beings.
That's why it's called an author aid. I'm I'm just I'm that's so that's my issue with using check check GPT to create assessments is. You anyway, I don't Yeah. Yeah. Yeah.
Oh, I totally understand. Yeah. It comes with all the caveats. Let let's go ahead and have Justin pop on in here. So first, topic that we have a service product.
And we're also in this community. So when you okay. I mean, I have a million other questions, and I'm not gonna, you know, jeopardize. You could take over all the time here, but I was just wondering that. Just kind of talk to myself.
Sure. Why don't you go ahead and stay after and would love to chat with just another brief response to that is people are taking, like, multiple generative models and, like, get the output of one, and then you ask chat, AT and T, essentially, is this valid or, you know, is this active? Gotcha. And and you can actually get better results from combining multiple which is great. Interesting. Yeah.
Typically for, especially for student interactions, you don't wanna go to steel shop interactions. From your AI. Single shot is typically what you're gonna see from a title in the same direction. Right. Since you're gonna have one iteration of I act as the researcher to check for accuracy.
So you run it through all of those before you have the final version. And so is that what atomic Joel's gonna do? Because teachers aren't gonna do that because they don't know. Yeah. This is all behind this. Okay.
Good. So you're gonna type in. Let's just say, why didn't this. There's always layers happening before you ever get a response, guys. Good questions.
And there's a question right up here too. I just are you left in TPT four. This is Is it correct? Yeah. I just asked you because it's a little bit more accurate with the information that it has. He is not generally a search engine.
So the accuracy is always inflection. Correct. Does that make sense? Because it's a work prediction and Yes. And and that's sorry. I don't mean that my job.
Yeah. I mean, either. So we we build a search a four canvas that has all of your content. So we're able to enhance the AI through not the very product, because we we already have access to all of the data that we put into Canvas. And so now we can direct the AI based on the data that we already have.
Good questions. Thank you. There's oh, go ahead. So as a teacher, I have to create my own content. So, and this might be a negative question.
How is this to save me time because it sounds like I'm gonna have to do a lot of additional work rather than just inputting my own content. Well, does your content include assessments? Yes. So this would be where you if you have your content then you can just go ahead and in your content to create assessment items. Right? And say again? Never mind. Never mind.
It's easy. It would be easier And I think it also depends on the the quality of question that you want and the cognitive load that you're available that you're able to offer. Right? If you can come up with those questions and type it in and you know what's happening and you're you're good to go, then, yeah, go ahead and type it right into canvas. When I create assessments, I feel drained after a couple. The cognitive load is high and takes me a long time to actually also create different assessment types as well.
And so this for me, personally, helps just speed along the assessment process, because I just, like, I get burned out and like, okay, another distractor. What's actually a good distractor? You know, what's the what feedback can I offer that's actually helpful to the student instead of correct, incorrect? It's it's slightly exhausting. So to be able to just click a couple of buttons for me, it's helpful to just have the assessment there, and then I have a base point, so then I can modify it from there, and then I can do a lot more and a lot faster. But again, everybody works differently, so it might not be faster for you. Thank you, everybody. We're at we're at time, so you're free to go, but we'll stay up here and take questions with you on the line.
I'm the director of the Center for instructional design and innovation at Utah State University, which is our centralized instructional design and faculty support in all things technology and academics, at Utah State University. And I'm Tony Hoskin. I'm the the VP of curriculum at Atomic. I've been here for about three years. And prior to that, I was at Utah University University in the teacher education and leadership department.
And then down in the audience, if you'll raise his hand, this is Joel Duffin. His name's up here. And he's he's a great resource if you have questions. Also Justin Ballback there on some of the the technology stuff. So, so yeah, today we're talking about just some of the principles that that we've used at Utah State University to guide assessment.
And, and also just some of the just giving some case studies, some examples of what some teachers have been doing. In this case, particularly using atomic assessments, And, and then Tony, I believe, has a cool sneak preview of some cool new stuff. So Okay. So, around the room, you had a couple of, topics. So we're just gonna run through some of those topics, today, and then you'll have a chance later to discuss these a little bit more amongst yourselves, and we'll be able to post those to a jam board.
Hopefully, you'll learn. From each other more than you learn from us today, because that's always the fun part of these conferences. So one of the things that, guides, the way that we try to design assessments, when we have a chance to to, take a role in at is this. We recently went through as a team and read this book by James Zoll. Have you has any of you had a chance to read this one? If you hadn't.
It's a pretty it's a pretty cool book, the art of changing the brain. And it's built kind of as a a cognitive science of learning around sort of the biology of the brain. And one of the neat things that he does in this book is he says, okay, let's take he takes Colbs learning theory of, which is around the brain here, concrete experience, reflective observation, abstract hypotheses, active testing. And he maps that onto the physical brain. So when we have an experience as humans, when we're learning, basically the the experience gets read registered back here in the post sensor in the sensory and post sensory area of our brain, back in the back.
And then it starts to work its way around down into the, like the amygdala region, the temporal integrative cortex. And and that's where you start to, kinda respond emotionally a little bit immediately to that experience. And you start to connect that to your immediate surroundings and the way you're feeling. And then the the front integrative cortex, and and I'm not a neuroscientist. So I am quite possibly butchering this.
If you are in here, please forgive me if I'm butchering some something. But, then up here in the the frontal integrative cortex, this is the control center of your brain, where you you're, kind of controlling some of those pulses. And also, it's kind of your logical reasoning center of the brain. That's where you start to, form some of these abstract hypotheses around what it is you're experiencing. And then you go up to the premotor and motor part of the brain where, you're actually acting out.
Your ideas, your thinking. And in cold cycle, that's the active testing phase. And then it feeds back And that becomes kind of the the the the end of the cycle where you have that concrete experience where you can then kind of respond and see how that kind of worked out for you, basically. And then it continues to cycle. So when when we have this section over here about learning, recy requires a cycle completion.
We really like to think of assessment as being an important part of that cycle completion because if we're sitting up here, and talking, that's maybe the concrete experience. Right? And maybe you get a little bit of that reflective observation, but you need you need something to complete that cycle to where you do the the active testing, and get that feed back. And, okay. We'll talk through this real quick. We obviously this is nothing new to anybody here.
But we we look at formative. We look at testing as both the ative and a summative tool. So testing plays a major role in learning. It's one of the things we're really starting to understand, better and better as time goes although there have been many studies going back to, like, the nineteen thirties, where this, this idea of retrieval practice. And testing as a learning tool has been shown time and time again where they've taken people who have just like red material versus red and self tested.
And they've seen that the people who read and self tested. Basically, at the point they self tested, they stopped forgetting. Whereas the people who had just read the content, later on in post tests, they're level of recall was quite a bit lower than people who had self tested. So, so retrieval practice, testing is important as a learning tool because it interrupts that process of forgetting. And then there's other concepts to you, the idea of spacing.
If you've read you've if you've read the book, make it stick, it goes into a lot of these things. You know, if you if you give a little bit of time to start to forget the material, And then you test it. Sometimes that can actually make it, stick a little bit better in the long term memory. So some of the things we try to do at Utah State is encourage our faculty, and I I don't think we're unique in this. I think this is instructional design shops around the world are this, but we try to encourage our faculty to use varied and interspersed assessment, to do more than just have, you know, a bunch of lectures and have two midterms in a final or one big, you know, massive summative test, but make it something that happens throughout and make it something that mixed up.
So we see students responding in different ways. So I have a couple of examples. This one's a bit notorious in our office, because we have an excellent, an excellent Latin instructor. May you all have the opportunity to meet him one day. Just go and look up Mark Damon.
He's got stuff online. He's got website. He has recorded a bunch of stuff. He's just a delight of a person. And he has spent so much time on his classes.
So he's put together in his courses, some really scripted, lectures And then throughout his whole course, he follows those up with all of these various exercises. So, so here he's got and again, we actually transferred these in some cases. From, like, classic quizzes. If we had started actually probably in atomic assessments, we could have had some of these as drag and drops. But he's just built throughout his entire his entire course, all of these opportunities for people to take, fill in the blanks, answer the questions from the lecture, you know, circle options, put in, translations.
And he's just has, then he has his grammar drills where he has them kind of identify the parts speech. I don't know if everything's gonna pull up on the Wi Fi here, but he's just really done a good job of building assessment in and giving students the chance to just practice practice practice practice practice all the way through learning lab. And I'll go to the one that's loaded here so you can take a picture. So and then, I have another example here too. This is a, a stats class, an intro to stats that goes over the the normal curve.
And in this case, what she's done is she's used atomic assessments here to to combine the lesson material, on the normal curve and just structure that in with, with practice exercises. And I'm totally not familiar with this content, but she's got that all set up so that then, you know, they can check answers. Ta da, and then continue to try Yes. I see you have a question back there. But how do they embed the assessment? So there are two ways with the atomic assessments that you can do this.
In this case, what she did was she actually built the content in atomic assessments because it has the ability to plug in passages and videos and images and calculators and rulers and all that. And then to plug in, questions and kind of intersperse those. So because what this is using is, it's actually like, wrapped up lernosity, which is what a lot of the publishers are using. So if you ever see your kids do their math homework. And it looks like this.
They're using, in a lot of those cases, lernosity. This is wrapped up, and this is put into into Canvas in a way that we can have these interactive activities, but it can also embed into a page. So we do have people who start with their Canvas pages like the, the Latin example here, he started with the Canvas page, and then you have a little bit of our, city design tools, progress, things, and all that other stuff. But then we've embedded the activity into the page as an assignment. Yeah.
This is down here. Yep. So up here is Canvas and Kaltura. And then down here is atomic assessments. Okay.
So those are the so then some other principles that we look at is, trying to approach assessment thinking like an assessor. And this comes from the book backwards, or understanding by design, by Wiggins and Mc Mictige. Mictige. I've never figured out how they're supposed to be pronounced. And really trying to go into our assessment thinking about more than just kind of, having something to give them a grade, but actually thinking, what kind of evidence do we need to see that they've met our objectives? What specific characteristics and student responses are we actually looking for that's gonna help us to understand that they really are getting it.
And And does that does this really help us to infer that they've learned beyond maybe just like multiple choice? So are we really trying to our assessment authentic. We spent some time a little while back trying to get a little bit of training. None of us in our office are psychometricians, But we spent some time trying to get a little bit of consulting around. Any psychometricians in here? Alright. I'll better.
Okay. So there actually are standards around assessment. So, there's a these organizations, academic education. I don't know what the acronym stands for stating up here, but, they there are some specific standards of educational testing that include validity, reliability, fairness, and security. And so in in designing good assessment, especially summative assessment, this specifically applies to that.
We're looking for validity, does it actually measure what we're intending it to measure? Does it do so consistently so that if we offer test to people of different backgrounds, we would get roughly a consistent result. If we gave to the same student today. Next week, we get roughly consistent result. Is it not biased, against specific or people with specific characteristics, is it fair, and also is it secured against cheating? And then, and then in that same consulting. So I'm I'm citing some work done here by Cavillon.
Which I don't know if they're here at this conference today, but they are the ones we were talking to for this. They gave us this acronym. So tips for test items it should be congruent, accurate. So the correct answers are correct, and we don't have faults, incorrect answers that can be construed as correct, relevant to the material being taught, and appropriately difficult to differentiate between a student who's knowledgeable and one who's not. So if you've played around in Canvas in the, the statistics where they have the differentiation score, we'll pull those up sometimes with faculty and look at those and say, okay.
Which of your assessment items are, you know, getting everybody's getting them or nobody's getting them that don't have this, this correct differentiation. So, and then we have an example here. Again, these are, atomic assessment examples. Our, our math, our college algebra, of course, math ten, has, created in their course a bunch of, of competency based, assessment. So they require students to go through and take these until they've mastered them, up to a certain point to score up to a certain level, and then they're able to move on.
And so they've done a really good job going in and tagging all of their items. And within all of their items, building out datasets, so that, you know, each item can have a different set of variables and a different set of results. And also every time they take, the the Sam, it uses the, the item tagging. And they go a little bit. I mean, this is this is a bit crazy here, but they really use item sampling in a big way to pull from their item banks and and vary it quite a lot.
Okay. Any questions? Because I'm going quick question, Jack? Okay. And then the other thing, this again, kind of comes from that make it stick book, but the idea of desirable difficulty, and I look at this as being more in the formative assessment realm? What can we do in our assessments to mimic that sense of it? It's hard enough that it presents a challenge and and makes you reach a little bit to inter interrupt that process of forgetting. But also it's not, like, so hard that it's defeating. And, and maybe even in if we can, can we make it so that students are motivated to go back and try again.
And so we have another example here, where the teacher's gone through and built This is a, a political science course. He's gone through and just built his modules. His vocabulary items, his mapping items. He's built all of those in, an assessment so that as they go through the activity, they're assessed as they go, and they're learning as they go. But they have the, again, they have the opportunity to check their answers multiple times.
And I have another example here of an educational statistics course, where again, she's giving them three opportunity to test. And I just think when I look at this, my own son who's, now fourteen, he was an eight grade last year, and there was a game that allowed him that he did for school that actually allowed him to flag to tag all of the countries in, in the world, and then all of the states and capitals and things in the country. And he spent probably hours at home on his own, just sitting there, practicing it over and over and over and seeing how fast he can do it. See if he could ace it and taking the borders off and seeing if he could do it without the borders. And he didn't have to do it, but it was set up in such a way that it was it became fun.
And so now, like his his knowledge of world geography, at least for right now, is is pretty awesome. Until maybe, you know, political things move the borders. But it's great. He can tell me, what country that where that country happens to be. So it's really exciting.
So those are some of the main principles. That we use to guide our assessment. So now what we're gonna do is have you go back into the groups that you were kinda grouped up into, at the beginning of this, and and just discuss some of these principles, share with each other, maybe what you're doing at your school, to to apply these principles, what you're doing in your design. And there is a jamboard here, so you can either pull it up on your or comp on your laptop or on your phone, we've got the QR code. And see if you can pull together an idea and post it up there.
And then we'll just pull that up and take a look at it. So we'll take about, what, three minutes? Okay. Okay. So we're gonna go ahead and calm the discussions down. We're happy to hear that you're all discussing principles of assessments and how it's being used.
We will check the jamboard later today as well if you and keep it open if you wanna open that discussion to to refer to later. There's questions you can post it on there. We'll check it again later today and later next week as well. We're gonna go ahead and move on to a case study. With Iowa workforce development.
So Neil did a fabulous job at, discussing the principles and how those are utilized in higher education. I wanted to showcase a few options of how those principles are are integrated into adult basic education. So before we go too far into the case study, I wanna do you at a quick raise of hands of who even knows what adult basic education is. Oh, wonderful. Okay.
Now let's take it a step further and who is an adult basic education provider or is partnered with an adult education provider at a community college. Great. We had more than I was expecting. So I wanted to give just a quick foundational knowledge of what this is before I dive into the because it's a slightly different world than higher education. Adult basic education provides foundational skills to adult learners who have not completed primary or secondary education.
So the mostly working with people that haven't completed high school for a variety of reasons, another large population right now is refugees. They're coming over with little to no native education either. They may not be able to read or write in their native language. That can be a huge barrier to testing, performative and summative assessing assessments. And then you can we're working with a variety of age ranges.
So you can have eighteen and up, and up to Iowa had a ninety two year old join the class. And so you're trying to imagine digital literacy and how this is gonna work in canvas. Right? So we were we partnered with Iowa workforce development to develop a statewide distance education curriculum and build informative assessments along the way to assist with teacher led instruction and also allow for students to work independently. This was also a struggle. They go home and they're like, I don't know what to do now.
I can't even navigate Canvas, let alone go back and forth to discussions, quizzes, assignments, It was so we needed to find a way to streamline that so we could get valid, formative and summative assessment results. Again, the the largest barrier was literacy. To be able to actually read the question, to answer the question, and then also just digital literacy skills navigating in canvas. And then again, it's always difficult to find challenging and novel questions that don't overwhelm people. And so I'm gonna go ahead and hop into some of these examples so you can see how we implemented some of these items.
So we kept to the students, this is a publicly facing course. Let me actually go back. If you wanna access this course, so you can look on your phone or on your laptop, you can. So you can follow along more easily. The QR code, there's also a link here for the presentation afterwards if you wanted to access it later.
Okay. I'm gonna go ahead and hop into now. So a couple people taking pictures. Okay. So you can see that we've kept the learners on one page for the entire virtual class, the pre class, the post class, and class time.
So when you move into the actual this is moving around a little bit. So when the student attends class, we built out facilitator instructions because we know this specifically was a reading and writing course. We know that explicit teacher led instruction is an important component for successful reading. So we wanted to, however, with distance education, working on zoom and having them navigate was really difficult. And so we we recommended that they pull up zoom on often the times they'd have a phone and a laptop.
So they pulled up zoom And then they would have canvas on their laptop, and they wouldn't have to leave the page. And it did wonders. So they could come here And on the virtual class assignment, again, an embedded atomic assessment in there, the teachers could do the I do model first then the students could move to question number two, then they could do it together at the same time. So they really didn't require high digital literacy skill or literacy skills be able to successfully complete the assignment, and the teachers during virtual class, the students could complete it, and then they could go into the reporting and tracking and see how the students did. To inform instruction right away so they didn't have to wait.
And then we also found that some of the seasoned learners were struggling with these all text. And so there's also accessibility options in here where they can change the color scheme and also the font size so they can make the text huge, and they can see. So that's not a barrier for them. And then after class, they already knew how to navigate to that page because of the instructor. They could just scroll down and complete the post class activities right on the page.
We're gonna go ahead and skip down to one of these so you can see what it looks like. Again, barrier for literacy, they can't even read the question, so it's a struggle. So that we went ahead and added in an audio player option so they can go ahead and play the question stem before they respond. If they are already at that if they're beyond the emerging reader and can read it themselves, they don't have to use the support. Some of the other question types that we utilized A lot of them have reading disabilities, so there's an option to have a line reader.
It makes the text feel not as overwhelming to the students. So they can read line by line. They could also read a paragraph at a time, and that was extremely helpful for them to not be overwhelmed by the text before they even got to the questions and just wanted to give up. Some other options. So this vocabulary is huge in reading.
And we we want to actually test their vocabulary in the assessment, not their reading ability. And so again, we've just built some assessments out on the page, where they can listen to the question stem, then they can come down here and also listen to the vocabulary word. So we're actually testing vocabulary, not their reading ability. Some other novel ways that, again, people don't we we don't wanna over test people, and they get just bored with the multiple choice questions and such. So we've just built in a lot of different types of question types.
We've also given them the option to audio record. They really struggled with audio recording in discussions and such because they had to go in and there were some multiple clicks and steps that would lose them. We went ahead and just added an audio recorder question. So they could practice their speaking abilities. They may be able to not type it, but they could speak it.
And then also practicing their English language learning skills as well. Some of these question types are already showcased a little bit by Neil, so we'll skip through some of those another item that the students really loved was the opportunity to provide a little bit of scaffolding. So you they can highlight in the sentence, you know, the subject, and then the verb, but they might have forgotten what that means. So they can go ahead and do the first tint. If that wasn't helpful enough, they can go ahead and do the second tint.
And the students are really empowered to be able to complete the assignment independently. This was huge for them. And then after the post class, the teachers could go ahead and look at at tracking progress, and then they could inform discussion for the next class, or also reach out to them and say, Hey, they need one on one support before we even start the next class so they can stay caught up and then increase retention as well. So any questions on that? Yeah. Especially for the highlighter, options.
How does that work with issues like visual impairments inside Sure. So the majority of question types in Joel, correct me if I'm wrong, but token highlight does work with screen reader. It is five zero eight compliant, and meets what CAC two point o standards. So those question types they could technically use with a a screen reader that they would have. Good question.
And then this is actually open content. So it's available to use. We did enhance it with atomic assessments, but you could, you know, move them into quizzes if you wanted. You just kind of lose the the digital literacy needs to be a little bit higher at that point. Yeah.
And if you had any other questions afterwards, you can go ahead and grab me. I'm gonna skip through just a little bit. I went ahead and put screenshots in here just in case internet wasn't working for me. So we're gonna go ahead and move on to the authoring aids. So we've talked about all of these assessment principles You as instructional designers know how to implement them.
Do you have the time to actually do it? It's the question. Also, can you get faculty buy in? Right? You can teach the faculty about assessment principles, but do they have the time, right, to be able to create it, even on paper takes time, but then to migrate it to a digital system, it's it's really overwhelming for the faculty. And also sometimes as the designer to do it at scale. And so I wanted to showcase the authoring aid. So this is in beta right now.
It's usable It's available. So we wanted to show what this looks like. So atomic assessments has over forty different question types that you can you can use at your pleasure. Author aid, they're focusing on three right now. So do you guys wanna create an assessment and see how fast it works? Yes.
I am so excited about this. For for me personally, and I hope that you enjoy it just as much. So the three options that we have are multiple choice, true false, and then also a closed drag and drop. This is just kind of the first part of connecting AI and integrating it with atomic assessments, and then it can also translate the items. I I just my mind is baffled on this where it's just a click of a button.
You can just translate it and it's there. You don't have to you don't have to copy and paste and go into chat GPT and then go back and then say is this correct? So Who wants to be my little guinea pig? Who wants to choose what type of question type we use? What multiple choice, true faults, or close drag and drop? Drag and drop? K. Who would like to choose a topic? Is there an assessment that you're currently working on that you're a little bit anybody got a question right off the top. If not, I'll come up with one. Okay.
Anatomy, it is. So anatomy and then let's focus on lungs. So we just go ahead and pop in a subcategory. If you have some text from your actual content, you can copy and paste that in. And then you can drill down further into a subtopic.
So we're just gonna go ahead and do something that's already in here, which is plural space. Let's see what it comes up with. So This is this is helping the teacher in a scaffolded approach, right, to create assessment, prince principled assessments. Okay. So let's go ahead and pick a difficult or let's actually pull up.
Okay. A difficulty level. Do you want easy, medium, or hard? Easy. Now, this is one of my favorites. It has Blooms taxonomies framework right in here.
So you can select, do you want a knowledge based question, comprehension, application, analysis. What do we wanna pick? Synthesis. Synthesis. Okay. Let's see what it comes up with.
We can easy synthesis. And then we can pick the number of blanks So since it's easy, let's just go ahead and go, you know, three blanks. We won't go blow out of the water, but let's provide a lot of possible options so we can see what it comes up with. I'm gonna go ahead and generate this response. It is going to take a second because of the awesome wifi here.
So while it's doing that, I'm gonna go ahead and do a multiple choice question as well. So you can see what this looks like. So is this stranger just one question right now? Yes. So right now, it does So, like, the first five questions, the first five will be local So right now, as the author aid, it's doing one item at a time, eventually you want to expand to do a full assignment. Right? And then allow the teachers those options to go in and and make edits as they see fit.
Good question. So let me go ahead and do a similar type question with multiple choice so you can see some of the feedback that happens in here as well. And we'll have this going at the same time baking in that oven while this one's already ready. So you can see, voila, we're done with the question. So you can see that it it pops up a question stem, you can go over to the left hand side right away, and you say, oh, I wanna make some changes.
I wanna add an image. Maybe I wanna change the text. You can change the possible responses if there's something that you don't love or if you wanna take away. You can also, you know, tell the authoring aid, Hey, I don't like this distractor. Can you pick a new one? You can tell it why you don't like it, and it'll populate a different response for you.
And then you, as a teacher, can go ahead and, you know, check things, you can manipulate it, make sure it's what you want, you can change things on the left hand side, then you go ahead and save it and you're done with the item. It took literally two clicks. Super great. Now let's hop hop on over and look at the multiple choice question really quick because that's done baking as well. So you can see that something that I feel takes such a long time is even even using chat TPT.
I'm like, I have to copy and paste so many different things and put it in so many different boxes. So this just streamlines it so it can happen fast and effectively for the teachers. So you can see you can go through here and check. Do I like the distractor? Do I like the incorrect response? Do I like the correct response? And then again, you can just go ahead and save it. You can make adjustments.
It literally takes a couple of clicks and you're done. Any questions? Is that just making your day? Do you guys wanna do you wanna see the translation? It's actually really neat. And it takes just super quick. So I'm gonna go ahead I saved that item. I'm gonna go ahead and just select translate I'll go into the item itself.
What language do you want? Spanish. So that took one click, two click, three click, it'll probably take a second to do its translation magic because of the wifi but we'll see what it looks like in just a second. While we're waiting on that, any any questions, discussion points, yeah, go ahead. That's still being slightly worked out. At the moment, we You can go in and beta test it right now, within the platform, and then we'll kinda go from there.
Good question. Joel, if you have a better answer, you can answer that. For those, I will say this isn't using this, but we have had an instruct in at our institution, who's a CS instructor. In atomic jolt, you can actually go in and get the the raw JSON for a question. And then you can use that if you want to manipulate it and kinda make custom question types and things.
But he went in, copied the raw JSON, plugged it into chat GPT, and said, here's the format. Can you now create a qu questions for me in this format. So he asked for it to create a question in that format, and it turned it out in the JSON, and then he just copied and pasted that JSON and got all the distractors and everything in one. One felt smooth. One felt swooped that way.
So it's something you can do. We had a one of our CS guys did it. So What are the questions? Yeah. Go ahead. Where do you where do you draw these questions from? Like, what are you using, what resources are you using to come up with these questions? So this is being integrated with with AI.
Specifically chat GPT. And so it's basically honing in and helping a teacher focus on what needs to happen in order to create an assessment, a principle based assessment. So then, we'll be able to step further. The play devil's advocate a little bit here. Do we know that these questions meet the standards of validity, reliability, fairness, security, but that's That's where we have to be good human beings.
That's why it's called an author aid. I'm I'm just I'm that's so that's my issue with using check check GPT to create assessments is. You anyway, I don't Yeah. Yeah. Yeah.
Oh, I totally understand. Yeah. It comes with all the caveats. Let let's go ahead and have Justin pop on in here. So first, topic that we have a service product.
And we're also in this community. So when you okay. I mean, I have a million other questions, and I'm not gonna, you know, jeopardize. You could take over all the time here, but I was just wondering that. Just kind of talk to myself.
Sure. Why don't you go ahead and stay after and would love to chat with just another brief response to that is people are taking, like, multiple generative models and, like, get the output of one, and then you ask chat, AT and T, essentially, is this valid or, you know, is this active? Gotcha. And and you can actually get better results from combining multiple which is great. Interesting. Yeah.
Typically for, especially for student interactions, you don't wanna go to steel shop interactions. From your AI. Single shot is typically what you're gonna see from a title in the same direction. Right. Since you're gonna have one iteration of I act as the researcher to check for accuracy.
So you run it through all of those before you have the final version. And so is that what atomic Joel's gonna do? Because teachers aren't gonna do that because they don't know. Yeah. This is all behind this. Okay.
Good. So you're gonna type in. Let's just say, why didn't this. There's always layers happening before you ever get a response, guys. Good questions.
And there's a question right up here too. I just are you left in TPT four. This is Is it correct? Yeah. I just asked you because it's a little bit more accurate with the information that it has. He is not generally a search engine.
So the accuracy is always inflection. Correct. Does that make sense? Because it's a work prediction and Yes. And and that's sorry. I don't mean that my job.
Yeah. I mean, either. So we we build a search a four canvas that has all of your content. So we're able to enhance the AI through not the very product, because we we already have access to all of the data that we put into Canvas. And so now we can direct the AI based on the data that we already have.
Good questions. Thank you. There's oh, go ahead. So as a teacher, I have to create my own content. So, and this might be a negative question.
How is this to save me time because it sounds like I'm gonna have to do a lot of additional work rather than just inputting my own content. Well, does your content include assessments? Yes. So this would be where you if you have your content then you can just go ahead and in your content to create assessment items. Right? And say again? Never mind. Never mind.
It's easy. It would be easier And I think it also depends on the the quality of question that you want and the cognitive load that you're available that you're able to offer. Right? If you can come up with those questions and type it in and you know what's happening and you're you're good to go, then, yeah, go ahead and type it right into canvas. When I create assessments, I feel drained after a couple. The cognitive load is high and takes me a long time to actually also create different assessment types as well.
And so this for me, personally, helps just speed along the assessment process, because I just, like, I get burned out and like, okay, another distractor. What's actually a good distractor? You know, what's the what feedback can I offer that's actually helpful to the student instead of correct, incorrect? It's it's slightly exhausting. So to be able to just click a couple of buttons for me, it's helpful to just have the assessment there, and then I have a base point, so then I can modify it from there, and then I can do a lot more and a lot faster. But again, everybody works differently, so it might not be faster for you. Thank you, everybody. We're at we're at time, so you're free to go, but we'll stay up here and take questions with you on the line.