How Districts Bring EdTech Evidence To Budget Conversations

Share

Every day, district administrators research, vet, budget for, and procure digital learning resources for their students and teachers. Over the past few years, with education technology use becoming more pervasive, answering the question of what’s working has become critical.

In this webinar, Bruce Neff (Oak Grove School District, California) and Wade Buchs (School City of Mishawaka, Indiana) discussed how they use evidence to inform and improve their edtech decisions and budgets.

Share
Video Transcript
Ahead and get started. Hello, everyone. Thank you for joining us today. My name is Selma Ibrahim. I'm gonna be moderating our session. And so today we're gonna be discussing how districts bring ed tech evidence to budget conversations.

And we have two guests with us here today who are gonna be talking about their experience in their district. So looking forward to doing that, Before we get started, just a couple of notes. Yes, we are recording this webinar. A link to, to this webinar should be sent to all of you within seventy two hours. So be on the lookout for that.

And then we're also gonna have, some time for Q and A at the end. And so, we encourage you to go ahead and submit your and comments along the way. Do not need to wait till the end, and we'll, go ahead and and pick those up as we as we go through our presentation today. Alright. So just a quick overview of Learn platform and what we do.

So learn platform helps k twelve stakeholders in generating evidence and driving efficiency, with that evidence to inform their decision making. And that's really true to our conversation here today as well. And we do this, on on different levels. So we work directly with districts. We partner with states and education service agencies.

And then we also work with providers to provide, some, third party evaluation and support for them building evidence as well. And, a little bit of context to lead us into this conversation. So We have been tracking at tech usage and engagement, since twenty eighteen. And as you can see, that number has been steadily rising, but with the, with school closures around March of twenty twenty. We saw that really, grow pretty quickly.

And that, growth has continued and been consistent, even beyond the pandemic. And so this really kind of opens up the question of All of these tools are out there and being used within our school districts, how many of them are working, and how many of them are, positively impacting our students, so that we can better communicate that information out, of course, and, and drive, continuous improvement in learning. And so that's just a little bit of context as we walk into the conversation today. And so, we have two great panelists with us here today. I'm gonna allow them to introduce themselves, and so we'll start with you, Wade, and go from there.

Awesome. Thank you, Salma. My name is Wade Books. I am the curriculum integration specialist here at School City of Mishawaka. We're located in Northern Indiana about halfway between Fort Wayne and Chicago, which depending on where you are may mean a lot or may not mean much of all.

But we're basically Snacked at Middle Estate about twenty minutes from Michigan. A, and again, I'll talk a little more about our district, itself, but we're more of an urban district inside of, kind of outside the suburbs of Chicago area. And Bruce? Yeah. Thanks, Wade. Yes, Bruce Neth, a tech specialist in Oak Grove, school district in San Jose, California.

We are a KA district, and we've been with learn platform for a little over a year now, and we're excited to continue our journey with them. K. Thank you both. And so we'll we'll start a conversation here shortly. But before we do, I'm gonna allow our presenters to, share a little bit more about their journey in evaluating their edtech products and, and they have a couple of slides to walk through as well.

Yeah. Awesome. Yes, so if you wanna go to the next slide, I can go ahead and get started here. So as I mentioned, School City of Mishawaka is a, I would consider it to be about a moderate sized district in Indiana. We service just over five thousand students k twelve.

And, we are a very, very urban district in a very rural area. So we have two suburban rural urban districts that are very large unified districts around us, and then we are completely urbanized district, in the city of Mishawaka. I joined School City Mish Walker three years ago, when my family and I relocated to the area from a different district in Indiana. At the other district I was at, learn platform was just becoming big about four years ago. We're starting to hear more about it.

I was responsible for getting learn platform over to that district, and then I relocated to school City, Mishawaka. SCM, had just purchased learn platform have been using the product and using the services. In eighteen and seven, eighteen, nineteen before I moved here, and then I moved here in nineteen twenty. So that first year, the pandemic was my first year on the job. One of the interesting things that I was responsible for and and continue to be responsible for is supporting teachers when it comes to usage of products.

And those products can be the curriculum products that we purchase, any Oers, open open educational resources we use, and then also EdTech. Part of my job is looking at ed tech usage and also supporting the student interaction of that, but also what the teacher side should look at look like. So naturally, what that lends us lend us to is me analyzing the platform and figuring out what's working and what's not. And so the nice part is because I get to see a big picture of curriculum assessment and ed tech, I get to see, you know, what impact does that have with our pro with our plat with our products, and with our assessment platforms as our students are making their journey, you know, each year and year, through our district. And so learn, learn platform has been a great asset in the sense of, like, the impact reporting that we've been using has really helped us dive deeper into researching and figuring out what is making a difference and what isn't.

So just to kind of show some examples, one of the most impactful things, no pun intended that we've used has been impact reporting, which has been focusing on, ed tech usage and comparisons with, with, what either we use as district assessments, or statewide assessments. So one of our biggest focuses has been to look at, not an internal assessment of the platform and not progress, but also the progress in comparison or the correlation between the ed tech platform as well as the, the assessment that goes back and forth with that. So this first slide just kind of shows you an early example we used my first year of what does it look like to start making some comparisons. So we had used reading eggs, which is a product from Edmonton, with our k three students, and we had used a k two assessment of phonological awareness called pals, that's based out of Virginia, to kind of figure out, like, based on assessment scores and based on usage, you know, did thing did students grow if they on the assessment, if they use the product more. What I love about Impact is the way in which it breaks it down.

And so I have a nice clean chart here that shows me. We don't know. And we and what was nice about that is I love the way it's labeled, which means examine other factors. And so, either we can say it's for all kids or not for all kids, but really a lot of our stuff lives in that gray area or in this case that yellow area. So learn platform help to start start looking at, like, breaking down by grade.

So look at your k and your one and your two, And do we see more impact at a grade level? And that started leading to conversations about being more impactful at kindergarten. And if Salome will go to the next slide for me? So then that kind of led us to digging deeper. And so we looked at more grade levels, and we looked at more things. And this time, we looked at, I already online instruction. But what this drove us to was a deeper conversation about, what does it mean for minutes? What does it mean for lessons completed like actual progress on the platform? So we started to run comparisons there.

And the best part about looking at this data over the continuum of years is learn platform has been housing our data now. All of our impact reports This being the third year up in pulling stuff, we're able to see consistencies not among cohorts, but among grade levels, for online tools and ed tech. So what we've seen specifically in this one as an example, we use our curriculum for mathematics is ready math. We use I I read online instruction. Our kindergarten first, second, third grades have always shown, even fourth a little bit, have always shown really good growth, on their diagnostic when they do more minutes and complete more lessons on the platform.

But we've noticed dips in the past three years now with the fifth sixth seventh grades. So we've started investigating what that might look like. And so it's been nice to have all those things in one place. Do you wanna go to the next slide, please, Alma? And so what we really did at least, like, we broke down even more and broke down even more. And, in working with, our CSM trying to figure out, like, certain ask more questions.

Right? Like, Well, what demographics can I really dive into? What is it? What is what is the market of a good study? And those are services working with Learn platform that's been very helpful to me. I'm a one man show. So I service curriculum and instruction for the entire district. I'm one of four in my department, but realistically, it's depending especially in the COVID year one of two because people are getting pulled in other things like school services. So, being able to bounce ideas off people, at the company, and really dive into, like, what's the best impact we can do over the past three years has been very helpful.

So you see the different, categories we've built in there. It wasn't just grade level. It was gender ethnicity, the breakdown by schools, looking at, section five zero four in in in exceptional learners, which is really a spec a spec ed population. Looking high ability, all those types of things have started to play a role in the way we service students with use of our ed tech. And then in this three year journey, the next slide Zama has led us to really having these deep rich conversations, with our principles.

So we're wrapping up these middle of the year MOI meetings right now. With our building admin, and I'm able to go in with a lot of data, and we spend, you know, anywhere from an hour and a half to two hours looking at just one or two different tech platforms, products that we have. But that journey has been supported now, obviously, by learning platform, and with the data that we can quickly get for me to just create this stuff on a whim and provide it. Isn't something I'm able to I mean, I understand the idea behind it, but the product and the support to do it was something that we were lack but now we've made data very viewable, very, very, adjustable for for principals, for admin, for teachers, and naturally help drive our conversation. So in this example, I checked up a graph where we're looking at reading eight minutes, but looking at the impact, amongst teachers and buildings, not to share with the public, obviously, but more to have com because I cut those names out, but more to have conversation about You know, are there certain teachers that are getting more impact with the product? Why might that be? Let's take a look at usage minutes in your building.

It seems like these teachers are using more by getting less of an can we reallocate those instructional minutes to something else? This is not something I'd be able to easily visualize on my own. So being able to distill that has been a a huge boon. So I wanted to show those slides as part of our three year journey because really early on it was, hey, what what can we even do with the product? But this has really kind of transitioned into some really deep and rich conversations, which is the whole point, of having data conversations, which is having tangible usable, visible data that we can have conversation about. Thank you, Wade. That's really helpful.

And we'll dig into a little bit more of those processes and thoughts and kind of, what led you to where you got in a moment here. Before I do, I'm gonna kick it over to Bruce. Great. Yeah, thanks. Thanks, Wade.

It's a great to hear your perspective, from someone who's been on this journey for a number of years, here in Oak Grove. We actually just, like I mentioned, we just began this journey a little over a year ago, and just got into our first, impact reports, the beginning of this year, a few months into this academic year. So again, we're a k eight district. We're just under ten thousand students, high, second language learning population. Demographic of students that just, very typical for California as a whole.

And, it it was it just interesting stuff. So when when we kinda came in initially and and engaged with learn platform, we came from an angle of, student privacy concerns with a lot of the stuff that's out there and making sure that, you know, the things that are safe for our students to access, and they've been very helpful for that. But then we stumbled upon this wonderful gem of impact reports and, you know, the rapid cycle evaluations where we can actually use this tool, which is just wonderful to see that value of those, different tech products. So we decided that we would kinda see what this is about by selecting two specific products, one that is purchased at the district level and do a a a deep dive on that. As well as one that is purchased traditionally at the site level.

Being an education for twenty years, I started off as a fourth grade teacher. And, one of the products that we used was, Renaissance's learning product, accelerate reader. And that was a site purchase product. And I've always, you know, kinda felt as a teacher. Yeah.

This is very effective. It's a good tool. It's gonna help my kids, but there was no way to really see that. I don't wanna it's, a larger scale to see how that's directly impacting, my, my students or whether that value, from my school at that time was was worthwhile. So I was really curious on how this was going because we've been using it for a very long time.

And at the district level, we selected a product that we use for our interim assessments, and it's there, you know, it's proposed. It's saying like, yeah, this is gonna have a direct correlation on, predictors for how you're gonna perform on your state testing and and think, you know, some of the tests that that, you know, in in the idea behind it we've been using for a number of years, is is that it's very yeah. This should work, but a lot of that information tends to come from, the vendors as opposed to what's really going on. And since there's so many variables involved, how can you really see this? So once we solve this opportunity, that, hey, we can look at some objective data to see this. It became very interesting.

So we said, yeah, let's look at these two things and and and see what this does. Not really sure what this was gonna do. So, if you can go ahead and go on to the, next slide there. So the district wide product, which is used for, not only our formative assessments, but also for supplemental for, English language arts that the expectation or the assumption, I might say, was that this would be used in the classroom setting to supplement core curriculum. And me being a teacher and my wife's actually a teacher, also my district, and my kids went through the district, There's a big difference we find between, you know, what sometimes the perception of what it they should be doing in the classroom versus what is actually happening.

So really our big focus here was like, are they actually using it? It was always that big question, you know. So when we looked at the district product, we could see that, yeah, there's some usage that the sample size was good. The, the target usage we had a number go off of, but that was based off of what we felt they should be doing. So the way these numbers came out, still, was still, you know, in my opinion, so, you know, kind of low. And when you kinda crunch down when I started to break down this data more not necessarily shown here, but since we are using these these, this particular product for the benchmark assessments, and I began to think to myself long does it take those students to take those benchmark assessments and you scale that up? How much of that time is actually attributed to that product use And a big aha kinda came in play here.

And I said, well, it seems to me that, you know, they're not using it nearly as much outside of those required moments. And so this is something that I'll be bringing up, during some discussions and some future meetings at the district level. On the right hand side, the individual, product, the accelerated reader product, again, that was a had a personal interest in how that was going to play out. As well as it's it is a very popular one that's been, just kind of an automatic, you know, purchase it, purchase it, purchase it without consideration of is it really doing what we think it is? You'll notice that it doesn't really give us much, like, well, what happened there? What was the overall effect? This was a big aha for us too because this is our first go at this. I came to realize quickly that we didn't have a target usage because this is a site level decision And if you were to ask the teacher, like, I myself being a fourth grade teacher had an expectation, but even my teammate across the hall wouldn't have the same number.

So kinda coming to that idea of understanding what that target might be and maybe even holding our vendors accountable what, you know, you say this is an effective tool, what is that dosage going to do so that this can help us out? So that was a big aha for me. But, go ahead and go to the next slide for me, please. So kinda looking at some of the things again, just sort of like through the stories that pull through the teachers through the students, you know, my own kids and things like that. And I go in and work with students all the time of the feelings on how they feel about these things. Like, you know, do you like this thing? Do you use it a lot? We had assumptions that I was making.

I'm like, well, I think it's probably very popular early on. You can kinda see on the left there, that the, the district wide product, yeah, very popular early on. But as the upper grades go, it diminished quite rapidly. And that that directly correlated to, what this story is my kids would say. It's like, yeah, with this particular one, it kinda like, it gets stale or like, you know, it's just it doesn't keep my interest as much and you can kinda see that that's that's shown here.

And it what a big aha was in terms of our usage goal, when I kinda crunch the numbers, I apologize is a bit small, but, only six percent of this number that, in my opinion, wasn't very much. Like, what our expectation for usage would have been, outside of those benchmark test, only six percent of them met that goal. And even for those, students who are, you know, that are very disincentivized that are gonna wanna do everything, this still felt low to me. So another interesting conversation, and this particular district product is is a is a large investment for our district. So it would be interesting to see how that goes and to bring this objective data to the table and say, perhaps we should start looking at other things that may meet these goals.

And, get a little bit more, bang for a buck out of that. Go ahead and go to the, next slide. I think I'm not sure if I have another one left. This may be actually But, yeah, so there's some, interesting questions I think that'll be coming up will be discussion during discussions with it. But, it's, you know, If nothing else, it really kinda brought to light some assumptions I made instead of me using sort of this, you know, subjective viewpoint or stories that are being told to more objective data that supports what's going on and be able to take decisions around that.

Yep. Thanks. Thank you, Bruce. That makes a lot of sense. Alright.

Well, first of all, just a quick reminder to everyone to drop your questions in the q and a so we can get to those later in the discussion today. I'm also gonna stop sharing my screen so you can see our wonderful panelists here, as we start this discussion. And so the first thing I really wanted to touch on, you really already have touched on. So I'm just gonna highlight it and give you the opportunity to add anything else you might want to. And that's, understanding the types evaluations that you did run, what types of products you were focused on and what your goals were in in focusing in on those products.

Sure. Yeah. I could just follow-up on that. It'd be pretty quick because I think I touched on it. Again, commonly used products, a lot trying to challenge the perceptions.

That was my goal. You know, there's a lot of perception. Sometimes there's there's often times you hear a disconnect where decisions may be made. At, a district level. And is that really the, what the expectation is down to classroom level with the teachers and going down to what the student experience is? There's perception, and it has something that supports that with some objective data.

It was like, wow, this is fantastic. It really makes me excited about using this process again with other educational tools to see how that also tells that same story. So it's been really good. And just to kind of follow-up on that, a lot of our drive has been anything that has a dollar sign attached to it. Is something that has questions around.

So an interesting thing about our district is, I don't know how other districts in other states are are funded. Ours has funded through ADM. So that student count which is another determinant, like, also gets determined based on assessed value of a district, being in a very odd district of being extremely urban, but very, but very residential RSS value is super low. So the money that comes into our district is very low. So what happens is we have to do referendums every five years.

So we have to go back to the people and say, we need more money. It's just the nature of the beast with the way funding works. But what happens is You have do you find yourself very quickly needing to justify with people who are going? What do you what are you talking about? What do you need more money for? Like, let me explain to you what that is. And I would I would argue that most districts have someone in the diss someone that has to have those conversations. That person's not me.

I'm just responsible for providing the information and the data to paint the clear picture of what it is we need. So a lot of my work has to do more with, like, basically, like, unfolding every the area and saying, like, here's exactly what's going on. Let's paint a very clear picture of what's going on. Most of the time, it meets exactly what it is and people are like, that's perfect. Good.

Thanks for I just wanted to want to be cleared. There's been a few times where we're we felt the stone should have been turned back over, but we left it unturned. We had hard data on why we're doing what we're doing. But basically, you know, the nice part is with that referendum, we're held accountable for every dollar that we spend, every five years with a vote. And so, ours has been very driven by that, but the byproduct is we get to talk about hard educational decisions with teachers then.

May not live in the community may not have that stake in it. They feel it's best. And I think Bruce hit on that that feeling piece. People don't get an education obviously for the money. For any other reason besides their passion about what they do.

So those feelings sometimes, I mean, they've led us to a lot of great things, but they also can present some barriers in and and and if people get attached to something. And so being able to have a very clear conversation is is extremely important. What we liked about learning platform is the ability to provide. A clear image to paint that quick picture. Come back to the notion of challenging perceptions and also of, coming back to teachers with hard data.

So we'll revisit that in a moment here. But, since since I think it really relates to what you've just shared, Can you tell us a little bit about what evaluation looked like before you started running RCEs and, kind of what has changed for you with that shift? Yeah. Sure. So, like I said, we're very early on this journey. We're really starting to get into this process of doing the RCEs and that rapid cycle evaluation.

I'm excited about the potential that that will bring to us. But in the past, it has always been just that that word-of-mouth, really. It's that sort of that subjective story. Like, the squeaky will gets the grease sometimes mentality or just the that's the way we've always done it mentality or perhaps that I went to a conference and they say this is gonna be the thing and They have some sort of research, but it isn't based on the, you know, the what my data, what what I need. And that's the big separating fact here.

It's maybe like their efficacy report. This is based on the ivory tower theory that it should work for this demographic, but does it really work for us? And that's the excitement here is that when we run these reports, it's about us, not necessary about some group of people out there. So, yeah, it was really just staff meetings. So our district like I had mentioned before, up until recent times, a lot of the purchase comes at the site level. So it was the the the site administration or the teachers that gets out there and say, yes, this product is the best because I've used it forever.

I don't know any of what else is I mean, that, you know, and so that's just what perpetuates the cycle. But now it's like we can look at it. Well, is it really doing what we think it's doing? And so that's a big part of it. Yeah. So now I'm getting to pivot.

I'm excited about potential. And, unfortunately, for me, my conversations will be beginning during this budget a year in about another month or so. We'll really see how that plays out. I think Wayne could probably speak a little bit more towards his experience because he's been down this road a few times already. Yeah.

And I I saw in the questions we had a a a question about RCs, but I can address rapid cycle evaluations in my answer here. But like like Bruce said, you know, we've been doing this for a few years now. And I came in in the middle of a referendum in the sense of, like, we're in the middle of a five year period. And so I didn't have to worry early on, like, things were happy. Rainbow and Sunshine, things had passed and we were able to kind of just keep moving and the money was flowing.

But as we approach another one, I mean, you're planning two years out to start having some conversations. So you need to start laying some groundwork now of why we need what we need. As we come in. I I, you know, when Sama had talked about, what does this look like before? So oddly enough, I think we all have our our pet passions. My background is in statistics and analysis, oddly enough.

I have a math degree, and, that's what I taught before before I got into, like, before I got into this more of a role, And so I liked thinking about those types of questions and asking them, but I wasn't really good at getting it in a in a feasible way people to see. Like, I could kind of make some stuff up. So early on, it was me and Google Sheets, hitting buttons, I'm hoping in the bot. If you're familiar with Google, there's a bottom button that hits to explore, and you're just hitting it hoping it's gonna give you something that looks halfway decent that you can use. What really helped was the conversations between my CSM and I at the time, who was a different individual a few years ago, but helped me get connected with some more people who did had kind of had developed impact recording and learn platform, which is so nice as a as a pretty as a pretty, level company.

You know, a lot of conversations with some people who have really good experience. And I had the background. I just couldn't get to the end goal and they helped me answer, you know, ask specific questions, We had some tools I went through with the platform of understanding, like, what is it, what is I'm trying to do? Am I looking at a comparative study where I'm comparing two groups? I'm and core loop study where I have one certain group I'm looking at a metric. That really helped identify what I was trying to accomplish. And so early on, it was a lot of fumbling, and I thought I had really good answers or at least decent answers.

And it turns out they were okay, but we don't have time for fumbling. In this world. It's very past pace. We have a lot going on, as much as the classroom resource are low. The in, the people like Bruce and myself are even lower in the sense of you know, I train teachers on how to teach.

I mean, I'm an instructional coach as well. So, I don't have the time to spend to develop, you know, decent looking things in really data mine. And so the platform helps me kind of do that. It takes that burden off of me and says, look, we can do this part for you, this part. The interpretation is what you need to focus on.

Let us do let us do the lift help you understand what it is we need and what it is you need to give us and then let us do the lift and we'll let you do the fun work of the analyzation later. So, hopefully, I addressed, like, rapid cycle evaluation was basically you doing that often because otherwise it's at the I mean, think about a statewide assessment. At the end of the year, you hit your data, And then the kids leave you and you have to make decisions based on that data. Rapid cycle is more of like a rhythm. So our rhythm happens to be beginning middle of end.

But with some of our assessment data, we actually can do it more frequently. With some of our, some of our intervention stuff, we can do, almost on two week cycles. So we have a two or three week cycle or four week cycle group that we're looking at data processing their assessments and seeing if they're growing based on the amount of data putting in. So in my mind, that was what the way we use rapid cycle, which is just a more regimented rhythm way of looking at data and interpreting it but knowing what we need to have in place to make those things happen. So I don't know Salma Bruce.

You want to expand on that a little bit, but that's my interpretation of Rapid Cycle, which is don't wait till the end. Let's do it now. So if if we, and then if we implement a, a fix or something to look at, then we're able to later on see if that fix actually did something. Yeah. Just to close on that, my, I guess, my analogy with the, wrap cycle evaluations is is is compared to, like, your your formative assessment versus your summative.

You know, we take our benchmarks. It hopefully is a predictor. If we don't, we We look at that data. We analyze. We make some adjustments and and do that.

And and this affords the opportunity since a lot of the data and collecting of that nuance of the numbers and crunching the numbers and making the the the product come out such that it is digestible for the layman and not the somewhere who has a minor in math of which I do as well, but it's difficult to make that stuff to come out in a way that people can understand it clearly. And since A learn is so effective at being able to pull that up through a lot of better tools on the tech side of the thing and then they generate this report for you that is understandable, and that's key. And you can do it rapidly. That's the keyword they're rapid, and it's a cycle circular. Keep doing it.

Keep doing it. Yeah. That's exactly right. You you all are gonna hire you on to explain our particular evaluation. So, it's really about generating, yes, that practical relevant evidence to help you make data informed decisions and doing that iteratively so that you can continue to improve.

And really, we're focusing in on understanding, what tools are working for which students and teachers in what context. So we're looking at it a little bit more well roundedly than just is it working or not? But good question. Alright. So I am gonna let's let's lay it all on the table. So with respect to Rapid Cycle valuation, Have you found this process, easier or harder than you expected? Like I said, for me, I'm I'm I'm actually on, I, working on, running these myself, but the support from Learn when we ran that first cycle.

It was it was fantastic. Very supportive. They were asking a lot of good questions that were relevant to us. A lot of questions. I wasn't really clear on what I'm not sure.

It actually even just going through that process of learning it, it got it gave me a lot of those a moments too that it's like, wow, we don't really look closely enough that these points, and we really need to pay attention because that does make a big difference. So, again, since I'm still early on, I'm gonna let We'd speak towards this because he's gone through it numerous times, but I'm more of just this, excitement of of of doing it more so. Yeah. I forget the the name of the curve or the or the the the little groups, but like early on when you don't know how to do something and you're trying to do it that unconsciously incompetent that's where we kind of landed. So when someone you asked, like, was it easier or harder? I didn't really know what I was getting into, and so I don't know how hard it was going to be.

I think what happened was once we realized getting started wasn't bad. But what really happened was when we started, you know, through my conversation with with our platform of opening up, like, have you thought this? What about this? Have you gone this route? That was where it started becoming more complicated, but more exciting because I had someone to we have a thought partner with and work through. And yes, we were we had opened more options and there was more work to be done. But we had tangible ways to work towards that. So I think that was the biggest thing.

So, it was more, I think, than what we realized. You know, Bruce had mentioned earlier, and I think you had hinted on it when you lane rapid cycle. There's so many things to focus on, whether it's like, hey, thirty minutes a week this kid will grow. But what does that mean though? Like, what kind of thirty minutes? Is it thirty the classroom, thirty minutes at home. What kid are we talking about? Cause I believe all kids are the same, but what kind of background conversation did you have? What grade levels? I think all those piece play a role.

And then maybe in the aggregate, it it makes a difference. But we weren't ready. I don't think as a district to ask those deep questions, especially when the community was demanding it. And so this has really pushed us. There's never been a time where I've been stuck.

I think that's the important thing. We've always had a tangible next step to work towards. So if you're the person who wants an end goal, I don't think this is gonna provide you with an end goal, but it's gonna provide you with something that's gonna benefit you moving forward. And that's just saying that after, you know, almost three years now, of work and and worth in partnering with the platform, Oh, that's great. Thank you.

So Wade, I I know you've been doing this for a while. As your evaluations have progressed from where you started to where you are now, how have the things that you've been considering evolved and how has your kind of perspectives changed? Yeah. There's no magic bullet that's gonna fix anything. And I think if there's one thing we've learned, I love the chart. We look at the beginning.

I had you're on your slide, but I had to print it off somewhere. We're the huge uptick in ed tech when we went into COVID. Right? EdTech wasn't gonna solve anything. Most of that probably was like a, what, like, a, like, a, a Zoom or a Google Meet or something. Right? We're using more of a platform to reach an audience Our kids were back in school, almost a hundred percent, in Indiana starting in September.

So EdTech only became more of a tool because it was like, how do we keep kids sitting and engaged and not interacting with others? There's no magic bullet that fixes that. Everything is everything is every every kid's unique. Every kid is different. And so you are actually being irresponsible If you know that every kid's different and you're not diving into a small cohort of a kid, you know, and and we try not to look at you don't wanna ever look and kid a a bucket or a label. But the kid moves.

Right? So I might have a student that is low achievement on this assessment, but with their high achievement on this assessment, but these the same platform. Why might that be? Like, those things are respond we're responsible for making those calls and and for digging deeper into that. And so I think what we've learned as times gone on is we're responsible for make for figuring out, how to take that one size fits all and make it fit every unique piece we have in the district, the unique piece being the student. And, you know, Bruce had mentioned it earlier, you know, and this is not a knock on a NetTech product, but their job is to market. And so they're saying, like, and and it's true.

The data is showing that in the aggregate this product works. We're responsible for figuring out how does it work with us? We're not talking about millions of kids from about thousands of kids or hundreds of kids. So what is it we can do to help this product work better? We know it's helpful. A lot of it's the conversation around the product. So if anything, it's gotten us really close with our edtech products, They they want us to keep them, you know, and they wanna keep in our classroom.

And so I'm extremely close with all of our reps because we're the ones communicating with them, what what data we need. Hey, it's not working for this kid. Like, I show them impact reports all the time. So the cool thing is this isn't just seen by our district. This is seen by our by our reps.

I'm not looking for a discount. I'm looking for an answer as to why this may be happening. And so it's really helped with a lot of conversation with edtech products of, you know, you know, Wait, what are you seeing in your classrooms with our use of our product? Why might that be? And so now we've had better answers and we can talk about, well, how what what might the support look like from the company? To provide to our teachers, to help them use the product better, better understand those types of things. So I think if nothing else, it's just open more conversation. Alright.

Well, I'm gonna just ask a couple more things and we'll jump over. I think we have a question. So what your and I'm gonna revisit the, the challenging perceptions and the, communicating with teachers here. So what are your plans for sharing, and applying your findings, or how have you shared applied your findings, not, in in one way from, like, the teacher perspective and within, that context, but also in terms of driving your budget decisions, how do you communicate that information out and and kind of where does it go from there? Yeah. So for for us, I'm really excited about, you know, having a budget meetings where, you know, at the at the district level decisions are being made, we have to, you know, l, lot Elocap planning for the stuff we get our funding.

Here in California and, decisions are made around these big products and and and being able to point at the effectiveness of that tool not based making these decisions, not based on tradition, but based on some objective data and then being able to break out of that comfort zone challenge ourselves to say, why is that not working for this demographic or or and what I'm and having the tool to be able to break that down is just so wonderful. And the lift, to to to manipulate that data isn't that difficult because of the ways that the learn platform can't integrate with our systems and the data that's out there. So there's not like I have to give too much, which we'll get to that. Hopefully, answers Susan's question later in regards specifically. I'm excited about sitting down with the site administrators because, again, I come from being a teacher.

I remember the perspective on those decisions sitting at the staff meeting. So Raise your hands if you still want that thing. That's how that went down. Now it's like I can sit down when they're trying to make decisions on budgets, which will changed dramatically in the next few years. I mean, right now, there's a lot of exploration and, opportunity because of some one time funds due to the current crisis that we're in, you know, with the with the learning recovery funds and, you know, the Ester funds that are out there, but that is going to go away.

And the pendulum is gonna swing back and swing back hard and fast. And we need to prepare ourselves ahead of time so that when those hard choices are made, we can do it with objective data that's going to help our students improve as opposed to, well, how do you feel about that? So that's the I'm excited start that conversation and next year when it really gets tough, we'll we'll we'll be prepared because they will have have done that. Yeah. That's that's exactly where we are as well. I I love that Bruce said the the hand race.

We all were there. Right? Majority wins. So we kept it. Now I think, what's been really interesting is the way we've been able over the past three years. And and honestly, with with launch shield data now, be able to say, hey, look, I don't think this product's right for these grades and here's why.

So we've really been able to hone in not, like, isn't that the product was fine. It was more the product was fine for this demographic or for these grouping of kids, and that's been really helpful. So we've been able to not cut a product out but really cut down on where it exists. And a lot of times we made some people happy because the teachers are finding the battle, and it's not working. And they're, you know, we're trying to fit a a square peg in a round hole here so I think that's been the biggest thing for us.

It's we've been able to slim down, but slim down in the right ways. And so we're not just cutting because it's not working. What's working here, but not here what's gonna get rid of it's working here. Let's keep it. And it's not working here here what's figure out why.

And if it if it's something a different way, we go a different way. That's that really is consistent with what we've been hearing with other districts we've been talking to too, and especially in in context of that funding cliff that is, is very, clearly going to come at some point in making the right choices at that point in time, and then, of course, using the products in the right context. Right? It's very rarely this product works or this product doesn't work. It's it's it's usually a little bit more, nuance than that. Alright.

So I'm gonna jump over to our Q and A. I think we have a question in here now that I'll share with you all. So besides setting up with Learn platform, what other data did you have to share? In order to get the information you needed to determine the effectiveness of a product. I I can speak just real to to mine. And I'll let I'll let Bruce talk about his experience.

So the data we always needed, you always have to look at the ed tech usage. A lot of times, the ed tech tool either had an internalized assessment measure. So, like, lessons completed, those types of things, minutes, whatever. Early on when Bruce was discussing talking about, like, being used to fidelity, he was tracking specifically the minutes on the platform was the one graph he showed. And that's a great way to get started.

In looking at that because that's good. Like, do we have fidelity of use at least to get started? We confirmed that and then we said, okay, like, now let's actually compare it to it. We tried to get it as external of a metric as we could. From the platform, we like being detached when we did core lab studies. And so with core lab, you're actually comparing two very evils.

You're looking at, you're looking at a comparison of, like, the edtech product. And for us, it was always an assessment variable. Inside the platform when you actually go to run the impact report, and I've gotten good with this over time. It's just it's taking a lot of clicks to figure out what I wanted. I've got to be I've able to hit the right variables to put in the right spot.

But as Bruce also mentioned, maybe he can look at this, you know, just kinda getting started, There was always someone to support you with that. Like, early on, I didn't even run them anymore. My CSM just told me what they needed. I got them to and they got them back to me. Over time, I wanted it more quickly.

And so giving me the power to actually do it has been helpful. But the training on that's there, that was always included. If I had questions, I asked There was a weird one one time where it kept defaulting to a controlled that it wanted to be controlled. And so now I know how to avoid that moving forward. So I get the data that I expect to get of it.

Yeah. Thanks, Wade. I know I know we're running out of time, so I'm gonna defer a little bit because he he has the experience with it. But, basically, in my in our experience early on just to, reiterate, good data in, good data out. Know what you, you know, in in in the fact that it goes quickly, these cycles go quickly.

You learn quickly. Like, what did I miss? Like, I knew that really fast on that first report. I had to say, oh, I missed that because I didn't put good data ends, and now I don't have a good measurable effect. So that's the nice thing about this. It's not gonna take me six months to figure out the answer.

I can run it quick, oop, tweak it, run it again. So, that that's that's my initial experience with it. Yeah. Yeah. I think we're running low on time.

That is true. So I'm gonna take just one more. So, We have a question around, how you're encouraging usage at the soon level to ensure that the impact of the product isn't low. Just because of lack of use. So in in the cases where you're getting context around, optimum usage, how are you encouraging that? I'll just speak quickly on this.

In my experience, the amount of stuff that goes out there, I attend conferences, a lot I present conferences. And, you know, when teachers go to, like, say, in California, something called queue. And they're just vendors are just throwing things at them. And it's what we like to call the Wild West. You get so much stuff that's out there.

I think that the issue that we may have is that, so many things get thrown at us, but yet there's so many, so much time that is taken as a classroom teacher that you are required to do certain things, you get kinda lost in the shuffle, and you're not sure. So being able to have a tool that you could even engage with at a class level if need be or that school and to give them some outcomes and say, this is how this thing may have helped you or that didn't help you. I think it would help take that off of their their stressor. It'd be like it'd be the the willingness to move away from tradition. This gives them that.

Again, moving away from that sort of, subjective or quantitative approach to things to an objective qualitative approach to things, helps change conversations, and it makes it better. So then they feel comfortable in letting go of that thing that they've been using. Yeah. I I thought Bruce covered that great. I have nothing to add to that one, but I think the big we're fighting for kids time all the time.

So I think you know, it it all comes down to, what makes the most sense in the in the classrooms. But, no, I think Bruce covered that well. Alright. Okay. Well, we have come to the end of our time here together today.

So, first, I just wanna thank you both Wade and Bruce for joining us today. This has been very helpful. And I'm sure we'll help some of your peers across districts in, taking on the same challenges that you have been taking on. I have put up on the screen here just a few resources that learn form offers, for you to take a look at and, reference if if need be. And then just a quick reminder that, we have some upcoming webinars and on demand webinars from the past.

So if you're interested in learning more, there's plenty there to dig into. But thank you so much everyone for joining. We appreciate your time. Thank you, everyone.
Collapse

Discover More Topics: