How K-12 Leaders Use Rapid-cycle Evaluation

How K-12 Leaders Use Rapid-cycle Evaluation
How K-12 Leaders Use Rapid-cycle Evaluation.png
Share

CHALLENGE: Understanding EdTech Impact on Student Outcomes

A district tech director doesn’t know if students are meeting prescribed usage goals for an edtech product, and the curriculum and instruction director isn’t sure that students who spend more time on that product score higher on assessments.

HOW WE HELP

Running an Outcomes Analysis brings together SIS, usage and achievement data. The analysis reveals the grade levels that are/aren’t meeting recommended usage goals, as well as groups that saw higher achievement. 

TAKING ACTION

The tech director will use the data to work with grade levels not meeting the recommended usage goal to increase usage. The curriculum director will gather teacher feedback on the tool to make product coaching and implementation changes. Both administrators decide they will run more RCEs in the coming months to see how the results of the analysis may change.

CHALLENGE: Determining if Tools are Supporting Curricular Goals


A district curriculum and instruction director isn’t sure that a supported, high-value product for grades 3-5 is being used at the intended level. They want to make sure that the product is effective and worth the money.

HOW WE HELP

Running a Usage Analysis with Fidelity brings together SIS data with a specific, recommended usage goal for the core product. The analysis shows that grade 3 students are not using the tool at the recommended level, while students in grades 4 and 5 are.

TAKING ACTION

The C&I director decides to work with grades 4 and 5 teachers to better understand their implementations and product usage, prompting for structured feedback that could inform product coaching for grade 3 teachers. The C&I director will also change the dosage level for grade 3 students to see how outcomes may differ – something they can evaluate later with an Outcomes Analysis.

CHALLENGE: Ensuring Financial Return on Investment


A district budget manager is reviewing the large number of digital tools their district purchased in the last year. Working with others in the school, they identified a few priority products they want to better understand if they’re getting the full value out of, based on licenses purchased as well as more indirect costs, like educator time.

HOW WE HELP

A Cost Analysis brings together pricing information, student numbers, usage data and a recommended usage goal. The analysis reveals that, for one of the priority products, there were more licenses purchased than were needed, a percentage of which were not being used at all. In addition, some grade levels were not using the product to the recommended level.

TAKING ACTION

The budget manager decides to revisit the license agreement with the edtech provider, as well as reach out for teacher feedback on tools to balance the data gathered. The budget manager then prompts the curriculum and instruction director to work with educators not using licenses to determine if more product coaching, or a different product, is needed.

CHALLENGE: Multiple Products Perform Similar Functions


A district curriculum and instruction director knows the district pays for multiple products that are designed to meet similar needs. They want to determine if the district should continue to implement/support all products, and for which student groups.

HOW WE HELP

An Outcomes Analysis brings together SIS, usage and achievement data, and reveals that one product is more effective than another product for the specified group of students. 

TAKING ACTION

Based on these findings, the C&I director decides to transition the group of students to the more effective product, and increase support and resources for the product across the district. They also gather structured teacher feedback to bring in their perspectives, ensuring a balance of report findings and qualitative evidence.

CHALLENGE: Context-specific Usage and Implementation Goals


A district technology director wants to understand the ideal product usage for a specific group of students in his district. Not all edtech tools should be implemented or used in the same way in all student groups, and context-specific insights are needed.

HOW WE HELP

An Outcomes Analysis uses achievement data, usage data, and identification of students who did and did not use the product. The analysis revealed that the student group using the product for more time per week shows more growth on a local assessment, and that students not using the product show less growth on the same assessment.

TAKING ACTION

The technology director decides to first check in with teachers about their experiences implementing the product, and measure that feedback alongside their new analysis findings. Next, the tech director establishes a new recommended usage goal and makes updates to product coaching.

Get started on your path to more effective edtech.


Request a demo and see how an edtech effective solution can help your organization make edtech evaluation a standard practice in your edtech ecosystem.

 

Discover More Topics:

Stay in the know

Don't miss a thing – subscribe to our monthly recap and receive the latest insights directly to your inbox.