Six Practical Tips for District Administrators to Use ESSA-Aligned Research Studies

Context is critical when reviewing ESSA-aligned research studies
Six Practical Tips for District Administrators to Use ESSA-Aligned Research Studies.png

Table of Contents

    Share
    Share

    Recently, we brought together a panel of LearnPlatform by Instructure research experts–who regularly work with both district administrators and edtech providers–for a webinar discussion about the ESSA levels of evidence framework. The goal of this conversation was to bring ESSA-aligned research work out of the realm of academia, and provide district administrators with practical, actionable explanations and advice that they could put to use in their own edtech evaluation processes right away.

    Watch the full webinar on-demand whenever you want.

    This is such a nuanced topic, and there were so many great nuggets throughout the discussion but, we have put together six key takeaways below:

    1. ESSA Level 4 is all about theoretical rationale for impact and suggested implementation.

    ESSA Level 4 evidence, sometimes referred to as a logic model or theory of change, simply asks for the reasoning, based on existing research, for why a program should lead to positive outcomes. Think of a level 4 study as a high level program overview. For a district administrator, it should lay out who should be using the tool, what will be needed to use it with fidelity, and how use and the intended outcomes can be measured. Every reputable edtech program should be able to achieve level 4 evidence.

    2. Generalizability is the key difference among ESSA Levels 3, 2 and 1.

    Levels 3, 2 and 1 are about conducting original research to put those logic models produced for level 4 to use and prove out the rationale. The design of the study–and how broadly applicable its results may be–determine what level of evidence a study will receive. Level 3 studies are very context-specific. They only consider the users of a product, without a comparison group, so results cannot be said with confidence to be because of the tool. Level 2 studies do include a comparison group that has not used the tool. As a result, results can be interpreted as more causal, and therefore more broadly applicable than Level 3 study results. Level 1 studies are randomized control trials (RCTs) with a minimum sample size requirement; because of the randomization and minimum sample size, they can be relied upon to produce generalizable, repeatable results.

    3. Context is critical when reviewing ESSA-aligned research studies

    Level 1 studies are often thought of as the “gold standard” in education research, but Level 2 and 3 studies can be similarly valuable as long as you are mindful of context. With Level 1 studies, the findings are applicable even if the study setting is different from your own. However, a Level 2 or 3 study done in a district setting that very closely resembles yours, or to address a challenge very similar to what you are facing, can be just as valuable as a Level 1 study completed in a vastly different setting. 

    4. Look for quality, quantity, and recency in the research base behind a program

    Looking to read more on research from a provider’s perspective?

    Check out: A Practical Pathway to Building Evidence - One Education Researcher’s Perspective

    It’s critical to consider the complete body of research behind a program. Research should be an ongoing, formative process as student needs change and products evolve over time. Avoid placing too much focus on only utilizing products with a Level 1 study, which are time consuming and expensive to produce. Indeed, you will have a difficult time finding many products that have a Level 1 study in hand, especially those that are newer to market.  A product that has several Level 2 or 3 studies that have been done in the past year or two may be just as research-based as a product with a single Level 1 RCT study from several years ago, especially if they occurred pre-pandemic.

    5. Use provider-led research to understand how to implement programs with fidelity

    Provider-led research is relatively new, but it can be hugely valuable for district administrators. These studies are often very focused on fidelity and usage, and help administrators understand the metrics to track when getting started with the tool. When utilizing provider-led research, always look for explanation of the implementation method used in the study and pay attention to study authorship–reputable research is always clearly credited to a credentialed researcher or team, and often is reviewed or done in conjunction with a third party.

    6. Build the case for doing your own research in your district

    Engaging in your own research efforts can be a game-changer when it comes to understanding academic and financial return on investment (ROI), and making data-based decisions to positively impact student outcomes. By doing your own research, you ensure that you understand product impact in your unique context, prove out to your stakeholders that you are using limited district funds responsibly and effectively, and take a formative approach which will  allow you to course correct quickly if needed. Be strategic when planning for your own research by focusing your efforts on those strategic edtech tools that you have invested the most in. Explore our offerings to learn how we support district-led research.  


    Want to find out more about how LearnPlatform supports educators in using ESSA-aligned evidence to make critical edtech decisions? Check out our ESSA Evidence badges.

    Discover More Topics:

    Stay in the know

    Don't miss a thing – subscribe to our monthly recap and receive the latest insights directly to your inbox.