Evidence Based vs. Research Based: Understanding the Key Differences

In the world of education, practices and programs are often categorized as either “evidence based” or “research based” But what exactly is the difference between these two terms? Understanding the nuances is critical for identifying truly effective resources backed by rigorous evidence

In this guide we’ll examine evidence based vs. research based in depth look at definitions and standards, and outline the benefits of favoring an evidence-based approach.

Defining Evidence Based Programs and Practices

The terms “evidence based” and “research based” are commonly used to describe educational programs, curricula, strategies, assessments, and interventions. But they have distinct meanings:

Evidence based programs or practices refer to resources that have been scientifically proven to be effective through rigorous controlled research studies.

To be considered evidence based, a program is studied under controlled conditions, often comparing skill growth against a control group not receiving the intervention. Multiple high-quality studies must consistently show positive outcomes and effectiveness to deem a program as evidence based.

Research based programs are developed using existing research in the field, but have not yet met the scientific standards to be designated as evidence based. The strategies may have a logical rationale, but the overall approach has simply not been studied in an experimental, controlled research setting.

The Role of Assessments

Evidence based principles also apply to academic and behavioral assessments. An assessment process is considered evidence based when:

  • The skills measured align to research on relevant competencies.
  • The assessment methods are grounded in scientific theory and research.
  • Administration and interpretation follow research-backed best practices.

Using evidence based assessments provides confidence the tool is appropriate, reliable, fair, and effective for informing teaching and interventions.

Standards for Educational Assessments

Evidence based assessment relies on measurement tools rigorously developed to meet scientific standards. Key standards include:

  • Validity: Research confirms the assessment is appropriate for its intended use.
  • Reliability: The assessment produces consistent results across repeated administrations.
  • Fairness: The assessment is unbiased and appropriate for diverse student subgroups.

Evidence based tools meet these research standards for validity, reliability, fairness, and more.

The Benefits of an Evidence Based Approach

Favoring evidence based over research based programs offers a number of advantages:

  • Proven results: Evidence based programs have demonstrated improved outcomes through scientific study. Research based programs may be logical but lack proof.

  • Informed decision making: Evidence provides objective data to guide spending and adoption of new programs.

  • Accountability: Evidence based programs help meet legal requirements for research-backed interventions.

  • Equity: Relying on evidence rather than assumptions helps avoid perpetuating biases.

  • Efficient spending: Funding programs backed by rigorous evidence avoids wasting resources on unproven approaches.

  • Ongoing improvement: Collecting evidence allows continually optimizing approaches to improve student outcomes.

An evidence based approach provides concrete proof of what works in education for optimal student growth and achievement.

Finding Evidence Based Programs and Practices

So how can you identify evidence based programs and tools? Here are some steps:

  • Consult reputable clearinghouses: Check databases like the What Works Clearinghouse or Best Evidence Encyclopedia that curate evidence based programs.

  • Look for citations: Reputable programs will cite experimental research studies supporting their effectiveness.

  • Ask developers: Reach out to publishers for summaries of research evidence and underlying studies.

  • Review independently: Search for peer-reviewed articles detailing research on the program’s impact.

  • Check national standards: Ensure assessments meet standards set by AERA, APA, and NCME for evidence based tests.

Taking time to verify evidence can ensure your school adopts proven, high-quality programs.

Comparing Evidence Based and Research Based Resources

Let’s examine some key points of comparison:

Evidence Based Research Based
Status Meets rigorous standards as evidence based Does not qualify as evidence based
Research Effectiveness proven through controlled studies Grounded in background research, but lacks direct experimental evidence
Results Consistently demonstrates improved outcomes No guarantee of impact
Decision Making Provides objective data to guide adoption Relies more on subjective criteria
Accountability Helps meet legal requirements for evidence based interventions Does not document effectiveness required by some policies

While research based programs have merits, evidence based programs provide the highest level of proven benefits and accountability.

An Evidence Based Example: DESSA Assessments

DESSA offers an example of evidence based, universal social-emotional assessments. Extensive research validating DESSA includes:

  • Reliability studies confirming consistent results over time
  • Validity studies proving DESSA SEL skills align to research
  • Fairness studies showing lack of bias across student groups
  • Field studies documenting links between SEL and student outcomes

This rigorous evidence makes DESSA assessments fully evidence based. Such concrete validation gives schools confidence in improving SEL programming.

Key Takeaways

  • Evidence based practices meet strict standards through experimental studies demonstrating effectiveness. Research based practices lack this direct evidence.

  • Assessments should also be evidence based, with validity, reliability, fairness backed by research.

  • Favoring evidence based over research based provides measurable benefits for student success and educational equity.

  • Look to reputable clearinghouses, published research, and developer evidence when selecting proven programs.

Adopting evidence based practices takes the guesswork out of determining what works. Relying on objective evidence provides the greatest returns on educational investments. While no program is a silver bullet, emphasizing evidence based approaches gives schools the best shot at helping all students thrive.

evidence based vs research based

What Should You Know and Ask When Programs Use these Terms?

At this point, if someone uses the term “scientifically based,” they probably don’t know that this term has functionally been expunged as the “go-to” standard in federal education law.

At the same time, as an informed consumer, you can still ask what the researcher or practitioner means by “scientifically based.” Then—if the practitioner is recommending a specific program, and endorsing it as “scientifically based,” ask for (preferably refereed) studies and their descriptions of the:

  • Demographic backgrounds and other characteristics of the students participating in the studies (so you can compare and contrast these students to your students);
  • Research methods used in the studies (so you can validate that the methods were sound, objective, and that they involved control or comparison groups not receiving the program or intervention);
  • Outcomes measured and reported in the studies (so you can validate that the research was focused on student outcomes, and especially the student outcomes that you are most interested in for your students);
  • Data collection tools, instruments, or processes used in the studies (so that you are assured that they were psychometrically reliable, valid, and objective—such that the data collected and reported are demonstrated to be accurate
  • Treatment or implementation integrity methods and data reported in the studies (so you can objectively determine that the program or intervention was implemented as it was designed, and in ways that make sense);
  • Data analysis procedures used in the studies (so you can validate that the data-based outcomes reported were based on the “right” statistical and analytic approaches);
  • Interpretations and conclusions reported by the studies [so you can objectively validate that these summarizations are supported by the data reported, and have not been inaccurately- or over-interpreted by the author(s)]; and the
  • Limitations reported in the studies (so you understand the inherent weaknesses in the studies, and can assess whether these weaknesses affected the integrity of and conclusions—relative to the efficacy of the programs or interventions—drawn by the studies).

Moving on: If a researcher or practitioner describes a program or intervention as “evidence-based” you need to ask them whether they are using the term as defined in ESEA/ESSA 2015 (see above).

Beyond this, we need to recognize that—relatively speaking—there are far fewer educational programs or psychological interventions used in schools that meet the experimental or quasi-experimental criteria in the Law.

Thus, it would be wise to assume that most educational programs or psychological interventions will be considered “evidence-based” because of these components in the Law:

‘(ii)(I) demonstrates a rationale based on high-quality research findings or positive evaluation that such activity, strategy, or intervention is likely to improve student outcomes or other relevant outcomes; and

‘(II) includes ongoing efforts to examine the effects of such activity, strategy, or intervention.”

As such, as an informed consumer, you need to ask the researcher or practitioner (and evaluate the responses to) all of the same questions as outlined above for the “scientifically based” research assertions.

In essence, if a research or practitioner uses the term “research-based,” they probably don’t know that the “go-to” term, standard, and definition in federal education law is “evidence-based.”

At the same time, as an informed consumer, a researcher or practitioner’s use of the “research-based” term should raise some “red flags”—as it might suggest that the quality of the research supposedly validating the recommended program or intervention is suspect.

Regardless, as an informed consumer, you will still ask the researcher or practitioner (and evaluate the responses to) all of the same questions as outlined above for the “scientifically based” research assertions.

Ultimately, after (a) collecting the information from the studies supposedly supporting a specific program or intervention, and (b) answering all the questions above, you need to determine the following:

  • Is there enough objective information to conclude that the “recommended” program or intervention is independently responsible for the student outcomes that are purported and reported?
  • Is there enough objective data to demonstrate that the “recommended” program or intervention is appropriate for MY student population, and will potentially result in the same positive and expected outcomes?

[The point here is that the program or intervention may be effective—but only with certain students. . . and not YOUR students.]

  • Will the resources needed to implementation the program be time- and cost-effective relative to the “Return-on-Investment”?

[These resources include, for example, the initial and long-term cost for materials, professional development time, specialized personnel, coaching and supervision, evaluation, parent and community outreach, etc.]

  • Will the “recommended” program or intervention be acceptable to those involved (e.g., students, staff, administrators, parents) such that they are motivated to implement it with integrity and over an extended period of time?

[There is extensive research on the “acceptability” of interventions, and the characteristics or variables that make program or intervention implementation likely or not likely.]

“Scientifically based” versus “Evidence-based” versus “Research-based”

As I provide consultation services to school districts across the country (and world), I continually hear people using three related terms to describe their practice—or their selection of specific services, supports, instruction, strategies, programs, or interventions.

The terms are “scientifically-based,” “evidence-based,” and “research-based” . . . and many educators seem to be using them interchangeably.

And so—because these terms are critical to understanding how to objectively evaluate the quality of a program or intervention being considered for implementation, I provide a brief history (and their definitions, when present) of these terms below.

As this series is focusing on the Elementary and Secondary Education Act(ESEA), I will restrict this brief analysis to (a) the 2001 version of ESEA (No Child Left Behind; ESEA/NCLB); (b) the current 2015 version of ESEA (Every Student Succeeds Act; ESEA/ESSA); and (c) ESEA’s current “brother”—the Individuals with Disabilities Education Act (IDEA 2004).

This term appeared in ESEA/NCLB 2001 twenty-eight times, and it was (at that time) the “go-to” definition in federal education law when discussing how to evaluate the efficacy, for example, of research or programs that states, districts, and schools needed to implement as part of their school and schooling processes.

Significantly, this term was defined in the law. According to ESEA/NCLB:

The term scientifically based research—

(A) means research that involves the application of rigorous, systematic, and objective procedures to obtain reliable and valid knowledge relevant to education activities and programs; and

(B) includes research that—

(i) employs systematic, empirical methods that draw on observation or experiment;

(ii) involves rigorous data analyses that are adequate to test the stated hypotheses and justify the general conclusions drawn;

(iii) relies on measurements or observational methods that provide reliable and valid data across evaluators and observers, across multiple measurements and observations, and across studies by the same or different investigators;

(iv) is evaluated using experimental or quasi-experimental designs in which individuals, entities, programs, or activities are assigned to different conditions and with appropriate controls to evaluate the effects of the condition of interest, with a preference for random-assignment experiments, or other designs to the extent that those designs contain within-condition or across-condition controls;

(v) ensures that experimental studies are presented in sufficient detail and clarity to allow for replication or, at a minimum, offer the opportunity to build systematically on their findings; and

(vi) has been accepted by a peer-reviewed journal or approved by a panel of independent experts through a comparably rigorous, objective, and scientific review.

The term “scientifically based” is found in IDEA 2004 twenty-five times—mostly when describing “scientifically based research, technical assistance, instruction, or intervention.”

The term “scientifically based” is found in ESEA/ESSA 2015 ONLY four times—mostly as “scientifically based research.” This term appears to have been replaced by the term “evidence-based” (see below) as the “standard” that ESEA/ESSA wants used when programs or interventions are evaluated for their effectiveness.

This term DID NOT APPEAR in either ESEA/NCLB 2001 or IDEA 2004.

It DOES appear in ESEA/ESSA 2015—sixty-three times (!!!) most often when describing “evidence-based research, technical assistance, professional development, programs, methods, instruction, or intervention.”

Moreover, as the new (and current) “go-to” standard when determining whether programs or interventions have been empirically demonstrated as effective, ESEA/ESSA 2105 defines this term.

As such, according to ESEA/ESSA 2015:

(A) IN GENERAL.—Except as provided in subparagraph (B), the term ‘evidence-based’, when used with respect to a State, local educational agency, or school activity, means an activity, strategy, or intervention that

‘(i) demonstrates a statistically significant effect on improving student outcomes or other relevant outcomes based on—

‘(I) strong evidence from at least 1 well-designed and well-implemented experimental study;

‘(II) moderate evidence from at least 1 well-designed and well-implemented quasi-experimental study; or

‘(III) promising evidence from at least 1 well-designed and well-implemented correlational study with statistical controls for selection bias; or

‘(ii)(I) demonstrates a rationale based on high-quality research findings or positive evaluation that such activity, strategy, or intervention is likely to improve student outcomes or other relevant outcomes; and

‘(II) includes ongoing efforts to examine the effects of such activity, strategy, or intervention.”

(B) DEFINITION FOR SPECIFIC ACTIVITIES FUNDED UNDER THIS ACT.—When used with respect to interventions or improvement activities or strategies funded under Section 1003 [School Improvement], the term ‘evidence-based’ means a State, local educational agency, or school activity, strategy, or intervention that meets the requirements of subclause (I), (II), or (III) of subparagraph (A)(i).

This term appeared in five times in ESEA/NCLB 2001; it appears four times in IDEA 2004; and it appears once in ESEA/ESSA 2015. When it appears, the term is largely used to describe programs that need to be implemented by schools to support student learning.

Significantly, the term researched-based is NOT defined in either ESEA law (2001, 2015), or by IDEA 2004.

Evidence-Based vs Research Based Programs for Dyslexics

What is the difference between research and evidence-based nursing practice?

Although the purposes of nursing research (conducting research to generate new knowledge) and evidence-based nursing practice (utilizing best evidence as basis of nursing practice) seem quite different, an increasing number of research studies have been conducted with the goal of translating evidence effectively into practice.

What is the difference between research based and evidence based?

Research-based – Parts or components of the program or method are based on practices demonstrated effective through Research. Evidence-based – The entire program or method has been demonstrated through Research to be effective. What this boils down to is that evidence-based is PREFERRED over research-based.

What is the difference between science based and evidence based?

Science-based – Parts or components of the program or method are based on Science. Research-based – Parts or components of the program or method are based on practices demonstrated effective through Research. Evidence-based – The entire program or method has been demonstrated through Research to be effective.

What is evidence based practice?

If sufficient research suggests that the program or practice is effective, it may be deemed “evidence-based.” Evidence-Informed (or Research-Based ) Practices are practices that were developed based on the best research available in the field.

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *