Levels of Evidence for the Learning Profession
Will’s Note: ONE DAY after publishing this first draft, I’ve decided that I mucked this up, mashing up what researchers, research translators, and learning professionals should focus on. Within the next week, I will update this to a second draft. You can still read the original below (for now):
Some evidence is better than other evidence. We naturally trust ten well-designed research studies better than one. We trust a well-controlled scientific study better than a poorly-controlled study. We trust scientific research more than opinion research, unless all we care about is people’s opinions.
Scientific journal editors have to decide which research articles to accept for publication and which to reject. Practitioners have to decide which research to trust and which to ignore. Politicians have to know which lies to tell and which to withhold (kidding, sort of).
To help themselves make decisions, journal editors regular rank each article on a continuum from strong research methodology to weak. The medical field regularly uses a level-of-evidence approach to making medical recommendations.
There are many taxonomies for “levels of evidence” or “hierarchy of evidence” as it is commonly called. Wikipedia offers a nice review of the hierarchy-of-evidence concept, including some important criticisms.
Hierarchy of Evidence for Learning Practitioners
The suggested models for level of evidence were created by and for researchers, so they are not directly applicable to learning professionals. Still, it’s helpful for us to have our own hierarchy of evidence, one that we might actually be able to use. For that reason, I’ve created one, adding in the importance of practical evidence that is missing from the research-focused taxonomies. Following the research versions, Level 1 is the best.
- Level 1 — Evidence from systematic research reviews and/or meta-analyses of all relevant randomized controlled trials (RCTs) that have ALSO been utilized by practitioners and found both beneficial and practical from a cost-time-effort perspective.
- Level 2 — Same evidence as Level 1, but NOT systematically or sufficiently utilized by practitioners to confirm benefits and practicality.
- Level 3 — Consistent evidence from a number of RCTs using different contexts and situations and learners; and conducted by different researchers.
- Level 4 — Evidence from one or more RCTs that utilize the same research context.
- Level 5 — Evidence from one or more well-designed controlled trial without randomization of learners to different learning factors.
- Level 6 — Evidence from well-designed cohort or case-control studies.
- Level 7 — Evidence from descriptive and/or qualitative studies.
- Level 8 — Evidence from research-to-practice experts.
- Level 9 — Evidence from the opinion of other authorities, expert committees, etc.
- Level 10 — Evidence from the opinion of practitioners surveyed, interviewed, focus-grouped, etc.
- Level 11 — Evidence from the opinion of learners surveyed, interviewed, focus-grouped, etc.
- Level 12 — Evidence curated from the internet.
Let me consider this Version 1 until I get feedback from you and others!
- Some evidence is better than other evidence
- If you’re not an expert in evaluating evidence, get insights from those who are–particularly valuable are research-to-practice experts (those who have considerable experience in translating research into practical recommendations).
- Opinion research in the learning field is especially problematic, because the learning field is comprised of both strong and poor conceptions of what works.
- Learner opinions are problematic as well because learners often have poor intuitions about what works for them in supporting their learning.
- Curating information from the internet is especially problematic because it’s difficult to distinguish between good and poor sources.
Trusted Research to Practice Experts
(in no particular order, they’re all great!)
- (Me) Will Thalheimer
- Patti Shank
- Julie Dirksen
- Clark Quinn
- Mirjam Neelen
- Ruth Clark
- Donald Clark
- Karl Kapp
- Jane Bozarth
- Ulrich Boser
You must be logged in to post a comment.