Posts

This is a guest post by Brett Christensen of Workplace Performance Consulting (www.workplaceperformance.ca/)

In this post, Brett tells us a story he recounted at a gathering of Debunker Club members at the 2018 ISPI conference in Seattle. It was such a telling story that I asked him if he would write a blog post sharing his lessons learned with you. It’s a cautionary tale about how easy it is to be fooled by information about learning that is too good to be true.

One thing to know before you read Brett’s post. He’s Canadian, which explains two things about what you will read, one of which is that he uses Canadian spellings. I’ll let you figure out the other thing.

______________________________

How I Was Fooled by Dale’s Cone

Why do we debunk?

A handful of members of the Debunker Club had the rare opportunity to meet in person on the morning of 09 April 2018 at the Starbucks Reserve Roastery in sunny (sic) Seattle prior to the second day of the International Society of Performance Improvement’s (ISPI) annual conference.

After introducing ourselves and learning that we had a “newbie” in our midst who had learned about the meeting from a friend’s re-tweet (see Networking Power on my blog), Will asked “Why do you debunk?” I somewhat sheepishly admitted that the root cause of my debunking desires could be traced back to a presentation I had done with a couple of colleagues in 2006 which was very early in my training and performance career. This was before I had discovered ISPI and before I understood and embraced the principles of evidence-based practice and scientific rigour.

We were working as e-Learning Instructional Designers (evangelists?) at the time and we were trying hard to communicate the benefits of e-Learning when it was designed correctly, which as we all know includes the design of activities that assist in transfer of learning. When we discovered Dale’s Cone – with the bad, bad, bad numbers, it made total sense to us. Insert foreboding music here.

The following image is an example of what we had seen (a problematic version of Dale’s Cone):

One of many bogus versions of Dale’s Cone

Our aim was to show to our training development colleagues that Dale’s Cone (with the numbers) was valid and that we should all endeavour to design activity into our training. We developed three different scenarios, one for each group. One group would read silently, one would read to each other out loud, and the last group would have an activity included. Everyone would then do a short assessment to measure transfer. The hope (Hypothesis? Pipe Dream?) was to show that the farther down the cone you went, the higher the transfer would be.

Well! That was not the outcome at all. In fact, if I remember correctly, everyone had similar scores on the exercise and the result was the exact opposite of what we were looking for. Rather than dig deeper into that when we got back home, we were on to the next big thing and Dale’s Cone faded in my memory. Before I go on, I’d like to point out that we weren’t total “hacks!” Our ISD process was based on valid models and we applied Mayer and Clark’s (2007) principles in all our work. We even received a “Gold e-Learning Award” award from the Canadian Society for Training Development, now the Institute for Performance and Learning (I4PL)

It wasn’t until much later, after being in ISPI for a number of years, that I had gotten to know Will, our head debunker, and read his research on Dale’s Cone! I was enlightened and a bit embarrassed that I had been a contributor to spreading bad “ju-ju” in the field. But hey – you don’t know what you don’t know. A couple of years after I found Will and finished my MSc, he started The Debunker Club. I knew I had to right my wrongs of the past and help spread the word to raise awareness of the myths and fads that continue to permeate our profession.

That’s why I am a debunker. Thank you, Will, for making me smarter in the work I do.

______________________________

Will’s Note: Brett is being much too kind. There are many people who take debunking very seriously these days. There are folks like De Bruyckere, Kirschner, Hulshof who wrote a book on learning myths. There is Clark Quinn who’s new debunking book is being released this month. There is Guy Wallace, Patti Shank, Julie Dirksen, Mirjam Neelen, Ruth Clark, Jane Bozarth, and many, many, many others (sorry if I’m forgetting you!). Now, there is also Brett Christensen who has been very active on social media over the last few years, debunking myths and more. The Debunker Club has over 600 members and over 50 people have applied for membership in the last month alone. And note, you are all invited to join.

Of course, debunking works most effectively if everybody jumps in and takes a stand. We must all stay current with the learning research and speak up gently and respectfully when we see bogus information being passed around.

Thanks Brett for sharing your story!! Most of us must admit that we have been taken in by bogus learning myths at some point in our careers. I know I have, and it’s a great reminder to stay humble and skeptical.

And let me point out a feature of Brett’s story that is easy to miss. Did you notice that Brett and his team actually did rigorous evaluation of their learning intervention? It was this evaluation that enabled Brett and his colleagues to learn how well things had gone. Now imagine if Brett and his team hadn’t done a good evaluation. They would never have learned that the methods they tried were not helpful in maximizing learning outcomes! Indeed, who knows what would have happened when they learned years later that the Dale’s Cone numbers were bogus. They might not have believed the truth of it!

Finally, let me say that Dale’s Cone itself, although not really research-based, is not the myth we’re talking about. It’s when Dale’s Cone is bastardized with the bogus numbers that it became truly problematic. See the link above entitled “research on Dale’s Cone” to see many other examples of bastardized cones.

Thanks again Brett for reminding us about what’s at stake. When myths are shared, the learning field loses trust, we learning professionals waste time, and our organizations bear the costs of many misspent funds. Our learners are also subjected to willy-nilly experimentation that hurts their learning.

 

 

The Danger

Have you ever seen the following “research” presented to demonstrate some truth about human learning?

Unfortunately, all of the above diagrams are evangelizing misleading information. Worse, these fabrications have been rampant over the last two or three decades—and seem to have accelerated during the age of the internet. Indeed, a Google image search for “Dale’s Cone” produces about 80% misleading information, as you can see below from a recent search.

Search 2015:

 

Search 2017:

 

This proliferation is a truly dangerous and heinous result of incompetence, deceit, confirmatory bias, greed, and other nefarious human tendencies.

It is also hurting learners throughout the world—and it must be stopped. Each of us has a responsibility in this regard.

 

New Research

Fortunately, a group of tireless researchers—who I’ve had the honor of collaborating with—has put a wooden stake through the dark heart of this demon. In the most recent addition of the scientific journal Educational Technology, Deepak Subramony, Michael Molenda, Anthony Betrus, and I (my contribution was small) produced four articles on the dangers of this misinformation and the genesis of it. After working separately over the years to debunk this bit of mythology, the four of us have come together in a joint effort to rally the troops—people like you, dedicated professionals who want to create the best outcomes for your learners.

Here are the citations for the four articles. Later, I will have a synopsis of each article.

Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). The Mythical Retention Chart and the Corruption of Dale’s Cone of Experience. Educational Technology, Nov/Dec 2014, 54(6), 6-16.

Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). Previous Attempts to Debunk the Mythical Retention Chart and Corrupted Dale’s Cone. Educational Technology, Nov/Dec 2014, 54(6), 17-21.

Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). The Good, the Bad, and the Ugly: A Bibliographic Essay on the Corrupted Cone. Educational Technology, Nov/Dec 2014, 54(6), 22-31.

Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). Timeline of the Mythical Retention Chart and Corrupted Dale’s Cone. Educational Technology, Nov/Dec 2014, 54(6), 31-24.

Many thanks to Lawrence Lipsitz, the editor of Educational Technology, for his support, encouragement, and efforts in making this possible!

To get a copy of the “Special Issue” or to subscribe to Educational Technology, go to this website. (Note, 2017: I don’t think the journal is being published anymore.)

 

The Background

There are two separate memes we are debunking, what we’ve labeled (1) the mythical retention chart and (2) the corruption of Dale’s Cone of Experience. As you will see—or might have noticed in the images I previously shared—the two have often be comingled.

Here is an example of the mythical retention chart:

 

Oftentimes though, this is presented in text:

“People Remember:

  • 10 percent of what they read;
  • 20 percent of what they hear;
  • 30 percent of what they see;
  • 50 percent of what they see and hear;
  • 70 percent of what they say; and
  • 90 percent of what they do and say

Note that the numbers proffered are not always the same, nor are the factors alleged to spur learning. So, for example, you can see that on the graphic, people are said to remember 30 percent of what they hear, but in the text, the percentage is 20 percent. In the graphic, people remember 80 percent when they are collaborating, but in the text they remember 70% of what they SAY. I’ve looked at hundreds of examples, and the variety is staggering.

Most importantly, the numbers do NOT provide good guidance for learning design, as I will detail later.

Here is a photocopied image of the original Dale’s Cone:

Edgar Dale (1900-1985) was an American educator who is best known for developing “Dale’s Cone of Experience” (the cone above) and for his work on how to incorporate audio-visual materials into the classroom learning experience. The image above was photocopied directly from his book, Audio-visual methods in teaching (from the 1969 edition).

 

You’ll note that Dale included no numbers in his cone. He also warned his readers not to take the cone too literally.

Unfortunately, someone somewhere decided to add the misleading numbers. Here are two more examples:

 

I include these two examples to make two points. First, note how one person clearly stole from the other one. Second, note how sloppy these fabricators are. They include a Confucius quote that directly contradicts what the numbers say. On the left side of the visuals, Confucius is purported to say that hearing is better than seeing, while the numbers on the right of the visuals say that seeing is better than hearing. And, by the way, Confucius did not actually say what he is being alleged to have said! What seems clear from looking at these and other examples is that people don’t do their due diligence—their ends seems to justify their means—and they are damn sloppy, suggesting that they don’t think their audiences will examine their arguments closely.

By the way, these deceptions are not restricted to the English-speaking world:

 

Intro to the Special Issue of Educational Technology

As Deepak Subramony and Michael Molenda say in the introduction to the Special Issue of Educational Technology, the four articles presented seek to provide a “comprehensive and complete analysis of the issues surrounding these tortured constructs.” They also provide “extensive supporting material necessary to present a comprehensive refutation of the aforementioned attempts to corrupt Dale’s original model.”

In the concluding notes to the introduction, Subramony and Molenda leave us with a somewhat dystopian view of information trajectory in the internet age. “In today’s Information Age it is immensely difficult, if not practically impossible, to contain the spread of bad ideas within cyberspace. As we speak, the corrupted cone and its attendant “data” are akin to a living organism—a virtual 21st century plague—that continues to spread and mutate all over the World Wide Web, most recently to China. It therefore seems logical—and responsible—on our part that we would ourselves endeavor to continue our efforts to combat this vexing misinformation on the Web as well.”

Later, I will provide a section on what we can all do to help debunk the myths and inaccuracies imbedded in these fabrications.

Now, I provide a synopsis of each article in the Special Edition.


Synopsis of First Article:

Citation:
Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). The Mythical Retention Chart and the Corruption of Dale’s Cone of Experience. Educational Technology, Nov/Dec 2014, 54(6), 6-16.

The authors point out that, “Learners—both face-to-face and distant—in classrooms, training centers, or homes are being subjected to lessons designed according to principles that are both unreliable and invalid. In any profession this would be called malpractice.” (p. 6).

The article makes four claims.

Claim 1: The Data in the Retention Chart is Not Credible

First, there is no body of research that supports the data presented in the many forms of the retention chart. That is, there is no scientific data—or other data—that supports the claim that People Remember some percentage of what they learned. Interestingly, where people have relied on research citations from 1943, 1947, 1963, and 1967 as the defining research when they cite the source of their data, the numbers—10%, 20%, 30% and so on—actually appeared as early as 1914 and 1922—when they were presented as information long known. A few years ago, I compiled research on actual percentages of remembering. You can access it here.

Second, the fact that the numbers all are divisible by 5 or 10 makes it obvious to anyone who has done research that these are not numbers derived by actual research. Human variability precludes round numbers. In addition, as pointed out as early at 1978 by Dwyer, there is the question of how the data were derived—what were learners actually asked to do? Note for example that the retention chart data always measures—among other things—how much people remember by reading, hearing, and seeing. How people could read without seeing is an obvious confusion. What are people doing when they only see and don’t read or listen? Also problematic is how you’d create a fair test to compare situations where learners listened or watched something. Are they tested on different tests (one where they see and one where they listen), which seems to allow bias or are they tested on the same test, in which case on group would be at a disadvantage because they aren’t taking a test in the same context in which they learned.

Third, the data portrayed don’t relate to any other research in the scientific literature on learning. As the authors write, “There is within educational psychology a voluminous literature on remembering and learning from various mediated experiences. Nowhere in this literature is there any summary of findings that remotely resembles the fictitious retention chart.” (p. 8)

Finally, as the author’s say, “Making sense of the retention chart is made nearly impossible by the varying presentations of the data, the numbers in the chart being a moving target, altered by the users to fit their individual biases about desirable training methods.” (p. 9).

Claim 2: Dale’s Cone is Misused.

Dale’s Cone of Experience is a visual depiction that portrays more concrete learning experiences at the bottom of the cone and more abstract experiences at the top of the cone. As the authors write, “The cone shape was meant to convey the gradual loss of sensory information” (p. 9) in the learning experiences as one moved from lower to higher levels on the cone.

“The root of all the perversions of the Cone is the assumption that the Cone is meant to be a prescriptive guide. Dale definitely intended the Cone to be descriptive—a classification system, not a road map for lesson planning.” (p. 10)

Claim 3: Combining the Retention Chart Data with Dale’s Cone

“The mythical retention data and the concrete-to-abstract cone evolved separately throughout the 1900’s, as illustrated in [the fourth article] ‘Timeline of the Mythical Retention Chart and Corrupted Dale’s Cone.’ At some point, probably around 1970, some errant soul—or perhaps more than one person—had the regrettable idea of overlaying the dubious retention data on top of Dale’s Cone of Experience.” (p. 11). We call this concoction the corrupted cone.

“What we do know is that over the succeeding years [after the original corruption] the corrupted cone spread widely from one source to another, not in scholarly publications—where someone might have asked hard questions about sources—but in ephemeral materials, such as handouts and slides used in teaching or manuals used in military or corporate training.” (p. 11-12).

“With the growth of the Internet, the World Wide Web, after 1993 this attractive nuisance spread rapidly, even virally. Imagine the retention data as a rapidly mutating virus and Dale’s Cone as a host; then imagine the World Wide Web as a bathhouse. Imagine the variety of mutations and their resistance to antiviral treatment. A Google Search in 2014 revealed 11,000 hits for ‘Dale’s Cone,’ 14,500 for ‘Cone of Learning,’ and 176,000 for ‘Cone of Experience.’ And virtually all of them are corrupted or fallacious representations of the original Dale’s cone. It just might be the most widespread pedagogical myth in the history of Western civilization!” (p. 11).

Claim 4: Murky Provenance

People who present the fallacious retention data and/or the corrupted cone often cite other sources—that might seem authoritative. Dozens of attributions have been made over the years, but several sources appear over and over, including the following:

  • Edgar Dale
  • Wiman & Meierhenry
  • Bruce Nyland
  • Various oil companies (Mobil, Standard Oil, Socony-Vacuum Oil, etc.)
  • NTL Institute
  • William Glasser
  • British Audio-Visual Society
  • Chi, Bassok, Lewis, Reimann, & Glaser (1989).

Unfortunately, none of these sources are real sources. They are false.

Conclusion:

“The retention chart cannot be supported in terms of scientific validity or logical interpretability. The Cone of Experience, created by Edgar Dale in 1946, makes no claim of scientific grounding, and its utility as a prescriptive theory is thoroughly unjustified.” (p. 15)

“No qualified scholar would endorse the use of this mish-mash as a guide to either research or design of learning environments. Nevertheless, [the corrupted cone] obviously has an allure that surpasses logical considerations. Clearly, it says something that many people want to hear. It reduces the complexity of media and method selection to a simple and easy to remember formula. It can thus be used to support a bias toward whatever learning methodology might be in vogue. Users seem to employ it as pseudo-scientific justification for their own preferences about media and methods.” (p. 15)


Synopsis of Second Article:

Citation:
Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). Previous Attempts to Debunk the Mythical Retention Chart and Corrupted Dale’s Cone. Educational Technology, Nov/Dec 2014, 54(6), 17-21.

The authors point to earlier attempts to debunk the mythical retention data and the corrupted cone. “Critics have been attempting to debunk the mythical retention chart at least since 1971. The earliest critics, David Curl and Frank Dwyer, were addressing just the retention data.  Beginning around 2002, a new generation of critics has taken on the illegitimate combination of the retention chart and Edgar Dale’s Cone of Experience – the corrupted cone.” (p. 17).

Interestingly, we only found two people who attempted to debunk the retention “data” before 2000. This could be because we failed to find other examples that existed, or it might just be because there weren’t that many examples of people sharing the bad information.

Starting in about 2002, we noticed many sources of refutation. I suspect this has to do with two things. First, it is easier to quickly search human activity in the internet age, giving an advantage in seeking examples. Second, the internet also makes it easier for people to post the erroneous information and share it to a universal audience.

The bottom line is that there have been a handful of people—in addition to the four authors—who have attempted to debunk the bogus information.


Synopsis of Third Article:

Citation:
Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). The Good, the Bad, and the Ugly: A Bibliographic Essay on the Corrupted Cone. Educational Technology, Nov/Dec 2014, 54(6), 22-31.

The authors of the article provide a series of brief synopses of the major players who have been cited as sources of the bogus data and corrupted visualizations. The goal here is to give you—the reader—additional information so you can make your own assessment of the credibility of the research sources provided.

Most people—I suspect—will skim through this article with a modest twinge of voyeuristic pleasure. I did.


Synopsis of Fourth Article:

Citation:
Subramony, D., Molenda, M., Betrus, A., and Thalheimer, W. (2014). Timeline of the Mythical Retention Chart and Corrupted Dale’s Cone. Educational Technology, Nov/Dec 2014, 54(6), 31-24.

The authors present a decade-by-decade outline of examples of the reporting of the bogus information—From 1900 to the 2000s. The outline represents great detective work by my co-authors, who have spent years and years searching databases, reading articles, and reaching out to individuals and institutions in search of the genesis and rebirth of the bogus information. I’m in continual awe of their exhaustive efforts!

The timeline includes scholarly work such as the “Journal of Education,” numerous books, academic courses, corporate training, government publications, military guidelines, etc.

The breadth and depth of examples demonstrates clearly that no area of the learning profession has been immune to the disease of poor information.


Synopsis of the Exhibits:

The authors catalog 16 different examples of the visuals that have been used to convey the mythical retention data and/or the corrupted cone. They also present about 25 text examples.

The visual examples are black-and-white canonical versions, and given these limitations, can’t convey the wild variety of examples available now on the internet. Still, they show in their variety just how often people have modified Dale’s Cone to support their own objectives.


My Conclusions, Warnings, and Recommendations

The four articles in the special edition of Educational Technology represent a watershed moment in the history of misinformation in the learning profession. The articles utilize two examples—the mythical retention data (“People remember 10%, 20%, 30%…”) and the numerical corruptions of Dale’s Cone—and demonstrate the following:

  1. There are definitively-bogus data sources floating around the learning profession.
  2. These bogus information sources damage the effectiveness of learning and hurt learners.
  3. Authors of these bogus examples do not do their due diligence in confirming the validity of their research sources. They blithely reproduce sources or augment them before conveying them to others.
  4. Consumers of these bogus information sources do not do their due diligence in being skeptical, in expecting and demanding validated scientific information, in pushing back against those who convey weak information.
  5. Those who stand up publically to debunk such misinformation—though nobly fighting a good fight—do not seem to be winning the war against this misinformation.
  6. More must be done if we are to limit the damage.

Some of you may chaff at my tone here, and if I had more time I might have been able to be more careful in my wording. But still, this stuff matters! Moreover, these articles focus only on two examples of bogus memes in the learning field. There are many more! Learning styles anyone?

Here is what you can do to help:

  1. Be skeptical.
  2. When conveying or consuming research-based information, check the actual source. Does it say what it is purported to say? Is it a scientifically-validated source? Are there corroborating sources?
  3. Gently—perhaps privately—let conveyors of bogus information know that they are conveying bogus information. Show them your sources so they can investigate for themselves.
  4. When you catch someone conveying bogus information, make note that they may be the kind of person who is lazy or corrupt in the information they convey or use in their decision making.
  5. Punish, sanction, or reprimand those in your sphere of influence who convey bogus information. Be fair and don’t be an ass about it.
  6. Make or take opportunities to convey warnings about the bogus information.
  7. Seek out scientifically-validated information and the people and institutions who tend to convey this information.
  8. Document more examples.

To this end, Anthony Betrus—on behalf of the four authors—has established www.coneofexperience.com. The purpose of this website is to provide a place for further exploration of the issues raised in the four articles. It provides the following:

  • Series of timelines
  • Links to other debunking attempts
  • Place for people to share stories about their experience with the bogus data and visuals.

The learning industry also has responsibilities.

  1. Educational institutions must ensure that validated information is more likely to be conveyed to their students, within the bounds of academic freedom…of course.
  2. Educational institutions must teach their students how to be good consumers of “research,” “data,” and information (more generally).
  3. Trade organizations must provide better introductory education for their members; more myth-busting articles, blog posts, videos, etc.; and push a stronger evidence-based-practice agenda.
  4. Researchers have to partner with research translators more often to get research-based information to real-world practitioners.

Links:

 

 

Below is another example of the misuse of the now-infamous bogus percentages by a speaker at a prominent international conference in the workplace learning field, this time in an online session in January 2009.

I have documented this problem starting in 2002. The following posts illustrate this problem.

A manager at Qube Learning joins the list of folks who have been fooled, and who foolishly and irresponsibly re-gift this faulty information. Point: If you can't verify the credibility of the so-called "research" you come across, don't share it.

Cone_January2009

And this follow-up slide:

Cone_January2009b

It's a shame we have to keep revisiting this bogus information. I truly wish I didn't have to do this.

Of course, even if you and I wipe this bogus-information example off the face of the earth, there will be more misinformation we'll have to deal with. It's okay. It's the nature of living I think. The learning point here is that all of us in the learning-and-performance field must be vigilant. We must be skeptical of claims. We must build structures where we can test these bogus claims in the crucible of an evidence-based marketplace. It is only then that we will be able to build a fully-worthy profession.

Keep sending me your examples. Thanks to the helpful soul who sent me this example.

Interestingly, just today a major player in our field asked me permission to publish the original blog post (the one debunking the bogus-percentage myth) in their company newsletter (which goes out to over 100,000 people). They too had been using this misinformation in their work and now wanted to correct their mistake. I salute their action.