This is a guest post by Brett Christensen of Workplace Performance Consulting (www.workplaceperformance.ca/)

In this post, Brett tells us a story he recounted at a gathering of Debunker Club members at the 2018 ISPI conference in Seattle. It was such a telling story that I asked him if he would write a blog post sharing his lessons learned with you. It’s a cautionary tale about how easy it is to be fooled by information about learning that is too good to be true.

One thing to know before you read Brett’s post. He’s Canadian, which explains two things about what you will read, one of which is that he uses Canadian spellings. I’ll let you figure out the other thing.

______________________________

How I Was Fooled by Dale’s Cone

Why do we debunk?

A handful of members of the Debunker Club had the rare opportunity to meet in person on the morning of 09 April 2018 at the Starbucks Reserve Roastery in sunny (sic) Seattle prior to the second day of the International Society of Performance Improvement’s (ISPI) annual conference.

After introducing ourselves and learning that we had a “newbie” in our midst who had learned about the meeting from a friend’s re-tweet (see Networking Power on my blog), Will asked “Why do you debunk?” I somewhat sheepishly admitted that the root cause of my debunking desires could be traced back to a presentation I had done with a couple of colleagues in 2006 which was very early in my training and performance career. This was before I had discovered ISPI and before I understood and embraced the principles of evidence-based practice and scientific rigour.

We were working as e-Learning Instructional Designers (evangelists?) at the time and we were trying hard to communicate the benefits of e-Learning when it was designed correctly, which as we all know includes the design of activities that assist in transfer of learning. When we discovered Dale’s Cone – with the bad, bad, bad numbers, it made total sense to us. Insert foreboding music here.

The following image is an example of what we had seen (a problematic version of Dale’s Cone):

One of many bogus versions of Dale’s Cone

Our aim was to show to our training development colleagues that Dale’s Cone (with the numbers) was valid and that we should all endeavour to design activity into our training. We developed three different scenarios, one for each group. One group would read silently, one would read to each other out loud, and the last group would have an activity included. Everyone would then do a short assessment to measure transfer. The hope (Hypothesis? Pipe Dream?) was to show that the farther down the cone you went, the higher the transfer would be.

Well! That was not the outcome at all. In fact, if I remember correctly, everyone had similar scores on the exercise and the result was the exact opposite of what we were looking for. Rather than dig deeper into that when we got back home, we were on to the next big thing and Dale’s Cone faded in my memory. Before I go on, I’d like to point out that we weren’t total “hacks!” Our ISD process was based on valid models and we applied Mayer and Clark’s (2007) principles in all our work. We even received a “Gold e-Learning Award” award from the Canadian Society for Training Development, now the Institute for Performance and Learning (I4PL)

It wasn’t until much later, after being in ISPI for a number of years, that I had gotten to know Will, our head debunker, and read his research on Dale’s Cone! I was enlightened and a bit embarrassed that I had been a contributor to spreading bad “ju-ju” in the field. But hey – you don’t know what you don’t know. A couple of years after I found Will and finished my MSc, he started The Debunker Club. I knew I had to right my wrongs of the past and help spread the word to raise awareness of the myths and fads that continue to permeate our profession.

That’s why I am a debunker. Thank you, Will, for making me smarter in the work I do.

______________________________

Will’s Note: Brett is being much too kind. There are many people who take debunking very seriously these days. There are folks like De Bruyckere, Kirschner, Hulshof who wrote a book on learning myths. There is Clark Quinn who’s new debunking book is being released this month. There is Guy Wallace, Patti Shank, Julie Dirksen, Mirjam Neelen, Ruth Clark, Jane Bozarth, and many, many, many others (sorry if I’m forgetting you!). Now, there is also Brett Christensen who has been very active on social media over the last few years, debunking myths and more. The Debunker Club has over 600 members and over 50 people have applied for membership in the last month alone. And note, you are all invited to join.

Of course, debunking works most effectively if everybody jumps in and takes a stand. We must all stay current with the learning research and speak up gently and respectfully when we see bogus information being passed around.

Thanks Brett for sharing your story!! Most of us must admit that we have been taken in by bogus learning myths at some point in our careers. I know I have, and it’s a great reminder to stay humble and skeptical.

And let me point out a feature of Brett’s story that is easy to miss. Did you notice that Brett and his team actually did rigorous evaluation of their learning intervention? It was this evaluation that enabled Brett and his colleagues to learn how well things had gone. Now imagine if Brett and his team hadn’t done a good evaluation. They would never have learned that the methods they tried were not helpful in maximizing learning outcomes! Indeed, who knows what would have happened when they learned years later that the Dale’s Cone numbers were bogus. They might not have believed the truth of it!

Finally, let me say that Dale’s Cone itself, although not really research-based, is not the myth we’re talking about. It’s when Dale’s Cone is bastardized with the bogus numbers that it became truly problematic. See the link above entitled “research on Dale’s Cone” to see many other examples of bastardized cones.

Thanks again Brett for reminding us about what’s at stake. When myths are shared, the learning field loses trust, we learning professionals waste time, and our organizations bear the costs of many misspent funds. Our learners are also subjected to willy-nilly experimentation that hurts their learning.

 

 

I’ve been at the helm of Work-Learning Research, Inc. for almost 20 years. Ever since I began to have a following as a research-to-practice consultant, I’ve been approached by vendors to “research” their products. A great majority who approach me are basically asking me to tell the industry that their products are good. I tell these vendors that I don’t do that kind of “research,” but if they want a fair, honest, and research-based evaluation of their product for their own benefit—advice not for public consumption but for their own feedback and deliberations—I can do that for them. Some take me up on this, but most don’t.

I recently got another request and I thought I’d share what this looks like (I’ve removed identifying information):

Vendor:

I’m reaching out as the co-founder of [GreatNewCompany], a [high-tech blankety-bling] platform. We’re trying to create a product that [does incredibly wonderful things to change the world of learning].

I wanted to ask if you’d consider reviewing our product? I know you’ve spoken to [this industry luminary about such-and-such] and wondered if this was an area of research you’d planned to do more work in?

A free account has access to almost all features but is just limited to [25] unique recipients [https URL generously offered]. If you need more access to perform a comprehensive review or have any questions then please let me know.

I understand that this isn’t a small ask as it’d take a decent amount of your time but thought I’d see if you found us interesting.

Gentleman Researcher/Consultant:

I do review products, but not for public consumption. I do it to provide feedback to developers, not for marketing purposes.

My cost is [such-and-such] per hour.

Let me know if you’re interested.

Vendor:

Thanks for letting me know – it’s appreciated.

We’d be interested in some consultancy on helping raise awareness of our product and to better reach more customers. We’re not sure if we’re just failing at marketing or whether our product just doesn’t have the broad appeal. Do you think you’d be a good fit helping us with that?

Thanks.

Gentleman Researcher/Consultant:

It’s a crazy market now, with lots of new entries. Very hard to gain visibility and traction.

I don’t schlep for others. I run a high-integrity consultancy here. SMILE.

One recommendation I make is to actually do good research on your product. This helps you to learn more and it gives you something to talk about in your content marketing efforts. A way to stand above the screaming crowd.

I can help you with high-integrity research, but this usually costs a ton…

Vendor:

Hi Will,

Thanks again for the thoughts, sounds like we’re a bad fit for the kind of consultancy that we need so I appreciate you being open about that.

Cheers!

THE END

A happy ending?

================

Conclusions:

  • Be careful when you hear about product endorsements. They may be paid for.
  • Remember, not all communications that are called “research” are created equal.
  • Look for consultants who can’t be bought. You want valid advice not advice tilted toward those who pay the consultants.
  • Look for vendors who tell true stories, who honestly research their products, who learn from their experience.
  • Be skeptical of communications coming out of trade associations when those messages are paid for directly or indirectly (through long commercial association between the vendor and the association).
  • Be even more skeptical of best-in-industry lists where those listed pay to be listed. Yes! These exist!
  • In general, be skeptical and look to work with those who have integrity. They exist too!