21st December 2022

Neon Elephant Award Announcement

Dr. Will Thalheimer, Principal at TiER1 Performance, Founder of Work-Learning Research, announces the winner of the 2022 Neon Elephant Award, given this year to Donald Clark, for writing the book, Learning Experience Design: How to Create Effective Learning that Works, and for his collation of the Greatest Minds on Learning (both in the podcast series with John Helmer and in Donald’s tireless work researching and curating critical ideas and thinkers in his Plan B blog).

Click here to learn more about the Neon Elephant Award…

 

2022 Award Winner – Donald Clark

Donald Clark is a successful entrepreneur, professor, researcher, author, blogger, and speaker. He is an internationally-renowned thinker in the field of learning-technology, having worked in EdTech for over 30 years; having been a leader in many successful learning-technology businesses (both as an executive and board member); and having written extensively on a wide range of topics related to learning and development—in books, articles, and his legendary blog. Relatively early in his career, Donald earned success as one of the original founders of Epic Group plc, a leading online learning company in the UK, an enterprise subsequently floated on the Stock Market in 1996 and sold in 2005. Since then, as Donald has described it, he has felt “free from the tyranny of employment,” using this privilege to the advantage of the learning field. Donald has becoming an advocate for research-based practices and intelligent uses of learning technologies. He has also founded, run, and supported learning-technology enterprises, which has further helped spread good learning practices.

Donald Clark’s most recent book—Learning Experience Design: How to Create Effective Learning that Works—stands above and apart from most writing on Learning Experience Design. It is fully and thoroughly inspired by the scientific research on learning and on real-world experience in using and developing learning technologies. Chapter after chapter it shares an inspiring introduction, a robust review of best practices, and a concise set of practical recommendations. Anyone practicing learning experience design should buy this book today—study it and apply it’s recommendations. Your learners and organizations will thank you. You will build wildly more effective learning!

Donald Clark’s compilations of our field’s best thinkers and ideas is legendary—or should be! Over many decades, he has tirelessly curated an almost endless treasure-trove of golden nuggets on many of the most important ideas in the learning field. This past year, he has brought these to a larger audience through his excellent collaboration with John Helmer, known most famously for his Learning Hack podcast. Donald Clark’s and John Helmer’s Great Minds on Learning webcast and podcast collaboration is fantastic. Donald’s written reviews provide us with an overview of the deep history of the learning field. Here is a blog post that lists many of Donald’s reviews of the great thinkers in our field.

Notable contributions from Donald Clark:


With Gratitude

In his decades of work, Donald Clark has been a tireless advocate for improvements and innovation in the field of learning-and-development and learning technology. He often refers to his work as “provocative,” and deserves admiration for (1) urging the learning field to embrace scientifically-informed practices, (2) urging us to be more forward looking in our embrace of learning technologies (particularly AI), and (3) being one of our field’s preeminent historians—reminding us of the rich and valuable work of researchers, writers, and practitioners from the past centuries to today. It is an honor to recognize Donald as this year’s winner of the Neon Elephant Award.

Click here to learn more about the Neon Elephant Award…

 

 

21st December 2021

Neon Elephant Award Announcement

Dr. Will Thalheimer, Principal at TiER1 Performance, Founder of Work-Learning Research, announces the winner of the 2021 Neon Elephant Award, given to two people this year, Clark Quinn and Patti Shank. Clark Quinn for writing the book, Learning Science for Instructional Designers. Patti Shank for writing the book Write Better Multiple-Choice Questions to Assess Learning—and for their many years translating learning research into practical recommendations.

Click here to learn more about the Neon Elephant Award…

2021 Award Winners – Clark Quinn and Patti Shank

Clark Quinn, PhD is an internationally-recognized consultant and thought-leader in learning science, learning technology and organizational learning. Clark holds a doctorate in Cognitive Psychology from the University of California at San Diego. Since 2001, Clark has been consulting, researching, writing, and speaking through his consulting practice, Quinnovation (website). Clark has been at the forefront of some of the most important trends in workplace learning, including his early advocacy for mobile learning, his work with the Internet Time Group advocating for a greater emphasis on workplace learning, and his many efforts to bring research-based wisdom to elearning design. With the publication of his new book, Clark again shows leadership—now in the cause of giving instructional designers a clear and highly-readable guide to the learning sciences.

Clark is the author of numerous books. The following are representative:

 

Patti Shank, PhD

Patti Shank, PhD, is an internationally-recognized learning analyst, writer, and translational researcher in the learning, performance, and talent space. Dr. Shank holds a doctorate in Educational Leadership and Innovation, Instructional Technology from the University of Colorado, Denver and a Masters degree in Education and Human Development from George Washington University. Since 1996, Patti has been consulting, researching, and writing through her consulting practice, Learning Peaks LLC (pattishank.com). As the best research-to-practice professionals tend to do, Patti has extensive experience as a practitioner, including roles such as training specialist, training supervisor, and manager of training and education. Patti has also played a critical role collaborating with the workplace learning’s most prominent trade associations—working, sometimes quixotically, to encourage the adoption of research-based wisdom for learning.

Patti is the author of numerous books, focusing not only on evidence-based practices, but also on online learning, elearning, and learning assessment. The following are her most recent books:


With Gratitude

In their decades of work, both Patti Shank and Clark Quinn have lived careers of heroic effort, perseverance, and passion. Their love for the learning-and-development field is deep and true. They don’t settle for half the truth, they don’t settle for half measures. But rather, they show their mettle even when they get pushback, even when times are tough, even when easier paths might call. It is an honor to recognize Patti and Clark as this year’s winners of the Neon Elephant Award.

 

Click here to learn more about the Neon Elephant Award…

 

 

21st December 2020

Neon Elephant Award Announcement

Dr. Will Thalheimer, President of Work-Learning Research, Inc., announces the winner of the 2020 Neon Elephant Award, given to Mirjam Neelen and Paul Kirschner for writing the book, Evidence-Informed Learning Design: Use Evidence to Create Training Which Improves Performanceand for their many years publishing their blog 3-Star Learning Experiences.

Click here to learn more about the Neon Elephant Award…

2020 Award Winners – Mirjam Neelen and Paul Kirschner

Mirjam Neelen is one of the world’s most accomplished research-to-practice practitioners in the workplace learning field. On the practical side, Mirjam has played many roles. As of this writing, she is the Head of Global Learning Design and Learning Sciences at Novartis. She has been a Learning Experience Design Lead at Accenture and at the Learnovate Centre in Dublin, an Instructional Designer at Google, and Instructional Design Lead at Houghton Mifflin Harcourt. Mirjam utilizes evidence-informed wisdom in her work and also partners with Paul A. Kirschner in the 3-Star Learning Experience blog to bring research and evidence-informed insights to the workplace learning field. Mirjam is a member of the Executive Advisory Board of The Learning Development Accelerator.

Paul A. Kirschner is Professor Emeritus at the Open University of the Netherlands and owner of kirschner-ED, an educational consulting practice. Paul is an internationally recognized expert in learning and educational research, with many classic studies to his name. He has served as President of the International Society for the Learning Sciences, is an AERA (American Education Research Association) Research Fellow (the first European to receive this honor). He has published several very successful books: Ten Steps to Complex Learning, Urban Myths about Learning and Education. More Urban Myths about Learning and Education, and this year he published How Learning Happens: Seminal Works in Educational Psychology and What They Mean in Practice with Carl Hendrick — as well as the book he and Mirjam are honored for here. Kirschner previously won the Neon Elephant Award in 2016 for the book Urban Myths about Learning and Education written with Pedro De Bruyckere and Casper D. Hulshof. Also, Paul’s co-author on the Ten-Steps book, Jeroen van Merriënboer, won the Neon-Elephant award in 2011.

Relevant Websites

Mirjam’s and Paul’s book, Evidence-Informed Learning Design was published only ten months ago, but has already swept the world as a book critical to learning architects and learning executives in their efforts to build the most effective learning designs. In my book review earlier this year I wrote, “Mirjam Neelen and Paul Kirschner have written a truly beautiful book—one that everyone in the workplace learning field should read, study, and keep close at hand. It’s a book of transformational value because it teaches us how to think about our jobs as practitioners in utilizing research-informed ideas to build maximally effective learning architectures.”

Mirjam Neelen and Paul Kirschner are the kind of research translators we should honor and emulate in the workplace learning field. They are unafraid in seeking the truth, passionate in sharing research- and evidence-informed wisdom, dogged in compiling research from scientific journals, and thoughtful in making research ideas accessible to practitioners in our field. It is an honor to recognize Mirjam and Paul as this year’s winners of the Neon Elephant Award.

 

Click here to learn more about the Neon Elephant Award…

 

 

12th December 2019

Neon Elephant Award Announcement

Dr. Will Thalheimer, President of Work-Learning Research, Inc., announces the winner of the 2019 Neon Elephant Award, given to David Epstein for writing the book Range: Why Generalists Triumph in a Specialized World, and for his many years as a journalist and science-inspired truth teller.

Click here to learn more about the Neon Elephant Award…

 

2019 Award Winner – David Epstein

David Epstein, is an award-winning writer and journalist, having won awards for his writing from such esteemed bodies as the National Academies of Sciences, Engineering, and Medicine, the Society of Professional Journalists, and the National Center on Disability and Journalism—and has been included in the Best American Science and Nature Writing anthology. David has been a science writer for ProPublica and a senior writer at Sports Illustrated where he helped break the story on baseball legend Alex Rodriguez’s steroid use. David speaks internationally on performance science and the uses (and misuses) of data and his TED talk on human athletic performance has been viewed over eight million times.

Mr. Epstein is the author of two books:

David is honored this year for his new book on human learning and development, Range: Why Generalists Triumph in a Specialized World. The book lays out a very strong case for why most people will become better performers if they focus broadly on their development rather than focusing tenaciously and exclusively on one domain. If we want to raise our children to be great soccer players (aka “football” in most places), we’d be better off having them play multiple sports rather than just soccer. If we want to develop the most innovative cancer researchers, we shouldn’t just train them in cancer-related biology and medicine, we should give them a wealth of information and experiences from a wide range of fields.

Range is a phenomenal piece of art and science. Epstein is truly brilliant in compiling and comprehending the science he reviews, while at the same time telling stories and organizing the book in ways that engage and make complex concepts understandable. In writing the book, David is debunking the common wisdom that performance is improved most rapidly and effectively by focusing practice and learning toward a narrow foci. Where others have only hinted at the power of a broad developmental pathway, Epstein’s Range builds up a towering landmark of evidence that will remain visible on the horizon of the learning field for decades if not millennium.

We in the workplace learning-and-development field should immerse ourselves in Range—not just in thinking about how to design learning and architect learning contexts, but also in thinking about how to evaluate prospects for recruitment and hiring. It’s likely that we currently undervalue people with broad backgrounds and artificially overvalue people with extreme and narrow talents.

Here is a nice article where Epstein wrestles with a question that elucidates an issue we have in our field—what happens when many people in a field are not following research-based guidelines. The article is set in the medical profession, but there are definite parallels to what we face everyday in the learning field.

Epstein is the kind of person we should honor and emulate in the workplace learning field. He is unafraid in seeking the truth, relentless and seemingly inexhaustible in his research efforts, and clear and engaging as a conveyor of information. It is an honor to recognize him as this year’s winner of the Neon Elephant Award.

 

Click here to learn more about the Neon Elephant Award…

 

 

15th December 2018

Neon Elephant Award Announcement

Dr. Will Thalheimer, President of Work-Learning Research, Inc., announces the winner of the 2018 Neon Elephant Award, given to Clark Quinn for writing the book Millennials, Goldfish & Other Training Misconceptions: Debunking Learning Myths and Superstitions, and for his many years advocating for research-based practices in the workplace learning field.

Click here to learn more about the Neon Elephant Award…

 

2018 Award Winner – Clark Quinn, PhD

Clark Quinn, PhD, is an internationally-recognized consultant and thought-leader in learning technology and organizational learning. Dr. Quinn holds a doctorate in Cognitive Psychology from the University of California at San Diego. Since 2001, Clark has been consulting, researching, writing, and speaking through his consulting practice, Quinnovation (website). Clark has been at the forefront of some of the most important trends in workplace learning, including his early advocacy for mobile learning, his work with the Internet Time Group advocating for a greater emphasis on workplace learning, and his collaboration on the Serious eLearning Manifesto to bring research-based wisdom to elearning design. With the publication of his new book, Clark again shows leadership—now in the cause of debunking learning myths and misconceptions.

Clark is the author of numerous books, focusing not only on debunking learning myths, but also on the practice of learning and development and mobile learning. The following are representative:

In addition to his lifetime of work, Clark is honored for his new book on debunking the learning myths, Millennials, Goldfish & Other Training Misconceptions: Debunking Learning Myths and Superstitions.

Millennials, Goldfish & Other Training Misconceptions provides a quick overview of some of the most popular learning myths, misconceptions, and mistakes. The book is designed as a quick reference for practitioners—to help trainers, instructional designers, and elearning developers avoid wasting their efforts and their organizations’ resources in using faulty concepts. As I wrote in the book’s preface, “Clark Quinn has compiled, for the first time, the myths, misconceptions, and confusions that imbue the workplace learning field with faulty decision making and ineffective learning practices.”

When we think about how much time and money has been wasted by learning myths, when we consider the damage done to learners and organizations, when we acknowledge the harm done to the reputation of the learning profession, we can see how important it is to have a quick reference like Clark has provided.

Clark’s passion for good learning is always evident. From his strategic work with clients, to his practical recommendations around learning technology, to his polemic hyperbole in the revolution book, to his longstanding energy in critiquing industry frailties and praising great work, to his eLearning Guild participatory leadership, to his editorial board contributions at eLearn Magazine, and to his excellent new book; Clark is a kinetic force in the workplace learning field. For his research-inspired recommendations, his tenacity in persevering as a thought-leader consultant, and for his ability to collaborate and share his wisdom, we in the learning field owe Clark Quinn our grateful thanks!

 

 

Click here to learn more about the Neon Elephant Award…

Updated July 3rd, 2018—a week after the original post. See end of post for the update, featuring Rob Brinkerhoff’s response.

Rob Brinkerhoff’s “Success Case Method” needs a subtle name change. I think a more accurate name would be the “Brinkerhoff Case Method.”

I’m one of Rob’s biggest fans, having selected him in 2008 as the Neon Elephant Award Winner for his evaluation work.

Thirty five years ago, in 1983, Rob published an article where he introduced the “Success Case Method.” Here is a picture of the first page of that article:

In that article, the Success-Case Method was introduced as a way to find the value of training when it works. Rob wrote, “The success-case method does not purport to produce a balanced assessment of the total results of training. It does, however, attempt to answer the question: When training works, how well does it work?” (page 58, which is visible above).

The Success-Case Method didn’t stand still. It evolved and improved as Rob refined it based on his research and his work with clients. In his landmark book that details the methodology in 2006, Telling Training’s Story: Evaluation Made Simple, Credible, and Effective, Rob describes how to first survey learners and then sample some of them for interviews by selecting them based on their level of success in applying the training. “Once the sorting is complete, the next step is to select the interviewees from among the high and low success candidates, and perhaps from the middle categories.” (page 102).

To call this the success-case method seems more aligned with the original naming then the actual recommended practice. For that reason, I recommend that we simply call it the Brinkerhoff Case Method. This gives Rob the credit he deserves, and it more accurately reflects the rigor and balance of the method itself.

As soon as I posted the original post, I reached out to Rob Brinkerhoff to let him know. After some reflection, Rob wrote this and asked me to post it:

“Thank you for raising the issue of the currency of the name Success Case Method (SCM). It is kind of you to also think about identifying it more closely with my name. Your thoughts are not unlike others and on occasion even myself. 

It is true the SCM collects data from extreme portions of the respondent distribution including likely successes, non-successes, and ‘middling’ users of training. Digging into these different groups yields rich and useful information. 

Interestingly the original name I gave to the method some 40 years ago when I first started forging it was the “Pioneer” method since when we studied the impact of a new technology or innovation we felt we learned the most from the early adopters – those out ahead of the pack that tried out new things and blazed a trail for others to follow. I refined that name to a more familiar term but the concept and goal remained identical: accelerate the pace of change and learning by studying and documenting the work of those who are using it the most and the best. Their experience is where the gold is buried. 

Given that, I choose to stick with the “success” name. It expresses our overall intent: to nurture and learn from and drive more success. In a nutshell, this name expresses best not how we do it, but why we do it. 

Thanks again for your thoughtful reflections. We’re on the same page.“ 

Rob’s response is thoughtful, as usual. Yet my feelings on this remain steady. As I’ve written in my report on the new Learning-Transfer Evaluation Model (LTEM), our models should nudge appropriate actions. The same is true for the names we give things. Mining for success stories is good, but it has to be balanced. After all, if evaluation doesn’t look for the full truth—without putting a thumb on the scale—than we are not evaluating, we are doing something else.

I know Rob’s work. I know that he is not advocating for, nor does he engage in, unbalanced evaluations. I do fear that the name Success Case Method may give permission or unconsciously nudge lesser practitioners to find more success and less failure than is warranted by the facts.

Of course, the term “Success Case Method” has one brilliant advantage. Where people are hesitant to evaluate for fear of uncovering unpleasant results, the name “Success Case Method” may lessen the worry of moving forward and engaging in evaluation—and so it may actually enable the balanced evaluation that is necessary to uncover the truth of learning’s level of success.

Whatever we call it, the Success Case Method or the Brinkerhoff Case Method—and this is the most important point—it is one of the best learning-evaluation innovations in the past half century.

I also agree that since Rob is the creator, his voice should have the most influence in terms of what to call his invention.

I will end with one of my all-time favorite quotations from the workplace learning field, from Tim Mooney and Robert Brinkerhoff’s excellent book, Courageous Training:

“The goal of training evaluation is not to prove the value of training; the goal of evaluation is to improve the value of training.” (p. 94-95)

On this we should all agree!

Guy Wallace has been an exemplar of the highest quality in the performance-improvement field for decades. His 31-page bio is a testament to his incredible work experience. He has worked with other industry luminaries including Dick Hanshaw, Geary Rummler, Dick Clark, Dale Brethower. He not only has been at the center of the move from training to performance—represented in the long arc of ISPI—he’s been capturing that history for years.

I highly recommend his video series.

The only blemish in that series is the video interview he released this week, featuring me. Legacy schmegacy! Seriously though, I am honored. Thank you Guy for all you do and have done!

And Guy’s still going strong in his work, offering optimal methodologies in performance analysis/assessment and curriculum architecture.

The Debunker Club, with over 600 members devoted to squashing the myths in the learning field, is offering a FREE webinar with noted author and learning guru Dr. Clark Quinn on myths and misconceptions in the learning field, based on his new book, just released last month, Millennials, Goldfish & Other Training Misconceptions: Debunking Learning Myths and Superstitions. (available from Amazon here).

DATE:

  • June 6th

TIME:

  • 10AM (San Francisco, USA)
  • 1PM (New York, USA)
  •  6PM (London, UK)
  • 10:30PM (Mumbai, India)
  • 3AM June 7th (Sydney, Australia)

REGISTER NOW:

Series of Four Interviews

I was recently interviewed by Jeffrey Dalto of Convergence Training. Jeffrey is a big fan of research-based practice. He did a great job compiling the interviews.

Click on the title of each one to read the interview:

An exhaustive new research study reveals that the backfire effect is not as prevalent as previous research once suggested. This is good news for debunkers, those who attempt to correct misconceptions. This may be good news for humanity as well. If we cannot reason from truth, if we cannot reliably correct our misconceptions, we as a species will certainly be diminished—weakened by realities we have not prepared ourselves to overcome. For those of us in the learning field, the removal of the backfire effect as an unbeatable Goliath is good news too. Perhaps we can correct the misconceptions about learning that every day wreak havoc on our learning designs, hurt our learners, push ineffective practices, and cause an untold waste of time and money spent chasing mythological learning memes.

 

 

The Backfire Effect

The backfire effect is a fascinating phenomenon. It occurs when a person is confronted with information that contradicts an incorrect belief that they hold. The backfire effect results from the surprising finding that attempts at persuading others with truthful information may actually make the believer believe the untruth even more than if they hadn’t been confronted in the first place.

The term “backfire effect” was coined by Brendan Nyhan and Jason Reifler in a 2010 scientific article on political misperceptions. Their article caused an international sensation, both in the scientific community and in the popular press. At a time when dishonesty in politics seems to be at historically high levels, this is no surprise.

In their article, Nyhan and Reifler concluded:

“The experiments reported in this paper help us understand why factual misperceptions about politics are so persistent. We find that responses to corrections in mock news articles differ significantly according to subjects’ ideological views. As a result, the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups in several cases.”

Subsequently, other researchers found similar backfire effects, and notable researchers working in the area (e.g., Lewandowsky) have expressed the rather fatalistic view that attempts at correcting misinformation were unlikely to work—that believers would not change their minds even in the face of compelling evidence.

 

Debunking the Myths in the Learning Field

As I have communicated many times, there are dozens of dangerously harmful myths in the learning field, including learning styles, neuroscience as fundamental to learning design, and the myth that “people remember 10% of what they read, 20% of what they hear, 30% of what they see…etc.” I even formed a group to confront these myths (The Debunker Club), although, and I must apologize, I have not had the time to devote to enabling our group to be more active.

The “backfire effect” was a direct assault on attempts to debunk myths in the learning field. Why bother if we would make no difference? If believers of untruths would continue to believe? If our actions to persuade would have a boomerang effect, causing false beliefs to be believed even more strongly? It was a leg-breaking, breath-taking finding. I wrote a set of recommendations to debunkers in the learning field on how best to be successful in debunking, but admittedly many of us, me included, were left feeling somewhat paralyzed by the backfire finding.

Ironically perhaps, I was not fully convinced. Indeed, some may think I suffered from my own backfire effect. In reviewing a scientific research review in 2017 on how to debunk, I implored that more research be done so we could learn more about how to debunk successfully, but I also argued that misinformation simply couldn’t be a permanent condition, that there was ample evidence to show that people could change their minds even on issues that they once believed strongly. Racist bigots have become voices for diversity. Homophobes have embraced the rainbow. Religious zealots have become agnostic. Lovers of technology have become anti-technology. Vegans have become paleo meat lovers. Devotees of Coke have switched to Pepsi.

The bottom line is that organizations waste millions of dollars every year when they use faulty information to guide their learning designs. As a professional in the learning field, it’s our professional responsibility to avoid the danger of misinformation! But is this even possible?

 

The Latest Research Findings

There is good news in the latest research! Thomas Wood and Ethan Porter just published an article (2018) that could not find any evidence for a backfire effect. They replicated the Nyhan and Reifler research, they expanded tenfold the number of misinformation instances studied, they modified the wording of their materials, they utilized over 10,000 participants in their research, and they varied their methods for obtaining those participants. They did not find any evidence for a backfire effect.

“We find that backfire is stubbornly difficult to induce, and is thus unlikely to be a characteristic of the public’s relationship to factual information. Overwhelmingly, when presented with factual information that corrects politicians—even when the politician is an ally—the average subject accedes to the correction and distances himself from the inaccurate claim.”

There is additional research to show that people can change their minds, that fact-checking can work, that feedback can correct misconceptions. Rich and Zaragoza (2016) found that misinformation can be fixed with corrections. Rich, Van Loon, Dunlosky, and  Zaragoza (2017) found that corrective feedback could work, if it was designed to be believed. More directly, Nyhan and Reifler (2016), in work cited by the American Press Institute Accountability Project, found that fact checking can work to debunk misinformation.

 

Some Perspective

First of all, let’s acknowledge that science sometimes works slowly. We don’t yet know all we will know about these persuasion and information-correction effects.

Also, let’s please be careful to note that backfire effects, when they are actually evoked, are typically found in situations where people are ideologically inclined to a system of beliefs for which they strongly identify. Backfire effects have been studied most of in situations where someone identifies themselves as a conservative or liberal—when this identity is singularly or strongly important to their self identity. Are folks in the learning field such strong believers in a system of beliefs and self-identity to easily suffer from the backfire effect? Maybe sometimes, but perhaps less likely than in the area of political belief which seems to consume many of us.

Here are some learning-industry beliefs that may be so deeply held that the light of truth may not penetrate easily:

  • Belief that learners know what is best for their learning.
  • Belief that learning is about conveying information.
  • Belief that we as learning professionals must kowtow to our organizational stakeholders, that we have no grounds to stand by our own principles.
  • Belief that our primary responsibility is to our organizations not our learners.
  • Belief that learner feedback is sufficient in revealing learning effectiveness.

These beliefs seem to undergird other beliefs and I’ve seen in my work where these beliefs seem to make it difficult to convey important truths. So let me clarify and first say that it is speculative on my part that these beliefs have substantial influence. This is a conjecture on my part. Note also that given that the research on the “backfire effect” has now been shown to be tenuous, I’m not claiming that fighting such foundational beliefs will cause damage. On the contrary, it seems like it might be worth doing.

 

Knowledge May Be Modifiable, But Attitudes and Belief Systems May Be Harder to Change

The original backfire effect showed that people believed facts more strongly when confronted with correct information, but this misses an important distinction. There are facts and there are attitudes, belief systems, and policy preferences.

A fascinating thing happened when Wood and Porter looked for—but didn’t find—the backfire effect. They talked with the original researchers, Nyhan and Reifler, and they began working together to solve the mystery. Why did the backfire effect happen sometimes but not regularly?

In a recent podcast (January 28, 2018) from the “You Are Not So Smart” podcast, Wood, Porter, and Nyhan were interviewed by David McRaney and they nicely clarified the distinction between factual backfire and attitudinal backfire.

Nyhan:

“People often focus on changing factual beliefs with the assumption that it will have consequences for the opinions people hold, or the policy preferences that they have, but we know from lots of social science research…that people can change their factual beliefs and it may not have an effect on their opinions at all.”

“The fundamental misconception here is that people use facts to form opinions and in practice that’s not how we tend to do it as human beings. Often we are marshaling facts to defend a particular opinion that we hold and we may be willing to discard a particular factual belief without actually revising the opinion that we’re using it to justify.”

Porter:

“Factual backfire if it exits would be especially worrisome, right? I don’t really believe we are going to find it anytime soon… Attitudinal backfire is less worrisome, because in some ways attitudinal backfire is just another description for failed persuasion attempts… that doesn’t mean that it’s impossible to change your attitude. That may very well just mean that what I’ve done to change your attitude has been a failure. It’s not that everyone is immune to persuasion, it’s just that persuasion is really, really hard.”

McRaney (Podcast Host):

“And so the facts suggest that the facts do work, and you absolutely should keep correcting people’s misinformation because people do update their beliefs and that’s important, but when we try to change people’s minds by only changing their [factual] beliefs, you can expect to end up, and engaging in, belief whack-a-mole, correcting bad beliefs left and right as the person on the other side generates new ones to support, justify, and protect the deeper psychological foundations of the self.”

Nyhan:

“True backfire effects, when people are moving overwhelmingly in the opposite direction, are probably very rare, they are probably on issues where people have very strong fixed beliefs….”

 

Rise Up! Debunk!

Here’s the takeaway for us in the learning field who want to be helpful in moving practice to more effective approaches.

  • While there may be some underlying beliefs that influence thinking in the learning field, they are unlikely to be as strongly held as the political beliefs that researchers have studied.
  • The research seems fairly clear that factual backfire effects are extremely unlikely to occur, so we should not be afraid to debunk factual inaccuracies.
  • Persuasion is difficult but not impossible, so it is worth making attempts to debunk. Such attempts are likely to be more effective if we take a change-management approach, look to the science of persuasion, and persevere respectfully and persistently over time.

Here is the message that one of the researchers, Tom Wood, wants to convey:

“I want to affirm people. Keep going out and trying to provide facts in your daily lives and know that the facts definitely make some difference…”

Here are some methods of persuasion from a recent article by Flynn, Nyhan, and Reifler (2017) that have worked even with people’s strongly-held beliefs:

  • When the persuader is seen to be ideologically sympathetic with those who might be persuaded.
  • When the correct information is presented in a graphical form rather than a textual form.
  • When an alternative causal account of the original belief is offered.
  • When credible or professional fact-checkers are utilized.
  • When multiple “related stories” are also encountered.

The stakes are high! Bad information permeates the learning field and makes our learning interventions less effective, harming our learners and our organizations while wasting untold resources.

We owe it to our organizations, our colleagues, and our fellow citizens to debunk bad information when we encounter it!

Let’s not be assholes about it! Let’s do it with respect, with openness to being wrong, and with all our persuasive wisdom. But let’s do it. It’s really important that we do!

 

Research Cited

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions.
Political Behavior, 32(2), 303–330.

Nyhan, B., & Zaragoza, J. (2016). Do people actually learn from fact-checking? Evidence from a longitudinal study during the 2014 campaign. Available at: www.dartmouth.edu/~nyhan/fact-checking-effects.pdf.
Rich, P. R., Van Loon, M. H., Dunlosky, J., & Zaragoza, M. S. (2017). Belief in corrective feedback for common misconceptions: Implications for knowledge revision. Journal of Experimental Psychology: Learning, Memory, and Cognition, 43(3), 492-501.
Rich, P. R., & Zaragoza, M. S. (2016). The continued influence of implied and explicitly stated misinformation in news reports. Journal of Experimental Psychology: Learning, Memory, and Cognition, 42(1), 62-74. http://dx.doi.org/10.1037/xlm0000155
Wood, T., & Porter, E. (2018). The elusive backfire effect: Mass attitudes’ steadfast factual adherence, Political Behavior, Advance Online Publication.