Conferences can be beautiful things—helping us learn, building relationships that help us grow and bring us joy, prompting us to see patterns in our industry we might miss otherwise, helping us set our agenda for what we need to learn more fully.
Conferences can be ugly things—teaching us myths, reinforcing our misconceptions, connecting us to people who steer us toward misinformation, echo chambers of bad thinking, a vendor-infested shark tank that can lead us to buy stuff that’s not that helpful or is actually harmful, pushing us to set our learning agenda on topics that distract us from what’s really important.
Given this dual reality, your job as a conference attendee is to be smart and skeptical, and work to validate your learning. In the Training Maximizers model, the first goal is ensuring our learning interventions are built from a base of “valid, credible content.” In conferences, where we curate our own learning, we have to be sure we are imbibing the good stuff and avoiding the poison. Here, I’ll highlight a few things to keep in mind as you attend a conference. I’ll aim to make this especially relevant for this year, 2018, when you are likely to encounter certain memes and themes.
Drinking the Good Stuff
- Look for speakers who have a background doing two things, (1) studying the scientific research (not opinion research), and (2) working with real-world learning professionals in implementing research-based practices.
- If speakers make statements without evidence, ask for the evidence or the research—or be highly skeptical.
- If things seem almost too good to be true, warn yourself that learning is complicated and there are no magic solutions.
- Be careful not to get sucked into group-think. Just because others seem to like something, doesn’t necessarily make it good. Think for yourself.
- Remember that correlation does not mean causation. Just because some factors seem to move in the same direction doesn’t mean that one caused the other. It could be the other way around. Or some third factor may have caused both to move in the same direction.
Prepare Yourself for This Year’s Shiny Objects
- Learning Styles — Learning Styles is bogus, but it keeps coming up every year. Don’t buy into it. Learn about it first. The Debunker.Club has a nice post on why we should avoid learning styles. Read it. And don’t let people tell you that learning styles if bad but learning preferences is good. They’re pulling the wool.
- Dale’s Cone with Percentages — People do NOT remember 10% of what they read, 20% of what they read, 30% of what they see (or anything similar). Here’s the Internet’s #1 URL debunking this silly myth.
- Neuroscience and Learning — It’s a very hot topic with vendors touting neuroscience to entice you to be impressed. But neuroscience at this time has nothing to say about learning.
- Microlearning — Because it’s a hot topic, vendors and consultants are yapping about microlearning endlessly. But microlearning is not a thing. It’s many things. Here’s the definitive definition of microlearning, if I must say so myself.
- AI, Machine Learning, and Big Data — Sexy stuff certainly, but it’s not clear whether these things can be applied to learning, or whether they can be applied now (given the state of our knowledge). Beware of taking these claims too seriously. Be open, but skeptical.
- Gamification — We are almost over this fad thankfully. Still, keep in mind that gamification, like microlearning, is comprised of multiple learning methods. Gamification is NOT a thing.
- Personalization — Personalization is a great idea, if carried out properly. Be careful if what someone calls personalization is just another way of saying learning styles. Also, don’t buy into the idea that personalization is new. It’s quite old. See Skinner and Keller back in the early 1900’s.
- Learning Analytics — There is a lot of movement in learning evaluation, but much of it is wrong-headed focus on pretty dashboards, and a focus only on business impact. Look for folks who are talking about how to get better feedback to make learning better. I’ll tout my own effort to develop a new approach to gathering learner feedback. But beware and do NOT just do smile sheets (said by the guy who wrote a book on smile sheets)! Beware of vendors telling you to focus only on measuring behavior and business results. Read why here.
- Kirkpatrick-Katzell Four-Level Model of Evaluation — Always a constant in the workplace learning field for the past 60 years. But even with recent changes it still has too many problems to be worthwhile. See the new Learning-Transfer Evaluation Model (LTEM), a worthy replacement.
Wow! So much to be worried about.
Well, sorry to say, I surely missing some stuff. It’s up to you to be smart and skeptical at the same time you stay open to new ideas.
You might consider joining the Debunker Club, folks who have agreed on the importance of debunking myths in the learning field.
An exhaustive new research study reveals that the backfire effect is not as prevalent as previous research once suggested. This is good news for debunkers, those who attempt to correct misconceptions. This may be good news for humanity as well. If we cannot reason from truth, if we cannot reliably correct our misconceptions, we as a species will certainly be diminished—weakened by realities we have not prepared ourselves to overcome. For those of us in the learning field, the removal of the backfire effect as an unbeatable Goliath is good news too. Perhaps we can correct the misconceptions about learning that every day wreak havoc on our learning designs, hurt our learners, push ineffective practices, and cause an untold waste of time and money spent chasing mythological learning memes.
The Backfire Effect
The backfire effect is a fascinating phenomenon. It occurs when a person is confronted with information that contradicts an incorrect belief that they hold. The backfire effect results from the surprising finding that attempts at persuading others with truthful information may actually make the believer believe the untruth even more than if they hadn’t been confronted in the first place.
The term “backfire effect” was coined by Brendan Nyhan and Jason Reifler in a 2010 scientific article on political misperceptions. Their article caused an international sensation, both in the scientific community and in the popular press. At a time when dishonesty in politics seems to be at historically high levels, this is no surprise.
In their article, Nyhan and Reifler concluded:
“The experiments reported in this paper help us understand why factual misperceptions about politics are so persistent. We find that responses to corrections in mock news articles differ significantly according to subjects’ ideological views. As a result, the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups in several cases.”
Subsequently, other researchers found similar backfire effects, and notable researchers working in the area (e.g., Lewandowsky) have expressed the rather fatalistic view that attempts at correcting misinformation were unlikely to work—that believers would not change their minds even in the face of compelling evidence.
Debunking the Myths in the Learning Field
As I have communicated many times, there are dozens of dangerously harmful myths in the learning field, including learning styles, neuroscience as fundamental to learning design, and the myth that “people remember 10% of what they read, 20% of what they hear, 30% of what they see…etc.” I even formed a group to confront these myths (The Debunker Club), although, and I must apologize, I have not had the time to devote to enabling our group to be more active.
The “backfire effect” was a direct assault on attempts to debunk myths in the learning field. Why bother if we would make no difference? If believers of untruths would continue to believe? If our actions to persuade would have a boomerang effect, causing false beliefs to be believed even more strongly? It was a leg-breaking, breath-taking finding. I wrote a set of recommendations to debunkers in the learning field on how best to be successful in debunking, but admittedly many of us, me included, were left feeling somewhat paralyzed by the backfire finding.
Ironically perhaps, I was not fully convinced. Indeed, some may think I suffered from my own backfire effect. In reviewing a scientific research review in 2017 on how to debunk, I implored that more research be done so we could learn more about how to debunk successfully, but I also argued that misinformation simply couldn’t be a permanent condition, that there was ample evidence to show that people could change their minds even on issues that they once believed strongly. Racist bigots have become voices for diversity. Homophobes have embraced the rainbow. Religious zealots have become agnostic. Lovers of technology have become anti-technology. Vegans have become paleo meat lovers. Devotees of Coke have switched to Pepsi.
The bottom line is that organizations waste millions of dollars every year when they use faulty information to guide their learning designs. As a professional in the learning field, it’s our professional responsibility to avoid the danger of misinformation! But is this even possible?
The Latest Research Findings
There is good news in the latest research! Thomas Wood and Ethan Porter just published an article (2018) that could not find any evidence for a backfire effect. They replicated the Nyhan and Reifler research, they expanded tenfold the number of misinformation instances studied, they modified the wording of their materials, they utilized over 10,000 participants in their research, and they varied their methods for obtaining those participants. They did not find any evidence for a backfire effect.
“We find that backfire is stubbornly difficult to induce, and is thus unlikely to be a characteristic of the public’s relationship to factual information. Overwhelmingly, when presented with factual information that corrects politicians—even when the politician is an ally—the average subject accedes to the correction and distances himself from the inaccurate claim.”
There is additional research to show that people can change their minds, that fact-checking can work, that feedback can correct misconceptions. Rich and Zaragoza (2016) found that misinformation can be fixed with corrections. Rich, Van Loon, Dunlosky, and Zaragoza (2017) found that corrective feedback could work, if it was designed to be believed. More directly, Nyhan and Reifler (2016), in work cited by the American Press Institute Accountability Project, found that fact checking can work to debunk misinformation.
First of all, let’s acknowledge that science sometimes works slowly. We don’t yet know all we will know about these persuasion and information-correction effects.
Also, let’s please be careful to note that backfire effects, when they are actually evoked, are typically found in situations where people are ideologically inclined to a system of beliefs for which they strongly identify. Backfire effects have been studied most of in situations where someone identifies themselves as a conservative or liberal—when this identity is singularly or strongly important to their self identity. Are folks in the learning field such strong believers in a system of beliefs and self-identity to easily suffer from the backfire effect? Maybe sometimes, but perhaps less likely than in the area of political belief which seems to consume many of us.
Here are some learning-industry beliefs that may be so deeply held that the light of truth may not penetrate easily:
- Belief that learners know what is best for their learning.
- Belief that learning is about conveying information.
- Belief that we as learning professionals must kowtow to our organizational stakeholders, that we have no grounds to stand by our own principles.
- Belief that our primary responsibility is to our organizations not our learners.
- Belief that learner feedback is sufficient in revealing learning effectiveness.
These beliefs seem to undergird other beliefs and I’ve seen in my work where these beliefs seem to make it difficult to convey important truths. So let me clarify and first say that it is speculative on my part that these beliefs have substantial influence. This is a conjecture on my part. Note also that given that the research on the “backfire effect” has now been shown to be tenuous, I’m not claiming that fighting such foundational beliefs will cause damage. On the contrary, it seems like it might be worth doing.
Knowledge May Be Modifiable, But Attitudes and Belief Systems May Be Harder to Change
The original backfire effect showed that people believed facts more strongly when confronted with correct information, but this misses an important distinction. There are facts and there are attitudes, belief systems, and policy preferences.
A fascinating thing happened when Wood and Porter looked for—but didn’t find—the backfire effect. They talked with the original researchers, Nyhan and Reifler, and they began working together to solve the mystery. Why did the backfire effect happen sometimes but not regularly?
In a recent podcast (January 28, 2018) from the “You Are Not So Smart” podcast, Wood, Porter, and Nyhan were interviewed by David McRaney and they nicely clarified the distinction between factual backfire and attitudinal backfire.
“People often focus on changing factual beliefs with the assumption that it will have consequences for the opinions people hold, or the policy preferences that they have, but we know from lots of social science research…that people can change their factual beliefs and it may not have an effect on their opinions at all.”
“The fundamental misconception here is that people use facts to form opinions and in practice that’s not how we tend to do it as human beings. Often we are marshaling facts to defend a particular opinion that we hold and we may be willing to discard a particular factual belief without actually revising the opinion that we’re using it to justify.”
“Factual backfire if it exits would be especially worrisome, right? I don’t really believe we are going to find it anytime soon… Attitudinal backfire is less worrisome, because in some ways attitudinal backfire is just another description for failed persuasion attempts… that doesn’t mean that it’s impossible to change your attitude. That may very well just mean that what I’ve done to change your attitude has been a failure. It’s not that everyone is immune to persuasion, it’s just that persuasion is really, really hard.”
McRaney (Podcast Host):
“And so the facts suggest that the facts do work, and you absolutely should keep correcting people’s misinformation because people do update their beliefs and that’s important, but when we try to change people’s minds by only changing their [factual] beliefs, you can expect to end up, and engaging in, belief whack-a-mole, correcting bad beliefs left and right as the person on the other side generates new ones to support, justify, and protect the deeper psychological foundations of the self.”
“True backfire effects, when people are moving overwhelmingly in the opposite direction, are probably very rare, they are probably on issues where people have very strong fixed beliefs….”
Rise Up! Debunk!
Here’s the takeaway for us in the learning field who want to be helpful in moving practice to more effective approaches.
- While there may be some underlying beliefs that influence thinking in the learning field, they are unlikely to be as strongly held as the political beliefs that researchers have studied.
- The research seems fairly clear that factual backfire effects are extremely unlikely to occur, so we should not be afraid to debunk factual inaccuracies.
- Persuasion is difficult but not impossible, so it is worth making attempts to debunk. Such attempts are likely to be more effective if we take a change-management approach, look to the science of persuasion, and persevere respectfully and persistently over time.
Here is the message that one of the researchers, Tom Wood, wants to convey:
“I want to affirm people. Keep going out and trying to provide facts in your daily lives and know that the facts definitely make some difference…”
Here are some methods of persuasion from a recent article by Flynn, Nyhan, and Reifler (2017) that have worked even with people’s strongly-held beliefs:
- When the persuader is seen to be ideologically sympathetic with those who might be persuaded.
- When the correct information is presented in a graphical form rather than a textual form.
- When an alternative causal account of the original belief is offered.
- When credible or professional fact-checkers are utilized.
- When multiple “related stories” are also encountered.
The stakes are high! Bad information permeates the learning field and makes our learning interventions less effective, harming our learners and our organizations while wasting untold resources.
We owe it to our organizations, our colleagues, and our fellow citizens to debunk bad information when we encounter it!
Let’s not be assholes about it! Let’s do it with respect, with openness to being wrong, and with all our persuasive wisdom. But let’s do it. It’s really important that we do!
Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions.
Political Behavior, 32(2), 303–330.
The Debunker Club, formed to fight myths and misconceptions in the learning field, is currently seeking public comment on the possibility that so-called neuroscience-based recommendations for learning and education are premature, untenable, or invalid.