The Backfire Effect is NOT Prevalent: Good News for Debunkers, Humans, and Learning Professionals!

, , ,

An exhaustive new research study reveals that the backfire effect is not as prevalent as previous research once suggested. This is good news for debunkers, those who attempt to correct misconceptions. This may be good news for humanity as well. If we cannot reason from truth, if we cannot reliably correct our misconceptions, we as a species will certainly be diminished—weakened by realities we have not prepared ourselves to overcome. For those of us in the learning field, the removal of the backfire effect as an unbeatable Goliath is good news too. Perhaps we can correct the misconceptions about learning that every day wreak havoc on our learning designs, hurt our learners, push ineffective practices, and cause an untold waste of time and money spent chasing mythological learning memes.

 

 

The Backfire Effect

The backfire effect is a fascinating phenomenon. It occurs when a person is confronted with information that contradicts an incorrect belief that they hold. The backfire effect results from the surprising finding that attempts at persuading others with truthful information may actually make the believer believe the untruth even more than if they hadn’t been confronted in the first place.

The term “backfire effect” was coined by Brendan Nyhan and Jason Reifler in a 2010 scientific article on political misperceptions. Their article caused an international sensation, both in the scientific community and in the popular press. At a time when dishonesty in politics seems to be at historically high levels, this is no surprise.

In their article, Nyhan and Reifler concluded:

“The experiments reported in this paper help us understand why factual misperceptions about politics are so persistent. We find that responses to corrections in mock news articles differ significantly according to subjects’ ideological views. As a result, the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups in several cases.”

Subsequently, other researchers found similar backfire effects, and notable researchers working in the area (e.g., Lewandowsky) have expressed the rather fatalistic view that attempts at correcting misinformation were unlikely to work—that believers would not change their minds even in the face of compelling evidence.

 

Debunking the Myths in the Learning Field

As I have communicated many times, there are dozens of dangerously harmful myths in the learning field, including learning styles, neuroscience as fundamental to learning design, and the myth that “people remember 10% of what they read, 20% of what they hear, 30% of what they see…etc.” I even formed a group to confront these myths (The Debunker Club), although, and I must apologize, I have not had the time to devote to enabling our group to be more active.

The “backfire effect” was a direct assault on attempts to debunk myths in the learning field. Why bother if we would make no difference? If believers of untruths would continue to believe? If our actions to persuade would have a boomerang effect, causing false beliefs to be believed even more strongly? It was a leg-breaking, breath-taking finding. I wrote a set of recommendations to debunkers in the learning field on how best to be successful in debunking, but admittedly many of us, me included, were left feeling somewhat paralyzed by the backfire finding.

Ironically perhaps, I was not fully convinced. Indeed, some may think I suffered from my own backfire effect. In reviewing a scientific research review in 2017 on how to debunk, I implored that more research be done so we could learn more about how to debunk successfully, but I also argued that misinformation simply couldn’t be a permanent condition, that there was ample evidence to show that people could change their minds even on issues that they once believed strongly. Racist bigots have become voices for diversity. Homophobes have embraced the rainbow. Religious zealots have become agnostic. Lovers of technology have become anti-technology. Vegans have become paleo meat lovers. Devotees of Coke have switched to Pepsi.

The bottom line is that organizations waste millions of dollars every year when they use faulty information to guide their learning designs. As a professional in the learning field, it’s our professional responsibility to avoid the danger of misinformation! But is this even possible?

 

The Latest Research Findings

There is good news in the latest research! Thomas Wood and Ethan Porter just published an article (2018) that could not find any evidence for a backfire effect. They replicated the Nyhan and Reifler research, they expanded tenfold the number of misinformation instances studied, they modified the wording of their materials, they utilized over 10,000 participants in their research, and they varied their methods for obtaining those participants. They did not find any evidence for a backfire effect.

“We find that backfire is stubbornly difficult to induce, and is thus unlikely to be a characteristic of the public’s relationship to factual information. Overwhelmingly, when presented with factual information that corrects politicians—even when the politician is an ally—the average subject accedes to the correction and distances himself from the inaccurate claim.”

There is additional research to show that people can change their minds, that fact-checking can work, that feedback can correct misconceptions. Rich and Zaragoza (2016) found that misinformation can be fixed with corrections. Rich, Van Loon, Dunlosky, and  Zaragoza (2017) found that corrective feedback could work, if it was designed to be believed. More directly, Nyhan and Reifler (2016), in work cited by the American Press Institute Accountability Project, found that fact checking can work to debunk misinformation.

 

Some Perspective

First of all, let’s acknowledge that science sometimes works slowly. We don’t yet know all we will know about these persuasion and information-correction effects.

Also, let’s please be careful to note that backfire effects, when they are actually evoked, are typically found in situations where people are ideologically inclined to a system of beliefs for which they strongly identify. Backfire effects have been studied most of in situations where someone identifies themselves as a conservative or liberal—when this identity is singularly or strongly important to their self identity. Are folks in the learning field such strong believers in a system of beliefs and self-identity to easily suffer from the backfire effect? Maybe sometimes, but perhaps less likely than in the area of political belief which seems to consume many of us.

Here are some learning-industry beliefs that may be so deeply held that the light of truth may not penetrate easily:

  • Belief that learners know what is best for their learning.
  • Belief that learning is about conveying information.
  • Belief that we as learning professionals must kowtow to our organizational stakeholders, that we have no grounds to stand by our own principles.
  • Belief that our primary responsibility is to our organizations not our learners.
  • Belief that learner feedback is sufficient in revealing learning effectiveness.

These beliefs seem to undergird other beliefs and I’ve seen in my work where these beliefs seem to make it difficult to convey important truths. So let me clarify and first say that it is speculative on my part that these beliefs have substantial influence. This is a conjecture on my part. Note also that given that the research on the “backfire effect” has now been shown to be tenuous, I’m not claiming that fighting such foundational beliefs will cause damage. On the contrary, it seems like it might be worth doing.

 

Knowledge May Be Modifiable, But Attitudes and Belief Systems May Be Harder to Change

The original backfire effect showed that people believed facts more strongly when confronted with correct information, but this misses an important distinction. There are facts and there are attitudes, belief systems, and policy preferences.

A fascinating thing happened when Wood and Porter looked for—but didn’t find—the backfire effect. They talked with the original researchers, Nyhan and Reifler, and they began working together to solve the mystery. Why did the backfire effect happen sometimes but not regularly?

In a recent podcast (January 28, 2018) from the “You Are Not So Smart” podcast, Wood, Porter, and Nyhan were interviewed by David McRaney and they nicely clarified the distinction between factual backfire and attitudinal backfire.

Nyhan:

“People often focus on changing factual beliefs with the assumption that it will have consequences for the opinions people hold, or the policy preferences that they have, but we know from lots of social science research…that people can change their factual beliefs and it may not have an effect on their opinions at all.”

“The fundamental misconception here is that people use facts to form opinions and in practice that’s not how we tend to do it as human beings. Often we are marshaling facts to defend a particular opinion that we hold and we may be willing to discard a particular factual belief without actually revising the opinion that we’re using it to justify.”

Porter:

“Factual backfire if it exits would be especially worrisome, right? I don’t really believe we are going to find it anytime soon… Attitudinal backfire is less worrisome, because in some ways attitudinal backfire is just another description for failed persuasion attempts… that doesn’t mean that it’s impossible to change your attitude. That may very well just mean that what I’ve done to change your attitude has been a failure. It’s not that everyone is immune to persuasion, it’s just that persuasion is really, really hard.”

McRaney (Podcast Host):

“And so the facts suggest that the facts do work, and you absolutely should keep correcting people’s misinformation because people do update their beliefs and that’s important, but when we try to change people’s minds by only changing their [factual] beliefs, you can expect to end up, and engaging in, belief whack-a-mole, correcting bad beliefs left and right as the person on the other side generates new ones to support, justify, and protect the deeper psychological foundations of the self.”

Nyhan:

“True backfire effects, when people are moving overwhelmingly in the opposite direction, are probably very rare, they are probably on issues where people have very strong fixed beliefs….”

 

Rise Up! Debunk!

Here’s the takeaway for us in the learning field who want to be helpful in moving practice to more effective approaches.

  • While there may be some underlying beliefs that influence thinking in the learning field, they are unlikely to be as strongly held as the political beliefs that researchers have studied.
  • The research seems fairly clear that factual backfire effects are extremely unlikely to occur, so we should not be afraid to debunk factual inaccuracies.
  • Persuasion is difficult but not impossible, so it is worth making attempts to debunk. Such attempts are likely to be more effective if we take a change-management approach, look to the science of persuasion, and persevere respectfully and persistently over time.

Here is the message that one of the researchers, Tom Wood, wants to convey:

“I want to affirm people. Keep going out and trying to provide facts in your daily lives and know that the facts definitely make some difference…”

Here are some methods of persuasion from a recent article by Flynn, Nyhan, and Reifler (2017) that have worked even with people’s strongly-held beliefs:

  • When the persuader is seen to be ideologically sympathetic with those who might be persuaded.
  • When the correct information is presented in a graphical form rather than a textual form.
  • When an alternative causal account of the original belief is offered.
  • When credible or professional fact-checkers are utilized.
  • When multiple “related stories” are also encountered.

The stakes are high! Bad information permeates the learning field and makes our learning interventions less effective, harming our learners and our organizations while wasting untold resources.

We owe it to our organizations, our colleagues, and our fellow citizens to debunk bad information when we encounter it!

Let’s not be assholes about it! Let’s do it with respect, with openness to being wrong, and with all our persuasive wisdom. But let’s do it. It’s really important that we do!

 

Research Cited

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions.
Political Behavior, 32(2), 303–330.

Nyhan, B., & Zaragoza, J. (2016). Do people actually learn from fact-checking? Evidence from a longitudinal study during the 2014 campaign. Available at: www.dartmouth.edu/~nyhan/fact-checking-effects.pdf.
Rich, P. R., Van Loon, M. H., Dunlosky, J., & Zaragoza, M. S. (2017). Belief in corrective feedback for common misconceptions: Implications for knowledge revision. Journal of Experimental Psychology: Learning, Memory, and Cognition, 43(3), 492-501.
Rich, P. R., & Zaragoza, M. S. (2016). The continued influence of implied and explicitly stated misinformation in news reports. Journal of Experimental Psychology: Learning, Memory, and Cognition, 42(1), 62-74. http://dx.doi.org/10.1037/xlm0000155
Wood, T., & Porter, E. (2018). The elusive backfire effect: Mass attitudes’ steadfast factual adherence, Political Behavior, Advance Online Publication.

 

Learning Styles Notion Still Prevalent on Google

, , ,

Two and a half years ago, in writing a blog post on learning styles, I did a Google search using the words “learning styles.” I found that the top 17 search items were all advocating for learning styles, even though there was clear evidence that learning-styles approaches DO NOT WORK.

Today, I replicated that search and found the following in the top 17 search items:

  • 13 advocated/supported the learning-styles idea.
  • 4 debunked it.

That’s progress, but clearly Google is not up to the task of providing valid information on learning styles.

Scientific Research that clearly Debunks the Learning-Styles Notion:

  • Kirschner, P. A. (2017) Stop propagating the learning styles myth. Computers & Education, 106, 166-171.
  • Willingham, D. T., Hughes, E. M., & Dobolyi, D. G. (2015). The scientific status of learning styles theories. Teaching of Psychology, 42(3), 266-271.
  • Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles: Concepts and evidence. Psychological Science in the Public Interest, 9(3), 105-119.
  • Rohrer, D., & Pashler, H. (2012). Learning styles: Where’s the evidence? Medical Education, 46(7), 634-635.

Follow the Money

  • Still no one has come forward to prove the benefits of learning styles, even though it’s been over 10 years since $1,000 was offered, and 3 years since $5,000 was offered.

Recording of Webinar — On Transfer Research for 2018

,

Holy Cow Batman! Yesterday’s Webinar, which I co-hosted with Emma Weber of Lever Learning, was overbooked and some people were unable to connect. To help make amends, here is the recording of the webinar:

 

 

Click Here to View Webinar on YouTube

 

Apologies in advance that we were not able to record the actual polling results (the responses of those who attended live — to the questions we asked). Still, I think it’s pretty good as webinar recordings go.

Emma and I send our heartfelt apologies. We know some of you notified your teams, changed your schedules, and stayed up late or stayed late at work to watch. We are considering offering an encore engagement in January for those who might want to participate more intimately than a recording can provide. Watch this blog for details or sign up for my list to be notified.

Learner-Feedback Current Practices Survey 2017-2018

, ,

New Meta-Analysis on Debunking — Still an Unclear Way to Potency

,

A new meta-analysis on debunking was released last week, and I was hoping to get clear guidelines on how to debunk misinformation. Unfortunately, the science still seems somewhat equivocal about how to debunk. Either that, or there’s just no magic bullet.

Let’s break this down. We all know misinformation exists. People lie, people get confused and share bad information, people don’t vet their sources, incorrect information is easily spread, et cetera. Debunking is the act of providing information or inducing interactions intended to correct misinformation.

Misinformation is a huge problem in the world today, especially in our political systems. Democracy is difficult if political debate and citizen conversations are infused with bad information. Misinformation is also a huge problem for citizens themselves and for organizations. People who hear false health-related information can make themselves sick. Organizations who have employees who make decisions based on bad information, can hurt the bottom line.

In the workplace learning field, there’s a ton of misinformation that has incredibly damaging effects. People believe in the witchcraft of learning styles, neuroscience snake oil, traditional smile sheets, and all kinds of bogus information.

It would be nice if misinformation could be easily thwarted, but too often it lingers. For example, the idea that people remember 10% of what they read, 20% of what they hear, 30% of what they see, etc., has been around since 1913 if not before, but it still gets passed around every year on bastardized versions of Dale’s Cone.

A meta-analysis is a scientific study that compiles many other scientific studies using advanced statistical procedures to enable overall conclusions to be drawn. The study I reviewed (the one that was made available online last week) is:

Chan, M. S., Jones, C. R., Jamieson, K. H., & Albarracin, D. (2017). Debunking: A meta-analysis of the psychological efficacy of messages countering misinformation. Psychological Science, early online publication (print page numbers not yet determined). Available here (if you have journal access: http://journals.sagepub.com/doi/10.1177/0956797617714579).

This study compiled scientific studies that:

  1. First presented people with misinformation (except a control group that got no misinformation).
  2. Then presented them with a debunking procedure.
  3. Then looked at what effect the debunking procedure had on people’s beliefs.

There are three types of effects examined in the study:

  1. Misinformation effect = Difference between the group that just got misinformation and a control group that didn’t get misinformation. This determined how much the misinformation hurt.
  2. Debunking effect = Difference between the group that just got misinformation and a group that got misinformation and later debunking. This determined how much debunking could lesson the effects of the misinformation.
  3. Misinformation-Persistence effect = Difference between the group that got misinformation-and-debunking and the control group that didn’t get misinformation. This determined how much debunking could fully reverse the effects of the misinformation.

They looked at three sets of factors.

First, the study examined what happens when people encounter misinformation. They found that the more people thought of explanations for the false information, the more they would believe this misinformation later, even in the face of debunking. From a practical standpoint then, if people are receiving misinformation, we should hope they don’t think too deeply about it. Of course, this is largely out of our control as learning practitioners, because people come to us after they’ve gotten misinformation. On the other hand, it may provide hints for us as we use knowledge management or social media. The research findings suggest that we might need to intervene immediately when bad information is encountered to prevent people from elaborating on the misinformation.

Second, the meta-analysis examined whether debunking messages that included procedures to induce people to make counter-arguments to the misinformation would outperform debunking messages that did not include such procedures (or that included less potent counter-argument-inducing procedures). They found consistent benefits to these counter-argument inducing procedures. These procedures helped reduce misinformation. This suggests strongly that debunking should induce counter-arguments to the misinformation. And though specific mechanisms for doing this may be difficult to design, it is probably not enough to present the counter-arguments ourselves without getting our learners to fully process the counter-arguments themselves to some sufficient level of mathemagenic (learning-producing) processing.

Third, the meta-analysis looked at whether debunking messages that included explanatory information for why the misinformation was wrong would outperform debunking messages that included just contradictory claims (for example, statements to the effect that the misinformation was wrong). They found mixed results here. Providing debunking messages with explanatory information was more effective in debunking misinformation (getting people to move from being misinformed to being less misinformed), but these more explanatory messages were actually less effective in fully ridding people of the misinformation. This was a conflicting finding and so it’s not clear whether greater explanations make a difference, or how they might be designed to make a difference. One wild conjecture. Perhaps where the explanations can induce relevant counter-arguments to the misinformation, they will be effective.

Overall, I came away disappointed that we haven’t been able to learn more about how to debunk. This is NOT these researchers’ fault. The data is the data. Rather, the research community as a whole has to double down on debunking and persuasion and figure out what works.

People certainly change their minds on heartfelt issues. Just think about the acceptance of gays and lesbians over the last twenty years. Dramatic changes! Many people are much more open and embracing. Well, how the hell did this happen? Some people died out, but many other people’s minds were changed.

My point is that misinformation cannot possibly be a permanent condition and it behooves the world to focus resources on fixing this problem — because it’s freakin’ huge!

————

Note that a review of this research in the New York Times painted this in a more optimistic light.

————

Some additional thoughts (added one day after original post).

To do a thorough job of analyzing any research paradigm, we should, of course, go beyond meta-analyses to the original studies being meta-analyzed. Most of us don’t have time for that, so we often take the short-cut of just reading the meta-analysis or just reading research reviews, etc. This is generally okay, but there is a caveat that we might be missing something important.

One thing that struck me in reading the meta-analysis is that the authors commented on the typical experimental paradigm used in the research. It appeared that the actual experiment might have lasted 30 minutes or less, maybe 60 minutes at most. This includes reading (learning) the misinformation, getting a ten-minute distractor task, and answering a few questions (some treatment manipulations, that is, types of debunking methods; plus the assessment of their final state of belief through answers to questions). To ensure I wasn’t misinterpreting the authors’ message that the experiments were short, I looked at several of the studies compiled in the meta-analysis. The research I looked at used very short experimental sessions. Here is one of the treatments the experimental participants received (it includes both misinformation and a corrective, so it is one of the longer treatments):

Health Care Reform and Death Panels: Setting the Record Straight

By JONATHAN G. PRATT
Published: November 15, 2009

WASHINGTON, DC – With health care reform in full swing, politicians and citizen groups are taking a close look at the provisions in the Affordable Health Care for America Act (H.R. 3962) and the accompanying Medicare Physician Payment Reform Act (H.R. 3961).

Discussion has focused on whether Congress intends to establish “death panels” to determine whether or not seniors can get access to end-of-life medical care. Some have speculated that these panels will force the elderly and ailing into accepting minimal end-of-life care to reduce health care costs. Concerns have been raised that hospitals will be forced to withhold treatments simply because they are costly, even if they extend the life of the patient. Now talking heads and politicians are getting into the act.

Betsy McCaughey, the former Lieutenant Governor of New York State has warned that the bills contain provisions that would make it mandatory that “people in Medicare have a required counseling session that will tell them how to end their life sooner.”

Iowa Senator Chuck Grassley, the ranking Republican member of the Senate Finance Committee, chimed into the debate as well at a town-hall meeting, telling a questioner, “You have every right to fear…[You] should not have a government-run plan to decide when to pull the plug on Grandma.”

However, a close examination of the bill by non-partisan organizations reveals that the controversial proposals are not death panels at all. They are nothing more than a provision that allows Medicare to pay for voluntary counseling.

The American Medical Association and the National Hospice and Palliative Care Organization support the provision. For years, federal laws and policies have encouraged Americans to think ahead about end-of-life decisions.

The bills allow Medicare to pay doctors to provide information about living wills, pain medication, and hospice care. John Rother, executive vice president of AARP, the seniors’ lobby, repeatedly has declared the “death panel” rumors false.

The new provision is similar to a proposal in the last Congress to cover an end-of-life planning consultation. That bill was co-sponsored by three Republicans, including John Isakson, a Republican Senator from Georgia.

Speaking about the end of life provisions, Senator Isakson has said, “It’s voluntary. Every state in America has an end of life directive or durable power of attorney provision… someone said Sarah Palin’s web site had talked about the House bill having death panels on it where people would be euthanized. How someone could take an end of life directive or a living will as that is nuts.”

That’s it. That’s the experimental treatment.

Are we truly to believe that such short exposures are representative of real-world debunking? Surely not! In the real world, people who get misinformation often hold that misinformation over months or years while occasionally thinking about the misinformation again or encountering additional supportive misinformation or non-supportive information that may modify their initial beliefs in the misinformation. This all happens and then we try our debunking treatments.

Finally, it should be emphasized that the meta-analysis also only compiled eight research articles, many using the same (or similar) experimental paradigm. This is further inducement to skepticism. We should be very skeptical of these findings and my plea above for more study of debunking — especially in more ecologically-valid situations — is reinforced!

Major Research Review on eLearning Effectiveness

, ,

Is elearning effective? As effective as classroom instruction — more or less effective? What about blended learning — when elearning and classroom learning are combined?

ELearning Research Report Cover 2017.


These critical questions have now been answered and are available in the research report, Does eLearning Work? What the Scientific Research Says!

In this research review, I looked at meta-analyses and individual research studies, and was able to derive clear conclusions. The report is available for free, it includes an executive summary, and research jargon is kept to a minimum.

Click here to download the report…

 

 

 

Note that the August 10, 2017 version of this report incorrectly cited the Rowland (2014) study in a footnote and omitted it from the list of research citations. These issues were fixed on August 11, 2017. Special thanks to Elizabeth Dalton who notified me of the issues.

Research Reflections — Take a Selfie Here; The Examined Life is Worth Living!

,

As professionals in the learning field, memory is central to our work. If we don’t help our learners preserve their memories (of what they learned), we have not really done our job. I’m oversimplifying here — sometimes we want to guide our learners toward external memory aids instead of memory. But mostly, we aim to support learning and memory.

Glacier View

You might have learned that people who take photographs will remember less than those who did not take photographs. Several research studies showed this (see for example, Henkel, 2014).

The internet buzzed with this information a few years ago:

  • The Telegraph — http://www.telegraph.co.uk/news/science/science-news/10507146/Taking-photographs-ruins-the-memory-research-finds.html
  • NPR — http://www.npr.org/2014/05/22/314592247/overexposed-camera-phones-could-be-washing-out-our-memories
  • Slate — http://www.slate.com/blogs/the_slatest/2013/12/09/a_new_study_finds_taking_photos_hurts_memory_of_the_thing_you_were_trying.html
  • CNN — http://www.cnn.com/2013/12/10/health/memory-photos-psychology/index.html
  • Fox News — http://www.foxnews.com/health/2013/12/11/taking-pictures-may-impair-memories-study-shows.html

Well, that was then. This is now.

Research Wisdom

There are CRITICAL LESSONS to be learned here — about using science intelligently… with wisdom.

Science is a self-correcting system that, with the arc of time, bends toward the truth. So, at any point in time, when we ask science for its conclusions, it tells us what it knows, while it apologizes for not knowing everything. Scientists can be wrong. Science can take wrong turns on the long road toward better understanding.

Does this mean we should reject scientific conclusions because they can’t guarantee omniscience; they can’t guarantee truth? I’ve written about this in more depth elsewhere, but I’ll say it here briefly — recommendations from science are better than our own intuitions; especially in regards to learning, given all the ways we humans are blind to how learning works.

Memory With Photography

Earlier studies showed that people who photographed images were less able to remember them than people who simply examined the images. Researchers surmised that people who off-loaded their memories to an external memory aid — to the photographs — freed up memory for other things.

We can look back at this now and see that this was a time of innocence; that science had kept some confidences hidden. New research by Barasch, Diehl, Silverman, and Zauberman (2017), found that people “who could freely take photographs during an experience recognized more of what they saw” and that those “with a camera had better recognition of aspects of the scene that they photographed than of aspects they did not photograph.

Of course, this is just one set of studies… we must be patient with science. More research will be done, and you and will benefit in knowing more than we know now and with more confidence… but this will take time.

What is the difference between the earlier studies and this latest set of studies? As argued by Barasch, Diehl, Silverman, and Zauberman (2017), the older studies did not give people the choice of which objects to photograph. In the words of the researchers, people did not have volitional control of their photographing experience. They didn’t go through the normal process we might go through in our real-world situations, where we must decide what to photograph and determine how to photograph the objects we target (i.e., the angles, borders, focus, etc.).

In a series of four experiments, the new research showed that attention was at the center of the memory effect. Indeed, people taking photographs “recognized more of what they saw and less of what they heard, compared with those who could not take any photographs (I added the bold underlines).

Interestingly, some of the same researchers, just the year before had found that taking photographs actually improved people’s enjoyment of their experiences (Diehl, Zauberman, & Barasch, 2016).

Practical Considerations for Learning Professionals

You might be asking yourself, “How should I handle the research-based recommendations I encounter?” Here is my advice:

  1. Be skeptical, but not too skeptical.
  2. Determine whether the research comes from a trusted source. Best sources are top-tier refereed scientific journals — especially where many studies find the same results. Worst sources are survey-based compilations of opinions. Beware of recommendations based on one scientific article. Beware of vendor-sponsored research. Beware of research that is not refereed; that is, not vetted by other researchers.
  3. Find yourself a trusted research translator. These people — and I count myself among them — have spent enough substantial time exploring the practical aspects of the research that they are liable to have wisdom about what the research means — and what its boundary conditions might be.
  4. Pay your research translators — so they can continue doing their work.
  5. Be good and prosper. Use the research in your learning programs and test it. Do good evaluation so you can get valid feedback to make your learning initiatives maximally effective.

Inscribed in My High School Yearbook in 1976

Time it was, and what a time it was, it was
A time of innocence, A time of confidences
Long ago, it must be, I have a photograph
Preserve your memories; They’re all that’s left you

Written by Paul Simon

The Photograph Above

Taken in Glacier National Park, Montana, USA; July 1, 2017
And incidentally, the glaciers are shrinking permanently.

Research Cited

Barasch, A., Diehl, K., Silverman, J., & Zauberman, G. (2017). Photographic Memory: The Effects of Volitional Photo Taking on Memory for Visual and Auditory Aspects of an Experience. Psychological Science, early online publication.

Diehl, K., Zauberman, G., & Barasch, A. (2016). How taking photos increases enjoyment of experiences. Journal of Personality and Social Psychology, 111, 119–140.

Henkel, L. A. (2014). Point-and-shoot memories: The influence of taking photos on memory for a museum tour. Psychological Science, 25, 396–402.

Prompting Learning When Our Learners Play Games

 

Another research brief. Answer the question and only then read what the research says:


 

In a recent study with teenagers playing a game to learn history, adding the learning instructions hurt learning outcomes for questions that assessed transfer, but NOT recall. The first choice hurt transfer but not recall. Give yourself some credit if you chose the second or third choices.

Caveats:

  • This is only one study.
  • It was done using only one type of learner.
  • It was done using only one type of learning method.
  • It was done with teenagers.

Important Point:

  • Don’t assume that adding instructions to encourage learning will facilitate learning.

Research:

Hawlitschek, A., & Joeckel, S. (2017). Increasing the effectiveness of digital educational games: The effects of a learning instruction on students’ learning, motivation and cognitive load. Computers in Human Behavior, 72, 79-86.

Doing Research On Our Learning Products

,

The learning profession has been blessed in recent years with a steady stream of scientific research that points to practical recommendations for designers of learning. If you or your organization are NOT hooked into the learning research, find yourself a research translator to help you! Call me, for example!

That’s the good news, but I have bad news for you too. In the old days, it wasn’t hard to create a competitive advantage for your company by staying abreast of the research and using it to design your learning products and services. Pretty soon, that won’t be enough. As the research becomes more widely known, you’ll have to do more to get a competitive advantage. Vendors especially will have to differentiate their products — NOT just by basing them on the research — but also by conducting research (A-B testing at a minimum) on your own products.

I know of at least a few companies right now who are conducting research on their own products. They aren’t advertising their research, because they want to get a jumpstart on the competition. But eventually, they’ll begin sharing what they’ve done.

Do you need an example of a company who’s had their product tested? Check out this page. Scroll down to the bottom and look at the 20 or so research studies that have been done using the product. Looks pretty impressive right?

To summarize, there are at least five benefits to doing research on your own products:

  1. Gain a competitive advantage by learning to make your product better.
  2. Gain a competitive advantage by supporting a high-quality brand image.
  3. Gain a competitive advantage by enabling the creation of unique and potent content marketing.
  4. Gain a competitive advantage by supporting creativity and innovation within your team.
  5. Gain a competitive advantage by creating an engaging and learning-oriented team environment.

Research Reviews of the Spacing Effect

I’ve been following the spacing effect for over a decade, writing a research-to-practice report in 2006, and recommending the spacing effects to my clients and in the guise of subscription learning (threaded microlearning).

One of the fascinating things is that researchers continue to be fascinated with the spacing effect producing about 10 new studies every year and many research reviews.

Here are a list of the research reviews from most recent to earliest.

  • Maddox, G. B. (2016). Understanding the underlying mechanism of the spacing effect in verbal learning: A case for encoding variability and study-phase retrieval. Journal of Cognitive Psychology, 28(6), 684-706.
  • Vlach, H. A. (2014). The spacing effect in children’s generalization of knowledge: Allowing children time to forget promotes their ability to learn. Child Development Perspectives, 8(3), 163-168.
  • Küpper-Tetzel, C. E. (2014). Understanding the distributed practice effect: Strong effects on weak theoretical grounds. Zeitschrift für Psychologie, 222(2), 71-81.
  • Carpenter, S. K. (2014). Spacing and interleaving of study and practice. In V. A. Benassi, C. E. Overson, & C. M. Hakala (Eds.), Applying science of learning in education: Infusing psychological science into the curriculum (pp. 131-141). Washington, DC: Society for the Teaching of Psychology.
  • Toppino, T. C., & Gerbier, E. (2014). About practice: Repetition, spacing, and abstraction. In B. H. Ross (Ed.), The psychology of learning and motivation: Vol. 60. The psychology of learning and motivation (pp. 113-189). San Diego, CA: Elsevier Academic Press.
  • Carpenter, S. K., Cepeda, N. J., Rohrer, D., Kang, S. H. K., & Pashler, H. (2012). Using spacing to enhance diverse forms of learning: Review of recent research and implications for instruction. Educational Psychology Review, 24(3), 369-378.
  • Kornmeier, J., & Sosic-Vasic, Z. (2012). Parallels between spacing effects during behavioral and cellular learning. Frontiers in Human Neuroscience, 6, Article ID 203.
  • Delaney, P. F., Verkoeijen, P. P. J. L., & Spirgel, A. (2010). Spacing and testing effects: A deeply critical, lengthy, and at times discursive review of the literature. In B. H. Ross (Ed.), The psychology of learning and motivation: Vol. 53. The psychology of learning and motivation: Advances in research and theory (pp. 63-147).
  • Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., & Rohrer, D. (2006). Distributed practice in verbal recall tasks: A review and quantitative synthesis. Psychological Bulletin, 132(3), 354-380.
  • Janiszewski, C., Noel, H., & Sawyer, A. G. (2003). A Meta-analysis of the Spacing Effect in Verbal Learning: Implications for Research on Advertising Repetition and Consumer Memory. Journal of Consumer Research, 30(1), 138-149.
  • Dempster, F. N., & Farris, R. (1990). The spacing effect: Research and practice. Journal of Research & Development in Education, 23(2), 97-101.
  • Underwood, B. J. (1961). Ten years of massed practice on distributed practice. Psychological Review, 68(4), 229-247.
  • Ruch, T. C. (1928). Factors influencing the relative economy of massed and distributed practice in learning. Psychological Review, 35(1), 19-45.