Brinkerhoff Case Method — A Better Name for a Great Learning-Evaluation Innovation

, ,

Updated July 3rd, 2018—a week after the original post. See end of post for the update, featuring Rob Brinkerhoff’s response.

Rob Brinkerhoff’s “Success Case Method” needs a subtle name change. I think a more accurate name would be the “Brinkerhoff Case Method.”

I’m one of Rob’s biggest fans, having selected him in 2008 as the Neon Elephant Award Winner for his evaluation work.

Thirty five years ago, in 1983, Rob published an article where he introduced the “Success Case Method.” Here is a picture of the first page of that article:

In that article, the Success-Case Method was introduced as a way to find the value of training when it works. Rob wrote, “The success-case method does not purport to produce a balanced assessment of the total results of training. It does, however, attempt to answer the question: When training works, how well does it work?” (page 58, which is visible above).

The Success-Case Method didn’t stand still. It evolved and improved as Rob refined it based on his research and his work with clients. In his landmark book that details the methodology in 2006, Telling Training’s Story: Evaluation Made Simple, Credible, and Effective, Rob describes how to first survey learners and then sample some of them for interviews by selecting them based on their level of success in applying the training. “Once the sorting is complete, the next step is to select the interviewees from among the high and low success candidates, and perhaps from the middle categories.” (page 102).

To call this the success-case method seems more aligned with the original naming then the actual recommended practice. For that reason, I recommend that we simply call it the Brinkerhoff Case Method. This gives Rob the credit he deserves, and it more accurately reflects the rigor and balance of the method itself.

As soon as I posted the original post, I reached out to Rob Brinkerhoff to let him know. After some reflection, Rob wrote this and asked me to post it:

“Thank you for raising the issue of the currency of the name Success Case Method (SCM). It is kind of you to also think about identifying it more closely with my name. Your thoughts are not unlike others and on occasion even myself. 

It is true the SCM collects data from extreme portions of the respondent distribution including likely successes, non-successes, and ‘middling’ users of training. Digging into these different groups yields rich and useful information. 

Interestingly the original name I gave to the method some 40 years ago when I first started forging it was the “Pioneer” method since when we studied the impact of a new technology or innovation we felt we learned the most from the early adopters – those out ahead of the pack that tried out new things and blazed a trail for others to follow. I refined that name to a more familiar term but the concept and goal remained identical: accelerate the pace of change and learning by studying and documenting the work of those who are using it the most and the best. Their experience is where the gold is buried. 

Given that, I choose to stick with the “success” name. It expresses our overall intent: to nurture and learn from and drive more success. In a nutshell, this name expresses best not how we do it, but why we do it. 

Thanks again for your thoughtful reflections. We’re on the same page.“ 

Rob’s response is thoughtful, as usual. Yet my feelings on this remain steady. As I’ve written in my report on the new Learning-Transfer Evaluation Model (LTEM), our models should nudge appropriate actions. The same is true for the names we give things. Mining for success stories is good, but it has to be balanced. After all, if evaluation doesn’t look for the full truth—without putting a thumb on the scale—than we are not evaluating, we are doing something else.

I know Rob’s work. I know that he is not advocating for, nor does he engage in, unbalanced evaluations. I do fear that the name Success Case Method may give permission or unconsciously nudge lesser practitioners to find more success and less failure than is warranted by the facts.

Of course, the term “Success Case Method” has one brilliant advantage. Where people are hesitant to evaluate for fear of uncovering unpleasant results, the name “Success Case Method” may lessen the worry of moving forward and engaging in evaluation—and so it may actually enable the balanced evaluation that is necessary to uncover the truth of learning’s level of success.

Whatever we call it, the Success Case Method or the Brinkerhoff Case Method—and this is the most important point—it is one of the best learning-evaluation innovations in the past half century.

I also agree that since Rob is the creator, his voice should have the most influence in terms of what to call his invention.

I will end with one of my all-time favorite quotations from the workplace learning field, from Tim Mooney and Robert Brinkerhoff’s excellent book, Courageous Training:

“The goal of training evaluation is not to prove the value of training; the goal of evaluation is to improve the value of training.” (p. 94-95)

On this we should all agree!

Thankful for So Much!! Paying Off My Student Loans at 60 Years of Age

, ,

Today, after turning 60 a few months ago, I finally paid off my student loans—the loans that made it possible for me to get my doctorate from Columbia University. I was in school for eight years from 1988 to 1996, studying with some of the brightest minds in learning, development, and psychology (Rothkopf, Black, Peverly, Kuhn, Higgins, Dweck, Mischel, Darling-Hammond, not to mention my student cohort). If my math is right, that’s 22 years to pay off my student-loan debt. A ton of interest paid too!

I’m eternally grateful! Without the federal government funding my education, my life would have been so much different. I would never have learned how to understand the research on learning. My work at Work-Learning Research, Inc.—attempting to bridge the gap between research and practice—would not have been possible. Thank you to my country—the United States of America—and fellow citizens for giving me the opportunity of a lifetime!! Thanks also must go to my wife for marrying into the forever-string of monthly payments. Without her tolerance and support I certainly would be lost in a different life.

I’ve often reflected on my good fortune in being able to pursue my interests, and wondered why we as a society don’t do more to give our young people an easier road to pursue their dreams. Even when I hear about the brilliant people winning MacArthur fellowships, I wonder why only those who have proven their genius are being boosted. They are deserving of course, but where is our commitment to those who might be teetering on a knife edge of opportunity and economic desperation? I was even lucky as an undergrad back in the late 1970’s, paying relatively little for a good education at a state school and having parents who funded my tuition and living expenses. College today is wicked expensive, cutting out even more of our promising youth from realizing their potential.

Economic mobility is not as easy as we might like it. The World Bank just released a report showing that worldwide only 12% of young adults have been able to obtain more education than their parents. The United States iis no longer the land of opportunity we once liked to imagine.

This is crazy short-sighted, and combine this with our tendency to underfund our public schools, it has the smell of a societal suicide.

That’s depressing! Today I’m celebrating my ability to get student loans two-and-a-half decades ago and pay them off over the last twenty-some years! Hooray!

Seems not so important when put into perspective. It’s something though.

 

 

Reflections This Morning On Brushing My Teeth

,

I use a toothbrush that has a design that research shows maximizes the benefits of brushing. It spins, and spinning is better than oscillations. It also has a timer, telling me when I’ve brushed for two minutes. Ever since a hockey stick broke up my mouth when I was twenty, I’ve been sensitive about the health of my teeth.

But what the heck does this have to so with learning and development? Well, let’s see.

Maybe my toothbrush is a performance-support exemplar. Maybe no training is needed. I didn’t read any instructions. I just used it. The design is intuitive. There’s an obvious button that turns it on, an obvious place to put toothpaste (on the bristles), and it’s obvious that the bristles should be placed against the teeth. So, the tool itself seems like it needs no training.

But I’m not so sure. Let’s do a thought experiment. If I give a spinning toothbrush to a person who’s never brushed their teeth, would they use it correctly? Would they use it at all? Doubtful!

What is needed to encourage or enable good tooth-brushing?

  • People probably need something to compel them to brush, perhaps knowledge that brushing prevents dental calamities like tooth decay, gum disease, bad breath—and may even prevent cognitive decline as in Alzheimer’s. Training may help motivate action.
  • People will probably be more likely to brush if they know other people are brushing. Tons of behavioral economics studies have shown that people are very attuned to social comparisons. Again, training may help motivate action. Interestingly, people may be more likely to brush with a spinning toothbrush if others around them are also brushing with spinning toothbrushes. Training coworkers (or in this case other family members) may also help motivate action.
  • People will probably brush more effectively if they know to brush all their teeth, and to brush near their gums as well—not just the biting surfaces of their teeth. Training may provide this critical knowledge.
  • People will probably brush more effectively if they are set up—probably if they set themselves up—to be triggered by environmental cues. For example, tooth-brushing is often most effectively triggered when people brush right after breakfast and right before they go to bed. Training people to set up situation-action triggering may increase later follow through.
  • People will probably brush more effectively if they know that they should brush for two minutes or so rather than just brushing quickly. Training may provide this critical knowledge. Note, of course, that the toothbrush’s two-minute timer may act to support this behavior. Training and performance support can work together to enable effective behavior.
  • People will be more likely to use an effective toothbrush if the cost of the toothbrush is reasonable given the benefits. The costs of people’s tools will affect their use.
  • People will be more likely to use a toothbrush if the design is intuitive and easy to use. The design of tools will affect their use.

I’m probably missing some things in the list above, but it should suffice to show the complex interplay between our workplace tools/practices/solutions and training and prompting mechanisms (i.e., performance support and the like).

But what insights, or dare we say wisdom, can we glean from these reflections? How about these for starters:

  • We could provide excellent training, but if our tools/practices/solutions are poorly designed they won’t get used.
  • We could provide excellent training, but if our tools/practices/solutions are too expensive they won’t get used.
  • Let’s not forget the importance of prior knowledge. Most of us know the basics of tooth brushing. It would waste time, and be boring, to repeat that in a training. The key is to know, to really know, not just guess, what our learners know—and compare that to what they really need to know.
  • Even when we seem to have a perfectly intuitive, well-designed tool/practice/solution let’s not assume that no training is needed. There might be knowledge or motivational gaps that need to be bridged (yes, the pun was intended! SMILE). There might be situation-action triggering sets that can be set up. There might be reminders that would be useful to maintain motivation and compel correct technique.
  • Learning should not be separated from design of tools/practices/solutions. We can support better designs by reminding the designers and developers of these objects/procedures that training can’t fix a bad design. Better yet, we can work hand in hand involved in prototyping the tool/training bundle to enable the most pertinent feedback during the design process itself.
  • Training isn’t just about knowledge, it’s also about motivation.
  • Motivation isn’t just the responsibility of training. Motivation is an affordance of the tools/practices/solutions themselves, it is borne in the social environment, it is subject to organizational influence, particularly through managers and peers.
  • Training shouldn’t be thought of as a one-time event. Reminders may be valuable as well, particularly around the motivational aspects (for simple tasks), and to support remembering (for tasks that are easily forgotten or misunderstood).

One final note. We might also train people to use the time when they are engaged in automated tasks—tooth-brushing for example—to reflect on important aspects of their lives, gaining from the learning that might occur or the thoughts that may enable future learning. And adding a little fun into mundane tasks. Smile for the tiny nooks and crannies of our lives that may illuminate our thinking!

 

Dealing with Emotional Readiness — What Should We be Doing?

,

I included this piece in my newsletter this morning (which you can sign up for here) and it seemed to really resonate with people, so I’m including it here.

I’ve always had a high tolerance for pain, but breaking my collarbone at the end of February really sent me crashing down a mountain. Lying in bed, I got thinking about the emotional side of workplace performance. I don’t have brilliant insights here, just maybe some thoughts that will get you thinking.

Skiing with my family in Vermont, it had been a very good week. My wife and I, skiing together on our next-to-last day on the mountain, went to look for the kids who told us they’d be skiing in the terrain park (where the jumps are). My wife skied down first, then I went. There was a little jump, about a foot high, of the kind I’d jumped many times. But this time would be different.

As I sailed over the jump — slowly because I’m wary of going too fast and flying too far — I looked down and saw, NOT snow, but stairs. WTF? Every other time I took a small jump there was snow on the other side. Not metal stairs. Not dry metal stairs. In mid-air my thought was, “okay, just stay calm, you’ll ski over the stairs back to snow.” Alas, what happened was that I came crashing down on my left shoulder, collarbone splintering into five or six pieces, and lay 20 feet down the hill. I knew right away that things were bad. I knew that my life would be upended for weeks or months. I knew that miserable times lay ahead.

I got up quickly. I was in shock and knew it. I looked up the mountain back at the jump. Freakin’ stairs!! What they hell were they doing there? I was rip-roaring mad! One of my skis was still on the stairs. The dry surface must have grabbed it, preventing me from skiing further down the slope. I retrieved my ski. A few people skied by me. My wife was long gone down the mountain. I was in shock and I was mad as hell and I couldn’t think straight, but I knew I shouldn’t sit down so I just stood there for five or ten minutes in a daze. Finally someone asked if I was okay, and I yelled crazy loud for the whole damn mountain to hear, “NO!” He was nice, said he’d contact the ski patrol.

I’ll spare you the details of the long road to recovery — a recovery not yet complete — but the notable events are that I had badly broken my collarbone, badly sprained my right thumb and mildly sprained my left thumb, couldn’t button my shirts or pants for a while, had to lie in bed in one position or the pain would be too great, watched a ton of Netflix (I highly recommend Seven Seconds!), couldn’t do my work, couldn’t help around the house, got surgery on my collarbone, got pneumonia, went to physical therapy, etc… Enough!

Feeling completely useless, I couldn’t help reflect on the emotional side of learning, development, and workplace performance in general. In L&D, we tend to be helping people who are able to learn and take actions — but maybe not all the people we touch are emotionally present and able. Some are certainly dealing with family crises, personal insecurities, previous job setbacks, and the like. Are we doing enough for them?

I’m not a person prone to depression, but I was clearly down for the count. My ability to do meaningful work was nil. At first it was the pain and the opiates. Later it was the knowledge that I just couldn’t get much work done, that I was unable to keep up with promises I’d made, that I was falling behind. I knew, intellectually, that I just had to wait it out — and this was a great comfort. But still, my inability to think and to work reminded me that as a learning professional I ought to be more empathetic with learners who may be suffering as well.

Usually, dealing with emotional issues of an employee falls to the employee and his or her manager. I used to be a leadership trainer and I don’t remember preparing my learners for how to deal with direct reports who might be emotionally unready to fully engage with work. Fortunately today we are willing to talk about individual differences, but I think we might be forgetting the roller-coaster ride of being human, that we may differ in our emotional readiness on any given day. Managers/supervisors rightly are the best resource for dealing with such issues, but we in L&D might have a role to play as well.

I don’t have answers here. I wish I did. Probably it begins with empathy. We also can help more when we know our learners more — and when we can look them in the eyes. This is tricky business though. We’re not qualified to be therapists and simple solutions like being nice and kind and keeping things positive is not always the answer. We know from the research that challenging people with realistic decision-making challenges is very beneficial. Giving honest feedback on poor performance is beneficial.

We should probably avoid scolding and punishment and reprimands. Competition has been shown to harmful in at least some learning situations. Leaderboards may make emotional issues worse, and generally the limited research suggests they aren’t very useful anyway. But these negative actions are rarely invoked, so we have to look deeper.

I wish I had more wisdom about this. I wish there was research-based evidence I could draw on. I wish I could say more than just be human, empathetic, understanding.

Now that I’m aware of this, I’m going to keep my eyes and ears open to learning more about how we as learning professionals can design learning interventions to be more sensitive to the ups and downs of our fellow travelers.

If you’ve got good ideas, please send them my way or use the LinkedIn Post generated from this to join the discussion.

Will Thalheimer Interviewed by Jeffrey Dalto

, , ,

Series of Four Interviews

I was recently interviewed by Jeffrey Dalto of Convergence Training. Jeffrey is a big fan of research-based practice. He did a great job compiling the interviews.

Click on the title of each one to read the interview:

The Backfire Effect is NOT Prevalent: Good News for Debunkers, Humans, and Learning Professionals!

, , ,

An exhaustive new research study reveals that the backfire effect is not as prevalent as previous research once suggested. This is good news for debunkers, those who attempt to correct misconceptions. This may be good news for humanity as well. If we cannot reason from truth, if we cannot reliably correct our misconceptions, we as a species will certainly be diminished—weakened by realities we have not prepared ourselves to overcome. For those of us in the learning field, the removal of the backfire effect as an unbeatable Goliath is good news too. Perhaps we can correct the misconceptions about learning that every day wreak havoc on our learning designs, hurt our learners, push ineffective practices, and cause an untold waste of time and money spent chasing mythological learning memes.

 

 

The Backfire Effect

The backfire effect is a fascinating phenomenon. It occurs when a person is confronted with information that contradicts an incorrect belief that they hold. The backfire effect results from the surprising finding that attempts at persuading others with truthful information may actually make the believer believe the untruth even more than if they hadn’t been confronted in the first place.

The term “backfire effect” was coined by Brendan Nyhan and Jason Reifler in a 2010 scientific article on political misperceptions. Their article caused an international sensation, both in the scientific community and in the popular press. At a time when dishonesty in politics seems to be at historically high levels, this is no surprise.

In their article, Nyhan and Reifler concluded:

“The experiments reported in this paper help us understand why factual misperceptions about politics are so persistent. We find that responses to corrections in mock news articles differ significantly according to subjects’ ideological views. As a result, the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups in several cases.”

Subsequently, other researchers found similar backfire effects, and notable researchers working in the area (e.g., Lewandowsky) have expressed the rather fatalistic view that attempts at correcting misinformation were unlikely to work—that believers would not change their minds even in the face of compelling evidence.

 

Debunking the Myths in the Learning Field

As I have communicated many times, there are dozens of dangerously harmful myths in the learning field, including learning styles, neuroscience as fundamental to learning design, and the myth that “people remember 10% of what they read, 20% of what they hear, 30% of what they see…etc.” I even formed a group to confront these myths (The Debunker Club), although, and I must apologize, I have not had the time to devote to enabling our group to be more active.

The “backfire effect” was a direct assault on attempts to debunk myths in the learning field. Why bother if we would make no difference? If believers of untruths would continue to believe? If our actions to persuade would have a boomerang effect, causing false beliefs to be believed even more strongly? It was a leg-breaking, breath-taking finding. I wrote a set of recommendations to debunkers in the learning field on how best to be successful in debunking, but admittedly many of us, me included, were left feeling somewhat paralyzed by the backfire finding.

Ironically perhaps, I was not fully convinced. Indeed, some may think I suffered from my own backfire effect. In reviewing a scientific research review in 2017 on how to debunk, I implored that more research be done so we could learn more about how to debunk successfully, but I also argued that misinformation simply couldn’t be a permanent condition, that there was ample evidence to show that people could change their minds even on issues that they once believed strongly. Racist bigots have become voices for diversity. Homophobes have embraced the rainbow. Religious zealots have become agnostic. Lovers of technology have become anti-technology. Vegans have become paleo meat lovers. Devotees of Coke have switched to Pepsi.

The bottom line is that organizations waste millions of dollars every year when they use faulty information to guide their learning designs. As a professional in the learning field, it’s our professional responsibility to avoid the danger of misinformation! But is this even possible?

 

The Latest Research Findings

There is good news in the latest research! Thomas Wood and Ethan Porter just published an article (2018) that could not find any evidence for a backfire effect. They replicated the Nyhan and Reifler research, they expanded tenfold the number of misinformation instances studied, they modified the wording of their materials, they utilized over 10,000 participants in their research, and they varied their methods for obtaining those participants. They did not find any evidence for a backfire effect.

“We find that backfire is stubbornly difficult to induce, and is thus unlikely to be a characteristic of the public’s relationship to factual information. Overwhelmingly, when presented with factual information that corrects politicians—even when the politician is an ally—the average subject accedes to the correction and distances himself from the inaccurate claim.”

There is additional research to show that people can change their minds, that fact-checking can work, that feedback can correct misconceptions. Rich and Zaragoza (2016) found that misinformation can be fixed with corrections. Rich, Van Loon, Dunlosky, and  Zaragoza (2017) found that corrective feedback could work, if it was designed to be believed. More directly, Nyhan and Reifler (2016), in work cited by the American Press Institute Accountability Project, found that fact checking can work to debunk misinformation.

 

Some Perspective

First of all, let’s acknowledge that science sometimes works slowly. We don’t yet know all we will know about these persuasion and information-correction effects.

Also, let’s please be careful to note that backfire effects, when they are actually evoked, are typically found in situations where people are ideologically inclined to a system of beliefs for which they strongly identify. Backfire effects have been studied most of in situations where someone identifies themselves as a conservative or liberal—when this identity is singularly or strongly important to their self identity. Are folks in the learning field such strong believers in a system of beliefs and self-identity to easily suffer from the backfire effect? Maybe sometimes, but perhaps less likely than in the area of political belief which seems to consume many of us.

Here are some learning-industry beliefs that may be so deeply held that the light of truth may not penetrate easily:

  • Belief that learners know what is best for their learning.
  • Belief that learning is about conveying information.
  • Belief that we as learning professionals must kowtow to our organizational stakeholders, that we have no grounds to stand by our own principles.
  • Belief that our primary responsibility is to our organizations not our learners.
  • Belief that learner feedback is sufficient in revealing learning effectiveness.

These beliefs seem to undergird other beliefs and I’ve seen in my work where these beliefs seem to make it difficult to convey important truths. So let me clarify and first say that it is speculative on my part that these beliefs have substantial influence. This is a conjecture on my part. Note also that given that the research on the “backfire effect” has now been shown to be tenuous, I’m not claiming that fighting such foundational beliefs will cause damage. On the contrary, it seems like it might be worth doing.

 

Knowledge May Be Modifiable, But Attitudes and Belief Systems May Be Harder to Change

The original backfire effect showed that people believed facts more strongly when confronted with correct information, but this misses an important distinction. There are facts and there are attitudes, belief systems, and policy preferences.

A fascinating thing happened when Wood and Porter looked for—but didn’t find—the backfire effect. They talked with the original researchers, Nyhan and Reifler, and they began working together to solve the mystery. Why did the backfire effect happen sometimes but not regularly?

In a recent podcast (January 28, 2018) from the “You Are Not So Smart” podcast, Wood, Porter, and Nyhan were interviewed by David McRaney and they nicely clarified the distinction between factual backfire and attitudinal backfire.

Nyhan:

“People often focus on changing factual beliefs with the assumption that it will have consequences for the opinions people hold, or the policy preferences that they have, but we know from lots of social science research…that people can change their factual beliefs and it may not have an effect on their opinions at all.”

“The fundamental misconception here is that people use facts to form opinions and in practice that’s not how we tend to do it as human beings. Often we are marshaling facts to defend a particular opinion that we hold and we may be willing to discard a particular factual belief without actually revising the opinion that we’re using it to justify.”

Porter:

“Factual backfire if it exits would be especially worrisome, right? I don’t really believe we are going to find it anytime soon… Attitudinal backfire is less worrisome, because in some ways attitudinal backfire is just another description for failed persuasion attempts… that doesn’t mean that it’s impossible to change your attitude. That may very well just mean that what I’ve done to change your attitude has been a failure. It’s not that everyone is immune to persuasion, it’s just that persuasion is really, really hard.”

McRaney (Podcast Host):

“And so the facts suggest that the facts do work, and you absolutely should keep correcting people’s misinformation because people do update their beliefs and that’s important, but when we try to change people’s minds by only changing their [factual] beliefs, you can expect to end up, and engaging in, belief whack-a-mole, correcting bad beliefs left and right as the person on the other side generates new ones to support, justify, and protect the deeper psychological foundations of the self.”

Nyhan:

“True backfire effects, when people are moving overwhelmingly in the opposite direction, are probably very rare, they are probably on issues where people have very strong fixed beliefs….”

 

Rise Up! Debunk!

Here’s the takeaway for us in the learning field who want to be helpful in moving practice to more effective approaches.

  • While there may be some underlying beliefs that influence thinking in the learning field, they are unlikely to be as strongly held as the political beliefs that researchers have studied.
  • The research seems fairly clear that factual backfire effects are extremely unlikely to occur, so we should not be afraid to debunk factual inaccuracies.
  • Persuasion is difficult but not impossible, so it is worth making attempts to debunk. Such attempts are likely to be more effective if we take a change-management approach, look to the science of persuasion, and persevere respectfully and persistently over time.

Here is the message that one of the researchers, Tom Wood, wants to convey:

“I want to affirm people. Keep going out and trying to provide facts in your daily lives and know that the facts definitely make some difference…”

Here are some methods of persuasion from a recent article by Flynn, Nyhan, and Reifler (2017) that have worked even with people’s strongly-held beliefs:

  • When the persuader is seen to be ideologically sympathetic with those who might be persuaded.
  • When the correct information is presented in a graphical form rather than a textual form.
  • When an alternative causal account of the original belief is offered.
  • When credible or professional fact-checkers are utilized.
  • When multiple “related stories” are also encountered.

The stakes are high! Bad information permeates the learning field and makes our learning interventions less effective, harming our learners and our organizations while wasting untold resources.

We owe it to our organizations, our colleagues, and our fellow citizens to debunk bad information when we encounter it!

Let’s not be assholes about it! Let’s do it with respect, with openness to being wrong, and with all our persuasive wisdom. But let’s do it. It’s really important that we do!

 

Research Cited

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions.
Political Behavior, 32(2), 303–330.

Nyhan, B., & Zaragoza, J. (2016). Do people actually learn from fact-checking? Evidence from a longitudinal study during the 2014 campaign. Available at: www.dartmouth.edu/~nyhan/fact-checking-effects.pdf.
Rich, P. R., Van Loon, M. H., Dunlosky, J., & Zaragoza, M. S. (2017). Belief in corrective feedback for common misconceptions: Implications for knowledge revision. Journal of Experimental Psychology: Learning, Memory, and Cognition, 43(3), 492-501.
Rich, P. R., & Zaragoza, M. S. (2016). The continued influence of implied and explicitly stated misinformation in news reports. Journal of Experimental Psychology: Learning, Memory, and Cognition, 42(1), 62-74. http://dx.doi.org/10.1037/xlm0000155
Wood, T., & Porter, E. (2018). The elusive backfire effect: Mass attitudes’ steadfast factual adherence, Political Behavior, Advance Online Publication.

 

Big Data and Learning — A Wild Goose Chase?

, ,

Geese are everywhere these days, crapping all over everything. Where we might have nourishment, we get poop on our shoes.

Big data is everywhere these days…

Even flocking into the learning field.

For big-data practitioners to NOT crap up the learning field, they’ll need to find good sources of data (good luck with that!), use intelligence about learning to know what it means (blind arrogance will prevent this, at least at first), and then find where the data is actually useful in practice (will there be patience and practice or just shiny new objects for sale?).

Beware of the wild goose chase! It’s already here.

Guest Post from Robert O. Brinkerhoff: 70-20-10: The Good, the Bad, and the Ugly

, , ,

This is a guest post by Robert O. Brinkerhoff (www.BrinkerhoffEvaluationInstitute.com).

Rob is a renowned expert on learning evaluation and performance improvement. His books, Telling Training’s Story and Courageous Training, are classics.

______________________________

70-20-10: The Good, the Bad, and the Ugly

The 70-20-10 framework may not have much if any research basis, but it is still a good reminder to all of us in in the L&D and performance improvement professions that the work-space is a powerful teacher and poses many opportunities for practice, feedback, and improvement.

But we must also recognize that a lot of the learning that is taking place on the job may not be for the good. I have held jobs in agencies, corporations and the military where I learned many things that were counter to what the organization wanted me to learn: how to fudge records, how to take unfair advantage of reimbursement policies, how to extend coffee breaks well beyond their prescribed limits, how to stretch sick leave, and so forth.

These were relatively benign instances. Consider this: Where did VW engineers learn how to falsify engine emission results? Where did Well Fargo staff learn how to create and sell fake accounts to their unwitting customers?

Besides these egregiously ugly examples, we have to also recognize that in the case of L&D programming that is intended to support new strategic and other change initiatives, the last thing the organization needs is more people learning how to do their jobs in the old way. AT&T, for example, worked very hard to drive new beliefs and actions to enable the business to shift from landline technologies to wireless; on-the-job learning dragged them backwards, and creates problems still today. As AllState Insurance tries to shift sales focus away from casualty policies to financial planning services, the old guard teaches the opposite actions, as they continue to harvest the financial benefits of policy renewals. Any organization that has to make wholesale and fundamental shifts to execute new strategies will have to cope with the negative effects of years of on-the-job learning.

When strategy is new, there are few if any on-the-job pockets of expertise and role models. Training new employees for existing jobs is a different story. Here, obviously, the on-job space is an entirely appropriate learning resource.

In short, we have to recognize that not all on-the-job learning is learning that we want. Yet on the job learning remains an inexorable force that we in L&D must learn how to understand, leverage, guide and manage.

Purpose of Workplace Learning and Development. Survey Inquiry

, ,

Seek Research-to-Practice Experts as Your Trusted Advisors

, , ,

I added these words to the sidebar of my blog, and I like them so much that I’m sharing them as a blog post itself.

Please seek wisdom from research-to-practice experts — the dedicated professionals who spend time in two worlds to bring the learning field insights based on science. These folks are my heroes, given their often quixotic efforts to navigate through an incomprehensible jungle of business and research obstacles.

These research-to-practice professionals should be your heroes as well. Not mythological heroes, not heroes etched into the walls of faraway mountains. These heroes should be sought out as our partners, our fellow travelers in learning, as people we hire as trusted advisors to bring us fresh research-based insights.

The business case is clear. Research-to-practice experts not only enlighten and challenge us with ideas we might not have considered — ideas that make our learning efforts more effective in producing business results — research-to-practice professionals also prevent us from engaging in wasted efforts, saving our organizations time and money, all the while enabling us to focus more productively on learning factors that actually matter.