One of my favorite people in the workplace-learning space is Clark Quinn. Smart, passionate, and research-based, Clark has been consulting on learning and learning technology for decades. We worked on the Serious eLearning Manifesto together. We attempted—and failed—to build a community of workplace learning thought leaders. We commiserate frequently. I feel that I know some of what is in Clark’s heart. Now you can too!

With publication of his new book, Revolutionize Learning and Development, Clark shares his unbridled passion for our field. His book is a cross between a furious Charles Bukowski poem and a winding Jack Kerouac road trip. His words stream like a molten-steel lava flow of wisdom, love, and thunder.

Clark loves what we do. He just wishes with hope that we did it better. He puts the focus on on-the-job performance, saying that our learning solutions should be aimed at creating results in the workplace. Indeed, Clark nimbly changes our name from the Learning and Development team to the Performance and Development team! Learning is just a means to performance.

Clark comes to his wisdom based on years of immersion in the learning research and working on practical learning-design issues. Early in the book he explains how the brain works and how its architecture should impact our learning designs. This is very helpful because too many of us in the learning field think that conscious information transfer is all that’s needed, when in fact most of our work is done subconsciously.

In the book, Clark highlights the need for us to redouble our focus on two areas, performance consulting and development facilitation. Performance consulting looks at how to optimize execution. It requires that we analyze tasks and root causes before we develop appropriate solutions—and, incidentally, it demands that we not always look to training as the solution. Development facilitation is focused on how people develop their knowledge and skills over time. Where performance consulting works directly through our outside efforts, development facilitation helps “people improve while working on their own and together.”


Nuggets From the Book:

  • “Our species has in many ways survived because we learned how to physically augment our resources.” We learned how to overcome our tendency to forget by providing ourselves with informational sources, including such things as reference materials, job aids, and performance support.
  • We in the workplace learning-and-performance field used to have a responsibility to help people execute their job tasks better. Now we also have a responsibility to help them be innovative.
  • We, as workplace learning-and-performance professionals, need to work backwards. “We need to identify the performance we desire and then decide how to distribute information among the individual, network, and resources—digital and real.”
  • “As much as possible, we should resist trying to change the individual, as this is difficult…Our focus must be on how we want people to perform, and we must figure out what can be ‘in the world’… and then what has to be ‘in the head.’”
  • “The principle here is to recognize that, when we want people to perform in a resourced environment, we should develop the formal learning to incorporate the performance resources in the [learning] experience. If we can avoid formal learning, we can and should, be when we can’t, we should develop the resources before we develop the training.”
  • “The very first thing to do is to stop doing what we are doing. That, arguably, is impossible, yet there needs to be fundamental change. We have to stop being order-takers and start being performance consultants and improvement facilitators.”
  • On-the-job learning requires an environment where people feel safe to contribute and make mistakes, diversity is valued, people and groups are open to new ideas, and time is made for reflection. We, as learning professionals, can help enable these tendencies.
  • “A performance support focus is a better starting point for organizations than courses!”
  • “A real revolution in social tools has taken place…these capabilities need to be leveraged for learning as well.”
  • “The room is smarter than the smartest person in the room if you manage the process right. If not, the room might be only as smart as the most dominant person in the room or the one with the most authority.”
  • Workplace learning professionals should consider the “Least Assistance Principle.” It’s counterintuitive for many of us, but it basically suggests that we answer the question, ‘What’s the least I can do to guide performance?’” So instead of jumping in with a training solution, we should consider alternatives first.
  • Formal learning methods are fine for novices but for experts more informal learning methods are needed. See graph below.

Critiques of Our Current Practices:

  • Our tendency to have a course/event mindset keeps us from achieving real change.
  • Our tendency to focus on providing knowledge keeps us from focusing on decision-making and task competency.
  • Instead of designing instruction to make the learning intrinsically interesting, we use all manner of attention-getting gimmicks like throwing rubber squish-balls around the room.
  • One of the key things we do wrong is provide insufficient practice. “And we practice until someone gets it right, instead of practicing until they can’t get it wrong.”
  • As an industry we seem incapable of making job aids, even when they are often much more effective than training or training alone.


There’s one thing that I regret about the book. Clark needed, and deserved, a better editor/publisher—one who would have wrestled his lightning-bolt wisdom into a tighter package. For example, the book uses the term, “To Hand,” but even after going through the book twice, I still don’t get what that means.


Banging the drum for revolution, Clark Quinn has done our field a great favor! His research-based focus is on target. His call for a performance-focus captures the high ground.

If you’re new to the idea of a performance focus, the book will help you see through the smoke of current practices.

If you’re a performance true believer, you’ll deepen your passion and restock your armaments with fresh insights and imperatives.

You can buy the book by clicking below.


The world’s best writers use editors to improve their work. Architectural engineers have their calculations reviewed to ensure structural integrity. Doctors seek second opinions on complex cases.

Unfortunately, we in the workplace learning field–and often in the education field–are often left in the dark about the strengths and weaknesses of our learning interventions. Mostly we use smile sheets (learner response forms) to get feedback even though hundreds of research studies show that smile sheets are not correlated with learning results. When we measure learning, we often do it in a way that incorporates severe forms of bias, providing ourselves with false data that pushes us into faulty decision-making.

Learning Audits enable learning professionals to get valid feedback about the strengths and weaknesses of their learning interventions. With the feedback they produce, learning audits help organizations maximize the benefits of their learning.

What is a Learning Audit?

“A learning audit is a systematic review of a learning program to determine the program’s strengths and weaknesses—with the aim to guide subsequent improvements of that learning program and/or other learning programs. Learning audits are conducted in a high-integrity manner to ensure validity and limit bias.”

Learning audits can examine classroom training, elearning, mobile learning, on-the-job learning, self-initiated learning, and academic learning. They can be relatively quick-and-dirty or they can be highly exhaustive. They can cost a little or cost a lot.

Learning audits can utilize the following data-gathering techniques:

  1. Interview learners, learners’ supervisors, learning designers, learning developers, learning deliverers, and other organizational stakeholders.
  2. Focus-group learning stakeholders—most likely in groups of similar individuals.
  3. Survey learning stakeholders.
  4. Job-shadow people as they learn and work on the job.
  5. Research-benchmark the learning program or prototype (or design intentions) based on a validated list of key learning factors (such as the Decisive Dozen).
  6. Analyze organizational artifacts (like company newsletters and bulletin boards) and other communication devices from a learning perspective.
  7. Create a list of the learning media that have been utilized.
  8. Create a list of the available learning media.
  9. Analyze the learning measurement approaches utilized.
  10. Review smile sheet results.
  11. Review results of learning assessments—especially scenario-based decisions, case studies, simulations, and realistic hands-on exercises.
  12. In addition to reviewing results that are assessed during or immediately after learning, seek to review results that assess learning after a delay of a week or more.
  13. Review on-the-job performance results that are routinely captured.
  14. Review business results, especially those that are linked to learning.
  15. Review the quality and use of prompting mechanisms (like job aids).
  16. Review the quality and use of on-the-job learning affordances, including coaching, social media, knowledge-management systems, team learning, etc.
  17. Review the supports in place for creativity-based insight learning.
  18. Review the supports in place for after-learning application.
  19. Develop and deploy improved smile sheets, learning assessments, and performance assessments.
  20. Conduct A-B testing on different versions of the same learning program.

What is Research Benchmarking?

Research Benchmarking is the process by which your learning interventions are benchmarked against research-based best practices.

Work-Learning Research excels in research benchmarking because of exhaustive work we’ve done over more than a decade compiling scientific research on learning, memory, and instruction. By combining research-based knowledge and practical wisdom, our learning audits lead the world in providing leverable recommendations for improvement.

Your learning programs will be benchmarked against the Decisive Dozen, the 12 most important learning factors. In addition, your learning ecosystem will be reviewed to look for learning and application support, learning measurement issues, and business or organizational considerations.

Want to Know More?

If you’d like to discuss learning audits further, contact me, Dr. Will Thalheimer, at 1-617-718-0767 or email me by clicking here.

If you’d like to consider conducting your own learning audits, I encourage you to view this web page.


In an article by Farhad Manjoo in the New York Times reports on Google's efforts to improve diversity. This is a compendable effort.

I was struck that while Google was utilizing scientists to devise the content of a diversity training program, it didn't seem to be utilizing research on the learning-to-performance process at all. It could be that Manjoo left it out of the article, or it could be that Google is missing the boat. Here's my commentary:

Dear Farhad,

Either this article is missing vital information–or Google, while perhaps using research on unconscious biases, is completely failing to utilize research-based best practices in learning-to-performance design. Ask almost any thought leader in the training-and-development field and they'll tell you that training by itself is extremely unlikely to substantially change behavior on its own, without additional supports.

By the way, the anecdotes cited for the success of Google's 90-minute training program are not persuasive. It's easy to find some anecdotes that support one's claims. Scientists call this "confirmation bias."

Believe it or not, there is a burgeoning science around what successful learning-to-performance solutions look like. This article, unfortunately, encourages the false notion that training programs alone will be successful in producing behavior change.

This article was originally published in Will’s Insight News, my monthly newsletter.

It has been updated and improved to include new information.

Click here if you want to sign up for my newsletter…

Radically Improved Action Planning
Using Cognitive Triggers to Support On-the-Job Performance

Most of us who have been trainers have tried one or more methods of action planning–hoping to get our learners to apply what they’ve learned back on the job. The most common form of action planning goes something like this (at the end of a training program):

“Okay, take a look at this action-planning handout. Think of 3 things from the course you’d like to take away and apply back on the job. This is critically important. If you feel you’ve learned something you’d like to use, you won’t get the results you want if you forget what your goals are. On the handout, you’ll see space to write down your 3 action-planning goals. I’m going to give you 20 minutes to do this because it’s so important!”

Unfortunately, that method is likely to get less than half the follow-through that another–research based–method may get you!

When we as trainers do action planning, we are recognizing that learning is not enough. We want to make sure that all of our passionate, exhaustive efforts at training are not wasted. If we’re honest with ourselves, we know that if our learners forget everything they’ve learned, then we really haven’t been effective. This goes for e-learning as well. There’s a lot of effort that goes into creating an e-learning course–and, if we can maximize the benefits through effective action planning, then we ought to do it.


Before sharing with you my radically improved action-planning method, it’s critical that I motivate it. Look at the above diagram. It shows that the human mind is subject to both conscious and sub-conscious messages. It also shows that the sub-conscious channel is using a broader bandwidth–and when humans process messages consciously, they often filter the messages in ways that limit the effectiveness of those messages.

One of the most important findings from psychological research in the past 10 years–I hate to call it “brain science” because that’s an inaccurate tease–is that much of what controls human thinking comes from or is influenced by sub-conscious primes. Speed limit signs (conscious messages to slow down) are not as effective as narrowing streets, planting trees near streets, and other sub-conscious influencers. Committing to a diet may not be as effective as using smaller dishes, removing snacks from eyesight, and shopping at farmer’s markets instead of in the processed-food isles of grocery stores.

We workplace professionals tend to use the conscious communication channel almost exclusively–we think it’s our job to compile content, make the best arguments for it’s usefulness, and share information so that our learners acknowledge its value and plan to use it. But, if a large part of human cognition is sub-conscious, shouldn’t we use that too? Don’t we have a professional responsibility to be as effective as we can?

My action-planning method does just that. It sets triggers that later create spontaneous sub-conscious prompts to action. I’m calling this “Triggered Action Planning”–a reminder that we are TAP-ping into our learners’ sub-conscious processing to help them remember what they’ve learned. SMILE.

The basic concept is this: We want learners, when they are back on the job, to be reminded of what they’ve learned. We should do this by aligning context–one of the Decisive Dozen research-based learning factors–in our training designs. We can do this by using more hands-on exercises, more real work, more simulations–but we can extend this to action planning as well.

The key is to set SITUATION-ACTION triggers. We want contextual situations to trigger certain actions. So for example, if we teach supervisors to bring their direct reports into decision-making, we want them to think about this when they are having team meetings, when they are discussing a decision with one of their direct reports, etc. The SITUATION could be a team meeting. The ACTION could be delegating a decision, asking for input, etc., as appropriate.

In action planning, it’s even simpler. Instead of just asking our learners what their goals are for implementing what they’ve learned, we also ask them to select situations when they will begin to carry out those goals. So for example:

  • GOAL: I will work with my team to identify a change initiative.
  • SITUATION-ACTION: At our first staff meeting in October, I will work with my team to identify a change initiative.

Remarkably, this kind of intervention–what researchers call “implementation intentions”–has been found to create incredibly significant effects, often doubling compliance of actual performance!!!!!!!!!!!!!

I think this research finding is so important to workplace learning that I’ve devoted a whole section of my unpublished tome to considering how to use it. Instead of using the term “implementation intentions”–it’s such a mouthful–I just call this trigger-setting.

The bottom line here is that we may be able to double the likelihood that our learners actually apply what they’ve learned simply by having our learners link situations and actions in their action planning.

New Job Aid for Triggered Action Planning

You can easily create your own triggered-action planning worksheets or e-learning interactions, but I’ve got one ready to go that you can use as is–FREE OF CHARGE BECAUSE I LOVE TO SHARE–or you can just use it as a starting point for your own triggered-action-planning exercises.


Click here to download the triggered-action-planning job aid (as a PDF)

Click here for a Word version (so you can modify)



Gollwitzer, P. M., & Sheeran, P. (2006). Implementation intentions and goal achievement: A meta-analysis of effects and processes. Advances in Experimental Social Psychology, 38, 69-119.

Bjork, R. A., & Richardson-Klavehn, A. (1989). On the puzzling relationship between environmental context and human memory. In C. Izawa (Ed.) Current Issues in Cognitive Processes: The Tulane Floweree Symposium on Cognition (pp. 313-344). Hillsdale, NJ: Erlbaum.

Roediger, H. L., III, & Guynn, M. J. (1996). Retrieval processes. In E. L. Bjork & R. A. Bjork (Eds.), Memory (pp. 197-236). San Diego, CA: Academic Press.

Smith, S. M., & Vela, E. (2001). Environmental context-dependent memory: A review and meta-analysis. Psychonomic Bulletin & Review, 8, 203-220.

Thalheimer, W. (2013). The decisive dozen: Research review abridged. Available at the Work-Learning Research catalog.