In today’s New York Times there is a great article, Who’s Minding the Mind?, by BENEDICT CAREY that sums up a large number of research studies on human cognition that show that human beings are more reactive than we might think. We tend to believe that we, as human beings, are very proactive and consciously in control of our thoughts and actions; but these studies show that much of what we do and think is due to hard-wired, often unconscious processes.

For example, the article cites how sitting near a briefcase (as opposed to a backpack) can make people more competitive. Or as Carey writes:

New studies have found that people tidy up more thoroughly when there’s a faint tang of cleaning liquid in the air; they become more competitive if there’s a briefcase in sight, or more cooperative if they glimpse words like “dependable” and “support” — all without being aware of the change, or what prompted it.

This basic fact about human behavior is relevant to the learning and performance field, of course. One of the things I’ve talked about for years is the notion of "spontaneous remembering." If we create learning right, we’re more likely to help our learners—when they’re on the job at a later time—by helping them spontaneously trigger memories of what they’ve learned. We can do this best by requiring our learners to utilize realistic cues in the learning context in making real-world decisions and taking real-world actions. This is why simulations are so effective (if they are well designed).

When learners process learning objectives or prequestions before encountering learning material, the learners are primed to pay attention to relevant learning material. It’s not necessarily a conscious process, but it works.

There are many examples available, but here’s another point: The learner-centric movement of the 1990’s and 2000’s has relied too heavily on the notion that the learners always know best, and that they are in conscious control of their learning and we just need to let them make the best decisions.

When we realize that our learners are more deterministically driven than the we want to believe (its about free will a little, isn’t it?), we have more work to do if we really want to drive maximum performance. Even when our clients consciously want to do something, we may be able to help them reach their goals by setting up learning and performance situations that unconsciously trigger the behavior they want to achieve.

I just came across another sighting of the mythologic numbers of memory retention, this time on the webpage of HRDQ.

Take a look:


They claim that, "Research shows people remember only 5% of what they hear in a lecture. But they retain 75% when they learn by doing." Bulloney!!

If you want to read my full debunking, click here.

If you want to see many bogus sightings, click here and scroll down.

The concept of evidence-based practice is bubbling up all over. I started Work-Learning Research in 1998 to bridge the gap between research and practice. Ruth Clark has been talking about evidence-based recommendations for years. The medical profession has transformed itself to focus on evidence-based medicine. The No-Child-Left-Behind initiative was based on the concept of evidence-based education. Though No Child Left Behind has been poorly implemented, the core concept is sound. Kevin Kruse, CEO of the custom e-learning company Axiom Health, has trademarked the term, “Evidence-Based Training.” Management gurus, Jeffrey Pfeffer and Bob Sutton have recently written a best selling book (Pfeffer and Sutton, 2006. Hard Facts, Dangerous Half-Truths And Total Nonsense: Profiting From Evidence-Based Management). ISPI (the International Society for Performance Improvement) has been promoting evidence-based practices in their CPT certification program since the 1990’s.

Evidence-Based Practice is on an upward trajectory, but what does it mean for those of us in the learning-and-performance field? I’m not going to go into depth now, but I want to highlight a few critical points.

  1. Evidence-Based Learning (EBL) DOES NOT simply mean that we follow research-based prescriptions.
  2. Evidence-Based Learning (EBL) requires us also to measure our own performance. In other words, we ought to be routinely gathering good EVIDENCE about how well our learning interventions are working. Only by having feedback loops can we learn from our performance.
  3. Evidence-Based Learning (EBL) requires us also to build continuous cycles of improvement into our practices. After gathering and analyzing the evidence, we need to act on it. Then we need to evaluate and analyze and act again.

I spend an inordinate amount of time every year culling practical learning wisdom from the world’s preeminent refereed journals on learning, memory, and instruction. It appears that I am one of the most passionate advocates for learning research living today. Yet still, I believe that following research-based recommendations can provide us with less than half, probably less than one-third, of the power of a full practice of evidence-based learning.

In fact, I’ve come to believe that, as a field, we have only reached a small measure of our potential because we don’t utilize evidence-based practices. We don’t have adequate feedback loops. We act too often on superstition, on the ideas inherent in commercial messaging, and on our learner’s lowest-common denominator comfort zones.

The bottom line recommendation is this: Only with true evidence-based practices, not warmed over attempts to follow a few research-based prescriptions, can you build a maximally effective learning program.

Sorry I can’t provide more specifics in this short blog format. If you want to know more, feel free to get in touch with me.

The most important question that instructional designers can ask is:

“What do learners need to be able to do, and in what situations do they need to do those things?”

While we might discount such a simple question as insignificant, the question brilliantly forces us to focus on our ultimate goals and helps us to align our learning interventions with the human learning system.

Too many of us design with a focus on topics, content, knowledge. This tendency pushes us, almost unconsciously, to create learning that is too boring, filled with too much information, and bereft of practice in realistic situations.

The Magic Question requires us to be relevant. For workplace learning, it focuses our thinking toward learners’ future job situations. For education learning, it focuses our thinking toward real-world relevance of our academic topics.

The Magic Question in Practice

In practice, the Magic Question forces us to begin our instructional-design efforts by not only creating a list of instructional objectives, but also by creating a list of performance situations. For example, if we’re creating leadership training, we not only need to compile objectives like, “For most decisions, it can be helpful to bring your direct reports into decision-making, so as to increase the likelihood that they will bring energy and passion in implementing decisions.” We also need to compile a list of situations were this objective is relevant, for example in weekly staff meetings, project meetings, in one-on-one face-to-face conversations, in phone conversations, etc. Also, for general decision making, but not in situations where time is urgent, where safety is an issue, where legal ramifications are evident, etc.

By framing our instructional-design projects in this way, we get to think about our learning designs in ways that are much more action-oriented, relevant, and practical. The framing makes it more likely that we will align our learning and performance contexts, making it more likely that our learners, in their future situations, will spontaneously remember what we’ve taught them. The framing makes it more likely that we will focus on practice instead of overloading our learners with information. The framing also makes it more likely that we will utilize relevant scenarios that more fully engage our learners. Finally, using the Magic Question forces our SME’s (subject-matter experts) to reformulate their expertise into potent practical packages of relevant material. It’s not always easy to bend SME’s to this discipline, but after the pain, they’ll thank you profusely as together you push their content to a much higher level.

Obviously, there is more to be said about how the Magic Question can be integrated into learning-design efforts. On the other hand, as my clients have reported, the Magic Question has within it a simple power to (1) change the way we think about instructional design, and (2) transform the learning interventions we build.