Clark Quinn and I have been grappling with FUN-da-mental issues in the learning space over the years, and we finally decided to publish some of our dialogue.

In the latest conversation, Clark and I discuss how the tools in the learning field often don't send the right messages about how to design learning–that they unintentionally push us toward poor instructional designs.

You can read the discussion on Clark's world-renowned blog by CLICKING HERE.




Or, read an earlier discussion on how professionalized we are by clicking here.


Below is the feedback for the Learning-Objective Quick Quiz. If you haven’t taken the quiz yet, I recommend you do so before you read the feedback. Science says you’ll learn more and think more deeply if you do! And don’t you want to think more deeply?

I’m impressed with your persistence!

The feedback file is available by clicking here.

While you’re here, feel free to look around. This is my blog…

= Will Thalheimer

Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem.

  • Nulla consequat massa quis enim.
  • Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu.
  • In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo.
  • Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus. Vivamus elementum semper nisi.

Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim.

Read more

Hello! This is Dr. Will Thalheimer of Work-Learning Research, curator of I’m partnering with Jeff Hurt of Velvet Chainsaw to provide PCMA Convening Leaders participants with an opportunity to see first hand what subscription learning feels like–to give you an indepth experience so you can judge for yourself.

We are lucky to have the help of Count5, the builders of one of the most powerful subscription learning tools, QMINDshare, which we will be using to provide you with valuable post-conference information/engagement.

Our subscription-learning experience will be facilitated by Jeff and myself, over a period of about two months. It will consist of short interactions (less than 5 minutes usually) that you can peruse at your leisure. It’s all optional, so don’t worry–you will not be overwhelmed. Also, this is a private and confidential experience. Once the experience is over, it’s over. You won’t have your email or name put on any other list!

Sign up below…

NOTE FROM THE PCMA CONFERENCE MONDAY: Hey everybody. Great session today. Jeff and I thank for your thoughtful participation! CLICK HERE FOR JEFF AND WILL’s SLIDES

And, if you want to grab Will’s slides and job aids from his Tuesday session, CLICK HERE!


Provide the following information and we’ll get started on the subscription learning within a week or so…


Hello. I created this page to show you book cover ideas and state why I like them. I would have done this within your online form, but it only encouraged links.


My first sketch.

I used a black background to convey a dark, provocative image.

I used “eyes” to convey that smile sheets are about helping us see…

I used the yellow-orange racing stripe because I thought this theme could be carried out to future books I will write/publish in this series.

Will's Book Cover Idea 1


Second Sketch

City skyline is to connote business and industry.

People are added because it’s the people who are our learners who are important, and because we as instructional designers etc., are not always happy with our smile sheets.

I added the logo just to let you know I have different color logos (and especially because if it was a black spine, I’m not sure my color logo on white would work that well).

Will's Book Cover Idea 2


Cover Example 1

I like this cover because it is dark and forboding and because it uses paper to get that effect. I thought perhaps a crumpled smile sheet–which is the traditional way that smile sheets are used–might make an interesting cover.

2014-11-11 10_13_40-Clipboard


Cover Example 2

It’s dark, and it has a blurry person in the background. Since smile sheets are supposed to bring things into view, blurry people in the background may convey this.

 2014-11-11 10_15_04-Clipboard

Cover Example 3

Dark, Forboding, Artistic

2014-11-11 10_07_30-Clipboard


Cover Example 4

Dark, Forboding, Artistic

2014-11-11 10_05_13-Clipboard


Cover Example 5

I like this one because the cover idea is “inspired.” I’m not showing you this for the look and feel, but because the graphic brilliantly conveys the idea of the book–which is about mobile learning…



That’s it. Thanks!


Oh, I just noticed that a lot of books from my field are advertised on the right panel here, so you can look at them as well.


= Will




Yo Clark, I really liked your new book, Revolutionize Learning and Development, but there’s one thing I’m not sure I’m fully behind—your recommendation that we as learning professionals kowtow to the organization—that we build our learning interventions aimed solely to meet organizational needs. I grew up near Philadelphia, so I’m partial to Rocky Balboa, using the interjection “Yo,” and rooting for the little guy. What are you thinking? Isn’t revolution usually aimed against the powerful?


Will, what is powerful are the forces against needed change.  L&D appears to be as tied to an older age as Rocky is!  I’m not saying a complete abdication to the organization, but we certainly can’t be oblivious to it either.  The organization doesn’t know learning, to be sure, and should be able to trust us on performance support and informal learning too.  But do you really think that most of what is happening under the guise of L&D is a good job on the formal learning side?


Clark, Of course not. Much of L&D is like Rocky’s brother-in-law Paulie, having an inner heart of gold, but not living up to full effectiveness. I’ve written about the Five Failures of Workplace Learning Professionals three years ago, so I’m on the record that we could do better. And yes, there are lots of forces allied against us, so I’m glad you’re calling for revolution. But back to the question Apollo! To whom do we have more responsibility, the organizations we work for or our profession? To whom should we give our Creed?


Will, your proposed bout is a non-starter!  It’s not either/or; we need to honor both our organization and our profession (and, I’ll argue, we’re currently doing neither).   When we’re building our interventions, they should be to serve the organizations needs, not just their wants. We can’t be order takers, we need to go to the mat (merrily mixing my metaphors) to find out the real problem, and use all solutions (not just courses).   Mickey’d tell you; you got to have heart, but also do the hard yards.  Isn’t the real tension between what we know we should be doing and what we’re actually doing?


I am so much in agreement! Why are we always order takers? You want fries with that? Here’s where I think some in our profession go overboard on the organization-first approach. First, like you say, many don’t have a training-request process that pushes their organizations to look beyond training as the singular leverage point to performance improvement. Second, some measurement “gurus” claim that what’s most important is to measure organizational results—while reneging on our professional responsibility to measure what we have the most control over—like whether people can make good work-related decisions after we train them or even remember what we taught them. Honestly, if the workplace learning field was a human being, it would be a person you wouldn’t want to have as a friend—someone who didn’t have a core set of values, someone who would be prone to following any fad or phony demigod, someone who would shift allegiances with the wind.


Now you’re talking; I love the idea of a training-request process! I recall an organization where the training head had a cost/benefit form for every idea that was brought to him.  It’s not how much it costs per bum per seat per hour, but is that bum per seat per hour making a difference!  And we can start with the ability to make those decisions, but ultimately we ought to also care that making those decisions is impacting the organization too.  I certainly agree we have to be strong and fight for what’s right, not what’s easy or expedient.  Serious elearning for the win!


We seem to be coming to consensus, however, you inspired another question. We agree that we have two responsibilities, one to our professional values and one to our organization’s needs. But should we add another stakeholder to this mix? I have my own answer, inherent in one of my many below-the-radar models, but I’d like your wisdom. Here’s the question, do we have a responsibility to our learners/performers? If we do have responsibilities to them, what are those responsibilities? And here is perhaps the hardest question–in comparison to the responsibility we have to our organizations, is our level of responsibility to our learners/performers higher, lower, or about the same? Remember, the smaller the ring, the harder it is to run…the more likely we get hit by a haymaker. Good luck with these questions…


Bringing in a ringer, eh?  I suppose you could see it as either of two ways: it’s our obligation to our profession and our organization to consider our learners, or they’re another stakeholder. I kinda like the former, as there’re lots of stakeholders: society, learners, ‘clients', SMEs, colleagues, profession, and more.  In fact, I’m inclined to go back to my proposition that’s it’s not either/or. Our obligation as professionals is to do the job that needs to be done in ways that responsibly address our learners, our organizations, and all stakeholders.  To put it in other words, designing interventions in ways that optimally equip learners to meet the needs of the organization is an integration of responsibilities, not a tradeoff.  We need to unify our approach  like boxing needs to unify the different titles!


From what I hear, boxing is dying as a spectator sport precisely because of all the discord and multiple sanctifying bodies. We in the learning-and-performance field might take this as a warning—we need to get our house in order, follow research-based best practices, and build a common body of knowledge and values. It starts with knowing who our stakeholders are and knowing that we have a responsibility to the values and goals of our profession. I like to give our learners a privileged place—at the same level of priority as the organization. It’s not that I think this is an easy argument in an economic sense, because the organization is paying the bills after all. But too often we forget our learners, so I like to keep them front and center in our learning-to-performance models.

Thanks Clark for the great discussion. And thanks for agreeing to host the next one on your blog

Well, it's not clear whether Oprah's doing subscription learning, but her press release says she is.

Calls to Oprah to guest star on this website have not been returned.

Just kidding! We never called Oprah. Would you return our calls?


The world’s best writers use editors to improve their work. Architectural engineers have their calculations reviewed to ensure structural integrity. Doctors seek second opinions on complex cases.

Unfortunately, we in the workplace learning field–and often in the education field–are often left in the dark about the strengths and weaknesses of our learning interventions. Mostly we use smile sheets (learner response forms) to get feedback even though hundreds of research studies show that smile sheets are not correlated with learning results. When we measure learning, we often do it in a way that incorporates severe forms of bias, providing ourselves with false data that pushes us into faulty decision-making.

Learning Audits enable learning professionals to get valid feedback about the strengths and weaknesses of their learning interventions. With the feedback they produce, learning audits help organizations maximize the benefits of their learning.

What is a Learning Audit?

“A learning audit is a systematic review of a learning program to determine the program’s strengths and weaknesses—with the aim to guide subsequent improvements of that learning program and/or other learning programs. Learning audits are conducted in a high-integrity manner to ensure validity and limit bias.”

Learning audits can examine classroom training, elearning, mobile learning, on-the-job learning, self-initiated learning, and academic learning. They can be relatively quick-and-dirty or they can be highly exhaustive. They can cost a little or cost a lot.

Learning audits can utilize the following data-gathering techniques:

  1. Interview learners, learners’ supervisors, learning designers, learning developers, learning deliverers, and other organizational stakeholders.
  2. Focus-group learning stakeholders—most likely in groups of similar individuals.
  3. Survey learning stakeholders.
  4. Job-shadow people as they learn and work on the job.
  5. Research-benchmark the learning program or prototype (or design intentions) based on a validated list of key learning factors (such as the Decisive Dozen).
  6. Analyze organizational artifacts (like company newsletters and bulletin boards) and other communication devices from a learning perspective.
  7. Create a list of the learning media that have been utilized.
  8. Create a list of the available learning media.
  9. Analyze the learning measurement approaches utilized.
  10. Review smile sheet results.
  11. Review results of learning assessments—especially scenario-based decisions, case studies, simulations, and realistic hands-on exercises.
  12. In addition to reviewing results that are assessed during or immediately after learning, seek to review results that assess learning after a delay of a week or more.
  13. Review on-the-job performance results that are routinely captured.
  14. Review business results, especially those that are linked to learning.
  15. Review the quality and use of prompting mechanisms (like job aids).
  16. Review the quality and use of on-the-job learning affordances, including coaching, social media, knowledge-management systems, team learning, etc.
  17. Review the supports in place for creativity-based insight learning.
  18. Review the supports in place for after-learning application.
  19. Develop and deploy improved smile sheets, learning assessments, and performance assessments.
  20. Conduct A-B testing on different versions of the same learning program.

What is Research Benchmarking?

Research Benchmarking is the process by which your learning interventions are benchmarked against research-based best practices.

Work-Learning Research excels in research benchmarking because of exhaustive work we’ve done over more than a decade compiling scientific research on learning, memory, and instruction. By combining research-based knowledge and practical wisdom, our learning audits lead the world in providing leverable recommendations for improvement.

Your learning programs will be benchmarked against the Decisive Dozen, the 12 most important learning factors. In addition, your learning ecosystem will be reviewed to look for learning and application support, learning measurement issues, and business or organizational considerations.

Want to Know More?

If you’d like to discuss learning audits further, contact me, Dr. Will Thalheimer, at 1-617-718-0767 or email me by clicking here.

If you’d like to consider conducting your own learning audits, I encourage you to view this web page.


Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Aenean commodo ligula eget dolor. Aenean massa. Cum sociis natoque penatibus et magnis dis parturient montes, nascetur ridiculus mus. Donec quam felis, ultricies nec, pellentesque eu, pretium quis, sem.

Nulla consequat massa quis enim. Donec pede justo, fringilla vel, aliquet nec, vulputate eget, arcu. In enim justo, rhoncus ut, imperdiet a, venenatis vitae, justo. Nullam dictum felis eu pede mollis pretium. Integer tincidunt. Cras dapibus. Vivamus elementum semper nisi. Aenean vulputate eleifend tellus. Aenean leo ligula, porttitor eu, consequat vitae, eleifend ac, enim.

Read more

Subscription Learning is the idea of providing short–usually less than 10 minute–nuggets of learning content or learning interaction. Often, we think of this as an information-delivery platform. In other words, we think of it as training that is spread over time in tiny packets of training content.

But we really ought to think beyond this limited “training” perspective! Subscription Learning can be so much more.

Just this week, I spoke with Marty Rosenheck of Cognitive Advisors, a cognitive-science inspired consultancy and learning-development shop. Cognitive Advisors has recently released TREK, a “Learning Experience Manager” that allows organizations to go beyond training and support people as they learn in the workplace.

From the Cognitive Advisors website, here’s what TREK does:

  • Manages, tracks and reports on the full range of learning
  • Is built from the ground-up with the Experience API (Tin Can) to capture learning outside the LMS
  • Offers cloud-based software designed for today’s mobile workforce
  • Aligns all learning experiences to competencies
  • Supports personalized learning paths to tailor learning for each learner

This technology wasn’t possible just a few years ago, and the future of this product category, which Marty has coined the “Learning Experience Manager,” seems exceedingly bright. The future looks promising because it fulfills a real need, providing support for on-the-job learning.

Indeed, one of the problems with the last decade’s flirtation with informal learning (and on-the-job learning) is that it happens willy nilly without adequate support. People can learn the wrong things, they can learn too slowly, they can get frustrated and give up, they can fail to learn. Just as learning researchers have found that discovery learning is not generally effective–unless it is “guided,” informal learning is often not as effective and efficient as it ought to be.

Real-world Example

Here’s a real-world example of how TREK has been used:

Professionals in the water-quality-assessment field have to get up to speed quickly to do their jobs. And, they have to do their jobs right. There’s no tolerance for poor water-quality testing. The problem is compounded because there’s so much to learn. New folks often feel they’re gazing through muddy waters until they’ve got lots of experience under their belts–and they have clarity about what to do when. They rely on experienced people to answer questions, provide guidance, and monitor their progress.

The down side of this is that it’s very expensive to keep sending experienced people out with the new folks, and its very inefficient and frustrating when inexperienced people have to keep calling their supervisors and mentors from the field.

Enter TREK. The Water Quality Association, working with Cognitive Advisors, piloted a program to provide their members with a structured on-the-job learning path, enabling learners to learn on the job–while being coached by their immediate supervisors.

TREK worked using employee’s smartphones’ sensors (camera, audio and video recorder, and GPS). Employees captured evidence of their critical actions at each step in their learning path. This evidence was submitted through TREK to each person’s designated manager-coach. As each step was completed, managers were notified and were prompted to review their direct reports’ submissions.

Managers provided brief feedback–either written or in a recorded audio nugget–and this feedback was presented to the learners. Managers weren’t left on their own to flounder in their coaching activities. They were provided with coaching guides, checklists, and success criteria within the TREK interface on their smartphones.

Interestingly, what TREK did in this case was to provide support to both learners and the learners’ managers. For both groups, this improved the effectiveness and efficiency of the learning/coaching experience.

Not Just a Technology

As Marty explained to me in discussing this new product category–the reason Learning Experience Management technology is possible now is that several forces have come together. First and foremost, a new database specification has been developed (Experience API, also known as Tin Can) that enables experiences to be collected and categorized–where once we could only capture training-related information. Second, mobile technology is now ubiquitous in cell phones. Third, cloud computing has become a norm, enabling continuous connectivity between learners and others. Finally, learning analytics, social media, and badging have added to the user experience.

But even given these technological advances, the thing that makes TREK and its performance-support coaching possible is that it aligns with human cognition. The key to providing a great learning experience is to ensure that the learning content is captured and presented in ways that can maximize learning.

Marty illustrated the key link in the process, as he described how he leads customers through a cognitive task analysis process he’s perfected over the years. He calls it knowledge harvesting. It reminded me very much of my days building simulations and working with SME’s to extract knowledge. It’s incredibly valuable, but you don’t want to skimp on the process.

Why is This Subscription Learning

TREK provides a subscription learning experience because by focusing on individual workplace tasks, it is providing small chunks of learning that is spaced over time.

The diagram Cognitive Advisors uses to explain their cognitive apprenticeship model highlights a number of small nuggets on the learning-path trajectory.


Folks to Watch

Marty Rosenheck and his business partner Colleen Enghauser are folks to watch–as is their company, Cognitive Advisors. Their dedication to creating learning technology aligned with the learning research is inspiring!