I just came across a letter I wrote back in 2001 to the editor of the magazine of The Association for Psychological Science. I’m sharing it because it shows that we have made only a little progress in creating an ecosystem where research translators play a vital role in facilitating the dissemination of research wisdom.

In the letter, I argued that research wholesalers are needed to bridge the chasm between academic researchers and practitioners.

You can read the letter here.

When I started playing the research-translator role full-time in 1998, I was full of hope that the role would allow me to prosper and that many more research translators would join the fold. At that time, only Ruth Clark and I were doing this in the workplace learning field.

Where are we now? Have things gotten better?

Yes! And No! We now have a handful of folks doing research translation full bore outside the academy, while earning their living as consultants, speakers, research directors, book writers, workshop presenters, learning strategists, learning evaluators or some combination. Ruth Clark is semi-retired. I, Will Thalheimer, am still at it. We’ve got Patti Shank, Julie Dirksen, Mirjan Neelen, Donald Clark, Jane Bozarth, Clark Quinn. We’ve got folks who focus more generally on learning like Ulrich Boser. It’s not always an easy existence for most of these folks, but they don’t show signs of backing down.

Back in 2001, I envisioned something a bit different however. Today’s research translators are scratching out a living through sheer entrepreneurial ingenuity. I had envisioned the academy embracing research translators as critical to their mission—and paying them a sustainable salary for their efforts. This is not going to happen any time soon, nor are our trade associations stepping up to provide well-paying roles for research translators. You’d think that the most well compensated of our trade-association leaders—those bringing home seven-figure incomes and funding dancing musical extravagances—could afford to nick their salaries and fund a research translator or two to ensure their members were being presented with the most powerful science-based recommendations.

Unfortunately, the forces in the workplace learning field are misaligned. There is no journalism in our field to keep the powerful accountable. There is little or no learning-measurement accountability to push us toward better learning designs and hence require proven research-based recommendations.

But! There are some damn good people who want to create the most effective learning possible. They are driving excellence even with the perfect storm blowing us hither and yon. I’m probably a bit biased, but I see more and more people who want to know what the research says—who want to build the most effective learning possible. I also see, on the flip side, an unwillingness for organizations to pay for research wisdom. Well, they’ll pay for opinion research to find out what everybody else is doing, but they won’t pay for scientific research. They seem to expect that this can be gained quickly from Google.

I’m always an optimist. I figure, if we stand by the river long enough, we will see poor practices washed away.

Anyway, back to that letter. I’m kind of proud of it. I’m happy to have happened upon it today.

Congratulations to Steve Semler who has become Work-Learning Research’s first certification earner by successfully completing the Work-Learning Academy course on Performance-Focused Smile Sheets!

The certification workshop is not yet available to the public but Steve generously agreed to take the program before its release. Note that certification verification can be viewed here.

Those who want to be notified of the upcoming release date can do that here.

 

The 70-20-10 Framework has been all the rage for the last five or ten years in the workplace learning field. Indeed, I organized a great debate about 70-20-10 through The Debunker Club (you can see the tweet stream here). I have gone on record saying that the numbers don’t have a sound research backing, but that the concept is a good one—particularly the idea that we as learning professionals ought to leverage on-the-job learning where we can.

What is 70-20-10?

The 70-20-10 framework is built on the belief that 10% of workplace learning is, or should be, propelled by formal training; that 20% is, or should be, enabled by learning directly from others; and that 70% of workplace learning is, or should, come from employee’s learning through workplace experiences.

Supported by Research?

Given all the energy around 70-20-10, you might think that lots of rigorous scientific research has been done on the framework. Well, you would be wrong!

In fact, up until today (April 19, 2019), only one study has been published in a scientific journal (my search of PsycINFO only reveals one study). In this post, I will review that one study, published last year:

Johnson, S. J., Blackman, D. A., & Buick, F. (2018). The 70:20:10 framework and the transfer of learning. Human Resource Development Quarterly. Advance online publication.

Caveats

All research has strengths, weaknesses, and limitations—and it’s helpful to acknowledge these so we can think clearly. First, one study cannot be definitive, and this is just one study. Also, this study is qualitative and relies on subjective inputs to draw its conclusions. Ideally, we’d like to have more objective measures utilized. It is also gathering data from a small sample of public sector workers, where ideally we want a wider range of diverse participants.

Methodology

The researchers found a group of organizations who had been bombarded with messages and training to encourage the use of the 70-20-10 model. Specifically, the APSC (The Australian Public Sector Commission), starting in 2011, encouraged the Australian public sector to embrace 70-20-10.

The specific study “draws from the experiences of two groups of Australian public sector managers: senior managers responsible for implementing the 70:20:10 framework within their organization; and middle managers who have undergone management capability development aligned to the 70:20:10 framework. All managers were drawn from the Commonwealth, Victorian, Queensland, and Northern Territory governments.”

A qualitative approach was chosen according to the researchers “given the atheoretical nature of the 70:20:10 framework and the lack of theory or evidence to provide a research framework.”

The qualitative approaches used by the researchers were individual structured interviews and group structured interviews.

The researchers chose people to interview based on their experience using the 70-20-10 framework to develop middle managers. “A purposive sampling technique was adopted, selecting participants who had specific knowledge of, and experience with, middle management capability development in line with the 70:20:10 framework.”

The researchers used a text-processing program (NVivo) to help them organize and make sense of the qualitative data (the words collected in the interviews). According to Wikipedia, “NVivo is intended to help users organize and analyze non-numerical or unstructured data. The software allows users to classify, sort and arrange information; examine relationships in the data; and combine analysis with linking, shaping, searching and modeling.”

Overall Results

The authors conclude the following:

“In terms of implications for practice, the 70:20:10 framework has the potential to better guide the achievement of capability development through improved learning transfer in the public sector. However, this will only occur if future implementation guidelines focus on both the types of learning required and how to integrate them in a meaningful way. Actively addressing the impact that senior managers and peers have in how learning is integrated into the workplace through both social modeling and organizational support… will also need to become a core part of any effective implementation.”

“Using a large qualitative data set that enabled the exploration of participant perspectives and experiences of using the 70:20:10 framework in situ, we found that, despite many Australian public sector organizations implementing the framework, to date it is failing to deliver desired learning transfer results. This failure can be attributed to four misconceptions in the framework’s implementation: (a) an overconfident assumption that unstructured experiential learning will automatically result in capability development; (b) a narrow interpretation of social learning and a failure to recognize the role social learning has in integrating experiential, social and formal learning; (c) the expectation that managerial behavior would automatically change following formal training and development activities without the need to actively support the process; and (d) a lack of recognition of the requirement of a planned and integrated relationship between the elements of the 70:20:10 framework.”

Specific Difficulties

With Experiential Learning

“Senior managers indicated that one reason for adopting the 70:20:10 framework was that the dominant element of 70% development achieved through experiential learning reflected their expectation that employees should learn on the job. However, when talking to the middle managers themselves, it was not clear how such learning was being supported. Participants suggested that one problem was a leadership perception across senior managers that middle managers could automatically transition into middle management roles without a great deal of support or development.”

“The most common concern, however, was that experiential learning efficacy was challenged because managers were acquiring inappropriate behaviors on the job based on what they saw around them every day.”

“We found that experiential learning, as it is currently being implemented, is predominantly unstructured and unmanaged, that is, systems are not put in place in the work environment to support learning. It was anticipated that managers would learn on the job, without adequate preparation, additional support, or resourcing to facilitate effective learning.”

With Social Learning

“Overall, participants welcomed the potential of social learning, which could help them make sense of their con-text, enabling both sense making of new knowledge acquired and reinforcing what was appropriate both in, and for, their organization. However, they made it clear that, despite apparent organizational awareness of the value of social learning, it was predominantly dependent upon the preferences and working styles of individual managers, rather than being supported systematically through organizationally designed learning programs. Consequently, it was apparent that social learning was not being utilized in the way intended in the 70:20:10 framework in that it was not usually integrated with formal or experiential learning.”

Mentoring

“Mentoring was consistently highlighted by middle and senior managers as being important for both supporting a middle manager’s current job and for building future capacity.”

“Despite mentoring being consistently raised as the most favored form of development, it was not always formally supported by the organization, meaning that, in many instances, mentoring was lacking for middle managers.”

“A lack of systemic approaches to mentoring meant it was fragile and often temporary.”

Peer Support

“Peer support and networking encouraged middle managers to adopt a broader perspective and engage in a community of practice to develop ideas regarding implementing new skills.”

“However, despite managers agreeing that networks and peer support would assist them to build capability and transfer learning to the workplace, there appeared to be few organizationally supported peer learning opportunities. It was largely up to individuals to actively seek out and join their own networks.”

With Formal Learning

“Formal learning programs were recognized by middle and senior managers as important forms of capability development. Attendance was often encouraged for new middle managers.”

“However, not all experiences with formal training programs were positive, with both senior and middle managers reflecting on their ineffectiveness.”

“For the most part, participants reported finishing formal development programs with little to no follow up.”

“There was a lack of both social and experiential support for embedding this learning. The lack of social learning support partly revolved around the high workloads of managers and the lack of time devoted to development activities.”

“The lack of experiential support and senior management feedback meant that many middle managers did not have the opportunity to practice and further develop their new skills, despite their initial enthusiasm.”

“A key issue with this was the lack of direct and clear guidance provided by their line managers.”

“A further issue with formal learning was that it was often designed generically for groups of participants…  The need for specificity also related to the lack of explicit, individualized feedback provided by their line manager to reinforce and embed learning.”

What Should We Make of This Preliminary Research?

Again, with only one study—and a qualitative one conducted on a narrow type of participant—we should be very careful in drawing conclusions.

Still, the study can be helpful in helping us develop hypotheses for further testing—both by researchers and by us as learning professionals.

We also ought to be careful in casting doubt on the 70-20-10 framework itself. Indeed, the research seems to suggest that the framework was not always implemented as intended. On the other hand, when it is demonstrated that a model tends to be used poorly in its routine use, then we should become skeptical that it will produce reliable benefits.

Here are a list of reflections generated in me by the research:

  1. Why so much excitement for 70-20-10 with so little research backing?
  2. Formal training was found to have all the problems normally associated with it, especially the lack of follow-through and after-training support—so we still need to work to improve it!
  3. Who will provide continuous support for experiential and social learning? In the research case, the responsibility for implementing on-the-job learning experiences was not clear, and so the implementation was not done or was poorly done.
  4. What does it take in terms of resources, responsibility, and tasking to make experiential and social learning useful? Or, is this just a bridge too far?
  5. The most likely leverage point for on-the-job learning still seems, to me, to be managers. If this is a correct assumption—and really it should be tested—how can we in Learning & Development encourage, support, and resource managers for this role?

Sign Up For Will’s News by Clicking Here

 

 

 

Links of Interest:

 

 

I’ve had the distinct honor of being invited to speak at the Learning Technologies conference in London for three years in a row. This year, I talked about two learning innovations:

  • Performance-Focused Learner Surveys
  • LTEM (The Learning-Transfer Evaluation Model)

It was a hand-raising experience!

Most importantly, they have done a great job capturing my talk on YouTube.

Indeed, although I’ve made some recent improvements in the way I talk about these two learning innovations, the video does an excellent job of capturing some of the main points I’ve been making about the state of learning evaluation and two innovations that are tearing down some of the obstacles that have held us back from doing good evaluation.

Thanks to Stella Collins at Stellar Learning for organizing and facilitating my session!

Special thanks to the brilliant conference organizer and learning-industry influencer Robert Taylor for inviting and supporting me and my work.

Again, click here to see the video of my presentation at Learning Technologies London 2019.

While I was in London a few months ago, where I talked about learning evaluation, I was interviewed by the LearningNews about learning evaluation.

Some of what I said:

  • “Most of us have been doing the same damn thing we’ve always done [in learning evaluation]. On the other hand, there is a breaking of the logjam.”
  • “A lot of us are defaulting to happy sheets, and happy sheets that aren’t effective.”
  • “Do we in L&D have the skills to be able to do evaluation in the first place?…. My short answer is NO WAY!”
  • “We can’t upskill ourselves fast enough [in terms of learning evaluation].

It was a fun interview and LearningNews did a nice job in editing it. Special thanks to Rob Clarke for the interview, organizing, and video work (along with his great team)!!

Click here to see the interview.

I want to thank David Kelly and the eLearning Guild for awarding me the prestigious title of Guild Master.

Guild Masters including an amazing list of folks, including lots of research-to-practice legends like Ruth Clark, Julie Dirksen, Clark Quinn, Jane Bozarth, Karl Kapp, and others who utilize research-based recommendations in their work.

Delighted to be included!