21st December 2022

Neon Elephant Award Announcement

Dr. Will Thalheimer, Principal at TiER1 Performance, Founder of Work-Learning Research, announces the winner of the 2022 Neon Elephant Award, given this year to Donald Clark, for writing the book, Learning Experience Design: How to Create Effective Learning that Works, and for his collation of the Greatest Minds on Learning (both in the podcast series with John Helmer and in Donald’s tireless work researching and curating critical ideas and thinkers in his Plan B blog).

Click here to learn more about the Neon Elephant Award…

 

2022 Award Winner – Donald Clark

Donald Clark is a successful entrepreneur, professor, researcher, author, blogger, and speaker. He is an internationally-renowned thinker in the field of learning-technology, having worked in EdTech for over 30 years; having been a leader in many successful learning-technology businesses (both as an executive and board member); and having written extensively on a wide range of topics related to learning and development—in books, articles, and his legendary blog. Relatively early in his career, Donald earned success as one of the original founders of Epic Group plc, a leading online learning company in the UK, an enterprise subsequently floated on the Stock Market in 1996 and sold in 2005. Since then, as Donald has described it, he has felt “free from the tyranny of employment,” using this privilege to the advantage of the learning field. Donald has becoming an advocate for research-based practices and intelligent uses of learning technologies. He has also founded, run, and supported learning-technology enterprises, which has further helped spread good learning practices.

Donald Clark’s most recent book—Learning Experience Design: How to Create Effective Learning that Works—stands above and apart from most writing on Learning Experience Design. It is fully and thoroughly inspired by the scientific research on learning and on real-world experience in using and developing learning technologies. Chapter after chapter it shares an inspiring introduction, a robust review of best practices, and a concise set of practical recommendations. Anyone practicing learning experience design should buy this book today—study it and apply it’s recommendations. Your learners and organizations will thank you. You will build wildly more effective learning!

Donald Clark’s compilations of our field’s best thinkers and ideas is legendary—or should be! Over many decades, he has tirelessly curated an almost endless treasure-trove of golden nuggets on many of the most important ideas in the learning field. This past year, he has brought these to a larger audience through his excellent collaboration with John Helmer, known most famously for his Learning Hack podcast. Donald Clark’s and John Helmer’s Great Minds on Learning webcast and podcast collaboration is fantastic. Donald’s written reviews provide us with an overview of the deep history of the learning field. Here is a blog post that lists many of Donald’s reviews of the great thinkers in our field.

Notable contributions from Donald Clark:


With Gratitude

In his decades of work, Donald Clark has been a tireless advocate for improvements and innovation in the field of learning-and-development and learning technology. He often refers to his work as “provocative,” and deserves admiration for (1) urging the learning field to embrace scientifically-informed practices, (2) urging us to be more forward looking in our embrace of learning technologies (particularly AI), and (3) being one of our field’s preeminent historians—reminding us of the rich and valuable work of researchers, writers, and practitioners from the past centuries to today. It is an honor to recognize Donald as this year’s winner of the Neon Elephant Award.

Click here to learn more about the Neon Elephant Award…

 

 

21st December 2021

Neon Elephant Award Announcement

Dr. Will Thalheimer, Principal at TiER1 Performance, Founder of Work-Learning Research, announces the winner of the 2021 Neon Elephant Award, given to two people this year, Clark Quinn and Patti Shank. Clark Quinn for writing the book, Learning Science for Instructional Designers. Patti Shank for writing the book Write Better Multiple-Choice Questions to Assess Learning—and for their many years translating learning research into practical recommendations.

Click here to learn more about the Neon Elephant Award…

2021 Award Winners – Clark Quinn and Patti Shank

Clark Quinn, PhD is an internationally-recognized consultant and thought-leader in learning science, learning technology and organizational learning. Clark holds a doctorate in Cognitive Psychology from the University of California at San Diego. Since 2001, Clark has been consulting, researching, writing, and speaking through his consulting practice, Quinnovation (website). Clark has been at the forefront of some of the most important trends in workplace learning, including his early advocacy for mobile learning, his work with the Internet Time Group advocating for a greater emphasis on workplace learning, and his many efforts to bring research-based wisdom to elearning design. With the publication of his new book, Clark again shows leadership—now in the cause of giving instructional designers a clear and highly-readable guide to the learning sciences.

Clark is the author of numerous books. The following are representative:

 

Patti Shank, PhD

Patti Shank, PhD, is an internationally-recognized learning analyst, writer, and translational researcher in the learning, performance, and talent space. Dr. Shank holds a doctorate in Educational Leadership and Innovation, Instructional Technology from the University of Colorado, Denver and a Masters degree in Education and Human Development from George Washington University. Since 1996, Patti has been consulting, researching, and writing through her consulting practice, Learning Peaks LLC (pattishank.com). As the best research-to-practice professionals tend to do, Patti has extensive experience as a practitioner, including roles such as training specialist, training supervisor, and manager of training and education. Patti has also played a critical role collaborating with the workplace learning’s most prominent trade associations—working, sometimes quixotically, to encourage the adoption of research-based wisdom for learning.

Patti is the author of numerous books, focusing not only on evidence-based practices, but also on online learning, elearning, and learning assessment. The following are her most recent books:


With Gratitude

In their decades of work, both Patti Shank and Clark Quinn have lived careers of heroic effort, perseverance, and passion. Their love for the learning-and-development field is deep and true. They don’t settle for half the truth, they don’t settle for half measures. But rather, they show their mettle even when they get pushback, even when times are tough, even when easier paths might call. It is an honor to recognize Patti and Clark as this year’s winners of the Neon Elephant Award.

 

Click here to learn more about the Neon Elephant Award…

 

 

21st December 2020

Neon Elephant Award Announcement

Dr. Will Thalheimer, President of Work-Learning Research, Inc., announces the winner of the 2020 Neon Elephant Award, given to Mirjam Neelen and Paul Kirschner for writing the book, Evidence-Informed Learning Design: Use Evidence to Create Training Which Improves Performanceand for their many years publishing their blog 3-Star Learning Experiences.

Click here to learn more about the Neon Elephant Award…

2020 Award Winners – Mirjam Neelen and Paul Kirschner

Mirjam Neelen is one of the world’s most accomplished research-to-practice practitioners in the workplace learning field. On the practical side, Mirjam has played many roles. As of this writing, she is the Head of Global Learning Design and Learning Sciences at Novartis. She has been a Learning Experience Design Lead at Accenture and at the Learnovate Centre in Dublin, an Instructional Designer at Google, and Instructional Design Lead at Houghton Mifflin Harcourt. Mirjam utilizes evidence-informed wisdom in her work and also partners with Paul A. Kirschner in the 3-Star Learning Experience blog to bring research and evidence-informed insights to the workplace learning field. Mirjam is a member of the Executive Advisory Board of The Learning Development Accelerator.

Paul A. Kirschner is Professor Emeritus at the Open University of the Netherlands and owner of kirschner-ED, an educational consulting practice. Paul is an internationally recognized expert in learning and educational research, with many classic studies to his name. He has served as President of the International Society for the Learning Sciences, is an AERA (American Education Research Association) Research Fellow (the first European to receive this honor). He has published several very successful books: Ten Steps to Complex Learning, Urban Myths about Learning and Education. More Urban Myths about Learning and Education, and this year he published How Learning Happens: Seminal Works in Educational Psychology and What They Mean in Practice with Carl Hendrick — as well as the book he and Mirjam are honored for here. Kirschner previously won the Neon Elephant Award in 2016 for the book Urban Myths about Learning and Education written with Pedro De Bruyckere and Casper D. Hulshof. Also, Paul’s co-author on the Ten-Steps book, Jeroen van Merriënboer, won the Neon-Elephant award in 2011.

Relevant Websites

Mirjam’s and Paul’s book, Evidence-Informed Learning Design was published only ten months ago, but has already swept the world as a book critical to learning architects and learning executives in their efforts to build the most effective learning designs. In my book review earlier this year I wrote, “Mirjam Neelen and Paul Kirschner have written a truly beautiful book—one that everyone in the workplace learning field should read, study, and keep close at hand. It’s a book of transformational value because it teaches us how to think about our jobs as practitioners in utilizing research-informed ideas to build maximally effective learning architectures.”

Mirjam Neelen and Paul Kirschner are the kind of research translators we should honor and emulate in the workplace learning field. They are unafraid in seeking the truth, passionate in sharing research- and evidence-informed wisdom, dogged in compiling research from scientific journals, and thoughtful in making research ideas accessible to practitioners in our field. It is an honor to recognize Mirjam and Paul as this year’s winners of the Neon Elephant Award.

 

Click here to learn more about the Neon Elephant Award…

The LEARNNOVATORS team (specifically Santhosh Kumar) asked if I would join them in their Crystal Balling with Learnnovators interview series, and I accepted! They have some really great people on the series, I recommend that you check it out!

The most impressive thing was that they must have studied my whole career history and read my publication list and watched my videos because they came up with a whole set of very pertinent and important questions. I was BLOWN AWAY—completely IMPRESSED! And, given their dedication, I spent a ton of time preparing and answering their questions.

It’s a two part series and here are the links:

Here are some of the quotes they pulled out and/or I’d like to highlight:

Learning is one of the most wondrous, complex, and important areas of human functioning.

The explosion of different learning technologies beyond authoring tools and LMSs is likely to create a wave of innovations in learning.

Data can be good, but also very very bad.

Learning Analytics is poised to cause problems as well. People are measuring all the wrong things. They are measuring what is easy to measure in learning, but not what is important.

We will be bamboozled by vendors who say they are using AI, but are not, or who are using just 1% AI and claiming that their product is AI-based.

Our senior managers don’t understand learning, they think it is easy, so they don’t support L&D like they should.

Because our L&D leaders live in a world where they are not understood, they do stupid stuff like pretending to align learning with business terminology and business-school vibes—forgetting to align first with learning.

We lie to our senior leaders when we show them our learning data—our smile sheets and our attendance data. We then manage toward these superstitious targets, causing a gross loss of effectiveness.

Learning is hard and learning that is focused on work is even harder because our learners have other priorities—so we shouldn’t beat ourselves up too much.

We know from the science of human cognition that when people encounter visual stimuli, their eyes move rapidly from one object to another and back again trying to comprehend what they see. I call this the “eye-path phenomenon.” So, because of this inherent human tendency, we as presenters—as learning designers too!—have to design our presentation slides to align with these eye-path movements.

Organizations now—and even more so in the near future—will use many tools in a Learning-Technology Stack. These will include (1) platforms that offer asynchronous cloud-based learning environments that enable and encourage better learning designs, (2) tools that enable realistic practice in decision-making, (3) tools that reinforce and remind learners, (4) spaced-learning tools, (5) habit-support tools, (6) insight-learning tools (those that enable creative ideation and innovation), et cetera

Learnnovators asked me what I hoped for the learning and development field. Here’s what I said:

Nobody is good at predicting the future, so I will share the vision I hope for. I hope we in learning and development continue to be passionate about helping other people learn and perform at their best. I hope we recognize that we have a responsibility not just to our organizations, but beyond business results to our learners, their coworkers/families/friends, to the community, society, and the environs. I hope we become brilliantly professionalized, having rigorous standards, a well-researched body of knowledge, higher salaries, and career paths beyond L&D. I hope we measure better, using our results to improve what we do. I hope we, more-and-more, take a small-S scientific approach to our practices, doing more A-B testing, compiling a database of meaningful results, building virtuous cycles of continuous improvement. I hope we develop better tools to make building better learning—and better performance—easier and more effective. And I hope we continue to feel good about our contributions to learning. Learning is at the heart of our humanity!

I’m thrilled and delighted to share the news that Jane Bozarth, research-to-practice advocate, author of Show Your Work, and Director of Research for the eLearning Guild, is pledging $1,000 to the Learning Styles Challenge!!

 

 

Jane has been a vigorous debunker of the Learning-Styles Myth for many, many years! For those of you who don’t know, the Learning-Styles Notion is the idea that different people have different styles of learning and that by designing our learning programs to meet each style—that is, to actually provide different learning content or activities to different learners—learning will be improved. Sounds great, but unfortunately, dozens and dozens of research studies and many major research reviews have found the Learning-Styles Notion to be untrue!

 

“Decades of research suggest that learning styles, or the belief that people learn better when they receive instruction in their dominant way of learning, may be one of the most pervasive myths about cognition.”

Nancekivell, S. E., Shah, P., & Gelman, S. A. (2020).
Maybe they’re born with it, or maybe it’s experience:
Toward a deeper understanding of the learning style myth.
Journal of Educational Psychology, 112(2), 221–235.

 

 

“Several reviews that span decades have evaluated the literature on learning styles (e.g., Arter & Jenkins, 1979; Kampwirth & Bates, 1980; Kavale & Forness, 1987; Kavale, Hirshoren, & Forness, 1998; Pashler et al., 2009; Snider, 1992; Stahl, 1999; Tarver & Dawson, 1978), and each has drawn the conclusion that there is no viable evidence to support the theory.”

Willingham, D. T., Hughes, E. M., & Dobolyi, D. G. (2015).
The scientific status of learning styles theories.
Teaching of Psychology, 42(3), 266-271.

 

With Jane’s contribution, the Learning Styles Challenge is up to $6,000! That is, if someone can demonstrate a beneficial effect from using learning styles to design learning, the underwriters will pay that person or group $6,000.

The Learning Styles Challenge began on August 4th 2006 when I offered $1,000 for the first challenge. In 2014, it expanded to $5,000 when additional pledges were made by Guy Wallace, Sivasailam “Thiagi” Thiagarajan, Bob Carleton, and Bob’s company, Vector Group.

Thank you to Jane Bozarth for her generous contribution to the cause! And check out her excellent research review of the learning-styles literature. Jane’s report is filled with tons of research, but also many very practical recommendations for learning professionals.

 

 

12th December 2019

Neon Elephant Award Announcement

Dr. Will Thalheimer, President of Work-Learning Research, Inc., announces the winner of the 2019 Neon Elephant Award, given to David Epstein for writing the book Range: Why Generalists Triumph in a Specialized World, and for his many years as a journalist and science-inspired truth teller.

Click here to learn more about the Neon Elephant Award…

 

2019 Award Winner – David Epstein

David Epstein, is an award-winning writer and journalist, having won awards for his writing from such esteemed bodies as the National Academies of Sciences, Engineering, and Medicine, the Society of Professional Journalists, and the National Center on Disability and Journalism—and has been included in the Best American Science and Nature Writing anthology. David has been a science writer for ProPublica and a senior writer at Sports Illustrated where he helped break the story on baseball legend Alex Rodriguez’s steroid use. David speaks internationally on performance science and the uses (and misuses) of data and his TED talk on human athletic performance has been viewed over eight million times.

Mr. Epstein is the author of two books:

David is honored this year for his new book on human learning and development, Range: Why Generalists Triumph in a Specialized World. The book lays out a very strong case for why most people will become better performers if they focus broadly on their development rather than focusing tenaciously and exclusively on one domain. If we want to raise our children to be great soccer players (aka “football” in most places), we’d be better off having them play multiple sports rather than just soccer. If we want to develop the most innovative cancer researchers, we shouldn’t just train them in cancer-related biology and medicine, we should give them a wealth of information and experiences from a wide range of fields.

Range is a phenomenal piece of art and science. Epstein is truly brilliant in compiling and comprehending the science he reviews, while at the same time telling stories and organizing the book in ways that engage and make complex concepts understandable. In writing the book, David is debunking the common wisdom that performance is improved most rapidly and effectively by focusing practice and learning toward a narrow foci. Where others have only hinted at the power of a broad developmental pathway, Epstein’s Range builds up a towering landmark of evidence that will remain visible on the horizon of the learning field for decades if not millennium.

We in the workplace learning-and-development field should immerse ourselves in Range—not just in thinking about how to design learning and architect learning contexts, but also in thinking about how to evaluate prospects for recruitment and hiring. It’s likely that we currently undervalue people with broad backgrounds and artificially overvalue people with extreme and narrow talents.

Here is a nice article where Epstein wrestles with a question that elucidates an issue we have in our field—what happens when many people in a field are not following research-based guidelines. The article is set in the medical profession, but there are definite parallels to what we face everyday in the learning field.

Epstein is the kind of person we should honor and emulate in the workplace learning field. He is unafraid in seeking the truth, relentless and seemingly inexhaustible in his research efforts, and clear and engaging as a conveyor of information. It is an honor to recognize him as this year’s winner of the Neon Elephant Award.

 

Click here to learn more about the Neon Elephant Award…

Will’s Note: ONE DAY after publishing this first draft, I’ve decided that I mucked this up, mashing up what researchers, research translators, and learning professionals should focus on. Within the next week, I will update this to a second draft. You can still read the original below (for now):

 

Some evidence is better than other evidence. We naturally trust ten well-designed research studies better than one. We trust a well-controlled scientific study better than a poorly-controlled study. We trust scientific research more than opinion research, unless all we care about is people’s opinions.

Scientific journal editors have to decide which research articles to accept for publication and which to reject. Practitioners have to decide which research to trust and which to ignore. Politicians have to know which lies to tell and which to withhold (kidding, sort of).

To help themselves make decisions, journal editors regular rank each article on a continuum from strong research methodology to weak. The medical field regularly uses a level-of-evidence approach to making medical recommendations.

There are many taxonomies for “levels of evidence” or “hierarchy of evidence” as it is commonly called. Wikipedia offers a nice review of the hierarchy-of-evidence concept, including some important criticisms.

Hierarchy of Evidence for Learning Practitioners

The suggested models for level of evidence were created by and for researchers, so they are not directly applicable to learning professionals. Still, it’s helpful for us to have our own hierarchy of evidence, one that we might actually be able to use. For that reason, I’ve created one, adding in the importance of practical evidence that is missing from the research-focused taxonomies. Following the research versions, Level 1 is the best.

  • Level 1 — Evidence from systematic research reviews and/or meta-analyses of all relevant randomized controlled trials (RCTs) that have ALSO been utilized by practitioners and found both beneficial and practical from a cost-time-effort perspective.
  • Level 2 — Same evidence as Level 1, but NOT systematically or sufficiently utilized by practitioners to confirm benefits and practicality.
  • Level 3 — Consistent evidence from a number of RCTs using different contexts and situations and learners; and conducted by different researchers.
  • Level 4 — Evidence from one or more RCTs that utilize the same research context.
  • Level 5 — Evidence from one or more well-designed controlled trial without randomization of learners to different learning factors.
  • Level 6 — Evidence from well-designed cohort or case-control studies.
  • Level 7 — Evidence from descriptive and/or qualitative studies.
  • Level 8 — Evidence from research-to-practice experts.
  • Level 9 — Evidence from the opinion of other authorities, expert committees, etc.
  • Level 10 — Evidence from the opinion of practitioners surveyed, interviewed, focus-grouped, etc.
  • Level 11 — Evidence from the opinion of learners surveyed, interviewed, focus-grouped, etc.
  • Level 12 — Evidence curated from the internet.

Let me consider this Version 1 until I get feedback from you and others!

Critical Considerations

  1. Some evidence is better than other evidence
  2. If you’re not an expert in evaluating evidence, get insights from those who are–particularly valuable are research-to-practice experts (those who have considerable experience in translating research into practical recommendations).
  3. Opinion research in the learning field is especially problematic, because the learning field is comprised of both strong and poor conceptions of what works.
  4. Learner opinions are problematic as well because learners often have poor intuitions about what works for them in supporting their learning.
  5. Curating information from the internet is especially problematic because it’s difficult to distinguish between good and poor sources.

Trusted Research to Practice Experts

(in no particular order, they’re all great!)

  • (Me) Will Thalheimer
  • Patti Shank
  • Julie Dirksen
  • Clark Quinn
  • Mirjam Neelen
  • Ruth Clark
  • Donald Clark
  • Karl Kapp
  • Jane Bozarth
  • Ulrich Boser

The 70-20-10 Framework has been all the rage for the last five or ten years in the workplace learning field. Indeed, I organized a great debate about 70-20-10 through The Debunker Club (you can see the tweet stream here). I have gone on record saying that the numbers don’t have a sound research backing, but that the concept is a good one—particularly the idea that we as learning professionals ought to leverage on-the-job learning where we can.

What is 70-20-10?

The 70-20-10 framework is built on the belief that 10% of workplace learning is, or should be, propelled by formal training; that 20% is, or should be, enabled by learning directly from others; and that 70% of workplace learning is, or should, come from employee’s learning through workplace experiences.

Supported by Research?

Given all the energy around 70-20-10, you might think that lots of rigorous scientific research has been done on the framework. Well, you would be wrong!

In fact, up until today (April 19, 2019), only one study has been published in a scientific journal (my search of PsycINFO only reveals one study). In this post, I will review that one study, published last year:

Johnson, S. J., Blackman, D. A., & Buick, F. (2018). The 70:20:10 framework and the transfer of learning. Human Resource Development Quarterly. Advance online publication.

Caveats

All research has strengths, weaknesses, and limitations—and it’s helpful to acknowledge these so we can think clearly. First, one study cannot be definitive, and this is just one study. Also, this study is qualitative and relies on subjective inputs to draw its conclusions. Ideally, we’d like to have more objective measures utilized. It is also gathering data from a small sample of public sector workers, where ideally we want a wider range of diverse participants.

Methodology

The researchers found a group of organizations who had been bombarded with messages and training to encourage the use of the 70-20-10 model. Specifically, the APSC (The Australian Public Sector Commission), starting in 2011, encouraged the Australian public sector to embrace 70-20-10.

The specific study “draws from the experiences of two groups of Australian public sector managers: senior managers responsible for implementing the 70:20:10 framework within their organization; and middle managers who have undergone management capability development aligned to the 70:20:10 framework. All managers were drawn from the Commonwealth, Victorian, Queensland, and Northern Territory governments.”

A qualitative approach was chosen according to the researchers “given the atheoretical nature of the 70:20:10 framework and the lack of theory or evidence to provide a research framework.”

The qualitative approaches used by the researchers were individual structured interviews and group structured interviews.

The researchers chose people to interview based on their experience using the 70-20-10 framework to develop middle managers. “A purposive sampling technique was adopted, selecting participants who had specific knowledge of, and experience with, middle management capability development in line with the 70:20:10 framework.”

The researchers used a text-processing program (NVivo) to help them organize and make sense of the qualitative data (the words collected in the interviews). According to Wikipedia, “NVivo is intended to help users organize and analyze non-numerical or unstructured data. The software allows users to classify, sort and arrange information; examine relationships in the data; and combine analysis with linking, shaping, searching and modeling.”

Overall Results

The authors conclude the following:

“In terms of implications for practice, the 70:20:10 framework has the potential to better guide the achievement of capability development through improved learning transfer in the public sector. However, this will only occur if future implementation guidelines focus on both the types of learning required and how to integrate them in a meaningful way. Actively addressing the impact that senior managers and peers have in how learning is integrated into the workplace through both social modeling and organizational support… will also need to become a core part of any effective implementation.”

“Using a large qualitative data set that enabled the exploration of participant perspectives and experiences of using the 70:20:10 framework in situ, we found that, despite many Australian public sector organizations implementing the framework, to date it is failing to deliver desired learning transfer results. This failure can be attributed to four misconceptions in the framework’s implementation: (a) an overconfident assumption that unstructured experiential learning will automatically result in capability development; (b) a narrow interpretation of social learning and a failure to recognize the role social learning has in integrating experiential, social and formal learning; (c) the expectation that managerial behavior would automatically change following formal training and development activities without the need to actively support the process; and (d) a lack of recognition of the requirement of a planned and integrated relationship between the elements of the 70:20:10 framework.”

Specific Difficulties

With Experiential Learning

“Senior managers indicated that one reason for adopting the 70:20:10 framework was that the dominant element of 70% development achieved through experiential learning reflected their expectation that employees should learn on the job. However, when talking to the middle managers themselves, it was not clear how such learning was being supported. Participants suggested that one problem was a leadership perception across senior managers that middle managers could automatically transition into middle management roles without a great deal of support or development.”

“The most common concern, however, was that experiential learning efficacy was challenged because managers were acquiring inappropriate behaviors on the job based on what they saw around them every day.”

“We found that experiential learning, as it is currently being implemented, is predominantly unstructured and unmanaged, that is, systems are not put in place in the work environment to support learning. It was anticipated that managers would learn on the job, without adequate preparation, additional support, or resourcing to facilitate effective learning.”

With Social Learning

“Overall, participants welcomed the potential of social learning, which could help them make sense of their con-text, enabling both sense making of new knowledge acquired and reinforcing what was appropriate both in, and for, their organization. However, they made it clear that, despite apparent organizational awareness of the value of social learning, it was predominantly dependent upon the preferences and working styles of individual managers, rather than being supported systematically through organizationally designed learning programs. Consequently, it was apparent that social learning was not being utilized in the way intended in the 70:20:10 framework in that it was not usually integrated with formal or experiential learning.”

Mentoring

“Mentoring was consistently highlighted by middle and senior managers as being important for both supporting a middle manager’s current job and for building future capacity.”

“Despite mentoring being consistently raised as the most favored form of development, it was not always formally supported by the organization, meaning that, in many instances, mentoring was lacking for middle managers.”

“A lack of systemic approaches to mentoring meant it was fragile and often temporary.”

Peer Support

“Peer support and networking encouraged middle managers to adopt a broader perspective and engage in a community of practice to develop ideas regarding implementing new skills.”

“However, despite managers agreeing that networks and peer support would assist them to build capability and transfer learning to the workplace, there appeared to be few organizationally supported peer learning opportunities. It was largely up to individuals to actively seek out and join their own networks.”

With Formal Learning

“Formal learning programs were recognized by middle and senior managers as important forms of capability development. Attendance was often encouraged for new middle managers.”

“However, not all experiences with formal training programs were positive, with both senior and middle managers reflecting on their ineffectiveness.”

“For the most part, participants reported finishing formal development programs with little to no follow up.”

“There was a lack of both social and experiential support for embedding this learning. The lack of social learning support partly revolved around the high workloads of managers and the lack of time devoted to development activities.”

“The lack of experiential support and senior management feedback meant that many middle managers did not have the opportunity to practice and further develop their new skills, despite their initial enthusiasm.”

“A key issue with this was the lack of direct and clear guidance provided by their line managers.”

“A further issue with formal learning was that it was often designed generically for groups of participants…  The need for specificity also related to the lack of explicit, individualized feedback provided by their line manager to reinforce and embed learning.”

What Should We Make of This Preliminary Research?

Again, with only one study—and a qualitative one conducted on a narrow type of participant—we should be very careful in drawing conclusions.

Still, the study can be helpful in helping us develop hypotheses for further testing—both by researchers and by us as learning professionals.

We also ought to be careful in casting doubt on the 70-20-10 framework itself. Indeed, the research seems to suggest that the framework was not always implemented as intended. On the other hand, when it is demonstrated that a model tends to be used poorly in its routine use, then we should become skeptical that it will produce reliable benefits.

Here are a list of reflections generated in me by the research:

  1. Why so much excitement for 70-20-10 with so little research backing?
  2. Formal training was found to have all the problems normally associated with it, especially the lack of follow-through and after-training support—so we still need to work to improve it!
  3. Who will provide continuous support for experiential and social learning? In the research case, the responsibility for implementing on-the-job learning experiences was not clear, and so the implementation was not done or was poorly done.
  4. What does it take in terms of resources, responsibility, and tasking to make experiential and social learning useful? Or, is this just a bridge too far?
  5. The most likely leverage point for on-the-job learning still seems, to me, to be managers. If this is a correct assumption—and really it should be tested—how can we in Learning & Development encourage, support, and resource managers for this role?

Sign Up For Will’s News by Clicking Here

 

 

 

Happened to notice these two statements printed in vendor literature at a recent conference. I’ve obscured their names just enough so I’m not obviously picking on them but they will know who they are.

Statement #1 from vendor named “C*g*i*o”

  • “We all know that up to 80% of what learners are taught in training will be lost in 30 days if there is no practice or reinforcement.”

Statement #2: from vendor named “A*ea*”

  • “We have known for more than 150 years that humans forget up to 70% of what they learn within 24 hours!”

These statements are false and misleading. To get a more accurate view of human forgetting, check out this well-researched document.

The Sad Reality of Faux or Misleading Research Citations in Vendor Literature

Buyer beware! Vendors are now utilizing confirmatory-bias methodologies to sprinkle their verbal and visual communications with research-sounding sound bites. Because we are human, this persuasion technique is likely to snare us.

We may even buy a product or service that doesn’t work.

My recommendation: Spend $500 on a research-to-practice expert to save yourself tens or hundreds of thousands of dollars, euros, pounds, etc.