For the Learning 2005 conference, starting today in Orlando within the movie-set sterility the Disney empire, Elliott Masie of the Masie Center has promised a new type of training-industry conference. No Powerpoints. No exhibit halls. No more conference sessions dominated by vendors and consultants. Instead, Elliott has promised to experiment with the medium, create a community-dominated discussion where conference-goers can learn from each other.

Will this effort at innovation succeed or fall flat?

Certainly, most conferences in the training and e-learning space are a mixed bag. Some excellent sessions. Some engaging. Some spewing misinformation. Some spewing platitudes. But the real problem with most conferences is that there is no way to validate the information learned—and since the information is largely vendor-driven, the information is a bit suspect in the first place.

Can a "1500-heads-are-better-than-one" format work? We shall see. I have my doubts. How will we know whether an idea put forward is a good one or not? How will we know whether the person with the idea is a genius or a nut?

In preparing to come to the conference, I have been impressed with the meeting and interactive technologies in place, though I admit to not fully understanding them. Learning 2005 has a "learning wiki" enabling conference participants to connect with each other, comment on the sessions, etc. As of today however, not much communication has occurred between the participants. This should increase once the sessions plant the seeds for discussion, but not too much prework has been done.

Img_1668_smaller_1 In tonight’s Keynote, Elliott told some jokes, talked to a humorous computer-generated talking head (very impressive if really computer-generated, but I’m betting on a human comedian behind the funny banter), interviewed a 19-year old intern to show us how different this generation is (one data point, isn’t it?), gave money and awards to a couple of non-profits, gave an award to CNN for their innovative learning design (which seemed to be for one-on-one coaching, but couldn’t really be, could it?), and talked with the Chairman and Founder of Boston Scientific, John Abele, who offered the best learning tidbits of the evening.

Abele started by answering Elliott’s query about why CEO’s worry about learning by saying that it is very simple from a business perspective, "If we can’t do learning better and faster, someone else will." Twice Abele mentioned the importance of give and take to get to wisdom. He talked about a live-demonstration course Boston Scientific developed to change the way the marketplace (doctors and medical institutions) viewed the company’s new medical methods. The amazing thing about the course was that they invited the world’s best surgeons to witness real surgeries and vote on what to do next by using a vote-response system. One example he cited was asking the doctors whether the medical device ought to be inserted half an inch more into the heart. The discussions and arguments that went on were a great learning vehicle. To augment these sessions, the commentators were chosen for being contrarians.

Abele also gave us the audience advice on choosing a doctor if we’re going in for surgery. Ask the doctor what research papers show the benefits of the surgery, and what research papers show that the procedure doesn’t work that well. Again, the benefits of experts fighting it out.

Finally, Abele talked about how doctors these days are beginning to do simulated surgeries on real people. If you have an abdominal aneurysms, your doctor may "take pictures" of the aneurysm and then basically perform your surgery on your images before he or she does surgery on you. What a great way of aligning the learning and performance contexts, a research topic I’ve written about many times.

Elliott’s main theme in his remarks is that the world is faster and more confusing then ever. Learning design must move from 18 weeks to 18 hours. That’s almost an exact quote, by the way.

The keynote was two hours long, but felt longer. Still, it was much better than listening to some celebrity deliver a canned speech, with no learning content to speak of.

I give Elliott lots of credit for this experiment. Whether it’s a noble effort, an ingenious publicity stunt, or both, I’m looking forward to the sunrise when we get down to the audience-generated learning.

It’s time for a change.

The Work-Learning Research Newsletter is becoming the Work-Learning Journal. In the short term, the content will consist of the same kind of pithy research-based commentary I’ve always aimed to deliver. In the long term, you’ll probably see some additions as well, but those plans will have to remain a secret for now.

The big change for the moment is the delivery mechanism. I’ll still send you an email reminder, but the content will now be contained in a blog-like format. This has several benefits:

  1. You can post your comments, so we can learn from each other.
  2. The content will remain where you can access it—online all the time.
  3. Others can link to it and share our intelligence far and wide.
  4. You can subscribe using your aggregator. If you don’t know what one is, you will within the next year or so.
  5. We can all learn about Web 2.0 and what it means for learning.

In addition to publishing a monthly newsletter, I’m going to write a blog as well. You can check out the beginnings of that effort at

The Newsletter is dead. Long live the Work-Learning Journal.

By the way, if you’re not yet on one or more of our email-reminder mailing lists, you can subscribe at this location.

I’ve been avoiding blogging.

Why? Mostly because I thought blogs were evil—just another contributor to "bad information gone wide." Despite my worries about an expanding universe of vacuous claptrap, I’ve decided to take my own advice and view blogging as just another tool—with strengths and weaknesses.

Why am I starting to blog now?

  1. Kathleen Gilroy, pixie mensch and esteemed leader of the Otter Group, talked me into it when she explained the Web 2.0 idea to me.
  2. I needed a way to convey information quickly and informally.
  3. I needed a way to get the Work-Learning Research newsletter up online permanently.
  4. I wanted to learn firsthand about the potential of this new technology for learning.
  5. I wanted a way to connect with clients, thought-leaders, colleagues, friends.
  6. I wanted a way to learn from others.
  7. I wanted to get started on idea projects that were on the back burner, pushing them toward completion, getting something out there in the event of my early demise.
  8. I wanted to get younger, hipper, and better looking.

I’d like to offer a special thanks to Kathleen and also to someone she introduced me to—Bill Ives, author of Business Blogs: A Practical Guide. Bill’s wisdom has helped me think through my blogging strategy as he has eased me up the learning curve.

This blog, Will at Work Learning, will throw out lots of ideas about research-based learning design and the learning-and-performance industry.

The Work-Learning Journal will offer longer pieces on more in-depth topics. It will also include thought-provoking pieces from other researchers and thinkers in our field.

Please Note the name of this blog, Will at Work Learning. Not only is this convenient because it parallels the Work-Learning Research company name, but it is also my dearest hope—that this blog will engender my own learning. Please contribute with your comments.

One of our primary goals at Work-Learning Research is to help learning-and-performance professionals improve their learning results. As part of this commitment to practitioners, we like to highlight the work of others who provide valid information with a high-integrity mindset.

One of the best examples of this high-value, high-integrity approach is TMR (Training Media Review), led by Bill Ellet, accompanied by his vast network of insightful professionals. TMR provides third-party reviews of training content, including reviews of e-learning courses, computer-based training, videos, books, training manuals, and more.

TMR is the exclusive provider of media reviews to t+d magazine, published by the American Society for Training and Development.

There are two things that I really like about TMR. The first is that the reviewers are all extremely experienced and well chosen to have the right background to make informed reviews. For example, Patti Shank—one of e-learning industry’s best thinkers—has recently completed a review of authoring-tool products. The second thing I really like about TMR is its mission to provide trusted information. This focus comes from Bill Ellet’s personal insistence that reviews be non-biased and that the information be valid and informative. TMR observes a strict no-conflict-of-interest policy. TMR receives no commissions when it reviews products. Vendors are never charged a fee for getting their products reviewed. TMR reviewers are not allowed to serve as consultants to companies whose products they review.

Those of you experienced in the industry know that many of the reviews produced in the marketplace and awards programs require applicants to pay a fee, biasing the results. What’s worse is that awards always seem to go to programs with sizzle and no substance. Just because a program gets a gold medal in some well-known industry award contest, doesn’t mean it is any good.

TMR separates itself from these pretenders by keeping strict standards and using great reviewers. By being a beacon of value, TMR shines a light for organizations seeking off-the-shelf training media and related products.

In addition to its reviews, TMR also offers discounts to subscribing members on all sorts of great stuff, including books, environmentally-friendly training products, software, and more.

Contact Information for Training Media Review is available at their website.

Among the many changes on the horizon at Work-Learning Research, one of the most exciting for me is our new emphasis on learning audits. As the field moves more toward "evidence-based practices" and "evaluation-tested learning" more and more decision-makers are incorporating outside evaluations into their instructional-design repertoires.

Before we unveil our learning-audit offerings in their new finery, we’d like to offer viewers of this journal a 25% discount for all audits that are started and completed within this calendar year. This offer only lasts while we have capacity to perform the audits—as of today, we can schedule about 10 more audits through the end of the year.

Whether for your organization, or your clients’ organizations, a learning audit might be just the thing to energize your instructional-design efforts, your e-learning efforts, your strategic organizational-learning initiatives.

Learning Audits aren’t cheap, but the information they produce can be priceless.

If you want to look more closely at our old verbiage on audits, check out this link. Better yet, contact me directly to get started or just ask questions.

Jonathon Levy, currently Senior Learning Strategist at the Monitor Group, tells the story from his days as Vice President at Harvard Business School Publishing. The story is funny and sad at the same time, but it’s very instructive on several fronts.

Levy’s client decided that he would award end-of-year Christmas bonuses based on how successful his employees were in completing the Harvard online courses. Levy advised against it, but the client did it anyway.

The results were predictable, but they might never have been noticed if Jonathon’s Harvard team had not integrated into all their courses a tracking system to provide themselves with feedback about how learners used the courses. The tracking system showed that learners didn’t read a thing; they just scanned the course and clicked where they were required to click. They just wanted to get credit so they could maximize their bonuses.

Although very little learning took place, everyone was happy.

  • Learners were happy because they got their bonuses.
  • The client (training manager) was happy because he could show a remarkable completion rate.
  • Line managers were happy, because the learners wasted very little time in training.
  • Senior management was happy because they could demonstrate a higher rate of utilization of online learning.

What can we learn from this?

  1. Be careful what you reward, because you might get the specific behavior you reinforced (course-completion in this case).
  2. Completion rates are a poor measure of training effectiveness. Same as butts in seats.
  3. We need authentic measures of training effectiveness to prevent such silliness.
  4. Instructional-design activities benefit when good tracking and feedback mechanisms are built into our designs.

In 2002, I wrote an article entitled, E-Learning’s Unique—And Seemingly Unknown—Capability. In essence, I was talking about the spacing effect—about facilitating learning by enabling learners to reconnect with key learning points over time. I specifically stated the following:

“Among all the learning media, e-learning is the only one that has the potential to have meaningful and renewable contact with learners over time.”

I further argued that e-learning’s connectivity capability was actually valuable because it aligned with the human learning system, enabling e-learning to deliver spaced repetitions, delayed feedback, and shorter retention intervals.

According to research from the preeminent refereed journals, spacing learning material over time adds to the power of repetition, producing improvements of up to 40%. Delayed feedback improves learning by 10 to 25%. Reducing the retention interval improves learning by significant amounts. Rates vary depending on many factors but especially on how much the retention interval shrinks. For example, in Harry Bahrick and family’s (Bahrick, Bahrick, Bahrick, & Bahrick, 1993) classic experiment on remembering foreign-language vocabulary, reducing the retention interval would have decreased forgetting by an average of 17% per year, and up to 178% for a five-year reduction in the retention interval.

As the title of my earlier article suggested, in 2002 there were very few uses of the spacing effect in e-learning designs. The good news is that it appears that things are changing. I am currently in the process of writing an article on how things have changed.

I have seen the following types of spaced-learning implementations.

  • Email reminders delivering learning material after the primary learning events.
  • Mini-e-learning refreshers dispersed once a month.
  • Encouragement to managers to follow-up afterwards.
  • E-learning delivered in chunks as opposed to all the time.

But I want more examples.

Please send me your ideas in an email to this email address. SORRY, ALL DONE GATHERING EXAMPLES…

The best examples will be incorporated into an article and will be mentioned in the newsletter, etc.

By the way, the examples don’t have to be pure e-learning examples either. All sorts of training examples are welcome.

What is the median age when children are potty trained? Pick the choice that is closest to the actual figure. Go ahead and pick an answer before you read further—just for the heck of it.

  1. 1 year old
  2. 2 years old
  3. 3 years old


Most Americans are likely to choose the second and third choices because most American children are potty trained starting when they are around two years of age.


What’s interesting is that at least 50% of the world’s babies are potty trained by the time they are 1 year old. This piece of data comes from a front-page story in the Sunday New York Times, October 9, 2005 (citing Contemporary Pediatrics magazine).


And the practice isn’t just relegated to a few cultures. Parents in more than 75 countries use these super-fast potty-training practices.


While I generally keep to adult-learning topics in this newsletter, I decided to include this topic for several reasons.


  1. If the NY Times runs it on the front page, it must be fit to print (wink-wink).
  2. The topic held the promise of being an attention-grabber.
  3. The topic reveals lessons for our training-and-development practices.

One obvious learning point is that cultural and individual differences exist in learning. The potty-training example is just an illustration of how surprised we can become if we assume everyone does training like we do it. We cannot assume everything generalizes!

A second point is that while performance outcomes may not differ, different training methods may have different side effects. Most kids are going to be potty trained by the time they are five, but just imagine how much solid waste could be reduced if kids used less than a year of diapers versus three years of diapers. Different training methods produce many different types of side effects: cost differential, organizational morale, individual sense of efficacy, self-esteem, work-family balance, and even pollution! Do you know your training’s side effects?


Let’s not forget that training affects learner attention, even after the training is over. If we train learners on grammar skills, later when they’re in a meeting, they’re more likely to utilize some of their limited working-memory capacity thinking about grammar than if we’d given them no training on grammar. This can be a beneficial or disruptive side effect.


A third point we might glean from the potty example is that learner readiness affects the training investment needed. As the New York Times article pointed out, the parents who use the super-fast training methods must be in constant vigilance, looking for the faintest signs that their child is going to pee or poop. Some of the parents who use these methods swear by them, describing how it helps them feel closer to their children, but they all talk about what an incredible effort it takes to use the methods. As child-guru T. Berry Brazelton was quoted as saying, “I’m all for it, except I don’t think many people can do it.” It’s just going to be too difficult for most parents—especially the typical parent who has multiple responsibilities.


Similar analogs can be found in workplace learning. Learner readiness can play out in many ways. More training may be required for those who aren’t ready, more prerequisite learning opportunities, more effort, and more hand-holding. Lack of readiness may even trigger a reversal in the training calculus—perhaps some topics for some people in some situations are just too costly to train. Alternative interventions may be required or a decision to withhold training may also be appropriate. Do you know how ready your learners are?


The learner-readiness notion is also relevant to public policy. For example, investments in our public schools affect the investments businesses need to make in learning.





  1. We cannot assume that we understand our learners.
  2. Learning designs have different side effects.
  3. Workplace attention is a side effect of training.
  4. The less ready learners are, the more investment needed.
  5. Learner readiness is affected by your elected officials.


Questions to Think About


  1. What are the side effects of your learning interventions?
  2. What level of readiness do your learners have?