Many people, including me, think the Internet is a great enabler of learning. But is it in jeopardy?

Right now the Internet provides relatively equal access for all people and organizations, from rich to poor, from powerful to oppressed. But might there be forces at work undermining Internet equality? Jeff Chester, executive director of the Center for Digital Democracy (www.democraticmedia.org) thinks that the internet as we know it may soon come to an end.

Writing in The Nation (a left-of-center publication that errors on the side of (and gives voice to) anti-corporate advocates), Chester describes how phone and cable companies are proposing to charge user’s fees, capture personal data, and provide privileged services to those who can pay the most.

Check out the online article here. It’s title: The End of the Internet.

Jeff Chester is author of a forthcoming book on US media politics, Digital Destiny, which will be published in the fall by The New Press.

Which is better, e-learning or blended learning?

Both constructs are constituted at too high a level of granularity to really make a difference for human learning. To say this more simply, it’s not whether it’s e-learning or blended learning that matters. It’s the learning methods that are used in designing these learning interventions that matter.

An e-learning program can be well-designed or poorly designed, depending on what learning methods are utilized in its design. It’s the same with a blended-learning approach. A learning intervention that utilizes meaningful repetitions will produce better learning outcomes than one that doesn’t. A learning intervention that provides realistic retrieval practice will be more effective than one that doesn’t. The key is to know how learning works, and design accordingly.

Imagine being asked the question, “What type of vehicle is going to get better gas mileage, trucks or sports-utility vehicles?” Such a question suggests a poor understanding of the causes of gas mileage. A person with a basic understanding of gas-mileage factors wouldn’t even ask such a question.

The implications for this are as follows:

1. If someone tells you they have a great learning intervention because it’s a blended-learning program, you are entitled to be skeptical. If they are trying to sell you this learning intervention, you are entitled to chuckle quietly—although we recommend nodding slowly and gently with sad empathetic eyes.

2. If you have to purchase or design a learning intervention—and you’ve been thinking that all you need to do is determine which media or which technology to use, please find someone who understands human learning to help you focus on learning methods and outcomes.

As I type this, I can hear the opening ceremony of the 2006 Winter Olympics emanating from our living room.

Oh damn.

Two weeks of discipline required. If I start watching–if I take just one sip–I’m doomed to fall into the hypnotic seduction of the thing. Earlier this week I wrote a piece suggesting that we embrace popular culture to figure out what grabs people’s attention and imagination. Good advice, but I just don’t have the luxury to spend time watching people in tights for two weeks. I have too many other things to do; too many things to learn.

Learning takes time. Learning requires that we NOT do something else. Just like a good business strategy forces a company to decide what NOT to do, individuals who want to maximize their learning must have a good learning strategy. They must decide what activities to forgo.

Hmmm. What leverage can I gain from this knowledge in terms of instructional design? I don’t know, maybe none. It’s certainly relevant to individuals deciding whether (and how) to spend time learning something. But can I use this nugget to improve the instructional results or informal learning of the learners I am charged to help?

  1. Well, we might remind learners’ managers that learning takes time, and that they can help by protecting learning time.
  2. We can try to make learning more efficient, enabling our learners to forgo fewer other activities.
  3. We can think about whether our learning efforts are really that important, and just cut out those that don’t hit the threshold.
  4. We can measure learning to ensure that it’s really making a difference–instead of just taking this on faith.
  5. We can create compelling learning designs and focus on high-value, highly-relevant content, drawing our learners away from their distractions, vices, addictions; away from the mindless fluff of our entertainment culture; away from their spouses, children, parents. Okay, well maybe we should just make page turners.
  6. We can take over NBC (or whoever’s running the Olympics this year) and add some learning content to it. We could add some info about nutrition, exercise, genetics, ethics, international diversity, the unfair playing field for athletes from countries impoverished with lack of money or a lack of snow and ice. We could teach media literacy, and show how the networks–the advertisers really–try to control our minds and our actions.
  7. We could just watch the damn Olympics and take a freakin’ rest for pete’s sake.

Go Bode go!!

Anna Belyaev and her team at Type A Learning Agency have inspired this question:

"What do professions outside the training/development/performance field have to teach us about learning design?"

They ask this of themselves all the time, and now they’ve got me thinking. I’m going to share my thoughts when they arise. Here’s the first one.

I heard Bill Clinton’s eulogy for Coretta Scott King and I was touched by it. It was a simple podcast from NPR. It was just audio. But I was emotionally engaged and I listened with great interest and intent.

Ms. King lead an especially meaningful life as a civil rights leader, advocate of non-violent struggle for racial equality, wife of Martin Luther King, Jr.

Learning, first and foremost, requires learners to pay attention. This is often a conscious intentional process. It is often an unconscious and unintentional process as well. Either way, attention is one of the leverage points that we can use to spur learning and retention. Usually, if we increase attention we increase learning.

What if we could distill the factors that help keep learners engaged and attentive to our learning messages. Let’s take Bill Clinton’s eulogy for a minute. What does it do that grabs attention? Can we borrow these factors, or are they specific to eulogies and great public speakers?

Here are some of the factors that may be at work:

  1. Someone had died.
  2. The focus was on a celebrity.
  3. The speaker was a celebrity.
  4. The speaker was a person known for having his own struggles.
  5. The speaker understood his audience.
  6. The speaker got the audience to add energy to the message (clapping and chanting).
  7. The speaker talked about personal matters that most could empathize with.
  8. The speaker utilized religious symbols and meaning.
  9. The speaker repeated themes throughout his speech, but intermixed repetitions with other comments, and varied the surface form of the repetition, but not the underlying theme.
  10. The speaker modulated his voice appropriately, raising it to express power and importance, lowering it to connect personally.
  11. The speaker changed the speed of his delivery effectively, for example speeding up to move quickly through a list of attributes, slowing down, almost stopping, to create a sense of intimacy.
  12. The speaker used humor effectively.
  13. The speaker understood what was acceptable to say in the situation and moment.
  14. The speaker challenged power, creating a heightened sense of tension and importance.
  15. The speaker asked each person in the audience to examine their own actions and their own moral responsibilities.
  16. The speaker made a call to action. He asked for something.
  17. The speaker ended with an intensely moving sentence, tying it back to the person who had died.

What else did you hear that worked to make this a compelling communication? What didn’t work for you? Why did things work or not? Did your political beliefs and convictions affect how you received the message?

Obviously, there are some things on my list above that we may not be able to use in our training. Religious symbolism has its pitfalls. You can’t always find celebrities as voice talent. You can’t always find great public speakers. You can’t kill someone every time you create a course on Microsoft Excel.

But we can at least move in these directions. We can sometimes appeal to people’s sense of responsibility or justice. We can sometimes find appropriate ways to get highly-visible company personalities to get involved. We can sometimes find real-world stories that show the negative consequences of what happens when training recommendations aren’t followed.

I’m a big advocate for research-based or evidence-based instructional design. As we move through our own worlds, we can keep track of what grabs our attention and what spurs our learning. We can also pay attention to fields whose very survival depends upon grabbing attention or creating learning. There wouldn’t be television, theater, radio, newspapers, magazines, pornography, religious services, performance poetry, books, ballet, music, or Cirque du Soleil if people in these professions hadn’t figured out time-tested ways to grab attention and keep interest. The factors they use are there for us all to see and hear (and experiment with and utilize in our instructional designs), if only we begin to look and listen!!

Well, it looks like one of my previous brainstorms was wrong. Check out this link from the American Psychological Association on cell-phone use while driving. Initial research on cell phones while driving seems to suggest that cell phones Do hurt driving. Still not sure if drivers can learn to use cell phones more effectively while driving.

47 years old and I began hating my haircut. I was fine with my hair and then one day I just snapped. I needed a change. It’s painful to fire one’s barber/hairstylist, but I just had to find someone new.

But the problem is this. My hair’s thin and thinning. I can’t use any haircare products because I’m chemically sensitive, so many styles just don’t work. No mousse here. No gels. No spray. My new hair professional has to be somebody who can really think, not just copy a style and apply it to my head. My head requires creativity and deep knowledge.

So I begin the painful process of finding a new hair cutter. Damn I hate this, but I gotta do it, so here goes.

The first guy who cuts my hair is a genius. He looks at my hair. He listens to my strange set of requirements. He talks to me. He cuts. Looks good. We talk.

Here’s how he learned to cut hair. He started out cutting his own hair. He tried different things. He experimented. He built mental models of various cause and effect relationships. He’s not afraid to try different approaches. He also has hair like mine. I like him, but he costs me over $50, which is too much. I’m cheap, and I figure maybe I can find someone else with a better value proposition.

The second person I try has one way of doing things. She’s weirded out by my "no chemicals" request. She tries, but the haircut just doesn’t cut it. She only costs me $20, so maybe I’ll try her again. It could take a little trial and error. She even mentions this.

I’m probably drawing too much from these two data points of anecdotal evidence, but it reminds me of learning research I’ve come across in the past. To help our learners overcome "functional fixedness"—the tendency to limit the range of response sets we consider—it’s helpful to provide learners with multiple contexts and to specifically help them avoid such fixedness by helping prepare them to analyze realistic situations.

The first hairstylist was better able to deal with my wacky hair requirements because he had developed more flexible and more appropriate mental models of how hair-cutting works.

We can help our learners in the same way by:

  1. Providing multiple contexts for practice.
  2. Helping learners understand the underlying principles, not just the obvious surface characteristics of the information to be learned.
  3. Avoid using blocked learning chunks, for example, by only presenting information in topic sections, chapters, etc., without forcing them to deal with all the information together (like they would have to do in the real world). In other words, instead of dividing our learning chunks into chapters, present it in ways that prompt learners to deal with it more organically, more authentically. This doesn’t mean we can’t start with Topic Sections, but we can’t end there if we want to prepare learners for the real world.

I’ve noticed functional fixedness in our own performance as learning developers. Almost all instructional-design shops tend to gravitate to a limited number of learning methodologies to create their learning programs. They have a functional-fixedness toward instructional design. To create the best value and to be more creative (and to prevent themselves from being outwitted by more creative competitors) instructional design shops need to gather a wider range of learning methods.

It’s the "If you have a hammer, everything looks like a nail problem." Not only do we have to prevent our learners from falling into this trap, we have to prevent ourselves as instructional-development houses.

Here are some references on functional fixedness for those interested:

Chrysikou, E. G.; Weisberg, R. W. (2005). Following the Wrong Footsteps: Fixation Effects of Pictorial Examples in a Design Problem-Solving Task. Journal of Experimental Psychology: Learning, Memory, and Cognition, 31, 1134-1148.

Solomon, I. (1994). Analogical transfer and "functional fixedness" in the science classroom. Journal of Educational Research, 87, 371-377.

Langer, E. J. (1992). Matters of mind: Mindfulness/mindlessness in perspective. Consciousness and Cognition: An International Journal, 1, 289-305.

Antonietti, A. (1991). Why does mental visualization facilitate problem-solving. In Logie, Robert H. (Ed); Denis, Michel (Ed). Mental images in human cognition. (pp. 211-227). Oxford, England: North-Holland.

McKelvie, S. J. (1984). Relationship between set and functional fixedness: A replication. Perceptual and Motor Skills, 58, 996-998.

Arnon, R.; Kreitler, S. (1984). Effects of meaning training on overcoming functional fixedness. Current Psychological Research & Reviews, 3, 11-24.

Greeno, J. G.; Magone, M. E.; Chaiklin, S. (1979). Theory of constructions and set in problem solving. Memory & Cognition, 7, 445-461.

This is a review of ZengerFolkman’s ActionPlan Mapper. Let me provide a little background so you’ll understand my conclusions.

In 2002 I wrote an article on e-learning’s unique capability—that it was one of the few learning media that enabled us “to have meaningful and renewable contact with learners over time.” I argued that e-learning was a tool, and that we ought to figure out what it does well and maximize the advantage of that capability—as long as our e-learning methods are aligned with the human learning system. No sense utilizing an e-learning method if it doesn’t facilitate learning and performance.

I wrote about several learning factors that seemed ideal for e-learning. I also challenged the industry to get its butt in gear. At that time I didn’t see many applications of e-learning that took advantage of the connectedness capability. I just reread that article in preparation for writing this blog piece. It was really quite brilliant—even though I must say so myself—and I recommend it highly. You can purchase it for five bucks at www.work-learning.com/catalog/. Go ahead, make me rich.

A few years ago, I also taught an online class entitled Leveraging E-Learning. One of the suggestions I made in that class was that we ought to use our new-found internet/intranet capacity to connect with our learner’s managers as well as our learners. I even developed some rudimentary templates that outlined how this could be done.

Although I’m recounting my former brilliance for you in the hopes that you’ll hire me as your learning consultant in the near future—and to make myself feel good during these dark winter days—true geniuses don’t just rant and rave, they make things.

Today’s training-genius award goes to the folks at ZengerFolkman who developed the ActionPlan Mapper (www.zfco.com/apm.asp). They have given me renewed faith that eventually e-learning will meet its promise.

The ActionPlan Mapper is a web-based hosted solution that is available 24/7. It was designed to help training participants take what they learned and apply it to their jobs. As Kelly Clayton, Product Leader for the ActionPlan Mapper, has said, “What we’re trying to prevent is the Monday-morning problem. People go to a training course, they take notes, they have discussions, they get energized, they’re roaring to go, but when they get back to the job on Monday, they are overwhelmed with their normal workload and the momentum for action fades to oblivion…The ActionPlan Mapper works by prodding the learners, reminding them to stay focused and keep pursuing the action items they previously resolved to accomplish.”

  Picture1_5

Review Details

From the two intensive demos I’ve seen, the ActionPlan Mapper is a great tool. From a learning-to-performance perspective, it creates some powerful learning effects:

  1. It indirectly reminds learners of what they learned, helping them to remember what they learned long enough to put it into action.
  2. It spurs workplace action by regularly reminding learners that they ought to be working to implement what they learned.
  3. It helps learners keep a focus on their intended post-training actions.
  4. It brings managers into the process of training implementation, making them partners and/or drivers of training application.
  5. It can be used to hold learners accountable for their action plans, helping to significantly lift the priority of training implementation from a “nice-to-do” to a “must-do.”

Although no formal evaluation studies have been completed as of yet on ActionPlan Mapper (maybe they haven’t heard of LearningAudit.com), I’m willing to bet that learners who use ActionPlan Mapper will be at least 50% more likely to utilize (on the job) what they learned in the classroom or in an e-learning course. I actually think performance improvements could be more like 200 to 300% for many post-training situations, but I’m being conservative because results always depend on many variables. Besides, a 50% improvement in on-the-job application is huge already!

The cost of the product seems reasonable to me, especially given the upside I just discussed. For only $40 to $250 per person (depending on several factors), ActionPlan Mapper is yours! The thinking behind the design is that simpler is better. Clayton claims that what ZengerFolkman was aiming for was a product that people would find easy and intuitive to use. As long as they have a web connection, people can use ActionPlan Mapper anytime anywhere to stay in touch with their action-planning projects. This design strategy is paying off as clients are using the tool beyond the training context for development planning, follow-up to performance reviews and strategy sessions, and more.

Description by way of Screen Shots

(Note: You can click on the screen shots to enlarge them.)

On the first screen below, participant Bob Sherwin has two action-planning “projects.”

Picture3_2   

 

The second screen shows Bob’s goals for his action-planning project, “Becoming a Better Manager.” The grayed ones are already accomplished. The goal 4.4 has a lock next to it to indicate that it is a private goal (only viewable by participant, not by his or her manager).

Picture4_2

 

Participants are prodded and reminded with emails from the system. They can also be encouraged to focus on their goals by their managers. In fact, the system seems ideally structured to encourage conversations on tasks central to business goals and organizational success.

The third screen shows the manager’s view.

Picture5_2

 

More complex systems like Microsoft Project are available for some similar applications, but these are not really suited to the kind of use envisioned by ZengerFolkman. A product offering similar capability in providing training follow-up, FridayFives, is offered by the Fort Hill Company (http://www.ifollowthrough.com/).

A bright idea.

Since it’s easier for me to come up with ideas than it is for these folks to develop these products, here’s an idea, for what it’s worth.

I’d like to see these systems augmented to create a parallel structure to provide direct learning reminders and/or practice opportunities. For example, for a leadership course, learners could be given periodic scenarios related to managing people. Learners would have to decide what to do in these leadership situations. These scenarios would help remind learners of what they learned and thus make it much more likely that when faced with similar situations on the job that they’ll remember how to perform successfully. The learning research shows clearly that such “retrieval-practice” opportunities are great to prompt long-term memory. The leadership scenarios would also provide learners with feedback and help them assess their competence, thereby giving them a heads-up to the kinds of information they could look for as they attempt to learn on the job.

Other reminder systems and retrieval-practice systems could be developed as well.

Still, bottom line, I love the ActionPlan Mapper concept. It’s simple, but it drives training transfer. It’s relatively inexpensive, but it utilizes one of the uniquely potent characteristics of online learning—the connection we can have to our learners and their managers. Way to go ZengerFolkman!!

Who are the best instructional development shops, developers, etc.?

  1. Who’s the best custom e-learning development companies?
  2. Who’s the best off-the-shelf e-learning development companies?
  3. Who are the best developers?
  4. Who is being the most innovative?

Just curious…

And what criteria would you use to decide?

January 2006 was a very busy month for me. I spent it on three major initiatives. First, I got down to the business of writing a book on learning and instructional design. Second, I got LearningAudit.com up and running, a Work-Learning Research service providing benchmarking and assessment services. Third, I spent half the month at client sites, involved with two of the most innovative and effective e-learning custom-development houses in the world. I learned a ton.

Celebrity News. Start.

One of my clients told me this month that they hired a firm to help them garner insights about the training/learning/e-learning field, AND that the firm that they hired—obviously a world-class firm—analyzed the evidence and discovered through exhaustive data-analytic techniques that I, cuddly curmudgeon Will Thalheimer, was one of the top 5 analysts in the learning field.

Upon hearing the news, I buzzed with delight. "I made the top 5 mom!" But then I realized that I had no idea what an analyst does. I still have no idea how metrics could be made valid to rate the top analysts. But I guess being in the top 5 is pretty good. Right?

But why not #1? And what if this is like the Oscars and there were five nominees in each category? Maybe I’m just one of the four also-rans. And damn, now I’m going to have to rent a Tux.

And the real kicker is this. I keep trying to tell everyone that I don’t have time to be an industry analyst. Other people care a lot more about who gobbles up whom, who’s going where to work, and who’s sleeping with whom. I have too much real work to do already, providing my clients with insights and keeping up to date with the research on learning. I don’t want to be an analyst! Take me off the list, damn it!!

But here is a little secret I learned about the industry this month (just so I don’t lose my top-five ranking). Some of the best e-learning developers are becoming a bit frustrated with their clients because their clients won’t let them build the most effective learning designs.

So, if you’re out there buying e-learning, I have two pieces of advice. First, hire Work-Learning Research, Inc. (and the top-five guy) to help you find the best provider for your particular needs (shameless plug), and second, once you’ve hired one of these excellent providers, get the hell out of their way!!

I know it’s hard to trust anyone these days living in a society driven by a million little lies, but there are e-learning vendors out there who really know what they’re doing.

Different topic. Start.

Although I am a great proponent of research-based learning design, I have believed for quite some time that research has to be interpreted intelligently to be useful. From my work as a consultant, I have also learned that  practitioners regular create innovative and effective learning, sometimes based on research insights, but often not. That’s okay with me. If it’s effective, it ought to be celebrated.

The book I’m writing will provide a research-based perspective, but I want it to be more. I want it to provide examples of the best learning designs out there, whether they are intentionally research-based or not. If you’d like to nominate folks who are doing world-class work, please comment here or send me a private email. I’ve got some pretty good ideas about who is doing the best work, but I’m sure I’ll miss somebody if you don’t enlighten me.

In your estimation, which companies are building the most effective and innovative learning interventions? Who should I talk with? Who can I learn from?

Final Topic. Groundhog Day. February 2nd. Today.

Raised in Pennsylvania, I used to be partial to Punxsutawney Phil, the Groundhog reputedly able to predict whether winter will be long or short. If he sees his shadow, we get six more weeks of winter. According to Stormfax Weather Almanac, Phil has seen his shadow 96 times since 1887. He’s seen NO shadow 14 times. And 9 times no report has been forthcoming. The bad news is that the groundhog has only been correct 39% of the time.

This reminds me of the training industry. We see sunny weather for almost all our training interventions, but the reality is that we actually fail more than we will admit. We use crude and inappropriate methods for predicting on-the-job training transfer, just like those deluded souls in Pennsylvania who believe in their large sleepy rodent. We do things because of the commercial benefits, not because of effectiveness, accuracy, or appropriateness. Clymer H. Freas, city editor of the Punxsutawney Spirit newspaper, invented Groundhog Day in 1887 and the town of Punxsutawney has kept it going as a revenue-generating tourist attraction ever since. The predominant messages in our industry are created by or for our vendor elites.

Weird coincidence. Oprah Winfrey hosted the groundhog on her television show in 1995. This should have been an omen for Oprah watchers—the fraud of the groundhog perpetrated by Clymer Freas is very similar to the fraud perpetrated by the author of the Oprah bookclub selection, "A Million Little Pieces." The author was James Frey, who lied repeated in his book, but sold it as a non-fiction memoir. Did you notice the synchronicity? Freas and Frey. Probably pronounced the same too. Oprah originally defended Frey, saying the truth didn’t matter if the outcome was good. Later she pulled a stunning live-TV reversal, ripping into the author and his publisher, and apologizing for not previously upholding the value of truth. "Truth matters," Oprah said simply.

Is there a lesson here for us?

Do you smell a rat in the training industry? Are we destined to continue repeating ourselves like Bill Murray in the movie? Or does truth matter in our work?

If we really want to learn from our work, we need to measure our training outcomes.

Happy Groundhog Day!