I got an email today from someone asking me about a term I created called “Evaluation Objectives.” I realize that I have not actually written anything for public consumption on this, SO this blog post will suffice until my book on workplace learning is released. Apologies if the following is not completely clear.

The basic idea is that we ought to have evaluation objectives rather than learning objectives in the traditional sense.

Specifically, we need to decouple our learning objectives from our evaluation objectives so that what we evaluate is directly relevant. Of course our evaluation objectives and learning objectives have to
be linked, but not necessarily with a one-on-one correspondence.

AN EXAMPLE

Suppose you want to train managers to be better at championing change efforts.

Traditionally, we might have objectives like:

The learner will be able to describe how people tend to resist change.

Or, put in a more performance-oriented fashion, a traditional objective might read:

The learner will engage in activities that lessen colleagues’ resistance to change.

Examples of evaluation objectives might be as follows:

The learner will initiate a change effort within one month after the training ends and be successful in getting 75% of his/her colleagues to sign a public statement of support for the effort.

OR, if real-world compliance cannot be assessed, an evaluation objective might be something like:

2. In the “Change-Management Simulation” the learner will score 65 points out of a total possible of 90.

OR, if a simulated performance can’t be created, an evaluation objective might focus on ratings by employees.

3. Two months after the training ends, the learners’ colleagues will rate them on average at least 4.5 (of 6 levels) on the multi-rater 360-degree change-management scale on each of the 5 indices.

OR, if this can’t be done, an evaluation objective might focus on a series of scenario-based questions.

4. On the 20-question scenario-based quiz on change management given two weeks after the course ended, the learner will get at least 17 correct.

NOTE: More than one evaluation objective can be used for any learning intervention.

THE POINT:

Evaluation objectives are NOT tied to individual learning points that have to be learned, though of course they are linked because both should be relevant to the overarching goals of the learning program.

THE BIG BENEFIT:

When objectives focus on the big picture, as compared to when there is a one-to-one correspondence between learning objectives and evaluation items, (1) they are more relevant, (2) the learners are more likely to see them as valuable and worth achieving, (3) organization stakeholders are more likely to see the evaluation results as having face validity, (4) the evaluation results will give us additional pertinent information on how to improve our learning interventions.

Tonight at the ISPI conference in Orlando, I attended a tribute to Geary Rummler, who recently died after a long and distinguished career in the “Performance-Improvement Field.”

I didn’t know Geary, so I didn’t know how I would react or how long I would stay. I brought my laptop to do emails while I listened. I sat in the back of the cavernous ballroom.

I became transfixed as speaker after speaker who had worked closely with Geary talked about his work and the contributions he made to the field.

The following is my stream of consciousness note-taking with some later annotations. Not worthy of a tribute, but perhaps enough to help me remember some of Geary’s work—and perhaps enough to encourage YOU to take a look at his books and writings.

Notes and Annotations

Entrepreneurial experimentation. Science and sweat. Learning through trial and error.
Creating one-week program on programmed instruction. Didn’t have time to tell, to present objectives, etc. Showed them, had them practice. Curiosity, interested in what did NOT work, not just what DID work. Pre-Testing then Programmed Instruction then Maintenance of Behavior. Annotation: Even way back in the 1960’s and 1970’s someone was thinking about maintaining performance after training, and we are still struggling to get most of the field to do this.

After the initial workshop was developed and deployed, then Management of Behavior Change Workshop. Then General Systems theory workshop. Started small and specific—built up to systems.
During 1960’s Geary and colleagues wouldn’t do training without a thorough front-end needs analysis. 

Annotation: Hmmm. With today’s pressures, many are eschewing FEA.

Geary rebuffed a man who insulted one of his woman colleagues by telling him, “Shut up, she knows more than you do.”

Praxis (a Geary company) had a mission (we didn’t earn that much money speaker said)—to make the world a better place by improving the place where people worked. Annotation:

One of Geary’s former colleagues sang a song in tribute (a tear-inducing moment).

Geary was an engineer. Geary worked with Tom Gilbert. Helped plan Motorola University. Did coaching of functional managers of manufacturing curriculum. One of nice things about Geary is that as a consultant as he learned—he would even tell about the mistakes he made.

Quote from Geary (paraphrased): “Beware of false prophets, the HR people, who would rank and rate you, but don’t really understand the organization.”

Article: “You want performance, not just training.”

Did Situation analysis. Asked these questions: (1) What is happening now? (2) What should be happening?, THEN Define desired outputs. “As-is” “To-Be”

This all become six-sigma, etc. Annotation: Several people said Geary’s work became basis for the Six-Sigma movement, TQM, etc.

Geary said: You’ve got a lot of white space on the org chart you need to manage.

Annotation: People like Geary and his colleagues have been doing very valuable stuff for years (For example, they got cycle time from 17 weeks to 5 days), but why hasn’t this spread? Why hasn’t this performance-based approach gradually knocked-out the dominant training-based approach?

Rummler stuff got repackaged into 6-sigma stuff. Motorola bought license from Geary Rummler’s stuff into 6-sigma and TQM.

Geary’s true legacy: Changed the lens we all look through, moving from training to performance improvement.

Provided a common language:
White space, disconnects, organization as a system, cross-functional processes.

Geary Rummler: Managing the whitespace:

Performance Design Lab (Came out of retirement and started/joined this company).
Geary retired, but had to come back and knew the world didn’t get it and had to get back in the game, forming Performance Design Lab in 2000, where they were focusing on performance management systems (maintaining  improvement after the performance improvement).

Serious Performance Consulting. A Geary Book.

Geary always had to have a vehicle for packaging  his insights into his workshops and his books in ways that would make sense for people.

A speaker said, “He believed that it could be possible to create a prosperous society by construction a good system that will hold successful organizations and generate superior results.”

Two places that Geary touched, who seem to want to use Geary’s work to improve the world. Performance Improvement Institute in Cd. Obregon. Sonora Inst of Tehcnol and norwest of Mexico.

Geary Quote: “We cannot continue working the way we are expecting to get different results.”
Speaker quote (paraphrased): “Geary would share his materials more than anyone I know, I think because he was a learner, he wanted us all to know enough so that he could  discuss with us and we could learn together.”

Drinking alcohol and developing relationships with good discussions was a recurring theme. Many stories about drinks being drunk.

“If you put a good performer in a bad system the system wins every time.” Quote from Geary Rummler.

One speaker, quieted by tears, haltingly spoke about imagining Geary becoming a star in the sky…

Annotation: I really didn’t know that much about Geary’s work, but now I am motivated to learn more. Also, I’m glad I went because it gave me nice perspective on the field, even knowing that this “history” was filtered through the lens of tribute protocol.

Bottom line: I was touched and I’m motivated to learn more. Thanks to ISPI for providing this, for all the speakers (who I apologize for failing to capture their names), and for all the people who came to Orlando especially for the tribute.

Here are two of Geary's most popular books:

For years I've been compiling research from preeminent refereed journals that shows, time and time again, that aligning the learning and on-the-job performance context is key in supporting long-term remembering.

Now, I continue by focusing on cultural and linguistic context.

Cover_Culturally,Linguistic_Scenarios  

Read the research report.

Here are the major recommendations:

  1. Utilize decision-making scenarios. Consider using them not just in a minor role—for example at the end of a section—but integrated into the main narrative of your learning design.
  2. Figure out what the salient cues will be in the workplace situations that your learners will face in utilizing the content you are conveying. As much as possible, simulate those cues in your decision scenarios. Consider using multimedia to augment this effect, relying on excellent acting, directing, and set design to enable the context effects that will trigger remembering.
  3. In simulating workplace cues, consider the range of cues that your learners will pay attention to in their work, including background objects, people and their facial expressions, language cues, and cultural referents.
  4. Determine the most important points you want to get across AND the most important situations in which these points are critical. Then, provide extra repetitions spaced over time on these key points and situations.
  5. Utilize culturally-appropriate objects, backgrounds, actors, and narrators in creating your scenarios. Consider not just ethnicity, but the many aspects of culture, including such things as socio-economics, education, international experience, immersion in popular culture, age, etc.
  6. Pilot test new designs using valid evaluation methods to determine the most effective designs for your learners, your workplace situations, and your learning points.

Dear President Obama,

You're a technophile I have heard. So, I have an improvement to suggest for the FDA, particularly how it deals with food-safety issues.

Here's what the FDA does now.

In the age of web technology, the FDA's methodology is just plain laughable.

I propose a webpage with a database that would enable citizens to submit food-safety alerts.

This should be damn simple. The post office has a list of all addresses in the country. Why can't the FDA create a list of all foods sold in the U.S. plus a list of all food sellers (grocery stores, restaurants, etc.).

Consumers who suspect they have some bad food could go online and within a few clicks select their product and where they bought it from. They could describe the issue, etc.

In the background, the system would monitor products for unusual activities (larger than normal number of alerts) and create an alerting response when something looks wrong.

If the FDA doesn't have the wherewithal to design and create such a system. I would be glad to take this on with my strategic partner Centrax Corporation (they build high-premium e-learning and web programs and could whip this up no problem).

Seriously, the FDA could save lives very simply and at a relatively low cost. Let's just do it.

Thank you Mr. President for considering this.

Please let me know what I'm supposed to do with the yogurt in my refrigerator that tastes bad. If you think I'm going to call one of those numbers, you just don't get it.

–A worried citizen/consumer

Update Thursday April 16th

Yesterday I decided I should make those calls. I called the yogurt manufacturer and went to their website and I called my regional FDA hotline person (who called me back today, a day later). Stoneyfield Farm has posted the following recall information (their phone complaint line was horribly implemented with long wait times and no one has gotten back to me from their online complaint system):

Londonderry, NH – April 3, 2009 Stonyfield Farm is
conducting a voluntary recall of Fat Free Plain Quarts in Stonyfield
Farm branded containers limited to specific dates. The products are
being recalled because they may contain a presence of food grade
sanitizer.

Affected products are limited to Stonyfield Farm 32 ounce Fat Free
Plain yogurt UPC # 52159 00006 carrying one of the following product
codes printed along the cup bottom that start with the following date
codes:
· May 06 09 Time stamped 22:17 thru 23:59 (limited to these specific time stamps only)
· May 07 09 All time stamps

Approximately 44,000 quarts were distributed to retail accounts nationally.

We have received several reports of people noticing an off-taste
when eating the product. We have received no reports of illness of any
kind after consuming the product.

The issue was a result of human error in not following our Company's
standard operating procedures. Stonyfield has taken all the necessary
corrective action to prevent this from occurring again.

Consumers are advised not to consume the product and to return
opened and unopened containers to the store where it was purchased.
Anyone returning these products will be reimbursed for the full value
of their purchase.

Customers with questions should contact Stonyfield Farm Consumer Relations at 1-800-Pro-Cows (776-2697) or visit our website at www.stonyfield.com.

This is listed on their website when I checked today. I didn't notice it yesterday (they have a very busy home page), but it probably was there.

Note to Stonyfield Farm: 

I am not satisfied with your announcement stating, "We have received several reports of people noticing an off-taste
when eating the product. We have received no reports of illness of any
kind after consuming the product."

THAT IS NOT GOOD ENOUGH!! You should (1) tell us what we ingested, (2) get health experts to provide us with some expert guidance on what symptoms or dangers we might be subject to.

More:

I just called Stonyfield Farm Consumer Hotline again (and actually got through to them today) and the guy said it was a Food-Grade Sanitizer, FDA approved, organic, etc. He told me ingesting it wouldn't hurt me, but I'm not convinced. I told him I wanted to know what it was I ingested. He wouldn't or couldn't tell me. I asked him if I ate a whole container whether it would hurt me…He said no.

Hey Stonyfield. You can do better…

Learning professionals (like me) can often gain insights about our industry from people in the field who have different vantage points than our own. I recently talked with Eric Shepherd, CEO of Questionmark, to get a sense of our industry and how it has been affected by the bad economy. Eric has been a good friend and long-time supporter of my research over the years and I’ve come to value his counsel.

Questionmark is the leading provider of assessment software according to a recent eLearning Guild study. I thought from his perch overseeing all-things-assessment, Eric might be able to give us some unique insight into the learning-and-performance field in general.

Check out my interview with him at the recent Guild conference. I divided it into two parts to make viewing easier.

Part 1: What trends do you see that we may be missing? 

Part 2: How is the bad economy affecting the learning assessment marketplace? 

Last year I wrote at length about my efforts to improve my own smile sheets. It turns out that this is an evolving effort as I continue to learn from my learners and my experience.

Check out my new 2009 version.

You may remember that one of the major improvements in my smile sheet was to ask learners about the value and newness of EACH CONCEPT TAUGHT (or at least each MAJOR concept). This is beneficial because people respond more accurately to specifics than to generalities, they respond better to concrete learning points than to the vague semblance of a full learning experience.

What I forgot in my previous version was the importance of getting specific feedback on how well I taught each concept. Doh!

My latest version adds a column for how well each concept is taught. There is absolutely no more room to add any columns (I didn't think I could fit this latest one in), so I suppose this may allow diminishing returns on any more improvements.

Check it out and let me know what you think.

Okay, I made these suggestions on Twitter today, but because it's so ephemeral, I RT them here:

  1. A Better Twitter Query: "What's happening for you?
  2. Use "MT" when you are suggesting a link to your own stuff.

1.
The Better Query encompasses the original query (still in use today): "What are you doing?" BUT it also conveys the way a majority of my Twitter contacts use Twitter, to convey what they find exciting, useful, notable. Sure, we have to go back to the 60's and 70's "What's happening man?" but those times weren't all bad.

2.
MT=Me Tweet. The MT idea helps people know whether the Tweeter is plugging their own work. This is useful in many ways. If you really like what someone you're following has to say in their longer off-Twitter conveyances, then you'll want to go there (and vice versa). It also enables the Tweeter to follow common interpersonal traditions by enabling spam-warnings. For example, in normal conversations we might say the following while looking apologetic, "Well, I know there are lots of perspectives on this, but here's my thoughts…" thus lubricating the social dialogue. Finally, some smart Twitter programmer will come up with a way to measure MT's and then self-promoters can be labeled as such.

Please RT (Re-Tweet) these ideas if you like them.

(1290 characters) about 9 tweets.

At the recent eLearning Guild conference in Orlando, I was asked to lead an Espresso Cafe roundtable discussion on a topic of interest.

My topic: The Pluses and Minuses of Social Media and User-generated Content.

I promised folks from my three sessions that I'd post all the results. Here they are:

Plusses:

  1. Users engaged.
  2. Relevant to the users.
  3. Not-distracting, real-world.
  4. Enables learning when training experts not available.
  5. Can augment online courses.
  6. Can capture water-cooler talk (that would have happened anyway).
  7. Opportunity to debunk inaccuracies.
  8. Capture institutional knowledge.
  9. Enables the use of internal experts for informal learning.
  10. Because informal, can be more comfortable to use for people of different languages and/or cultures. Or different socio-economic groups as well.
  11. More of an equal exchange. Leveling the playing field. Creating more democratic or egalitarian organizations.
  12. Novel, interesting.
  13. Quick feedback on what doesn't work.
  14. Not corporate-down, so more likely to be attended to without skepticism, jadedness, etc.
  15. Opportunity to connect with customers.
  16. Keep up with younger workers coming in.
  17. Headquarters experts may not be as trusted as those who work on the ground.
  18. Timely, instant updates.
  19. Get details from someone who actually does the job.
  20. Emotional connection.
  21. Convenience.
  22. No geographic boundaries.
  23. RSS feeds enables more targeted info.
  24. Employees may be able to affect policy.
  25. Could make us improve our policies for fear of law suits. (Like this: stuff that's posted can be used in court. Organization then has impetus to make changes quickly).
  26. Questions coming first is a good learning design.
  27. Can give organization more of a sense of what's going on in the field.
  28. Cheap.
  29. Builds community if people are tackling serious issues together.
  30. Feeling engaged.
  31. Employees have instant access to experts.
  32. Another data source.
  33. Develop connections. Know who knows who AND who knows what.
  34. Enables virtual relationships.
  35. More reflective–learners have to reflect to write, to learn deeper.
  36. Wisdom of the crowd.
  37. Opens up links to other things. Sets agenda, letting people know that there are other things.
  38. Generate buzz.
  39. Smile sheets shared. (Rate my teacher. Rate my professor).
  40. Best practices are distributed.
  41. Will make things easier. Info at fingertips.

Minuses:

  1. Might have to get used to it.
  2. How do you make it usable?
  3. Duplicate information.
  4. How to make pertinent information instantly accessible.
  5. Opening up floodgates.
  6. Cultural hurdles and disconnects.
  7. Competes with other channels of information.
  8. Perhaps top-level buy-in is required.
  9. A big distraction. Time user.
  10. Productivity drain.
  11. One more thing to do.
  12. We are still learning how to utilize wisely.
  13. May need support, maintenance, and the resources thereof.
  14. Information may not translate to behavior without directed support.
  15. How to confirm validity of content.
  16. Info can be used in lawsuits.
  17. Is the time beneficial?
  18. Danger of noise. Hard to get to best information.
  19. Time to create.
  20. Hard to measure. Maybe we're fooling ourselves.
  21. Could be incorrect/bad information.
  22. Could be offensive information.
  23. Must bring people up-to-speed on technology.
  24. Can create cliques.
  25. Time suck–filling up on candy.
  26. Dangers of giving censors power.
  27. Do these media self-select different types of people, biasing information gathered?
  28. Time is our most limited resource. The key organizational-productivity leverage point.
  29. Often implemented without planning, no marketing, no preparation, etc.
  30. Sometimes systems have no purpose. So costs/time not parlayed to maximum effect.
  31. Unnatural groups may not work, may have difficulties.
  32. One or a few can take over.
  33. Example: General in military told story of how soldiers posted how to defuse an IED. Info was wrong. 2 died. Enemies can use information too.
  34. Many see this as the be-all end-all, creating big blind spots, overzealous implementation, poor planning, poor focus.
  35. Potential permanence of information and/or systems.
  36. Personal vs. work issues may arise.

Thanks to all the folks who contributed to my discussions. It was kind of hard to hear, but here are the names to thank: Nancy, Leslie, Terra, Pat, Sonya, Betsy, Michael, David, David, Ann, Joyce, Nancy, Chris, Chris, Richard, John, Susan, Paula, John.

A new journal is forming to support applied e-learning research. Check it out. Get involved.