New Design for My Smile Sheet

Will’s 2016 Update: My latest thinking on smile sheets can be found in my book, Performance-Focused Smile Sheets: A Radical Rethinking of a Dangerous Art Form. See www.SmileSheets.com.

Some of what I wrote here, I still believe. Some I don’t. I’m keeping this up for historical purposes only.

 

Original from 2008

Smile sheets (the feedback forms we give learners after learning events) are an almost inevitable practice for training programs throughout the workplace learning industry. Residing at Donald Kirkpatrick’s 1st level—the Reaction level—smile sheets offer some benefits and some difficulties.

On the plus side, smile sheets (a) show the learners that we respect their thoughts and concerns, (b) provide us with a customer satisfaction ratings (with the learners as customers), (c) hint at potential bright spots and trouble spots, and (d) enable us to make changes to improve later programs.

On the minus side, smile sheets (a) do not seem to correlate with learning or behavior (see meta-analysis by Alliger, Tannenbaum, Bennett, Traver, and Shotland, 1997, that showed very weak correlations), (b) are often biased by being provided to learners in the learning context immediately after learning, and (c) are often analyzed in a manner that over-values the data as more meaningful than it really is.

Based on these benefits and difficulties, I recently developed a new smile sheet (one that shows its teeth, so to speak. SMILE) for my workshop Measuring and Creating Learning Transfer.

It has several advantages over traditional smile sheets:

1. Instead of asking learners to respond globally (which they are not very good at), it asks learners to respond to specific learning points covered in the learning intervention. This not only enables the learners to better calibrate their responses, it also gives the learners a spaced repetition (improving later memory retrieval on key learning points).

2. The new smile sheet enables me to capture data about the value of the individual key concepts so that changes can be made in future learning interventions.

3. The smile sheet has only a few overall ratings (when lots of separate ratings are used in traditional smile sheets, most of the time we don’t even analyze or use the data that is collected). There is space for comments on specifics, which obviates the need for specific ratings, and really gets better data as well. The average value is highlighted, which helps the learners compare the current learning intervention to previous learning interventions they have experienced. (You should be able to click on the image to see a bigger version).

4. The smile sheet asks two critical questions related to how likely the information learned will be utilized on the job and how likely the information will be shared with others. In some sense, this is where the rubber hits the road because it asks whether the training is likely to have an impact where it was intended to have an impact.

5. The smile sheet shows some personal touches that encourage the learners that the learning facilitator (trainer, professor, etc., or me in this case) will take the information seriously.

6. Finally, the smile sheet is just a starting point for getting feedback from learners. They are also sent a follow-up survey 2 weeks later, asking them to respond to a few short questions. Here are a few of those questions. Again, you might need to click the image to see a bigger version.

The learners get the following question only if they answer a previous question suggesting that they had not yet shared what they learned with others.

Why I Like My New Smile Sheet

I’m not going to pretend that I’ve created the perfect assessment for my one-day workshop. As I’ve said many times before, I don’t believe in “perfect assessments.” There are simply too many tradeoffs between precision and workability. Also, my new smile sheet and the follow-up survey are really only an improvement on the traditional smile sheet. So much more can be done, as I will detail below.

I like my new evaluation sheet and follow-up survey because they give me actionable information.

  • If my learners tell me that a concept provides little value, I can look for ways to make it valuable and relevant to them, or I can discard it.
  • If my learners find a concept particularly new and valuable, I can reinforce that concept and encourage implementation, or I can highlight this concept in other work that I do (providing value to others).
  • If my learners rate my workshop high at the end of the day, but low after two weeks, I can figure out why and attempt to overcome the obstacles.
  • If my learners think they are likely to implement what they learned (or teach others) at the end of the day, but don’t follow-through after two weeks, I can provide more reminders, encourage more management support, provide more practice to boost long-term retrieval, or provide a follow-up learning experience (maybe a working-learning experience).

I also like the evaluation practice because it supports learning and performance.

  • It provides a spaced repetition of the key learning concepts at the end of the learning event.
  • It provides a further spaced repetition of the key learning concepts at the beginning of the two-week survey.
  • It reminds learners at the end of the learning intervention that they are expected to put their learning into practice and share what they’ve learned with others.
  • It reminds them after 2 weeks back on the job that they are expected to put their learning into practice and share what they’ve learned with others.
  • It provides them with follow-up support 2 weeks out if they feel they need it.

Limitations

  • My new smile sheet and follow-up survey don’t tell me much about how people are actually using what I’ve taught them in their work. They could be implementing things perfectly or completely screwing things up. They might perfectly understand the learning points I was making or they may utterly misunderstand them.
  • The workshop is an open-enrollment workshop, so I don’t really have access to people on the job. When I run the workshop at a client’s site (as opposed to an open-enrollment format), there can be opportunities to actually put things into practice, give feedback, and provide additional information and support. This, by the way, not only improves my learners’ remembering and performance (and my clients’ benefits), it gives me even richer evaluation information than any smile sheet or survey could.
  • While the smile sheet and follow-up survey include the key learning points, they don’t assess retrieval of those learning points or even understanding.
  • Not everyone will complete the follow-up survey.
  • The design I mentioned not only doesn’t track learning, understanding, or retrieval; it also doesn’t compare results to anything except learners’ subjective expectations. If I was going to measure learning or performance or even organiziational results, I would consider control groups, pretests, etc.
  • There is no benchmarking data with other similar learning programs. I don’t know whether my learners are doing better than if they read a book, took a workshop with Ruth Clark, or went and got their masters degree in learning design from Boise State.
  • Bottom line is that my smile sheet and follow-up survey are an improvement over most traditional smile sheets, but they certainly aren’t a complete solution.

Learning Measurement is a Critical Leverage Point

Learning Measurement provides us with a critical leverage point in the work that we do. If we don’t do good measurement, we’re not getting good feedback. If we don’t get good feedback, we’re not able to improve what we’re doing.

My workshop smile sheet and follow-up survey attempt to balance workability and information-gathering. If you find value in this approach, great, and feel free to use the links below to download my smile sheet so you can use it as a template for your evaluation needs. If you have suggestions for improvement, send me an email or leave a comment.

References

Alliger, G. M., Tannenbaum, S. I., Bennett, W. Jr., Traver, H., & Shotland, A. (1997). A meta-analysis of the relations among training criteria. Personnel Psychology, 50, 341-358.

45 replies
  1. Mark Sheppard
    Mark Sheppard says:

    Will: I like what you’ve done with this sheet, and it’s an improvement on some of the usual fare. I really like Bennett’s 7-level framework (shown in Rothwell & Cookson’s book) for more detailed evaluation work.

  2. Mark Frank
    Mark Frank says:

    Nice – and very attractively presented. It makes me realise the potential for integrating level 1 assessment into a course/event instead of just having students fill it in as an afterthought when they are keen to get away and avoid the traffic. You could use the level 1 assessment as the basis for a wrap-up and review – maybe you do?

  3. Marcus Barber
    Marcus Barber says:

    Gidday Will
    Much improved and I’ve been heading down a similar path though not as succinctly as you’ve captured here.
    One possible area for improvement, especially given you intend to follow up. I ask the following question ‘What are two areas you believe you are likely to use this information in the next month?’
    When I run the follow up sessions I can then ask participants ‘You said you’d probably user this in x & y – how did you find it?’
    You might be able to structure yours in such a way that you automatically populate the responses from the initial feed back to the future follow up, so that you can track application of the learning to the real work environment
    Marcus 🙂

  4. Peter Williams
    Peter Williams says:

    This is great. I’m dealing with the problem now of improving the “smiley sheets” I use in online and face-to-face training. We all knew they were inadequate but there was a sense of resignation. Now, there’s hope. 🙂 Do you mind if I point other trainers in my workshops to your discussion here?

  5. Paul Left
    Paul Left says:

    Great that you are asking for feedback on specific sections on content. Another way to approach this is to ask for feedback in relation to specific learning outcomes. That way it links to the purpose of the learning (and its assessment) as well the content.

  6. Gaye Mara
    Gaye Mara says:

    Hi, Will,
    Thanks for sharing this. It reminded me of your great best learning practices presentation to ISPI Potomac Chapter a few years ago.
    A lot of useful ideas for us here! One nitpicky suggestion: Use the same rating scheme (lowest # is worst, highest is best) throughout. When one comes to the “Is there value?” questions, that scheme reverses, with “1” as the best rating and “5” as the worst. That created some cognitive dissonance for me, and I suspect it would for others as well.
    Thanks for your good work and your generosity.

  7. Lisa Allen
    Lisa Allen says:

    Thank you! I’m a librarian who works with executives in Human Resource Development. The sessions I teach feel more like training, and I was in desperate need of some ideas for capturing feedback. I really appreciate your sharing your documents!

  8. K Stoneman
    K Stoneman says:

    Dr. Thalheimer –
    A real improvement over the usual smile sheets. I’ve been toying with a similar idea of a “learning 360” for my particpants to evaluate the usefulness of my trianing focusing on applying the material and what percentage of the time they are using it, etc.
    I look forward to your continued thoughts on evaluation and needs assessment.

  9. Megan, Three Wheel Bike
    Megan, Three Wheel Bike says:

    I’m dismayed by what seems a primary, possibly sole, focus on level one types of evaluation. That is – learner reaction to a particular learning opportunity. Why do we place so much importance on the subjective feedback of someone who might not even know why his boss signed him up for the class (for example)?

  10. Generic Drugs Exporter
    Generic Drugs Exporter says:

    I have wanted to learn more about particular topics, but not many websites would help me out in informing me the way I expected. This left me with many question, but after reading your article, I got an answer to all my questions. You are too cool dude!!!

  11. Spiriva
    Spiriva says:

    Dit is echt interessant, Je bent een zeer ervaren blogger. Ik ben bij uw feed en kijk uit naar op zoek naar meer van uw prachtige post. Ook heb ik deelde in je site in mijn sociale netwerken!

  12. Allegra
    Allegra says:

    Dit is echt interessant, Je bent een zeer ervaren blogger. Ik ben bij uw feed en kijk uit naar op zoek naar meer van uw prachtige post. Ook heb ik deelde in je site in mijn sociale netwerken!

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply