New Design for My Smile Sheet
Will’s 2016 Update: My latest thinking on smile sheets can be found in my book, Performance-Focused Smile Sheets: A Radical Rethinking of a Dangerous Art Form. See www.SmileSheets.com.
Some of what I wrote here, I still believe. Some I don’t. I’m keeping this up for historical purposes only.
Original from 2008
Smile sheets (the feedback forms we give learners after learning events) are an almost inevitable practice for training programs throughout the workplace learning industry. Residing at Donald Kirkpatrick’s 1st level—the Reaction level—smile sheets offer some benefits and some difficulties.
On the plus side, smile sheets (a) show the learners that we respect their thoughts and concerns, (b) provide us with a customer satisfaction ratings (with the learners as customers), (c) hint at potential bright spots and trouble spots, and (d) enable us to make changes to improve later programs.
On the minus side, smile sheets (a) do not seem to correlate with learning or behavior (see meta-analysis by Alliger, Tannenbaum, Bennett, Traver, and Shotland, 1997, that showed very weak correlations), (b) are often biased by being provided to learners in the learning context immediately after learning, and (c) are often analyzed in a manner that over-values the data as more meaningful than it really is.
Based on these benefits and difficulties, I recently developed a new smile sheet (one that shows its teeth, so to speak. SMILE) for my workshop Measuring and Creating Learning Transfer.
It has several advantages over traditional smile sheets:
1. Instead of asking learners to respond globally (which they are not very good at), it asks learners to respond to specific learning points covered in the learning intervention. This not only enables the learners to better calibrate their responses, it also gives the learners a spaced repetition (improving later memory retrieval on key learning points).
2. The new smile sheet enables me to capture data about the value of the individual key concepts so that changes can be made in future learning interventions.
3. The smile sheet has only a few overall ratings (when lots of separate ratings are used in traditional smile sheets, most of the time we don’t even analyze or use the data that is collected). There is space for comments on specifics, which obviates the need for specific ratings, and really gets better data as well. The average value is highlighted, which helps the learners compare the current learning intervention to previous learning interventions they have experienced. (You should be able to click on the image to see a bigger version).
4. The smile sheet asks two critical questions related to how likely the information learned will be utilized on the job and how likely the information will be shared with others. In some sense, this is where the rubber hits the road because it asks whether the training is likely to have an impact where it was intended to have an impact.
5. The smile sheet shows some personal touches that encourage the learners that the learning facilitator (trainer, professor, etc., or me in this case) will take the information seriously.
6. Finally, the smile sheet is just a starting point for getting feedback from learners. They are also sent a follow-up survey 2 weeks later, asking them to respond to a few short questions. Here are a few of those questions. Again, you might need to click the image to see a bigger version.
The learners get the following question only if they answer a previous question suggesting that they had not yet shared what they learned with others.
Why I Like My New Smile Sheet
I’m not going to pretend that I’ve created the perfect assessment for my one-day workshop. As I’ve said many times before, I don’t believe in “perfect assessments.” There are simply too many tradeoffs between precision and workability. Also, my new smile sheet and the follow-up survey are really only an improvement on the traditional smile sheet. So much more can be done, as I will detail below.
I like my new evaluation sheet and follow-up survey because they give me actionable information.
- If my learners tell me that a concept provides little value, I can look for ways to make it valuable and relevant to them, or I can discard it.
- If my learners find a concept particularly new and valuable, I can reinforce that concept and encourage implementation, or I can highlight this concept in other work that I do (providing value to others).
- If my learners rate my workshop high at the end of the day, but low after two weeks, I can figure out why and attempt to overcome the obstacles.
- If my learners think they are likely to implement what they learned (or teach others) at the end of the day, but don’t follow-through after two weeks, I can provide more reminders, encourage more management support, provide more practice to boost long-term retrieval, or provide a follow-up learning experience (maybe a working-learning experience).
I also like the evaluation practice because it supports learning and performance.
- It provides a spaced repetition of the key learning concepts at the end of the learning event.
- It provides a further spaced repetition of the key learning concepts at the beginning of the two-week survey.
- It reminds learners at the end of the learning intervention that they are expected to put their learning into practice and share what they’ve learned with others.
- It reminds them after 2 weeks back on the job that they are expected to put their learning into practice and share what they’ve learned with others.
- It provides them with follow-up support 2 weeks out if they feel they need it.
Limitations
- My new smile sheet and follow-up survey don’t tell me much about how people are actually using what I’ve taught them in their work. They could be implementing things perfectly or completely screwing things up. They might perfectly understand the learning points I was making or they may utterly misunderstand them.
- The workshop is an open-enrollment workshop, so I don’t really have access to people on the job. When I run the workshop at a client’s site (as opposed to an open-enrollment format), there can be opportunities to actually put things into practice, give feedback, and provide additional information and support. This, by the way, not only improves my learners’ remembering and performance (and my clients’ benefits), it gives me even richer evaluation information than any smile sheet or survey could.
- While the smile sheet and follow-up survey include the key learning points, they don’t assess retrieval of those learning points or even understanding.
- Not everyone will complete the follow-up survey.
- The design I mentioned not only doesn’t track learning, understanding, or retrieval; it also doesn’t compare results to anything except learners’ subjective expectations. If I was going to measure learning or performance or even organiziational results, I would consider control groups, pretests, etc.
- There is no benchmarking data with other similar learning programs. I don’t know whether my learners are doing better than if they read a book, took a workshop with Ruth Clark, or went and got their masters degree in learning design from Boise State.
- Bottom line is that my smile sheet and follow-up survey are an improvement over most traditional smile sheets, but they certainly aren’t a complete solution.
Learning Measurement is a Critical Leverage Point
Learning Measurement provides us with a critical leverage point in the work that we do. If we don’t do good measurement, we’re not getting good feedback. If we don’t get good feedback, we’re not able to improve what we’re doing.
My workshop smile sheet and follow-up survey attempt to balance workability and information-gathering. If you find value in this approach, great, and feel free to use the links below to download my smile sheet so you can use it as a template for your evaluation needs. If you have suggestions for improvement, send me an email or leave a comment.
References
Alliger, G. M., Tannenbaum, S. I., Bennett, W. Jr., Traver, H., & Shotland, A. (1997). A meta-analysis of the relations among training criteria. Personnel Psychology, 50, 341-358.
You must be logged in to post a comment.