Lectures are widely reviled for putting learners in a passive mode. On the other hand, lectures are relatively easy to implement, even with large numbers of learners. And regardless of the pluses and minuses, lectures are ubiquitous. While there aren’t many lectures in kindergarten, by third grade teachers are talking a lot and learners are listening. The college classroom is dominated by lecture. So is the corporate training session, conference presentations, church sermons, public meetings, elder hostels, and the local library’s evening speaker series. Lectures aren’t going away anytime soon, nor should they. Like all tools for learning, they provide certain unique advantages and have certain unique limitations.
Lectures can be modified in different ways to increase the amount of active learning—to ensure that learners are more fully engaged, have a more robust understanding of the learning material, are more likely to remember what they learned, are more likely to utilize the information at a later time.
One such method to increase active learning are "response cards." Response cards are provided to students so that each one can respond to instructor questions. Two types of response cards are available, (1) those that enable each learner to write his or her answer on the card (for example with a dry-erase marker), and (2) those that enable learners to hold up preprinted answers (either True or False; or A, B, C, or D for example).
While not a lot of good research has been done on response cards, the research seems to suggest that compared with the traditional method of having students raise their hands in response to questions, response cards improve learners’ classroom engagement, the amount they learn, and the amount they retain after a delay (Marmolejo, Wilder, & Bradley, 2004; Gardner, Heward, Grossi, 1994; Kellum, Carr, and Dozier, 2001; Narayan, Heward, Gardner, Courson, Omness, 1990; Christle & Schuster, 2003). Learners generally prefer response cards to simple hand-raising. Most of the research has focused on K-12 classrooms, with some research done in community college. The research has tended to focus on relatively low-level information and has not tested the value of response cards on higher-order thinking skills.
Getting learners to actively respond in lectures is certainly a worthwhile goal. Research has been fairly conclusive that learners learn better when they are actively engaged in learning (Bransford, Brown, & Cocking, 1999). Response cards may be one tool in the arsenal of methods to generate learner engagement. Of course, electronic keypads can be used in a similar way, at a significantly increased cost, with perhaps some added benefits as well. Still, at less than $30 a classroom, response cards may be worth a try.
Personally, I’m skeptical that audiences in adult training situations would be open to response cards. While 87% of college students rated the cards highly (Marmolejo, Wilder, & Bradley, 2004), the corporate audiences I’ve worked with over the years, might find them childish or unnecessary ("hey, why can’t we just raise our hands?"). On the other hand, electronic keypads are more likely to be accepted. Of course, such acceptance—whether we’re talking about response cards or electronic keypads—really depends on the relevance of the material and the questions used. If the questions are low-level rote memorization, adult audiences are likely to reject the instruction regardless of the technology employed.
Making lectures interactive has to be done with care. Adding questions and student responses can have negative consequences as well. When we ask questions, we signal to learners what to pay attention to. If we push our learners to think about low-level trivia, they will do that to the detriment of focusing on more important high-level concepts.
Limitations of the Research
The research on response cards tends to focus on low-level questions that are delivered all-to-frequently throughout lectures. Learners who have to answer a question every two minutes are being conditioned to focus on trivia, facts, and knowledge. Future research on response cards should focus on higher-level material in situations where more peer discussion are enabled.
Most of the research on response cards suffered from minor methodological difficulties (e.g., weaker than preferred comparison designs and a low level of learners actually tracked) and ambiguity (e.g., in reading the research articles, it was often difficult to tell whether the in-class questions were repeated on the final quizzes—those used as dependent variables; and no inferential statistics were available to test hypotheses).
Marmolejo, E. K., Wilder, D. A., & Bradley, L. (2004). A preliminary analysis of the effects of response cards on student performance and participation in an upper division university course. Journal of Applied Behavior Analysis, 37, 405-410.
Cristle, C. A., & Schuster, J. W. (2003). The effects of using response cards on student participation, academic achievement, and on-task behavior during whole-class, math instruction. Journal of Behavioral Education, 12(3), 147-165.
Gardner, R., Heward, W. L., & Grossi, T. A. (1994). Effects of response cards on student participation and academic achievement: A systematic replication with inner-city students during whole-class science instruction. Journal of Applied Behavior Analysis, 27, 63-71.
Kellum, K. K., Carr, J. E., & Dozier, C. L. (2001). Response-card instruction and student learning in a college classroom. Teaching of Psychology, 28(2), 101-104.
Narayan, J. S., Heward, W. L., Gardner, R., Courson, F. H., & Omness, C. K. (1990). Using response cards to increase student participation in an elementary classroom. Journal of Applied Behavior Analysis, 23, 483-490.
Bransford, J. D., Brown, A. L., & Cocking, R. R. (1999). How people learn: Brain, mind, experience, and school. Washington, DC: National Academy Press.