Monday, January 31, 2011

Peer Instruction Using Clickers

Guest blogger: Mazharul Huq

Our trouble with assessment and Middle States has raised very serious questions about assessment. Successful assessment goes hand in hand with good teaching techniques. Time has come for us to examine very carefully our teaching methods and the tools we use for assessment of student learning.

I have been using a variety of technology in my teaching ranging from online quizzes to PowerPoint presentations. Of course, use of technology does not necessarily mean quality instruction. I must confess that in spite of use of technology my teaching style is not very much different from that used about a hundred years ago for much smaller and specialized audiences. However, it has been changing for the better during the last five to ten years.

The issues that have been troubling me also troubled Eric Mazur of Harvard University, which led him to develop something called peer instruction. Peer instruction is based on concept tests – short conceptual questions on the subject being discussed. The students are first asked to answer a short question (usually multiple choice question) without any discussion with peers. Then they are allowed to discuss the question with students next to them. After the discussion, they answer the question again. The instructor collects the data and analyzes to determine improvement in understanding. In the beginning, Eric Mazur used flash cards and show of hands, which made data collection rather cumbersome and time consuming. Of course, Eric Mazur had graduate assistants to help for classes with a few hundred students. Later on, he used clicker technology to automate data collection.

One and half years ago, with the help of a faculty development grant, I developed a number of instruction modules for peer instruction in my General Physics class with about 30 students. A clicker is a hand-held device with a radio transmitter that can communicate with a receiver connected to a computer at the instructor’s desk. Students log in using user name and password, which identifies the students on the instructor’s computer. The software I used was Notebook, standard software for the Smartboard. Of course, I did not have Smartboard at that time, so I used Notebook with an overhead LED projector.

Each module consisted of five questions. Allowing three minutes for the initial response, three minutes for peer-to-peer discussion, and three minutes for the post discussion response, I could get through a module quite comfortably in a 50-minute period. One great thing about the software was that I could selectively display the results instantaneously including bar and pie charts for the performance. One drawback of the software was that I could not prevent the students from modifying the pre-answer after the discussion. However, that did not happen, except for one or two cases.

The result was quite encouraging showing improvement in learning. However, there were a number of cases when after discussion with peers some students changed their correct answer to a wrong answer. Often students have less confidence in self than confidence in friends. For example, even students knowing Newton’s third law very well fall into a trap with peers and wrongly think: “When a heavy truck collides with a light car, the heavy truck exerts more force on the light truck than the force on the heavy truck from the light car.”

In this season of assessments, the clickers can be a great tool in assessing student learning outcomes. It can be used from student surveys to quizzes and many other innovative assessments. I have a number of suggestions that can encourage faculty to use clickers in classroom instructions.

  • The receivers should be permanently attached to computers interfaced with Smartboards. Then faculty do not have to carry the receiver to connect to the desk computer or a laptop.

  • Each student should be issued a clicker, which the student can use in all classes. A student is identified in a specific course by his/her username and password for that course. This would allow the faculty not worry about carrying 30 or more clickers to the class, distributing them, and collecting them at the end of the class.

  • We should also look for a better type of clicker and software for better delivery mechanism.


I hope Nancy Evans will read this blog and implement these

Reference: Peer Instruction by Eric Mazur, Prentice Hall

1 comment:

  1. This is interesting and can help with assessment; but I think our problem with assessment is more fundamental: it has to do with understanding the process. One we all understand the process, we can design our assessment accordingly, and the software we use will be more important than the hardware.

    ReplyDelete