clickers (student response systems) workshop … · clickers faq’s – common troubleshooting...

19
CLICKERS(STUDENT RESPONSE SYSTEMS) WORKSHOP INFORMATION

Upload: others

Post on 01-Jun-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

‘CLICKERS’

(STUDENT RESPONSE

SYSTEMS)

WORKSHOP INFORMATION

Page 2: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

8 Steps to Successfully Running TurningPoint 2008 In the Classroom

Plug in the receiver.

Open TurningPoint by double clicking the icon from the desktop.

Open Presentation from the Office button.

Select a Participant List, if you are tracking students.

Reset the Session.

Run the Presentation.

Save the Session.

Generate Reports.

Page 3: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

Wil

min

gto

n U

niv

ersi

ty E

du

cati

on

al T

ech

no

logy

1

Clickers FAQ’s – Common Troubleshooting Tips

Using PowerPoint 2007 If you are using a different version of Office, the TurningPoint toolbar will appear differently.

1. I can’t insert any slides and don’t see the TurningPoint tab on the menu bar.

Make sure you open TurningPoint first and then browse for your PPT file to open.

DO NOT open Microsoft PowerPoint first because you won’t see the Turning Point

tools.

2. I can't get my ResponseCard to work. I don’t see any responses.

If you are not receiving responses, check the following settings and programming.

Check to see if your receiver is registering in TurningPoint.

o Select Tools from your TurningPoint toolbar.

o Select Settings.

o Select Response Device.

o The receiver will be displayed in the right hand pane. It will list the receiver

ID, version and channel.

o Note the Channel Number. Default Channel ID is 41.

Make sure your ResponseCard RF is on the same channel as your receiver.

o On the ResponseCard RF press Ch/GO.

o (Channel number)

o Press Ch/GO.

3. How do I know the receiver is working and plugged in properly? To determine if the USB

receiver is working properly, you must:

Connect the device receiver to the USB port of the presentation computer. The

computer detects the receiver and installs the proper drivers.

The computer will then show pop up text that the device is ready for use.

Under the Tools menu, select “Settings”

You will notice the device’s Channel ID

See #2 FAQ

Page 4: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

Clickers FAQ’s – Common Troubleshooting Tips

Wil

min

gto

n U

niv

ersi

ty O

nli

ne

2

4. I inserted a new slide with 4 answers, and the graph on the slide disappears. How do I get the

graph to reappear?

Under the Insert Object Menu, select “Charts, Vertical”

The chart will reappear on the slide.

5. I closed the settings pane to the right of my slide, how do I get the settings pane to reappear?

Under the Tools menu, select “Enable Settings Pane.”

The Settings pane will now display to the right of the slide.

Page 5: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

Clickers FAQ’s – Common Troubleshooting Tips

Wil

min

gto

n U

niv

ersi

ty O

nli

ne

3

6. Can I add clipart to slides?

Yes, you can add clipart, pictures, sound and even custom animation in each slide.

7. If I don’t have a receiver, how can I practice my PPT at home?

You can run a presentation by selecting “Keyboard Keys 0-9”

When polling opens type a key representing answer choices, a poll will display each

time you type a number

Make sure you return back to “Response Devices during actual Clicker polling

8. I don’t want to display the charts in my presentation. What do I do?

Select Tools from the TurningPoint toolbar

Select Settings

Select Presentation

Make sure the All Settings radio button is marked

On the right hand pane, scroll to Chart Options

Scroll to Review Only

Change to True

Select Done

If you do not wish to display the chart in your presentation, set the presentation to Review Only. This will keep the charts from populating during the presentation. However you will still be able to see charts in thumbnail view. You will also have the option of returning to the slide to reveal the chart during presentation if you decide to reveal the results to your audience. This option is only available on the PC.

Page 6: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

Clickers FAQ’s – Common Troubleshooting Tips

Wil

min

gto

n U

niv

ersi

ty O

nli

ne

4

9. I have 10 students responding, but I only get 9 replies in the poll, how do I check if all the

clickers are working?

The instructor can check the devices before class begins:

Ensure the device receiver is properly installed and the device to be tested is available.

To test for device communication…

1. Click Tools on the TurningPoint Ribbon and select Settings. The Settings window opens.

2. Select the Polling Test tab. 3. TurningPoint displays the Polling Test window. Use this window to verify

that TurningPoint can receive responses from the devices. 4. Click Start Test. 5. Magnify the responses by placing a check mark in the Magnify Values

box. 6. Press a key on each response device to be used. TurningPoint displays

the Device ID, Channel, and the key entry from each response device in the order in which they were tested.

7. Click End Test. 8. Click Done. The device communication check is now complete.

OR

Have the students participate by checking the clicker’s ability to

communicate by opening polling and point to each student, one at a time

and see if a response is recorded.

If you find the clicker that isn’t responding by polling, reset the Channel by

clicking the CH button, press the numbers for Channel ID and press the CH

button again

Still have questions?

Information: http://www.turningtechnologies.com/media/files/productfaqs/TurningPointPCFAQ.pdf

Pr o d u c t g u i d e s & m a n u a l s . ( 2 0 1 0 ) . R e t r i e v e d f r o m h t t p : / / w w w . t u r n i n g t e c h n o l o g i e s . c o m / r e s p o n s e s y s t e m s u p p o r t / p r o d u c t g u i d e s m a n u a l s /

http://www.turningtechnologies.com/responsesystemsupport/productguidesmanuals/

Page 7: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

Turning TechnologiesNew Users for TurningPoint 4.2 1

New Users for TurningPoint 4.2.3

Setting Up the Receiver

The ResponseCards communicate with TurningPoint using a radio frequency. The receiver accepts the signal produced by the individual ResponseCards and passes the response to TurningPoint.

The effective range of the RF Receiver is 200 feet (60 meters). The default channel for the RF Receiver is 41.

The receiver must be connected to the computer that will be used to run the presentation.

When RF Receivers are used near each other, each Receiver needs to be set on its own channel.

To change the channel on the receiver:

1 Click Tools on the TurningPoint Ribbon and select Settings.

2 Select Response Device in the upper left-hand corner of the Settings Window.

The channel number is listed under the category ResponseCard Channels.

3 Click the two-digit channel number and select a new channel. (See example below.)

If a receiver has not been detected by the computer, “Empty” will be listed under ResponseCard channels.

ResponseCards must be on the same radio frequency channel as the receiver in order to communicate properly.

Page 8: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

Turning TechnologiesNew Users for TurningPoint 4.2 2

Setting the ResponseCard Channel

Creating TurningPoint Slides

TurningPoint slides can be used to poll, generate discussion, assess the audience and immediately view the results of the slide.

To create a basic TurningPoint Slide:

1 Click Insert Slide on the TurningPoint Ribbon.

2 Select the desired chart slide type.

3 TurningPoint inserts the new slide into the presentation, there are three components:

• Question Area - Where the question being asked is typed.

• Answer Area - Where up to ten answer choices for the question are typed.

• Results Area - Where the results of the slide are displayed.

Standard and LCD ResponseCard

1 Press Channel (Ch) button.

2 Enter the two-digit channel number.

3 Press the Channel (Ch) button again.

4 A green light displays when the channel change is successful.

XR ResponseCard

1 Follow menu to Change Channel.

2 Enter the two-digit channel number.

3 Press Enter.

Page 9: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

Turning TechnologiesNew Users for TurningPoint 4.2 3

Converting PowerPoint Slides to TurningPoint SlidesQuestion slides that were created in PowerPoint can be converted into interactive TurningPoint questions.

To convert a PowerPoint slide into a TurningPoint slide:

1 Begin with a PowerPoint Slide in “Title and Text” (Office 03) layout or “Title and Content” (Office 07) layout.

HINT: Title Area = Question; Text/Content Area = Bulleted Answer Choices

2 From the TurningPoint Ribbon, click Insert Object, mouse over Charts and select the desired chart type.

TurningPoint Settings Pane (Office 2007)The Settings Pane is a shortcut for frequently used slide settings.

1 Click the eyeglasses icon to access TurningPoint Settings Window.

2 Click the drop down menu to select a Participant List.

3 Set Point Values. The default is 1 for a correct answer.

4 Set correct/incorrect answer choices.

Page 10: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

Turning TechnologiesNew Users for TurningPoint 4.2 4

TurningPoint Objects

TurningPoint offers several static and interactive objects that you can optionally add to slides:

Participant Lists

Participant Lists identify the members of your audience who will use a response device to respond to questions during your presentation. Participant Lists make it possible to track individual results and participation.

• Charts - change the chart type on a slide• Animated 2D/3D Charts - add dimension to

slides, the chart is converted to a flash object• Countdown - sets a time limit for answering

the question• Grid - keeps track of how many participants

have submitted a response• Answer Now - a visual cue that the slide is a

TurningPoint slide, can provide additional information to participants

• Response Counter - keeps track of how many participants have submitted a response and closes polling when all of the responses are received

• Correct Answer Indicator - displays the correct answer(s) to the audience after polling is closed

• Stats - adds statistical data to your slide• Text Message - sends a message to

ResponseWare users to reinforce the content and their participation

• Participant List Wizard - used to create a list• Import a Participant List - to bring a

Participant List in from another location• Edit a Participant List - to open a list and

make changes• Delete a Participant List - to delete a list from

your computer• Real-Time Registration Tool - to assign

devices on the spot• Participant List Display - to easily review

entries in a Participant List

Page 11: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

Turning TechnologiesNew Users for TurningPoint 4.2 5

Resetting Session

Prior to starting the presentation, you must reset the session or all slides.

Running a Presentation

To start a presentation, click the Slide Show tab and select “From Beginning” or “From Current Slide.” Clicking the Slide Show Icon in the bottom right corner will also start the presentation.

Saving a Session

When a presentation is finished, the collected data may be saved by selecting Save Session.

• Session - clears any previously collected response data from the software and resets the charts

• Current Slide - resets the chart but does not clear any previously collected response data for the selected slide

• All Slides - resets the chart but does not clear any previously collected response data for all slides, new response data will be appended to the end of the session

To save a session:

1 Click Save Session on the TurningPoint Ribbon.

2 Select the desired save location and name the session file.

3 Click Save.

Page 12: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

Turning TechnologiesNew Users for TurningPoint 4.2 6

Generating Reports

TurningPoint offers several reports to review collected data.

To generate reports:

1 Click Tools on the TurningPoint Ribbon and select Reports.

2 Double click a session file.

To import a session file click the folder icon and

navigate to the location of the file.

3 Select the desired reports.

4 Click Generate Reports.

5 The reports generate in Microsoft Excel. If you selected multiple reports, one Excel file is created with a worksheet for each report.

Page 13: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

Using Student Response Systems to Increase Motivation, Learning, and KnowledgeRetentionby David J. Radosevich, Roger Salomon, Deirdre M. Radosevich, and Patricia Kahn

Advances in technology have transformed both students and their learning environments; the technologicalenvironment in which 21st-century learners have grown up means that their aptitudes, expectations, andlearning styles are very different from those of their teachers (Oblinger and Oblinger 2005). These studentsexpect that their educators will shift from traditional lecture-based teaching to a pedagogy that createslearning environments where students interact with the material, the instructor, and their peers (Dede 2005;Oblinger and Oblinger 2005). At the same time, instructors must integrate a variety of pedagogicalapproaches and strategies to create rich learning environments that can address cultural, demographic, andskill-based differences among students (Dunn and Griggs 2000) as well as individual learning styles andmultiple intelligences (Gardner 1993; Gardner 1999; Gardner 2004).

Student response system (SRS) technology is one of the many tools available to help instructors create a richand productive learning environment even within the framework of a traditional lecture-based lesson. TheSRS presents questions to the class, prompts students to enter responses using a pocket-sized keypadtransmitter (Figure 1), and provides aggregated feedback regarding student responses to the instructor. AnSRS can be used to assess students’ comprehension of complex material, affording both the instructor andthe students immediate feedback so that instruction can be tailored to student needs. Furthermore, thequestion-and-feedback process has the potential to promote greater student engagement in classdiscussions, and group activities in which students solve problems together and submit answers using theSRS can promote active learning. The primary goal of this study is to examine the extent to which SRS canimpact student motivation and foster active learning.

Background

SRS has been used to enhance learning across several disciplines, including biology (El-Rady 2006), earthsciences (Greer and Heaney 2004), communications (Rice and Bunz 2006), and family and consumerscience education (Gentry 2007). A number of studies have demonstrated the acceptability of SRS tostudents. While some students in one study reported not liking the fact that they cannot "hide" if the SRS isbeing used to take attendance, most participants reported enjoying the interaction and appreciating thedynamic feedback (Duncan 2006). Graduate students enrolled in two courses (Research Methods andMediated Communication in Organizations) had similarly favorable reactions to the SRS, indicating that thesystem reinforced class material and aided in studying for exams (Rice and Bunz 2006).

The immediate feedback produced by SRS can also create more engagement among students. Master'sstudents who initially gave an incorrect response to an SRS-administered question were more attentive tofollow-up questions and corresponding explanations (Rice and Bunz 2006). Similarly, Pargas (2005)describes how class participation and collaboration increased when instructors used an SRS as anassessment tool by polling students and obtaining feedback and opinions on specific topics.

Researchers have also reported real pedagogical advantages to the use of SRS although Duncan (2006)acknowledges that some instructors may find the technology a distraction because it requires them to do twothings at once. Abrahamson (2002) describes how this technology can transform the classroom by helpinginstructors become more aware of students who are having problems with the material. The use of SRSprovides opportunities for the instructor to receive immediate feedback, which allows for more focused

http://www.innovateonline.info/index.php?view=article&id=449

Page 14: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

instruction on the concepts that students have difficulty understanding (Demetry 2005).

While researchers and users of SRS generally indicate that the technology in combination with soundpedagogy can increase learners' motivation and satisfaction, most studies do not provide an empiricalexamination of those claims using an experimental design. Our study seeks to address this need byempirically examining the effects of SRS on student motivation, student interest, and learning outcomes inour organizational behavior class.

Student Response Systems and Pedagogy

Ample evidence from the learning and psychological literature suggests that providing more practice andfeedback enhances the learning process (e.g., Kuh et al. 1994). Studies generally confirm that externallyprovided feedback enables learners to be more effective (Kulhavy and Stock 1989). Butler and Winne (1995)argue that decreasing the temporal spacing between the presentation of learning exercises and performancefeedback may promote a deeper processing of the material by guiding the cognitive activities necessary tolearn effectively. Allowing students opportunities to respond to questions and receive immediate feedback ontheir responses also gives them control over their own learning, which, in turn, facilitates comprehension(Locke and Latham 1990).

The provision of immediate feedback by SRS technology represents a significant advantage in light of theconstraints that instructors may otherwise face. Instructors typically provide exposure to practice questionsthrough study guides that are often included with the textbook. Leaving aside the question of whetherstudents actually use these guides, one limitation of this format is that a significant amount of time must passbetween the coverage of the relevant material in class and the student's review of the practice questions.Similarly, feedback in the classroom is usually provided by a graded exam or quiz that is returned some timeafter the test is completed, missing the opportunity to present immediate feedback in a way that would allowstudents to engage in a deeper process of knowledge construction (Butler and Winne 1995). SRS offers atechnological solution to this pedagogical dilemma.

Methodology

Our study was designed to investigate the potential for SRS to increase student motivation and interest andto foster learning. We incorporated an SRS into one section of our organizational behavior class at MontclairState University, embedding multiple-choice questions at key points in the lecture; in turn, we taught anothersection of the same class without such technology. We then used a survey to compare both student groups interms of their self-reported interest in the class and their performance expectations for an end-of semesterretention test while also comparing both groups in terms of their actual performance on the retention test aswell as on a midterm exam.

Implementing the SRS

After selecting and setting up an SRS (Exhibit 1), we employed it in the classroom by inserting questionprompts in the PowerPoint presentations used during lectures; these prompts cued the instructor to toggleover to the SRS software application to display one or more multiple-choice questions (Figure 2). Studentsviewed the questions and entered their responses on their keypads within a specified period of time with the

http://www.innovateonline.info/index.php?view=article&id=449

Page 15: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

monitor indicating the frequency count of their responses as they did so (Figure 3). When the time limitexpired, the correct answer was shown (Figure 4). Through this format we sought to determine whether thetechnology could provide sufficient real-time assessment of learning and whether it would allow students toengage in a deeper processing of the material by making adjustments to their knowledge construction.

The SRS also allowed individual student scores to be downloaded to a gradebook application (Figure 5) andoffered various reporting options for individual item analysis (Figure 6).

Participants and Procedures

The 145 participants in this study all took the same undergraduate organizational behavior class. Half of theparticipants (n = 70) were in the control group that took the class in the fall semester without the use of theSRS. The second group (n = 75) took the class in the spring semester with the use of the SRS; this was thetesting or "clicker" group. There were no meaningful statistical differences between mean SAT scores andages for the two groups, suggesting that the two groups were comparable in ability at the beginning of thesemester as well as in other demographic characteristics (Table 1).

The quasi-experimental design meant that the groups were not randomized but were comprised of studentsenrolled in specific courses through the normal student registration process. Both groups were taught by thesame professor and received the same lectures and exams. The only difference between the two groups wasthat the SRS was used to present multiple-choice questions to the SRS group before and during the lecture.The questions focused on recall, recognition, and potential application of the material covered in the class.The control group had access to the same questions outside of class for independent review.

We established a detailed timeline of procedures for the study (Table 2). During weeks 1-7, both groupsreceived the same lectures using the same PowerPoint slides. However, the SRS group was presented withmultiple-choice questions on the material both before and during the lecture; the grades of these studentswere recorded in the electronic grade book, and they were also provided immediate feedback on theirresponses. Both the SRS and the control group took a paper-and-pencil midterm exam during week 8. Forthe remainder of the semester, weeks 9-13, the classes proceeded normally, creating a buffer of time to allowfor more effective assessment of knowledge retention. During the final week of class in each semester,students in both groups took a survey in which they indicated their level of interest in the class and theirexpectation of success on a subsequent retention test (Exhibit 2). Each student then completed a retentiontest that included items from the midterm exam administered in week 8.

Results

Quantitative Evidence

In the main analyses, we compared the means, standard deviations, and t values for the SRS and the controlgroup (Table 3). The SRS group averaged 28% on the pre-lecture questions and 66.67% on the questionspresented during the lectures. Since the control group had access to these questions only outside of class,their responses were not recorded.

On the midterm exam, the SRS group (M = 82.72) scored higher than the control group (M = 78.83) by astatistically significant margin (t(143) = 2.40 p < .05). Thus, exposure to the multiple-choice questions andimmediate feedback in class had an important effect on subsequent test performance. A more interesting

http://www.innovateonline.info/index.php?view=article&id=449

Page 16: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

finding was the statistically significant (t(143) = 5.40 p < .01) difference between the SRS group (M = 48.47)and the control group (M = 34.86) on the retention test. Although the percentage of course material retainedamong both groups was not outstanding, using the SRS had an important influence on the extent to whichstudents were able to remember the information six weeks after their midterm exam.

Additional analyses were performed to determine if exposure to the multiple-choice questions delivered withthe SRS in class had any influence on students' interest in the class or their expectations for success on theretention test. On a seven-point Likert scale, the results showed that students in the SRS group (M = 4.13)had greater interest in the class than the control group (M = 3.51), which was statistically significant (t(143) =2.28 p < .05). Similarly, there was a significant difference (t(143) = 2.78 p < .01) between the SRS group (M =5.11) and the control group (M = 4.36) regarding student expectations for remembering the content from themidterm exam.

In summary, the results indicated that those students who used the SRS as an integral part of class reportedgreater interest in the class, higher expectations of success on a retention test, and higher levels of testperformance on the midterm exam. More importantly, students who used SRS were able to perform better ona knowledge retention test administered at the end of the semester, five weeks after the material was initiallytested in a midterm exam.

Qualitative Evidence

At the end of the semester, students in the SRS class were asked to provide anonymous feedback regardingthe SRS as an attachment to the university-issued course evaluations. These comments generally notedincreased attention and engagement, appreciation for the opportunity to practice for the test, and usefulnessof the feedback. This student comment was representative:

The clickers [SRS] were great! I could focus on the lecture more instead of daydreaming. Plus, I couldcompare myself with others. It was a relief to know that I wasn't the only one who did not know all of theanswers. I was better prepared for the test because the clickers [SRS] constantly had me in thestudy-mindset. I only wish all my professors used clickers.

However, the instructor had both positive and negative reactions. On the one hand, the technology offeredmany benefits. The SRS was very engaging and made the class more interesting, and it was very easy touse in the classroom. The RF keypads functioned effectively since students did not have to point directly atthe receiver. Finally, the ability to capture student responses in a gradebook and provide visual feedback tostudents was a distinct pedagogical advantage; the system allowed the instructor to monitor student learningin real time. As a result, misunderstandings could be addressed immediately and the instructor did not glossover important material under the assumption that students understood the concepts.

On the other hand, learning the application and entering questions was time-consuming. Furthermore, theprofessor had to bring spare batteries to class. We concur with Duncan's (2006) recommendation that ampletime should be provided for both the student and the instructor to get used to the teaching and learningenvironment using this technology.

Discussion

Our findings support previous SRS research that demonstrates the benefit of this technology in terms of

http://www.innovateonline.info/index.php?view=article&id=449

Page 17: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

student motivation and engagement. For example, our findings are consistent with both Duncan (2006) andRice and Bunz (2006), all of whom found positive benefits for students in terms of making class moreinteresting and aiding in exam preparation. However, our findings go beyond previous research and make aunique contribution to the literature on SRS by demonstrating that students who used an SRS retainedsignificantly more of their knowledge from the midterm than did the control group. Thus, the SRS positivelyimpacted not only students’ expectations of success and interest in the class but also their retention ofknowledge.

Overall, the findings from this study indicate that SRS can be effective in enhancing student engagement andlearning. In the traditional classroom where lecture is the preferred mode of instruction, SRS technology canprovide another mode of learning that may help students engage with the material and let instructors seewhere learning needs more support. These findings are consistent with the notion that externally providedfeedback enables learners to be more effective (Kulhavy and Stock 1989). It may be the case, as Butler andWinne (1995) have proposed, that using SRS to provide feedback immediately after the learning exercisemay afford students the opportunity to engage in a deeper learning process than is typically experienced inthe classroom. That is, the feedback provided by SRS may facilitate more effective comprehension.

Although our study employed a quasi-experimental design to address the impact of SRS on learningoutcomes, there were some limitations that must be addressed. For example, the primary distinction betweenthe treatment group and the control group was the presentation of multiple-choice questions in class to theSRS group. The control group was not made responsible for reviewing the multiple-choice questions outsideof class. It may be possible that making time in class for these questions to be delivered to the control groupeven without an SRS would have affected the results. Nonetheless, this delivery option still would not haveafforded visual and normative feedback as efficiently as the SRS did.

Conclusion

This study provided empirical evidence that SRS may play an important role in increasing studentengagement and interest, improving performance on traditional exams, increasing confidence inremembering the material, and most importantly, increasing retention of that material. It is important to notethat the SRS may not necessarily be the most innovative technology available to educators, but this studydemonstrates that it is an effective technology when supported by sound learning principles.

Future research could expand on our findings by examining the impact of SRS in different disciplines or usingdifferent types of questions beyond factual multiple-choice questions. Alternatively, researchers may alsoconsider the impact of other technologies that can provide immediate feedback. For example, cell phonetechnology is nearly ubiquitous, easy to use, and inexpensive. This technology, like the SRS, offers users theability to participate in polling activities. Future research endeavors could help determine if one system ismore advantageous than the other.

Technology has become a staple of the 21st-century learning environment, and as technology changes, sodo the opportunities for instructors to empower students to engage in successful learning. Based on thefindings from this study, SRS offers one such opportunity for educators to adapt to the changing learningenvironment.

References

Abrahamson, A. L. 2002. An overview of teaching and learning research with classroom communicationsystems (CCSs). http://www.bedu.com/Publications/Samos.html (accessed September 29, 2009). Archived at

http://www.innovateonline.info/index.php?view=article&id=449

Page 18: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

http://www.webcitation.org/5bCxqakbF.

Butler, D. L., and P. H. Winne. 1995. Feedback and self-regulated learning: A theoretical synthesis. Reviewof Educational Research 65 (3): 245-281.

Dede, C. 2005. Planning for neomillenial learning styles. EDUCAUSE Quarterly 28 (1): 7-12.http://connect.educause.edu/Library/EDUCAUSE+Quarterly/PlanningforNeomillennialL/39899 (accessedSeptember 29, 2008). Archived at http://www.webcitation.org/5bCxSXfVw.

Demetry, C. 2005. Use of educational technology to transform the 50-minute lecture: Is student responsedependent on learning style? In Proceedings of the 2005 American Society for Engineering Education AnnualConference and Exposition. Washington, DC: ASEE. http://www.asee.org/acPapers/2005-1434_Final.pdf(accessed September 29, 2008). Archived at http://www.webcitation.org/5bCx9XcNY.

Duncan, D. 2006. Clickers: New teaching aid with exceptional promise. Astronomy Education Review 5 (1):70-88. http://aer.noao.edu/cgi-bin/article.pl?id=194 (accessed September 29, 2008). Archived athttp://www.webcitation.org/5bCwiTCCG.

Dunn, R., and S. Griggs. 2000. Practical approaches to using learning styles in higher education. Westport,CT: Greenwood Publishing Group, Inc.

El-Rady, J. 2006. To click or not to click: That's the question. Innovate 2 (4).http://www.innovateonline.info/index.php?view=article&id=171 (accessed September 29, 2008). Archived athttp://www.webcitation.org/5bCw2heHk.

Gardner, H. 1993. Frames of mind. New York: Basic Books.Gardner, H. 1999. Intelligence reframed: Multiple intelligences for the 21st century. New York: Basic Books.

Gardner, H. 2004. Changing minds: The art and science of changing our own and other people’s minds.Boston, MA: Harvard Business School Publishing.

Gentry, D. B. 2007. Using audience response systems in FCS. Journal of Family and Consumer Sciences 99(2): 42-44.

Greer, L., and P. J. Heaney. 2004. Real-time analysis of student comprehension: An assessment ofelectronic student response technology in an introductory earth science course. Journal of GeoscienceEducation 52 (4): 345-351.

Kuh, G. D., K. B. Douglas, J. P. Lund, and J. Ramin-Gyurnek. 1994. Student learning outside the classroom:Transcending artificial boundaries. ASHE-ERIC Higher Education Report No. 8. Washington, DC: GeorgeWashington University, School of Education and Human Development.http://www.ericdigests.org/1996-4/student.htm (accessed September 29, 2008). Archived athttp://www.webcitation.org/5bCvWc62g.

Kulhavy, R. W., and W. A. Stock. 1989. Feedback and written instruction: The place of response certitude.Educational Psychology Review 1:279-308.

Locke, E. A., and G. P. Latham. 1990. A theory of goal setting and task performance. Englewood Cliffs, NJ:Prentice Hall.

Oblinger, D. G., and J. L. Oblinger. 2005. Educating the Net Generation. EDUCAUSE.http://www.educause.edu/ir/library/pdf/pub7101.pdf (accessed September 29, 2008). Archived athttp://www.webcitation.org/5bCtqNo1i.

http://www.innovateonline.info/index.php?view=article&id=449

Page 19: CLICKERS (STUDENT RESPONSE SYSTEMS) WORKSHOP … · Clickers FAQ’s – Common Troubleshooting Tips y Product guides & manuals e 4 9. I have 10 students responding, but I only get

Pargas, R. P. 2005. Using MessageGrid to promote student collaboration. Paper presented at CELDA 2005,Porto, Portugal, December.http://www.cs.clemson.edu/%7Epargas/messagegrid/PargasMessageGridCELDA2005.pdf (accessedSeptember 29, 2008). Archived at http://www.webcitation.org/5bCv9D8J2.

Rice, R., and U. Bunz. 2006. Evaluating a wireless course feedback system: The role of demographic,expertise, fluency, competency, and usage. Simile 6 (3): 3.

COPYRIGHT AND CITATION INFORMATION FOR THIS ARTICLE

This article may be reproduced and distributed for educational purposes if the following attribution is included in the document:

Note: This article was originally published in Innovate (http://www.innovateonline.info/) as: Radosevich, D., R. Salomon, D.

Radosevich, and P. Kahn. 2008. Using Student Response Systems to Increase Motivation, Learning, and Knowledge Retention.

Innovate 5 (1). http://www.innovateonline.info/index.php?view=article&id=449 (accessed September 30, 2008). The article is reprinted

here with permission of the publisher, The Fischler School of Education and Human Services at Nova Southeastern University.

To find related articles, view the webcast, or comment publically on this article in the discussion forums, please goto http://www.innovateonline.info/index.php?view=article&id=449 and select the appropriate function from the

sidebar.

http://www.innovateonline.info/index.php?view=article&id=449