Sorry, you need to enable JavaScript to visit this website.

eCOTS 2012 - Virtual Poster #4


"A Two Semester Study: The Effects of an Audience Response System (ARS) on Achievement and Attitudes Towards Statistics in an Introductory Statistics Class"
with Megan Mocko, University of Florida; Brad Hartlaub, Kenyon College; and Tim Jacobbe, University of Florida

Hosted by: David Kline, The Ohio State University

Do Audience Response Systems meet the needs of the modern student? Do ARS with its instant feedback improve the grades and the attitudes of introduction to statistics students? Audience Response Systems are devices that allow for instant communication and involvement by the students in the classroom by the use of handheld remotes. The system allows students to anonymously respond to questions posed by the instructor. A study was done in the Spring of 2009 and 2010 to test the impacts of using these remotes. During each year there were two sections of Introduction to Statistics I that were taught identically except for the use of the handheld remotes: one of these sections used a handheld remote to respond to questions; whereas, the other class discussed the answers to the questions as a class. The difference between the two years is that for the first year, one to three unrelated questions were asked per class period; however for the second year, the remotes were used to answer three related questions at a time. In addition to looking at graded assessments, each student took the Survey of Attitudes Toward Statistics 36+ Pre and Post Test. The impact of using the remotes on graded assessments and attitudes towards statistics will be compared as well as recommendations for future study.

Download slides


Having trouble viewing? Try: Download MPEG-4 Podcast (.mp4)

Comments

Paul Hewson:

I liked the way you were trying to do a comparative study (with/without) but did wonder if you could give a little more context (what kind of course, what kind of questions). Are you checking procedures with the questions or concepts for example.

Also, wondered if you had any thoughts about developing question banks?

Megan Mocko:

Paul: The questions that I used were a mix of concept and procedure. More concept then procedure.
I have worked with Pearson on developing questions banks for Agresti/Franklin "Statistics:The Art and Science of Learning from Data".

Paul Hewson:

Dear Megan. Thanks. I was curious in that I think SATS would reflect a reasoning/concept course. Maybe you're already doing well in that regard then, so the clickers don't show instant additional benefi.

Michael Granaas:

My limited use of clickers had me excited. But absent data I wasn't sure if the excitement was warranted. Whatever the limitations of this study it shows that we need a better understanding of this technology before it is widely adopted.
Because I like the tech I hope positive evidence is forthcoming.

Jennifer Kaplan:

Michael - There has been research on clickers and peer instruction published in other science disciplines. You can find a list of books and papers to get you started at http://www.iclicker.com/community/research/

And, yes, I realize this list is hosted by a clicker company so it will only contain papers that show the value of clickers, (hence a biased sample) but as I said, it is a place to start.

Enjoy!

Jennifer Kaplan, University of Georgia

Megan Mocko:

I still believe that clickers can be an effective tool in the classroom, however, I want to give more thought as to how to get the most benefit from them. I really like Camille Peres's talk on Monday that used clickers with simulation. I think that this might be an area where use of the clickers might really shine!

Megan Mocko
University of Florida