Community Colleges | Feature
Campus Wide Clicker Implementation: A Faculty Choice
A Q&A with Jennifer Condon, Iowa Central CC
| Jennifer Condon |
Iowa Central Community College implemented student response
systems campus wide this fall. With the start of the Fall semester this past
August 27, an order of 3,000 response devices began to find its way into the
pockets and backpacks of students. The initiative originated from the interests
of faculty, and the college has kept faculty choice as the key factor in the
deployment and use of the clickers in any course or program. All faculty and
instructors, across all its centers, have the option to implement the clickers,
in the way they wish. Here, CT asks Jennifer Condon, the dean of Liberal Arts
& Sciences, about the progress of the clicker initiative and lessons
Mary Grush: How did your initiative to implement clickers campus wide begin?
Jennifer Condon: The idea arose from faculty who were interested in implementing clickers.
I took the lead, and met with several faculty in the Liberal
Arts & Sciences along with instructors from a couple of career and
technology programs interested in implementing clickers. We reviewed several
products, and met as a team to make a product selection to move forward with.
So, the idea really came from faculty who expressed an interest in implementing
clickers. I just gave them the opportunity to explore it, so they could make
recommendations to their peers.
As a working group we decided to make a plan to implement
clickers campus wide, and to standardize on one clicker. Turning Technologies
was the vendor we selected. And we gave faculty the choice to implement--all
faculty at every campus location in all courses had this choice. Some
implementation decisions would be made at the program level; others at the
course or instructor level. But the decisions are all faculty-driven, and
certainly not dictated by administration.
A speaker at a January 2012 in-service, Dr. Mark Taylor, modeled response clickers. Then we surveyed faculty in the spring to see how many thought they would like to implement them… there was an overwhelming 'yes'
response. We also had two programs on campus that had been using clickers for two years--those were fairly isolated initiatives, but we had them share their perspectives with all our faculty on what it did for their classes.
Our bookstore purchased and delivered the clickers to
students. The cost to students is $45, and the bookstore has agreed to buy them
back at $20 when the students finally return them. The general consensus is
that we will sell a clicker to every student, and we generally agree that if a
student uses the clicker for about 4 semesters, this is very cost-effective.
Grush: Are you getting a lot of implementations by faculty yet? You are only about ten weeks
into the first semester.
Condon: I would say yes, though some faculty or programs are implementing clickers in phases and have at this point not used the clickers very much yet. Also, there are always a lot of
new hires in the summer, especially of adjunct faculty, and many of them are still unaware of the clickers available on our campus. We've got a good start and expect many more implementations by Spring semester.
Grush: What are some of the challenges you are facing, and what things might help you though those challenges? Was there anything you wish you had done differently?
Condon: I think the implementation has not been as smooth overall as we had hoped, for the Fall
semester--but I think that we will see a different picture in the spring for both students and faculty.
Professional development and training is one of the biggest
hurdles--making it available and effective, with enough lead time. Some of the
faculty were a bit overwhelmed with the challenges of implementing the student
response clickers. I think they wanted implementation to be a little more
efficient--and I think a big part of that is a need for better training.
Grush: Did your vendor help with training?
Condon: Turning Technologies does offer training help for us. They attended our faculty
training in August, and we did work their training modules into our conference
for the day. But faculty were in and out, choosing between conference sessions.
I think it would have been more effective to offer Turning Technologies
training by itself on one day, when faculty could focus just on that training
for a longer period of time.
Grush: What are some other hurdles?
Condon: Among the main hurdles to get over for most faculty are simply students who don't yet
have a clicker, or those who forget to bring them--it's going to take a campus
environmental shift towards using the clickers and setting expectations. It's
more of a cultural change than you might expect.
I think clicker use will increase as instructors begin to
develop good strategies for using them. A lot of that will come from their own
sharing, but we will also have more training in January prior to Spring term,
and there will be a better sense of preparedness to implement the clickers.
Grush: What were some of the successes… instructional objectives that have already been served
by the student response system initiative?
Condon: It depends on the course.
Currently the feedback I have is anecdotal. In biology, the
students really appreciate that instant feedback for test reviews. It provides
a different approach in some of those prerequisite courses for programs like
nursing or other health science programs--prerequisite courses where students
often seem to struggle. Where the course content seems to be challenging and
seems to be driven by very objective assessments, the kind of feedback clickers
provide is very beneficial for the students.
For courses in the humanities, which are more discussion-based,
we struggle a little bit more with how to make the clicker apply without
seeming to be 'too much' or 'too little'. I think instructors of those courses
are trying to find that 'medium' place where clickers can be used meaningfully.
Math instructors are considering how to modify instruction
to make clickers more applicable in class. Those instructors are talking with
me and with each other, and they are already involved in sharing ideas at
Grush: Are faculty using the clickers for testing and quizzing?
Condon: Some faculty have replaced paper quizzes with clicker quizzing--but very few so far: It's not integrated into our Moodle LMS yet, but that's in progress. We're hoping to
make testing and quizzing well integrated with the LMS and gradebook--where a
quiz could be uploaded to the LMS, administered in the LMS, and results
exported to the gradebook. All those pieces will be there at some point. For
now, most instructors tend use the clickers for participation and for
non-graded reviews or non-graded 'pop' quizzes.
One of the most effective strategies I've seen, though, of
using the clickers was done as a pilot by the instructor of one of our 'college
experience' courses. She did her final exam with the clickers instead of a traditional
final. Because the final exam done with the clickers was in effect at the same
time reviewing all the key points from the college experience course, if there
were misconceptions or wrong answers, those were corrected right there, on the
spot, during the final. This gave the instructor a last chance to review any
points the students still weren’t comfortable with. This type of strategy might
not work in another course context, but it was a great strategy for the college
Grush: Are there any other lessons learned about
operating a very large student response system initiative?
Condon: Students still have to self-register their clicker for every course, so that's a slower
process than we were hoping for. But we are working on making it possible for
the instructor to access the roster to register the clickers for their
class--so that's one little correction that will be really helpful.
Also, students need an orientation that will work across all
their courses. The faculty for The College Experience has agreed to utilize the
student response clicker in that curriculum, providing a general orientation to
the device in preparation for use in other courses.
Grush: In terms of institutional assessment, is there going to be an institution wide initiative
to use the clickers in some formal way to support retention and student
Condon: We are now at the end of our latest 3-year strategic plan. For our next plan, we will
probably have division or departmental goals and objectives that support
broader outcomes like student retention or increased completion. We are not in
the process yet of creating the next strategic plan, but I expect student clickers
to be one of the action plans that departments could implement.
Grush: Would that put more pressure on faculty to use clickers in prescribed ways, if you were to incorporate clickers in institutional assessments?
Condon: No, actually I don't think the application of such assessments would come from
administration. It would be generated through either department- or
course-specific initiatives. For example, if a science instructor finds that
clickers increase her test scores, or that there's a noticeable difference in
learning in the class, and she shares that, then I see a project coming out of
it for the whole department.
Grush: It's probably important to comment again on how the interest in the use of clickers at your
institution really came from faculty. As a dean, could you describe
higher-level administration’s commitment to help the institution and the
student through support of faculty technology initiatives?
Condon: When faculty have an interest in pursuing something new to impact engagement and learning in
the classroom, we're very supportive of that and the ways we can help and
assist them in that. If faculty hadn't had the interest in clickers, we
wouldn't be where we are today. And this will continue to be a faculty choice.