How Do You Measure Success? <br>Lessons on Assessment and Evaluation from the LEAD Center

If you are a college instructor who is investing significant effort (and assuming risk) in using innovative technologies in your courses, or if you are an academic staff person who supports such college instructors, you surely want to determine whether, and if so why, students are learning better, more, and/or differently as a result of your efforts. If you are an instructor, you can obtain the answers you seek by using assessment techniques. If you are someone who supports instructors, you may be able to help these instructors by using evaluation methods.

Most people in the academic community tend to use the terms evaluation and assessment interchangeably. Researchers at the University of Wisconsin-Madison's Learning through Evaluation, Adaptation, and Dissemination (LEAD) Center use these terms to describe different, though related, processes.

Evaluation refers to applied researcher efforts to help clients achieve their goals for a project (whether a course, a program, or an organization). LEAD researchers help clients decide what data should be obtained, gather and analyze that data, and then report the resulting information in formats that the clients can use to improve their project and/or demonstrate to others the value of the project.

Assessment refers to faculty efforts to obtain information about how and what students are learning in order to improve their teaching efforts and/or to demonstrate to others the degree to which students have accomplished the learning goals for a course. At the LEAD Center, both evaluation and assessment entail the following processes:

  • Articulation of goals
  • Specification of the strategies intended to achieve these goals and the reasons that these strategies are expected to be effective
  • Agreement on the evidence that will convince specified individuals that the strategies have achieved the goals
  • The gathering, interpretation, and use of information.

Evaluation Processes at the LEAD Center

If you have the opportunity to work with LEAD evaluators, they will help you through these four processes, as the faculty who teach the sophomore-level engineering course at the University of Wisconsin-Madison (CS 310) will tell you. LEAD is currently evaluating CS 310, a course that focuses on teaching students how to solve problems in engineering using a variety of computer applications. For the first time this past fall, all lectures, exercises, and course materials for CS 310 were available online through the course's Web page and a newly developed online video-presentation application called eTEACH. eTEACH allows students to watch and review lectures and related materials on their own time and at their own pace. Live lectures have been replaced by team labs, where students engage in small-group problem solving. The results of LEAD's year-long formative evaluation will be used to improve both the course and the eTEACH computer program/application.

The LEAD Center works only with clients who undertake larger, long-term projects and have a strong commitment to evaluation. Once the center has determined that the client meets requisite characteristics, LEAD researchers meet with clients to discuss key questions such as:

  • Which goals for student learning or project success d'es the client want to evaluate?
  • What will the client and the client's colleagues accept as evidence that the goals have been achieved?
  • How much emphasis shall be placed on understanding student learning processes and the organizational and cultural factors associated with project success?
  • What data-gathering methods are feasible for obtaining information about both processes and outcomes?
  • Given the limitations of the research design, funding, and timetable, what kind of formative and summative feedback processes and products will optimize the achievement of goals?

During the project, researchers at LEAD use a combination of qualitative and quantitative social science methods, with strong emphasis on inductive analysis. Data-gathering methods include surveys, open-ended structured interviews and focus groups, observations and video recordings, and longitudinal student databases.

Assessment Resources

Due to expertise gained in their work evaluating college-level educational improvement efforts, several researchers at LEAD have worked with the National Institute for Science Education's College Level One (CL-1) team to produce assessment resources that can help faculty ascertain how well their strategies to improve student learning are working. The Field-tested Learning Assessment Guide (FLAG) and the Student Assessment of Learning Gains (SALG) instrument are located at the CL-1 Web site: www.wcer.wisc.edu/nise/cl1/.

The FLAG provides a collection of classroom assessment strategies that rests on a strong foundation of empirical research and has been tested by extensive use in the classroom. The FLAG is designed around Angelo and Cross's concept of "Classroom Assessment Techniques" (CATs). These are self-instructional modules that introduce techniques for assessing progress toward conceptual, attitudinal, and performance-based course goals in science, mathematics, engineering, and technology (SMET) disciplines.

Upon invitation by the CL-1 Team, a national group of leading SMET assessment scholars accepted the challenge of developing CATs for their assessment specialties, resulting in a set of twelve CATs that are field-tested and evaluated. The FLAG also includes an introductory primer, an interactive engine that links faculty goals with the most appropriate assessment techniques, and a searchable database of assessment tools, which continues to be expanded.

A particularly popular CAT within the FLAG is the Student Assessment of Learning Gains (SALG), which uses the Web to offer faculty a quick and easy way to obtain both mid- and end-of-semester feedback from students. The SALG is accessible to anyone with a browser and is offered as a free service.

The LT2 Web Site

LEAD researchers also have enjoyed the opportunity to develop case studies for the College Level One team's faculty development resource on effective use of learning technology. This resource, Learning Through Technology (LT2), is located at www.wcer.wisc.edu/nise/cl1/.

LT2 is designed to answer questions such as, "What can I do with learning technology that I can't do now? What are the nuts and bolts of using learning technology? How can I use learning technology so that my students really learn?" In particular, it seeks to serve SMET educators who believe it is important to develop the ranks of future scientists and a technical workforce, prepare teachers to be scientifically knowledgeable, and help all students become scientifically literate members of our society by making appropriate—indeed, transformative—use of the new computer-based learning technologies. The LT2 site is not designed to serve individuals seeking resources on distance learning or on how to translate traditional course content into electronic formats. It offers in-depth case studies, lively first-person accounts, "hallway conversations" about technology, and links to articles and more resources, including a taxonomy of learning technologies.

Using the LEAD Center

The UW-Madison established the LEAD Center in fall 1994 to provide third-party evaluation research in support of educational improvement efforts at both undergraduate and graduate levels. The center has a client-driven and student-focused approach to evaluation research. LEAD clients are faculty or staff at UW-Madison or institutions that are collaborating with the UW. Furthermore, they are individuals who:

  • Can provide or work with LEAD to obtain the resources—usually grants—to pay the full cost of the evaluation research
  • Have well-articulated goals for deeper and more relevant student learning
  • Are developing and testing more effective strategies for achieving these goals
  • Are committed to obtaining and using feedback on student learning experiences and outcomes to improve teaching and fine-tune goals
  • Seek to understand the various factors that are necessary to more effectively institutionalize and disseminate their efforts.

For more information about LEAD and the evaluation and assessment projects affiliated with the center, visit www.cae.wisc.edu/~lead/.

References

T.A. Angelo & K.P. Cross, Classroom Assessment Techniques, Jossey-Bass Publishers (1993). Hestenes, D., Wells, M., and Swackhammer, G. (1992). "Force concept inventory." The Physics Teacher, 30, 141-158.

Assessment and Technology in Physics

Curt Hieggelke is a national leader in the development, use, and dissemination of innovative computer-enhanced introductory physics teaching methods. A "tekkie" from way back, Hieggelke has transformed his introductory physics courses at Joliet Junior College in Illinois into meaningful and exciting learning experiences for his students—whether they are aspiring engineers, scientists, health professionals, or non-science majors. Key to his success are (1) the use of computer-based labs that actively engage his students through real-time acquisition and analysis of data, connections to real-world events, and visualization and simulation, and (2) assessment practices that constantly inform Hieggelke of the value of his innovations.

In the late 1980s, Hieggelke realized that he might go beyond using computers merely for analysis and instead use them to transform the way his students learn physics. In particular, he was excited about the possibilities of using electronic probes that interface with a computer; such devices would enable his students to collect and analyze data themselves, fostering a predict-observe-explain learning process that Hieggelke felt was essential to getting his students to understand—not just memorize and regurgitate—important physics concepts.

After obtaining computers for his students, Hieggelke searched for—and helped develop—a second generation of software tools that would enable the active learning environment that Hieggelke sought for his students. He explains, "The first generation of computer technology was ‘do the old lab experiment, hook a computer to it, and let the computer do graphing or fitting.'" The second generation demands the active engagement of students by predicting, observing, and explaining. "Students are really engaged with the experiments," he says. "After they set them up, they can interact with them and see exactly how things changed."

More specifically, these software tools allow students to visualize patterns of data, use graphical representations to avoid being lost in the data setup and collection of details that accompany most lab activities, and experiment easily with different parameters in the same lab setup.

Pretest Score Posttest Score
Mean Score
SD
49%
16%
73%
15%
SD = Standard deviation; N = number of students
Average JJC Hake gain .47
Average national Hake gain for students in traditional courses .23
Average national Hake gain for students in interactive courses .48

Along with his computer-enhanced teaching strategies, Hieggelke also uses assessment activities and guided group work activities. He blends these computer-independent activities and his computer-dependent activities into a synergistic framework for learning. "The hope is that [the computer work] will feed nicely into how I am interacting with the students and how the students are interacting with each other," he says, "even when we are not in lab."

Hieggelke places great importance on the use of formative assessment tools as learning activities. He believes that these activities are critical in directly fostering learning, and in providing information about student learning that instructors need in order to constantly adjust and improve teaching strategies.

The assessment activities Hieggelke uses fall into two general groups: activities that he calls "Tasks Inspired by Physics Education Research" (TIPERs), and pre-/posttests that have been developed recently by physics faculty around the nation. The physics education research on which the TIPERs are based has established that it is very difficult to modify some of the typical beliefs that students hold, and has ascertained through experimentation that certain methods of teaching are more effective than others in getting students to make the appropriate modifications.

Hieggelke uses the learning tasks or formats that these education researchers have designed to pursue their research questions as assessment activities in his course. Details about the TIPERs can be found at http://tycphysics.org. Hieggelke uses pre-/posttest assessment tools to check whether his teaching activities are reinforcing each other and providing students with challenging, engaging, and effective science learning experiences. His data show that when tested on conceptual understanding, students perform significantly better than their counterparts in traditional physics courses (see table).

The Force Concept Inventory (FCI), designed by David Hestenes and colleagues, is widely used in the physics community to assess student understanding of the basic issues and concepts in Newtonian dynamics. Questions are multiple-choice and are written in non-technical language, but answers are included among attractive distractors that specifically address common misconceptions about physics.

Has Hieggelke succeeded in transforming his introductory physics course into meaningful and exciting learning experiences for his students? The proof is in the outcomes. According to Alan Van Heuvelen, a nationally recognized physics educator at Ohio State University's Department of Physics, "When you plot Hieggelke's students' posttest results on the Force Concept Inventory along with the results of students taught by other faculty who use the interactive engagement approach to physics, his students' outcomes compare favorably with those of students taught by Eric Mazur at Harvard University."

For more information see the LT2 Web site, www.wcer.wisc.edu/nise/cl1/.For examples of some of the software that Hieggelke uses in his classes, see http://vernier.com/cmat/tst.html, http://vernier.com/cmat/rtp.html, and www.wiley.com/college/sokoloff-physics/.

Featured

  • interconnected cloud icons with glowing lines on a gradient blue backdrop

    Report: Cloud Certifications Bring Biggest Salary Payoff

    It pays to be conversant in cloud, according to a new study from Skillsoft The company's annual IT skills and salary survey report found that the top three certifications resulting in the highest payoffs salarywise are for skills in the cloud, specifically related to Amazon Web Services (AWS), Google Cloud, and Nutanix.

  • AI-inspired background pattern with geometric shapes and fine lines in muted blue and gray on a dark background

    IBM Releases Granite 3.0 Family of Advanced AI Models

    IBM has introduced its most advanced family of AI models to date, Granite 3.0, at its annual TechXchange event. The new models were developed to provide a combination of performance, flexibility, and autonomy that outperforms or matches similarly sized models from leading providers on a range of benchmarks.

  • landscape photo with an AI rubber stamp on top

    California AI Watermarking Bill Garners OpenAI Support

    ChatGPT creator OpenAI is backing a California bill that would require tech companies to label AI-generated content in the form of a digital "watermark." The proposed legislation, known as the "California Digital Content Provenance Standards" (AB 3211), aims to ensure transparency in digital media by identifying content created through artificial intelligence. This requirement would apply to a broad range of AI-generated material, from harmless memes to deepfakes that could be used to spread misinformation about political candidates.

  • happy woman sitting in front of computer

    Delightful Progress: Kuali's Legacy of Community and Leadership

    CEO Joel Dehlin updates us on Kuali today, and how it has thrived as a software company that succeeds in the tech marketplace while maintaining the community values envisioned in higher education years ago.