Assess or Guess?

10 ways you might be fooling yourself about assessment

David SingerDavid Singer is a visiting scholar at MIT, in the Department of Brain and Cognitive Science. He is a specialist in assessment related to the application of education technology and brain science in education. Prior to his visiting post at MIT, he was the director of Education of the former American Nobel Committee. Singer received his doctorate in education from Boston University (MA). Here, the scholar and assessment authority takes a hard look at how we could be fooling ourselves, even with our best attempts at assessment.

EDITOR’S NOTE: At the Campus Technology 2006 conference in Boston (July 31-August 3), Singer will moderate a panel from MIT about the power of assessment, and its proper implementation.

Want to be considered for Campus Technology’s Top 10? Send your countdown and a brief background/bio summary to [email protected]

10

Surveys do not accurately describe students’ real behaviors, attitudes, or what they have learned.

  • Validity—and especially the reliability—of student surveys are often low.
9

Approximately 10 percent of research on the effect of education technologies focuses on student learning or performance.

  • 90 percent of such research bases its conclusions on student opinion surveys.
8

Assessment usually only deals with the mean change in student attitude or performance.

  • Assessments often assume that the shift in distribution is uniform for all students when, in fact, students who perform worse than or better than the mean may be differentially affected by any new educational strategy.
7

Once the methodologies and technologies are in place, it gets harder to apply them consistently.

  • It is a common problem that instructors fall back on old habits.
6

Even if the number of students in a research effort is large enough, the results cannot necessarily be generalized to all populations of students.

  • A study of 1,000 students in Biloxi, Mississippi may have little relevance to 1,000 students in Bangor, Alaska or Beverly Hills, California.
5

Student motivation to try a new educational methodology can decrease the validity of the findings.

  • The so-called “novelty effect” increases student motivation and d'es not address the real research questions regarding the pedagogy of the educational methods.
4

Instructors’ attentiveness and interest in research in education often detracts from the validity of results.

  • An instructor’s enthusiasm to prove the benefits of a new approach to teaching can, in itself, improve students’ motivation, interest, and therefore performance.
3

There is a not a generally well-accepted definition of what constitutes learning in education.

  • The basis for what a student must do to receive a given grade varies dramatically across schools and campuses.
2

Collaborative, interactive, and hands-on learning methods are not effective for all students and may be counterproductive for some.

  • Although these are popular and very effective methods for many students and some courses of study, some students may not benefit due to different learning and personality styles. Certain students may, in fact, do worse because of them.
1

Improvements in learning often cannot be attributed to the methodologies of technologies that are being investigated.

  • Research in education has historically been very poor because of the cost, difficulty, or because of investigators not knowing how to set methodological controls.

Featured

  • From Fire TV to Signage Stick: University of Utah's Digital Signage Evolution

    Jake Sorensen, who oversees sponsorship and advertising and Student Media in Auxiliary Business Development at the University of Utah, has navigated the digital signage landscape for nearly 15 years. He was managing hundreds of devices on campus that were incompatible with digital signage requirements and needed a solution that was reliable and lowered labor costs. The Amazon Signage Stick, specifically engineered for digital signage applications, gave him the stability and design functionality the University of Utah needed, along with the assurance of long-term support.

  • cybersecurity analyst in a modern operations center monitors multiple digital screens showing padlock icons, graphs, and a global map with security markers

    Louisiana State University Doubles Down on Larger Student-Run SOC

    In an effort to provide students with increased access to real-world cybersecurity experience, Louisiana State University has expanded its relationship with cybersecurity solutions provider TekStream to launch TigerSOC, a new student-run security operations center.

  • flowing lines and geometric shapes representing data flow and analysis

    Complete College America Launches Center to Boost Data-Driven Student Success Strategies

    National nonprofit Complete College America (CCA) recently launched the Center for Leadership, Institutional Metrics, and Best Practices (CLIMB), with the goal of helping higher education institutions use data-driven strategies to improve student outcomes.

  • geometric pattern features abstract icons of a dollar sign, graduation cap, and document

    Maricopa Community Colleges Adopts Platform to Combat Student Application Fraud

    In an effort to secure its admissions and financial processes, Maricopa Community Colleges has partnered with A.M. Simpkins and Associates (AMSA) to implement the company's S.A.F.E (Student Application Fraudulent Examination) across the district's 10 institutions.