Open Menu Close Menu

Assessment Technology >> Choose One From Column B

On the road to better assessing learning and teaching performance in the higher ed environment, who’s adopting ePortfolios, personal response systems, even simulation-and-assessment technology — and why?

Assessment TechnologyTHINK BACK TO THE DAYS when you were a college or university student, furiously scribbling in your notebook as Dr. So-and-so lectured on, and you wondered: Is this going to be on the midterm?

Thankfully (from the student perspective, at least), on most campuses across the country, those days are gone. Today, in the age of real-time and anytime electronic assessment, it’s a safe bet that just about everything will "be on the test"— if the "test" is the ongoing assessment of learning and performance, that is. With technologies such as ePortfolios, in-class student response systems, and simulation/assessment software, schools can now assess their students’ ability to grasp course material from the outset, and instructors can gauge the effectiveness of their course delivery methods before students fall behind. The result: better understanding across the board.

The Power of Portfolios

Next to paper-based tests, ePortfolios have become a highly effective method of assessment in recent years. On the most basic level, this technology relies on a website to which students can upload anything they feel represents their knowledge on a subject—papers, pictures, essays, and more. More sophisticated ePortfolio approaches are integrated into a campus assessment system that allows the collection of scores and the creation of reports.

Such is the case at Oral Roberts University (OK), where technologists have aligned the assessment of 16 proficiencies in five student learning outcomes within the curriculum. ORU chose ePortfolio technology from Chalk & Wire to handle this process. Today, according to Cal Easterling, director of institutional research and planning, the system enables faculty members to use rubrics to measure how well students are doing, and how that success correlates to other university competencies such as technological knowhow, spirituality, and reading comprehension. "For the first time, [both students and instructors] are speaking the same language," he says. Easterling notes that the five outcomes the university has identified are: spiritually alive, intellectually alert, physically disciplined, socially adept, and professionally competent. He adds, "Students are now aware of what these outcomes are, and they understand how their performance in individual classes ultimately relates to each one."

According to Easterling, competencies were hammered out by faculty over a two-year period. Faculty members worked with students to determine which types of documents/artifacts would demonstrate proficiencies, focusing on artifacts that educators were already using, so that students wouldn’t feel like they were doing extra work. Then they built more than 800 rubrics to measure ORU’s five student outcomes, linking the relevant criteria of each rubric to what Chalk & Wire calls a "standard." With these standards in mind, students are scored on each outcome. (They are introduced to the five outcomes from the moment they attend freshman orientation.) Wording related to the outcomes is posted on flags all over the campus, and every time students log on to the ePortfolio system, they are reminded of these outcomes. Beginning next fall, the school will include ePortfolio scores along with grade reports.

As far as ePortfolio systems go, the ORU system is typical; what makes it stand out is the school’s decision to take it university-wide and aggregate and disaggregate the data for a variety of purposes. The system was rolled out first in the School of Education in 2003, and then to the schools of nursing and engineering later that year. By 2004, the whole university had embraced the technology. Today, software for the ePortfolio system resides on a back-end server that serves the entire institution. On the front end, the interface (found here) appears in the form of a dashboard, so students can view a score for their performance on each outcome.

At Indiana University of Pennsylvania, only one program at the business school has embraced the use of ePortfolios— with a home-grown solution that Lucinda Willis, assistant professor of technology and training, has devised herself. Willis constructed the solution last fall as a way for programming and technology students to demonstrate to potential employers what they had learned at school about software and data management. (Ultimately, most of these students land technology support positions on corporate help desks.)

Is it an 'A' for Big B?

For Blackboard users: A quick update on the company’s new assessment offering.

IF IT SEEMS like content management vendor Blackboard is everywhere these days, that may be because it’s true. The latest example: the Blackboard Outcomes System, which the company unveiled in January. The system facilitates the continuous improvement of academic and administrative processes with a focus on student learning, academic programs, and co-curricular activities. To do this, the technology leverages the capabilities and user experience of the Blackboard Academic Suite to help institutions make evidence-based decisions, streamline assessment processes, and engage students in their own learning.

The platform is organized around three stages of assessment: plan, measure, and improve.Within this paradigm, the system offers specific tools on each level, to help users meet the challenges in each stage. Some of these tools enable users to build rubrics and curriculum maps, manage standards, collect and evaluate artifacts of student learning, survey student attitudes and interests, and run online course evaluations. Michael Chasen, the company’s president and CEO, says the possibilities literally are endless.

"Educational assessment is driven by the standards and culture specific to the discipline and school; one size does not fit all," he says. "The Blackboard Outcomes System is a flexible, customizable solution with tools that can be tailored to meet the needs of each individual course, program, department, college, system, or administrator."

Pilot programs of the new Blackboard technology were ongoing at press time. Seton Hall University (NJ) is one of these pilot schools, and though the institution has not had enough experience with the technology to discuss results, Paul Fisher, director of the school’s Teaching, Learning and Technology Center, says early indicators are promising in seven participating departments that include Education, Math, Diplomacy, and English.

In particular, Fisher appreciates the system’s ability to customize assessments. Sure, he says, Seton Hall is using the technology for traditional assessments, in order to gauge student knowledge. But the school also has developed student surveys about faculty performance, and has set up the system to quickly deliver responses to individual departments. This fast turnaround has enabled educators to modify their curricula and teaching styles on the fly, ultimately making the entire educational environment more responsive.

"If a faculty member is teaching a course again in the spring, he or she can make course corrections," Fisher says. "Being able to collect data, and having the opportunity to use that data to make changes to our processes more quickly than usual, has been invaluable."

Initially, the new system was rolledout to existing freshmen, but each coming year, says Willis, incoming students will be setting up ePortfolios of their own. Under this schedule, every student will have a portfolio by 2010. Willis notes that for students who are accustomed to being assessed via traditional measures such as periodic exams, the new technology forces them to master practices and procedures as they are covered in class—not in cram fashion, right before an announced test. In this way, she says, there’s no chance for a student to "fudge" something he or she doesn’t know.

"The portfolio has helped students ‘thread’ their knowledge and prove they can do hands-on stuff before they learn how to assist someone [as part of a corporate help desk]," she says. "It’s like when you learn to bake a cake—you can’t do anything unless you prove that you understand what all the measurement tools are."

Willis notes that budget limitations prevented IUP from investing in a commercial product; instead, the school was able to utilize faculty members’ web development capabilities to build the system, using Adobe’s Dreamweaver. Willis estimates that she spent a few days’ worth of time building, modifying, and implementing the system, but no actual money was spent putting it together.

Real-Time Feedback

While ePortfolios certainly involve modern technology, the assessment process rarely can be established until an educator logs on to a student’s portfolio and inspects his or her work. For institutions looking for feedback in real time, this type of assessment simply isn’t fast enough. To accomplish realtime assessment, they have turned to audience (or personal) response systems from vendors such as GTCO CalComp (which recently launched Interwrite Learning, a new company focused on its interactive learning solutions), Allegiance, and Turning Technologies.

Assessment Technology

Calhoun's preliminary research suggests that students who achieve high scores on the PRS system also perform better on exams. The technology has even improved student attendance.

These technologies rely upon handheld clickers and other devices that students use in class to answer questions as they come up. During a lecture or presentation, the instructor poses these questions and gives students a finite period of time to respond. Students enter their responses via the device, which transmits them wirelessly to the instructor’s computer at the front of the room. Software on the computer then tabulates the results, and an assessment of the entire class appears within seconds. Instructors can then analyze the results, look for low scores, and identify individual students who seem to be struggling.

"Why should I wait a week or month to see if a student understands a concept, when I can assess that knowledge immediately?" asks Joe Calhoun, a professor of economics at Florida State University, which uses the Interwrite PRS system from GTCO CalComp. "This technology changes everything." Calhoun’s preliminary research on the technology suggests that there may be a positive correlation between PRS scores and exam scores: Students who participate in class and answer more PRS questions accurately, perform better on exams. He adds that the technology has improved attendance, and when students come to class, they are more involved than they used to be; instead of taking notes and leaving, students ask more questions because they know they’re being quizzed about things they don’t understand.

All of Calhoun’s students are required to purchase $50 "clicker" devices when they sign up for the class (the devices are used in many other departments, so students can use the same clicker for several classes during the same semester, or for multiple classes over several semesters). Students then register the devices on the school’s Blackboard content management system (CMS). (For more on Blackboard’s assessment offerings, see "Is It an ‘A’ for Big B?") At the beginning of every class, Calhoun downloads a class database from the Blackboard server, which tells him which students are using each clicker. After each question, the Interwrite software populates this database with information on students’ performance.

Perhaps the only drawback to audience response systems is the inability of the technology to snuff out cheating. Calhoun admits that a student easily could give his clicker to a buddy and instruct the buddy to answer accordingly, making it look as if both youngsters were in class. To overcome this temptation, he weights clicker responses to be worth about 10 percent of a student’s overall grade. Calhoun says this figure is sizable enough to provide incentive to come to class, but not too big to spark a strong incentive to cheat. Sure, he’s caught at least two students cheating every semester since the tools were deployed, but considering that Calhoun teaches thousands of students a year, such minuscule numbers are not worth rethinking the entire system.

"I can’t worry about cheating when, on the whole, my students come to class, use the technology to participate, and learn with it," he says, adding that since he started using the technology, student performance has generally been up. Calhoun’s preliminary data shows that in 32-question summative tests, students who used clickers during the lessons answered anywhere from two to four more questions correctly than students who did not use them. "As a professor, I can’t ask for anything more than that," he says.

Student response works a little differently at Ohio State University, where Physics Professor Bill Reay uses devices from Turning Technologies. Instead of requiring students to purchase the clickers, Reay purchased them himself this past year, using $9,000 in grant money allocated to the acquisition of the technology. Now, when students enter Reay’s lecture, they pick up a clicker from a large repository. Throughout the class, as students answer Reay’s questions, his computer tabulates their responses anonymously.

Clearly, since there’s no way for this strategy to link individual students with individual clickers, assessment information is not used to evaluate individual student performance, nor is it connected to a CMS. Instead, Reay says he utilizes the technology exclusively to assess his own teaching. If, for instance, an assessment reveals that a majority of the class does not grasp a concept, Reay will spend more time on it. If an assessment reveals the class is comfortable with a topic, he can advance the lecture without holding it up for those few who may be lagging behind.

"Sure, it helps the class. But the assessment also is a great way for me to get a sense of how many students understand a particular concept before I move on," he says. "It helps me get a sense of what needs explaining, and helps them understand everything as best as they can." The bottom line for OSU: The immediacy of the data provided by clickers has made them an invaluable supplement to traditional assessment methods.

What’s Next

The joystick-like quality of in-class clickers certainly has to be part of their appeal: Ask any college-aged youngster about the future of technology, and he or she is bound to hail the praises of video games. Cool graphics, live action— these are both reasons why analysts at Gartner and Forrester predict the video game industry will skyrocket in the coming years. It’s no surprise, then, that the future of assessment tools may look more like video games than traditional educational technologies.

Dan Smith

"Unless you assess his performance in front of a simulator, you can't tell at all how a student would do in a realworld situation." — Dan Smith, DePaul University

Case in point: Capstone and Foundation, two assessment tools for business schools that simulate the dog-eat-dog quality of the corporate world. The technology behind the two tools creates virtual realities designed to teach students real-world lessons about business. It is the brainchild of Dan Smith, a professor of business strategy at DePaul University (IL), who invented the tools in 1986, and then built his own company around them. Today, that company is called Management Simulations.

The Foundation game is simpler than Capstone, requiring less time for students to complete, though both games are appropriate for all business and economics courses. Foundation simulates how accounting works across all phases of corporate decision-making: research and development, marketing, production, human resources, and finance. Capstone tackles the same issues, but the game is made up of more complex and sophisticated stages. As if the two simulation tools were not enough, Smith says teachers can purchase a specialized assessment tool, Comp-XM, to evaluate students at the end of each simulation.

"The downside of traditional tests is that you can only use them to test a student’s cognitive space," says Smith. "Unless you assess his performance in front of a simulator, you can’t tell at all how a student would do in a real-world situation."

Each simulation opens with a memo informing participating students that they’ve been selected to lead a fictional company. From there, four years of balance sheets and initial public offerings (IPOs) play out over the course of four hours. Smith reports that in the classroom environment, roughly 20 percent of all students run their companies into the ground. The benefit, however, is that these failures happen in virtual reality, enabling students to learn from their mistakes so that the same errors don’t happen in the real world.

Importantly, in whatever manner these simulators and their assessment tools work, they are catching on: According to Smith, more than 60,000 students at 500 business schools participate in the simulations each year, and this year should maintain that average. The technology is easy to set up, free to the institution, and relatively inexpensive for students, costing no more than $50 per license. Considering that the simulation/assessment technology could potentially yield a new crop of business moguls, the return on the investment is clear.

"In general, it’s hard to tell how a business student will perform, once he or she finishes business school," Smith says. "With this technology, everyone can begin to get an idea."

Webextra

Assessment is for institutions, too: click here.

comments powered by Disqus