Open Menu Close Menu

ePortfolio | Viewpoint

Reviewers Unhappy with Portfolio 'Stuff' Demand Evidence


“Enough is enough,” say faculty members reviewing portfolio reports that resemble scrapbooks. “Where is the analysis?” they ask. “Where is the thinking?” Evidence-based learning concepts offer a way to re-frame the portfolio process so it produces meaningful and assessable evidence of achievement.

An e-mail comment from one reviewer:

“In reviewing about 100-some-odd accreditation reports in the last few months, it has been useful in our work here at Washington State University to distinguish ‘stuff’ from evidence. We have adopted an understanding that evidence is material or data that has been analyzed and that can be used, as dictionary definitions state, as ‘proof.’ A student gathers ‘stuff’ in the ePortfolio, selects, reflects, etc., and presents evidence that makes a case (or not)… The use of this distinction has been indispensable here. An embarrassing amount of academic assessment work culminates in the presentation of ‘stuff’ that has not been analyzed--student evaluations, grades, pass rates, retention, etc. After reading these ‘self studies,’ we ask the stumping question--fine, but what have you learned? Much of the ‘evidence’ we review has been presented without thought or with the general assumption that it is somehow self-evident… But too often that kind of evidence has not focused on an issue or problem or question. It is evidence that provides proof of nothing. (And I am aware of and distinguishing here the research usage of proof, in which even a rigorous design with meticulous statistics that demonstrate a significant gain suggest but do not prove…)” [Gary Brown, Washington State University, private e-mail, 9-26-10, used with permission]

Accreditation reports sometimes include samples of student portfolio presentations, often a Web page with links to work collected in the portfolio and student summary comments. These portfolio presentations are supposed to show student progress toward learning outcomes. But they don’t. The reviewer just quoted was referring to portfolio collections that were not organized around any question so were not evidence of anything and therefore were just “stuff.”

As portfolio implementations become common, there are growing pains. What might have been self-evident to the instructor who assigned the work that ended up in the portfolio--he or she would have known what question or assignment the student was responding to--is not self-evident to a later reviewer. And when students aggregate their best work from several courses, the evidentiary chain is left even further behind: What was the context for the original work? Why did the student do this work? And what does this work show?

Another colleague puts it this way:

“My recurring concern, as so much emphasis is put on portfolio and ePortfolio tools, is around the concurrent need for clear, effective attention to the learning assessment dimensions of any such tools. Portfolios have such an appeal, and it seems to me that this appeal can very easily overshadow any questions about what they significantly may lack, overlook, or avoid, in terms of the evidence-management challenge of learning assessment. My own background is as an anthropologist (cultural) and I often use the analogy of the archaeologist, collecting interesting ranges of appealing artifacts; if in the process, the artifacts are not collected systematically, and within a carefully understood matrix of ‘context,’ then the collection of artifacts becomes a mute gathering of interesting ‘stuff.’ The State archaeologist is often bombarded by avid, amateur collectors who empty their pockets of interesting artifacts, which because they are no longer in situ, remain mute and useless for understanding anything about their potential, deeper meaning.” [Brian Donohue-Lynch, Quinebaug Valley Community College, private e-mail 9-30-10, used with permission.]

What is portfolio evidence? Electronic portfolio technologies can store student work such as written assignments, photos of work on assignments with teams in or out of the classroom, diagrams, audio clips, video clips--basically anything in digital form. Students can upload those files from their desktops, laptops, or even their smart phones. Collecting is easy. Interpreting and integrating the collection is hard.

The legacy, classic definition of the portfolio process has been “collect, reflect, select, present.” But portfolios have gone big time and this legacy construct is insufficient for the high-stakes, longitudinal institutional and individual purposes portfolios are being used for today. Simply collecting a lot of “stuff” and showing it on a Web page does not support any kind of claim other than that you’ve done the work and, presumably, the instructor has accepted your work. When students make a claim--for a grade on an assignment, a grade in a course, for a capstone requirement, for graduation, or for career purposes--they must also work within an evidence structure/process of some sort that is just as transparent as the scientific or legal process for using evidence.

Evidence in scientific experiments or in legal cases is tagged or annotated in some way--where the evidence was found, or how it was found, when it was found or discovered, the relevance of the evidence to the question at hand, and the people associated with the evidence. Portfolio evidence can have a date stamp, is of course associated with either the individual or group that created or discovered the evidence, and can be and often is annotated with comments from the instructor and with responses from students. But the other necessary steps in evidence authentication--where it was found or created, how it is relevant to a problem or question, how it fits with other evidence, and finally, what claim it supports--are usually absent. We have the process defined for the easy part, but not for the hard part.

Portfolio practices seem to be lacking in the last and most important requirements for academic proof. Portfolio success is entirely dependent on the degree of rigor in processing evidence. If we are moving from testing as the sole or even primary means by which we judge student learning, and toward portfolio evidence, then we must more extensively describe and follow a thorough process for using evidence. Academia, very fortunately, has centuries of experience in how best to use evidence and can now adapt that experience to address the objections voiced in the two e-mails I quoted.

Interestingly enough, evidence in student portfolios can also support claims made by faculty members in their requests for promotion or tenure, or as evidence of performance during an annual review. The Scholarship of Teaching and Learning (SOTL) scholarly community has promoted the idea of teachers as researchers about their own teaching. Once students and faculty think of portfolios not as just another way to work as usual, but as evidence, an entirely new conceptual framework opens.

To build an evidence-based learning process, and to therefore make portfolio collections more than just “stuff,” courses must be designed to include essential questions that students gather evidence about. As I was learning about research when I was a graduate student, researchers told me, “The research question is the most important part of developing the research design. If you have the wrong question, your research will be of little value.” The question is a good way to start, as well, when designing a course or a sequence of courses.

Courses using portfolios need explicit questions for students to work with so that the evidence in their portfolio refers to something. And that referential context must be carried forward--it must remain in situ, as professor Lynch (quoted above) points out. We academics are not accustomed to designing explicit longitudinal instructional units; our expertise is in designing tacit longitudinal instructional units: “If you take this sequence of courses, we can assume you know this, this, and this”--and no proof has been required that our assumptions are correct. But portfolios cannot be tacit because they have explicit ‘stuff’ in them. The context for this explicit ‘stuff’ must also be explicit or the ‘stuff’ is useless. The articulation between courses must then be explicit as well. The portfolio collection from a previous course must be linked explicitly, intentionally, and integratively so that the collection does not degrade to ‘stuff.’

The very best portfolio implementation, therefore, is a complete institution-wide implementation so this kind of explicit context continuity can be created for the benefit of students, faculty (for review purposes), and reviewers and assessors; as well as for accreditation review, institutional curriculum review, institutional transformation, and for the improvement of learning.

[Photo by Trent Batson]

comments powered by Disqus