Grading OERs for Class

Social-networking tools and learning analytics can help educators evaluate OERs.

A vast amount of open content is now accessible to educators online, but there's not much evaluation data or guidelines on how to use it. Campus Technology asked Michael Cottam, associate dean of instructional design and new program development at Rio Salado College (AZ), how higher ed could better evaluate open education resources (OERs) and help course designers leverage them more appropriately.

Campus Technology: How can course developers determine the potential effectiveness of an OER for their course?

Michael Cottam: There's such a wealth of material online under Creative Commons licensing. For an instructional designer, it's a matter of curating it and gathering it into something that will make sense for your particular learners in your particular class. Part of what an instructional designer has to do now is to find the best resources and fit them together in a way that allows learners to meet their objectives in a class.

CT: Is the role of instructional designers not only to guide faculty in how to use technology effectively, but also how to identify and use OERs?

Cottam: I think it's both now. The skill set around using technology for instruction is certainly still required, but instructional designers also have to be very aware of the OERs that already exist, and guide faculty to use them in an effective way. To do that, we need to be able to evaluate OERs. I think that's a direction the OER community needs to take.

CT: How do you see this happening?

Cottam: The first thing is online feedback. As a consumer, when you investigate any product or service online, it's increasingly common to see other people's reviews of the product right there. Can we do the same thing with OER? To some extent, people have already been peer evaluating OER content. For example, MERLOT has been facilitating peer reviews for years. But we need to leverage social-networking tools more widely. The peer evaluation that happens naturally in an online social environment could inform instructional designers and faculty as they build courses. In a sense, it's the collision of OERs with social networking online that's going to make this work.

Part of the allure of the social web is that you can interact with anybody, anytime. As an academic, you may get feedback from many different perspectives--from swirling students, from practitioners in the professions, as well as from other academics. In a way, the chaos of the web can be informative and beneficial to us as designers and educators.

CT: What else will help evaluate OERs?

Cottam: The other very exciting piece that ties in with evaluating OERs is learning analytics. With learning analytics, I hope we can look more deeply into the effectiveness of specific learning designs and online learning materials--especially OERs--in our classes. Then we can make data-based decisions to improve our courses.

If there are OER learning objects on the web with common assessments, and we are able to gather data on their effectiveness for a large number of learners and institutions, the impact of design improvements can extend beyond a single class or section. Shared data and transparency about learning design and outcomes have the potential to change the way we approach student success.

Analytics have been used in the corporate sector for many years. The impact of analytics can be as great in education as it has been in business and marketing. By pairing learning analytics with a strong methodology, we will know that the results are valid and reliable. If those results are then broadly shared among institutions, they have the potential to effect widespread change in the acceptance and use of OERs.

Editor's note: Michael Cottam will give a hands-on workshop on interactive learning designs with new media at CT Forum, April 30-May 2, in Long Beach, CA.

About the Author

Mary Grush is Editor and Conference Program Director, Campus Technology.

Featured

  • laptop displaying a red padlock icon sits on a wooden desk with a digital network interface background

    Reports Highlight Domain Controllers as Prime Ransomware Targets

    A recent report from Microsoft reinforces warnings about the critical role Active Directory (AD) domain controllers play in large-scale ransomware attacks, aligning with U.S. government advisories on the persistent threat of AD compromise.

  • various technology icons including a cloud, AI chip, and padlock shield above a laptop displaying charts and cloud data

    AI-Focused Data Security Report Identifies Cloud Governance Gaps

    A new Varonis data security report notes that excessive permissions and AI-driven risks are leaving cloud environments dangerously exposed.

  • abstract pattern of cybersecurity, ai and cloud imagery

    OpenAI Report Identifies Malicious Use of AI in Cloud-Based Cyber Threats

    A report from OpenAI identifies the misuse of artificial intelligence in cybercrime, social engineering, and influence operations, particularly those targeting or operating through cloud infrastructure. In "Disrupting Malicious Uses of AI: June 2025," the company outlines how threat actors are weaponizing large language models for malicious ends — and how OpenAI is pushing back.

  • student reading a book with a brain, a protective hand, a computer monitor showing education icons, gears, and leaves

    4 Steps to Responsible AI Implementation

    Researchers at the University of Kansas Center for Innovation, Design & Digital Learning (CIDDL) have published a new framework for the responsible implementation of artificial intelligence at all levels of education.