New Framework Offers Way to Validate OER Commitment, Sincerity

A new report proposes a framework by which open educational resource initiatives — and particularly those promulgated by for-profit organizations — might be measured. "Toward a Sustainable OER Ecosystem: The Case for OER Stewardship" has three purposes according to its authors: to help make sure "the OER community's values can be maintained as the movement scales"; to gauge the practices of "new entrants" to the OER field (especially those out to make money from it); and to build educator confidence in participating in OER, including those who contribute their own materials and may be uncertain regarding its use by for-profit publishers.

The CARE framework, as it's called, is applicable to all OER stakeholders — or "stewards," as the framework refers to them. Those include individuals, schools and affiliated organizations, both nonprofit and for-profit.

CARE is an acronym for four processes OER stewards need to maintain to show their "duty of care to the broader OER movement":

  • Contribute: Stewards need to contribute to the work of OER, whether in the form of funding or "in-kind" contributions, to perpetuate the "awareness, improvement and distribution" of OER;
  • Attribute: Stewards make sure to credit those who have created or remixed OER;
  • Release: Stewards help to make OER materials available in a form that can be released and used beyond the course and platform in which it was created or distributed; and
  • Empower: Stewards "strive to meet the diverse needs of all learners" and encourage participation by "new and non-traditional voices in OER creation and adoption."

The next step for the framework, according to Doug Levin, co-author of the report and president of EdTech Strategies, is to solicit stakeholder reaction, "to see what questions get raised and where we may have missed the mark or omitted something important."

"This isn't intended to be a document that orgs would sign on to, but there are many ways that it can be used to build a consensus in the field about expectations for those who want to work with OER and gain the support of the community," Levin explained.

Levin, along with his co-authors, Lisa Petrides, CEO and founder of the Institute for the Study of Knowledge Management in Education, and C. Edward Watson, an associate vice president at the Association of American Colleges and Universities, expects to present on the framework at education events over the coming year. The group also plans to release follow-up supplemental materials shortly, such as example practices "to help operationalize the framework and make it more concrete," he said. After that, "we'll see where we go."

The paper is posted online at a new website and was distributed under a Creative Commons CC BY SA license.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • Hand holding a stylus over a tablet with futuristic risk management icons

    Why Universities Are Ransomware's Easy Target: Lessons from the 23% Surge

    Academic environments face heightened risk because their collaboration-driven environments are inherently open, making them more susceptible to attack, while the high-value research data they hold makes them an especially attractive target. The question is not if this data will be targeted, but whether universities can defend it swiftly enough against increasingly AI-powered threats.

  • interconnected blocks of data

    Rubrik Intros Immutable Backup for Okta Environments

    Rubrik has announced Okta Recovery, extending its identity resilience platform to Okta with immutable backups and in-place recovery, while separately detailing its integration with Okta Identity Threat Protection for automated remediation.

  • teenager’s study desk with a laptop displaying an AI symbol, surrounded by books, headphones, a notebook, and a cup of colorful pencils

    Survey: Student AI Use on the Rise

    Ninety-three percent of students across the United States have used AI at least once or twice for school-related purposes, according to the latest AI in Education report from Microsoft.

  • cybersecurity book with a shield and padlock

    NIST Proposes New Cybersecurity Guidelines for AI Systems

    The National Institute of Standards and Technology has unveiled plans to issue a new set of cybersecurity guidelines aimed at safeguarding artificial intelligence systems, citing rising concerns over risks tied to generative models, predictive analytics, and autonomous agents.