Report: Competency Ed Needs To Show 'Credible Evidence' To Prove Validity

While many competency-based education (CBE) programs do a decent job of documenting the competencies students need to master and the types of assessments used to measure student proficiency, that doesn't go far enough, according to a new report on the topic of assessment in CBE.

Moving forward, a new report from Pearson recommended, CBE program designers "should work to clarify the links between the tasks students complete on an assessment and the competencies those tasks are designed to measure." On top of that, CBE programs need to be validated against external standards in order to prove to employers that a competency-based education is "credible evidence of students' career readiness."

The new report, titled, "Measuring Mastery," provides best practices for schools in how to validate assessments and establish performance levels that tie to "real-world mastery." The report was written by Pearson researchers Katie McClarty, director of the Center for College & Career Success, and Matthew Gaertner, a senior research scientist. It's being distributed by the American Enterprise Institute (AEI), a private, nonpartisan, not-for-profit organization that researches and attempts to influence policy on government, politics, economics and social welfare. Education is one of AEI's research focuses.

"CBE programs have generally done a good job defining the relevant competencies, that is, what students need to know, and what they'll learn," said McClarty in a prepared statement. "The next critical step will be gathering empirical evidence that documents the relationship between competency mastery and future success."

The report offers four recommendations:

  • CBE programs need to "clearly define" competencies with enough detail and document the evidence that assessments "fully measure" them. The result would be improved transparency "around the processes and expectations" of the program;
  • Conduct research to validate CBE assessments against other assessments. The researchers suggest that CBE programs could collaborate to gather the necessary evidence to show "an empirical relationship";
  • Use the results of empirical research to develop the initial process for setting standards. For example, the report stated, CBE assessments could be given to employees currently working in the relevant fields, and the results of student assessments could be compared to those results; and
  • CBE programs should track the outcomes of graduates' later lives in order to gather evidence that a CBE credential stands for a level of preparation equivalent to a traditional college degree. The outcomes could be measured, the report suggested, "in terms of subsequent academic performance or through job attainment, job performance, occupational prestige or earnings."

The report is publicly available on AEI's site.

AIE will host a live and online panel to discuss CBE on May 21 at 3:30 p.m. Eastern time. Presenters will include Pearson's McClarty as well as representatives from Excelsior College, Seton Hall University and the Association of American Colleges & Universities.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • abstract graph showing growth

    Where Are You on the Ed Tech Maturity Curve?

    Ed tech maturity models can help institutions map progress and make smarter tech decisions.

  • row of digital padlocks

    2026 Cybersecurity Trends to Watch in Higher Education

    In an open call last month, we asked education and industry leaders for their predictions on the cybersecurity landscape for schools, districts, colleges, and universities in 2026. Here's what they told us.

  • Interface buttons of Generative AI tool

    Report: No Foolproof Method Exists for Detecting AI-Generated Media

    Microsoft has released a new research report warning that no single technology can reliably distinguish AI-generated content from authentic media, and that deepening reliance on any one method risks misleading the public.

  • Abstract digital cloudscape of glowing interconnected clouds and radiant lines

    Cloud Complexity Outpacing Human Defenses, Report Warns

    According to the 2026 Cloud Security Report from Fortinet, while cloud security budgets are rising, 66% of organizations lack confidence in real-time threat detection across increasingly complex multi-cloud environments, with identity risks, tool sprawl, and fragmented visibility creating persistent operational gaps despite significant investment increases.