4 Steps to Effective Data Use in CTE

Double exposure of man programmer, software developer working on digital tablet and smart city with binary, html computer code on screen

While plenty of attention has been directed at the collection and use of data to improve instruction and effectiveness of programs in K-12 and higher education, career and technical education (CTE) on its own has received far less focus. MDRC recently issued a 10-page brief laying out the challenges CTE programs face in collecting data (such as a lack of staff dedicated to the job) and four steps they can take "to strengthen their own CTE data-collection and measurement activities." MDRC is a nonprofit that takes on developing solutions for really difficult problems in numerous fields, including education.

The information covered in the brief came from a "scan of leading CTE programs" (including those that are part of MDRC's Center for Effective CTE) and interviews with the people running those programs and researching them. Research Associate Hannah Dalporto, the author of the report, also talked with "innovative leaders, consultants and organizations" involved in CTE to understand their data strategies and the obstacles they've faced.

According to the report, creating a data strategy requires answering two questions: What problem does the program address, and how does it do that? Knowing those answers is the first step in figuring out what data to collect and using it to measure whether the program is meeting its goals. That feeds into creation of what MDRC called a "theory of change" — a model that lays out the "essential components and mechanisms" of the program that result in success. From there, it's a matter of the organization collecting data to answer questions about the outcomes.

Challenges surface in the process. One is the difficulty of measuring outcomes that require getting data from a number of institutions or schools. "For example," Dalporto noted, "a high school CTE program may feature a work-based learning component such as an internship and also offer classes that count for college credit. To measure outcomes, that program might need to collect data from secondary, postsecondary and workforce data systems."

The report winnowed the steps for effective data collection and usage down to four:

  1. Conduct a needs assessment and develop a theory of change. That should cover what problems the program is trying to address in the community and how the model leads to change and makes an impact.
  2. Define the "priority research questions" and the most important outcomes. Having that information will help corral what data to collect in the limited time available.
  3. Set up the data collection processes, through the use of spreadsheets or dedicated software (LaunchPath for work-based learning management and ImBlaze for internship management both receive a nod) and tapping into data already collected by school districts, community colleges and state or national agencies.
  4. "Iterate, adapt and update." Only once program people have begun collecting, analyzing and reporting on the data will they identify the gaps and come up with solutions for addressing them.

There's a lot to be gained from having "better metrics," Dalporto suggested. One is enabling programs and funders "to pinpoint the near-term measures that predict future workforce or college success." Another is to help avoid "the mistakes" of the past for vocational education. As an example, "funders are increasingly paying attention to diversity, equity and inclusion and asking for data on related outcomes," the brief stated. "Simply examining the outcomes of subgroups defined by race and ethnicity, gender or socioeconomic status can reveal otherwise hidden inequities."

"Building Effective Data Strategies in Career and Technical Education" is openly available on the MDRC website.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • AI-inspired background pattern with geometric shapes and fine lines in muted blue and gray on a dark background

    IBM Releases Granite 3.0 Family of Advanced AI Models

    IBM has introduced its most advanced family of AI models to date, Granite 3.0, at its annual TechXchange event. The new models were developed to provide a combination of performance, flexibility, and autonomy that outperforms or matches similarly sized models from leading providers on a range of benchmarks.

  • Abstract geometric shapes, including squares and rectangles, are arranged in a grid-like pattern with connecting lines

    Eclipse Foundation Establishes New Open Source Compliance Initiative

    The Eclipse Foundation has launched the Open Regulatory Compliance Working Group (ORC WG), dedicated to helping the global open source community navigate increasingly complex regulatory landscapes.

  • interconnected cloud icons with glowing lines on a gradient blue backdrop

    Report: Cloud Certifications Bring Biggest Salary Payoff

    It pays to be conversant in cloud, according to a new study from Skillsoft The company's annual IT skills and salary survey report found that the top three certifications resulting in the highest payoffs salarywise are for skills in the cloud, specifically related to Amazon Web Services (AWS), Google Cloud, and Nutanix.

  • abstract representation of a supercomputer with glowing blue and green neon geometric shapes resembling interconnected data nodes on a dark background

    University of Florida Invests in Supercomputer Upgrade for AI, Research

    The University of Florida has announced plans to upgrade its HiPerGator supercomputer with new equipment from Nvidia. The $24 million investment will fuel the institution's leadership in AI and research, according to a news announcement.