Distance Learning Programs Make Case for Quality Assessment

The Dallas County Community College District had grown its online learning programs organically for two decades when Terry Di Paolo, executive dean of online instructional services, decided it was time to take a "holistic view" of the programs to assess quality and create an improvement plan that aligned with its accreditation work. Attendance in 2015 at a Southern Association of Colleges and Schools Commission on Colleges (SACS) workshop introduced him to the Online Learning Consortium's Quality Scorecard. Other institutions at that event assured him that he could use the scorecard system across all seven colleges and various service centers that made up the district.

OLC's Quality Scorecards are worksheets that gives schools criteria and benchmarking tools to assess the effectiveness of their online instructional efforts. Broad assessment areas cover administration of online programs; blended learning; quality course teaching and instructional practices; digital courseware instructional practice; and course design, the last a rubric developed and shared by the State University of New York. Each document covers multiple details; for example, the "administration of online programs" rubric has 75 criteria, which are scored on four levels: deficient (zero points), developing (one point), accomplished (two points) and exemplary (three points).

OLC's Quality Scorecards are worksheets that gives schools criteria and benchmarking tools to assess the effectiveness of their online instructional efforts. 

In the case of the Texas college system, over a 14-month period starting in February 2016, the district adopted the OLC Quality Scorecard internally, customizing it to the unique needs of its large and diverse institution, applying it to the district overall and to its individual colleges.

During that project the district realized that while individual colleges could evaluate their own efforts against the scorecard criteria, there was no visibility to the district's overall approach for the items being measured. Districts were coming up with their own standards for the various aspects of online instruction and learning. Di Paolo and his small team of reviewers received all the college scores and converted them into a single score for each item. He also asked each college to provide five ideas for "quick fixes" and five for "long-term improvements" in each category.

In a new case study published on the OLC website, Di Paolo explained that "everything we created we gave back to the college teams with instructions to share." As the district moved through the process, he added, "you really saw a commitment to what we were doing. You could tell this was having an impact — I started to get requests to provide overviews to others in the district who were not directly involved with the scorecard."

By April 2017 the college system had gathered its top administrators in a room to report the results. During this session, participants engaged in conversations with the college teams about where the district's online programs were and where they should go next. As an institution, the case study noted, they realized "they needed to make some changes." Among them was a need to improve how the colleges mapped distance education policies to SACS requirements.

For the first time, this case study along with those for two other institutions are being published on the OLC website over the next couple of weeks. The idea, according to OLC is to help other schools find out what others have learned about evaluating the quality of their online learning programs. For example, a profile of Baker College's Baker Online will explain how the school used the quality scorecard to help prioritize improvements. The case study for Middle Tennessee State University laid out how the institution used its evaluation to pinpoint areas that needed improvement in its online offerings and demonstrate deficiencies to gain support for the changes.

All three case studies will be available on the OLC Quality Scorecard Suite homepage.

About the Author

Dian Schaffhauser is a former senior contributing editor for 1105 Media's education publications THE Journal, Campus Technology and Spaces4Learning.

Featured

  • glowing hexagons and circles connected by illuminated lines

    Microsoft Intros Open Source Multi-Agent AI System

    Microsoft researchers have unveiled a new open source multi-agent AI system, Magnetic-One, aimed to help enterprises automate complex tasks typically requiring human intervention.

  • Two autonomous AI figures performing tasks in a tech environment; one interacts with floating holographic screens, while the other manipulates digital components

    Agentic AI Named Top Tech Trend for 2025

    Agentic AI will be the top tech trend for 2025, according to research firm Gartner. The term describes autonomous machine "agents" that move beyond query-and-response generative chatbots to do enterprise-related tasks without human guidance.

  • Digital Education Council survey data

    Survey: 86% of Students Already Use AI in Their Studies

    In a recent survey from the Digital Education Council, a global alliance of universities and industry representatives focused on education innovation, the majority of students (86%) said they use artificial intelligence in their studies.

  • abstract image of fragmented, floating geometric shapes with holographic lock icons and encrypted code, set against a dark, glitchy background with intersecting circuits and swirling light trails

    Education Sector a Top Target for Mobile Malware Attacks

    Mobile and IoT/OT cyber threats continue to grow in number and complexity, becoming more targeted and sophisticated, according to a new report from Zscaler.