Open Menu Close Menu

Research

Adaptive, Flipped Approach to Introductory Statistics Lifts Outcomes in 4-Year Schools

man holding tablet with virtual hologram of statistics, graph and charts

A multi-year pilot in Maryland that aimed to redesign the curriculum for introductory statistics using adaptive learning technology and active learning pedagogy found a spark of success among students in four-year institutions.

The "Adaptive Learning in Statistics" (ALiS) study involved numerous players: Ithaka S+R; Transforming Post-Secondary Education in Math (TPSE Math); the William E. Kirwan Center for Academic Innovation at the University System of Maryland; the University of Maryland, College Park (UMCP); Montgomery College; the Urban Institute; and adaptive learning platform provider Acrobatiq.

According to an Ithaka report, the goal of the project was to find out if the use of adaptive learning could "significantly improve" the course outcomes for students in two- and four-year colleges and universities. Ithaka worked with a team of faculty and other partners in the schools along with the vendor (Acrobatiq) to design courseware that could be delivered in a blended format. The Urban Institute served as an outside evaluator to assess student outcomes and advise on the evaluation design. Acrobatiq's Probability and Statistics course made up the initial course curriculum.

The course was set up to guide students through the content on their own learning paths while generating real-time data for instructors regarding student engagement and performance. A group of instructors was also given extra materials that allowed for "focused instruction" and promoted the use of active learning in their classrooms.

The instructors participating in the program received just-in-time training online, through virtual learning sessions and availability of a virtual learning community that let them interact with other instructors and share their classroom exercises with each other. This was important, noted the Ithaka report, because many of them — especially in the two-year colleges — were part-time and adjunct instructors hired at the last minute to teach intro classes with little time to prepare and few resources to help them get ready. "Lead instructors" served as mentors at their respective schools in an effort to inspire a "shared culture of learning and collaboration" both inside and across the participating institutions.

While planning started in 2015, a "pre-pilot" phase took place during the 2016-2017 academic year for a small number of course sections at the Montgomery College and a large section at UMCP. A full-scale pilot began in fall 2017 at those two schools as well as six other institutions. In spring 2018 one more university was added. Over those two semesters, a total of 3,808 students and 45 instructors participated in the study.

Between the two sets of pilots, a lot of tweaking went on behind the scenes. For example, the courseware had to be truncated to fit a one-semester, 15-week format. That content work was undertaken by a team of faculty from the two founding institutions, with Acrobatiq's engineering and course management teams providing programming help.

Also, a new module was introduced. "Getting Ready Check" was added to the first unit of the course, to serve as a prerequisite assignment. Students worked through 45 questions to enable the program to assess their readiness to work with the main content. Based on their responses, the software generated a set of personalized instructional pages with text, illustrations, examples and questions to help them fill in their gaps in foundational math knowledge.

Likewise, additional adaptive exercises and quiz questions were embedded elsewhere in the content to boost its adaptivity.

Although it wasn't a requirement, the instructors who participated were "strongly" encouraged to use a flipped classroom approach "to take advantage of the adaptive features of the tool."

Interestingly, the report noted, slicing up the courseware proved problematic because of the "highly interconnected nature of the adaptive learning courseware, where all components of courseware content ... are intricately tied to one another." Consistent instructor onboarding was another challenge due to time limitations for those adjunct faculty and patchy use of the training materials and other resources. Also, the lead instructor model "worked well in some cases but not well in others." And just simple communication among instructors turned out to be uneven when people had different schedules and resided on different campuses.

The results of the study focused on three metrics:

  • How well students did in their final course grade;
  • How likely they were to receive a final course grade of C or better; and
  • How competent they became on statistics based on how well they did on their final exam.

While students in four-year schools experienced "statistically significant positive outcomes" for course grade (up 0.16 points on a 4.0 scale), course passing rate (up 3.8 percentage points) and statistics competency, there was no impact for those attending two-year institutions or for those who were first-generation or Pell grant-eligible or who had prior developmental education experience. The two-year students were also "substantially less satisfied" with their course experiences than learners in traditional sections in the same schools.

The researchers offered a number of theories about why the four-year/two-year difference emerged, and the report locked onto a couple of possible causes:

  • Variance in how the course was delivered to and experienced by the students. For example, just 32 percent of instructors at the two-year schools reported fully flipping their courses, whereas most at four-year institutions (82 percent) reported doing so. In regard to student experience, while the study attempted to control for variations in student characteristics, those in two-year colleges faced greater difficulties in reading comprehension and in dedicating "sufficient time to study the learning materials on their own before class."
  • Inconsistent use of the available training by the instructors, particularly for those who work in community colleges, where people might be hired "very close to the start of the semester."

The study continued into fall 2018, with fewer faculty and student participants (18 and 1,256, respectively). By then, the report stated, the implementation process was "smoother," the faculty "more experienced" and the resources "improved." The results "mirrored" the ones from the full-scale study in 2017-2018.

Now, four of the Maryland institutions have chosen to continue the adaptive/flipped format, including two — UMCP and WorWic Community College — which will be using it for all of their introductory statistics classes.

Both Ithaka and Urban have issued reports on the findings, which are openly available on their websites.

comments powered by Disqus