Introduction

The Adaptive Learning in Statistics (ALiS) project was a multi-year pilot initiative in which faculty members from multiple two-year and four-year public institutions in Maryland used a common adaptive learning courseware in their introductory statistics courses and received training and instructional resources on an active learning and flipped classroom pedagogical approach.[1] The project was organized and led by Ithaka S+R in collaboration with Transforming Post-Secondary Education in Mathematics (TPSE Math), the William E. Kirwan Center for Academic Innovation at the University System of Maryland (Kirwan Center), the University of Maryland, College Park (UMCP), Montgomery College (MC), the Urban Institute, and Acrobatiq.[2]

The hypotheses we sought to test through this pilot were that (a) the adaptive courseware and the provided training and resources would significantly change the educational experiences of faculty and students in this important gateway course and that (b) such modified educational experiences would improve course-level learning outcomes for students and reduce gaps between different subgroups of students.[3] We also sought to demonstrate that (c) the ALiS package, including both the courseware and the online instructor training and support resources, could achieve consistently positive results when implemented across many institutions and instructors at the same time.

The unusual structure of the project—third parties coordinating directly with faculty from multiple institutions, and facilitating engagement among those faculty—also allowed us to gain insight on two additional questions: (d) whether building capacities around incorporating new technology into the classroom among a group of faculty could potentially be a catalyst for broader pedagogical changes within their respective institutions; and (e) whether the relationships built and resources shared among faculty from two-year and four-year institutions could potentially reduce friction in the articulation of math courses and lead to a greater statewide coherence in the long-run.

There are four high-level takeaways from the project:

  • The analysis of the full-scale intervention in 2017-18 showed statistically significant positive outcomes for students at four-year institutions in course grade, course passing rate, and statistical competency, but no impact for students at two-year institutions or for those students in focus subgroups (i.e. first-generation, Pell grant eligible, and students with prior developmental education experience). Student outcomes remained different at the four-year institutions compared to the two-year institutions for the extended pilot (fall 2018).
  • Within each institution type (two-year and four-year institutions), there was notable variability in impact estimates across the individual institutions. Since the analyses tried to correct for differences in student baseline characteristics and other important background information, it is reasonable to conclude that some part of that variability was the result of the different ways in which the course was delivered to and experienced by students. [4] Further research is needed to better understand this variability and exactly what combinations of adaptive learning technology and active learning instructional approach would work better or worse for whom, in what ways, under what circumstances, and why.
  • The training and other resources provided to instructors across multiple campuses through a series of webinars in 2017-18 were not consistently utilized. This may have contributed to the variation in the implementation of the recommended pedagogy from instructor to instructor. The feedback from the lead instructors and others indicated that the training program and support mechanisms put in place for the 2017-18 were not as effective as they needed to be to prepare a large and diverse group of instructors to teach statistics to students with different backgrounds and learning needs using the new tool and instructional approach. The Comprehensive Course Guide (CCG) adopted in fall 2018 (described more fully later in this report) shows some evidence of helping to fill that gap but still requires further testing and development.
  • The cross-institutional communication fostered as part of this initiative enriched the ongoing conversation on improving statistics instruction and math education more generally within and across the participating institutions. At one of the two-year institutions in particular, the lead instructor and department head reported that this project has helped the institution significantly improve the rigor of its statistics curriculum, and put in place additional resources to further enhance the teaching and learning experiences of their faculty and students. Going forward, of particular interest to the faculty and administrative leaders involved in the ALiS project is the possibility that greater collaboration between two-year and four-year institutions might lead to improved alignment and consistency in the quality and delivery of math education across the state, and achieve greater student success.

This report describes the motivation behind the project, key project phases and activities, implementation processes and challenges, as well as a summary of key results from the formal evaluation of the intervention conducted by researchers at the Urban Institute during the 2017-18 academic school year and again in fall 2018. We end the report with our reflections on the study results and some concluding thoughts about possible next steps for future work.

The report includes references and links to related reports and other outputs of the project. Specifically, the course design-related lessons learned in this project formed the basis for a separate document on possible design requirements for an adaptive learning courseware package intended to be used in a multi-institutional setting, which is available on the Ithaka S+R website.[5] In addition, the research design and results from the 2017-18 pilot are presented more fully in a separate report prepared by the Urban Institute that is available on the Urban Institute website (https://www.urban.org/research/publication/evaluation-adaptive-learning-statistics-alis).[6] A slide deck summarizing the results from the extended pilot in fall 2018 is also available on the Urban Institute website (https://www.urban.org/research/publication/evaluation-adaptive-learning-statistics-alis/evaluation-findings-fall-2018).[7] Summaries of the courseware and Comprehensive Course Guide (CCG) are included in this report as Appendices C and E, respectively. Information about how to gain access to the actual courseware and CCG can be obtained through Ithaka S+R and Acrobatiq (now VitalSource).

Motivation and Context

There is a general consensus that a quality postsecondary education and credential are critical to success in today’s rapidly changing economy. However, a growing body of evidence has shown that traditional, entry-level mathematics courses required to progress toward a degree, particularly those standalone, noncredit-bearing remedial courses, constitute a formidable barrier to completion of postsecondary credentials, especially for first-generation and lower-income students.[8] Key reasons for this include the disconnected nature of these course offerings and their misalignment with students’ academic and career aspirations and needs, as well as limited use of evidence-based pedagogy.[9] Math education organizations, policy makers, and higher education leaders across the nation have urged faculty and institutions to re-engineer the way entry-level mathematics is taught.[10]

One of the math redesign efforts that has shown some promising early results in improving student success and has received endorsement by the math education community and reform advisory groups, such as the Charles A. Dana Center at the University of Texas, Austin, is the co-requisite model. There are a variety of ways in which the co-requisite model can be implemented, but the basic idea is that students are placed into credit-bearing math courses based on a more holistic review of their academic preparedness and aspiration and provided with significant wraparound supports—both curricular and non-curricular—to help them make timely progress toward earning their credentials.[11] Along the same lines, statistics is increasingly regarded as more functional and pragmatic mathematics training for students pursuing non-STEM fields of study and career paths,[12] and colleges around the country are beginning to offer credit-bearing statistics in place of algebra for these students.[13]

In a recent study conducted at the City University of New York (CUNY), researchers Alexandra W. Logue, Mari Watanabe-Rose, and Daniel Douglas used a randomized controlled trial to track how 907 students with remedial needs progressed through three different math courses: (1) traditional remedial algebra, (2) traditional remedial algebra plus a weekly workshop, and (3) college-level, credit-bearing, introductory statistics plus a weekly workshop.[14] Logue and her colleagues found that students in the third group (i.e. statistics with workshop) passed at a rate of 16 percentage points higher than those assigned to remedial algebra (p<0.01), and also accumulated more credits in the subsequent year (3 credits more on average). Following the academic progress of these students for three years after the experiment, they found that students in the statistics group passed at least as many of their general courses as did the traditional remedial group with a lower average number of course enrollments and received an associate’s degree from CUNY or another college at a higher rate (25 percent vs. 17 percent) Moreover, they showed that the course success and graduation rate results did not differ according to students’ race and ethnicity, suggesting that this kind of co-requisite approach using statistics shows great promise for closing achievement gaps.[15]

At the same time, rapid advancement in educational technology has raised the possibility that such technology can help effect the shifts in the curriculum and pedagogy envisioned by math reformers and also allow them to be implemented at scale more effectively. Most recently, a growing market of adaptive learning solutions has received attention from educators, higher education leaders, and policy makers based on their ability to tailor content and feedback to provide personalized learning paths to students at scale.[16] Adaptive learning is an instructional technique designed for “providing personalized learning, which aims to provide efficient, effective, and customized learning paths to engage each student.”[17] Technologies that incorporate adaptive learning principles use a data-driven approach to adjust the path as well as the pace of learning for individual students with a promise for delivering personalized learning at scale, although there are different types and degrees of adaptivity across different products.

The level of sophistication required for adaptive learning technology has resulted in these learning solutions most often being developed and maintained by third-party education technology vendors and/or publishers. There are two general types of adaptive courseware: (1) one that is more oriented to open-content and allows instructors to author their own content, create their own learning objectives, and configure their own course sequences, and (2) another that is more closed-content oriented, where course content, assessment, and learning objectives are hard-coded into the platform, and customization for instructors is limited.[18] The first type is often affiliated with smaller, start-up vendors while the latter type is often affiliated with textbook publishers, though this is not always the case. There is a wide range of learning solutions that fall in between these two types, many of which allow or require some level of customization prior to course delivery.[19]

Perhaps the most promising aspect of adaptive learning courseware is the opportunity to make learning more active for students.[20] Instead of merely providing content to students through a lecture during class time, for example, content can be delivered to students by courseware before class, with system-generated feedback tailored for individual students to guide their self-study, so that class time with instructors can be repurposed for deeper and more meaningful interaction and active learning. The rich data analytics tool and dashboards provided by these platforms can further assist instructors in their ongoing formative assessment of student learning to make appropriate modifications to their instruction and in-class activities. At least conceptually, adaptive learning technology can both leverage and augment the redesign efforts math reformers are undertaking across the country and have a broader impact on student success, taking advantage of the level of scalability that such digital tools can potentially achieve.

Given that adaptive learning technology is a relatively new area of research in higher education, the existing literature base is somewhat limited and scattered, although some findings and lessons from early projects are emerging. In a recent report published by SRI International researchers used quasi-experimental methods to explore the learning and cost impacts of adaptive courseware implementations in the Adaptive Learning Market Accelerated Program (ALMAP) initiated and supported by the Bill & Melinda Gates Foundation between summer 2013 and winter 2015 at 14 higher education institutions. [21] A total of 15 gateway general education courses (i.e. psychology, biology, business, marketing, and economics) and seven developmental (remedial) education courses used adaptive courseware to make different kinds of changes in course delivery (i.e. blended adaptive vs. traditional lecture, online adaptive vs. regular online; blended adaptive vs. blended). Their analyses revealed that effects on student learning and course completion were mixed across this collection of courses, where only a few courses with adequate data resulted in higher average course grades for students—although they saw modest but significantly positive average impact for seven side-by-side comparison courses on common final exam scores. The results also showed that the impact appeared to be slightly larger for mathematics and biology courses while the effect did not vary significantly for other disciplines. Students were generally satisfied with the courses used, with two-year institution students having more positive views than did those at the four-year institutions. The researchers also found that impacts varied by use, where switching from lecture format to adaptive blended format had a positive impact on student learning. In terms of the impact on course costs, they found that startup costs were high in most cases, largely driven by the amount of instructor time needed to insert content into the adaptive products. However, ongoing costs did appear to go down in the second and third implementations, which suggests that some economies can be achieved over time and with sufficient scale. In conclusion the report called for future studies to incorporate student characteristics as well as the specifics of how the courseware was used to allow for a better understanding of the impact of such interventions on student success.

Another study to note is the one conducted at three University of California campuses (UC Davis, UC Santa Barbara, and UC Santa Cruz), where the ALEKS system (an artificially intelligent assessment and learning system)[22] was implemented during the 2015-16 academic year in select mathematics and chemistry courses to understand its impact on student learning.[23] The study found that, when ALEKS was used by students as the designers intended—with systematic efforts to encourage student persistence toward the established achievement goals—results were positive in terms of overall performance in the course and, in some cases, the same positive results were found in at-risk populations (i.e. underrepresented minority, low-income, and first-generation). The authors of the pilot report noted that, although the analyses revealed positive findings, ALEKS should be not seen as a simple panacea for poor student performance. They called for institutions considering the implementation of such technology into their course offerings to think strategically about all facets of their organizational structures—including the research process, technology infrastructure, finance, and accountability systems—to make the intervention work effectively.

The ALiS project was designed to build on these separate yet overlapping threads of work to contribute to the growing body of knowledge at the intersection of course redesign and adaptive learning technology implementation (see Sidebar 1 for a brief project history). Specifically, the project aimed to test whether redesigning the way introductory statistics is taught using adaptive learning technology would significantly improve course-level outcomes for students across a diverse set of two-year and four-year institutions. The Urban Institute served as the third-party evaluator to assess student outcomes and advise on evaluation design. The project team at Ithaka S+R worked closely with a team of faculty and other partners at the two-year and four-year institutions and the system office in Maryland,[24] as well as the vendor partner Acrobatiq, to design common adaptive learning courseware that would be offered in a blended format. The course design relied on the adaptive courseware to guide students’ study through the content on a personalized learning pathway while simultaneously equipping instructors with real-time data on student engagement and performance. [25] Instructors were also provided with relevant in-class exercises to allow focused instruction and promote active learning in their classrooms.

Moreover, by using the common course and training materials across a large number of institutions in Maryland, the project team hoped to achieve consistent delivery of the course without requiring significant additional resources in terms of instructor time or other expenditures. Just-in-time training and other resources were designed to be delivered to instructors online and through virtual training sessions. This was deemed particularly important for the project team since many part-time and adjunct faculty are often hired close to the start of the semester to teach introductory courses with limited time and resources for preparation. Finally, a communication and organizational strategy was developed in an effort to bring together all participating instructors around a common training and discussion schedule, with designated lead instructors at the institutions regularly interacting with the project team and with each other. Lead instructors were asked to serve as mentors to other instructors at their respective institutions in order to foster a shared culture of learning and collaboration both within and across the participating institutions.

Sidebar 1: A Brief History of the ALiS Project

The project had its origins in conversations dating back several years between the late William G. Bowen (Bill Bowen), then the board chair of ITHAKA, former president of The Andrew W. Mellon Foundation and Princeton University and an economist of higher education, and William E. Kirwan (Brit Kirwan), a distinguished mathematician and institutional leader, then about to complete a twelve-year term as Chancellor of the University System of Maryland and about to become executive director of Transforming Postsecondary Education in Mathematics (TPSE Math). Under Bill Bowen’s leadership, Ithaka S+R was studying and organizing experiments in the use of technology to improve both the effectiveness and the efficiency of teaching in a variety of introductory subjects, including statistics. Meanwhile, Brit Kirwan was fostering similar kinds of innovation in Maryland and had joined with leading mathematicians around the country to form TPSE Math with the goal of improving math education across the nation.[26] Included in the early work in Maryland was participation in an Ithaka S+R study on “Interactive Learning Online at Public Universities,”[27] a study on the integration of MOOCs and other platforms in hybrid formats into face-to-face courses,[28] and a study on the integration of an adaptive learning product into summer bridge programs to improve students’ college math readiness.[29]

Project Phases, Implementation Processes, and Research Results

The planning for the ALiS project and discussions among the emerging group of partners began in the summer of 2015, and continued during the 2015-16 academic year. During that time, a small group of faculty from the University of Maryland, College Park (UMCP) and Montgomery College (MC) reviewed the introductory statistics contents endorsed by professional mathematics organizations and compared them to the emerging standards under development in Maryland. Using those as a starting point, the group decided to use Acrobatiq’s existing Probability and Statistics course, itself a successor of Carnegie Mellon University’s Online Learning Initiative (OLI) course.[30] Additional infrastructure was put in place by the project team, including the development of the initial research design and formal agreements with key partners to detail roles and responsibilities. Below we summarize in detail the three project phases, including key activities, implementation processes as well as challenges, and research results. See Appendix A for a visual project timeline and descriptions of other concurrent–and ongoing–relevant initiatives in Maryland.

The Pre-Pilot Phase (2016-17)

During 2016-17, we carried out a pre-pilot phase of the project, using the Acrobatiq courseware for a small number of sections at MC and one large section at UMCP. The goals of this phase were to work out the kinks in the courseware, experiment with different forms of course delivery, formulate a plan for training new instructors to use the courseware and other resources, and put in place other parts of the infrastructure necessary to carry out the full-scale, multi-institutional test in the subsequent year.

Throughout the year, the pre-pilot faculty at MC and UMCP participated in a series of meetings and workshops hosted by a team of instructional advisors at the UMCP’s Teaching and Learning Transformation Center to discuss their experiences, share lessons, and identify areas for improvement.[31] Informed by these conversations as well as insights gleaned from the evaluation of pre-pilot outcomes conducted by the Urban Institute,[32] the project team engaged in several activities over the summer of 2017 to prepare for the full-scale pilot in 2017-18:

  1. Truncate the courseware content to allow better integration with the one-semester, fifteen-week format. The original version of the courseware had additional content that was not included in the national and Maryland standards that guided the ALiS course design effort.[33] As a result, it was difficult for participating faculty to fit the full Acrobatiq course into their traditional three- or four-credit, one-semester statistics courses. A small team of faculty from UMCP and MC thoroughly reviewed the courseware content and recommended the removal of a number of modules. Over the summer, the same faculty team worked with Acrobatiq’s engineering and course management team to remove those modules from the course, ultimately resulting in the version that was piloted in 2017-18 (i.e. the ALiS course). [34] See Appendix C for the table of contents for the course and a list of learning objectives.
  2. Enhance the adaptivity of the course by creating and adding more adaptive content. A new adaptive module called “Getting Ready Check” was developed and included in the first unit of the courseware.[35] This was a pre-requisite assignment with forty-five questions intended to assess students’ preparedness to engage with the main course content.[36] Based on students’ performance on these questions, the platform generated a set of personalized instructional pages with explanatory text, visual illustrations, examples, and questions to help fill gaps in students’ foundational math knowledge before they engaged with the course content. Additional adaptive exercises and quiz questions were included throughout the courseware in an effort to further enhance its degree of adaptivity.
  3. Develop a plan for instructor training and related mechanisms for sharing resources across many institutions. A plan for instructor training and resource sharing was developed in consultation with faculty leaders at UMCP and MC, the Acrobatiq training support team, and partners at the Kirwan Center.[37] The goal of this plan was fourfold: (a) encourage participating instructors to use the courseware as the primary delivery mechanism of the course content; (b) encourage instructors to set clearer engagement and participation expectations to motivate their students’ full use of the courseware; (c) ensure that instructors become familiar with the key adaptive features and functionalities afforded by the courseware platform to support their instructional practices; and (d) facilitate resource sharing among instructors (e.g. sample syllabi, in-class activities aligned with the courseware content/sequence) to assist their promotion of active learning in the classrooms. See Appendix D for more information about our efforts to train and support instructors.

Although not explicitly spelled out in the pedagogical guidance document (Appendix D), the project team strongly recommended that instructors use a flipped classroom approach to take advantage of the adaptive features of the tool.[38] In a flipped classroom, course content is delivered primarily by technology and high quality and timely responses are provided by instructors.[39] This approach posits that both students and instructors play an active role in their learning and teaching; students engage with course readings and exercises in the courseware prior to class, and instructors monitor students’ progress on the platform to tailor in-class instruction and activities in ways that address students’ learning needs.

According to a meta-analysis study conducted to compare outcomes for different combinations of hybrid, blended, and flipped courses,[40] the only type of mixed-method course that consistently improved student learning was flipped blended, a type of blended instruction that delivers content via technology and provides feedback via instructor.[41] The researchers attributed the improved learning outcomes to instructors’ ability to devote classroom time to concept application, problem solving, and deeper discussions. In many ways, the flipped classroom approach was very much in line with the project’s theory of change, which posited that a thoughtful instructional approach that adapts to students’ ongoing learning needs could improve both how much they learned and how well they retained the content.

The Full Pilot Phase (2017-18)

The full-scale pilot was launched in August 2017 with eight institutions, including five two-year institutions and three four-year institutions.[42] The pilot was then repeated in the spring 2018 semester with the original eight institutions plus an additional four-year institution.[43] In total, 3,808 students and 45 instructors participated in the study across the two semesters. The study design, created and managed by the Urban Institute in consultation with the ALiS project team and participating institutions, required a sizable portion of the pilot-traditional section pairs to be taught by the same instructors. Sixty percent (60 percent) of the matched pairs in fall 2017 and 74 percent of the matched pairs in spring 2018 were taught by the same instructors. (See the separate Urban Institute report for a detailed description of the research design and results from the 2017-18 pilot).

All pilot instructors were invited to the training sessions and provided access to the project-wide “virtual learning community” site to receive important project announcements, interact with other fellow instructors, and share active learning exercises and other resources. Regular interactions with the lead instructors enabled the project team to continually collect feedback and address issues in a timely manner and think about areas for further improvement in the courseware and resources.

There were several implementation challenges, three of which we highlight below:

  1. The truncated version of the courseware required a high degree of maintenance throughout the year. The highly interconnected nature of the adaptive learning courseware, where all components of courseware content – e.g. its explanatory text, exercises, or quizzes – are intricately tied to one another, meant that “plugging out” content from the course was not an easy endeavor. We realized quickly after the roll out of the courseware in fall 2017 that traces of the removed content were embedded in different parts of the course because, by design, the courseware itself is intended to be used in full from beginning to end, with layered attempts to reinforce important concepts throughout a student’s learning journey. Many of these issues were detected and addressed in a timely manner, but they undoubtedly took considerable time and energy from the participating instructors and the project team. This experience also impacted many students because it caused unnecessary confusion and distraction and, ultimately, undermined their confidence in the courseware and the course as a whole.
  2. Onboarding instructors to the project in a consistent manner was challenging due to their different entry points. The variation across different campus schedules as well as the relatively tight contract terms of many part-time and adjunct faculty made it difficult for the project team to identify all participating instructors early on to onboard them to the project and consistently deliver training webinars as originally planned. In the case of some of the community colleges, some instructors were hired very close to the start of the semester (e.g. a week or even a few days before) with little to no time to prepare for their courses or familiarize themselves with the training materials and resources provided through the project. The project team made recordings of the training sessions available for those who joined the project late, but it was unclear to what degree those resources were accessed. For many instructors, the recordings did not sufficiently prepare them to use the courseware and other related tools extensively or effectively.
  3. The lead instructor model–where a designated lead instructor at each institution took charge of mentoring and guiding other pilot instructors at their own institution–worked well in some cases but not well in others. The intervention relied on the lead instructors at the participating institutions to create opportunities for other instructors at their own institutions to surface any questions and concerns, brainstorm solutions to challenges or issues, and share ideas and best practices with each other throughout the pilot year. Based on the project team’s conversations with the lead instructors throughout the year, this model worked well mainly among instructors who shared office spaces and, therefore, were able to meet regularly. Communication among instructors seemed more difficult and less seamless when they were located in different campuses/offices or had very different class schedules and availability.

All in all, there were more coordination challenges managing this multi-institutional implementation than we had initially anticipated, including issues ranging from integrating the courseware with different schools’ learning management systems to delivering various pieces of project-related communication to a large and diverse group of participants. By spring 2018, the project team was able to streamline these processes somewhat better, as everyone gained more experience and instituted better communication mechanisms (e.g. weekly digest emails with tailored messages for different participant audiences). However, we continued to see the lack of consistency and robustness of the instructor training program to be very limiting in terms of preparing instructors to teach with the new tool and different instructional approach.

Results from the 2017-18 pilot.

The measures of success for the ALiS intervention focused on three key metrics of the students’ academic performance: (a) final course grade, (b) the probability of receiving a final course grade of C or better, and (c) statistics competency as measured by students’ performance on a set of common final exam questions. We also rated student satisfaction as derived from the end-of-semester survey. Instructors also received a survey on their experience with and perspective on teaching the course. Key findings from the Urban Institute analysis of these measures include:

  • There was a modest learning gain at the aggregate level but notable differences in impact by institution type. Across all colleges and both semesters, when pooling the institutions together and weighting them equally, students in the ALiS sections experienced a modest gain in course grade (0.081 grade points higher out of 4 points, p<0.10) and statistical competency (0.075 standard deviation higher, p<0.01) but were marginally less satisfied with the course (0.093 points lower on a 5-point scale, p<0.10). However, when the analysis was divided by institution type (i.e. four-year vs. two-year institutions), the differences in impact were quite striking. Students at the four-year institutions experienced statistically significant positive impacts across the board compared to their peers in traditional sections, with gains in all three academic performance measures–their course grades (0.161 grade points higher, p<0.01), probability of passing the course with C or better (0.038 percent points higher, p<0.05), and statistical competency (0.112 standard deviation higher, p<0.01)–as well as in their satisfaction with the overall course experience (0.264 points higher, p<0.01). Students at the two-year institutions, on the other hand, experienced no statistically significant impact on their academic outcomes, and, in fact, were significantly less satisfied with the course (0.355 points lower, p<0.01).
  • Although patterns of impact were generally consistent among individual institutions within each institution type, there were some notable variations. Table 1 below provides a summary of impact estimates at the full pilot, institution type, and individual institution levels. As can be seen in this table, not all institutions in the four-year institution group had significantly positive results for all outcome measures, and not all institutions in the two-year institution group showed neutral results for all academic performance measures.

Table 1. Summary of 2017-18 Impact Estimates at the Full Pilot, Institution Type, and Individual Institution Levels

Note that cells with bold text indicate positive results, cells with italic text indicate negative results, and empty cells indicate no significant results. See the full research report prepared by the Urban Institute for more details.

InstitutionFinal Course Grade (GPA pointsPassing Rate (probability of receiving C or better)Statistical Competency (common final exam performance)Student satisfaction (student end-of-semester survey)
All 9 Pilot institutions0.081 points higher out of 4 points (p<0.10)0.075 standard deviation above the mean (p<0.01) 0.093 points lower out of 5 points (p<0.10)
All Four-year Institutions (n=4)0.161 points higher out of 4 points (p<0.01)0.038 percentage points higher (p<0.05)0.112 standard deviation above the mean (p<0.01) 0.264 points higher out of 5 points (p<0.01)
Four-year institution 10.423 points higher (p<0.01)0.160 percentage points higher (p<0.01)0.444 standard deviation above the mean (p>0.01) 0.256 points higher out of 5 points (p<0.10)
Four-year institution 20.405 points higher (p<0.01)0.116 percentage points higher (0<0.05)0.182 standard deviation above the mean (p<0.05)
Four-year institution 30.195 standard deviation above the mean (p<0.05)
Four-year institution 40.500 points higher out of 5 (p<0.01)
All Two-year institutions (n=5)0.355 points lower out of 5 points (p<0.01)
Two-year institution 10.387 standard deviation below the mean (p<0.01) 0.327 points higher out of 5 (p<0.05)
Two-year institution 20.357 standard deviation above the mean (p<0.01)0.830 points lower out of 5 (p<0.01)
Two-year institution 3
Two-year institution 40.116 percentage points lower (p<0.10)0.748 points lower (p<0.01)
Two-year institution 50.873 points lower (p<0.01)

  • Students entering the course with less advantageous backgrounds generally were not harmed by the intervention but were significantly less satisfied. The subgroup analysis of course-level impact indicated that, for the most part, students starting the ALiS course at a disadvantage were not significantly impacted–positively or negatively–by the intervention.[44] However, there were some marginal increases in the gap in statistical competency for Pell grant eligible students and those with prior development education coursework experience.
  • Students’ perceived experiences with different components of the course differed vastly by institution type. Pilot students at the four-year institutions were more satisfied than their peers with various components of the course, including the quality of instruction (0.33 points higher out of 5 points; p<0.01), instructor availability (0.66 points higher out of 5 points; p<0.01), and the quality of activities in terms of their engagement (0.66 points higher out of 5 points; p<0.01) and their helpfulness with comprehension (0.33 points higher out of 5 points; p<0.001), though students reported significantly less satisfaction with the courseware compared to their peers in traditional sections using other non-adaptive learning tools (0.33 points lower out of 5 points; p<0.10). In sharp contrast, pilot students at the two-year institutions were relatively less satisfied than their peers with the quality of activities in terms of helping them with comprehension (0.33 points lower out of 5 points; p<0.10), amount learned (0.66 points lower out of 5; p<0.01), general ease of the course (0.33 points lower out of 5 points; p<0.10), their interest in the subject (0.33 points lower out of 5 points; p<0.01), as well as their satisfaction with the courseware (0.99 points lower out of 5 points; p<0.01).
  • Importantly, there was a wide variation in how instructors implemented the recommended pedagogy in their courses.[45] Only about 32 percent of instructors at the two-year institutions reported fully flipping their classrooms in spring 2018, whereas the majority of instructors at the four-year institutions (82 percent) reported doing so. Returning instructors in spring 2018 were more likely to flip their classrooms than first-time instructors regardless of institution type, suggesting that continuity of experience is important for ongoing improvement of implementation (38 percent of returning instructors vs. 17 percent of first-time instructors at the two-year institutions flipped their classrooms, and 100 percent of returning instructors vs. 67 percent of first-time instructors at four-year institutions flipped their classrooms). Moreover, although the majority of instructors reported using the dashboards in the Acrobatiq platform to track student progress at some point during the semester (100 percent at four-year institutions and 89 percent at two-year institutions), the frequency of usage varied widely. Those at the two-year institutions reported using the dashboard more frequently than their peers at the four-year institutions. Seventy-six percent of two-year instructors reported using the dashboard at least once a month, compared to 62 percent of their peers at the four-year institutions. Also, nearly half of the two-year instructors (46 percent) reported using the dashboard weekly or every class period (the strongly suggested frequency), but only 12 percent of four-year instructors reported doing so.

The Extended Pilot Phase (2018-19)

In light of these findings as well as the anecdotal evidence gathered from lead instructors and others, the project team developed the Comprehensive Course Guide (CCG) over the summer of 2018 to address the key shortcomings of the ALiS intervention, with the aim of providing more consistent instructor onboarding and better support for the course delivery experience for participating faculty. The CCG was designed to be a “one-stop-shop” solution to the many-institution, many-instructor implementation challenges that the project team experienced in the first full-scale pilot year. More specifically, it was designed to provide a robust package of online instructor training courseware supported by extensive instructional resources —including weekly summary guides, in-class activities, data sets, extra problem sets, and test question banks —all carefully aligned with the sequence of the ALiS courseware to guide instructors to promote more personalized active learning in classrooms using the flipped classroom approach to pedagogy and the adaptive features of the courseware (see Appendix E for what the CCG entails).

In the meantime, the project team also sought an extension to the original grant to pilot the courseware again in 2018-19 with the improved and expanded instructor resources. Five of the nine institutions elected to participate and signed a new Memorandum of Understanding.[46] At the same time, a small team of faculty and Acrobatiq staff continued to review and make the necessary edits to the courseware for quality control. For example, sentence structures contained in the courseware content were reviewed thoroughly and simplified in an effort to reduce the reading level requirements of the course in light of the feedback from students and instructors.

The ALiS course and the newly developed CCG were piloted in 2018-19 at the five participating institutions, but formal evaluation by the Urban Institute was only conducted in fall 2018. In total, 18 instructors participated in the fall 2018 study, offering 20 pilot and 7 traditional sections.[47] A total of 1,256 students were enrolled across the pilot and traditional sections (791 in pilot and 465 in traditional sections). In spring 2019, in lieu of formal testing, a relatively simple internal evaluation protocol was developed by the project team and carried out at the end of the semester, which included the collection of course-level outcome data (i.e. final course grades, common final exam scores), phone interviews with several instructors, and a short student survey.[48] Nine instructors participated in the spring 2019 pilot, offering a total of 19 sections to 855 students; there was no traditional group in spring 2019.

Overall, the implementation process was much smoother in 2018-19, with more experienced instructors, improved resources, and established processes. By tapping into the expertise of the experienced lead instructors as well as the improved resources, we were able to streamline the implementation process. The CCG, in particular, made it easier to deliver all of the necessary training materials and resources to participating faculty in a more consistent and timely manner. Those resources were designed to be accessed by individual faculty when they needed them—a more productive form of delivery than trying to impose a common training schedule on a large group of instructors with different schedules and availability. More regular and focused discussion among the lead instructors from all five institutions also facilitated the process of learning and experimentation for the entire instructor group.

The improved and expanded instructor training and resources package was well received by both returning and new pilot instructors. Although the use of the CCG was not systematically tracked, the project team learned from surveys, site visits, and conversations with lead instructors that the CCG resources were frequently used by the instructors. The one and only new pilot instructor in spring 2019 noted in a follow-up phone conversation that the training course and the other resources enabled her to quickly onboard herself into the project. The returning instructors also welcomed this new development. Many expressed their desire for further building on these resources–e.g. additional problem sets, exercises, test bank questions, and data sets–to help enhance the teaching and learning experience. They suggested that, if the course continues to be used by multiple institutions going forward, it would be useful to have a central team who will be responsible for regularly maintaining and updating these resources.

Results from the fall 2018 pilot.

As noted earlier (see footnote 44), the study design for fall 2018 study was different from that of the 2017-18 pilot in that no pilot and traditional sections shared the same instructor and the numbers of schools, students, and instructors were all smaller. Despite these limitations on comparability with the earlier results, the fall 2018 evaluation added useful insights:

  • In terms of impact, the fall 2018 results mirrored those of the full-scale study in 2017-18, particularly on the differential outcomes for students by institution type. There were significant learning gains for students at the four-year institutions, with improvement in the final course grades (0.216 grade points higher, p<0.10) and the probability of passing the course with C or better (0.071 percentage points higher, p<0.01), as well as in their satisfaction with the overall course experience (0.495 points higher, p<0.01). However, there were still no significant improvements at the two-year institutions.[49] Similar to the results from the earlier pilot, students with less advantageous backgrounds were not harmed by participating in the project, though first-generation students were less satisfied.
  • Similar to the previous year’s results, students’ satisfaction with different course components differed by institution type. Pilot students at the four-year institutions were more satisfied than their peers in the traditional sections with various components of the course, including the quality of instruction (0.33 points higher out of 5 points; p<0.01), the quality of activities in terms of their level of engagement (0.33 points higher out of 5 points; p<0.05), the amount learned (0.33 points higher out of 5 points, p<0.01), general ease of the course (0.66 points higher out of 5 points, p<0.01), interest in the subject (0.66 points higher out of 5 points, p<0.01), as well as their willingness to take future math classes (0.33 points higher out of 5 points, p<0.05). In contrast, these results were neutral across the board for pilot students at the two-year institutions. It is worthwhile to note, however, that the student satisfaction results in the first pilot year (2017-18) were far more negative, especially at the two-year institutions, as we described earlier.
  • Compared to instructors in the earlier pilot, the fall 2018 instructors were more likely to utilize the flipped classroom approach, regardless of institution type. All instructors at the two-year institutions reported flipping their classrooms to some degree (78 percent flipped fully, and 22 percent flipped partially, compared to 32 percent in the earlier pilot). The two instructors at the four-year institutions also reported flipping their classrooms fully. In the end-of-semester phone interviews with the project team, the returning instructors reported that they were able to flip their classrooms more effectively and were also able to set clearer expectations for students around participation in order to promote active engagement with the courseware content. Interestingly though, survey findings revealed that in-class time was used differently by instructors at different institutions. The instructors at the four-year institutions reported using more of their in-class time on computer activities and assessments, whereas those at the two-year institutions reported relying more on lectures and paper activities.
  • The returning instructors in fall 2018 reported using the dashboard data more frequently to make decisions on areas to focus on during class, provide targeted feedback to students, and intervene with students who were falling behind.[50] Over 70 percent of the returning instructors reported using the dashboard data at least once a week to make decisions about what areas and topics to elaborate or focus on during class, compared to 12.5 percent of the first time instructors. Moreover, 50 percent of the returning instructors indicated that they used the dashboard data outside of class (e.g. office hours) to provide targeted feedback to students at least once a week, whereas none of the first-time instructors indicated doing so. And about 67 percent of the returning instructors reported using the dashboard data to intervene with students in the class who were falling behind at least once a week, compared to 12 percent of the first-time instructors who reported doing so.

Some Reflections on the Results

Overall, the results collectively suggest that, despite the project team’s attempt to provide a common package of adaptive learning courseware, instructor training, and resources in a consistent manner to all participants, there were still considerable differences in how the course was delivered to and experienced by students across the two institution types as well as across the individual institutions. The CCG introduction and the presence of more experienced instructors in fall 2018 did translate into smoother implementation processes overall, and helped enhance the instructors’ ability to better incorporate the recommended pedagogy into their courses and use the dashboard with more fidelity to tailor their instruction and provide targeted feedback to students. However, we did not see the broader learning gains that we were hoping to see across all participating institutions. We believe there are at least two possible reasons to the variable learning gains.

First, instructors were not on-boarded to the project in a consistent manner due to the reasons we articulated earlier. For example, the tight contract terms often associated with adjunct faculty at the two-year institutions meant that they had very little time before the beginning of the semester to prepare for their courses and to learn how to use the new tool and the adaptive features it afforded. This variability in the on-boarding process may have in part contributed to the differences in how the intervention was implemented across institutions and instructors. As reported by the instructors formally through the surveys and also informally through conversations, there was great variability in the extent to which the flipped classroom approach was incorporated, as well as the degree to which the dashboard data were used to strategically tailor in-class instruction and provide targeted feedback to students.

Second, the differences in how students experienced the course may be in part due to their variable life circumstances and other important factors that we were not able to fully account for in the project. Although the study design did control for various kinds of baseline student characteristics and other relevant background information, we know from the surveys and other informal feedback that the challenges experienced by students, especially by those at the two-year institutions, such as difficulty with reading comprehension and inability to devote sufficient time to study the learning material on their own before class, were all very real and could have negatively affected their ability to perform well in the course.[51]

The variable learning gains observed in the study warrant for additional research to better understand what combinations of adaptive learning technology and instructional approach work better or worse for whom, in what ways, under what circumstances, and why. However, we do believe that more experience and experimentation with the flipped classroom pedagogy and adaptive learning tools by instructors (and by extension, students themselves) can make a big difference in how they teach the course and how their students experience it. As we saw in the fall 2018 extended pilot, the instructors who had participated in the previous pilot were much more likely than newly joining instructors to use the dashboards to guide their approach to in-class time and their interventions with individual students both in class and office hours. Having robust online training materials and resources for instructors to support active learning in the classroom tailored for their students’ circumstances and learning needs could help speed up the learning curve for new instructors. Putting in place an effective mentoring process in which lead instructors or others with similar experience guiding with the onboarding of new instructors will also help, though we think some time has to be allowed for first-time users to actually experience it.

Discussion and Conclusion

Viewing the project as a whole, we believe that there is sufficient evidence to support the idea that the original goals, as ambitious as they are, can still be achieved by continuing to develop the work that was started by the broader ALiS project team. In particular, the most compelling evidence includes:

  • The consistently positive impacts on academic outcomes (GPA and passing with a C or better) for students at the four-year institutions;
  • Feedback from the extended pilot (2018-19) about how the availability of the CCG and other resources developed over the summer of 2018 helped speed up the learning curve both for individual instructors and for the lead instructors trying to mentor them;
  • The fact that four of the Maryland institutions have elected to use the ALiS course and resources in 2019-20,[52] including two – UMCP and Wor-Wic Community College (WWCC) – that will use it for all of the introductory statistics sections offered on their campuses;
  • The willingness of these four institutions to continue to work together, with the leadership and support of the Kirwan Center, to develop a faculty learning community devoted to exploring ways to use adaptive learning technology more fully and effectively;
  • And, finally, the experience of Wor-Wic Community College (WWCC), where instructors reported that participating in this project has enabled them to raise the standards for student learning in a way that brings them more in line with other Maryland public institutions and prepares their students better to pursue other curricular and professional interests, either at WWCC or at a four-year institution.

One important implication of this study is that timely human intervention is still needed for at least some students. In the case of ALiS, a course redesign in introductory statistics around the adaptive learning technology by itself did not produce the kinds of improvements in student learning that we were seeking. While we continue to see the potential for adaptive learning technology to leverage the human resources already being invested in such courses, the level of technology currently available is not robust enough to guide learning in ways that we envisioned for this project. In particular, the level of automated tutoring in the courseware was limited, which creates problems for both students who are less prepared and therefore struggle and instructors who are trying to help them keep up. To the extent that many students are unable to fully absorb courseware content on their own, instructors need to devote large amounts of class time to covering that material, often squeezing out the active learning work that is at the heart of the course design.

Moreover, even when instructors are able to identify gaps in learning for individual students, they are not always able to intervene to provide help for those students because of limitations on the amount time and availability students have to give for extra work in any single course. This problem is particularly acute at community colleges where many students are juggling work and family obligations as well as other courses and academic work.[53] In a recent report published by Ithaka S+R based on a survey of over 10,000 students across seven community colleges, the researchers found that the most pressing challenges facing these students were often related to non-curricular issues, such as balancing work and school responsibilities and ensuring they have enough money to pay for their basic needs such as housing, food, and transportation.[54] It may well be that a different mix of strategies is needed in those cases to accommodate these constraints, perhaps depending less on work outside the classroom time and providing a balanced mix of self-study and active learning during the in-class periods.[55] If possible, such an approach should be accompanied by other necessary supports—both curricular and non-curricular—to help students succeed in these courses and make timely progress towards their degrees.

The findings from this study call for more work on how to integrate technology and pedagogy, and there are already some promising examples emerging across the field that we can learn from. For example, institutions like Arizona State University and Rio Salado Community College are integrating student support throughout students’ life cycle to address common roadblocks for students in the form of retention coaching, dedicated online tutoring, automatic alerts and predictive analytics to help faculty and academic advisors to support online learners, 24/7 technology support, as well as individualistic and holistic support to help students navigate their coursework and balance their studies with other work and family commitments.[56] We believe that the ALiS intervention could benefit from this kind of integrated support model, though we recognize that this would require more fundamental campus-wide or system-wide redesign efforts in order to realize its maximum effectiveness. Such efforts are especially needed at two-year institutions where resources are generally not available for extensive instructor training or for significant additional student support, all of which also points back to the need for improved and expanded educational technology as a critical component of such redesign efforts.

Another aspect of the implementation that was not studied explicitly in this project but we believe matters a great deal is the nature and level of departmental and institutional commitment to the intervention and how any one specific intervention fits within the larger institution-wide change initiative. That commitment shows up most obviously in the way that instructors are introduced to the intervention and the level of importance departmental and institutional leaders attach to a successful outcome as it relates to the overall institutional goal of improving student success. Most of the challenges described above become much more manageable when the department chair and institutional decision makers are prepared to weigh in to help solve those problems and are able to mobilize support and buy-in throughout the institution. Moreover, there are ways, both financial and otherwise, that institutions can recognize and reward instructors and other staff who go the extra mile to make the intervention work and bring their colleagues along to embrace the changes being made.

On a related note, we believe that the common course approach–similar to the one undertaken in the ALiS study–is a promising strategy to improve student outcomes at scale without requiring major new expenditures at each institution.[57] The approach could also potentially reduce variability in course quality and delivery, standardize student experience in a positive way, and contribute to a greater coherence and rigor in high-enrollment introductory or gateway courses across a large number of institutions within a system. This approach has been found to work especially well for institutions that have a large adjunct base who bring very diverse backgrounds, experiences, and motivation to their teaching. [58] The use of common courses can also help lower course development costs by preventing possible duplication and help make the course more affordable for institutions, which in turn, could provide learning materials at more affordable pricing for students. But, again, in order to make this kind of approach really work, a number of important organizational features must be considered, such as the development of a robust instructor training program, facilitation of a learning community to empower and support instructors to serve as guides and resources for each other, and building in extra time in faculty contracts to allow sufficient time for them to learn new tools and prepare for their courses accordingly. Also, the more effective the online training and instructor support resources can be – like the CCG developed as part of this project–the more likely it is that two-year institutions and the students they serve will be able to benefit fully from such efforts. In the end, taking these kinds of redesign efforts to scale requires them to be effective across a wide variety of institutions at affordable costs to both institutions and students.

What remains to be done? We believe that the ALiS course is a good base on which to build what could be a truly effective course model in statistics for groups of two-year and four-year institutions even though a lot more work needs to be done to improve both resources and implementation. Such an effort will require true collaboration and some nontrivial investments of time on the part of both instructors and institutional and system-level leaders. The good news is that the culture is already changing in many institutions, and innovations like adaptive learning technology and learning communities can both capitalize on those changes and speed them along. With that, we end this report with three recommendations derived from our study for institutions or systems of institutions interested in undertaking a large-scale course redesign and implementation approach in an effort to improve student outcomes.

Recommendation 1: Ensure that the course redesign process is contextually and culturally sensitive to the diversity of students being served.

In the current pilot, our common course approach neither guaranteed consistent implementation, nor yielded similar course outcomes for all students. As much as we continue to support the idea of a common course as a possible scalable education solution, we also believe that local adaptations that take into account students’ unique contexts, both in education and in life, are extremely important in order to design a course that deeply connects with students’ needs and goals, and integrates well with their daily lives. The variations in student experiences and outcomes, as well as their backgrounds and circumstances, speak to the need for more robust adaptations that consider a variety of issues from the perspective of students at all levels of the course redesign process, including curation of course content and reading levels, thoughtful design of pedagogy and instructional approach, robust student support infrastructure at the institutions and departments, as well as features and functionalities afforded by technology tools.[59]

Recommendation 2: Strategically align and integrate scattered efforts and supports within and across institutions to develop a culture of shared responsibility and engagement around the redesign and implementation process.

To achieve the kinds of local adaptations that we describe above to truly meet the needs of diverse students across multiple institutions, and to do that in a way that is efficient and seamless at scale, a lot more work will be needed, not just at the individual instructor and classroom level, but also at the broader departmental, institutional, and system levels. Strategic alignment and integration can help avoid possible duplication of efforts and ensure that parties involved have a vested interest in ensuring that the work succeeds. Furthermore, there has to be a system in place to support instructors in their redesign work, and recognize them for advancing the work through promotional or other award opportunities, to make the work both personally and professionally meaningful for them. Some instructors we interacted with through the pilot expressed the feelings of isolation at times, especially when they were the only ones trying out the ALiS courseware at their institutions. Thoughtful integration of the redesign efforts at the institutional and system-wide levels is very important for ensuring their longevity and likely success.

Recommendation 3: Establish a central team dedicated to operationalizing and managing the redesign process, drawing on the expertise and shared leadership of faculty and others to make the process authentically inclusive.

Based on our experience with the ALiS project and what we are learning from the field,[60] we believe that this kind of holistic approach to course redesign requires a central team whose role is to manage the course redesign process in both the original design phase and during ongoing continuous improvement efforts. The dedicated central team can take the responsibility of coordinating communication and collaboration among various players both within and across institutions.[61] They can also serve as a bridge builder and collaboration facilitator, helping to operationalize often complex, large-scale redesign efforts, drawing on the expertise of faculty, administrators, and other staff along the way to make important decisions for the institutions and/or system.

The key ingredients for student success, we believe, are the ones identified in the ALiS project —strategic use of innovative technology, attention to pedagogy and professional development for instructors, close collaboration among multiple stakeholders within and across institutions including both two-year and four-year institutions, and a commitment to careful assessment of results and continuous improvement. In the end, there is almost certainly no magic wand for any of this, but we continue to believe that smart and caring instructors supported by their institutions and systems, taking advantage of advancing technology, can in fact succeed in moving this particular needle.

Acknowledgements

We are grateful to the Bill & Melinda Gates Foundation for their generous support of this project.

In addition, we express our sincere thanks to the following individuals for their contribution to the project:

  • The Gates Foundation Program Officers Rebecca Hartzler, Kimberly Marshall, and Jim Ptaszynski, as well as Rahim Rajan and Nazeema Alli, for their encouragement and support at different phases of the project;
  • The ALiS Steering Committee—William Brit Kirwan, MJ Bishop, Catharine Bond Hill, and Martin Kurzweil—for their advice and guidance throughout the project;
  • Lisa S. Krueger for spearheading the faculty training and resource development processes of the project as a key member of the project management team;
  • The partners at the William E. Kirwan Center—Nancy O’Neil, Annika Many, and Stephanie Hall—for leading the institutional onboarding processes and providing on-the-ground coordination support and advice on faculty training and resource development;
  • The ALiS course design team members—Matthew Griffin, Susan Mazzullo, Sean Gruber, and Scott Wolpert of the University of Maryland, College Park; Barry Spieler and John Hamman of Montgomery College; and Murray Kimball and Michelle Neil of Acrobatiq—for spearheading the development of the project courseware and faculty training courseware and other resources;
  • Researchers at the Urban Institute —Theresa Anderson, Amanda Briggs, Semhar Gebrekristos, Alice Mei, David Blount, Alphonse Simon, Matthew Chingos—for designing and carrying out the project evaluation;
  • Benny Johnson at Acrobatiq for offering his research expertise in the project evaluation, share sharing platform data for ongoing training and evaluation purposes;
  • The lead instructors—Cristina Voisei (CCBC), Bonnie Kegan (UMBC), Matthew Griffin (UMCP), Susan Mazzullo (UMCP), Mary Lou Townsend (WWCC), Barry Spieler (MC), Jennifer Birdsell (AACC), Lance Revennaugh (FSU), Felice Shore (TU), and Jessica Adams (HCC)—for contributing their time to mentor and guide other pilot instructors and sharing their feedback with the project team to continuously improve the courseware and resources;
  • Eric Frank at Acrobatiq for his enthusiastic engagement with the project and offering reasonable courseware pricing for interested institutions in the coming years;
  • Ben Bederson and Sabrina Kramer at the University of Maryland, College Park, for their contribution during the pre-pilot phase, including the facilitation of the initial faculty learning community;
  • Ithaka S+R colleagues, Keven Guthrie and Rayane Alamuddin, for their ongoing engagement with and advice on the project;
  • All of the project leads, learning management system administrators, data liaisons, and other instructors across the nine institutions who participated or provided direct support to the project;
  • Finally, over 6,000 students from the nine institutions who participated in the study and shared valuable insights about their learning experiences.

Appendix A. Project Timeline and Descriptions of Concurrent Initiatives in Maryland

Project Timeline and Descriptions of Concurrent Initiatives in Maryland

Download the timeline

 

Appendix B. Description of the Courseware and Key Adaptive Features

The courseware provided by the Acrobatiq platform incorporates research from learning sciences with an aim to guide students through the learning materials in a thoughtful fashion while using the data gathered from students learning in similar subjects. One of the powerful features of the platform is that it continually collects and uses student data to gain insight into the structure of knowledge and knowledge construction processes, the impact of particular activities and feedback loop on student engagement and learning, as well as techniques for improving student learning gains.

Each unit of the course has features that are designed to support students as independent learners. The main features are (those with * are newly added or enhanced for the ALiS project):

  1. Learning Objectives: Learning objectives are presented on top of each page to help students prepare for what they are about to learn and check their understanding of the material on each page.
  2. Explanatory Content: Each page consists of short passages of text with information, examples, images, videos, and explanations to guide student learning.
  3. Learn By Doing: These activities provide students a chance to practice the concepts they are learning with helpful hints and feedback to guide them, especially when they are struggling.
  4. Did I Get This?: These activities provide students a chance to engage in “self-check” to assess their own understanding of the material before working on a graded activity.
  5. *Getting Ready: This prerequisite assignment is intended to assess students’ preparedness for the contents in the course. Based on their performance, they are presented with personalized instructional pages to prepare them to be successful in future units.
  6. Before You Continue: These surveys allow students to evaluate their own understanding of the learning objectives covered in each section. Students’ responses are not graded and the results will be available for instructor review to inform classroom instruction.
  7. StatTutor: These are process-based activities that use real scenarios to assess students’ knowledge of the material using datasets and statistical tools. Their scores on StatTutor will reflect the percent completed.
  8. *Apply What You Know: These activities are adaptive and designed to present questions for individual students based on their performance in each module. Students will receive feedback as they answer these questions to help them prepare for quizzes. Students receive scores on these assignments.
  9. *Checkpoints and Quizzes: These are short assessments to inform the instructors how well their students have mastered the materials. These are scored assessments.

These features are designed to present students with formative feedback, which provides opportunities for them to test their growing knowledge and get real time feedback along the way. The majority of the built-in features are meant to provide a safe place for students to practice the concepts they are just learning, and they are purposefully sequenced to help students check their retention and discover new linkages between the materials presented throughout the course.

Appendix C. ALiS Courseware Table of Contents and Learning Objectives

Course Units & ModulesLearning Objectives
Unit 1: Learning Strategies and Big PictureUnderstand the key learning science principles and course features designed to help you learn
Unit 2, Module 1: Examining DistributionsUnderstand the structure of a data set and identify different types of variables | Summarize the distribution of a categorical variable | Generate numerical measures of center and measures of spread of the distribution of a quantitative variable and use them to summarize the distribution| Generate graphical displays of the distribution of a quantitative variable and use them to summarize the overall pattern of the distribution
Unit 2, Module 2: Examining RelationshipsClassify a data analysis scenario according to the "role-type classification" | Summarize the relationship between a categorical explanatory variable and a quantitative response variable by comparing distributions of a quantitative variable across several groups | Summarize the relationship between two categorical variables | Generate a graphical display for the relationship between two quantitative variables and use to describe the relationship | Understand the role of the correlation coefficient r and its properties | Summarize the linear relationship using the least squares regression line | Recognize the distinction between association and causation and identify potential lurking variables
Unit 3, Module 3 & 4: Sampling and Designing StudiesIdentify the sampling method and its potential limitations | Identify the design and other features of a study | Understand how the design of a study impacts the type of conclusions that can be drawn | Determine how the features of a survey impacts the quality of the collected data
Unit 4, Module 5: Introduction to ProbabilityUnderstand how probability quantifies uncertainty and relate the probability of an event to the likelihood of the event occurring | Understand how relative frequency can be used to estimate the probability of an event and apply this approach in practice
Unit 4, Module 6: Random VariablesUnderstand the concept of a random variable and distinguish between discrete and continuous random variables | Find the probability distribution function of a discrete random variable and use it to find probabilities | Find the mean and variance of a discrete random variable, and apply these concepts and solving real world problems | Fit the binomial model when appropriate, and use it to perform probability calculations | Understand how a density function is used to find probabilities involving continuous random variables | Find probabilities associated with the normal distribution
Unit 4, Module 7: Sampling DistributionsDistinguish between a parameter and a statistic and recognize the concept of sampling variability | Determine the sampling distribution of the sample proportion and the sample mean in a given situation and apply it to determine likelihoods
Unit 5, Module 8 & 9: Estimation & Hypothesis TestingExplain how statistical inference fits into the big picture and recognize the three main forms of statistical inference | Determine point estimate in simple cases, and make the connection between the sampling distribution of a statistic, and its properties as a point estimator | Calculate interpret the confidence interval for the population mean μ, and recognize the effect of level of confidence and sample size on the precision of the interval estimation | Calculate interpret the confidence interval for the population proportion p, and recognize the effect of level of confidence and sample size on the precision of the interval estimation
Unit 5, Module 9: Hypothesis TestingRecognize the different steps of hypothesis testing and its logic. In particular, determine the hypotheses, interpret the p-value and draw conclusions | Carry out the hypothesis test for the population proportion and draw conclusions in context | Carry out the hypothesis test for the population mean and draw conclusions in context | Determine the likelihood of making type I and type II errors, and explain how to reduce them, in context | Carry out hypothesis test when you have one categorical variable from a single population to determine whether sample data are consistent with a hypothesized distribution
Unit 5, Module 10: InferenceIdentify and distinguish among cases where independent samples, matched pairs, and ANOVA are appropriate | Carry out two-sample t-test for comparing two population means when appropriate and draw meaningful conclusions | Carry out the paired t-test when appropriate and draw meaningful conclusions

Appendix D. Overview of Instructor Training and Pedagogical Guidance in 2017-18

Instructor Training and Resource Sharing Sessions

Learning Objectives:

Instructors will be able to…

  • Actively participate in and contribute to the project-wide virtual learning community delivered on the BaseCamp platform, as well as their local learning community
  • Deliver a course aligned with the objectives of the First in the World Maryland Mathematics Reform Initiative (University System of Maryland), to insure transfer of credit across institutions; the ASA Guidelines for Assessment and Instruction in Statistics Education in terms of both what and how to teach introductory college level statistics; and the Charles A. Dana Center New Math Pathways model, including recommended prerequisite mathematics skills, content and design standards;
  • Understand and implement appropriate pedagogical approaches for achieving student learning outcomes;
  • Identify and implement strategies to encourage student participation in the courseware and platform;
  • Access, share and deliver face-to-face active lessons, using technology to explore concepts and analyze data, and tied to content and activities in the ALiS courseware delivered through the Acrobatiq adaptive learning platform;
  • Develop strategies to use student learning performance data to promote a student-centered course;
  • Use predictive learning analytics and the adaptive learning platform to create targeted instructional choices.

Sessions Offered in Fall 2017:

  1. Introductory Instructor Workshop (in-person) (May)
  2. Online Tour of the ALiS Virtual Learning Community in BaseCamp (June)
  3. Virtual Office Hours – offered weekly to answer questions about the ALiS project and the Virtual Learning Community (June-December)
  4. Virtual Office Hours and Online Discussions – offered biweekly by veteran instructors to discuss upcoming content, activities, and instructional strategies (June – December)
  5. Virtual Office Hours – Review the components of the ALiS course in Acrobatiq & Open questions (July)
  6. Webinar – ALiS Course Alignment with MMRI, ASA GAISE, and the Dana Center & Pedagogy for Success (August)
  7. Webinar – Acrobatiq Instructor Orientation (Statistics Course Overview, Platform Navigation, Course Settings, Dashboard Overview, Adaptive Features) (August)
  8. Virtual Office Hours – Student Orientation #1 – preparing students to work with Acrobatiq & Open Questions (August)
  9. Virtual Office Hours – Demonstration and Discussion of Sample Activity #1 & Student Data Collection Survey (August)
  10. Virtual Office Hours – Demonstration and Discussion of Sample Activity #2 – #5 (August)
  11. Virtual Office Hours – Course Syllabus, Pacing Guide, Table of Contents and Grading Policy (August)
  12. Virtual Office Hours – Demonstration and Discussion of Sample Activities #6 – #10 (September)
  13. Webinar – The ALiS Common Final Assessment (September)
  14. Webinar – Acrobatiq In-Depth Dashboard Training (September)
  15. Virtual Office Hours – Demonstration and Discussion of Sample Activities #11 – #15 (October)
  16. Virtual Office Hours – Demonstration and Discussion of Sample Activities #16 – #20 (October)
  17. Virtual Office Hours – Demonstration and Discussion of Sample Activities #21 – #25 (October)

 

*Note that the same sessions were offered in Spring 2018 (except for the in-person instructor workshop) through various means, including pre-recorded videos, webinars, and virtual workshops and office hours in order to accommodate instructors with varying levels of experience using the courseware and related resources (there was a good mix of both new and returning pilot instructors).

Pedagogical Guidance

To align practice across multiple instructors and institutions, implement research-based pedagogical techniques, and incorporate the lessons of earlier iterations of the ALiS course, the ALiS project has articulated a set of expectations for teaching the ALiS course and utilizing the platform and courseware.

The expectations are premised on an instructional model that emphasizes the importance of learning by doing, replacing standard lectures with active and personalized instruction aligned with the scope and sequence of the contents in the platform. The concept behind this instructional approach is that the platform can deliver the content to most of the students most of the time, allowing instructors to devote their time in class to encouraging and supporting student application of the content and intervening with students who are struggling. In this model, platform-guided instruction does not replace student-instructor engagement, but rather provides the instructor with better information about students’ needs, particularly those who are not as likely to raise their hands.

The six pedagogical expectations are:

  • The adaptive learning platform must be the primary delivery mechanism for the content. Course content will be reinforced, emphasized and explained by the instructor, but the scope and sequence of the course will be aligned with that of the ALiS courseware.
  • Students should spend most of their in-class time actively doing rather than passively listening or reading. (A suggested allocation of class time is provided below.)
  • Students should have an opportunity to learn statistical thinking and problem solving by utilizing technology to explore core concepts and analyze data.
  • Students need real-time, face-to-face and often personalized guidance and encouragement in order to take best advantage of the adaptive learning tools.
  • Instructors need to encourage and hold students accountable for engagement with the courseware and its features, monitoring their progress through the dashboards.
  • Instructors should utilize the dashboards and other feedback from the platform to inform their instruction and target help to individual students.
  • In alignment with these expectations, a typical ALiS class session might take place in a computer lab with clusters of workstations so that students can work in groups or individually, depending on the activities for that class period. Instructors would guide the work of students in alignment with the ALiS unit of instruction as presented on the platform and intervene directly to help students who are struggling. When questions or misconceptions affect larger groups of students, instructors would redirect and provide alternative perspectives. Instructors would also introduce group exercises or projects that provide students with an opportunity to apply statistical thinking, encouraging peer-to-peer interaction and assistance.

Suggested percentages of face-to-face class time to be spent on various activities:

In Class Activities
Including prep/setup
30 - 45%25 activities at 35 min is about 40%. A set of activities, aligned with the courseware will be shared through the virtual learning community.
Review of Main Ideas
Processing activities, summarizing discussions, reviewing for tests & quizzes
25 - 30%e.g. review periods before each test and final, plus 10 min after each activity and at end of some other class days
Instruction
Q&A, targeted instruction based on dashboard data
15 -20%short Q&A, plus 10 -15 minute  mini-lectures once or twice per week, perhaps as introduction for content to be read at home
Summative Assessment
Tests, quizzes, etc.
5 - 10%e.g. three 50 min tests
Formative Assessment
Mini-assessments such as clicker questions, exit cards, minute papers
5 - 10%one 5-min assessment activity per class session
Administrative
Announcements, etc.
5%5-10 min per week as needed for pep talks, discussion of policies, grades, etc.

Appendix E. Overview of the Comprehensive Course Guide (CCG)

CCG Site Map

C:\Users\JJoo\Downloads\CCG site map v2.jpeg

CCG Homepage Screenshot

 

 

Acrobatiq Instructor Training Course – Table of Contents

UnitsModules
Unit 1: Course IntroductionAbout this Course

Module 1: Navigation
Unit 2: Interactive Course ElementsModule 2: Introduction to Course Elements

Module 3: Adaptives

Module 4: Formative Activities

Module 5: Statistics Courseware & Stat Tutors

Module 6: Summative Assessments – Check points & quizzes
Unit 3: Preparing the CourseModule 7: Course Settings

Module 8: Scheduling

Module 9: Dashboard Prep
Unit 4: GradebookModule 10: Navigation

Module 11: Managing Scores
Unit 5: Preparing to TeachModule 12: Student Experience

Module 13: Planning Your Syllabus
Unit 6: Actionable DataModule 14: Learning Estimates

Module 15: Student Engagement
Unit 7: LMS Set-upModule 16: Technical Setup (Faculty)

Module 17: Technical Setup (Admin)

Acrobatiq Instructor Training Course – Sample Page (Unit 6, Module 15)

Understanding Engagement Graphs

An engagement graph shows for every page of a course (x-axis) how many different students interacted with that page (y-axis). This simple concept is surprisingly powerful, enabling several actionable insights about your course. Below is a list of the five items that are captured in each engagement graph.

unlabelled image

Page Visits — The blue dots tracks the number of students who visited that page in the course. If a student visited a page multiple times it will only be counted as once on these graphs.

Formative Practice — The red dots track the number of students who attempted the nongraded activities on the content pages themselves, labeled Learn by Doing and Did I Get This?

Adaptive Practice — The pink dots represent how many students attempted the Apply What You Know in each module.

Summative Assessments — The green dots represent the number of students who attempted the Quizzes and Checkpoints in the course.

Video Play — The turquoise dots represent the number of students who clicked play on the course videos.

Interpreting Engagement Graphs

Reading-Doing Gap — The gap is the vertical space between reading the page (blue dot) and doing the activities (red dot). All students who viewed the page are included in the blue dot, but only students who also did formative practice on that page will be included in the red dot. This gap often increases as the course goes on, meaning fewer students complete the formative practice as the course progresses. This gap between reading and doing is problematic. Research shows that doing has a six times greater effect size on learning as reading alone, so if engagement increases, there is good reason to believe that student outcomes will improve.

unlabelled image

Within Module Streaking — Some engagement graphs will show a downward streaking pattern. This occurs when students begin the module but then drop off as the module progresses. Students then return at the beginning of the next and begin this pattern again. The dotted vertical lines represent the start of each module.

unlabelled image

Assessment Only  –  In the example below, you can see how the green dots are above the other colors on the graphs. This happens when students only do the summative assessments and are not doing the rest of the items in the course.

unlabelled image

Ideal Engagement  – If every student completed all items in the course, you would see one solid line of color across the top. This example shows that the majority of the students in this class were completing most of the Acrobatiq course.

unlabelled image

How to Improve Student Engagement

The research out of Carnegie Mellon University’s Open Learning Initiative states that doing has six times the effect size over reading alone. Acrobatiq’s internal research has confirmed these findings with its own course data. Increasing engagement with the formative practice activities is a key to student success.

Changes in approach can successfully improve engagement.

There are several strategies you can use to improve engagement:

  • Grading Policy — How much you weight the different components of an Acrobatiq course can have a big impact on how students engage. For example, if the only grades you count are the summative assignments, that can change how students interact with the formative activities.
  • In-Class Work — Some instructors have found it beneficial to utilize formative questions from Acrobatiq in class as a way to incentivize students to do their work. Making the connection between what is done in Acrobatiq with what is happening in class can help to increase engagement.
  • Leveraging Data — The Learning Dashboard can help you pinpoint specific students who are having trouble engaging so you are able to reach out to them directly.

 

Pedagogical Guidance

Best Practices for the ALiS Course: Combining Adaptive Learning Technology and a Flipped Classroom Approach

This document provides an overview of the ALiS recommended pedagogy – the flipped classroom approach – which the project team has used to guide the design of a robust student-centered active learning environment that we believe will yield the most positive outcomes. The document begins with a definition of the flipped approach and what research indicates about its efficacy in “moving the needle” in terms of improving student learning outcomes. We then identify the key ingredients for success based on the lessons learned from the ongoing pilot, and end with a set of best practice recommendations for ALiS instructors that will make the AY19 pilot efforts as successful as possible.

To deliver a successful ALiS course, it is essential to pair a flipped classroom approach with adaptive learning technology. Neither is sufficient by itself. Indeed, we believe – and the preliminary data support our belief – that, when carefully aligned and coordinated, this combination can produce synergistic results in terms of improving student learning, especially for at-risk students with developmental needs in math.

What is a Flipped Approach? And Why Flip?

For the purpose of consistency, we define the flipped approach here as a particular kind of blended approach in which the course content is delivered primarily by technology and high quality feedback is provided primarily by the instructor (adapted from Margulieux, McCracken, and Catrambone, 2015). This approach relies heavily on both students and instructors playing an active role in learning and teaching. Students engage with course readings and exercises in the adaptive learning courseware prior to class, and instructors monitor students’ progress on the platform dashboard to tailor in-class instruction and activities in ways that are aimed at addressing students’ learning needs. According to a meta-analysis study examining the outcomes for hybrid, blended, and flipped courses, the only type of mixed-method course that consistently improved student learning outcomes relative to a traditional approach was the flipped blended approach (Margulieux, et al, 2015). Why? In this type of course, instructors can devote classroom time to concept application, problem solving, and deeper in-class discussions. With a thoughtful design and built-in support structure that adapts to students’ ongoing learning needs, flipped courses that encourage and promote active learning can be highly effective in improving how much and how well students learn.

What are the Key Ingredients for Success?

Based on our own experience with the ongoing pilot in the ALiS project and outside studies, there are four key ingredients for delivering a successful course:

1. Shift the Mindset

Because the idea behind the flipped approach is still countercultural to how universities and colleges have been delivering courses for centuries, it is important for both students and instructors to make a conscious shift in how they view learning and teaching–as well as how they view their roles as learners and teachers. In flipped courses, the instructor learns from students as much as students learn from the instructor, and this openness to learning from one another should be emphasized at the outset and reinforced throughout the semester to motivate students to assume a more active role in their learning. This, in turn, will help instructors identify areas where students need the most guidance and intervene accordingly.

2. Use the Courseware Fully

The design of the ALiS statistics course relies on the use of the adaptive learning courseware, which does a great job of providing accurate and appropriate content, frequent checkpoint quizzes, on-the-spot system-generated feedback, and personalized exercises based on how students engage and perform throughout the modules and units. Thus, it is important that the courseware is used fully and serves as the primary delivery mechanism for the course content. If the courseware is seen simply as an add-on, students may not be motivated to use it. This will lead to result in “missing data” on student learning, and then instructors will be unable to use the dashboard data in a meaningful way. We recommend that instructors make every effort to cover the entire content of the courseware. Further, we recommend that students’ engagement with the courseware should comprise a significant portion of their course grade. This will ensure that there is a seamless alignment between what students do pre-class and in-class activities, which should include interactions with their peers and the use of real-world data and questions.

3. Promote Active Learning

Previous research has repeatedly shown that active practice both inside and outside of the classroom has a positive impact on student learning, greater than that of other educational resources like watching videos or reading text. This is known in the research literature as the “doer effect” (Koedinger, McLaughlin, Jia, and Bier, 2016). Our analysis of the fall 2017 pilot platform data confirmed this positive causal relationship between completion of the in-unit formative practice exercises (the “Learn by Doing” and “Did I Get This?” exercises) and students’ in-course summative scores. It is important for students to engage in these “doing” exercises as much as possible both outside and inside of classroom, so they are provided with ample opportunities to develop statistical thinking.[62] Classroom practice should include opportunities for peer-to-peer and student-to-instructor interactions, and instructors should emphasize sense-making, promote student participation, and elicit and build on student mathematical thinking. Students are expected to communicate their mathematical thinking and to evaluate peers’ thinking. Instruction should focus on key mathematical and statistical ideas, and instructors should ensure that students demonstrate procedural fluency, conceptual understanding, and make connections between concepts. The basic idea is that, if the platform can deliver the content and prepare students for in-depth learning, then instructors can devote class time to provide perspective and motivation for learning and for engaging students in real-world problems with timely support and feedback along the way.

4. Develop a Weekly Routine

Finally, to make this kind of flipped approach successful, a high level of coordination, organization, and time management are key. Developing a weekly routine that holds students accountable for their pre-class work and holds instructors accountable for reviewing students’ performance on the pre-class work and making appropriate adjustments to the in-class instruction is a critical first step. The below diagram shows an example of a flipped course weekly routine that helps both students and instructors embody a continuous learning cycle that is facilitated by the affordances of technology.[63] Although academic calendars may vary, it is important that both students and the instructor adhere to a similar weekly routine throughout the semester.

 

Best Practice Recommendations for ALiS Instructors

Here are some best practices we recommend for ALiS instructors, both outside and inside the classroom.

Outside the Classroom Inside the Classroom
  • Use the Acrobatiq platform to deliver content and provide student accountability checks necessary to make the flipped approach successful
  • Require platform activities as a significant portion of course grades and use engagement graphs to promote maximum “doing” outside of class
  • Analyze student learning data on the dashboard on a weekly basis to identify “hot spot” learning objectives for additional in-class remediation
  • Regularly communicate with other ALiS instructors to share experience and resources
  • Explain the benefits and challenges of the flipped classroom approach to students at the beginning (and remind them regularly) and set clear expectations about student engagement both at the outset and throughout semester
  • Share out-of-class data with students and help students make meaningful connections with in-class work
  • Use in-class activities to provide students with opportunities to apply concepts learned in out-of-class work in the platform and to give on-the-spot high-quality feedback; make sure in-class activities are explicitly aligned with the learning objectives and content being covered at that time in the platform
  • Reserve enough time and space for students to work through problems with peers and the instructor

We recognize that everything we’ve outlined above is easier said than done. The project team will provide more robust professional development and other resources in mid-August for to help ALiS instructors successfully flip their courses and take full advantage of its design using the adaptive learning technology. We encourage all instructors to regularly communicate with the ALiS project-wide and local lead instructors and other mentors, so that we can collaboratively work together to create active learning environments that are truly student-centered.

References

Koedinger, K. R., McLaughlin, E. A., Jia, J. Z., & Bier, N. L. (2016, April). Is the doer effect a causal relationship?: how can we tell and why it’s important. In Proceedings of the Sixth International Conference on Learning Analytics & Knowledge (pp. 388-397). ACM.

Margulieux, L. E., McCracken, W. M., & Catrambone, R. (2015). Mixing in class and online learning: Content meta-analysis of outcomes for hybrid, blended, and flipped courses. In Proceedings of Computer Supported Collaborative Learning. International Society of the Learning Sciences.

 

Endnotes

  1. Five two-year institutions (Anne Arundel Community College, Community College of Baltimore County, Harford Community College, Montgomery College, and Wor-Wic Community College) and four four-year institutions (Frostburg State University, Towson University, the University of Maryland – Baltimore County, and the University of Maryland – College Park) participated in the ALiS project.
  2. Note that Acrobatiq was merged into VitalSource in September 2018, https://www.insidehighered.com/digital-learning/article/2018/09/05/vitalsource-acquires-courseware-platform-acrobatiq.
  3. Gateway courses are credit-bearing first college-level courses that can apply to requirements of a degree and are designed to equip students with foundational knowledge and skills necessary to progress through their studies in a variety of fields (adapted from the definition presented in Education First & SHEEO. “Aligning Gateway College Courses,” K-12/Higher Education Alignment, Brief 5, 2015, https://education-first.com/wp-content/uploads/2015/10/05-Higher-Ed-Alignment-Brief-Gateway-Courses.pdf.)
  4. The regression model developed by researchers at the Urban Institute controlled for the following student characteristics: demographics (e.g. ,age, sex, ethnicity, race), first generation status, Pell grant eligibility, prior developmental education experience, prior statistics experience in high school or college, prior college performance (e.g. cumulative GAP, number of credits earned before the semester), standardized test scores (e.g. Accuplacer score, SAT/ACT percentile), employment (e.g. expecting to work more than 20 hours/week), full-time student status, and other baseline characteristics (e.g. aptitude, self-efficacy, and attitudes). See the Urban Institute research report for additional details about the research design, https://www.urban.org/research/publication/evaluation-adaptive-learning-statistics-alis.
  5. Jenna Joo, “Design Requirements for Next Generation Gateway Mathematics Courseware: A Possible Model for Scalable Implementation,” Ithaka S+R, 7 November 2019, https://sr.ithaka.org/sr-design-requirements-next-generation-mathematics-courseware-11072019/.
  6. Amanda Briggs, Theresa Anderson, Semhar Gebrekristos, Alphonse Simon, and Alice Mei, “Evaluation of Adaptive Learning in Statistics (ALiS): Testing an Online Adaptive Learning Platform at Nine Postsecondary Institutions in Maryland,” Urban Institute, 7 November 2019, https://www.urban.org/research/publication/evaluation-adaptive-learning-statistics-alis.
  7. Urban Institute Study Team, “Testing a Content-Neutral Adaptive Learning Platform in Introductory Statistics Courses: Evaluation Findings From Fall 2018,”https://www.urban.org/research/publication/evaluation-adaptive-learning-statistics-alis/evaluation-findings-fall-2018.
  8. Thomas R. Bailey, “Challenge and Opportunity: Rethinking the Role and Function of Developmental Education in Community College,” New Directions for Community Colleges, 145, 2009, https://ccrc.tc.columbia.edu/publications/challenge-and-opportunity.html; Judith S. Clayton and Olga Rodriguez, “Development, Discouragement, or Diversion? New Evidence on the Effects of College Remediation,” NBER Working Paper No. 18328, 2012, https://ccrc.tc.columbia.edu/publications/development-discouragement-diversion.html; “Developmental Education: A Barrier to Postsecondary Credential for Millions of Americans,” MDRC, 2013, https://www.mdrc.org/publication/developmental-education-barrier-postsecondary-credential-millions-americans.
  9. “The Case for Mathematics Pathways,” Charles A. Dana Center at the University of Texas at Austin, 2019, https://dcmathpathways.org/resources/case-mathematics-pathways.
  10. National Research Council 2013, “The Mathematical Sciences in 2025,” Washington DC: The National Academies Press, 2013, https://doi.org/10.17226/15269.
  11. For an overview of mathematics course redesign around co-requisite model, see Dana Center, “Co-requisite Courses: Narrowing the Gap Between Instruction and Supports,” Dana Center Mathematics Pathways, 2018, https://dcmathpathways.org/sites/default/files/resources/2018-07/Co-req_Supports_2018_07_24.pdf.
  12. The Carnegie Foundation for the Advancement of Teaching has also promoted the use of Statway, which combines remedial mathematics with introductory statistics. See https://carnegiemathpathways.org/statway/.
  13. Paul Fain, “Faster Math Path,” Inside higher Ed News, October 21, 2013, https://www.insidehighered.com/news/2013/10/21/california-community-colleges-cautious-experiment-accelerated-remediation.
  14. Alexandra W. Logue, Mari Watanabe-Rose, and Daniel Douglas, “Should Students Assessed as Needing Remedial Mathematics Take College-Level Quantitative Courses Instead? Randomized Controlled Trial,” Educational Evaluation and Policy Analysis (2016), https://doi.org/10.3102/0162373716649056.
  15. Alexandra W. Logue, “The Extensive Evidence of Co-Requisite Remediation’s Effectiveness,” Inside Higher Ed, July 17, 2018, https://www.insidehighered.com/views/2018/07/17/data-already-tell-us-how-effective-co-requisite-education-opinion.
  16. See Ithaka S+R’s 2015 overview paper on the diverse and rapidly growing market of adaptive learning solutions: Jessie Brown, “Personalizing Post-Secondary Education: An Overview of Adaptive Learning Solutions for Higher Education,” Ithaka S+R, Last Modified 18 March 2015, https://doi.org/10.18665/sr.221030. Also see Nazeema Alli, Rahim Rajan, and Greg Ratliff, “How Personalized Learning Unlocks Student Success,” EDUCAUSE Review, March/April 2016, p.12-21, https://er.educause.edu/~/media/files/articles/2016/3/marapr16erfullissue.pdf?la=en.
  17. Patsy Moskal, Don Carter, and Dale Johnson, “7 Things You Should Know About Adaptive Learning,” EDUCAUSE Learning Initiative (ELI), January 4, 2017, https://library.educause.edu/resources/2017/1/7-things-you-should-know-about-adaptive-learning.
  18. Ibid.
  19. Association of Public & Land-Grant Universities, “A Guide for Implementing Adaptive Courseware: From Planning Through Scaling,” October, 2018, p. 1, https://www.aplu.org/library/a-guide-for-implementing-adaptive-courseware-from-planning-through-scaling/file.
  20. Ibid, p. 2.
  21. Louise Yarnall, Barbara Means, and Tallie Wetzel, “Lessons Learned from Early Implementations of Adaptive Courseware,” SRI International, 2016, https://www.sri.com/sites/default/files/brochures/almap_final_report.pdf.
  22. What is ALEKS? https://www.aleks.com/about_aleks.
  23. “Adaptive Learning Technology Pilot Report,” University of California, December, 2016, https://www.ucop.edu/institutional-research-academic-planning/_files/BFI-Adaptive-Learning-Technology-Report.pdf.
  24. It is worthwhile to mention that the University System of Maryland (USM) includes the state’s public four-year institutions but not its public two-year institutions. The ALiS project aimed to complement many of the ongoing statewide efforts in Maryland to implement and evaluate innovations across both two-year and four-year institutions.
  25. For more information on Acrobatiq courseware, visit http://acrobatiq.com/products/adaptive-learning-2/.
  26. Josh Logue, “Pushing New Math Paths,” Inside Higher ED News, April, 21, 2016, https://www.insidehighered.com/news/2016/04/21/tpsemath-working-reform-math-education.
  27. William G. Bowen, Kelly A. Lack, Matthew Chingos, and Thomas I. Nygren, “Interactive Learning Online at Public Universities: Evidence from Randomized Trials,” Ithaka S+R, Last Modified 22 May 2012, https://doi.org/10.18665/sr.22464.
  28. Matthew Chingos, Christine Mulhern, Rebecca J. Griffiths, and Richard R. Spies, “Interactive Online Learning on Campus: Testing MOOCs and Other Platforms in Hybrid Formats in the University System of Maryland,” Ithaka S+R, Last Modified 10 July 2014, https://doi.org/10.18665/sr.22522.
  29. Rebecca J. Griffiths, Matthew Chingos, and Christine Mulhern, “Can Online Learning Improve College Math Readiness? Randomized Trials Using Pearson’s MyFoundationsLab in Summer Bridge Programs,” Ithaka S+R, Last Modified 14, December 2015, https://doi.org/10.18665/sr.275477.
  30. The description of the foundational OLI statistics course, along with its learning objectives, course outlines and details can be found in the OLI website, https://oli.cmu.edu/?s=statistics&post_type=product.
  31. Jeffrey R. Young, “Inside an Adaptive-Courseware Experiment, Glitches and All,” EdSurge News, February 7, 2017, https://www.edsurge.com/news/2017-02-07-inside-an-adaptive-courseware-experiment-glitches-and-all.
  32. The Urban Institute research team took a mixed methods approach to evaluate the outcome of the pre-pilot offerings, and collected data from multiple sources, including the baseline and end of semester surveys, common final exams, student-level data from institutions/departments, instructor interviews and surveys, administrator interviews, student focus groups, classroom observations, and student engagement data from the Acrobatiq platform.
  33. These standards include American Statistical Association’s Guidelines for Assessment and Instruction in Statistics Education (GAISE) as well as those developed by the Maryland Mathematics Reform Initiative (MMRI).
  34. As we will discuss later, removing these modules from the existing course was not an easy undertaking due to the interconnected nature of these modules.
  35. We took the principles of the co-requisite model (e.g. strategies to support students as learners are integrated into courses) and those from the latest learning sciences research (e.g. the capacity to assess learning; the adjustment of presentations of content in relation to knowledge of learners) into consideration when developing this module. For more information about the co-requisite model, see the Dana Center’s Math Pathways model (https://dcmathpathways.org/dcmp/dcmp-model). For more information about courseware design features based on the latest learning science research, see the Courseware-in-Context framework (https://go.edsurge.com/rs/590-LFO-179/images/3_FRAMEWORK_How_to_Use_the_CWiC_Framework.pdf).
  36. The following reference document was used to develop the content for this new adaptive module: Roxy Peck, Rob Gould, Jessica Utts, “Mathematics Foundations for Success in Introductory Statistics,” Charles A. Dana Center, August, 2019, https://dcmathpathways.org/sites/default/files/resources/2019-08/Mathematics_Foundations_for_Success_in_Introductory_Statistics_20190809.pdf.
  37. Note that this plan later served as a basis for the comprehensive online instructor training course developed in the summer of 2018.
  38. More explicit guidance around the use of a flipped classroom approach was provided for the extended pilot in 2018-19. See pedagogical guidance document included in Appendix E for more details.
  39. Lauren E. Margulieux, Michael McCracken, and Richard Catrambone, “Mixing In-Class and Online Learning: Content Meta-Analysis of Outcomes for Hybrid, Blended, and Flipped Courses,” CSCL 2015 Proceedings. https://pdfs.semanticscholar.org/282a/4fba9f6edb38727d5d00cb4769b6c4aa5e33.pdf?_ga=2.60248557.1330057319.1568197642-1039987030.1565098129.
  40. For definitions of these “mixed-method” courses, see Lauren E. Margulieux, Keith R. Bujack, W. Michael McCracken, and David Majerich, “Hybrid, Blended, Flipped, and Inverted: Defining Terms in a Two Dimensional Taxonomy,” Hawaii International Conference 2014 Proceedings, https://files.eric.ed.gov/fulltext/EJ1089335.pdf.
  41. Ibid.
  42. The five 2-year institutions were Anne Arundel Community College, Community College of Baltimore County, Harford Community College, Montgomery College, and Wor-Wic Community College. Note that Wor-Wic Community College was not included in the fall 2017 study because it only piloted the courseware in a small honors section without a matching comparison section. The three 4-year institutions were Frostburg State University, University of Maryland, Baltimore County, and University of Maryland, College Park.
  43. Towson University.
  44. Throughout this analysis, the subgroups analyzed were Pell-eligible students versus those not eligible for Pell grants, first generation college students versus those who were not first gen, and students with previous experience with developmental education coursework versus those without any such experience. In each case, the question asked was whether or not the ALiS intervention produced greater positive impacts for subgroup members than the students not in those subgroups. In other words, the question was whether or not the ALiS intervention reduced the current gaps for students in those subgroups.
  45. Note that these questions were only asked in the spring 2018 instructor surveys.
  46. The five institutions that continued to participate in 2018-19 include Community College of Baltimore County, Montgomery College, University of Maryland, Baltimore County, University of Maryland, College Park, and Wor-Wic Community College.
  47. Note that the research design in fall 2018 was less rigorous than the earlier pilot in that it lacked the matched-pair section requirement which was an important part of the research protocol for the 2017-18 pilot. Many instructors who taught matched pair sections (both pilot and traditional sections) in 2017-18 noted that switching between one method of teaching to another was difficult and time-consuming. As a result, the project team eased up on this requirement in 2018-19. None of the compared pilot-traditional sections in the fall 2018 evaluation were taught by the same instructors, and therefore, the findings from the fall 2018 are not directly comparable to the findings from the 2017-18 pilot. See the Urban Institute’s fall 2018 results slide deck for additional details, https://www.urban.org/research/publication/evaluation-adaptive-learning-statistics-alis/evaluation-findings-fall-2018.
  48. Results from the spring 2018 evaluation were shared with the lead instructors at the five institutions for internal discussion and reflection purposes. No specific findings are included in this report, but we do bring in some insights gleaned from our conversations with instructors in this section and in our final comments and reflections that appear in the Conclusion.
  49. Unfortunately, since there were no well-matched comparison sections in fall 2018 (as we noted in footnote 41), the Urban institute evaluation team was not able to disaggregate these results by individual institutions as they did in the 2017-18 study.
  50. Note that the breakdowns of dashboard usage by institution type (two-year vs. four-year institutions) are not provided for fall 2018 because there were only two instructors in the four-year institutions teaching pilot sections.
  51. The Acrobatiq team generated student engagement graphs and shared them with the project team regularly to help guide our ongoing discussion with the lead instructors and others at the participating institutions. The engagement graphs produced at the individual section level showed a wide variation in student engagement patterns throughout the semesters. See Appendix E for sample engagement graphs.
  52. These four institutions include Montgomery College (MC), the University of Maryland, College Park (UMCP), the University of Maryland, Baltimore County (UMBC), and Wor-Wic Community College (WWCC).
  53. In the 2017-18 ALiS pilot, compared to their peers at four-year institutions, students at two-year institutions were more likely to be married with children (15 percent vs. 2 percent), working more than 20 hours per week (54 percent vs. 19 percent), and attending school part-time (43 percent vs. 4 percent).
  54. Melissa Blankstein, Christine Wolff-Eisenberg, and Bradlee, “Student Needs are Academic Needs: Community College Libraries and Academic Support for Student Success,” Ithaka S+R, Last Modified 30 September 2019, https://doi.org/10.18665/sr.311913.
  55. For example, see Jennifer Gonzales, “Modifying the Flipped Classroom: The “In-Class” Version,” Edutopia Blog Post, March, 24, 2014, https://www.edutopia.org/blog/flipped-classroom-in-class-version-jennifer-gonzalez.
  56. Allison Bailey, Nithya Vaduganathan, Tyce Henry, Renee Laverdiere, and Lou Pugliese, “Making Digital Learning Work: Successful Strategies from Six Leading Universities and Community Colleges,” The Boston Consulting Group, March, 2018. p. 33-34, https://edplus.asu.edu/sites/default/files/BCG-Making-Digital-Learning-Work-Apr-2018%20.pdf.
  57. One of the important goals of the project was to develop the resources needed to enable diverse institutions to work together to improve student engagement and success without increasing costs for either the institutions or the students. We approached this primarily by trying to standardize the courseware and supporting resources as much as possible and to deliver instructor training and support virtually. Hence the emphasis on online training and as-needed support through the CCG and the lead instructors. For this pilot, we focused our evaluation efforts mainly on student performance and satisfaction, and did not undertake extensive cost analyses. We believe that cost should be part of the assessment process going forward, so that strategies for containing those costs are also evaluated alongside student performance and satisfaction.
  58. Allison Bailey et al., “Making Digital Learning Work: Successful Strategies from Six Leading Universities and Community Colleges,” p. 33.
  59. In a recent report by Luminary Labs based on their lessons from a three-year U.S. Department of education initiative aimed at better equipping instructors with techniques, tools, and open educational resources to teach adults with advanced math skills needed for modern jobs, the authors noted that the educational technology tools currently available in the market are not particularly designed for adult learners in mind, many of whom are dealing with “adult circumstances,” such as balancing class attendance with other work and family responsibilities. Read more: Luminary Labs, “Changing the Equation: Empowering Adult Learners with Edtech,” Power in Numbers: Advancing Math for Adult Learners, September, 2019, https://lincs.ed.gov/professional-development/resource-collections/profile-1163.
  60. See p. 31-32 in Allison Bailey, Nithya Vaduganathan, Tyce Henry, Renee Laverdiere, and Lou Pugliese, “Making Digital Learning Work: Successful Strategies from Six Leading Universities and Community Colleges” for examples of how some institutions have organized central teams for online programs to build needed capabilities and expertise to design for quality.
  61. See a section about establishing a backbone function in a collaboration playbook developed by Ithaka S+R and Jeff Selingo: Jenna Joo, Jeff Selingo, and Rayane Alamuddin. “Unlocking the Power of Collaboration: How to Develop a Successful Collaborative Network in and around Higher Education.” Ithaka S+R. Last Modified 17 October 2019. https://doi.org/10.18665/sr.312001.
  62. The American Statistical Association’s 2016 Guidelines for Assessment and Instruction in Statistics Education College Report (GAISE) recommends both what to teach in an introductory college level statistics course, as well as how to teach that course. According to the report, the course should focus on teaching statistical thinking as an investigative process, conceptual understanding, integrate real data, foster active learning, use technology to explore concepts and analyze data, and use assessments to improve and evaluate student learning. All of these recommendations have been considered in the design of the ALiS course.
  63. This example was adapted from the flipped classroom pedagogy model developed by Professor Stephen Lu at University of Southern California. You can read more about Professor Lu’s iPodia program here: http://ipodia.usc.edu/pedagogy/.