Scott Wolpert is a professor of mathematics at the University of Maryland, College Park (UMCP). Professor Wolpert has been part of the ALiS project since the pre-pilot phase in 2016, providing leadership to help shape the direction of the project in the areas of course design, pedagogy, and faculty professional development. He was the chair of the mathematics department at his institution during the ALiS project.
What was the main motivation for you and your institution to participate in this project?
The project offered the opportunity to work on the redesign of our largest enrollment course—elementary statistics—and to work with several Maryland community colleges. From our internal view, elementary statistics is one of the courses where student outcomes differ considerably from our expectations. After consultation with colleagues at Montgomery College, we decided that elementary statistics provided the best fit for the project.
Looking at the project as a whole, what was the most important finding or lesson learned for you and your institution?
For us, the most important finding is that breakdown of time for in-class, in-lab, and out-of-class is an important factor for student learning. For reasons of scale, we routinely use large lectures for elementary statistics with student enrollment of up to 300, which begins with direct instruction as the primary course delivery approach. The project prompted us to experiment with different approaches, such as decreased lectures with increased lab time, partially flipped classrooms, open labs, breakout sessions in large lectures, and embedding peer mentors in large lectures to make the learning experience more engaging for students. From the feedback of exam scores and instructors’ experience we judged that large lectures with peer mentors and breakout sessions are most effective.
What did you personally find to be the most surprising finding or conclusion?
The most surprising finding for us is above—that the approach to teaching and learning is a primary matter of consideration in mathematics instruction. We are also surprised by the difference in measured learning between students in two-year and four-year institutions. I originally expected that the platform would especially help with the wide preparation range for students in two-year schools.
What was the most challenging part of the project experience for you personally?
By far the most challenging experience is that we did not get to the stage of fully utilizing the student learning analytics provided in the platform. I believe that the individual student data tools present not-fully-tapped opportunities. The data can cue re-teaching lesson segments and to engaging groups of students on individual topics. The data immediately helps when a student asks “what should I focus my studying on?”
What do you think should happen next at your institution (and beyond) to build on this work?
Now we are trying to effectively use peer mentors in our large lectures. We may consider expanding peer mentor use to our sophomore differential equations classes. We are exploring the use of smart platforms/online content in some of our other large enrollment courses. I would like to see us in another joint project with the Maryland community colleges – there has been positive “spillover” from our working together. While collaborating on the project, we have occasionally taken time out to discuss other courses and transfer issues.