One thing that appeals to me about working for Noel-Levitz is our emphasis on data-informed decision making. Especially in the current economic climate, we need data to direct us to the solutions that have the greatest potential to be cost-effective. Whether it’s recruitment, retention, or the improvement of instruction, we can’t try everything, and we certainly can’t build a program for every issue or unique problem that pops up.
Predictive models are one data tool that can greatly support strong enrollment management programs. Increasingly, I see campuses turning to predictive modeling to support student retention, and Noel-Levitz’ research reflects my experience. Our forthcoming 2011 Student Retention Practices Report shows that 50 percent of four-year institutions and 30 percent of two-year institutions are already using retention modeling. Furthermore, of those using this method, nearly 50 percent of two-year publics, nearly 60 percent of four-year publics, and 65 percent of four-year privates rated retention modeling “very effective” or “somewhat effective.”
How can this process strengthen student success? Let me illustrate using the Noel-Levitz predictive modeling tool the Student Retention Predictor (SRP) as an example.
First, predictive modeling for retention can help campuses customize communication flows to new students. The SRP assigns a score from 0 to 1 to each student. These model scores are based on the historical student retention behaviors unique to each campus. Typically, students with model scores above .75 are the most likely to be retained; those with model scores below .25 are the least likely to be retained. Armed with this information, we can customize our enrolled student communications flow, increasing our outreach to students with lower scores and being less intrusive with students with higher scores.
Second, the SRP produces a model that shows what characteristics, functioning together, are most predictive of retention. Campus models typically have four to six variables. For each variable the tool gives us a risk threshold. For example, I have seen high school GPA as a model variable many times. The GPA risk threshold ranges from around 2.7 to above 3.2. Students at the risk threshold typically retain at the same rate as their overall cohort. Knowing the main variables in the model and their risk thresholds can help campuses direct their programming where they can have the greatest impact.
Let me share with you a few examples of how some of the campuses I work with have used predictive modeling for retention. At one school, the ACT math subscore was one of the six variables in the model. This finding led us to a focused exploration of math placement and curriculum measures. Before the SRP, this campus allowed students to start the math curriculum with little regard to their prior math experience. Within a year they developed a math placement system and piloted a new entry-level math course for students who needed skill development. When we remodeled two years later, ACT math was no longer part of the predictive model, and students were experiencing much higher success levels in quantitative courses.
On another campus, the predictive model revealed an instructive pattern based on the distance from the students’ home addresses to campus. Students basically fell into one of four zones. Students in zone one, the zone surrounding the college, retained at a high rate, as did those in zone three. Students in zone two, the next closest to campus, retained at a lower rate, as did those from the farthest away in zone four. Again, this helped us focus on students whose level of risk might have otherwise gone undetected. After some additional analysis, our intervention on this campus focused on improving access to campus work opportunities for students in zone two, those who lived close enough to go home on the weekends to work, but not so close that they could spend weekend nights on campus. Also, for students in zone two and zone four, we ramped up our outreach related to co-curricular activities
One of my favorite things about predictive modeling is that we often see a variable that surprises us on first glance, but makes sense as we deepen our analysis. That was the case with both campuses and variables I described above. While many academic and financial variables come as no surprise, they still provide compelling evidence of the need for resources to address these student risk factors. That may mean direct support for academic skill development or more indirect programming in financial literacy to help students make better financial choices.
Whether predictive variables are a surprise or not, knowing what they are allows us to focus more strategically on what matters to students and contributes to their success on our campuses.
If you have any questions about predictive modeling for student retention, or other questions about using data to guide student success initiatives, please e-mail me or leave a comment below. With the current school year ending, now’s the time to look ahead and hone your retention strategies for next year’s class.
About the author