The post from which the snip below is taken caused me to think back to 1996, when I did a study about retention rates at Illinois. I had 5 years of data on all courses offered with both the ten-day enrollment (after the add-drop period has ended) and the final course enrollment. I looked at the ratio of those two numbers, focusing on only undergraduate course that had 10 or more students in the ten-day number. Students who add with the permission of the instructor after day ten would make that ratio look a little higher. I simply assumed there weren't too many in that category.
The finding was that in non-engineering courses that ratio was so high (typically 95% or higher) that it could hardly be improved upon by an outside intervention, such as introducing an online learning component to the class. Even in engineering, the average was something like 91% or 92%. There is a lot of churn in those first ten days, but not a lot afterward, at least at Illinois from fall 1991 through spring 1996.
This study was done in the dark ages before there were ERP systems. Now we have those. I wonder why data like this isn't made publicly available, if not at the course level than aggregated a bit higher (say all 100-level courses offered in a department aggregated in one lump so as not to cast aspersions at a particular instructor with a lower ratio). If some facts about retention rates at particular institutions were more widely known, it would make the discussion about whether they should be targets of improvement a more informed discussion.
Maybe someday we might even ask, if the students survived the course and got a decent grade in it, did he or she actually learn something that will be retained in their mind. That's the retention rate we should care about.
No comments:
Post a Comment