The trouble with being an old timer is that you remember when. Before we had Banner, we would get hard copy rosters around the first day of class put into our departmental mailboxes. Thereafter, for the next 10 days of the semester, we would get manila cards, on which were printed the adds and drops.
One likes to think of technological progress as in the forward direction, but in reality it is more two steps forward and one step back, and sometimes it's the two steps that are in the wrong direction. I know something about electronic rosters from an administrative perspective. As Assistant CIO for Educational Technology, I was the one who informed faculty that we were canceling the Campus Gradebook service. I got a lot of grief for that. The users were quite wedded to it. As a grade book, it did several things quite well that current electronic grade books that I know about don't do. It categorized grades - homework, term papers, exams - and with that it provided a disaggregated or an aggregated view of the data. It handled TA access to the data the way you would want - TAs could only access grades in the section they had responsibility for. And because it kept data in a file rather than in a database, instructors would download the file to do all their data entry fairly rapidly on the downloaded file, and then when they were done upload it back again. That was much faster than after each entry having to do a screen refresh so they could enter other data.
But the truth is that Campus Gradebook was so valued because it provided an electronic roster service. Though not the intended function, it was the capability most valued by instructors. At the time, there weren't good alternatives. (For the course management systems we were then doing weekly updates of rosters, which is just not timely enough to be useful.) It was the correct decision to end the Campus Gradebook service. There was only one person who understood the code; it didn't work with Macs; integrating it with Banner would have been a bear; and we were consolidating services onto an enterprise LMS. These are all sensible reasons, but what the users found in the sequel was worse than what they had before. So I got a piece of their mind from quite a few of them, feeling like I was walking around with a kick me sign on my back.
This gets me to where we are now. As an instructor, I now have access to three electronic roster services. One is Banner, which I believe gives real time access to the data, but which doesn't provide download of the roster into a spreadsheet, in my view a fatal flaw. Banner offers two views of the roster, one in summary where there are a maximum of 50 entries per page (another pain), the other in detail that included student rank and major. Partly to alleviate these issues with Banner the campus developed its own roster service run by the the Division of Management Information. The third service is provided by the LMS. This semester I'm using Moodle run by ATLAS, but the same would be true if I were using Illinois Compass. Both DMI and the LMS get daily updates of the rosters with data from the data warehouse, not directly from Banner.
None of the three alternatives provides an electronic equivalent to the manila cards we used to get. I wonder why. For an instructor to track adds and drops, one must compare a roster downloaded at one time to a roster downloaded earlier. This morning (a Sunday). I downloaded a roster from Moodle and again from DMI. They were not the same. (After comparing the DMI roster to Banner and seeing they were identical, I concluded that ATLAS doesn't update rosters over the weekend. I'm not saying it should. I'm just trying to indicate the instructor issues with the current arrangement.)
I actually am using all three services because each offers a function the others don't. When a student emails me that they've just added, I need to verify that before I respond. For that Banner with the real time data is best. I use the Moodle roster to find out if the student can access the class site or not. My response will depend on that. And the DMI roster has the full demographic information in a way I can categorize it, making it useful to me. But the situation is much more complex than it should be and a conscientious instructor who is trying to keep up with the adds and drops has to put in an inordinate amount of effort to do so.
My own preferred solution to this - we have a reading day so why not an add/drop day - and then we can get on with it. Recognizing that is unlikely to happen I wonder if we could develop an electronic manila card function. It would be valued by instructors.
Let me conclude with this, because there is a lot of discussion of consolidation and centralization of IT services, wasteful duplication, and the like. There is the further issue of changing business practices to accommodate what the technology actually does. With instruction, however, that would be a disaster. As was explained to me some time ago by Denny Kane, Banner takes as its base unit the section. We on campus, however, think of the base unit as the course, which is defined by the instructor teaching it and the meeting time, not by rubric and number. I am teaching two courses this semester, Econ 302 and Econ 490. In the timetable there are nine sections of Econ 302, each an independent course (with some instructors teaching more than one section). In Econ 490, I have two sections (a separate grad section from the undergrad section). There is another 490 course on a different topic taught by a different instructor, who also has two sections.
An electronic roster system must accommodate the actual practice. Now we're only getting a first order approximation.