I was a seemingly well adjusted kid but then around age 14 or 15 the wheels began to unravel and I lost my personal sense of gravity. The issues revolved around confidence, control, and motivation. And related to that was the role of school. Was my self-worth tied up in performance there or not? And, of course, there was the issue of whether my parents should make decisions on my own behalf or if I could make my own mistakes. I went to Northwestern for grad school in part to get sufficiently far away from New York so that the the ties with my parents would be cut completely. (My realistic alternative was Rochester, which had also offered a fellowship, but that was too close to Cornell where I did my undergrad and not a new enough adventure. I had wanted to go to Berkeley, but didn't get money from them.) It was not until my mid twenties that I learned to feel comfortable inside my own skin. And then my mother had her first hip replacement and all of a sudden I wanted to live closer to them. But that is a different story.
I would characterize my approach to life in general and to my job more specifically as collegial. I want to feel free to articulate my view but you don't have to agree. And I want to give you enough space for you to do likewise. I'm not trying to con or cajole anyone. I try to give information and argument for why we've done things as a service provider. I will represent the bad as well as the good and in that sense strive for a balanced approach. In dealing with faculty who use services provided by one of my units, this is the approach that makes the most sense to me. We're here for the long term and we're trying to do well by them, their students, and others on campus. As much as possible I hope I can have a relationship with them on a first name basis. This is how I'd like to be treated if I were the recipient of the service.
It is far less clear, however, that this approach makes sense in other arenas. If we are trying to bring out social change rather than simply provide a good service, then the message must be more idealistic and aspiring. And even if the goals are not so lofty but the audience is external to the campus, for example in seeking grant funding from foundations or in getting recognition around the state, perhaps my tone, which tends not to vary with audience, is not right. In contrast, let's look at the Sakai project and in particular one of the leaders of that project, Brad Wheeler, from Indiana University. I mention Brad because I'm friendly with him and I think he would not disagree with the characterization I'm going to provide.
Brad presents at all the national conferences. I've seen him at Educause, CNI, and of course the SEPP conference. I've got him to talk via teleconference with our FSI steering committee. He is highly visible and that is part of his job. He delivers a stump speech that is not unlike what a candidate would do in a political campaign. He is literally selling the idea of "Community Source" software development and as a business school professor has a keen sense of what marketing such an idea requires. He too frames the risks inherent in the community source approach. His Educause Review piece on Open Source makes the full argument. But it is clear that work is written by an Apostle, one whose mission is to bring others on board to the Community Source movement and thereby give recognition to the work he and his colleagues have already done.
Sometime we fail in the ed tech arena. Our services are not as good as we hoped they would be or the usage of the software is duller than we imagined and hence has little or no impact on learning. These failures may bother our instructors. But they put a huge burden on the staff who are trying to make things better. Hyping a service or an approach will in general raise expectations. Lifting the bar has the benefit of affecting performance. (In teaching we are constantly told that instructors should set high expectations for their students.) Of course, that can cut both ways.
As an organization in general, we (CITES) are more like me than like Brad. When Larry Smarr was here running NCSA, he certainly was a good promoter (and he championed some non orthodox ideas in the process. But note that NCSA is mostly an outsiders organization that is housed here.) We try very hard to provide a good, professional service. We don't try at all to inspire. I do, in a dabbling kind of way. I wish I were more out in front or somebody else here was, since I'm not sure it is in me to do that.
At the time I got involved with learning technologies, a lot of it did seem inspirational and Burks Oakley here was doing much of the inspiring. We've grown a lot since then and we've also witnessed the bursting of the dot.com bubble. It's not possible to create quite the sense of wonder because the novelty is no longer there. But that maturity does allow a more realistic and nuanced view of what is possible to improve learning. We need a charismatic character to promote yet with a subtle argument. Is that possible?
pedagogy, the economics of, technical issues, tie-ins with other stuff, the entire grab bag.
Tuesday, May 31, 2005
Monday, May 30, 2005
MS Office versus OpenOffice for outreach to K-12
A year or two ago my boss was on a kick to put some pressure on Microsoft in terms of how they were licensing their Office Suite to Higher Ed. At the time, Sun Microsystems was pushing their Star Office as a viable alternative to the Microsoft version. We had more than one discussion at our Cabinet about how we might implement on campus. For my part, I went so far as to buy myself a very inexpensive Linspire (formerly Lindows, a linux OS that has a user interface linke Windows) computer and try out that OS. My argument was that anybody who was working on a Windows box would use Microsoft Office products. But if somebody was operating in a Linux environment, then using open alternatives would seem reasonable.
The initiative failed miserably. The story I was told is that Sun didn't follow through. But the demographics didn't make sense, in my opinion, so it was doomed from the start. My guess is that the people on campus most likely to try non-Microsoft offerings are Mac users. But Star Office doesn't have a Mac version. We should have tried with Open Office instead. (Sun has played a background role in Open Office but Star Office and Open Office are not the same.) If we did that, however, we would be on our own. Sun wouldn't have been involved.
I did learn some things from this experiment. Failures do have their lessons. I spoke to quite a few people in CITES about providing lab computers cheaply. Most of them (and these folks had real knowledge about support) were dead against things like Sun Ray terminals, which like Citrix, run desktop software off a server. They said this stuff is too slow or too unreliable. Also, they would really like to buy their computers from a known vendor - Dell has been the standard for quite a while in the CITES computing labs. The feeling was that the Linspire computers which one can buy at Walmart actually have a higher total cost of ownership or their performance is much lower.
I would concur with the latter verdict. These boxes used chips from AMD rather than from Intel and the box I got was from a company I had never heard of. I don't know what Microsoft gets from Dell or other vendors for the Windows software, but just checking the Dell site this morning there is a starter desktop system that one can get for $300 (with rebate). This is a full system that comes with XP home edition. From the Linspire site, I found this very cheapie box but note that it doesn't come with a mouse, keyboard, monitor, or speakers. If you compare the two processors, the Dell box is 2.4 GHZ with a Celeron processor. The cheapie box is 1.5 GHZ with an AMD processor. I've got nothing against cheapie, but this is an apples and oranges comparison.
So how does any of this apply to K-12 and High Ed outreach there? The answer is simple. Reliability is worth a lot. The approach that works must have computers that are reliable and that can be maintained by whatever staff the schools have. At present it is my belief that Apple still has the lion's share of the market in the schools and that is because of its historic give away programs. If that is right, that market share itself makes Mac the low cost product because that is easiest for the schools to support. In recent years when Apple was in trouble, some of the schools may have switched to PC in which case they probably have Dells or Gateways. Those are the environments that are there and those are what we should design to now, if we're doing outreach.
The society as a whole must embrace open software first, before it makes sense to use that approach in the schools. Will that happen? I have no idea. There is such an installed base for the Microsoft stuff, I have my doubts. But most kids don't learn Excel before coming to college. (They do learn PowerPoint and Word.) If somebody in the open source universe cleverly de-bundled the spreadsheet and database pieces, they might make some more penetration. Here's betting that is not likely to happen.
The initiative failed miserably. The story I was told is that Sun didn't follow through. But the demographics didn't make sense, in my opinion, so it was doomed from the start. My guess is that the people on campus most likely to try non-Microsoft offerings are Mac users. But Star Office doesn't have a Mac version. We should have tried with Open Office instead. (Sun has played a background role in Open Office but Star Office and Open Office are not the same.) If we did that, however, we would be on our own. Sun wouldn't have been involved.
I did learn some things from this experiment. Failures do have their lessons. I spoke to quite a few people in CITES about providing lab computers cheaply. Most of them (and these folks had real knowledge about support) were dead against things like Sun Ray terminals, which like Citrix, run desktop software off a server. They said this stuff is too slow or too unreliable. Also, they would really like to buy their computers from a known vendor - Dell has been the standard for quite a while in the CITES computing labs. The feeling was that the Linspire computers which one can buy at Walmart actually have a higher total cost of ownership or their performance is much lower.
I would concur with the latter verdict. These boxes used chips from AMD rather than from Intel and the box I got was from a company I had never heard of. I don't know what Microsoft gets from Dell or other vendors for the Windows software, but just checking the Dell site this morning there is a starter desktop system that one can get for $300 (with rebate). This is a full system that comes with XP home edition. From the Linspire site, I found this very cheapie box but note that it doesn't come with a mouse, keyboard, monitor, or speakers. If you compare the two processors, the Dell box is 2.4 GHZ with a Celeron processor. The cheapie box is 1.5 GHZ with an AMD processor. I've got nothing against cheapie, but this is an apples and oranges comparison.
So how does any of this apply to K-12 and High Ed outreach there? The answer is simple. Reliability is worth a lot. The approach that works must have computers that are reliable and that can be maintained by whatever staff the schools have. At present it is my belief that Apple still has the lion's share of the market in the schools and that is because of its historic give away programs. If that is right, that market share itself makes Mac the low cost product because that is easiest for the schools to support. In recent years when Apple was in trouble, some of the schools may have switched to PC in which case they probably have Dells or Gateways. Those are the environments that are there and those are what we should design to now, if we're doing outreach.
The society as a whole must embrace open software first, before it makes sense to use that approach in the schools. Will that happen? I have no idea. There is such an installed base for the Microsoft stuff, I have my doubts. But most kids don't learn Excel before coming to college. (They do learn PowerPoint and Word.) If somebody in the open source universe cleverly de-bundled the spreadsheet and database pieces, they might make some more penetration. Here's betting that is not likely to happen.
Sunday, May 29, 2005
An example of what we could do for K-12
My younger son is a pretty bright kid but his handwriting is terrible. (That is likely genetically determined.) He has had some trouble on his recent math homework, not because he doesn't understand the concepts but because he can't read his own handwriting. So he makes a mistake on a subsequent step (a 3 looks strikingly like a 2) because he bases his judgment not on his recollection of how he came up with the 3, but rather on his reading of the number on his hand written page which is full of erasure marks (it sure looks like a 2). The error follows naturally from the misstep.
I would not have come up with this tutorial if not for my son and watching him do his work. But, of course, I had this idea of outreach from Higher Ed to K-12 on my brain and that was an additional motive, certainly. It's kind of strange to spend almost a full day of a long weekend designing content like this. But this stuff serves as therapy for me and illustrates the idea that it is possible to make intellectual contributions to the lower grades. Clearly this is other than a derivative of what we teach in Higher Ed. Yet the making of this tutorial was an intellectual challenge (I'll explain why in a bit) and so could be viewed as a mini research project.
Here is one other remark before describing the tutorial itself about how it differs from the paper and pencil approach. The tutorial is long on interactivity. The student gets feedback at every step and can't go further without getting the preceding step correct. In the paper and pencil world, unless there is someone else staring over the student's shoulder while the student is doing the work the student gets no feedback at all until the problem has been turned in for evaluation. The feedback elements encourage the student to try things - perhaps deliberately getting something wrong, just to see what happens. The tutorial also emphasizes the recursive structure that is the essence of long division. Within a given cycle, the arithmetic is done off to the side and should be done mostly in the student's head. The results of that arithmetic are then recorded in the long division area. The breaking apart the recording from the arithmetic gives a cleaner look to the result. I hope it also allows the student to think a little more about what is going on, since the student doesn't have to worry about writing down the results.
To use the tutorial, Macros must be enabled. (Tools menu, then Macros, then Security, then Intermediate.) Once the workbook is open there will be two spreadsheets: the Student Version and the Instructor's Manual. These are essentially the same. The Student Version worksheet is Protected (but the password is blank so it is a simple matter to unprotect it). The protection prevents the students from clicking on cells and seeing the content (or entering different values into those cells). Also, there are some cells that are hidden from the students' view. The Instructor Manual allows all the formulas to be inspected by clicking on the cell and looking at the formula bar. Further, the Macros associated with buttons can be found by right clicking on the particular button and by right clicking on the spinner buttons one can see the cells they control.
The tutorial accommodates dividends with up to 6 digits and divisors with 1 or 2 digits. Some values are in place when the spreadsheet is open. It is straightforward to put in your own values. Put the dividend into cell B3 and then hit Enter (Return). Put the divisor in cell B4 and again hit Enter. Then push the Set button. This initializes the values in the rest of the spreadsheet.
You can now start the process by looking at Step 1. If the answer to the question is yes, you are done. If not you can proceed to steps 2 through 4 in sequence. At each step you must choose an answer and then evaluate. If your response is wrong, choose another answer and evaluate again. There is some feedback given to guide the student to the right answer. After a while they'll see that getting the answer right the first time is the fastest way to get through the problem. Once you do get the right answer, you can proceed to the next step. When you have completed Step 4, press the Record and Update button. Now you begin again with Step 1, but with a new and smaller dividend.
The first time try it just to see what it does. The next time through, think about what it takes to make something like this. There are certain "tricks of the trade" (for example if you don't want a value to be visible have the font color the same as the background color, in this case white on white) that can get you so far. Algorithmically, the arithmetic part is not that hard. (And indeed part of the idea is to convince the students that it is not that hard.) It is only a little harder to figure out algorithmically what the full long division layout should look like (all the intermediate dividends and subtrahends). The real difficulty is making the stuff appear at the right time. This is particularly difficult for the quotient area where two quotients of the same length will nonetheless be generated with a different number of iterations, depending on how many zeroes they have in other than the first and last digit.
So to do this correctly, one has to come up with some counting algorithms. That counting is not so straightforward. And that represents the intellectual challenge in making this sort of thing. In general, if one designs an interactive exercise of this sort there are elements of conditional response and if the entire decision tree is fairly long or if there are many possibilities at each branch of the tree, the entirety can become quite complex. This, I believe, is why you don't see so much interactive material of this sort, but rather flatter stuff that is mostly straight presentation. The flatter stuff is intellectually easier to produce. But it is far less compelling to use.
Does this tutorial work for the fifth grader? I don't know, but I'm interested in finding out. Of course there are the access issues. But what about the pedagogic issues? Would this help students who did have access to Excel on a computer? And if so which type of students would benefit?
Let me assume for a minute that the students would benefit. And let me assume further that as a consequence the schools would like to have this particular tutorial and many others like it. Could we in Higher Ed come up with a program where tutorials of this sort were either the main product or a significant by-product? I can envision an online network that would be aimed at supporting teachers who used this type of tutorial content. I can imagine both peer teachers and students and faculty in Higher Ed participating. But I'm having a harder time envisioning the structure inside of Higher Ed that would sustain this.
And here is my fear. At the outset I made it quite clear that this tutorial is not a faculty member's research. And though I didn't elaborate on the point, I hope it is clear that because this is just an Excel file, it is fairly accessible. If schools have computers, they likely have Excel. Further, the content is not my view of how the schools should teach math. It is a computer approach to what they in the schools are already doing. I believe that content designed that way has a chance. Content that is part of a faculty member's research project or part of their college level instruction that is then refitted for K-12 has a much lower chance of working. It is not impossible, but there are a lot of constraints that make replicating the college environment difficult in the K-12 setting. The concern, then, is not just that we try these other less likely to succeed approaches, but also that these other approaches end up crowding out what might work.
I would not have come up with this tutorial if not for my son and watching him do his work. But, of course, I had this idea of outreach from Higher Ed to K-12 on my brain and that was an additional motive, certainly. It's kind of strange to spend almost a full day of a long weekend designing content like this. But this stuff serves as therapy for me and illustrates the idea that it is possible to make intellectual contributions to the lower grades. Clearly this is other than a derivative of what we teach in Higher Ed. Yet the making of this tutorial was an intellectual challenge (I'll explain why in a bit) and so could be viewed as a mini research project.
Here is one other remark before describing the tutorial itself about how it differs from the paper and pencil approach. The tutorial is long on interactivity. The student gets feedback at every step and can't go further without getting the preceding step correct. In the paper and pencil world, unless there is someone else staring over the student's shoulder while the student is doing the work the student gets no feedback at all until the problem has been turned in for evaluation. The feedback elements encourage the student to try things - perhaps deliberately getting something wrong, just to see what happens. The tutorial also emphasizes the recursive structure that is the essence of long division. Within a given cycle, the arithmetic is done off to the side and should be done mostly in the student's head. The results of that arithmetic are then recorded in the long division area. The breaking apart the recording from the arithmetic gives a cleaner look to the result. I hope it also allows the student to think a little more about what is going on, since the student doesn't have to worry about writing down the results.
To use the tutorial, Macros must be enabled. (Tools menu, then Macros, then Security, then Intermediate.) Once the workbook is open there will be two spreadsheets: the Student Version and the Instructor's Manual. These are essentially the same. The Student Version worksheet is Protected (but the password is blank so it is a simple matter to unprotect it). The protection prevents the students from clicking on cells and seeing the content (or entering different values into those cells). Also, there are some cells that are hidden from the students' view. The Instructor Manual allows all the formulas to be inspected by clicking on the cell and looking at the formula bar. Further, the Macros associated with buttons can be found by right clicking on the particular button and by right clicking on the spinner buttons one can see the cells they control.
The tutorial accommodates dividends with up to 6 digits and divisors with 1 or 2 digits. Some values are in place when the spreadsheet is open. It is straightforward to put in your own values. Put the dividend into cell B3 and then hit Enter (Return). Put the divisor in cell B4 and again hit Enter. Then push the Set button. This initializes the values in the rest of the spreadsheet.
You can now start the process by looking at Step 1. If the answer to the question is yes, you are done. If not you can proceed to steps 2 through 4 in sequence. At each step you must choose an answer and then evaluate. If your response is wrong, choose another answer and evaluate again. There is some feedback given to guide the student to the right answer. After a while they'll see that getting the answer right the first time is the fastest way to get through the problem. Once you do get the right answer, you can proceed to the next step. When you have completed Step 4, press the Record and Update button. Now you begin again with Step 1, but with a new and smaller dividend.
The first time try it just to see what it does. The next time through, think about what it takes to make something like this. There are certain "tricks of the trade" (for example if you don't want a value to be visible have the font color the same as the background color, in this case white on white) that can get you so far. Algorithmically, the arithmetic part is not that hard. (And indeed part of the idea is to convince the students that it is not that hard.) It is only a little harder to figure out algorithmically what the full long division layout should look like (all the intermediate dividends and subtrahends). The real difficulty is making the stuff appear at the right time. This is particularly difficult for the quotient area where two quotients of the same length will nonetheless be generated with a different number of iterations, depending on how many zeroes they have in other than the first and last digit.
So to do this correctly, one has to come up with some counting algorithms. That counting is not so straightforward. And that represents the intellectual challenge in making this sort of thing. In general, if one designs an interactive exercise of this sort there are elements of conditional response and if the entire decision tree is fairly long or if there are many possibilities at each branch of the tree, the entirety can become quite complex. This, I believe, is why you don't see so much interactive material of this sort, but rather flatter stuff that is mostly straight presentation. The flatter stuff is intellectually easier to produce. But it is far less compelling to use.
Does this tutorial work for the fifth grader? I don't know, but I'm interested in finding out. Of course there are the access issues. But what about the pedagogic issues? Would this help students who did have access to Excel on a computer? And if so which type of students would benefit?
Let me assume for a minute that the students would benefit. And let me assume further that as a consequence the schools would like to have this particular tutorial and many others like it. Could we in Higher Ed come up with a program where tutorials of this sort were either the main product or a significant by-product? I can envision an online network that would be aimed at supporting teachers who used this type of tutorial content. I can imagine both peer teachers and students and faculty in Higher Ed participating. But I'm having a harder time envisioning the structure inside of Higher Ed that would sustain this.
And here is my fear. At the outset I made it quite clear that this tutorial is not a faculty member's research. And though I didn't elaborate on the point, I hope it is clear that because this is just an Excel file, it is fairly accessible. If schools have computers, they likely have Excel. Further, the content is not my view of how the schools should teach math. It is a computer approach to what they in the schools are already doing. I believe that content designed that way has a chance. Content that is part of a faculty member's research project or part of their college level instruction that is then refitted for K-12 has a much lower chance of working. It is not impossible, but there are a lot of constraints that make replicating the college environment difficult in the K-12 setting. The concern, then, is not just that we try these other less likely to succeed approaches, but also that these other approaches end up crowding out what might work.
Saturday, May 28, 2005
Mix and Match
Matt Miller, a guest columnist for the Time Op-Ed page had an interesting column this morning about paying school teachers in California higher wages (base salary rises from $40K to $60K per year, top half teachers would average $90K, and the very best would earn $150K) all as incentive to bring in good teachers into poor schools. Presumably this part of the deal would be delightful from the Union's point of view. The bargain would be that the schools would be able to fire the under performers among the teachers. (Miller leads off with how he has heard from the good teachers in the bad schools about just how bad some of the other teachers are.) To me the interesting part of this suggestion is that it is straight "efficiency wage" theory, something I used to do research on ten to fifteen years ago.
Actually, it's a little sliver of an idea in Miller's piece that I want to glom onto. He mentions that before the 1970s women were not free to enter all the professions. Hence highly educated women were school teachers. By working at wages substantially below what their education would have predicted otherwise, they provided a massive subsidy to schools. In other words, the kids who went to the schools at that time got a better education than they were otherwise entitled to.
I was one of those kids. I started first grade in 1960 (in P.S. 31) and then after a year transferred to another school which was newly built (P.S. 203). That was in NYC, what is now called Oakland Gardens in Queens. My recollection is vague. I still remember the names of my teachers from then and I know that in 3rd grade I had my ten minutes of fame as the Sheriff of Hokum County in the play Bandit Ben Rides Again. But any of the particulars of the classroom are now long forgotten. I do have a sense that I got a good education there, for which I'm greatly appreciative.
Now fast forward 30 - 35 years. It's the late 90's and I'm teaching mostly intermediate microeconomics here at Illinois. Most of the kids are in-state. These kids attended elementary school perhaps 10 years earlier, in the late 8o's. I have found fault in the education these kids had based on how they behave in my class. Too many of them simply tried to memorize everything. Too few of them tried to "figure things out" on their own. I had always attributed that to their major. Most of the students were either business students or LAS students trying to become business students. But Miller's throw away comment has me thinking. Was it their teachers that made them this way - inexperienced, and not sufficiently educated teachers?
Not too long ago I recall talking with my younger son's teacher about my son's long division; he was having problems with that. The teacher said he just needed some drill on that. I'm not crazy about drill, especially when it seems forced. But in fifth grade I had Mrs. Stone and she made us do a speed test every day on multiplication and division (can't remember if it was 3 minutes or 5 minutes). Perhaps drill is not such a bad thing. You can't see the forest until you know what a tree is.
If K-12 matters so much, are there things that we can do in higher ed that would help, particularly in the lower grades? We are beginning to talk about a statewide consortium for the licensing and hosting of course management systems. What if through that we begin to have a program of content sharing across the campuses? What if further that K-12 becomes included and some of the content that is created is for K-12?
How does one address the "access issue" for lower income students? Can one have a program for decent networking and computers in the home? Or do the support issues suggest that there should be computer centers at schools and libraries where kids can get access?
Many years ago I was involved in a little project at Robeson school here in Champaign on the use of the Mallard software. My own view of the benefit of this software is as a good homework tool to let students work at their own pace and "do it till they get it right." Robeson had a lab where an entire class could work on the computers at one time. (They were the only school in the district with such a lab.) So they did Mallard as an ensemble activity.
This particular project was in Social Studies. The Social Studies teachers really didn't want short answer questions for homework. They wanted open end questions where the students could discuss the issues. For whatever reasons, the math and science classes didn't get involved in the project.
Where is drill when you need it? And more generally on the outreach from Higher Ed to K-12, will we forever be wrecked on the rocks of good intentions or will we come up with something sensible programmatically that is useful? If the latter, I could see turning my personal attention to this issue.
Actually, it's a little sliver of an idea in Miller's piece that I want to glom onto. He mentions that before the 1970s women were not free to enter all the professions. Hence highly educated women were school teachers. By working at wages substantially below what their education would have predicted otherwise, they provided a massive subsidy to schools. In other words, the kids who went to the schools at that time got a better education than they were otherwise entitled to.
I was one of those kids. I started first grade in 1960 (in P.S. 31) and then after a year transferred to another school which was newly built (P.S. 203). That was in NYC, what is now called Oakland Gardens in Queens. My recollection is vague. I still remember the names of my teachers from then and I know that in 3rd grade I had my ten minutes of fame as the Sheriff of Hokum County in the play Bandit Ben Rides Again. But any of the particulars of the classroom are now long forgotten. I do have a sense that I got a good education there, for which I'm greatly appreciative.
Now fast forward 30 - 35 years. It's the late 90's and I'm teaching mostly intermediate microeconomics here at Illinois. Most of the kids are in-state. These kids attended elementary school perhaps 10 years earlier, in the late 8o's. I have found fault in the education these kids had based on how they behave in my class. Too many of them simply tried to memorize everything. Too few of them tried to "figure things out" on their own. I had always attributed that to their major. Most of the students were either business students or LAS students trying to become business students. But Miller's throw away comment has me thinking. Was it their teachers that made them this way - inexperienced, and not sufficiently educated teachers?
Not too long ago I recall talking with my younger son's teacher about my son's long division; he was having problems with that. The teacher said he just needed some drill on that. I'm not crazy about drill, especially when it seems forced. But in fifth grade I had Mrs. Stone and she made us do a speed test every day on multiplication and division (can't remember if it was 3 minutes or 5 minutes). Perhaps drill is not such a bad thing. You can't see the forest until you know what a tree is.
If K-12 matters so much, are there things that we can do in higher ed that would help, particularly in the lower grades? We are beginning to talk about a statewide consortium for the licensing and hosting of course management systems. What if through that we begin to have a program of content sharing across the campuses? What if further that K-12 becomes included and some of the content that is created is for K-12?
How does one address the "access issue" for lower income students? Can one have a program for decent networking and computers in the home? Or do the support issues suggest that there should be computer centers at schools and libraries where kids can get access?
Many years ago I was involved in a little project at Robeson school here in Champaign on the use of the Mallard software. My own view of the benefit of this software is as a good homework tool to let students work at their own pace and "do it till they get it right." Robeson had a lab where an entire class could work on the computers at one time. (They were the only school in the district with such a lab.) So they did Mallard as an ensemble activity.
This particular project was in Social Studies. The Social Studies teachers really didn't want short answer questions for homework. They wanted open end questions where the students could discuss the issues. For whatever reasons, the math and science classes didn't get involved in the project.
Where is drill when you need it? And more generally on the outreach from Higher Ed to K-12, will we forever be wrecked on the rocks of good intentions or will we come up with something sensible programmatically that is useful? If the latter, I could see turning my personal attention to this issue.
Friday, May 27, 2005
Smart Data Entry
All three campuses of my University went to Banner, the ERP software (Enterprise Resource planning). It has created a variety of complaints among the users in the various departments. One of the big ones concerns data entry. It appears that in many areas there is now manual data entry where there used to be automated data entry (which to me means the data is already somewhere in electronic format and then gets moved to where the user wants it without manual intervention). My goal here is not to talk about Banner. I brought that up simply to indicate that many at the university have been sensitized to the data entry issues. I want to talk about smart data entry in the context of the CMS, where we already have it and where it is missing, and what possibilities it would afford if we did have it.
Two areas where current CMS are pretty good on the data entry front are file upload and column import in the grade book. Most of the CMS now support the WebDav protocol. WebDav allows uploading of a bunch of files via one command by the user. On our campus we encourage use of Novell NetDrive as the WebDav client. This allows the file area of the CMS to appear as a mapped drive on the designer's computer. Then uploading files is as simple as dragging them to a folder. This works reasonably well. There are also plugins, for example one from Microsoft, which allows publishing from the particular application, e.g., PowerPoint, directly into the CMS. I've found that functional when wanting to publish the PowerPoint as a Web page. (I don't think the plugin relies on WebDav. But it provides similar functionality.)
Most CMS also allow import into the grade book from a CSV (comma separated value text) file. The WebCT Vista tool for this type of import is particularly good. It is a strength of the product. As I've indicated in previous posts, since the functionality already exists in that arena, it would be nice for it to exist in the learning area, for experiments or surveys, and it would be nice to enable the import in that setting for the students. But let's return to the designer.
In every other part of the CMS, when creating an object the approach is to fill out a form that is internal to the CMS. There is competition across vendors (and with Sakai and Moodle) over how well designed those forms are, whether they allow a WYSIWYG editor for the HTML fields, whether they enable linking to other content, etc. But that form based learning objects are the norm goes without question.
This is sensible when those objects don't exist in any other format, in other words when they are created from scratch. But this puts a big burden on the creator when the objects already exist elsewhere, either on her own hard drive or somewhere online. Why not have tools that allow the creator to pull in the electronic content into the CMS and automatically fill out the form(s) so the creator doesn't have to do it manually?
This holds for just about every tool in the CMS, but the one I want to focus on is the tool that enables links to external resources, because here the form is particularly simple and because a "link scraper" tool would be particularly useful. There are two essential pieces of data for a link, the name and the url. One can readily imagine a list of links arrayed with the two columns of data and imported into the CMS in the same way that grade book information is imported.
What about the source of those links? One interesting possibility would be the page in our eReserves that lists all the class materials. So there would need to be a way to take a Web page with links on it and from that create a list of links. How to do that is not transparent to me, but I can't believe it would be that hard to do for a competent programmer. Then the last step would be to populate the links tool with the list. This part I'm least sure about. Do the APIs in the CMS allow this type of thing? I'm not sure. I certainly hope so.
Two areas where current CMS are pretty good on the data entry front are file upload and column import in the grade book. Most of the CMS now support the WebDav protocol. WebDav allows uploading of a bunch of files via one command by the user. On our campus we encourage use of Novell NetDrive as the WebDav client. This allows the file area of the CMS to appear as a mapped drive on the designer's computer. Then uploading files is as simple as dragging them to a folder. This works reasonably well. There are also plugins, for example one from Microsoft, which allows publishing from the particular application, e.g., PowerPoint, directly into the CMS. I've found that functional when wanting to publish the PowerPoint as a Web page. (I don't think the plugin relies on WebDav. But it provides similar functionality.)
Most CMS also allow import into the grade book from a CSV (comma separated value text) file. The WebCT Vista tool for this type of import is particularly good. It is a strength of the product. As I've indicated in previous posts, since the functionality already exists in that arena, it would be nice for it to exist in the learning area, for experiments or surveys, and it would be nice to enable the import in that setting for the students. But let's return to the designer.
In every other part of the CMS, when creating an object the approach is to fill out a form that is internal to the CMS. There is competition across vendors (and with Sakai and Moodle) over how well designed those forms are, whether they allow a WYSIWYG editor for the HTML fields, whether they enable linking to other content, etc. But that form based learning objects are the norm goes without question.
This is sensible when those objects don't exist in any other format, in other words when they are created from scratch. But this puts a big burden on the creator when the objects already exist elsewhere, either on her own hard drive or somewhere online. Why not have tools that allow the creator to pull in the electronic content into the CMS and automatically fill out the form(s) so the creator doesn't have to do it manually?
This holds for just about every tool in the CMS, but the one I want to focus on is the tool that enables links to external resources, because here the form is particularly simple and because a "link scraper" tool would be particularly useful. There are two essential pieces of data for a link, the name and the url. One can readily imagine a list of links arrayed with the two columns of data and imported into the CMS in the same way that grade book information is imported.
What about the source of those links? One interesting possibility would be the page in our eReserves that lists all the class materials. So there would need to be a way to take a Web page with links on it and from that create a list of links. How to do that is not transparent to me, but I can't believe it would be that hard to do for a competent programmer. Then the last step would be to populate the links tool with the list. This part I'm least sure about. Do the APIs in the CMS allow this type of thing? I'm not sure. I certainly hope so.
Thursday, May 26, 2005
eReserves and the CMS
I guess we do odd things for no apparent reason, especially before 6 AM. After reading an Op-Ed piece in the Times by David Brooks that I thought was gratuitous, I went to the campus Library site, followed the eReserves link, had to authenticate via the Library proxy server to get in because I didn't turn on the VPN from the family computer at home, but then got in without problem. Then, immediately, a page came up which has a course listing of all the courses that use eReserves. I scrolled down. It looked like a long list. I wondered how many. I copied the list and pasted into Excel. Then I sorted to eliminate the blank spaces. There were a little more than 600 entries. Wow! That is a lot.
I can get into any of these sites. The first one I try is from a colleague I know in Ag Econ from way back when I used to teach the graduate Microeconomics core theory course. The page has old tests on it scanned and now in PDF format. This strikes me as odd. Why use eReserves for this rather than the CMS? I went into another site, this time an East Asian Languages and Culture course taught by someone I don't know. This time there are chapters from a book, again in PDF. (I suppose most of the content in eReserves is in PDF format.) That made me feel a little better. The book chapters should be in eReserves so the content doesn't linger in the CMS. There is a better Fair Use argument that way.
The Library breaks the links after the semester is over. It occurs to me that this course is for the spring and compulsive as I am I look it up in the timetable. Sure enough, there is no summer listing of the course but it appears in the spring with the instructor on the eReserve list. The spring semester ended a week and a half ago. I wonder when they do break the links.
Last week at FSI we had a nice presentation about eReserves from Stephanie Atkins who is an Assistant Circulation Librarian. After she described our approach to eReserves, where outsiders can't get in at all, but insiders can get into every class site, Steven McDonald, our featured speaker on copyright, asked why we did that rather than restricting the eReserves of one class only to members of that class (say by putting the list of links to eReserves inside the CMS). During the session we talked about technical issues like the Library not getting course rosters and the Voyager software not accommodating them. (Our eReserves actually are not in Voyager. There is a link out from Voyager to our home grown system.)
Afterwards I chatted a little with Stephanie and said that I recall a discussion with Mary Laskowski about this (Mary is the Coordinator for Media Services in the Library). I believe we allow access to all the courses by analogy to what we used to do with paper reserves. If a book was on reserve and a patron wanted the book but wasn't enrolled in the class, the patron could get at the book. That made sense to me. I was in the role of the patron in that example, more than once.
But I'm still scratching my head on this one as it pertains to electronic reserves. On the one hand, I was told that the eReserves are DEFINITELY NOT a repository. This means the content is not supposed to be discoverable. One can't browse the items other than by the classes that use them and there is no search on the items. All of this is done to remain consistent with Fair Use. But, on the other hand, if somehow I do discover an item I want that is in eReserves, then I have access to it. This is not for Fair Use, not by a long shot. This is for the Library's mission to promote access to all. Somehow this approach represents the Library's balance between these two tensions. I'd never have come up with this one on my own. That's one reason why I'm not a Librarian.
In my pointed headed way of thinking, the access argument just doesn't apply to eReserves. And to make the Fair Use argument tighter, access should be restricted to just members of the class, so on this I agree with Steven and can see that using the CMS for eReserves has some logic to it. But I'm not going to force the issue. There is more to lose in terms of goodwill with the Library than there is to gain by winning a debating point.
There is, however, an issue that really does need to be discussed and understood. Quite a few faculty who have eReserve sites don't have sites in the CMS (and vice versa). So rather than getting the best of both possible worlds, the students get one or the other. Why do some faculty only have eReserve sites? Here are my guesses as to the answer.
(a) This is the natural extension of what these faculty have done historically. They used to use paper Reserves. Now they use eReserves. There is little difference in their perspective on what they have to submit to the Library. And they are all for the convenience this affords the students. This use is sufficient for their needs.
That is probably an explanation for some, but it is not an explanation for all. Stephanie made a point in her presentation that eReserve use is growing, rather dramatically. Something else must explain that.
(b) The instructor has some documents that should be in eReserves for the Fair Use reason. It is more convenient then to put all the document there rather than to have an Illinois Compass site and an eReserve site. Moreover, there is some help from the Library on the eReserve site so it is particularly easy to make.
(c) The instructor doesn't know how to make a Web site, in Illinois Compass, Netfiles, or on the department provided Web server. And the instructor doesn't have the inclination to learn those skills. With eReserves, all the instructor has to do is provide a list of references and hard copy for those things not already in the Library's collection.
I don't know the relative importance of (b) versus (c). Both give me some cause for concern, but for different reasons. The essence of (b) is that faculty are too busy and the more course administrative or course logistic function the campus provides the better, but the consequence is that there won't be a vigorous online component to the course. Online will be used only for content distribution - from instructor to students. As my mission is in large part to get instructors to take next steps in using online in their teaching to engage their students, this thwarts my mission. The essence of (c) is that there are many faculty who are not sufficiently literate in information technology. (One would think this is mostly a problem with senior faculty, but I don't know that for a fact.) Though the EdTech office provides good and effective training, many faculty don't avail themselves of it.
I have never had a discussion within CITES, in our advisory committees, or with colleagues in the Library about faculty IT literacy. (At least I can't recall having any such discussion.) On the flip side we talk about faculty development, a lot, and recently about FSI and why more from my campus didn't attend. Part of the answer may be that there is implicit encouragement not to become IT literate via service such as eReserves that don't make those type of requirements on faculty. If that is true I wonder if the campus benefits from the approach.
I can get into any of these sites. The first one I try is from a colleague I know in Ag Econ from way back when I used to teach the graduate Microeconomics core theory course. The page has old tests on it scanned and now in PDF format. This strikes me as odd. Why use eReserves for this rather than the CMS? I went into another site, this time an East Asian Languages and Culture course taught by someone I don't know. This time there are chapters from a book, again in PDF. (I suppose most of the content in eReserves is in PDF format.) That made me feel a little better. The book chapters should be in eReserves so the content doesn't linger in the CMS. There is a better Fair Use argument that way.
The Library breaks the links after the semester is over. It occurs to me that this course is for the spring and compulsive as I am I look it up in the timetable. Sure enough, there is no summer listing of the course but it appears in the spring with the instructor on the eReserve list. The spring semester ended a week and a half ago. I wonder when they do break the links.
Last week at FSI we had a nice presentation about eReserves from Stephanie Atkins who is an Assistant Circulation Librarian. After she described our approach to eReserves, where outsiders can't get in at all, but insiders can get into every class site, Steven McDonald, our featured speaker on copyright, asked why we did that rather than restricting the eReserves of one class only to members of that class (say by putting the list of links to eReserves inside the CMS). During the session we talked about technical issues like the Library not getting course rosters and the Voyager software not accommodating them. (Our eReserves actually are not in Voyager. There is a link out from Voyager to our home grown system.)
Afterwards I chatted a little with Stephanie and said that I recall a discussion with Mary Laskowski about this (Mary is the Coordinator for Media Services in the Library). I believe we allow access to all the courses by analogy to what we used to do with paper reserves. If a book was on reserve and a patron wanted the book but wasn't enrolled in the class, the patron could get at the book. That made sense to me. I was in the role of the patron in that example, more than once.
But I'm still scratching my head on this one as it pertains to electronic reserves. On the one hand, I was told that the eReserves are DEFINITELY NOT a repository. This means the content is not supposed to be discoverable. One can't browse the items other than by the classes that use them and there is no search on the items. All of this is done to remain consistent with Fair Use. But, on the other hand, if somehow I do discover an item I want that is in eReserves, then I have access to it. This is not for Fair Use, not by a long shot. This is for the Library's mission to promote access to all. Somehow this approach represents the Library's balance between these two tensions. I'd never have come up with this one on my own. That's one reason why I'm not a Librarian.
In my pointed headed way of thinking, the access argument just doesn't apply to eReserves. And to make the Fair Use argument tighter, access should be restricted to just members of the class, so on this I agree with Steven and can see that using the CMS for eReserves has some logic to it. But I'm not going to force the issue. There is more to lose in terms of goodwill with the Library than there is to gain by winning a debating point.
There is, however, an issue that really does need to be discussed and understood. Quite a few faculty who have eReserve sites don't have sites in the CMS (and vice versa). So rather than getting the best of both possible worlds, the students get one or the other. Why do some faculty only have eReserve sites? Here are my guesses as to the answer.
(a) This is the natural extension of what these faculty have done historically. They used to use paper Reserves. Now they use eReserves. There is little difference in their perspective on what they have to submit to the Library. And they are all for the convenience this affords the students. This use is sufficient for their needs.
That is probably an explanation for some, but it is not an explanation for all. Stephanie made a point in her presentation that eReserve use is growing, rather dramatically. Something else must explain that.
(b) The instructor has some documents that should be in eReserves for the Fair Use reason. It is more convenient then to put all the document there rather than to have an Illinois Compass site and an eReserve site. Moreover, there is some help from the Library on the eReserve site so it is particularly easy to make.
(c) The instructor doesn't know how to make a Web site, in Illinois Compass, Netfiles, or on the department provided Web server. And the instructor doesn't have the inclination to learn those skills. With eReserves, all the instructor has to do is provide a list of references and hard copy for those things not already in the Library's collection.
I don't know the relative importance of (b) versus (c). Both give me some cause for concern, but for different reasons. The essence of (b) is that faculty are too busy and the more course administrative or course logistic function the campus provides the better, but the consequence is that there won't be a vigorous online component to the course. Online will be used only for content distribution - from instructor to students. As my mission is in large part to get instructors to take next steps in using online in their teaching to engage their students, this thwarts my mission. The essence of (c) is that there are many faculty who are not sufficiently literate in information technology. (One would think this is mostly a problem with senior faculty, but I don't know that for a fact.) Though the EdTech office provides good and effective training, many faculty don't avail themselves of it.
I have never had a discussion within CITES, in our advisory committees, or with colleagues in the Library about faculty IT literacy. (At least I can't recall having any such discussion.) On the flip side we talk about faculty development, a lot, and recently about FSI and why more from my campus didn't attend. Part of the answer may be that there is implicit encouragement not to become IT literate via service such as eReserves that don't make those type of requirements on faculty. If that is true I wonder if the campus benefits from the approach.
Wednesday, May 25, 2005
What would we want in a really good CMS? Part 4
Yesterday I had a discussion with a developer at WebCT about their next steps in encouraging sharing of learning objects between different designers, focusing on those objects like quizzes and unlike Word or PowerPoint documents that really can't exist in a functional way outside the CMS. I tried to argue the point that content sharing had to be tied to a community of practice, that the community of practice may very well not map well into the hierarchy that they have inside the Vista software and on which we at Illinois built, which follows our administrative structure (the campus is divided into colleges, the colleges have departments, etc.) and so they really need to enable communities of practice as a separate structure.
In that discussion we focused narrowly on the sharing of the objects themselves. But one can readily imagine moving the discussion to asking the following questions: What tools are essential to enable a community of practice, which might very well engage in a detailed and deep discussion on how to teach and that may be as or more important than sharing objects? When should the tools be inside the CMS (meaning a log-in is required to access them) and when should they exist outside the CMS, perhaps because the same tools are used in other contexts as well? I'd like to advocate for the position that the CMS should not have a version of all types of tools. Asking it to do that will make it excessively bulky over time and harder for it to remain sharp and current on those functions where the market doesn't provide good alternatives. So the CMS must be selective on its toolset and as a consequence it is reasonable to expect that communities will rely on tools outside the CMS to some degree.
Here's a little hypothetical that might illustrate. Suppose we have a community of practice for sharing learning objects inside the CMS and now one of the members wants to set up a Blog that will reside on Blogspot.com so members can have meaningful discussions about teaching. But unlike my blog here where only I can make a post and everyone else can make comments only, suppose in this blog we want a more democratic approach, every member can make a post. So one of us begins by setting up the blog on Blogger.com and then after a fashion reaches the part of the setup where they want to invite members of the community to join the blog. At this juncture they'll come to a point like this where they are supposed to supply a list of email addresses that are comma separated. (Apologies that the image is so degraded. I hope you can get the idea nonetheless.)
How much of a hassle is it to produce such an email list from within the CMS so the blog site creator can plop that list into the invitation form? If it is a hassle, then only knowledgeable people will do it and hence it won't happen very often. If it is a snap, everyone can do it. Obviously, Blogger.com is only one possibility among many, many external communication environments. Each may have their own way of managing users. A good CMS would enable interaction with any or all of them in the sense that porting the user group over should be a snap.
Let me turn to a different area that has captured the attention of many of my staff, PodCasting. I'm going to ignore the iPod and music sharing issues altogether and note that I've "futzed" with this enough to be able to conclude that the same podcasting clients can be used to download PowerPoint presentations or video presentations or really any file type you can think of and the same RSS feed structure can be used to deliver that type of content. This means that at least in theory we can move from an approach where the student clicks on a link to content on the server and then downloads the content at the time of viewing to the alternative where the students subscribes to content feeds and at regular intervals new content is downloaded automatically. The student can then go to a designated folder on her computer when it is convenient for her and view the new content without any lag due to downloading, because that happened earlier. (Perhaps the CMS is smart enough so that the links there now point to content in that local folder.)
As video content becomes more the norm, this type of content distribution will become increasingly valuable. For TEACH Act or Fair Use reasons, however, we will still need to deliver some content (created by other than the professor and which hasn't been licensed for distribution to the class) in formats either where the content is streamed but is not permanently downloaded to the students computer or if not streaming where it nonetheless resides only in a temporary cache and in a format that is not readily copyable by the student. Indeed, if there were different methods of content distribution depending on who owned the copyright, as I have suggested, this would serve as an excellent mechanism for educating students and instructors alike on copyright issues, something a good CMS should facilitate.
Further, for similar reasons regarding bandwidth and speed, I believe the CMS will have to enable more transactions to occur on the student's local computer rather than back and forth with the server, which has inherent limitations in terms of speed of such things as page reloads. In addition, these local transactions must involve "communication" between CMS content and content from other packages that would be used to do course work. (In my course, that other package would probably be Excel. The CMS must be flexible enough to be able to integrate with whatever software the instructor wants to use.) There will have to be at the end of the series of local transactions some summary transaction with the server so a record is kept. (This would be both for the experiments I talked about yesterday and for individual student assignments.) But otherwise the CMS must get smarter about using the student's CPU.
Now let's cycle back to the question on educating instructors about new teaching approaches. The normal experience with a CMS is the first time through an instructor will use a few features and then the next time, after having gained some comfort in the environment, the instructor will expand use to some other features. Currently many CMS have manuals and built in help on "how" to use these features. What if there were analogous documents written on "why" to use the features and with examples. Many new products on the market provide sample use on the product Web site. Why doesn't the CMS do likewise?
Actually some of the CMS vendors do provide examples of good use. This is WebCT's exemplary course page. The underlying idea for this page is a good one. I think, however, that we are getting little to no use value from this page. (One can find mention of WebCT on our Illinois Compass homepage. (Look inside the Help.) However, there are no links to the WebCT.com site either there or on the EdTech division pages that are devoted to providing user support for Illinois Compass. And it may very well be that if we did this sort of thing we'd want examples from our own campus, as they would have the most relevance.
One last issue to consider is "can you take it with you?" meaning in this context whether an instructor can take the online part of the course to the next institution of employment after switching jobs. The IMS project has helped here, but it is still hard to migrate a course, esepcially one with a fair amount of content. It would be nice to see this problem solved.
In that discussion we focused narrowly on the sharing of the objects themselves. But one can readily imagine moving the discussion to asking the following questions: What tools are essential to enable a community of practice, which might very well engage in a detailed and deep discussion on how to teach and that may be as or more important than sharing objects? When should the tools be inside the CMS (meaning a log-in is required to access them) and when should they exist outside the CMS, perhaps because the same tools are used in other contexts as well? I'd like to advocate for the position that the CMS should not have a version of all types of tools. Asking it to do that will make it excessively bulky over time and harder for it to remain sharp and current on those functions where the market doesn't provide good alternatives. So the CMS must be selective on its toolset and as a consequence it is reasonable to expect that communities will rely on tools outside the CMS to some degree.
Here's a little hypothetical that might illustrate. Suppose we have a community of practice for sharing learning objects inside the CMS and now one of the members wants to set up a Blog that will reside on Blogspot.com so members can have meaningful discussions about teaching. But unlike my blog here where only I can make a post and everyone else can make comments only, suppose in this blog we want a more democratic approach, every member can make a post. So one of us begins by setting up the blog on Blogger.com and then after a fashion reaches the part of the setup where they want to invite members of the community to join the blog. At this juncture they'll come to a point like this where they are supposed to supply a list of email addresses that are comma separated. (Apologies that the image is so degraded. I hope you can get the idea nonetheless.)
How much of a hassle is it to produce such an email list from within the CMS so the blog site creator can plop that list into the invitation form? If it is a hassle, then only knowledgeable people will do it and hence it won't happen very often. If it is a snap, everyone can do it. Obviously, Blogger.com is only one possibility among many, many external communication environments. Each may have their own way of managing users. A good CMS would enable interaction with any or all of them in the sense that porting the user group over should be a snap.
Let me turn to a different area that has captured the attention of many of my staff, PodCasting. I'm going to ignore the iPod and music sharing issues altogether and note that I've "futzed" with this enough to be able to conclude that the same podcasting clients can be used to download PowerPoint presentations or video presentations or really any file type you can think of and the same RSS feed structure can be used to deliver that type of content. This means that at least in theory we can move from an approach where the student clicks on a link to content on the server and then downloads the content at the time of viewing to the alternative where the students subscribes to content feeds and at regular intervals new content is downloaded automatically. The student can then go to a designated folder on her computer when it is convenient for her and view the new content without any lag due to downloading, because that happened earlier. (Perhaps the CMS is smart enough so that the links there now point to content in that local folder.)
As video content becomes more the norm, this type of content distribution will become increasingly valuable. For TEACH Act or Fair Use reasons, however, we will still need to deliver some content (created by other than the professor and which hasn't been licensed for distribution to the class) in formats either where the content is streamed but is not permanently downloaded to the students computer or if not streaming where it nonetheless resides only in a temporary cache and in a format that is not readily copyable by the student. Indeed, if there were different methods of content distribution depending on who owned the copyright, as I have suggested, this would serve as an excellent mechanism for educating students and instructors alike on copyright issues, something a good CMS should facilitate.
Further, for similar reasons regarding bandwidth and speed, I believe the CMS will have to enable more transactions to occur on the student's local computer rather than back and forth with the server, which has inherent limitations in terms of speed of such things as page reloads. In addition, these local transactions must involve "communication" between CMS content and content from other packages that would be used to do course work. (In my course, that other package would probably be Excel. The CMS must be flexible enough to be able to integrate with whatever software the instructor wants to use.) There will have to be at the end of the series of local transactions some summary transaction with the server so a record is kept. (This would be both for the experiments I talked about yesterday and for individual student assignments.) But otherwise the CMS must get smarter about using the student's CPU.
Now let's cycle back to the question on educating instructors about new teaching approaches. The normal experience with a CMS is the first time through an instructor will use a few features and then the next time, after having gained some comfort in the environment, the instructor will expand use to some other features. Currently many CMS have manuals and built in help on "how" to use these features. What if there were analogous documents written on "why" to use the features and with examples. Many new products on the market provide sample use on the product Web site. Why doesn't the CMS do likewise?
Actually some of the CMS vendors do provide examples of good use. This is WebCT's exemplary course page. The underlying idea for this page is a good one. I think, however, that we are getting little to no use value from this page. (One can find mention of WebCT on our Illinois Compass homepage. (Look inside the Help.) However, there are no links to the WebCT.com site either there or on the EdTech division pages that are devoted to providing user support for Illinois Compass. And it may very well be that if we did this sort of thing we'd want examples from our own campus, as they would have the most relevance.
One last issue to consider is "can you take it with you?" meaning in this context whether an instructor can take the online part of the course to the next institution of employment after switching jobs. The IMS project has helped here, but it is still hard to migrate a course, esepcially one with a fair amount of content. It would be nice to see this problem solved.
Tuesday, May 24, 2005
What would we want in a really good CMS? Part 3
I'm going to stick with the notion of using the CMS to promote critical thinking and specifically inquiry based learning. If the word "experiment" is broadly conceived, then many open ended assignments can be thought of as experiments. Some of these are experiments at the individual level only. The ones I want to focus on are experiments at the group or entire class level. In such experiments there is a requirement to pool the results of each student to give meaning to what is found.
Conceivably the course management system could be viewed as the place to record the observations in the experiment and then to process those to some degree (I will illustrate below). The processed results would then serve as input on the drawing conclusions and reflection parts of the inquiry process.
Consider this example, which was the first assignment in my econ principles class in spring '04. There were 15 students. I asked each student to identify the top 10 principles textbooks. The rules were that if they identified a book from the top 10 and nobody else in the class identified the same book they would get 5 points. If at least two students identified a book then nobody in the class would get any points for that book. There were no precise instructions on how they should report their books, e.g., by author, by publisher, with or without publishing date, nor were there rules about how many books they could suggest. (more than 10?) I used the assignment tool in WebCT Vista for this, because I needed to give them the points. (It turned out that one student earned 5 points, all the other submissions had duplicates and they identified collectively all the textbooks in the top 10.)
I had to compile the results to show what was in their response. I did this manually and with some intelligence. You can see that compilation here. This looks unexceptional but note that the headers of the columns are determined by the data; they were not preset. Those headers are the textbook authors. Shouldn't it be possible to go from straight survey data to a representation of the results like this? Would such a representation know when to include the result even if there were a typo in the author's name? Or suppose in the case of jointly authored work the student only included one author. (In some cases earlier editions only had one author and as the senior author got more mature he brought on a junior partner.) Should the result be included in that case?
Now envision that in addition to the representation as I've got it, some other statistics of the the data were given, for example the count in each column and the rank statistic. (If you don't know about rank statistics perhaps you recall the old TV show, Family Feud, where the goal was to give responses that were in the top 3 in a previously conducted survey. This is the same idea.) These type of pieces of information are extremely useful in the refection part. Where did the students go to get the information? Why did they look their? Did they try to behave in this respect unlike their peers?
CMS now are reasonably good in their ability to collect (text based) data from students. But a lot more headway could be made in terms of representing the results and "letting the data speak." This is an obvious area for improvement.
Let me turn to a different way the data might improve things. In the first post on this topic, I mentioned Alfred Hubler, the inventor of CyberProf. One of his original ideas (this was before Amazon.com existed) was to use the history of student responses to a physics quiz question to recommend hints to a student working on the problem. In other words, the hints should be based on the past frequency of mistakes. At the time he was implementing the idea, it failed miserably - the server ran very slow. At that time my home computer was a 70 Megahertz Power Mac. Now the family had a relatively new Dell at home with CPU over 3 Gigahertz. That's a more than 40 times increase in crunching power. (Gotta love Moore's Law.) It's time to take Alfred's idea out of the mothballs and try again. And in the process, relying on the crunching power of the student desktop computer makes sense.
Let me make one more point where I don't have a specific recommendation about the software but I feel comfortable describing the general issue. A lot of students for better or worse (mostly worse in my opinion) treat the professor as the oracle and try to learn at the feet of said professor. They do this by literally sucking up everything the professor spits out. So the student view may be characterized by the impression that the professor possesses truth and their job is to gain truth by listening to the professor. This view is anti critical thinking.
The experimental approach that I've advocated means that students learn from observation and then reflection, their own and with others in the class. This means that their perspective on the issue should go through a transformation, from very naive and flat to more nuanced and with depth. It would be extremely valuable to mark the stages where the perspective changes or is modified, either to reconcile an observation that wasn't predicted or to account for a conclusion that wasn't anticipated. By considering these markings, the students should see themselves mature as they progress and become more aware of their own learning.
As I said, I'm less sure how this should be done . At one point in the late '90s I thought that portfolio assessment of a student's work would help demonstrate that trajectory to the students. But as ePortfolios have played out, they have lost this function and instead have been used to demonstrate student competence (not well in my opinion). I hope the longitudinal assessment notion can reemerge.
Are there other areas where CMS can be made much better? Sure. So tomorrow I'll wrap up this topic with some other suggestions.
Conceivably the course management system could be viewed as the place to record the observations in the experiment and then to process those to some degree (I will illustrate below). The processed results would then serve as input on the drawing conclusions and reflection parts of the inquiry process.
Consider this example, which was the first assignment in my econ principles class in spring '04. There were 15 students. I asked each student to identify the top 10 principles textbooks. The rules were that if they identified a book from the top 10 and nobody else in the class identified the same book they would get 5 points. If at least two students identified a book then nobody in the class would get any points for that book. There were no precise instructions on how they should report their books, e.g., by author, by publisher, with or without publishing date, nor were there rules about how many books they could suggest. (more than 10?) I used the assignment tool in WebCT Vista for this, because I needed to give them the points. (It turned out that one student earned 5 points, all the other submissions had duplicates and they identified collectively all the textbooks in the top 10.)
I had to compile the results to show what was in their response. I did this manually and with some intelligence. You can see that compilation here. This looks unexceptional but note that the headers of the columns are determined by the data; they were not preset. Those headers are the textbook authors. Shouldn't it be possible to go from straight survey data to a representation of the results like this? Would such a representation know when to include the result even if there were a typo in the author's name? Or suppose in the case of jointly authored work the student only included one author. (In some cases earlier editions only had one author and as the senior author got more mature he brought on a junior partner.) Should the result be included in that case?
Now envision that in addition to the representation as I've got it, some other statistics of the the data were given, for example the count in each column and the rank statistic. (If you don't know about rank statistics perhaps you recall the old TV show, Family Feud, where the goal was to give responses that were in the top 3 in a previously conducted survey. This is the same idea.) These type of pieces of information are extremely useful in the refection part. Where did the students go to get the information? Why did they look their? Did they try to behave in this respect unlike their peers?
CMS now are reasonably good in their ability to collect (text based) data from students. But a lot more headway could be made in terms of representing the results and "letting the data speak." This is an obvious area for improvement.
Let me turn to a different way the data might improve things. In the first post on this topic, I mentioned Alfred Hubler, the inventor of CyberProf. One of his original ideas (this was before Amazon.com existed) was to use the history of student responses to a physics quiz question to recommend hints to a student working on the problem. In other words, the hints should be based on the past frequency of mistakes. At the time he was implementing the idea, it failed miserably - the server ran very slow. At that time my home computer was a 70 Megahertz Power Mac. Now the family had a relatively new Dell at home with CPU over 3 Gigahertz. That's a more than 40 times increase in crunching power. (Gotta love Moore's Law.) It's time to take Alfred's idea out of the mothballs and try again. And in the process, relying on the crunching power of the student desktop computer makes sense.
Let me make one more point where I don't have a specific recommendation about the software but I feel comfortable describing the general issue. A lot of students for better or worse (mostly worse in my opinion) treat the professor as the oracle and try to learn at the feet of said professor. They do this by literally sucking up everything the professor spits out. So the student view may be characterized by the impression that the professor possesses truth and their job is to gain truth by listening to the professor. This view is anti critical thinking.
The experimental approach that I've advocated means that students learn from observation and then reflection, their own and with others in the class. This means that their perspective on the issue should go through a transformation, from very naive and flat to more nuanced and with depth. It would be extremely valuable to mark the stages where the perspective changes or is modified, either to reconcile an observation that wasn't predicted or to account for a conclusion that wasn't anticipated. By considering these markings, the students should see themselves mature as they progress and become more aware of their own learning.
As I said, I'm less sure how this should be done . At one point in the late '90s I thought that portfolio assessment of a student's work would help demonstrate that trajectory to the students. But as ePortfolios have played out, they have lost this function and instead have been used to demonstrate student competence (not well in my opinion). I hope the longitudinal assessment notion can reemerge.
Are there other areas where CMS can be made much better? Sure. So tomorrow I'll wrap up this topic with some other suggestions.
Monday, May 23, 2005
What would we want in a really good CMS? Part 2
Not too long ago I borrowed a copy of Steve Landsburg's Intermediate Microeconomics text from a colleague in the department to see if it read as well as some of the reviews I saw for it at Amazon.com. Landsburg is a well known author of The Armchair Economist and a columnist for Slate Magazine. Landsburg is known for writing provocative pieces that are good for students because they challenge the students on what they know and what they think they know.
I looked through the first couple of chapters of the book, which starts off with Supply and Demand. To be sure, some of the examples are unusual (focusing on crime) and show that economics can be applied to situations that would seemingly be of interest to a sociologist, not an economist. Otherwise, however, the presentation was quite standard. In particular, first the theory was developed, then examples were given to demonstrate the theory. And at the end of the chapters there are problems for students to work. This is the tried and true method and just about every intermediate text on the market is written this way.
There's only one problem. This is not the best way to teach the stuff. It is much better to first situate the students in an example, one that has meaning for them, one they can work their way through. Lead with the example and then generalize from that to the theory. This is much closer to the way people actually reason. So that is the way we should teach, from example to general concept, not vice versa. We should also, upon occasion, show the pitfalls in going that direction (sometimes the generalizing principle isn't). But that one occasionally guesses at hypotheses that are untrue based on a limited set of examples doesn't mean that we should revert to the traditional approach.
The traditional approach (really theory for theory's sake) is fine for the faculty who as students were destined to become faculty. Most of the students we teach, however, are destined to get a job outside academe and hardly touch on theory after they graduate. If the lessons of the course are to stick with them in some way after they have completed it (and a good economics course can fundamentally affect a person's world view for life) then we must emphasize the primacy of illustrative examples.
How will it happen that instructors who have had traditional training in their doctoral studies become non-traditional (and more effective) as teachers by taking an approach that is outside their own realm of experience? This is a good (and tough) question. One answer I propose is that this change is encouraged by the software the instructors use to make their course content. In other words, the CMS might be the deliverer of specific teaching approaches, including some that are alien to novice faculty.
On this campus, we have some faculty who are strong proponents of "inquiry based learning" in the spirit of John Dewey. Consider this chain, which is similar though not exactly the methodology they have developed.
1. Interesting Question (from instructor, to initiate the activity).
2. Initial Experiment Design (perhaps a synthesis by the students and the instructor), to address the question in #1.
3. Implementation of Experiment and Observation of Results.
4. First Conclusions
5. Reflection and either a recycling to some earlier stage or moving on to a different question.
Now let's start with the interesting question(s) that I'll leave to the next post to address. Can a CMS be designed to promote this type of pedagogy? Can the course materials that the instructor makes reflect this type of an approach? (And, for the worry warts in the crowd) Can this type of an approach be used and still cover the same amount of course content as the instructor was covering with the more traditional approach?
I looked through the first couple of chapters of the book, which starts off with Supply and Demand. To be sure, some of the examples are unusual (focusing on crime) and show that economics can be applied to situations that would seemingly be of interest to a sociologist, not an economist. Otherwise, however, the presentation was quite standard. In particular, first the theory was developed, then examples were given to demonstrate the theory. And at the end of the chapters there are problems for students to work. This is the tried and true method and just about every intermediate text on the market is written this way.
There's only one problem. This is not the best way to teach the stuff. It is much better to first situate the students in an example, one that has meaning for them, one they can work their way through. Lead with the example and then generalize from that to the theory. This is much closer to the way people actually reason. So that is the way we should teach, from example to general concept, not vice versa. We should also, upon occasion, show the pitfalls in going that direction (sometimes the generalizing principle isn't). But that one occasionally guesses at hypotheses that are untrue based on a limited set of examples doesn't mean that we should revert to the traditional approach.
The traditional approach (really theory for theory's sake) is fine for the faculty who as students were destined to become faculty. Most of the students we teach, however, are destined to get a job outside academe and hardly touch on theory after they graduate. If the lessons of the course are to stick with them in some way after they have completed it (and a good economics course can fundamentally affect a person's world view for life) then we must emphasize the primacy of illustrative examples.
How will it happen that instructors who have had traditional training in their doctoral studies become non-traditional (and more effective) as teachers by taking an approach that is outside their own realm of experience? This is a good (and tough) question. One answer I propose is that this change is encouraged by the software the instructors use to make their course content. In other words, the CMS might be the deliverer of specific teaching approaches, including some that are alien to novice faculty.
On this campus, we have some faculty who are strong proponents of "inquiry based learning" in the spirit of John Dewey. Consider this chain, which is similar though not exactly the methodology they have developed.
1. Interesting Question (from instructor, to initiate the activity).
2. Initial Experiment Design (perhaps a synthesis by the students and the instructor), to address the question in #1.
3. Implementation of Experiment and Observation of Results.
4. First Conclusions
5. Reflection and either a recycling to some earlier stage or moving on to a different question.
Now let's start with the interesting question(s) that I'll leave to the next post to address. Can a CMS be designed to promote this type of pedagogy? Can the course materials that the instructor makes reflect this type of an approach? (And, for the worry warts in the crowd) Can this type of an approach be used and still cover the same amount of course content as the instructor was covering with the more traditional approach?
Sunday, May 22, 2005
What would we want in a really good CMS?
A little bit more than a year ago, Chris Vento, CTO at WebCT wrote a column in Campus-Technology magazine where he argued for Open Standards in CMS development done by commercial ventures (such as WebCT) as an alternative to Open Source. I wrote a response to that (partly at the urging of my colleague Steve Acker who edits the column) arguing that the business discipline commercial companies bring is probably good in many respects, but one consequence is that we get incrementalism in terms of improvements in the CMS rather than dramatic pedagogic change. The argument is that dramatic change is inherently risky and the commercial vendors can't afford to fail.
But we in academe can. So we should be the ones to develop the more interesting pedagogic applications. And we should be the ones to pilot them. Many will fail, if not technically then because they don't generate sufficient interest. The ones that succeed can then be brought into the commercial environments. This is the juncture that departs from open source thinking. Consequently, this might sound like sacrilege to many of my colleagues. Why should we give away to the vendors the intellectual property that we've developed? (I've heard that refrain over and over again.) The answer is simple. We move on. When we do, what we developed earlier that at the time was an asset becomes a liability. Now we've got to maintain it, but our interest is elsewhere. Let the vendors maintain it instead. And let them earn a decent return on their software so they have incentive to do just that.
Let's begin by noting that most faculty currently use the CMS to put up their content. But this content is mostly documents; PowerPoint is the quintessential example. In the main, when we talk about instructors improving the online part of their course pedagogically, we talk about them using other tools, such as the discussion board, that add human to human interaction. What about improving the presentation content itself?
Way back when, about ten years ago Alfred Hubler, a faculty member in the Physics department, was working on a software system he developed called CyberProf. His goal was for his system to enable really good content. I got friendly with Alfred over the next few years and I know from discussions with him that Plato was a source of inspiration and that he looked at Mathematica rather than other course management systems as the main rival. The idea that the CMS can enable really good content is why we started to support Mallard in SCALE and why our Biology units on campus now use LON-CAPA.
What does really good content look like? Here I'm going to take an extreme position. It's not the graphical design or the Flash animation that matters. (Elegance and simplicity in presentation do matter, but bear with me for a bit.) What matters is that the content promote critical thinking. The content must help the student visualize the idea and reason through the material. Furthermore, the content should take advantage of the computer and display.
I'm going to be unfair and pick on a product that some of my staff like, StudyMate by Respondus. This is an easy to use tool that makes Flash objects one can use in the CMS. These objects remind me of HyperCard, the old Apple tool. Indeed among the objects one can make with StudyMate are Fact Cards, Flash Cards, and simply quiz questions, all nicely designed. These would be really great for my kids (5th and 7th grade) for studying social studies or language arts. But is that what we want to be promoting in college? I can see it being used in a foreign language class or a class that has a huge amount of terminology. But in the main, I don't see it.
A lot of the better online instructional content is done via Java applets or some other software that enables animation. For example, here is a nice page on the Monty Hall Paradox. I like that page a lot in terms of what it delivers both to the student and to other statistics instructors. But it leaves as a mystery how to make the applet. So those that can do. Most instructors can't and therefore don't.
Now I'm going to talk about my own stuff made with Excel, because I know what I did in designing the content. Here Excel serves as an alternative to a CMS, both for the simulation and for automatically graded questions based on the simulation. I think a good CMS should be able to allow this type of content and I'll explain how as I go along and also what Excel contributes in this regard.
This is an exercise from my principles of economics course. You must Enable Macros to have this function properly. Assuming that, open the file and fill in the info on the login sheet. (This file is sitting on your own computer. You are not giving away any personal information. Lie about your age and say you were born after 1980.) Then go to the ReservationPrice worksheet.
The upper pane is "the experiment area" and you should leave it fixed. The lower pane is "the quiz area" and you should scroll down as you progress through the exercise. Push the buttons in the upper pane to change the values in the graph and the table. The price buttons affect the table and the pink point. The scenario button only affects the graph.
After you've finished experimenting reset both prices to zero and then proceed to answer the questions in the bottom frame. See if you can get to part B of the quiz, if not all the way through. (I'm betting that if you get to part B you will finish the quiz.) The design is so that you can't make progress unless you get the previous question right.
Now let's consider what is going on here. The buttons in the upper pane affect the data value in some cell in the spreadsheet. Change the value in the cell and change the data table or the graph or both. The quiz questions are of the following general form: If the answer is correct more content is revealed which includes a response in bold green and then another question. If the answer is incorrect, some other content is revealed in red italic. The assessment of whether correct or incorrect involves satisfying an equality or not (the first two questions in part A) or satisfying an inequality or not (the questions in part A that ask for a price pair).
One other point I'd like to mention, which is something I learned from some of the Plato programmers who came over to the IT organization after Plato was licensed off as NovaNet. The changes render quickly. The entire page does not refresh. Only the content that has changed refreshes. Since the Excel file sits on your local computer, not on the server, this happens very quickly. This quick response is necessary to keep student interest.
Most current course management systems can do some of these functions ---- in their grade book. They can render data graphically via a histogram of the scores. WebCT Vista has a very nice query tool that can query data columns on whether an equality or inequality holds and return all values for which the statement is true. The grade book is much more sophisticated than the quiz engine in this respect. Why not allow data tables that are grade book like, but that are used to generate content a la my economics exercise? There are quite a few courses that would utilize numeric content of this sort, if they saw the pedagogic benefit.
Would instructors create this type of content? (I probably spent a month to make that workbook, but a fair amount of that was on figuring out what to show. You won't find these type of diagrams in any econ text.) My view is this. If they think they have to make it all themselves, they won't do it. (And then it would be rational for the CMS vendors to completely ignore the ideas in this post.) But they might make one or two to show what they have in mind and then either (a) develop a community with colleagues at other schools who do likewise and trade them or (b) have their students do these in lieu of some other course project and then re-use the good ones.
But we in academe can. So we should be the ones to develop the more interesting pedagogic applications. And we should be the ones to pilot them. Many will fail, if not technically then because they don't generate sufficient interest. The ones that succeed can then be brought into the commercial environments. This is the juncture that departs from open source thinking. Consequently, this might sound like sacrilege to many of my colleagues. Why should we give away to the vendors the intellectual property that we've developed? (I've heard that refrain over and over again.) The answer is simple. We move on. When we do, what we developed earlier that at the time was an asset becomes a liability. Now we've got to maintain it, but our interest is elsewhere. Let the vendors maintain it instead. And let them earn a decent return on their software so they have incentive to do just that.
Let's begin by noting that most faculty currently use the CMS to put up their content. But this content is mostly documents; PowerPoint is the quintessential example. In the main, when we talk about instructors improving the online part of their course pedagogically, we talk about them using other tools, such as the discussion board, that add human to human interaction. What about improving the presentation content itself?
Way back when, about ten years ago Alfred Hubler, a faculty member in the Physics department, was working on a software system he developed called CyberProf. His goal was for his system to enable really good content. I got friendly with Alfred over the next few years and I know from discussions with him that Plato was a source of inspiration and that he looked at Mathematica rather than other course management systems as the main rival. The idea that the CMS can enable really good content is why we started to support Mallard in SCALE and why our Biology units on campus now use LON-CAPA.
What does really good content look like? Here I'm going to take an extreme position. It's not the graphical design or the Flash animation that matters. (Elegance and simplicity in presentation do matter, but bear with me for a bit.) What matters is that the content promote critical thinking. The content must help the student visualize the idea and reason through the material. Furthermore, the content should take advantage of the computer and display.
I'm going to be unfair and pick on a product that some of my staff like, StudyMate by Respondus. This is an easy to use tool that makes Flash objects one can use in the CMS. These objects remind me of HyperCard, the old Apple tool. Indeed among the objects one can make with StudyMate are Fact Cards, Flash Cards, and simply quiz questions, all nicely designed. These would be really great for my kids (5th and 7th grade) for studying social studies or language arts. But is that what we want to be promoting in college? I can see it being used in a foreign language class or a class that has a huge amount of terminology. But in the main, I don't see it.
A lot of the better online instructional content is done via Java applets or some other software that enables animation. For example, here is a nice page on the Monty Hall Paradox. I like that page a lot in terms of what it delivers both to the student and to other statistics instructors. But it leaves as a mystery how to make the applet. So those that can do. Most instructors can't and therefore don't.
Now I'm going to talk about my own stuff made with Excel, because I know what I did in designing the content. Here Excel serves as an alternative to a CMS, both for the simulation and for automatically graded questions based on the simulation. I think a good CMS should be able to allow this type of content and I'll explain how as I go along and also what Excel contributes in this regard.
This is an exercise from my principles of economics course. You must Enable Macros to have this function properly. Assuming that, open the file and fill in the info on the login sheet. (This file is sitting on your own computer. You are not giving away any personal information. Lie about your age and say you were born after 1980.) Then go to the ReservationPrice worksheet.
The upper pane is "the experiment area" and you should leave it fixed. The lower pane is "the quiz area" and you should scroll down as you progress through the exercise. Push the buttons in the upper pane to change the values in the graph and the table. The price buttons affect the table and the pink point. The scenario button only affects the graph.
After you've finished experimenting reset both prices to zero and then proceed to answer the questions in the bottom frame. See if you can get to part B of the quiz, if not all the way through. (I'm betting that if you get to part B you will finish the quiz.) The design is so that you can't make progress unless you get the previous question right.
Now let's consider what is going on here. The buttons in the upper pane affect the data value in some cell in the spreadsheet. Change the value in the cell and change the data table or the graph or both. The quiz questions are of the following general form: If the answer is correct more content is revealed which includes a response in bold green and then another question. If the answer is incorrect, some other content is revealed in red italic. The assessment of whether correct or incorrect involves satisfying an equality or not (the first two questions in part A) or satisfying an inequality or not (the questions in part A that ask for a price pair).
One other point I'd like to mention, which is something I learned from some of the Plato programmers who came over to the IT organization after Plato was licensed off as NovaNet. The changes render quickly. The entire page does not refresh. Only the content that has changed refreshes. Since the Excel file sits on your local computer, not on the server, this happens very quickly. This quick response is necessary to keep student interest.
Most current course management systems can do some of these functions ---- in their grade book. They can render data graphically via a histogram of the scores. WebCT Vista has a very nice query tool that can query data columns on whether an equality or inequality holds and return all values for which the statement is true. The grade book is much more sophisticated than the quiz engine in this respect. Why not allow data tables that are grade book like, but that are used to generate content a la my economics exercise? There are quite a few courses that would utilize numeric content of this sort, if they saw the pedagogic benefit.
Would instructors create this type of content? (I probably spent a month to make that workbook, but a fair amount of that was on figuring out what to show. You won't find these type of diagrams in any econ text.) My view is this. If they think they have to make it all themselves, they won't do it. (And then it would be rational for the CMS vendors to completely ignore the ideas in this post.) But they might make one or two to show what they have in mind and then either (a) develop a community with colleagues at other schools who do likewise and trade them or (b) have their students do these in lieu of some other course project and then re-use the good ones.
Saturday, May 21, 2005
How many CMS are enough?
I've had some email threads this past week with colleagues around the country about course management systems and which one is the best bet. One wants to know what do to until Sakai gets mature enough (and if it will ever reach that stage). Should his school go to WebCT Vista now and take advantage of its full feature set or wait for Sakai? Another wants to know whether it makes sense to get on the Moodle bandwagon. Has Moodle beaten Sakai to the punch? Is Moodle a viable solution for campuses that are looking for an enterprise CMS? Or is it just a cheapie alternative?
On a different but related front, last Wednesday night there was a meeting between the FSI steering committee representatives and Illinois Community College representatives about pursuing a statewide consortium for the licensing of course management systems. This coming Wednesday there will be another meeting of the Community College group (which has been meeting for the last 8 months or so and does have a few of the four year people on it, such as me) to review models such as the one done by the Ohio Learning Network, where OLN has negotiated a consortium price with Blackboard, WebCT, and soon Desire2Learn, and some of the bigger schools, Kent State and Cincinnati now, Ohio State in the not too distant future, serve as ASPs for any smaller school that wants to participate.
If Illinois adopts a similar model, I'd like to see my campus in this ASP role, on a cost recovery basis of course, just so we can afford some needed redundancy in the personnel who support our CMS. There clearly are campuses that would want to be hosted by us. We'll have to see how this plays out.
ON a different front, I periodically go to the NASDAQ Web site to look up Blackboard, the only CMS vendor that is publicly traded (its symbol is BBBB). Blackboard's earnings per share are finally showing a positive value, reflecting the fact that they've had positive net revenues for the last 6 months or so. WebCT also reported (privately to me and my boss) that they were in the black last year. I don't know how Angel or D2L did financially, but both got some lucrative contracts last year.
I can't see how all these companies and the open source ventures can survive. If I were betting (with somebody else's money, with mine I'd rather keep it in a mutual fund) I'd guess that at least a couple of these will merge or be gobbled up by some other company. My guess as to the acquiring firm a couple of years ago would have been one of the big textbook publishers or perhaps a company that makes administrative systems that support the academic enterprise. But now I'm really not sure.
What does seem clear is that higher ed is hurting financially and the companies that thought they had a sure revenue stream by selling enterprise CMS will have to reconsider. The interesting case now is not with the big schools that have already chosen to go enterprise but rather with the small schools that are currently in lower levels of the product offering.
WebCT dropped its standard edition product a year or two ago and drew the enmity of standard edition customers who were quite content with that offering. But WebCT was losing money on the offering and they are a for profit business. Blackboard is in the same boat now with their Basic Edition.
Some of these schools will switch to Moodle and in the near term that probably is the most sensible path. But as I hear about one school after another having significant production and reliability problems, the idea of supporting a CMS on a shoestring budget makes less and less sense. This software is becoming mission critical if it isn't there already. Supporting mission critical software requires a substantial investment.
I don't begrudge the commercial vendors making a buck. In that I seem to be unlike my peers. Most of them want their cake and want to eat it. If the consolidation does come and we all say at the time, "Woe is me," we'll only have ourselves to blame.
On a different but related front, last Wednesday night there was a meeting between the FSI steering committee representatives and Illinois Community College representatives about pursuing a statewide consortium for the licensing of course management systems. This coming Wednesday there will be another meeting of the Community College group (which has been meeting for the last 8 months or so and does have a few of the four year people on it, such as me) to review models such as the one done by the Ohio Learning Network, where OLN has negotiated a consortium price with Blackboard, WebCT, and soon Desire2Learn, and some of the bigger schools, Kent State and Cincinnati now, Ohio State in the not too distant future, serve as ASPs for any smaller school that wants to participate.
If Illinois adopts a similar model, I'd like to see my campus in this ASP role, on a cost recovery basis of course, just so we can afford some needed redundancy in the personnel who support our CMS. There clearly are campuses that would want to be hosted by us. We'll have to see how this plays out.
ON a different front, I periodically go to the NASDAQ Web site to look up Blackboard, the only CMS vendor that is publicly traded (its symbol is BBBB). Blackboard's earnings per share are finally showing a positive value, reflecting the fact that they've had positive net revenues for the last 6 months or so. WebCT also reported (privately to me and my boss) that they were in the black last year. I don't know how Angel or D2L did financially, but both got some lucrative contracts last year.
I can't see how all these companies and the open source ventures can survive. If I were betting (with somebody else's money, with mine I'd rather keep it in a mutual fund) I'd guess that at least a couple of these will merge or be gobbled up by some other company. My guess as to the acquiring firm a couple of years ago would have been one of the big textbook publishers or perhaps a company that makes administrative systems that support the academic enterprise. But now I'm really not sure.
What does seem clear is that higher ed is hurting financially and the companies that thought they had a sure revenue stream by selling enterprise CMS will have to reconsider. The interesting case now is not with the big schools that have already chosen to go enterprise but rather with the small schools that are currently in lower levels of the product offering.
WebCT dropped its standard edition product a year or two ago and drew the enmity of standard edition customers who were quite content with that offering. But WebCT was losing money on the offering and they are a for profit business. Blackboard is in the same boat now with their Basic Edition.
Some of these schools will switch to Moodle and in the near term that probably is the most sensible path. But as I hear about one school after another having significant production and reliability problems, the idea of supporting a CMS on a shoestring budget makes less and less sense. This software is becoming mission critical if it isn't there already. Supporting mission critical software requires a substantial investment.
I don't begrudge the commercial vendors making a buck. In that I seem to be unlike my peers. Most of them want their cake and want to eat it. If the consolidation does come and we all say at the time, "Woe is me," we'll only have ourselves to blame.
Friday, May 20, 2005
Fragments from FSI
Our faculty summer institute ended today. We had a small group this year, but the ones who stayed till the end were quite appreciative. The last couple of days the plenary sessions had more Q&A. Steven McDonald's presentation on copyright was very well received as was Norm Coomb's presentation this morning.
Yesterday I had lunch with a few attendees including a woman who sat near the front of the room during the plenary sessions but who never asked a question there. At lunch it also took here time to warm up but when she did she asked, "Why didn't you plan some activity during the Wednesday night dinner session?" She caught me off guard. The members of the steering committee were very low key about this FSI. They reported it tough to recruit faculty to attend. Also, at the last year meeting, where I had a presentation to deliver before that dinner, everyone kept going to the bar and it was as if my talk was interfering with their blowing off steam. But each cohort is different and this attendee, quiet as she was, wanted to work and learn. Before the event this thought doesn't occur but during FSI it is quite clear to me that I have an obligation to those who come, including those from other campuses. I didn't do well by her.
I learned something about usability during Norm's talk. He would much rather have listen to someone talk extemporaneously than have someone read. This is coming from someone doing using a Jaws screen reader who is quite used to that speech. He did argue for a text transcript but it was clear it should be created that way rather than having the text first and then reading from that. I've tried it both ways myself and find I can't talk naturally about economics, if I'm reading aloud.
Many people were very appreciative about FSI. I got this both at steering committee dinner last night and at the end of the conference today. Most everyone who I can recall expressing appreciation was not from my campus. The people from Carbondale and Edwardsville said they collaborated (they were in learning technology support) but the rest of their campuses did not collaborate. There is something to scratch your head about.
Steve Schomberg, who was the force behind the first FSI, came to the dinner last night. Several years ago he passed the baton to Jim Onderdonk in running FSI. The last year or so, Jim has likewise passed the baton to Wendy Pickering. Similarly, many people on the steering committee now have had the job handed to them from predecessors who had more of an administrative role but less of a direct support responsibility. In contrast, I've been the head facilitator for the whole nine years that we've had FSI. When I was running SCALE, it had an outreach mission and so this made sense. CITES doesn't have such an outreach mission and so there really isn't anyone to pass the baton to if we were to do this again next year.
Simple kludges are still best. During most of the conference, if a presenter had audio on their computer we plugged in some wired amplifier through the headphone jack and got sound out of the computer that way. Burks and Curtis brought their own speakers. But for Norm's talk today for whatever reason it just didn't work through the headphone jack so he removed the wireless mic from his sweater and held near the speaker built into his laptop. It worked fine. Norm is blind but he could navigate that ok. Common sense rules.
Yesterday I had lunch with a few attendees including a woman who sat near the front of the room during the plenary sessions but who never asked a question there. At lunch it also took here time to warm up but when she did she asked, "Why didn't you plan some activity during the Wednesday night dinner session?" She caught me off guard. The members of the steering committee were very low key about this FSI. They reported it tough to recruit faculty to attend. Also, at the last year meeting, where I had a presentation to deliver before that dinner, everyone kept going to the bar and it was as if my talk was interfering with their blowing off steam. But each cohort is different and this attendee, quiet as she was, wanted to work and learn. Before the event this thought doesn't occur but during FSI it is quite clear to me that I have an obligation to those who come, including those from other campuses. I didn't do well by her.
I learned something about usability during Norm's talk. He would much rather have listen to someone talk extemporaneously than have someone read. This is coming from someone doing using a Jaws screen reader who is quite used to that speech. He did argue for a text transcript but it was clear it should be created that way rather than having the text first and then reading from that. I've tried it both ways myself and find I can't talk naturally about economics, if I'm reading aloud.
Many people were very appreciative about FSI. I got this both at steering committee dinner last night and at the end of the conference today. Most everyone who I can recall expressing appreciation was not from my campus. The people from Carbondale and Edwardsville said they collaborated (they were in learning technology support) but the rest of their campuses did not collaborate. There is something to scratch your head about.
Steve Schomberg, who was the force behind the first FSI, came to the dinner last night. Several years ago he passed the baton to Jim Onderdonk in running FSI. The last year or so, Jim has likewise passed the baton to Wendy Pickering. Similarly, many people on the steering committee now have had the job handed to them from predecessors who had more of an administrative role but less of a direct support responsibility. In contrast, I've been the head facilitator for the whole nine years that we've had FSI. When I was running SCALE, it had an outreach mission and so this made sense. CITES doesn't have such an outreach mission and so there really isn't anyone to pass the baton to if we were to do this again next year.
Simple kludges are still best. During most of the conference, if a presenter had audio on their computer we plugged in some wired amplifier through the headphone jack and got sound out of the computer that way. Burks and Curtis brought their own speakers. But for Norm's talk today for whatever reason it just didn't work through the headphone jack so he removed the wireless mic from his sweater and held near the speaker built into his laptop. It worked fine. Norm is blind but he could navigate that ok. Common sense rules.
Thursday, May 19, 2005
A critique of: Is totally online....?
Today I'm going to critique my own post from yesterday. In other words, I made the argument but I don't necessarily buy the argument. I will begin by continuing to focus on the demand side. Then I will switch and discuss the supply side.
When I was 40, though I didn't realize it at the time, I began to switch careers, from economics faculty member to administrator of teaching with learning technology. The process had fits and starts but for most part it has been a gradual transformation. I've certainly learned a lot running SCALE, then Director of CET, and now Assistant CIO for Ed Tech. There was no formal education associated with this transformation. It was learning by doing, learning by talking with peers, learning by attending conferences (though I feel close to tapped out on that one now) and learning from an assortment of reading in a hodgepodge fashion.
My experience provides one observation that formal education is not necessary for career switches. That is not sufficient to generalize from. So let me make the point differently. Many employers would rather teach their employees on the job than to rely on what those employees learned in their formal education prior to working. The reason seems obvious - then the firm knows what it's getting. The formal education is used more as a passport to get in the door. But for someone who has already had a job, been successful at it, and was well regarded in that capacity, the employment history can serve the passport function. The value of another degree is much less. My belief is that many career switches can and should be accomplished without bouts of formal education in between.
Two years ago I did get some rigorous professional development from the Frye Leadership Institute. This is a two week, highly intensive, program that is held on site at Emory University. It is jointly sponsored by Educause, the Information Technology professional organization, and by the Council on Library and Information Resources. Most of the attendees were either IT professionals or Librarians with a smattering of faculty and other administrators also in attendance. I learned later that getting into Frye is highly competitive. The attendees are being groomed to be the next CIOs, University Librarians, or Provosts. We ate meals together, did small group work together, attended presentations ensemble and were variously challenged and provoked by an assortment of very high caliber speakers and guests as well as by the two deans, Deanna Marcum and Rick Detweiler. With the exception of Jacques Steinberg of the New York Times, who at the last minute couldn't come and so did his session by video conference, all other sessions were face to face. (Steinberg talked with us two days before the Jayson Blair fiasco became front page news and it was Steinberg's byline on the article on the front page of the Times that told the story. That experience only added to the aura of Frye.)
We have used a listserv for our Frye class to keep in touch post institute. It was very active the first year. It has almost become extinct this past year. I do keep up with several people I met there by regular email and we meet at conferences and communicate then. But these are add ons to the core experience. Frye was an expensive, immersive, and transformative two week experience. It was also face to face. My view is that when you look for the creme de la creme of professional development activities, they will be face to face. I think that even Frank Mayadas of the Sloan Foundation, probably the biggest proponent of learning online that we have, would agree on this point. Sloan supports a summer workshop for the top practitioners in the field where they share current research under the Sloan "Five Pillars." The workshop is held at a swank resort in a very comfortable setting and is, of course, face to face.
The point here is that for maximal commitment to learning and total engagement of the learner (meaning the conversations during the breaks in the hallways count as much as what is discussed during the actual presentations) face to face wins. Face to face in this sense is also clearly more expensive and blocks access. Totally online, in contrast, is flexible and thereby promotes access. Intensive, face to face offerings held at swank facilities represent the gold standard in continuing education. Totally online is much more blue collar.
I believe that the gold standard form, though far from utilitarian, casts a disproportionate net in its influence on how instruction is held. The analogy is to the research university and the net it casts. Non-doctoral institutions often have stern requirements for faculty publication in order to achieve tenure or promotion. Why? If the job of the faculty member is primarily teaching, what is this concern about research? But it clearly is there and that is because that is how the best (the doctoral degree granting institutions) assure quality of their faculty. In other words, this trickles down from the top. And that trickle down effect has a big impact on demand. If face to face is perceived as of higher quality, that's what will be offered in professional development programs.
Of course there have been programs with spectacular numbers of totally online students in them. The ones I know about are the SUNY Learning Network, University of Maryland University College, and University of Central Florida. But what of the students in these programs. Are they of the caliber of students who'd be admitted to a graduate program at the University of Illinois? I don't believe we will ever associate the big number online programs with the educational elite in the way Harvard Business School is so associated.
Now let me turn to the supply issues. Is continuing professional education supposed to be getting students acquainted with the most current research in the field? Or is it supposed to be a form of apprenticeship from experienced practitioners in the field? The two are quite different and likely would be taught by different faculty; the former would be taught by research faculty while the latter would be taught by clinical faculty. Can both types of faculty co-exist at the same institution? Perhaps, but I doubt it. In the main the research faculty populate the research universities and the clinical faculty populate other non-profit universities or work for places like University of Phoenix. There are some areas (though I think the areas are limited) where the demand is for instruction that teaches the current research. So there will be some of this done, possibly totally online. I believe there is much broader demand for courses taught by practitioners.
This doesn't mean there will be no totally online professional education. It just means that the major research universities are likely to be on the sidelines in this industry. And in my opinion that is likely to remain the steady state. The only factor that I see which might upset this equilibrium is if the labor market were to no longer value highly the undergraduate degrees from these institutions. Those undergraduate degrees are a core business of these institutions, even if research is king. It will take a significant dent in that core business to upset the existing order.
Much of higher ed is hurting now due to economic factors. The primary reason is that state governments seem reluctant to contribute their historical share of the cost of education. This doesn't mean, however, that the core business has been hit. The primary effect on demand is with lower income students. They may not attend and hence less qualified but better able to pay middle or upper income students will be admitted. This is extraordinarily unfortunate as it will severely curtail the upward mobility of the poor and working class. But this factor, in itself, will not drive research institutions toward post-baccalaureate continuing education, online or otherwise.
Now I want to consider a different issue on the supply side. This pertains to improving quality of instruction. Are we at or near the quality frontier on face to face instruction (which includes "Web enhanced" courses)? Or are there many improvements that can be made? If the latter, is this simply a matter of instructors not applying known best practice? Or is it that such quality improvement is an applied research that must happen in situ by the instructor teaching the course? My view is that we are frequently far away from the quality frontier and that situational learning by the instructor is needed to move us closer to the frontier.
Instructors need encouragement to engage in such an applied research (and, frankly, junior faculty should not be so encouraged because it will lessen their chance at tenure). Perhaps the greatest form of encouragement is to learn of colleagues who have done likewise, making interesting modifications in the way they teach in response to the learning issues they've encountered in previous offerings of their courses. Innovation in teaching with technology begats other innovation through the charisma and appeal of the early adopters as well as the art of their creations in teaching with technology.
When I got started there was a large cohort of creative faculty who did precisely this. They made substantial changes in their teaching because of technology and then grew coattails from their own efforts. We are not seeing a new generation of instructors emerge to become today's pioneers in instruction. Part of the reason is that in departments where there is a systematic focus to promote the use of learning technologies, attention has turned to totally online instruction. This is true for both the College of Education and the Graduate School of Library and Information Science. The totally online teaching offers minimal spillovers to the face to face alternative.
In my view the top of the food chain in teaching with technology is really good instruction that produces engaged learning. To get there we must hang together, lest we suffer the fate that Ben Franklin predicted. And the issue of whether our efforts should be directed toward face to face or totally online should be dictated by where we think our students likely to be.
When I was 40, though I didn't realize it at the time, I began to switch careers, from economics faculty member to administrator of teaching with learning technology. The process had fits and starts but for most part it has been a gradual transformation. I've certainly learned a lot running SCALE, then Director of CET, and now Assistant CIO for Ed Tech. There was no formal education associated with this transformation. It was learning by doing, learning by talking with peers, learning by attending conferences (though I feel close to tapped out on that one now) and learning from an assortment of reading in a hodgepodge fashion.
My experience provides one observation that formal education is not necessary for career switches. That is not sufficient to generalize from. So let me make the point differently. Many employers would rather teach their employees on the job than to rely on what those employees learned in their formal education prior to working. The reason seems obvious - then the firm knows what it's getting. The formal education is used more as a passport to get in the door. But for someone who has already had a job, been successful at it, and was well regarded in that capacity, the employment history can serve the passport function. The value of another degree is much less. My belief is that many career switches can and should be accomplished without bouts of formal education in between.
Two years ago I did get some rigorous professional development from the Frye Leadership Institute. This is a two week, highly intensive, program that is held on site at Emory University. It is jointly sponsored by Educause, the Information Technology professional organization, and by the Council on Library and Information Resources. Most of the attendees were either IT professionals or Librarians with a smattering of faculty and other administrators also in attendance. I learned later that getting into Frye is highly competitive. The attendees are being groomed to be the next CIOs, University Librarians, or Provosts. We ate meals together, did small group work together, attended presentations ensemble and were variously challenged and provoked by an assortment of very high caliber speakers and guests as well as by the two deans, Deanna Marcum and Rick Detweiler. With the exception of Jacques Steinberg of the New York Times, who at the last minute couldn't come and so did his session by video conference, all other sessions were face to face. (Steinberg talked with us two days before the Jayson Blair fiasco became front page news and it was Steinberg's byline on the article on the front page of the Times that told the story. That experience only added to the aura of Frye.)
We have used a listserv for our Frye class to keep in touch post institute. It was very active the first year. It has almost become extinct this past year. I do keep up with several people I met there by regular email and we meet at conferences and communicate then. But these are add ons to the core experience. Frye was an expensive, immersive, and transformative two week experience. It was also face to face. My view is that when you look for the creme de la creme of professional development activities, they will be face to face. I think that even Frank Mayadas of the Sloan Foundation, probably the biggest proponent of learning online that we have, would agree on this point. Sloan supports a summer workshop for the top practitioners in the field where they share current research under the Sloan "Five Pillars." The workshop is held at a swank resort in a very comfortable setting and is, of course, face to face.
The point here is that for maximal commitment to learning and total engagement of the learner (meaning the conversations during the breaks in the hallways count as much as what is discussed during the actual presentations) face to face wins. Face to face in this sense is also clearly more expensive and blocks access. Totally online, in contrast, is flexible and thereby promotes access. Intensive, face to face offerings held at swank facilities represent the gold standard in continuing education. Totally online is much more blue collar.
I believe that the gold standard form, though far from utilitarian, casts a disproportionate net in its influence on how instruction is held. The analogy is to the research university and the net it casts. Non-doctoral institutions often have stern requirements for faculty publication in order to achieve tenure or promotion. Why? If the job of the faculty member is primarily teaching, what is this concern about research? But it clearly is there and that is because that is how the best (the doctoral degree granting institutions) assure quality of their faculty. In other words, this trickles down from the top. And that trickle down effect has a big impact on demand. If face to face is perceived as of higher quality, that's what will be offered in professional development programs.
Of course there have been programs with spectacular numbers of totally online students in them. The ones I know about are the SUNY Learning Network, University of Maryland University College, and University of Central Florida. But what of the students in these programs. Are they of the caliber of students who'd be admitted to a graduate program at the University of Illinois? I don't believe we will ever associate the big number online programs with the educational elite in the way Harvard Business School is so associated.
Now let me turn to the supply issues. Is continuing professional education supposed to be getting students acquainted with the most current research in the field? Or is it supposed to be a form of apprenticeship from experienced practitioners in the field? The two are quite different and likely would be taught by different faculty; the former would be taught by research faculty while the latter would be taught by clinical faculty. Can both types of faculty co-exist at the same institution? Perhaps, but I doubt it. In the main the research faculty populate the research universities and the clinical faculty populate other non-profit universities or work for places like University of Phoenix. There are some areas (though I think the areas are limited) where the demand is for instruction that teaches the current research. So there will be some of this done, possibly totally online. I believe there is much broader demand for courses taught by practitioners.
This doesn't mean there will be no totally online professional education. It just means that the major research universities are likely to be on the sidelines in this industry. And in my opinion that is likely to remain the steady state. The only factor that I see which might upset this equilibrium is if the labor market were to no longer value highly the undergraduate degrees from these institutions. Those undergraduate degrees are a core business of these institutions, even if research is king. It will take a significant dent in that core business to upset the existing order.
Much of higher ed is hurting now due to economic factors. The primary reason is that state governments seem reluctant to contribute their historical share of the cost of education. This doesn't mean, however, that the core business has been hit. The primary effect on demand is with lower income students. They may not attend and hence less qualified but better able to pay middle or upper income students will be admitted. This is extraordinarily unfortunate as it will severely curtail the upward mobility of the poor and working class. But this factor, in itself, will not drive research institutions toward post-baccalaureate continuing education, online or otherwise.
Now I want to consider a different issue on the supply side. This pertains to improving quality of instruction. Are we at or near the quality frontier on face to face instruction (which includes "Web enhanced" courses)? Or are there many improvements that can be made? If the latter, is this simply a matter of instructors not applying known best practice? Or is it that such quality improvement is an applied research that must happen in situ by the instructor teaching the course? My view is that we are frequently far away from the quality frontier and that situational learning by the instructor is needed to move us closer to the frontier.
Instructors need encouragement to engage in such an applied research (and, frankly, junior faculty should not be so encouraged because it will lessen their chance at tenure). Perhaps the greatest form of encouragement is to learn of colleagues who have done likewise, making interesting modifications in the way they teach in response to the learning issues they've encountered in previous offerings of their courses. Innovation in teaching with technology begats other innovation through the charisma and appeal of the early adopters as well as the art of their creations in teaching with technology.
When I got started there was a large cohort of creative faculty who did precisely this. They made substantial changes in their teaching because of technology and then grew coattails from their own efforts. We are not seeing a new generation of instructors emerge to become today's pioneers in instruction. Part of the reason is that in departments where there is a systematic focus to promote the use of learning technologies, attention has turned to totally online instruction. This is true for both the College of Education and the Graduate School of Library and Information Science. The totally online teaching offers minimal spillovers to the face to face alternative.
In my view the top of the food chain in teaching with technology is really good instruction that produces engaged learning. To get there we must hang together, lest we suffer the fate that Ben Franklin predicted. And the issue of whether our efforts should be directed toward face to face or totally online should be dictated by where we think our students likely to be.
Wednesday, May 18, 2005
Is totally online at the top of the learning technology food chain?
Both Burks Oakley and Curtis Bonk really promoted totally online in their presentations. In his afternoon talk, one of the schemas that Curtis presented had "virtual university" as the end form in an 11-step transformation process. In this post I want to make two arguments for totally online, based on social need. In the post tomorrow, I'll critique the point of view I articulate here. In both cases I'm sure I wouldn't get the complete endorsement of either Curtis or Burks in what I argue. That is ok. They come at this from a different angle than I do.
Let's begin with the notion of lifelong learning and focus on education that is post the bachelors degree. There are two types of formal education to consider: (1) Education that is the production of general human capital, i.e., the degree (or certificate) has value in the labor market. The education is viewed as an investment that is at least implicitly paid for by the higher wages received by the person getting that education. (2) Education that is direct consumption. It makes life better. It contributes to human well being. There need not be a degree or certificate associated with this type of education.
In the first category, the most obvious instance is continuing education within a profession that is mandated by the professional associations and/or licensing bodies. This sort of thing exists, for example, in veterinary medicine. One could envision that it would come to pass in other professions, such as Law, where the Bar Association might require lawyers to take refresher courses to become acquainted with new law, or, like the Department of Motor Vehicles, to assure that previously certified lawyers "can drive" proficiently. It's not hard to imagine that serious ethics training of some sort be delivered in this manner.
If we focus on continuing professional education of this sort, then totally online has a lot to commend for itself, especially if there is some external test that would validate the education. Totally online is clearly the least disruptive educational format - the learners can continue in their line of work while they attend classes. And if reputable institutions offered the education, there would be some quality assurance mechanism in the teaching. I fully expect this area to grow. The pace will be dictated by the professional associations themselves.
A second instance would again be professional development in some vertical, but this time not associated with certification by some profession. The executive MBA comes to mind here and one can envision a variety of business certificate programs that might provide a similar function for those who aspire to be business executives. The growth of this sort of thing long term depends critically on how the labor market (which might be internal to a firm, or public sector) perceives this formal education. Some of that will be tied to general economic conditions, but because the education is readily verifiable, it can serve as a prerequisite for climbing to higher rungs in the job ladder. Again, totally online has a lot to commend for itself in that the individuals can be working at the same time they are taking courses. I would argue further that in these circumstances less than 100% retention might be a good thing as completing an online program might also serve as a signal that the employee is serious and diligent.
A third instance would be switching fields of work. Our economy depends critically on the idea of labor mobility. This concept doesn't refer so much to geographic movement as it does to the changing from one field of endeavor to another as labor demand patterns change. My own view is that as the population becomes more gray, senior citizens will increasing find that they work, partly for the income and partly for the sense of fulfillment. The time of retirement might mark a time of career change. If formal education is needed to be successful in the next occupation, there might be substantial demand from seniors who plan to reenter the labor force. Now the argument for totally online is somewhat different. It is a convenience thing. The seniors likely wouldn't be residential students and to the extent that certain schools concentrated in certain specialty areas, the seniors would rather take the classes online from the good programs than attend the local college which may have a less good face-to-face offering.
The other category I want to mention here is that especially seniors, but also possibly others who are time abundant and not income needy may want formal education to pursue further self-study or small-group study that I would characterize as learning for learning's sake. For example, when I retire I might want to read Plato and Aristotle and be educated in that by philosophers who can direct my thinking in a way that my reading on my own can't achieve. Further, there very well might be emeriti faculty who would find teaching in such an environment matching their own lifestyle. This is another possible growth area.
These are the four areas where I can see my own campus possibly engaging in distance learning in the future, especially if the demographics play out as I've articulated them. I'm much less sanguine about the campus entering the market for an online bachelors degree or associates degree, when that is the first postsecondary degree for the individual. Unless the market of 18-22 year olds who want to attend a residential college completely dries up, I see that still being an important market for us in the years to come. But as demographics change, these other areas will likely become increasingly important. That to me is the main reason to emphasize totally online.
Let's begin with the notion of lifelong learning and focus on education that is post the bachelors degree. There are two types of formal education to consider: (1) Education that is the production of general human capital, i.e., the degree (or certificate) has value in the labor market. The education is viewed as an investment that is at least implicitly paid for by the higher wages received by the person getting that education. (2) Education that is direct consumption. It makes life better. It contributes to human well being. There need not be a degree or certificate associated with this type of education.
In the first category, the most obvious instance is continuing education within a profession that is mandated by the professional associations and/or licensing bodies. This sort of thing exists, for example, in veterinary medicine. One could envision that it would come to pass in other professions, such as Law, where the Bar Association might require lawyers to take refresher courses to become acquainted with new law, or, like the Department of Motor Vehicles, to assure that previously certified lawyers "can drive" proficiently. It's not hard to imagine that serious ethics training of some sort be delivered in this manner.
If we focus on continuing professional education of this sort, then totally online has a lot to commend for itself, especially if there is some external test that would validate the education. Totally online is clearly the least disruptive educational format - the learners can continue in their line of work while they attend classes. And if reputable institutions offered the education, there would be some quality assurance mechanism in the teaching. I fully expect this area to grow. The pace will be dictated by the professional associations themselves.
A second instance would again be professional development in some vertical, but this time not associated with certification by some profession. The executive MBA comes to mind here and one can envision a variety of business certificate programs that might provide a similar function for those who aspire to be business executives. The growth of this sort of thing long term depends critically on how the labor market (which might be internal to a firm, or public sector) perceives this formal education. Some of that will be tied to general economic conditions, but because the education is readily verifiable, it can serve as a prerequisite for climbing to higher rungs in the job ladder. Again, totally online has a lot to commend for itself in that the individuals can be working at the same time they are taking courses. I would argue further that in these circumstances less than 100% retention might be a good thing as completing an online program might also serve as a signal that the employee is serious and diligent.
A third instance would be switching fields of work. Our economy depends critically on the idea of labor mobility. This concept doesn't refer so much to geographic movement as it does to the changing from one field of endeavor to another as labor demand patterns change. My own view is that as the population becomes more gray, senior citizens will increasing find that they work, partly for the income and partly for the sense of fulfillment. The time of retirement might mark a time of career change. If formal education is needed to be successful in the next occupation, there might be substantial demand from seniors who plan to reenter the labor force. Now the argument for totally online is somewhat different. It is a convenience thing. The seniors likely wouldn't be residential students and to the extent that certain schools concentrated in certain specialty areas, the seniors would rather take the classes online from the good programs than attend the local college which may have a less good face-to-face offering.
The other category I want to mention here is that especially seniors, but also possibly others who are time abundant and not income needy may want formal education to pursue further self-study or small-group study that I would characterize as learning for learning's sake. For example, when I retire I might want to read Plato and Aristotle and be educated in that by philosophers who can direct my thinking in a way that my reading on my own can't achieve. Further, there very well might be emeriti faculty who would find teaching in such an environment matching their own lifestyle. This is another possible growth area.
These are the four areas where I can see my own campus possibly engaging in distance learning in the future, especially if the demographics play out as I've articulated them. I'm much less sanguine about the campus entering the market for an online bachelors degree or associates degree, when that is the first postsecondary degree for the individual. Unless the market of 18-22 year olds who want to attend a residential college completely dries up, I see that still being an important market for us in the years to come. But as demographics change, these other areas will likely become increasingly important. That to me is the main reason to emphasize totally online.
Tuesday, May 17, 2005
Faculty Development Activities
This evening concludes the first full day of the Faculty Summer Institute. This is the ninth year of FSI and probably the last time we will do such an event in this format. A few of my impressions follow.
The passions of the attendess are lower than in the past. Whether that is due to budget cuts, that the faculty are not "early adopters" but rather members of the "late majority," or if is because the novelty factor has worn off with educational technology, I can't say. But to me it clearly is less intense from the attendee perspective. I associate deep learning with a profound emotional response. For this FSI there is no faucet of emotions where I can simply open the tap. We've had good speakers in the plenary sessions. Yet the audience has almost been too respectful of them.
At the Burks Oakley leadoff session, until Burks started to talk about Blogs and RSS feeds, there was little interaction with the audience. At Curtis Bonk's main presentation this morning, and to be sure Curtis puts on quite a show, there were nary any questions. The most interaction we've had so far was in Tim Stelzer's presentation on iClicker. I attribute that to Tim's mild mannered demeanor and his comparatively slow pace. The audience could ask questions in his pauses and didn't feel they were interrupting. Attendees from years past would have challenged Burks and Curtis more, their pace notwithstanding. I sat with some folks from Chicago State at lunch. They were genuninely appreciative and eager to talk at the lunch table. So I sense something wrong with the picture. Why didn't they ask questions during the sessions?
On a different point, at the reception last night I talked with a faculty member in the College of Business here who had been a Blackboard user and after taking some training in WebCT Vista but not having that training register opted instead to use Netfiles (Xythos) because mostly he was distributing lecture notes and answer keys to the students and Netfiles really is fine for that. This faculty member, roughly my age (I'm 50), complained that the training addressed things things he didn't care about. I'm of two minds about this. There is the "complete customized view" - the attendee designs the training. And there is the "this is good for you view." There are things to learn in the new environment that are important, trust us. At play in this particular case, I believe, is a generational issue. This faculty member has had a lot of prior experience with computers and has done some fundamental work in his primary area of research. Seeming stupid about Vista just doesn't match the profile. I will try to help him out next week as a colleague, not as ed tech support person. With what he wants to do next fall Netfiles is not sufficient.
The other thing that is noteworthy about FSI is the no shows in the audience. There were some late arrivals, so I will have to get a new tally tomorrow, but it seemed as if almost a quarter of those registered did not show up. Last year we know that many of the faculty from here didn't go to the plenary sessions and instead only attended the hands on sessions in the labs. Perhaps the same thing is happening this time around. But the dynamics are different now and we really expected the locals to come in the morning. The attendee commitment to this activity is weak, in large part because the institutional commitment to their faculty development is weak.
Events like FSI are critical if we are to see progress with learning technology, because faculty need to pick up new teaching ideas from somewhere. They won't generate those ideas purely from introspection nor will they generate them from careful empirics on their own teaching. If the institution doesn't make a greater commitment to these instructors, it appears they won't generate them at all. But in tough budget times, faculty development activities look incredibly non-tangible and, hence, the commitment is bound to be weaker. This is a problem.
The passions of the attendess are lower than in the past. Whether that is due to budget cuts, that the faculty are not "early adopters" but rather members of the "late majority," or if is because the novelty factor has worn off with educational technology, I can't say. But to me it clearly is less intense from the attendee perspective. I associate deep learning with a profound emotional response. For this FSI there is no faucet of emotions where I can simply open the tap. We've had good speakers in the plenary sessions. Yet the audience has almost been too respectful of them.
At the Burks Oakley leadoff session, until Burks started to talk about Blogs and RSS feeds, there was little interaction with the audience. At Curtis Bonk's main presentation this morning, and to be sure Curtis puts on quite a show, there were nary any questions. The most interaction we've had so far was in Tim Stelzer's presentation on iClicker. I attribute that to Tim's mild mannered demeanor and his comparatively slow pace. The audience could ask questions in his pauses and didn't feel they were interrupting. Attendees from years past would have challenged Burks and Curtis more, their pace notwithstanding. I sat with some folks from Chicago State at lunch. They were genuninely appreciative and eager to talk at the lunch table. So I sense something wrong with the picture. Why didn't they ask questions during the sessions?
On a different point, at the reception last night I talked with a faculty member in the College of Business here who had been a Blackboard user and after taking some training in WebCT Vista but not having that training register opted instead to use Netfiles (Xythos) because mostly he was distributing lecture notes and answer keys to the students and Netfiles really is fine for that. This faculty member, roughly my age (I'm 50), complained that the training addressed things things he didn't care about. I'm of two minds about this. There is the "complete customized view" - the attendee designs the training. And there is the "this is good for you view." There are things to learn in the new environment that are important, trust us. At play in this particular case, I believe, is a generational issue. This faculty member has had a lot of prior experience with computers and has done some fundamental work in his primary area of research. Seeming stupid about Vista just doesn't match the profile. I will try to help him out next week as a colleague, not as ed tech support person. With what he wants to do next fall Netfiles is not sufficient.
The other thing that is noteworthy about FSI is the no shows in the audience. There were some late arrivals, so I will have to get a new tally tomorrow, but it seemed as if almost a quarter of those registered did not show up. Last year we know that many of the faculty from here didn't go to the plenary sessions and instead only attended the hands on sessions in the labs. Perhaps the same thing is happening this time around. But the dynamics are different now and we really expected the locals to come in the morning. The attendee commitment to this activity is weak, in large part because the institutional commitment to their faculty development is weak.
Events like FSI are critical if we are to see progress with learning technology, because faculty need to pick up new teaching ideas from somewhere. They won't generate those ideas purely from introspection nor will they generate them from careful empirics on their own teaching. If the institution doesn't make a greater commitment to these instructors, it appears they won't generate them at all. But in tough budget times, faculty development activities look incredibly non-tangible and, hence, the commitment is bound to be weaker. This is a problem.
Monday, May 16, 2005
Getting students to do the online work
I'm about three quarters through the new book Freakonomics. Given all the hype, I suppose I should have guessed that it would read more like a long NY Times Sunday Magazine piece than getting insight into Levitt's thinking. There is very little on whether Levitt comes to a new project with a thesis already formed and his look at the data confirms that hypothesis or if he is a naïf at first as he looks at the data and somehow the key idea emerges as he starts to uncover patterns in the data. The only part of the book where the reader is confronted with data of this sort is in the section on (K-12) teacher cheating. Apart from communicating that finding patterns in the data are near impossible without a prior model of what cheating might look like, there is little that suggests how Levitt came to think of this topic. And I don't mean the circumstances - he was asked by the Chicago Public Schools. I mean what was his operating hypothesis, if any, when he first got involved with the project. (I've seen other economists who were school teachers talk about erasing students’ answers on a test when they thought the student had cheated. So I'm fairly confident that Levitt didn't come up with all of this from a blank slate.)
In spite of this deficiency and my conclusion that the vast majority of the book was written by Steven Dubner, the other author, the book is an interesting read in giving background information on a bunch of areas that would not seem to be in the domain of economic analysis. The book does emphasize the people respond to incentives and that is the major orienting point for an economist doing an analysis. And the book spends much of its time talking about "cheating." Apart from the teacher cheating issue, for example, there is an extensive discussion of whether people pay on the honor system put their buck into the coffee can when they take a bagel in the morning. Since the guy who supplied the bagels collected the coffee cans and kept data on that, there was quite a story to tell. That some cheating occurs goes without saying. That there is so much honesty is the mystery for economists, especially without overt enforcement.
Now I want to turn to those ideas of cheating and honesty as they apply to instruction and learning. The bagel guy in Freakonomics has enough data to hypothesize that people are more honest when they feel good - their job is secure, the weather is nice, the office is a fun place to work. But even with all that good feeling, they get maybe 90% honesty. Suppose your standard in the classroom is more than 90%. And suppose you teach a large class that makes it hard to ensure that all the students feel good about the course. Now what?
I know one instructor, when he told me this I thought "that's odd," who rather than use a course management system to have his students do multiple choice homework, has the students fill in bubble sheets outside of class and then bring those bubble sheets to class. Why did he do this? The answer is so the day the homework was due students would attend the live class session. This is what we economists would call an example of "mechanism design." The students respond to incentives. The instructor designs a mechanism that makes the students come to class. This particular mechanism I certainly would not have invented on my own.
I do think that the presentations we've been talking about in the last couple of posts should be blended with online assessment (automated grading) and this is particularly true in large classes. Part of the value of the assessment for the students is so they can benchmark their own learning and see if "they are getting it." But the other part of this type of assessment is simply to monitor the students and assure they are held accountable for their effort. Now the instructor design issue is harder because instead of simply designing presentation, the instructor must design the presentation melded with assessment.
Furthermore, if the instructor teaches the same course repeatedly, either the instructor must change the assessments from one offering or the course to the next. Or the instructor must design the assessment to be cheat proof, in the sense that they are immune from a bright but subversive student posting the answers to a Web site for other students to copy and thereby get the "right answers" without having thought about the underlying questions at all.
My view on these issues are first, that making the combination of presentation and assessment in the form of a dialog: first presentation, then assessment, then more presentation on what what just covered and answered, etc. is a great way to go to make the content engaging. And then on the assessment pieces one needs mutliple versions of similar questions that are hard to catalog in a simple way. A lot of us favor questions that can be solved numerically and then posed with a random number generator. This makes it easier to generate the multiple questions. But otherwise, one needs to write out multiple versions of such questions (and resort to tricks like have the answers appear in random order.)
Generating this type of content, especially in making it high quality, is extremely time consuming. No instructor in their right mind is going to go this route for their entire course. Suddenly teaching a single course has become a lifelong effort. But what about doing enough of this sort of thing so that the instructor has a model of the type of content that is desired. Then the making of this content becomes the alternative to the term paper and it is the students who become the creators. Can that work? I think it has a better chance and should be what we promote.
In spite of this deficiency and my conclusion that the vast majority of the book was written by Steven Dubner, the other author, the book is an interesting read in giving background information on a bunch of areas that would not seem to be in the domain of economic analysis. The book does emphasize the people respond to incentives and that is the major orienting point for an economist doing an analysis. And the book spends much of its time talking about "cheating." Apart from the teacher cheating issue, for example, there is an extensive discussion of whether people pay on the honor system put their buck into the coffee can when they take a bagel in the morning. Since the guy who supplied the bagels collected the coffee cans and kept data on that, there was quite a story to tell. That some cheating occurs goes without saying. That there is so much honesty is the mystery for economists, especially without overt enforcement.
Now I want to turn to those ideas of cheating and honesty as they apply to instruction and learning. The bagel guy in Freakonomics has enough data to hypothesize that people are more honest when they feel good - their job is secure, the weather is nice, the office is a fun place to work. But even with all that good feeling, they get maybe 90% honesty. Suppose your standard in the classroom is more than 90%. And suppose you teach a large class that makes it hard to ensure that all the students feel good about the course. Now what?
I know one instructor, when he told me this I thought "that's odd," who rather than use a course management system to have his students do multiple choice homework, has the students fill in bubble sheets outside of class and then bring those bubble sheets to class. Why did he do this? The answer is so the day the homework was due students would attend the live class session. This is what we economists would call an example of "mechanism design." The students respond to incentives. The instructor designs a mechanism that makes the students come to class. This particular mechanism I certainly would not have invented on my own.
I do think that the presentations we've been talking about in the last couple of posts should be blended with online assessment (automated grading) and this is particularly true in large classes. Part of the value of the assessment for the students is so they can benchmark their own learning and see if "they are getting it." But the other part of this type of assessment is simply to monitor the students and assure they are held accountable for their effort. Now the instructor design issue is harder because instead of simply designing presentation, the instructor must design the presentation melded with assessment.
Furthermore, if the instructor teaches the same course repeatedly, either the instructor must change the assessments from one offering or the course to the next. Or the instructor must design the assessment to be cheat proof, in the sense that they are immune from a bright but subversive student posting the answers to a Web site for other students to copy and thereby get the "right answers" without having thought about the underlying questions at all.
My view on these issues are first, that making the combination of presentation and assessment in the form of a dialog: first presentation, then assessment, then more presentation on what what just covered and answered, etc. is a great way to go to make the content engaging. And then on the assessment pieces one needs mutliple versions of similar questions that are hard to catalog in a simple way. A lot of us favor questions that can be solved numerically and then posed with a random number generator. This makes it easier to generate the multiple questions. But otherwise, one needs to write out multiple versions of such questions (and resort to tricks like have the answers appear in random order.)
Generating this type of content, especially in making it high quality, is extremely time consuming. No instructor in their right mind is going to go this route for their entire course. Suddenly teaching a single course has become a lifelong effort. But what about doing enough of this sort of thing so that the instructor has a model of the type of content that is desired. Then the making of this content becomes the alternative to the term paper and it is the students who become the creators. Can that work? I think it has a better chance and should be what we promote.
Sunday, May 15, 2005
Online alternatives to straight presentation
Let me begin with something I tried that worked pretty well with IE on a PC when I used it in my class a year ago. I will get to the lack of universality in a bit, but first let's consider this as is. This is a mini lecture featuring some of my Excelets, voice over, and transcription of the voice. (The first slide that comes up has algebra instead of Excel. The other two slides have Excel.) Like yesterday, this presentation downloads first. You can run the executable from the server and then it auto-installs after you click the OK button.
IF IE is not your default browser and you want to see how this works, temporarily set IE to your default browser before clicking the above link. To do this go to Control Panel, Folder Options, File Types, and then set the application for the htm and html extensions to be IE. You can reset to your preferred browser afterwards. Also recall that IE has the security bar that blocks content. You may have to click that bar to enable the content. This will allow the embedded realplayer to play the audio. When you click to the second or third slide, choose Open. This will enable the Excelet to display within the page.
Go to the Revenue Comparison slide. The Excel spreadsheet appears inside a window called an IFrame (which means it's actually being displayed in a separate browser window). The reason for using IE is that then Excel will display within the browser and the reason for using the IFrame is that the position of the size and the window can be predetermined. This falls under the category I call, "screen management." That is fairly important so the student can focus on what's central and not be distracted by sidebars.
Let's talk about the good and the bad of this slide. If you push the button within the spreadsheet area the graph moves. Something is going on, but what? You can play with that for a bit to try to figure it out on your own. Or you can restart the audio from the beginning and listen to it as you manipulate the graph. The audio is better than the text here because you can listen to the audio and manipulate the graph at the same time. So you can experiment on your own or experiment in a planned way as the narration guides you. The text transcription may seem like overkill. It is there for two reasons. First, it is useful for anyone who is hearing impaired. Second, it makes the page searchable for keywords. This is helpful if the pages are also used as a reference.
In fact, the text is not a really a transcription of the audio. The text was written first and then read aloud to generate the audio. I think I sound a little stiffer that way. It is hard for me to talk about the economics and think about that when I'm reading text off the printed page. But the stiffness notwithstanding, this might not be a bad practice for this type of content. Imagine if students made these things in a course project. An instructor might build up a nice multimedia glossary for a course in this manner.
The key for making this work pedagogically is that the Excelet has to convey some interesting bit of economics and then the voice over has to explain what is being conveyed. The two work well in conjunction. I think this is much better than the straight presentation. And it is brief. This is meant for 10 or 15 minutes max.
Now let me turn to the universal design issue. How do we do this? Well for starters, let's ditch the IFrame, because if we're going to continue to use Excel we have to expect that to launch as a separate application. My view is that now we'll have two windows. We can envision dividing the computer screen in half and have Excel (as representative of any application that that might be used for the student to do some work) in the left half and the browser which has the navigation elements, link to the Excel file, embedded audio player, and text transcription all in the right half. If you knew Javascript (I don't) you probably can get halfway with this simply by having the browser open to the correct size and position with the Javascript. If not, there needs to be some direction to get the two windows set up correctly. They shouldn't overlap so one can move from one application to the other without hiding any of the information. This won't look beautiful, but it is do-able and it is functional.
So now we have something that is fairly compelling and can be used regardless of the computer environment. What's next? Well, for starters we should ask whether the students will view this material. We've given at least a few carrots. In the next post we'll talk about the sticks.
IF IE is not your default browser and you want to see how this works, temporarily set IE to your default browser before clicking the above link. To do this go to Control Panel, Folder Options, File Types, and then set the application for the htm and html extensions to be IE. You can reset to your preferred browser afterwards. Also recall that IE has the security bar that blocks content. You may have to click that bar to enable the content. This will allow the embedded realplayer to play the audio. When you click to the second or third slide, choose Open. This will enable the Excelet to display within the page.
Go to the Revenue Comparison slide. The Excel spreadsheet appears inside a window called an IFrame (which means it's actually being displayed in a separate browser window). The reason for using IE is that then Excel will display within the browser and the reason for using the IFrame is that the position of the size and the window can be predetermined. This falls under the category I call, "screen management." That is fairly important so the student can focus on what's central and not be distracted by sidebars.
Let's talk about the good and the bad of this slide. If you push the button within the spreadsheet area the graph moves. Something is going on, but what? You can play with that for a bit to try to figure it out on your own. Or you can restart the audio from the beginning and listen to it as you manipulate the graph. The audio is better than the text here because you can listen to the audio and manipulate the graph at the same time. So you can experiment on your own or experiment in a planned way as the narration guides you. The text transcription may seem like overkill. It is there for two reasons. First, it is useful for anyone who is hearing impaired. Second, it makes the page searchable for keywords. This is helpful if the pages are also used as a reference.
In fact, the text is not a really a transcription of the audio. The text was written first and then read aloud to generate the audio. I think I sound a little stiffer that way. It is hard for me to talk about the economics and think about that when I'm reading text off the printed page. But the stiffness notwithstanding, this might not be a bad practice for this type of content. Imagine if students made these things in a course project. An instructor might build up a nice multimedia glossary for a course in this manner.
The key for making this work pedagogically is that the Excelet has to convey some interesting bit of economics and then the voice over has to explain what is being conveyed. The two work well in conjunction. I think this is much better than the straight presentation. And it is brief. This is meant for 10 or 15 minutes max.
Now let me turn to the universal design issue. How do we do this? Well for starters, let's ditch the IFrame, because if we're going to continue to use Excel we have to expect that to launch as a separate application. My view is that now we'll have two windows. We can envision dividing the computer screen in half and have Excel (as representative of any application that that might be used for the student to do some work) in the left half and the browser which has the navigation elements, link to the Excel file, embedded audio player, and text transcription all in the right half. If you knew Javascript (I don't) you probably can get halfway with this simply by having the browser open to the correct size and position with the Javascript. If not, there needs to be some direction to get the two windows set up correctly. They shouldn't overlap so one can move from one application to the other without hiding any of the information. This won't look beautiful, but it is do-able and it is functional.
So now we have something that is fairly compelling and can be used regardless of the computer environment. What's next? Well, for starters we should ask whether the students will view this material. We've given at least a few carrots. In the next post we'll talk about the sticks.