Sunday, December 29, 2019

North Is The Dummy

On occasion, I go back and read my old blog posts.  This is typically not a random occurrence but rather happens to follow up on a current thought after recalling that I said something about the issue in the blog.  A few days ago I was thinking about family, how much I knew about my parents' rhythms and how much my kids understand mine.  I've become something of a hermit, partly because of my arthritis, so I'm reluctant to go places where I'll feel physically uncomfortable, but also because I'm out of the routine of engaging in family schtick that endures more than for a moment.  And a lot of this reluctance for family banter is not getting a decent night's sleep the night before, so feeling I'll be burdened by the slightest thing.  It's kind of sad really.

With my parents these sort of issues were less, until they got very old, in large part because they had routine activities with family and friends that normalized interactions.  One of those was playing tennis.  The other was playing bridge.  I wrote about this some in a post called Devotion, which was crafted after cleaning out my parent's condo in Boca Raton, aided by their caregiver, Beverly.  The post was both a way to process my grief, which was considerable, and to offer up a tribute to Beverly.  My mother remained alive for more than 13 years after my father died, remarkable under the circumstances.  To set the stage I did talk about family life some when we were kids and then again early into their retirement, when both parents were reasonably functional.  Part of that was talking about bridge.

...... My parents (more my mother than my father) were avid if only moderately skilled players. Many of the friends whom they had over on the weekends came to play bridge. Bridge was a big part of my parents' social life. So it's not surprising that my parents taught my brother and me. (My sister, I believe, didn't learn bridge since, 5 years my senior, there wasn't a fourth person around to play with when she was junior high school age. I was too young for that.)

Bridge is a fascinating game to an economist as it bears aspects of incomplete information and communicating privately held information via both the bidding and the play of the cards. When I came to the U of I some of my colleagues (who would soon become my very good friends) had a regular game at lunchtime. I joined right in. One of our number had been a ranked player nationally and he tutored us to raise our level. For a while my game improved steadily as I came to understand the requisite thinking in better play. There is a lot of counting to determine probabilities (or at least a sense of probability) in figuring out the play of each card, using the bidding and the prior play to aid in this determination.

I didn't write this in the piece, but it turns out that bridge is a useful complement for the economic theorist in considering actual decision making.  Economic theory has as its aim to illuminate general principles that explain many seemingly different situations.  In contrast, when playing bridge each hand offers up a unique situation to analyze.  Sequencing the choice of cards to play then becomes an important aspect of the player's thinking.  The aim is to do as best as possible in the given situation.  My mother knew a lot of rules to apply, which eventually turned out to limit her play (and also to limit her in managing interactions with me when I was a teenager).  Partly out of a sense of rebellion and maybe just because it was my intellectual disposition, I wanted to figure out the situation from first principles and the facts that were already known.  Because there is considerable complexity with bridge, I wouldn't get the play right all the time.  But that's what I would try for.  I have a sense that this sort of thinking helped me in my career, as an administrator.  Sequencing the moves was important there as well.  I wonder if kids nowadays get practice at sequentially arranging choices in other activities in which they engage.

Recently, I've taken up again reading the bridge column written by Phillip Adler, which is reprinted in the local newspaper.  This was a serendipitous choice, as the paper rearranged whether the Daily Jumble appeared, and I already had the habit of doing that. Adler's column appeared adjacent to the Jumble.  On those days where I snapped off the Jumble - one, two, three, zing - I wanted some more time to noodle around with the paper.  The bridge column was perfect for that.

It's been a few weeks since I resumed with Adler's column and I want to note some differences for the reader of the columns as compared to those who actually played the hand of bridge.  First, the reader does have complete information, as the cards for each player are revealed.  But the reader is then asked by Adler to consider the situation when only looking at one hand, but also taking into account the information that bidding and the prior play revealed. Second, the hand is selected precisely because there is some subtlety in finding the winning play, whether that is for the defense or, more frequently, for the declarer.  This keeps the reader on his toes, to see if he can come up with that winning line of play.  Third, these are typically hands played in some actual duplicate bridge tournament, so the reader gets to compare his imagined choices with what actually happened in the tournament.

There are some interesting conventions employed in the writing of the column.  The first paragraph is always about some famous person or event, taking a particular lesson from that and applying it to the hand in question.  It is a stylistic choice of the writer to begin the piece that way.  It adds a certain charm to the column.  Then the players are arrayed using the convention of directions on a map.  The defense is always East and West, regardless of how the players were actually sitting during the tournament.  The declarer is South.  The open hand during the play, opposite the declarer, is North.  The open hand is called the dummy, ergo the title of my piece.

I puzzled some why this latter convention is used and if it makes reading the column easier or not.  I thought of this West Wing episode, where some cartographers argue for using the Peters projection for global maps instead of the Mercator projection.  CJ became completely freaked out by this suggestion.  What was taught in grade school we accept as absolute truth, a fact she emblemizes in this scene.  We don't like our firm beliefs to be disrupted, even if the change is socially necessary.  Armed with that memory, I wondered why the declarer couldn't match where the players actually sat in the tournament or, if not that, why in the columns the players would be identified as right and left on defense, up as the dummy and down as the declarer.  Would that really change anything?  Nevertheless, it's not done that done that way, perhaps another example of tyranny of the status quo.

* * * * *

A different tradition with my parents, one I didn't write about in that piece on Beverly, was to engage in word play.  Some of that happened spontaneously in conversation.  Making puns in context was a  prized activity, one I have passed along to my offspring.  Then, too, instead of playing scrabble we played anagrams, where you not only made your own words with the letters you had picked up, but you also could steal a word from another player by adding some letters to a word the player had already displayed on the table and then rearranging the letters to form a new word. Now word play is deeply embedded in me.  I do it compulsively, as a matter of course.

So I start to play around with the title of this post.  I wonder if anyone who reads it would do likewise.  As a preface, this quote might be helpful.

"The past is never dead. It's not even past."
William Faulkner, Requiem for a Nun

I'm guessing that to a lot of readers it will seem now that we are going through a reenactment of the Civil War, albeit this time around it's a cold war not a hot one, and the names have changed so that now the Republicans represent the South in this war.  It seems to me we've been in this war for a very long time, at least since Reagan became president. The culture wars, with William Bennett the government official whom we most associate with the term and The Moral Majority, founded by Jerry Falwell, the non-governmental organization we most associate with the movement.  Subsequently, there was the Contract with America, the TEA Party, and more recently MAGA.  Each time the leadership made an appeal to potential members to join with them as if they were on a holy crusade.  The South understands in their bones that they are engaged in another Civil War, though the aim now is more members of Congress and more Justices who might revoke Roe.

Taken this way, the title of the post can mean that The North is stupid.  It refuses to recognize that this Civil War is going on.  It repeatedly errs by framing things as if we all love America and politics is merely the expression of differences in point of view done in a civil (not capitalized) manner.  Both pundits and politicians then err, by accusing Republicans in Congress of venality and hypocrisy, but avoiding talking about politics as war, thereby missing the main point.  During a war all is fair.

We have a long experience of fighting a cold war with the Soviet Union.  We might draw some lessons from that experience.  While there was aggression exercised on both sides then, there was also restraint, which was governed by MAD (Mutually Assured Destruction).  The U.S. had already demonstrated the devastation nuclear weapons were capable of in Hiroshima and Nagasaki.  The arms race ensued immediately after World War II ended.  But the Dr. Strangelove scenario never played itself out in reality.  The subsequent events that the first strike would trigger were too scary to contemplate.  If we tried to apply the lessons from this experience to our domestic politics, what credible threat would The North come up with to restrain the South from its current excesses and thereby eventually retrain itself as well?  Or is that even possible?

I confess that I don't have answers to these questions.  A real answer would require understanding both game theory a la Thomas Schelling and to understand the means by which effective political conflict would occur nowadays. I'm ignorant on the latter, so rely on TV shows and the movies with political espionage and intrigue to fuel my imagination.  Suppose, for example, it has been discovered that Senator McConnell has squirreled away hundreds of millions of dollars in some off-share tax haven, and via sophisticated electronic warfare techniques those funds can by siphoned away, so he no longer has access.  With hypotheticals like this one can readily construct quite a yarn.  But doing so doesn't get us any closer to imagining what real political cold warfare would be like, with the North fully engaged.  It is, of course, also possible to imagine hot warfare, perhaps as a demonstration of the bad outcome that might ensue.  But it is much harder to consider ways where that would be contained and not escalate, indeed only to serve subsequently as a threat against grievous violations of pax politica.

So, I'm guessing this is not an easy problem to solve, even for those who do have suitable expertise to consider realistic alternatives.  But that a problem is difficult to solve shouldn't mean we shy away from considering the problem altogether, n'est-pas?  Doing so might be a different way that the North is the Dummy and a not very courageous one at that.

Thursday, December 12, 2019

Direct Expenditure On Instruction Per Instructional Unit

The mantra - data driven decision making - has had a big influence on us, some in ways that are obvious to see, others perhaps far less clear.  One new thing I've become aware of is that several students in my class whom I would judge are not top notch analytically are nonetheless majoring in econometrics or double majoring in economics and statistics.  This, almost surely, is being done because the choice of major is driven by the perception of where the good jobs will be after graduation.  Knowing how to manipulate data so its truths can be revealed is an important skill.

But I keep coming back to this Koopmans piece, Measurement without Theory, as a critique of the "data will tell all" view.  As I was trained to be a theorist, I think you need to start with questions that you want the data to inform about.  Those questions then take the shape of a model, where parameters can be estimated with the data and hypotheses can be tested by the data.  Indeed, in the bits of empirical work I did as an administrator, first in the SCALE project, then more than a decade later as the CIO for the College of Business, the questions I was trying to answer drove the inquiry as well as how the data was amassed.  No fancy statistical techniques were employed, yet I knew enough about the data to be able to get at useful answers to my questions.  I'm not saying that you don't need knowledge of econometrics or statistics to understand what's going on.  I'm saying that you need theory too, which is the lesson I drew when Freakonomics was the rage.

There is yet another reason why data won't tell all.  This happens when the information needed to answer the fundamental questions is not present, but is potentially attainable with some effort in data collection. That is the issue I want to address in this piece.  In my previous post I argued that additional instructional resources needed to be put into the first-year experience.  An immediate rebuttal might be something like this - we're a public university and can't afford to teach first-year classes in a more labor intensive way.  Anyone with a traditional view of the public university - meaning geezers like me who remember back to 39 years ago - will likely embrace the rebuttal because exclusively large lecture classes in the first year seemed a staple of how things were done.  In turn, it was how the cost of instruction was kept down.

But things have changed, a lot, since 39 years ago.  For one, as I argued in this post called, The business and ethical dilemmas of undergraduate education at public R1's, tuition has been hyper-inflationary over essentially this entire time interval and now constitutes a major source of revenue for the university.  Purely on the matter of making things transparent to the "customer" (I hate to think of students as customers when they are in my class, but surely the university needs to consider them this way as they or their families pay tuition) there is a need to communicate what they are getting for what they pay.  How much of their tuition goes for direct expenditure on instruction, particularly the instruction they are getting in their classes?  What would a good number look like?  I don't know but my prior is that it should be around 50%.  Having the right data would inform that view. 

Another way that things have changed is the path students take to the degree.  It's now common for students to take the first two years elsewhere, either at a community college or at some other university, and then transfer in, mainly as juniors. Such students have already completed most of their general education requirements and are probably taking fewer large lecture classes.   If the old model had the first two years of college subsidizing the last two years of college, but the transfer students don't pay this subsidy, why should the students who do start at the university pay this subsidy?

I don't want this to be a metaphysics discussion.  I want it to be practical.  Let's begin with this question.  Can we determine a number that measures the size of the subsidy?    Here's a second question. Might some of the subsidy be going elsewhere, e.g., to doctoral education or to research or service?  This is a trickier question to answer.  I don't want to get bogged down by it here, so let me use that question to ask yet a different one.  Might there be other ways where the subsidy manifests?  For example, might some majors subsidize others or some colleges subsidize others?  This you could answer in a fairly straightforward way, with the type of information I'm arguing we need to have.

Now let me talk a bit about methodology.  Instructional units are determined by the number of students enrolled in the class times the credit hours the course offers.  The class I'm teaching now currently has 36 students registered and the course offers 3 credit hours, so the course is generating 108 IUs at present.  Yet I have two phantom students on my roster (students who stopped doing the course work quite early in the semester and who stopped coming to class). That sort of thing is probably hard to measure from one class to the next, but what is readily measurable are the number of late drops and the number of students who fail the course.  So one can get a more refined view of IUs, by excluding those students.

Likewise, enrollments typically vary over the semester.   When I did that SCALE project, the data I got from DMI (the Division of Management Information, which curates institutional data) had 10-day enrollments and final enrollments for all undergraduate sections.  Students can't add a class after day 10 without permission of the instructor, so day 10 is referred to as the add date.  The drop date is much later.  (I think it is  day 40 - 8  work weeks into the semester - but I'm doing that off the top of my head so that number should be verified.)   My preference would be to have looks at enrollments on day 1, day 10, day 40, and final enrollments.  The reasons for this are many.  Here are a few of them.

Students engage in some gaming of the course registration process.  They can't tell which classes are easy and which are hard, so they use the first 10 days as a way to sample the classes and their instructors.  Students also have strong time-of-day preferences for when to take classes, but many courses are at capacity early so they register for something else, at a less desirable time, hoping to get into a more preferred offering at a better time.  Then, students may under perform in class so consider dropping it after they learn their scores on the first midterm.  What, then, is the right time during the semester to measure IUs from a theoretical perspective?  I don't have a good answer to that question. Instead, my preference would be have several different views to consider and then see if they matter much in doing the expenditure per IU calculation.

Let me talk now about the expenditure side of things.  In my case this is remarkably easy.  I am under contract to teach the one course I am teaching this fall. So my pay in that contract is the appropriate expenditure number.  With full-time faculty, it is somewhat harder.  Teaching load matters as does the fraction of the total time devoted to instruction.  In the model we used when I first started at Illinois, the typical teaching load in the Economics Department was 4 courses per year (fall and spring, summer teaching was extra) or two courses per semester, with the typical allocation that one course was undergraduate and the other was graduate.  Also, the typical time allocation we would state is 50% research, 40% teaching, and 10% service.  If this actually were still the case you'd take the instructor's (9-month) salary, multiply by 40% to get the part of salary devoted to teaching, and then divide that number by 4 to get the part of salary on a per course basis.  

While I know there is now an official requirement for time reporting, I strongly doubt that most faculty actually track their time allocation.  Further, there is a conceptual issue with measuring cogitation.  If you're thinking of your model while driving to work or while doing the dishes (something I normally did when I was doing economics research) does that time count?  In any event, if you were able to get an an hours per week measure of the activities, it might produce quite a different number as to the salary per course number.  So the numbers one gets won't be precise, for sure.  Yet it would still be interesting to have those numbers, to get a look at salary per course.

A different issue arises in computing such expenditure for discussion sessions run by TAs.  This is whether their tuition waiver should count in their pay or not. There is some incentive for the campus to count the tuition waiver, as it will raise the expenditure per course number.  But let's face it.  The main reason for using TAs to staff these discussion sections, rather than rely on full time instructors, is because it is cheaper that way.  So I'd like to see the numbers without the tuition waiver included.  And by the philosophy I've articulated above, it really would be good to have both views.  I don't want to argue for a single number.  A vector is better than a scalar here.  We'd get a better sense of what's happening that way.

Let me close by making one more point.  One of the adjustments that students have made to the current system is to take more credit hours per semester and therefore to devote less time in any one course.  This semester I have one student who self-reported that he is taking 24 credit hours.  Several students reported taking 18 credit hours, and among them most seemed to me not that strong analytically.  This gaming, either by having a double major, or by trying to accelerate the time to graduation, is tyranny of the extensive margin and ends up crowding out deeper learning on the intensive margin, which needs students to put in more time. Some of the student stress we're seeing is a consequence of this sort of reaction, with the students themselves not perceiving they should want to find in their studies a form of self-expression.  This gives a measurable reason for wanting to see expenditure on instruction per IU.  The hope is that if we reallocated resources in a way that shows the students we understand their dilemma, we might change the living hell they are currently experiencing into a reasonably nurturing experience.  That should be our goal.

Tuesday, December 10, 2019

Should the U of I Consider Having Pass/Fail Grading for the Entire First-Year Experience?

Last week in the News-Gazette there was an article Demand for mental-health services surging on UI campus.  Many students appear to be under a high level of stress.  I'm seeing it in the one class I still teach.  The issue appears to be national, perhaps global.  Indirectly, you might imagine it a consequence of the rising inequality in society, fueled by the belief that those who get near the top have done so....because they earned it.

This meritocracy view puts added importance on GPA, or so it would seem.  Students then become single-minded regarding their own motivation.  In my class, where students write a weekly blog post, the last post is a review and critique, of the course and of their own performance.  This student post, particularly the second paragraph, did a nice job of describing the ethos among the students in the class.  And the comment that followed, in response to my comment, makes particular mention of the grades culture as the primary culprit.  So, one wants to know whether the surging demand for mental health services is a byproduct of the grades culture and, if so what can be done about it.

I gave a pretty thorough analysis of these issues back in 2015 in a post called, The double-edged sword we refer to as 'high expectations'.  That semester was the first time I saw the lack of intrinsic motivation manifest in a large fraction of my students.  Before that, I actually had pretty good success with the methodology I employed.  Since then, however, not so much.  And what I'm concluding is that it's pretty hopeless to try to address these issues at the course level only.   At the course level, one resorts to incentives, requiring attendance for example, that reinforces the grades culture.  A more systematic solution is needed, one that gets students to step outside their current habits with regard to school, so they can experience what intrinsic motivation feels like.

* * * * *

I was a freshman in college in fall 1972 at MIT.  At the time, MIT had the unfortunate distinction of leading the country in the suicide rate at universities.  They were ahead of the curve in regard to this issue of student stress and mental health.  So, they took some steps to address the issue head on.  The one that I remember the most was moving to pass/fail grading during the first year.  Instead of grades, students would get a written evaluation at mid-semester and again at the end of the semester.  The instructor had to produce these evaluations.

I seem to remember that the pre-meds were somewhat upset with this system, because certain courses that were taken during the first year would be important for their application to medical school, and they wanted to report the grades they received in those courses.  For example, the first semester chemistry course I took was part typical college chemistry for that time but then also part organic chemistry.  And in the second semester, you could take organic chemistry, which I did.  Organic, at that time, was the make-it-or-break-it class for getting into medical school.  So, the compromise MIT opted for was to have "hidden grades" in the evaluation document, which partly defeated the purpose, but maybe was as good as they could do at the time.  There were some other odd consequences of this policy.  I took a probability course in the spring semester.  They wouldn't let me sit for the final exam as I had already amassed enough points via the midterm and problem sets to pass the course. If testing is purely assessment of what has been learned, then this makes sense. But if some learning happens even during the exams, this seems an odd outcome.

No doubt a variety of issues would have to be worked through to change the grading system in this way.  I want to make a different, but related point.  Almost surely there would have to be concomitant changes that require devoting more resources to the first-year experience.  For one, we should take seriously the recommendations of the Boyer Commission Report.  Every first-year student should take at least one seminar class taught by a tenure-track faculty member.  Twenty years ago the university had the Discovery Program that aimed at something like this (though Discovery classes could be taught as a lecture if the instructor so desired).  Owing to various rounds of budget cuts, those classes dwindled.  It seems time to consider restoring them, in light of the mental health situation on campus.

Still, most first-year classes would be large lecture courses.  If it were the TAs who would write those evaluations, they would need to have a manageable number of students to be able to perform this task, which means either reduced section size or fewer sections per TA.  Further, they'd almost certainly need training in how to write effective evaluations of this sort.

So, I'm not holding my breath till the campus moves ahead with this.  But, seriously, if the mental health crisis with students is the canary in the coalmine, we should be asking how to address the root cause.  This seems like a reasonable first conversation to have about that.