Yesterday I attended most of a symposium on the future of Higher Ed that was held at the I-Hotel near campus. I hadn't gotten much sleep for a couple of nights in a row - worried about my class and some of the usual stuff about getting up in the middle of the night and not being able to go back to sleep. So I was a bit more crabby and off my game. Today, after a decent night's sleep, I'm ready to chime in. Most of what I heard seemed like the same old, same old, not all that different from an event like this held fifteen years ago. What was different, however, was the strong emphasis on college cost and student indebtedness. That you wouldn't have heard about when I was the one hosting such a conference.
The common sense "analysis" of the problem is that there is a need to find a low cost way for students to get a college degree, so students can reap the reward from the degree and not have to go into hock to do so. Way back when, that was done via night school. First generation, mainly recent immigrant students would get their degrees this way. They'd get ahead of where they would have been otherwise, and eventually their children would go to a residential college and find the good life from that. This pattern has been disrupted by various changes in the economy. The jobs that those night school graduates had gotten have been drying up and the tuition for college has reached a level where that first generation now has no confidence that they can deliver for their offspring. There is a further issue that appears to be emerging on the other side of the labor market - employers increasingly reporting that they can't find candidates for job openings. You hear about that now and then. You don't hear about anyone taking constructive steps to address the issue.
As an economist steeped in general equilibrium theory and the escoterics of subgame perfection in non-cooperative games, as well as the non-economics but relevant systems approach to thinking about management and organizations (Peter Senge, The Fifth Discipline), I have a tendency to give a nod to common sense, but be quite suspicious of its conclusions, especially when it is applied to complex social problems. There is a strong inclination to mistakenly treat symptom as root cause and thereby come up with a quite wrong headed solution. In this case the problem with the common sense solution is that it treats higher education over here and the labor market over there, taking the latter fixed while it tries to jerry-rig the former. What is needed, instead, is a conjoined solution.
Economists looking at the problem (at least the one authoring this piece) would begin with a theoretical ideal, describe that, and use it to provide context for looking at solutions, imperfect though they might be, that best approximate the ideal. Here the theoretical ideal is given by making a perfect capital markets assumption - the student can borrow as much as he or she wants and will pay back the loan via the increase in wages earned in the future. If all of this were certain, the amount of college would be determined by the usual marginal benefits equals marginal cost conditions that set the optimal investment level. (Yesterday at the symposium, there was discussion of the "public good" aspect of a college degree. I concur; there is a strong public good benefit. I would add there is or at least can be a strong consumption good aspect - college to be enjoyed as a thing in itself by the matriculating student. I don't want to deny these other benefits at all and I agree it is too narrow to focus only on the human capital aspects of college education that have market value. But here I will ignore the public good and consumption benefits as I don't believe that looking at them helps in considering how the college education should be financed.)
The first step in moving away from the perfect capital markets ideal is recognizing that all of this is not certain, but that the tuition costs are far more certain than the student's future wages in the labor market. So there are risks - first in an entering student completing college and then, having attained the degree, in future earnings. These risks are currently borne by the students themselves (and their families). They are risks that can't be diversified away, at least not as things currently are done. And for middle and low income students, in particular, college as it is currently financed has become too much risk to carry.
The search for low-cost ways to attend college, when viewed this way, is the search for a safety-play in this otherwise high-risk universe. But socially, that might be quite wrong. Instead, what might be preferred is to keep the riskiness of the system as is, but to get some other parties to bear the risk. To me, that seems optimal as does that the other parties who are the obvious candidates to bear the risk are the future employers, especially those who are currently sitting on a pile of cash and can diversify the risk much better than the students can. This is the impetus for an employer-pay-for-college approach.
Readers, who properly should be skeptical about suggestions such as this one, might ask, if this is so much better than the low-cost college alternative why doesn't it happen on its own accord? It is a fair question. The answer is that it does happen now, in niche areas, but is certainly not ubiquitous. Looking at the niches might help in ascertaining what would need to be done to make the practice more widespread.
Let's consider two quite distinct areas. One is graduate professional education for mid-career employees who have climbed the job ladder and are highly valued by their employers. In this case, the productivity risk is much less than for new college grads. Further, these employees are implicitly bonded to their employer. Even if the professional education raises their future productivity substantially, they are apt to remain with their current employer, so the current employer has a reasonable belief that it can capture a good share of the productivity gains, thereby rationalizing the paying for the professional education investment.
The other is the remuneration of NCAA athletes in the revenue generating sports. The pro teams don't overtly pay the college tuition and incidental costs. But implicitly they do, especially by not maintaining minor league/development leagues for the players who would be high predictors for making it at the professional level and by having a draft system, so that players as they leave college are not immediately free agents. The draft system means that younger pro players are bonded to their teams.
Between these two examples I believe there are the elements of what an employer-pay system must do. First, of course, it must bring a good chunk of the wages it would pay employees (say in the first five years of employment after graduation) forward to the college years. The employers absorb the productivity risk this way. In, return for that, they are entitled to extract some risk premium, which would be done by paying those graduates substantially less in that five-year period than they would otherwise earn, the wage reduction covering the college cost and then some.
Second, the system must address the issue that some other employer will poach the productive employee - get the productivity reward but not incur the cost of paying for college. If this can happen freely (as is the case now) that would kill the incentives for employer pay. So there must be some legal form of bonding to the employer. Concomitantly, there needs to be some cartel pricing for the wages of new employees, so the solution works both for the students and the employers. Tilted one way, as already argued, the employers won't have incentive to make this sort of investment. Tilted the other way, this would appear like indentured servitude. What is needed is to minimize the tilt in either direction.
Third, the system must come to address how particular students get matched with their future employers and how the matching can become at least approximately efficient. Now we have a system based on recruiting and internships, as the primary identifying mechanism, followed by a standard market where near grads are free agents who collect offers (if they are lucky enough to appear attractive to potential employers) and chose the one they prefer. Employers, in turn, use both the recruiting and the internships as screening to whittle down the list of their preferred job candidates and from there make offers accordingly. Can such a system survive if the wages paid get taken out of the equation?
I suspect not. The incentives to cheat would be too great. Under the table payments would emerge as a way for an employer to attract preferred candidates. That would unravel the entire thing. If that is right, a different approach to matching would be necessary, such as a draft like what the pro sports leagues do or, instead, like the National Residency Matching Program, where ultimately the match is determined by the process rather than by being the highest bidder. This would still afford reason for pre-play communication like internships, so each side would have more information about the other when approaching the matching process. But it would preclude any bidding that would undermine the system.
Can this work? First, let me say this essay is only meant as a brief sketch of what might be done. It is not a fully fleshed out plan, not by a long shot. In adding flesh to the ideas here and in making tweaks to what has already been suggested (Why 5 years of bonding to the employer? Why not 3 years or 7 years instead?) many other related issues will likely emerge. (For example, the system might reasonably serve those future employers who are large and have substantial accumulated cash balances. But what of new ventures just beginning to make their way? How would they attract talent in this system?) Second, my purpose in this piece was not to answer whether this can work or not. I don't know the answer to that question. My purpose was to ask, why aren't we even attempting to consider this sort of alternative?
It is hubris to refer to your own thinking as out of the box. I do so here not as self-promotion, but because I'm frustrated that others don't seem to be producing non-obvious but interesting possible solutions to the problem. Instead, there is only the conventional thinking, which makes it seem like higher education is now in a race to the bottom, following in the footsteps of K-12 over the past twenty years or more. There was much talk at the symposium about crystal balls and making prognostications for where higher education will be in five years. My prognostication is decline, unless we do something serious and substantial to reverse that. So put on your thinking caps and come up with your own out-of-the-box alternatives. That's what we need to make things better.
No comments:
Post a Comment