Saturday, February 23, 2019

The Clouding Out Hypothesis

I finally bit the bullet and bought a new iMac for my home office. Actually, it was an act of necessity as the hard drive on the old machine was failing.  It may have been failing for some time as many things, particularly startup, were very slow.  But the coup de grace was the inability to launch Microsoft Word, which is one of those apps I use quite a lot, though there is kind of a love-hate relationship there.  Like many others, I hate auto-format.  I want to make formatting choices for myself, not have them imposed by the software.  Partly for that reason, I've taken to do much of my writing within the Blogger post box, which still does browser spell check, but otherwise is neutral.  On that score, I prefer it to Word.  But .docx is a standard and people expect to be able to communicate with others using Word documents as file attachments in email.

In the interim where the old machine was failing, then making the purchasing decision, then waiting to receive the new computer and set it up, the work around I used was to take a Word document that somebody had sent me, drag it to Google Drive, and then edit in Google Docs.  It's a funny choice because the university has a contract with Microsoft for Office 365, so I easily could have used their online version of Word and then would have the document stored in OneDrive.  But I didn't do that and now I'm puzzling a bit as to why not.  In this case I'm not sure whether my fears are rational or paranoid, but I don't completely trust any online provider that holds information of mine.  I don't trust Google more than Microsoft, but I've been using Blogger for years and years and Google Drive for quite a while. So, trust or not, that horse has left the barn.  (Live in the Midwest long enough and you start using idioms you have no right to use otherwise.)  It's no big deal to add one or two more documents to a Google Drive account that already has quite a few. Starting a similar relationship with another company, however, and the old fear crops up again. So I took the path of least resistance rather than confront the fear.

But then my desire for convenience trumped the old fear.  I do most of my stuff in a browser these days and I typically use three different browsers, so I can maintain different identities in each.  Firefox has been my main browser for a number of years.  I use it for both my U of Illinois identity and my lanny.arvan Google identity.  (This blog is accessed via that Google identity.)   I use it for the bulk of my transactions.  I then use Chrome for other things, and with a different Google identity, prof.arvan.  And I use Safari, mostly for the volunteer work I do with yet a different Google identity.

I was kind of dreading having to set up all of this on the new computer, when I learned that browsers now have this "sync" capability.  If you log into the browser on a different machine, but with the same account, all the information the browser stores on the first computer is accessed by the other machine.  This is hugely convenient.   I was up and running on the new iMac much faster than I had anticipated.  But it is also extremely frightening.  Do I trust Mozilla to protect the information I have in the browser?  Likewise, do I trust Chrome?

I should note that the University has a contract with Box.com and I've put a fair amount of my content there, some of it for teaching, a bit for sharing pdf files with the rest of the world, and then some for redundancy of content on my home computer.  At the time Box was chosen (previously the campus had a contract with a company called Xythos and ran a service called Netfiles based on that, but the service reached end of life) it seemed a secure alternative with some nice functionality regarding previewing and file sharing.  I don't know what Box's reputation is now, but its entire business is secure file sharing, so the incentives are in line for them to run a trustworthy service, at least a long as they have a good chunk of the market that demands their product. So I'm more comfortable with it, but I really shouldn't use it for non-university function.

With Apple it is a little different.  I have been reluctant to use iCloud, pretty much for the same reason I'm reluctant to use Microsoft's Cloud services, but one reason for me to buy another Mac rather than to switch back to a PC is that I like the application Messages, so I can read and write text messages on the computer.  I prefer that to using the phone.  I don't completely understand the technology at play here, but I gather iCloud must be enabled to make this work. So while I don't store my files in iCloud....

Let me note that software install is now all downloaded from the Cloud and is actually extremely quick.  The entire MS Office Suite was downloaded and installed in under 10 minutes.  We have a pretty good Internet connection.  (I just ran a speed test which reports 260 MB/sec for download and 23 MB/sec for upload.).  Which gets me to the last topic I want to discuss - antivirus and malware protection software.  On the old iMac, after the university cancelled its contract with McAfee, I found an alternative, Sophos, which was free for home use.  I can't tell by usage what is good and what is bad software in this domain, but I can say you could set up a custom scan for an external drive, which I use so liked, and in real time it would tell you the file it was scanning and how many files remained. I later learned that it reports problems when there is a file it can't scan (such as Microsoft DMG files for the installation of Office components).   So I made an account at Sophos and downloaded a free version for my new iMac.  I ran a scan, which worked fine, but a summary of the results were reported on their Web site (in my account).  I found this very frightening.  If they are collecting my scan results in summary form, what else might they be collecting but not telling me about?

So I uninstalled their app - they do provide their own uninstall program - and looked for some other antivirus program I might use.  I did a quick Google search on "antivirus for mac" (without the quotes) and then I made a mistake, probably from being stressed out about this and some other things.  I clicked on an ad listing, rather than the first real listing.  So I ended up buying a product I hadn't heard of before called TotalAV.  It was the number one rated product on the ad site.  But I read a review of it this morning, which thought it mediocre.  Also it said the low price was for the first year only and they auto-renew for the second year at a much higher price.  While that is a concern, my real question is about whether these antivirus companies are harvesting your data without you knowing about it.  What other company do you allow to do a full scan of your device?

And, then, if they are capturing a lot of information about you, does it really matter whether your content is on a local machine or in the Cloud?

“Just because you're paranoid doesn't mean they aren't after you.”
― Joseph Heller, Catch-22

Sunday, February 17, 2019

Restoring Usage of Whom and the Subject-Object Distinction

I post a rhyme to Twitter on a daily basis.  Sometimes my focus is entirely on the box that contains the Tweet.  But at other times I do check out what else is on the home page, both the items that follow my Tweet and the content of the sidebars.  Invariably, I will then get dismayed by the content in the right sidebar where there is a box, "Who to follow."  It is not that I'm opposed to getting suggestions of people who write Tweets that might interest me, though I typically ignore these suggestions.  (I'm far behind in my reading as it is.)   Rather, it is this misuse of the word who.  The label of that box should say, instead, whom to follow.  Below I will make the case for this as well as offer an explanation why there seems to be confusion about using the word whom in a sentence.  That confusion leads to avoidance in usage.  (There is a similar though not identical problem with (non) use of the word me.  We can talk about that at another time.)

Before getting to my analysis, however, let me make the following observation.  There very well may now be a crisis regarding large numbers of people who are unable to distinguish truth from fake news, fact from myth, and who can't discriminate situations where reasonable people can disagree from other situations where there should be no disagreement. This is not my observation.  I'm simply echoing it here.  What I want to ask here is this.  What is the source of this lack of discernment?  I'm willing to believe there are multiple causes rather than just one.  Might one of those causes be that people don't understand basic elements of English grammar so they don't have the right language at hand to consider these distinctions?  If so, would it then make sense to teach English grammar more thoroughly than is done now?  Or, would this be yet another area where students get taught but don't really learn, in this case because without reading sufficiently there isn't enough practice to cement the grammatical ideas in the students' understanding and the bulk of the students are not doing enough independent reading?

I really don't know.  Indeed, I don't consider myself a grammarian, though my mother was a foreign language teacher and as a consequence I probably got more grammar lessons than most kids did.  In writing this piece, I'm doing a lot of little Google searches, so I don't say wrong things about the grammar.  While I had an intuitive feel that what Twitter does with the phrase "Who to follow" is wrong, I had to work through the argument to support that intuition.  Here is the justification.

Consider this sentence.  You should follow these people.   In the sentence, you is the subject and these people is the object.  Should and follow are both verbs.  I looked up should in the dictionary and it described it as an auxiliary verb - a verb that modifies other verbs.  It's not a term that I had heard before (or perhaps I did but have long since forgotten it).  In this case should modifies follow.   So, should follow is the properly modified verb in the sentence.  I take it that this sentence is what the folks at Twitter mean with the phrase "Who to follow," but if these people is the object of my sentence, then the correct pronoun to relate to these people is whom, not who.  Whom is the objective form of the pronoun.

Let's slightly complicate things by considering a sentence with two clauses, while still aiming to imitate the sentence in the prior paragraph.  These are the people whom you should follow.   In this case, these are the people is the independent clause, while whom you should follow is the dependent clause.  Whom is a relative pronoun in the dependent clause that relates to these people from the independent clause.  And it is still whom rather than who, because it is the object of the dependent clause in spite of the word order.   The word order is one factor that can confuse people.

Consider this more abbreviated form of the sentence.  These are the people whom to follow.   There are still two clauses here.  But in this case the subject of the dependent clause, you, is implicit as is the auxiliary verb, should, while now the preposition to is added, to distinguish from the case where these people are the ones doing the following.  That sentences might have implicit subjects and auxiliary verbs is another factor that can confuse people.

The last step is to observe that our use of language does tend to abbreviate things even more.  The entire independent clause can be made implicit.  Then what is left is the phrase as it should be in the Twitter sidebar - Whom to follow.

Anybody who actually has read to this point must be asking, do we really need to belabor this?   I understand the analysis, but this is no big deal.  I'm guessing that this would be the typical reaction to what I've said so far.  Here is why it might be a big deal.

First, the people who made this mistake aren't hayseeds.  They work at one of the major tech companies in the world.  Twitter may be smart software in many other ways. (I will leave the analysis of that proposition to others who are more tech savvy than I am.)   Yet regarding language use, this is a pretty elementary mistake.  If Twitter personnel can make this mistake, one might argue that most anyone could make the mistake.  (Let's hope that those who teach English grammar would not.)

Second, there is getting at why the mistake was made.  There seem to be two possible explanations.  One is that the people at Twitter were not capable of doing the analysis like the one above.  The other is that the people were quite capable of doing the analysis, but they did it quickly, so didn't do the analysis carefully or perhaps they didn't do it at all.  It's this other explanation, which I find frightening and which we might take as the virtual canary in a coal mine.   Where else in their work are people taking shortcuts instead of thinking things through?  How lazy do people get as thinkers from quite frequently not putting in the time to do the analysis?  And how much pressure are people under at work to not put the time in on any one task, because there is so much other stuff to be done?

Of course, I don't know the answer to these questions.  But I have an intuition.   It is partly based on my teaching, where students have told me they skimmed pieces I recommend that they read, and partly from some online discussions that I occasionally participate in, where much of the commentary seems quite shallow and where it takes an effort to get a few participants to delve deeper into the topic, but when that happens it is more the exception than the rule.   The intuition is that largely we are making mistakes like the mistake in the Twitter sidebar and with much greater frequency than we care to admit.

What if that is true?  Is there anything we could then do about it to reverse the trend?   Before getting to my proposed answer, let me observe just how odd this is.  These days it's impossible not to see pieces about artificial intelligence and viewing AI as a big jobs killer.  Computers can do the work more reliably and do it cheaper than people can, especially if the work itself is routinized.  Yet people who are working seem to be so incredibly busy that they have no time to think!  What is wrong with this picture?

Language is fundamental to thinking.  Thought is internalized conversation with oneself.  If people are going to think through things well, they first need to use language well.  This provides the basis for an argument that people should learn English grammar reasonably well, as a thing in itself, because it is an enabler of their thinking.  Yet the bigger reason to learn grammar well is as an emblem.  People need to embrace being thoughtful and to realize that thinking doesn't come on the cheap.  It takes time and patience to be thoughtful.  Making a big to do about using whom where appropriate and understanding when the relative pronoun is the object within a dependent clause then serves as a reminder for people to be careful in their thinking, regardless of the setting.

Further, we know that as people learn, once proficiency has developed those thought processes become autonomous and then can occur comparatively quickly, so that as people who continue to practice being thoughtful they can direct their thinking at increasingly complex matters.  Making a big deal about the subject-object distinction is not just a pedantic matter, even if it might seem so at first.  It is about encouraging people to be thoughtful in how they go about their work and their life outside of work.

So, we should make a big deal of what may seem a very small thing.

Monday, February 11, 2019

Ring Around The Rosie In The College Economics Classroom

I'm not yet sure whether I will be teaching this fall nor if I do teach whether it will be for the Economics Department.  But for the sake of this piece let's say it will happen and it will be the course I had been teaching on The Economics of Organizations.  Periodically, I consider experiments I might try in class to improve things.  These are not experiments in the sense of the scientific method.  They are reflective practice where once tried I then do an informal evaluation of it to see how the students react.  If it isn't a complete disaster, I will likely try it again the next time I teach and then also do an informal evaluation. At least, that's how its worked in the past. My future teaching is more uncertain now, yet I'm still intrigued by the possibility of this experiment, if I do indeed teach this time around.

The underlying motivation is whether some sense of social conscience can be taught with the teaching done in a way where it will matter to the students.  The ideal I have in mind is that a student who is doing well in the course, and in school more generally, takes it upon herself or himself to assist a teammate who is struggling, with my class and quite possibly with school more broadly considered.  Further, the good student is sufficiently discerning so as to distinguish the struggling student from another teammate who really is shirking but is otherwise well adjusted.  Some fairly recent experience I've had suggests that the good student will instead confound the two situations and treat them both as if shirking has occurred. The good student becomes embittered as a consequence (shirking seems so widespread).  Imagine the difference in mindset and attitude if the good student did put in the effort to help the struggling student and the good student could see some benefit coming from such effort. That ideal as outcome may only be the delusions of a doddering old professor.  Yet it is such delusions that have motivated my previous experiments.  So why not another one now?

Now I confess this is not a well researched idea.  It comes, instead, from couple of different family experiences.  The first was when one of my kids was young and struggling to learn to read, even as this same kid had qualified to be in the "gifted program" at his school.  This discord impacted the family in a big way.  Eventually, we pulled the kid from the gifted classroom and took him to a reading specialist outside the school setting.  The specialist did help.   She used a multisensory approach, something I had never heard of before, but apparently one that works well with kids who have dyslexia.  I have some vague memories of my son having his fingers on one hand in a metal tray with sand in it, while they were working through the reading exercises.  I do not understand why this was helpful, but apparently it was.   The other experience happened much later in life.  I was raised Jewish but my wife is Methodist.  When visiting some of her family, they would offer a prayer before dinner.  During the prayer, people at the table held hands, making solidarity with one another.  I'm not big on prayer in the open, but the holding hands like that made an impression on me.

So, I suppose that, subliminally, I've wondered for some time whether adult learners could be partially engaged by their sense of touch and not make all the teaching and learning purely about their cognitive processes. Indeed, a few years ago after seeing a lecture by Harry Boyte I wrote a long blog post because that lecture had a significant impact on my own thinking, yet I was physically uncomfortable during the talk (my neck was hurting) so I didn't engage Boyte after the talk was over, but then we did have an email thread, where I shared that post with him and we talked about related ideas. I liked very much his metaphor of educating the head, the hands, and the heart as the way we produce good citizens.  (Good citizens and social conscience are essentially the same for me, though the former may convey more the acts of good citizenship while the latter might speak more to the mindset that encourages those acts.)  These days it seems that among the people I know educating the hands is about learning to cook, a good life skill, one where I am substantially below average, so I definitely won't be teaching my students that. Where I talk about using the hand in what follows, it is as an instrument of the heart.  I want to focus on using the sense of touch not for other things but for other people.  Holding hands is a way to feel other people and, I hope, support a cognitive sense of other people as well.

In a nutshell, I'd like to teach a class where students hold hands during the class session to see whether it impacts how the session goes and whether the students appreciate their fellow students more as a result.

But while this would seem perfectly natural in elementary school, it's a bit weird for the college classroom.  So the question is how to initiate this in a way where the students might perceive it as daffy (I do other things that they find daffy like having them write blog posts), but even so they think it benign and non-threatening.   The kids game/song Ring Around the Rosie is there to help with that.  My hope is that most of the kids in the class would already be familiar with it, which would be a big plus.  My class features that students bring their own experiences and tie it to the subject matter we study.  This would be in a similar vein.

In trying to envision how it would work, I would begin by saying that we're going to play a children's game as a way to help us learn what we want to study.  Before getting to the game I will note that in most of the economics they have been taught, each individual cares only about their own consumption bundle.  Their preferences depend on that, but typically don't directly depend on other people.  Yet in organizations people need to depend on other people - their co-workers, their managers, people they themselves may manage, etc. This dependency necessitates something of a sociological approach.  So we're going to play a game to get the class into the right mindset.

In previous years, I had already experimented with sitting the class in discussion mode, by which I mean moving the tablet armchairs from their customary position and instead placing them around the periphery of the room (in more of a horseshoe than a circle, though the exact shape would depend on how many students were there).  This much is now a common practice for me when we are not in lecture mode, which I do sparingly to cover some of the economic models that the students don't seem to be getting from the online homework.   Once seated in discussion mode, I would ask the class - how many learned Ring Around the Rosy when they were kids?  Among those who raised their hands I'd ask - does anybody want to recite it now to the class?  If I could get a taker, that would be great.  If not, I would recite it myself.

Ring around the rosie
A pocket full of posey
Ashes ashes
We all fall down. 

Then I would explain we are not going to sing it aloud as a group, because that might disturb some of the adjacent classrooms (there have been complaints in the recent past about noise from showing videos during class).  And we won't do the last part where we all fall down, because I would have trouble getting back up again.  I'd hope that would get a laugh or two.  Then I would tell them I'd like them to all hold hands.  Please do it gently, and please note this is a classroom exercise only.  It is not aimed to improve their social lives outside the classroom.  I would conclude this bit with observing that groups work better if they've had a prior bonding experience.  So they consider the hand holding activity from that perspective.

Then I would move us into discussion and observe that if they have each of their hands holding the hands of other people, one to the right and another to the left, then they don't have a hand free to raise to signal they want to contribute to the discussion.  So, they'd be paired up to free one of their hands.  If we had an even number of students, that would be easier.  With an odd number, one would be paired with me. That might provide its own mild interest.

I probably would want to try this in the second class session.  I typically cover in an abbreviated form Akerlof's model of Labor Contracts as Partial Gift Exchange on that day.   The discussion would be about getting students to give examples of places they worked or organizations they were part of where people did the bare minimum, nothing more, and then other examples where people contributed much more than they had to do.  Then we'd get at their own conjectures for why the one or the other.   We'd then talk about which they would prefer and why that is.

About 5 minutes into the discussion I would make a quick scan of the class and make a mental note about whether the students had relaxed or if the hand holding still made them feel awkward.  I'd also want to subsequently get an impression of whether the hand holding had an impact on the class discussion.  Perhaps it gave some comfort to students, so they were more willing to participate.  In any event, I would make some mental note of my own impression of this.  But it is hard to process this way and conduct a class at the same time.  So I would have students evaluate the class after it was over.

The last time I taught the course I gave those students in attendance the option to fill out a survey after class which was about the quality of the class session.  They would get a few points for completing the survey, which might end up boosting their grade a little. This meant I would need to take attendance, which I did by means of a class sign-in sheet, and I would have to track the students in the survey to make sure they were actually there.   The last time around I had several Likert style questions to rate the class discussion in some way, and then one paragraph question for the comments.  Early on in the semester the responses to the Likert-style questions were informative, but later in the semester they weren't.  By then I had a pretty well formed impression of my own about the class.  But the responses to paragraph question were always interesting to see.  Here, I think I'd have a few different paragraph questions about the effect of the hand holding on the class.  Then I'd conclude with one yes-no question.  Should we do this again in the next class session?

I want to add one more thing and then close.  I had a policy of no electronic devices during discussion mode the last time I taught.  This setup might be more extreme in that it would probably block taking notes on paper as well, at least for those students where the hand they write with was being used to hold the hand of their neighbor.  If I said to the class no paper note taking along with no electronic devices, would that work?  Or would it end up making some students uncomfortable because of the lack of note taking, rather than because of the hand holding.  I would need to work this through before actually trying to implement the idea.

That sticking point notwithstanding, the idea does have me intrigued.

Friday, February 01, 2019

Getting Skin in the Income/Wealth Redistribution Game

I am on Elizabeth Warren's mailing list, which goes to my University of Illinois email.  Either I once gave a donation to her political campaign or I wrote some comment on her Website.  At this point I'm not sure which.  I also get email from other political sources, but sometime ago I changed the settings for my email and put those in my Junk Mail folder automatically, so I rarely if ever look at them.  For whatever reason, I have never done that with the Elizabeth Warren emails.  They still come to my Inbox.  I get a huge number of solicitations, vendors who think I'm still working, some who think I am in Medical IT (perhaps confounding me with my brother), and some meant for my wife (we're both L. Arvan).  And there are econ prof ones - publish in this journal that is just starting out, for example.  My campus email largely has been commandeered by folks that I would rather not hear from.  This probably explains my lethargy for keeping the Elizabeth Warren mails.  Mainly, I don't even look at the previews, but this message caught my eye.


I'm going to take this on, with my own critique that I don't see being talked about elsewhere.  (For example, John Cassidy has an analysis of the proposal.  It gives the usual critique of most tax proposals, based on tax avoidance, and also considers the alternative of taxing accrued capital gains, as opposed to only taxing realized capital gains - when the asset is sold.)  Before getting to that, however, I want to note that I'm not close to figuring out which candidate I favor for the upcoming election.  Criticizing Warren's proposal doesn't preclude me from voting for her in the Democratic primary.   We tend to treat supporting a candidate like we treat rooting for your favorite team.  Fans show loyalty and in that context I understand that.  (Truthfully, I've given up on the Knicks and the football Giants.  It's just too hard to be loyal now.)  But I don't think the same approach should apply to politics.  I prefer to argue out the ideas of candidates and elected officials, when I have a different view of things.  And I would prefer that other voters do likewise. Now, onto my argument.

Why isn't there another category?  For example, why not tax wealth over $10 million at 1%.  This is not just a question of where to draw the line, though that is definitely part of it.  It's whether there is any core principle we can invoke about who pays the tax and who is exempt.    In my book, a household  worth $45 million is rich.  Yet under Warren's proposal they are exempt.  Why is that?

Now let me get to my core issue, which is how to view taxation.  Here are two quite different possibilities.

1.  Taxation is a taking by government of private property owned by citizens. 
2.  Taxation is a way for citizens to express their responsibility to their community, their state, and their country. 

The first view is what generates tax avoidance.  The second view is consistent with people willingly assuming their tax obligations.

The issue for me is this.  Are the views people hold about taxation intrinsic to them, or can they be shaped by how we go about taxation and government spending?   Anybody who reads my blog (I want to thank all 10 or 20 of you who do that) knows I believe that these views can be shaped and what is needed, as much or more than a policy change, is a deliberate educational effort that strongly encourages view (2).  My last few posts have been on this theme and I've written about it, on and off, after the Tea Party showed prominence in the midterm elections in 2010.

Now I want to switch gears some and talk about metaphors, because I think that is the way people might best be convinced to reconsider their views about taxation.  (Once upon a time, I read Lakoff and Johnson.)  And because consistency is the hobgoblin of small minds, I'm actually going to return to sports as my source of metaphor.

For years and years, I've considered paying taxes as like getting hit by a pitch in baseball.  It hurts, maybe a lot, but getting on base helps the team.  Ron Hunt is the name I associate with getting hit by a pitch.  He led the National League in this category a few times.  Sometimes getting hit by a pitch is purely an accident.  When you lead the league in the category, however, there has to be some intentionality to it.  In any event, this is how I thought about taxation till recently and I would label the good behavior I wanted to see as taking one for team.  But my views have evolved. I've been involved in volunteer work that is quite intensive, both in the time I put in (mainly writing) to support the organization but also in the money donated.  Both are needed.  And I feel good about giving both.   The feeling good part is what I now want to emphasize.

So I've looked for a different example from sports that is more like my volunteer experience. The example of Jack Twyman and Maurice Stokes readily came to mind.  Maurice Stokes had a paralyzing injury that shortened his career and Twyman took care of Stokes financially (with the help of others) for the remainder of Stokes' life.  Both were players on Cincinnati.  They knew each other as teammates and as friends.  Friends help each other when needed.

What would it take for Americans to view their fellow citizens as members of the same team?  And then, beyond that, what would it take to offer up help for other Americans who are hurting by willingly pay more in taxes?

On this score, the problem with a wealth tax as Warren proposes it is that the vast majority of us are not sufficiently wealthy for the tax to have any direct effect on us. If I hadn't been trained as an academic economist, would I care about whether there is another category above $10 million, or about Warren's original proposal?  Maybe a bit, if I felt the rich are demons and Warren's proposal was a way to give them their comeuppance.  (Trump and his cronies are making that view more prominent.)  But as somebody who is financially comfortable, yet not rich, would I make this the make or break issue of the campaign for me?  Or would I focus on other things that seem more directly related to my own financial well being?  If the latter, would my views about taxation change at all?  I doubt they would.

It has seemed to me for some time that the needed way to get those views to change about paying more in taxes is for people to witness others like themselves doing just that.  It then becomes fashionable, the new black, if you will.  It must begin with a vanguard who do it voluntarily and visibly.  Then others get caught up in it.  It diffuses a la Malcolm Gladwell's The Tipping Point.  This is the way to get everyone who can afford to pay more in taxes to do so.

Now a different criticism of Warren's proposal, which is using a 10-year time frame to consider revenue generation, while not impacting the way people do view paying taxes now, so leaving a considerable chunk of the population, including many of the very wealthy, who consider taxation from the perspective of (1) above.  Our recent experience shows that even if Democrats win out in 2020, and possibly 2024, the Republicans will make a comeback.  Then what?  My bet would be tax cuts, of the type that would undo Warren's proposal.   We might get a little bit more realism to the public debate the next time a tax cut proposal comes around.  Tax cuts may very well not stimulate the economy much if at all.  So the real reason for the cuts would be that rich people hate paying taxes and rich people disproportionately fund political campaigns.  But I wouldn't count on the realism to emerge. Yet I would expect this top happen.

If you want to talk about a 10-year time horizon, you need to explain how the approach will endure even after a regime change that puts the Republicans back in charge.

What if, instead, Eisenhower Republicans make a comeback, because the anti-tax views of the wealthy hard right have been discredited?  Would that make for a greater likelihood that the approach will be sustained even if there is a regime change?  If so, then it makes sense to to encourage such discrediting.  It seems to me this can only happen if many people who currently hold views about taxation as in (1) change their views to (2). In turn, this can only happen if many more people have their taxes raised, so they are bearing some share of the responsibility.  It simply can't work if only the uber rich are the targets.

I will close on this note.  The demographics of who votes Democratic and who votes Republican have been changing.  Red districts now typically have average household income below the national average.  For Blue districts it is the reverse.  If we are to heal as a nation, income redistribution should go from Blue districts to Red districts.  Getting that to happen might be a very tough sell, as most people still think about voting their pocketbook rather than voting to help a fellow citizen in need.  We need leadership to get people to change their views, on taxation and on spending.  What I think I'm seeing instead, in both the proposal of Elizabeth Warren and that of Alexandria Ocasio-Cortez, on raising the top marginal tax rate to 70%, is some awareness of a historical norm and then trying to reproduce that norm in one fell swoop.  I think we need a process, one that changes minds as well as changes fiscal policy, taking many steps to do so.  The changing of minds is the key factor missing from our current politics.  If by magic our political leaders could listen to me, that's the one message I'd want them to hear.