Sunday, February 17, 2019

Restoring Usage of Whom and the Subject-Object Distinction

I post a rhyme to Twitter on a daily basis.  Sometimes my focus is entirely on the box that contains the Tweet.  But at other times I do check out what else is on the home page, both the items that follow my Tweet and the content of the sidebars.  Invariably, I will then get dismayed by the content in the right sidebar where there is a box, "Who to follow."  It is not that I'm opposed to getting suggestions of people who write Tweets that might interest me, though I typically ignore these suggestions.  (I'm far behind in my reading as it is.)   Rather, it is this misuse of the word who.  The label of that box should say, instead, whom to follow.  Below I will make the case for this as well as offer an explanation why there seems to be confusion about using the word whom in a sentence.  That confusion leads to avoidance in usage.  (There is a similar though not identical problem with (non) use of the word me.  We can talk about that at another time.)

Before getting to my analysis, however, let me make the following observation.  There very well may now be a crisis regarding large numbers of people who are unable to distinguish truth from fake news, fact from myth, and who can't discriminate situations where reasonable people can disagree from other situations where there should be no disagreement. This is not my observation.  I'm simply echoing it here.  What I want to ask here is this.  What is the source of this lack of discernment?  I'm willing to believe there are multiple causes rather than just one.  Might one of those causes be that people don't understand basic elements of English grammar so they don't have the right language at hand to consider these distinctions?  If so, would it then make sense to teach English grammar more thoroughly than is done now?  Or, would this be yet another area where students get taught but don't really learn, in this case because without reading sufficiently there isn't enough practice to cement the grammatical ideas in the students' understanding and the bulk of the students are not doing enough independent reading?

I really don't know.  Indeed, I don't consider myself a grammarian, though my mother was a foreign language teacher and as a consequence I probably got more grammar lessons than most kids did.  In writing this piece, I'm doing a lot of little Google searches, so I don't say wrong things about the grammar.  While I had an intuitive feel that what Twitter does with the phrase "Who to follow" is wrong, I had to work through the argument to support that intuition.  Here is the justification.

Consider this sentence.  You should follow these people.   In the sentence, you is the subject and these people is the object.  Should and follow are both verbs.  I looked up should in the dictionary and it described it as an auxiliary verb - a verb that modifies other verbs.  It's not a term that I had heard before (or perhaps I did but have long since forgotten it).  In this case should modifies follow.   So, should follow is the properly modified verb in the sentence.  I take it that this sentence is what the folks at Twitter mean with the phrase "Who to follow," but if these people is the object of my sentence, then the correct pronoun to relate to these people is whom, not who.  Whom is the objective form of the pronoun.

Let's slightly complicate things by considering a sentence with two clauses, while still aiming to imitate the sentence in the prior paragraph.  These are the people whom you should follow.   In this case, these are the people is the independent clause, while whom you should follow is the dependent clause.  Whom is a relative pronoun in the dependent clause that relates to these people from the independent clause.  And it is still whom rather than who, because it is the object of the dependent clause in spite of the word order.   The word order is one factor that can confuse people.

Consider this more abbreviated form of the sentence.  These are the people whom to follow.   There are still two clauses here.  But in this case the subject of the dependent clause, you, is implicit as is the auxiliary verb, should, while now the preposition to is added, to distinguish from the case where these people are the ones doing the following.  That sentences might have implicit subjects and auxiliary verbs is another factor that can confuse people.

The last step is to observe that our use of language does tend to abbreviate things even more.  The entire independent clause can be made implicit.  Then what is left is the phrase as it should be in the Twitter sidebar - Whom to follow.

Anybody who actually has read to this point must be asking, do we really need to belabor this?   I understand the analysis, but this is no big deal.  I'm guessing that this would be the typical reaction to what I've said so far.  Here is why it might be a big deal.

First, the people who made this mistake aren't hayseeds.  They work at one of the major tech companies in the world.  Twitter may be smart software in many other ways. (I will leave the analysis of that proposition to others who are more tech savvy than I am.)   Yet regarding language use, this is a pretty elementary mistake.  If Twitter personnel can make this mistake, one might argue that most anyone could make the mistake.  (Let's hope that those who teach English grammar would not.)

Second, there is getting at why the mistake was made.  There seem to be two possible explanations.  One is that the people at Twitter were not capable of doing the analysis like the one above.  The other is that the people were quite capable of doing the analysis, but they did it quickly, so didn't do the analysis carefully or perhaps they didn't do it at all.  It's this other explanation, which I find frightening and which we might take as the virtual canary in a coal mine.   Where else in their work are people taking shortcuts instead of thinking things through?  How lazy do people get as thinkers from quite frequently not putting in the time to do the analysis?  And how much pressure are people under at work to not put the time in on any one task, because there is so much other stuff to be done?

Of course, I don't know the answer to these questions.  But I have an intuition.   It is partly based on my teaching, where students have told me they skimmed pieces I recommend that they read, and partly from some online discussions that I occasionally participate in, where much of the commentary seems quite shallow and where it takes an effort to get a few participants to delve deeper into the topic, but when that happens it is more the exception than the rule.   The intuition is that largely we are making mistakes like the mistake in the Twitter sidebar and with much greater frequency than we care to admit.

What if that is true?  Is there anything we could then do about it to reverse the trend?   Before getting to my proposed answer, let me observe just how odd this is.  These days it's impossible not to see pieces about artificial intelligence and viewing AI as a big jobs killer.  Computers can do the work more reliably and do it cheaper than people can, especially if the work itself is routinized.  Yet people who are working seem to be so incredibly busy that they have no time to think!  What is wrong with this picture?

Language is fundamental to thinking.  Thought is internalized conversation with oneself.  If people are going to think through things well, they first need to use language well.  This provides the basis for an argument that people should learn English grammar reasonably well, as a thing in itself, because it is an enabler of their thinking.  Yet the bigger reason to learn grammar well is as an emblem.  People need to embrace being thoughtful and to realize that thinking doesn't come on the cheap.  It takes time and patience to be thoughtful.  Making a big to do about using whom where appropriate and understanding when the relative pronoun is the object within a dependent clause then serves as a reminder for people to be careful in their thinking, regardless of the setting.

Further, we know that as people learn, once proficiency has developed those thought processes become autonomous and then can occur comparatively quickly, so that as people who continue to practice being thoughtful they can direct their thinking at increasingly complex matters.  Making a big deal about the subject-object distinction is not just a pedantic matter, even if it might seem so at first.  It is about encouraging people to be thoughtful in how they go about their work and their life outside of work.

So, we should make a big deal of what may seem a very small thing.

2 comments:

Nikki R said...

First an observation: It will be hard to teach young people better grammar when we do not teach the teachers of young people better grammar. It seems to me that concern over subject-object distinction has fallen out of fashion altogether.

On the other hand, teaching discrimination should not be dependent on proper grammar. Nor should it be limited to any particular subject area. Fact checking can be taught in all subject areas, and should be. However, I’m wondering whether the belief in fake news isn’t a deeper problem - a problem of belief. All people seek “news” or data to confirm what they already believe, even scientists. If the news agrees with their beliefs, then it is deemed accurate; if not, the news is clearly “fake”. To counter this sort of normal human tendency, we need to teach people to understand that questioning the beliefs they are taught at home is not heresy. They also need to be exposed to a lot of differences that may seem to conflict with their own traditions, but which can coexist with their own traditions without disrupting them. In our factory style school system where we expect all young people to develop in all areas at the same pace, teaching the acceptance of differences is hard. We don’t tolerate their obvious differences, so how can we expect them to learn such tolerance?

Lanny Arvan said...

Nikki - thanks for you comments. Here are a few different thoughts in response.

Way back when I was in sixth grade, one other kid and I were given a "programmed book" with which we were to self-teach about grammar. I still remember - ring, rang, rung yet bring, brought, brought. The book taught a rule or lessons, then self tested. There were alternating sections of white and gray. White was for the presentation. Gray was for the response. You were supposed to cover up the next response as you grappled with the question posed. All these years later, I wonder why it wasn't done by the whole class, only us two. But it makes me think this sort of thing is pretty ideal for computer guided instruction. Whether that could be done both educate and entertain the kids, I think is possible but I really don't know all the challenges entailed in the authoring. However, I think it's more whether a well authored piece of software would have a market.

Confirmation bias is now well understood, though as you say, even people who are well aware of it may not know how to guard against their own bias. Some years ago I wrote a post - Do I have to consume conservative media to consider myself thoughtful? The upshot of that piece is that it was painful to do so and I viewed the pieces I read as hatchet jobs - and these were the supposedly more high minded of the conservative stuff. So I stopped reading authors other than on the NY Times Op-Ed page. And recently, I confess, to reading less and less of this stuff. Everything one reads these days is so depressing.

My last point is that if there is some hunger in the person about getting diverse views and experiences, reading books is indispensable. But my sense is that reading books for recreation and self-education at the same time is down quite a lot. It can't compete for kids attention with whatever is on their phones.