Thursday, November 19, 2015

R̶i̶s̶k̶ Uncertainty Assessment

Today I should be grading, but human nature being what it is I'm procrastinating on that task, though I hope to get to it shortly. Here I want to put a few things together that have been on my radar that last few weeks.

The first is about something that Richard Levin said in his lecture on campus last week. This was one of those statements that can be taken as a stand alone point rather than as part of his larger argument.  (I wrote about Levin's lecture in my previous post.)  This point is about the quantitative reasoning piece of general education.  Levin said students should take probability and statistics instead of calculus, as all students need to understand and perform assessments of uncertainty based on available data, while most of them will rarely if ever use calculus once they've graduated from college.  I previously read about this point in a column by Nicholas Kristof, so I gather the view is making the rounds.

Before taking on the argument, let me observe that the economist Frank Knight is associated with this distinction between risk and uncertainty that has found its way into my title.  Economists embrace this distinction, but to my knowledge it has not yet found its way into common usage by the population as a whole.  To avoid philosophical issues, let me give a working definition of these ideas.

Risk is when there is a frequency notion at root, so one can look at historical data to assign probabilities.  Consider a flip of a fair coin, the first textbook example a student gets exposed to in a course on probability.  It has a probability of .5 that it will come up heads.  Underlying this is the Law of Large Numbers, which for the coin flipping example says the ratio of the the number of heads to the number of flips of the coin will tend to .5 as the sample size gets large.  That there is risk in the world provides a rationale for why there are actuaries, who examine the data and assess probabilities based on that.  Insurance premiums are driven by such risk assessment. 

Uncertainty is when the situation has novelty to it, so one needs to make an assessment based on the situation alone.  If you think of this from an evolutionary perspective, the canonical question is: fight or flight?  And one of the big points here is that you can't totally divorce the assessment of the uncertainty from the consequences.  In other words, fear can manifest when danger is perceived, even when the situation is benign in fact.  Further, past negative experiences (trauma) can alter the uncertainty assessment.  For a trauma victim it becomes plausible that lightning will strike in the same place again.

A different sort of assessment happens for upside consequences, where because after the fact we tend to impute causality even when randomness was fundamentally at play, we feel we are deserving of our own good fortune.  I wrote about this a while back in a post called Pluck or Luck.  There I made reference to something called The Just World Theory, a certain type of cognitive bias.

Behavioral economics takes as its basis that human beings are not rational and instead possess a variety of these cognitive biases.  For example, Daniel Kahneman in his book Thinking Fast and Slow, talks about WYSIATI (What You See Is All There Is).  This means that people make their assessment of uncertainty based on their own experience, but ignore the information that might be garnered from the experience of others when that information is not immediately at hand.

If you look at two of the more hot button issues in the news now, that plight of the Syrian refugees and the matter of racism on campuses around the country, and you look at how different are the proposed ways of addressing these issues, depending on whether the proponent is Liberal or Conservative, underlying this has to be significant differences in assessing the probabilities.  One might therefore be hopeful that if the population as a whole had a better sense of probability and statistics, that some of these differences in how to address social issues would erode.  Alas, I think we should be skeptical of this hopeful view.

There are two big issues to confront here that don't have easy answers.  First, many students get through math courses without ever really internalizing what is supposed to be learned there so it becomes part of their own thinking.  This starts quite early in school, when students are first exposed to algebra and geometry.  These kids know that they don't know the math, so they look for alternative ways to get through these classes (memorizing homework problems and lectures) that are entirely dysfunctional for producing understanding.  How much college math, whether calculus or probability and statistics, really gets learned by students who have such a shaky foundation in their prior math understanding?  Indeed, early probability courses are based on counting and approximation, to a large extent.  Many students are not good at these matters.

The other big issue is that probability and statistics are typically taught in a way that is pretty technical but also divorced from decision making.  So while a student can become familiar with the mechanics of a probability calculation, they may never learn when to use such a calculation in practice or to trust that calculation for making a decision.  More importantly, the students are not made to confront their own cognitive biases.  If they were, they might actively resist these courses rather than embrace them.  (There is resistance to these courses now because they are hard.  But there is not resistance because the subject matter would make students uncomfortable.)  Most of us don't like to be told that we're prejudiced and in need of awareness training to alleviate that.

Let me close with a mention of this piece about how ISIS became a force.   It is an interesting read.  There were many unanticipated consequences from past action. 
One wonders whether it would have been possible to be more prescient than we actually were, but we opted out of doing that because such actions, "didn't fit the current narrative."  You are supposed to learn from your mistakes, but on this matter one senses there is a lot of willful blocking of learning, precisely because the narrative prevents empiricism from occurring.

It would be delightful to discover that I'm wrong here and that teaching probability and statistics broadly would improve matters significantly.  In the absence of evidence to contrary, however, I'll stick with my skepticism.  More than the appropriate subject, the key issue is whether the student is open to what is being taught. If the student is not open to really engaging with new ideas, the subject matter counts for naught.

No comments:

Post a Comment