Good Thinking, Part 1 - Fact and Interpretation

We live in an opinionated world. Our culture is fueled by pundits - from television to the internet, we increasingly get information from sources who do not simply tell us what they know but what they believe. Whether it's talk radio or television commentary or (ahem) blog posts, the great majority of what we read involves people purporting to think for us - to give us their conclusions and expect us to share them. None of this is new, but what is new is the way we've blurred the lines between this punditry and simply "finding the facts." The effort to be objective or give both sides, while always imperfect, has now largely been abandoned by large swaths of our world and across cultural, political, and religious lines. We live in the age of the fundamentalist, whether that fundamentalism is one of the Right or the Left, the Christian or the atheist.

At the same time, we have increasingly lost a desire to think well. Perhaps this is a cause of the pundit's rise; perhaps it is an effect. Regardless, the training in logic and rhetoric that was once a hallmark of a liberal arts education is now often sacrificed in the name of mechanical proficiencies. Our aim is to train people in ways that make them successful economic actors rather than successful thinkers and citizens. I don't mean to romanticize the past there - it's not like careful thinking was ever common. However, what desire for it did exist seems increasingly set aside.

What is perhaps most challenging about all of this is that we often don't realize our lack of ability to think well. Psychologists talk about the "Dunning-Kruger Effect," which is a way of comparing how much people know to how much they think they know. You might, for example, give someone a test and ask them how they think they did on the test, after which you compare it to how they actually did. What these tests show, mathematically, is something we all know intuitively but fail to apply to our lives. People who know the least about a topic often feel like they know the most. As you start to learn, your perception of competence actually falls (dramatically) before it starts to rise again. The chart to the left illustrates this effect.


This phenomenon very much applies to our thinking. Most of us assume we are perfectly logical and objective about everything. However, this perception doesn't necessarily match reality. Often it is because we actually haven't done much thinking at all. Conversely, those people who have learned to think well are often less confident in their conclusions. This can seem discouraging, but it is actually a sign of thoughtfulness.

With all of that said, it's my hope in this and few posts that will follow to discuss some principles of thinking well. I don't offer this as if I am an expert in the skills of reasoning. However, I have had the opportunity to learn from some truly great thinkers, and I have found these principles helpful as I have thought to develop opinions over the years.

Facts and Beliefs
Taking our cue from the above chart, before we can think more clearly we have to muddy the waters a bit. We need to become less certain in order to pursue true certainty. To that end, I want to start this series by drawing a few distinctions in terms of our ideas. The first is the division between facts and our beliefs about those facts. Our assumption is often that "facts" are what our mind contains. We treat our conclusions as established and self-evident. This is part of why we so often fail when we seek to dialog with others. If what I have are facts, then my opponent is attacking the facts, probably maliciously.

Let me offer a better rubric. There are facts in the world, absolutely, and we do access them. To deny this is to fall into total subjectivity. However, for the most part, I do not simply import those facts into my mind. There are things that get close - for instance, if I say "there is a crack in this cup," while holding up the cup, there is very little distance between what I say and what I think. However, most things don't work that way, especially things that matter. Those things start with facts in the world, but they then filter those facts through a process of interpretation. They require combining facts with other facts and going through a process of reasoning. For instance, if I say "my son must have broken this cup," assuming I wasn't in the room at the time - that isn't as certain. I am combining the fact of the cup with other facts, including those about my childrens' location and their tendency to break things. What I am now discussing is a belief. It might well still be true, but the process of interpretation matters. If you say "the cup isn't broken," we are disagreeing about something right in front of us, and unless one of us is halucinating, we are obviously wrong. But if you say "actually, I think it was your daughter," we are in a position where what is in question isn't the fact itself but our interpretation of the fact. We have a difference in beliefs.

This distinction can be seen in the chart to the right. As we encounter facts in the world, we are constantly interpreting them, resulting in a series of beliefs we hold. It's important to stress that I'm using the word "belief" differently than some. It's common to equate "belief" with "private preference," as if beliefs can't be challenged or tested. That is not what I have in mind. However, it matters because it means our beliefs are not as certain as facts themselves. They are constantly open to challenge and change.

Already, this distinction helps as we think about handling disagreements well. To disagree about facts requires us to treat the other party as, essentially, insane or intentionally lying. To disagree about beliefs, however, simply means that we have interpreted the world differently. We'll be talking more about what contributes to those different interpretations in the next post. What matters for now, though, is that recognizing that we are all interpreting the world makes room for listening and learning.

This is not to say that there is a moral component to our beliefs, or that we can't be right or wrong. We can interpret the world irresponsibly or selfishly. We can be willfully or unconsciously biased. Incorrect beliefs are incorrect, but because they are beliefs, we should always be humble about how correct we might be.

Degrees of Confidence
To make things even less clear, we need to recognize that not all beliefs can be held with the same degree of confidence. If I saw my son run into the kitchen, yelling that he needed a drink until the cries stopped with a sudden crash, my suspicion of him as the nefarious cup-breaker is pretty well-founded. If I just walked in the door from work and have no idea where he is, while I could still be right, I can't be as sure. We need to be mindful not just of what our beliefs are but of how tightly we hold them.

Let me introduce, then, the pyramid of confidence. This is something I mentally use to try to rank the different things I believe, placing them in categories. As we move up the pyramid, we find beliefs that are more certain than those that came before. However, it is a pyramid because each step should hold less than the one before it.

Here's how the pyramid works. At the bottom - not really in the structure at all - are beliefs that are impossible. There aren't actually many beliefs in this category, but it's there for completeness. Impossibility includes things that are innately self-contradictory, like a square circle. It also includes things that are so out-of-touch with the world that basic reality would have to change for them to be true. To continue our cup example, it is (as far as I know) impossible that the cup transformed into a bird and flew up to the ceiling before suddenly transforming back into a cup and toppling to the floor. Everything else is possible, meaning there is nothing innately self-contradictory or obviously ridiculous in them.

The distinction between possible and plausible, the next step, is that it isn't wildly unlikely or out of the ordinary. Barring other evidence, I do not think that the cup was broken by a burglar, even if my son proposes this as an explanation. Assuming nothing else was stolen and there are no signs of forced entry, it's just too bizarre to believe. What is plausible in the cup scenario is a much shorter list than what is possible - one of my children broke it, or my wife, or perhaps it broke in the sink without anyone noticing. Importantly, though, we are not yet to the point where it is reasonable to adopt one of those conclusions over the others.

That point where beliefs start to form is that of the probable. We are weighing the plausible scenarios and asking which one is most likely, and how likely that scenario is. If we weight those factors and conclude it is pretty likely, we might adopt it as a belief. However, probable is still not the same thing as certain. We should be especially open to other beliefs at this level, and we should also be mindful of how probable we think it is. If it is 50% likely that is much different than 90%. If I know all three kids were in the kitchen but that my younger son is the most rambunctious and clumsy of them, blaming him might be a good bet, but it is still much less probable than if the other two were downstairs at the time.

The last category is certain, and it requires a bit more clarification because it means something more than just "very probable." Certainty is the result of probability plus time and investigation. In terms of the cup, it requires conversations with all three children and my wife. It probably involves asking my son and letting him propose other scenarios. Even then I don't have perfect certainty, but I have done the due diligence to test alternative scenarios and find them unpersuasive. One point there is worth stressing - perfect certainty is impossible for human beings. It exists only in the mind of God. For the rest of us, certainty means we are as persuaded as a person can reasonably be of a conclusion we hold after testing and (often quite a bit of) time.

All of that might sound silly when discussing cracked cups, but it comes into its own when we consider our deeper beliefs. Let me just give two applications of that pyramic of confidence to our thinking.

First, it means we need to make sure we aren't skipping steps up the pyramid. This is a common mistake - we move from possible or plausible to certain without the diligence it requires. An obvious example of this is the conspiracy theory. It makes a series of arguments for possibility - "person A might have talked to person B, who might have also been involved with group C, and so event D could have been planned by person A all along." None of those steps are impossible, although some of them might strain the bounds of plausibility. However, all of that work does not mean the conspiracy is true, or even particularly likely.

This can also sneak into mainstream ideas. For instance, it is common in academic discussions to see a sort of "certainty creep." One academic writes a paper arguing that a view is possible or plausible. A few years later, another paper states it as an established fact, footnoting the original piece without doing the work of moving it up the pyramid. Nobody ever made the argument or invested the time and investigation; it just became something "everybody knows."

Second, and more personally, it means that we need to be honest about where our beliefs fit. Which is why I draw that pyramid in the first place. Our tendency is to talk with equal certainty about all of our opinions, but this is enormously destructive. It kills our ability to dialog and learn.

One specific application of this is that, in areas where we lack expertise, we should never hold views as more than probable. For instance, I have beliefs about economics and sports and foreign policy and philosophy. I have those beliefs because forming beliefs is a normal part of thinking. We usually develop opinions pretty early in the learning process, and that is fine. However, I am not an expert in any of those fields. Not even close. So while I hold those beliefs, I try to be honest with myself and others that they belong pretty far down the pyramid. I do not hold them with the same confidence as beliefs in areas I have studied or thought about for my whole life.

Let me offer a practical example of why this matters. Being a pastor, I often meet people who grew up within a certain sort of fundamentalist Christianity. By fundamentalist here I don't mean a set of beliefs, but rather a cultural impulse. Fundamentalism in this sense speaks with absolute certainty about basically everything. The contents of the Apostle's Creed and views on debatable theological matters and flowcharts of the rapture and specific conclusions about culture and politics are all held with 100% confidence. The problem, of course, is that those conclusions are not equally clear. By failing to admit this, what tends to happen in such a fundamentalism is that a person who questions one area ends up questioning them all. Many crises of faith could be averted by simply admitting that I am not as certain about God's will for this election or the age of the earth as I am that He exists or that Jesus rose from the dead.

A second specific application of our need to be honest about where our beliefs fit in the pyramid is that we should always be seeking to learn and listen to opposing viewpoints. The only way to move towards certainty is through careful testing and study. This testing process is why we must be seeking out voices with which we disagree. Not exclusively - there are benefits from reading those from our side who are more knowledgable or thoughtful than we are, and we should be doing this too. However, many of the most fruitful books and conversations I've had in my own thinking were from different positions. Not that I changed my mind, but that by working through their ideas I learned to be more nuanced and careful about my own.

One last thought - when we do encounter new ideas, all of this should also encourage us to be slow to adopt such new ideas as well. There is at times a bipolar tendency in our thinking. We encounter an interesting thought and it becomes ours. Then we read the other side and conclude that we were in fact totally wrong and we rabidly adopt that position instead. Bouncing from interesting idea to interesting idea, we never go deeply into any of them. The process of good thinking should always be gradual, doubting both our positions and our doubts with equal seriousness.

So that's step one in thinking well - recognizing our beliefs for what they are and understanding how confidently we should hold them. Next post we're going to start to describe different ways to test those beliefs as we talk about the process of reasoning and different ways it can be done.

Comments