The outcome of any decision or set of decisions has a luck component. That is to say the outcome of a decision set is probabalistic. Even if we make good decisions, the outcome may not be what we hoped for. What we would like to do is to make the optimal set of decisions which will provide the maximum utility given the probability distribution of all possible outcomes. I don't want to get too much into utility and decision theory here, but it is close enough to say that maximum utility maximizes the likelihood of the desired outcome after adjusting for some subjective level of risk aversion.
Utility is inherently subjective. This is desired subjectivity. People have different tastes, desires, and perspectives. A significant part of utility theory is dedicated to making the inherent subjectivity quantifiable so that an objective, mathematical analysis can then create the decision set to maximize that subjective utility.
However, to be able to calculate the optimal decision set to maximize utility, the subjective preferences must be based on an accurate perception of reality and the rest of the input data also needs to be accurate. Therefore, we would all like to have the best understanding of the world and the most accurate data possible in order to maximize our utility.
Jim writes that it's not necessarily critical whether or not all the facts are accurate in what he reads because he has healthy skepticism and constantly asks himself:
"Does this feel right? Does it jibe with the way humans act and systems operate? Does the thread of logic make sense?"
I think that everybody views the world through these exact same filters. In fact, given the uncertainty we usually have to deal with, there is generally little choice except to use these subjective filters and the results from using them are usually sufficient to get us through life.
Let's consider what we're really saying here. I'm going to rewrite Jim's words a little to explicitly state the subjectivity of our perspective:
Does this feel right to me? Does it jibe with the way I understand that humans act and systems operate? Does the thread of logic make sense to me?
These filters contain undesirable subjectivity. We have to use these filters because we often don't have better information. However, they do not enable us to increase our understanding of the world, nor do they help us make decisions that maximize utility. In fact, the filter biases and errors tend to be self reinforcing and, as a result, they tend to make our understanding of the world more inaccurate over time.
To see why the errors are self reinforcing, let's take a simple, hypothetical example. Let's say you think Bush is a good president. When you read something positive about Bush, it passes the "does this feel right" and other filters. So you say "yeah, that's quite possibly true." And now you have more evidence that Bush is a good president so you're even more likely to accept future data that supports that. On the other hand, if you read something bad about Bush, your filters make you skeptical, and you tend to reject the evidence so it has little influence on the future performance of the filter. Eventually, accepting the news that matches your belief and rejecting the news that doesn't, you end up believing that Bush is a great president and there is no way you could be convinced otherwise.
There are many, many people who are so strongly convinced that Bush is a good President, that no amount of evidence could convince them otherwise. Of course, the converse is true as well. There are many, many people who are so strongly convinced that Bush is a bad President, that no amount of evidence could convince them otherwise.
This is the definition of having a closed mind. Given that you have to have some filters to get through life, if you can't trust the information to which you have access, the closing of your mind is inevitable.
There are other factors which further exacerbate this trend. Rejecting information is painful. When the filters kicks in and rejects data it hurts (it "doesn't feel right"). So people tend to expose themselves to information and data that doesn't create cognitive dissonance. If you're going to ignore everything in an article because of your filters, you might as well skip reading it in the first place. It's a waste of time and unpleasant to boot.
In order to avoid the filter based psychic pain, people like to associate with those that agree with them. Sometimes they even move to certain areas of the country or world in order to be near those that think like them. Most blogs are just groups of bloggers and readers where they pretty much already agree on everyhing. The Great Guys Blog is a rare exception.
I'm the same way. I'm simply not interested or willing to read or watch anything that takes significant time and effort if I'm not going to agree with it.
Unless I feel I can trust the contents to be accurate! If I'm confident that it will be accurate, I'm willing to put in the effort and tolerate the cognitive dissonance. Otherwise, if I can't trust the contents to be accurate, it's simply pointless for me to read it. Therefore, no Michael Moore for me. No Bellesiles, No Lott, No NY Times, No BBC, No CCN, No LA Times. It's very limiting.
That is why I think it is excruciatingly important for the media and significant authors to provide accurate information as the basis for their arguments. The American mind has been closed, both on the left and on the right, because of lack of trust in not only the politicians, but also the media and traditional content providers.
No comments:
Post a Comment