It can be perfectly rational to adopt an irrational, incoherent, incohesive, and conflicting worldview. In order to do so, the adopter may have to reduce cognitive dissonance by ignoring, or at least not taking too seriously, the conflicting evidence.
How can it be rational? One argument could be that worldview X has worked for the past N years even when exposed to radical change and conflict in an unfathomably complex universe with essentially unlimited uncertainty. Thus worldview X is robust if non-optimal. Worldview X has known inconsistencies but since it's impossible to truly know which particular item is right or wrong, or that it even may be that the inconsistencies are inherent to the robustness of the system, tweaking worldview X is risky. Perhaps the tweaking of X creates a more optimal system, at least in the short term, but creates fragility where robustness once existed.
I personally am a risk taker, so I think tweaking is worth it, but I fully understand the other view.
No comments:
Post a Comment