I'm really interested in the classic philosophical question "how should we live". We all already have a set of answers to that implanted in our heads, though for most of us those answers are invisible, assumed to be "normal". Often the answers to these questions that we have, those given to us by our culture and our own conclusions are confused and at odds with themselves. Their relative invisibility makes them even harder to work with.
One way we can divide systems of culture is by their basic approach to knowledge. Some are essentially declarative (that is, they dictate the truth with finality), while others are essentially explorative (that is, they don't claim to know the truth, but rather to be seeking it). An observable pattern in explorative culture and knowledge systems is the tendency to explode into an ever broadening fractal of fields and subfields, subjects gradually separating themselves from each other into specialties. Typically, each of these subfields becomes loosely coupled with its parent, able to support itself and its assertions independently.
Declarative systems of thought, conversely, appear to absorb subjects into themselves, forming one accretion of interdependent statements. This mass of statements may wind up collected into one canonical published source. The Bible is a pretty good example of this process. The Bible is not a specifically religious book. The word "bible" literally translates from Greek as "the books". It contains cutting-edge (circa 400 BCE) thinking on natural philosophy, politics, medicine, law, history, ethics, agriculture, poetry, and spirituality, all in one massive lump.
A cognitive bias is a recurring illogical pattern of thought or judgment that appears in most people. That's a fancy way of saying it's something that people always seem to think about the wrong way, or a common mistake. An easy cognitive bias to observe is “the bandwagon effect”, which is believing something is reasonable or true simply because many other people think that it's reasonable or true.
There is a strong, but rarely explicitly made, suggestion regarding cognitive biases: we should attempt to eliminate them from our thinking. On the surface this seems like a very reasonable idea, after all, biases are illogical and lead to incorrect results. I have grave concerns about this approach to cognition because, in my experience, the correct result is not always the optimum one. For example, the perception of terrain slopes is seen as much steeper than it is measured. It's a well documented cognitive bias1, and it's also present for a good reason. Walking up even moderate slopes is generally a bad idea, the bias keeping us safe.
Before we think of a particular cognitive bias as a bad thing or try to correct it, it's worth applying the principal of Chesterton's Fence … Let us say, for the sake of simplicity, a fence or gate erected across a road. The more modern type of reformer goes gaily up to it and says, “I don’t see the use of this; let us clear it away.” To which the more intelligent type of reformer will do well to answer: “If you don’t see the use of it, I certainly won’t let you clear it away. Go away and think. Then, when you can come back and tell me that you do see the use of it, I may allow you to destroy it.2