[<< | Prev | Index | Next | >>]
Tuesday, January 25, 2022
Derationalizing, Part 1: Question the Obvious
My reply to this question from an acquaintance:
As a young person, it's difficult to know about every topic that is discussed. What do you do as a young skeptic who holds few opinions? Do you become the Socrates in every discussion/argument while offering very few of your own opinions?
How can you ever be ready to say "I believe x" if there's so much room to say "I don't know"? It's unlikely that you'll go through every source about any subject, so at what point should you be confident enough to form an opinion?
What happens when you don't know what you don't know?
I guess the question here is: which strategy are people most receptive to in an argument? Which will help you learn the most? Is there a combination of these that allows for maximum learning efficiency?
Imo there is only one answer to any of these and it's the answer to all of these, and you touched on it up top w/Socrates:
Make your only objective: to understand why people believe what they do. (And in particular, not to try to convince anyone of anything before you've done that.)
Once you achieve that (which you may or may not ever with any given person...), depending on what the answer is there are usually one of three options: 1) You realize they are right, 2) You use what you've learned about why they believe what they do to guide them to a new belief, 3) You realize the reasons they believe things are entirely irrational and, chances are, you're too honest to do what it would take to change their minds.
Generally speaking you will find that honest people will self-correct before you get to the end. That is, if you just keep asking them about whatever bit isn't making sense to you, and assuming their answers don't clear it up for you and change your mind, they will crash into their own inconsistencies or unsupported premises in their effort to answer your questions, and will say "Oh, now that you mention it.... maybe I had that part wrong." So in practice, it's rare you ever get a chance to exercise (2) because those people self-correct before you get there.
There are like 10 such people on the planet. I know 5 of them.
Most of the time you'll eventually end up at (3), which is a hard end to reach because you'll rightly be very reluctant to accept just how irrational people are, particularly the academics who've been trained to present as rational. Sadly (1) will be more rare than you hope, too, for the same reason.
To give you a head start on understanding why (most) people believe what they do, here's my distillation after half a century of studying how people think:
1) No matter how many reasons most people have for what they believe, the reality is belief is a feeling and all the reasoning is post-rationalization. Deep down, certain things are "obvious" to them, and this feeling of obviousness is the stopping condition that keeps them from ruminating endlessly into the most minute details or remote possibilities. In other words, the feeling of truth or obviousness is a key and necessary feature of cognition, and without it we would be unable to look up from our navel in time to eat. But the problem is, very few people recognize it for what it is--a subconsciously acquired emotional response (as opposed to a necessarily sound inference)--and they blindly heed its impulse, which is to stop thinking about the "obvious" thing in question.
So, most people most of the time are not in a learning mode at all, they are post-rationalizing what they already feel.
2) What most people feel is true is programmed into them by their tribe--in this age via their chosen media and social networks. People in media know this, and use it (the caveat on (3) above does not apply to them). But it doesn't happen the way you think: It's not about presenting evidence and making rational arguments, because as I said that's not how people actually think. It's about adopting the desired "truth" as a presupposition and then portraying everything using that light. So "the terrorists stormed the church and held fifty people captive for three days" brings our conscious attention to the described action, but hidden from scrutiny is the choice of "terrorists" vs. "revolutionaries", which carries with it a great deal of emotional implications which directly program the readers. Even if the depictions don't seem super credible, repetition will make them into reality, because every time someone is exposed to a phrase or story like that, they picture it in their mind's eye, and their cortex integrates that experience as if it had been seen in the real world.
In other words, if you can get people to listen to your stories for enough minutes of the day, they will integrate the presuppositions (the background reality implied by your stories) of those stories as their model of the real world. Doesn't matter if they believe they are reading fact or fiction.
And looping back, once those presuppositions are programmed in, they define what is "obvious", and in turn what people will no longer think about.
Again, most people who make it to positions of serious power understand all of this very well, and the world is run via these strings. The people you'll encounter will mostly be on the wrong end of those strings.
So, you can try asking people why they believe what they do, but mostly their answers won't make sense (they will form a logical chain, but that chain will be just one of many logical chains and not necessarily the most likely one by any obvious metric), so there's a fair chance you won't come to understand why they believe what they do by asking them. Instead, find out what media they consume, and then go pay attention to how that media is crafted, and you will probably have your answer. Which is why, for case (3), there's usually nothing you can do about it.
Just beware, once you pay attention to media at this level enough to see what I'm talking about, you'll have learned to perceive it in the same way you perceive a chair when you walk into a room, without having to analyze the legs and seat and logically deduce that there's a chair. I.e., you won't be able to un-see it, and then you'll spend the rest of your life living in Clown World where you can't believe you're surrounded by people who blindly swallow/follow this crap.
(But you won't be able to take these glasses off.)
All that said, to end on a more productive note and to tie in what I said at the start: Imo the best approach to take is to just ask questions. If you're actively debating someone, focus on the most important one or very few--people will throw a lot of chaff to distract you away from their inconsistencies. Just ignore the chaff (hard to do!) and laser focus on the important question--whatever most doesn't make sense to you about their stance. I have actually "won" a few debates this way, the sorts of debates that are not normally ever won by either side, which is as much a credit to them as to my approach because most will break down into full cognitive dissonance first:
"I have learned that, for most people, if you inspire within them a conflict between what they feel is right and what they know is right, well, fireworks or waterworks are usually the result." -Rik Ling
Equally important is to be that person who is willing to answer the important questions straight, and to admit error when you can't. You might just make it 11.
Oh, and also actionable: To whatever degree all of the above is accurate and representative of humans in general including you and me, the primary control you have over what you believe, and the quality and accuracy of what you believe, is the media you expose yourself to. This is a very tricky decision to make well, because:
"One thing I've learned in the half-century of living and learning is that if society establishes an organized process, cheats will enter, and game the system." -- James Lyons-Weiler
The most rigorous and seemingly trustworthy institutions are also the most insidiously corrupt.
Independent bloggers and journalists are probably your best bet right now. Pick a wide range of ones who think well, and rely on their public position as aggregators to pull in what matters. Avoid the mainstream media like the NPC blue pill it is. (Once you start paying attention to slant, you won't be able to stomach mainstream media anyway. See Clown World.)
Lastly... to make the implied explicit: You can do a lot to clean up your own belief system by simply being willing to entertain any challenge to something you consider "obvious" (at least long enough to make sure it's not a new angle you haven't honestly considered before; and even if you have, it's worth re-visiting if it comes from someone you consider sane). I.e., there's probably no way to avoid the emotional aspect of belief (and the subconscious absorption of presuppositions from media), but you can largely work around/patch it simply by understanding that that's how the (your) mind works.
Which is a really long-winded way of saying: don't be over-confident about anything.
[<< | Prev | Index | Next | >>]