How To Actually Change Your Mind

LessWrong

Unfortunately the universe doesn't agree with me.  We'll see which one of us is still standing when this is over.


A car with a broken engine cannot drive backward at 200 mph, even if the engine is really really broken.


According to Denes-Raj and Epstein, these subjects reported afterward that even though they knew the probabilities were against them, they felt they had a better chance when there were more red beans.  This may sound crazy to you, oh Statistically Sophisticated Reader, but if you think more carefully you'll realize that it makes perfect sense.  A 7% probability versus 10% probability may be bad news, but it's more than made up for by the increased number of red beans.  It's a worse probability, yes, but you're still more likely to win, you see.  You should meditate upon this thought until you attain enlightenment as to how the rest of the planet thinks about probability.


In practice you can never completely eliminate reliance on authority.  Good authorities are more likely to know about any counterevidence that exists and should be taken into account; a lesser authority is less likely to know this, which makes their arguments less reliable.  This is not a factor you can eliminate merely by hearing the evidence they did take into account.


There is an ineradicable legitimacy to assigning slightly higher probability to what E. T. Jaynes tells you about Bayesian probability, than you assign to Eliezer Yudkowsky making the exact same statement.  Fifty additional years of experience should not count for literally zero influence. But this slight strength of authority is only ceteris paribus, and can easily be overwhelmed by stronger arguments.  I have a minor erratum in one of Jaynes's books - because algebra trumps authority.


"What does you in is not failure to apply some high-level, intricate, complicated technique.  It's overlooking the basics.  Not keeping your eye on the ball."


There's a sadly large number of times when it's worthwhile to judge the speaker's rationality.  You should always do it with a hollow feeling in your heart, though, a sense that something's missing.


Because we don't see the cost of a general policy, we learn overly specific lessons.  After September 11th, the FAA prohibited box-cutters on airplanes - as if the problem had been the failure to take this particular "obvious" precaution.  We don't learn the general lesson: the cost of effective caution is very high because you must attend to problems that are not as obvious now as past problems seem in hindsight.


Of course, one didn't use phlogiston theory to predict the outcome of a chemical transformation.  You looked at the result first, then you used phlogiston theory to explain it. 


Alas, human beings do not use a rigorous algorithm for updating belief networks. 


Curiosity is the first virtue, without which your questioning will be purposeless and your skills without direction.


If your eyes and brain work correctly, you will become tangled up with your own shoelaces.


Therefore rational beliefs are contagious, among honest folk who believe each other to be honest.  And it's why a claim that your beliefs are not contagious - that you believe for private reasons which are not transmissible - is so suspicious.  If your beliefs are entangled with reality, they should be contagious among honest folk.


If your model of reality suggests that the outputs of your thought processes should not be contagious to others, then your model says that your beliefs are not themselves evidence, meaning they are not entangled with reality.  You should apply a reflective correction, and stop believing.


The spoken sentence is not the fact itself; don't be led astray by the mere meanings of words.


In modern civilization particularly, no one can think fast enough to think their own thoughts. 


But that is moot.  By the time you realize you have a choice, there is no choice.  You cannot unsee what you see.  The other way is closed.


If there were ever a discipline that genuinely demanded X-Treme Nitpicking, it is evolutionary psychology.


John Kenneth Galbraith said:  "Faced with the choice of changing one's mind and proving that there is no need to do so, almost everyone gets busy on the proof."  And the greater the inconvenience of changing one's mind, the more effort people will expend on the proof.


Let me not become attached to beliefs I may not want.


But the affective death spiral turns much deadlier after criticism becomes a sin, or a gaffe, or a crime.  There are things in this world that are worth praising greatly, and you can't flatly say that praise beyond a certain point is forbidden.  But there is never an Idea so true that it's wrong to criticize any argument that supports it.  Never.