Less Wrong

Eliezer Yudkowsky

our minds respond less readily to our will than our hands. Our ability to control our muscles is evolutionarily ancient; our ability to reason about our own reasoning processes is a much more recent innovation. We shouldn’t be surprised, then, that muscles are easier to use than brains.


If you live in an urban area, you probably don’t need to walk very far to find a martial arts dojo.


Some of the machinery is optimized for evolutionary selection pressures that run directly counter to our declared goals in using it. Deliberately we decide that we want to seek only the truth; but our brains have hardwired support for rationalizing falsehoods.


We can try to compensate for what we choose to regard as flaws of the machinery; but we can’t actually rewire the neural circuitry.


humans arent reflectively blind. We do have a native instinct for introspection. The inner eye isnt sightless, though it sees blurrily, with systematic distortions.


To make rationality into a moral duty is to give it all the dreadful degrees of freedom of an arbitrary tribal custom. People arrive at the wrong answer, and then indignantly protest that they acted with propriety, rather than learning from their mistake.


What other motives are there? Well, you might want to accomplish some specific real-world goal, like building an airplane, and therefore you need to know some specific truth about aerodynamics.


If this is the reason you want truth, then the priority you assign to your questions will reflect the expected utility of their information—how much the possible answers influence your choices, how much your choices matter, and how much you expect to find an answer that changes your choice from its default.


To seek truth merely for its instrumental value may seem impure—should we not desire the truth for its own sake?—but such investigations are extremely important because they create an outside criterion of verification: if your airplane drops out of the sky, or if you get to the store and find no chocolate milk, its a hint that you did something wrong. You get back feedback on which modes of thinking work, and which don't.


Another possibility: you might care about whats true because, damn it, you're curious.


curiosity has a special and admirable purity. If your motive is curiosity, you will assign priority to questions according to how the questions, themselves, tickle your aesthetic sense. A trickier challenge, with a greater probability of failure, may be worth more effort than a simpler one, just because it's more fun.


Although pure curiosity is a wonderful thing, it may not linger too long on verifying its answers, once the attractive mystery is gone.


what set humanity firmly on the path of Science was noticing that certain modes of thinking uncovered beliefs that let us manipulate the world—truth as an instrument. As far as sheer curiosity goes, spinning campfire tales of gods and heroes satisfied that desire just as well, and no one realized that anything was wrong with that.


“Cognitive biases” are those obstacles to truth which are produced, not by the cost of information, nor by limited computing power, but by the shape of our own mental machinery.


the mental machinery might be adapted not to particularly care whether something is true, such as when we feel the urge to believe what others believe to get along socially.


A bias is an obstacle to our goal of obtaining truth, and thus in our way.


The creationist practices a very selective underconfidence, refusing to integrate massive weights of evidence in favor of a conclusion they find uncomfortable. I would say that whether you call this “humility” or not, it is the wrong step in the dance.


What about the engineer who humbly designs fail-safe mechanisms into machinery, even though they’re damn sure the machinery won’t fail? This seems like a good kind of humility to me. Historically, it’s not unheard-of for an engineer to be damn sure a new machine won’t fail, and then it fails anyway.


You suggest studying harder, and the student replies: “No, it wouldn’t work for me; I’m not one of the smart kids like you; nay, one so lowly as myself can hope for no better lot.” This is social modesty, not humility. It has to do with regulating status in the tribe, rather than scientific process.


The student says: “But I’ve seen other students double-check their answers and then they still turned out to be wrong. Or what if, by the problem of induction, 2 + 2 = 5 this time around? No matter what I do, I won’t be sure of myself.” It sounds very profound, and very modest. But it is not coincidence that the student wants to hand in the test quickly, and go home and play video games.


The student says: “But I’ve seen other students double-check their answers and then they still turned out to be wrong. Or what if, by the problem of induction, 2 + 2 = 5 this time around? No matter what I do, I won’t be sure of myself.” It sounds very profound, and very modest. But it is not coincidence that the student wants to hand in the test quickly, and go home and play video games.


The end of an era in physics does not always announce itself with thunder and trumpets; more often it begins with what seems like a small, small flaw . . . But because physicists have this arrogant idea that their models should work all the time, not just most of the time, they follow up on small flaws. Usually, the small flaw goes away under closer inspection. Rarely, the flaw widens to the point where it blows up the whole theory. Therefore it is written: “If you do not seek perfection you will halt before taking your first steps.”


When you argue a lot, people look upon you as confrontational. If you repeatedly refuse to compromise, it’s even worse. Consider it as a question of tribal status: scientists have certainly earned some extra status in exchange for such socially useful tools as medicine and cellphones. But this social status does not justify their insistence that only scientific ideas on evolution be taught in public schools. Priests also have high social status, after all. Scientists are getting above themselves—they won a little status, and now they think they’re chiefs of the whole tribe! They ought to be more humble, and compromise a little.


It is dangerous to have a prescriptive principle which you only vaguely comprehend; your mental picture may have so many degrees of freedom that it can adapt to justify almost any deed.


Where people have vague mental models that can be used to argue anything, they usually end up believing whatever they started out wanting to believe.


This is so convenient that people are often reluctant to give up vagueness.


the purpose of our ethics is to move us, not be moved by us.


“Humility” is a virtue that is often misunderstood. This doesn’t mean we should discard the concept of humility, but we should be careful using it. It may help to look at the actions recommended by a “humble” line of thinking, and ask: “Does acting this way make you stronger, or weaker?”


If you think about the problem of induction as applied to a bridge that needs to stay up, it may sound reasonable to conclude that nothing is certain no matter what precautions are employed; but if you consider the real-world difference between adding a few extra cables, and shrugging, it seems clear enough what makes the stronger bridge.


Humility, in its most commonly misunderstood form, is a fully general excuse not to believe something; since, after all, you can’t be sure. Beware of fully general excuses!


John Kenneth Galbraith said: “Faced with the choice of changing one’s mind and proving that there is no need to do so, almost everyone gets busy on the proof.”


But y’know, if you’re gonna do the same thing anyway, there’s no point in going to such incredible lengths to rationalize it.


The point of thinking is to shape our plans; if you’re going to keep the same plans anyway, why bother going to all that work to justify it?


When you disagree with someone, even after talking over your reasons, the Modesty Argument claims that you should each adjust your probability estimates toward the other's, and keep doing this until you agree.  The Modesty Argument is inspired by Aumann's


When you disagree with someone, even after talking over your reasons, the Modesty Argument claims that you should each adjust your probability estimates toward the other's, and keep doing this until you agree. 


In the former case, the question is absolutely clear, and in the latter case it is not absolutely clear, to me at least, which opens up the possibility that they are different questions.


If I thought creationism was 50% probable, I wouldn't need


"Do not believe you do others a favor if you accept their arguments; the favor is to you."  Am I really doing myself a favor by agreeing with the creationist to take the average of our probability distributions?


I regard rationality in its purest form as an individual thing - not because rationalists have only selfish interests, but because of the form of the only admissible question:  "Is is actually true?" 


If you fail to achieve a correct answer, it is futile to protest that you acted with propriety." 


Those who dream do not know they dream; but when you wake you know you are awake.  Dreaming,


Those who dream do not know they dream; but when you wake you know you are awake. 


Integrating the Modesty Argument as new evidence ought to produce a large effect on someone's life and plans.  If it's being really integrated, that is, rather than flushed down a black hole.