Hackers & Painters

Graham, Paul

The main reason nerds are unpopular is that they have other things to think about. Their attention is drawn to books or the natural world, not fashions and parties. They’re like someone trying to play soccer while balancing a glass of water on his head.


If you leave a bunch of eleven-year-olds to their own devices, what you get is Lord of the Flies. Like a lot of American kids, I read this book in school. Presumably it was not a coincidence. Presumably someone wanted to point out to us that we were savages, and that we had made ourselves a cruel and stupid world. This was too subtle for me.


But in at least some cases the reason the nerds don’t fit in really is that everyone else is crazy. I remember sitting in the audience at a “pep rally” at my high school, watching as the cheerleaders threw an effigy of an opposing player into the audience to be torn to pieces. I felt like an explorer witnessing some bizarre tribal ritual.


What bothers me is not that the kids are kept in prisons, but that (a) they aren’t told about it, and (b) the prisons are run mostly by the inmates. Kids are sent off to spend six years memorizing meaningless facts in a world ruled by a caste of giants who run after an oblong brown ball, as if this were the most natural thing in the world. And if they balk at this surreal cocktail, they’re called misfits.


Bullying was only part of the problem. Another problem, and possibly an even worse one, was that we never had anything real to work on. Humans like to work; in most of the world, your work is your identity. And all the work we did was pointless, or seemed so at the time.


In my high school French class we were supposed to read Hugo’s Les Miserables. I don’t think any of us knew French well enough to make our way through this enormous book. Like the rest of the class, I just skimmed the Cliff ’s Notes. When we were given a test on the book, I noticed that the questions sounded odd. They were full of long words that our teacher wouldn’t have used. Where had these questions come from? From the Cliff ’s Notes, it turned out. The teacher was using them too. We were all just pretending.


When there is some real external test of skill, it isn’t painful to be at the bottom of the hierarchy. A rookie on a football team doesn’t resent the skill of the veteran; he hopes to be like him one day and is happy to have the chance to learn from him. The veteran may in turn feel a sense of noblesse oblige. And most importantly, their status depends on how well they do against opponents, not on whether they can push the other down.


Nerds aren’t losers. They’re just playing a different game, and a game much closer to the one played in the real world. Adults know this. It’s hard to find successful adults now who don’t claim to have been nerds in high school.


I’ve never liked the term “computer science.” The main reason I don’t like it is that there’s no such thing. Computer science is a grab bag of tenuously related areas thrown together by an accident of history, like Yugoslavia. At one end you have people who are really mathematicians, but call what they’re doing computer science so they can get DARPA grants.


The border between architecture and engineering is not sharply defined, but it’s there. It falls between what and how: architects decide what to do, and engineers figure out how to do


The border between architecture and engineering is not sharply defined, but it’s there. It falls between what and how: architects decide what to do, and engineers figure out how to do it.


What and how should not be kept too separate. You’re asking for trouble if you try to decide what to do without understanding how to do


What and how should not be kept too separate. You’re asking for trouble if you try to decide what to do without understanding how to do it.


for the hackers this label is a problem. If what they’re doing is called science, it makes them feel they ought to be acting scientific. So instead of doing what they really want to do, which is to design beautiful software, hackers in universities and research labs feel they ought to be writing research papers. In the best case, the papers are just a formality. Hackers write cool software, and then write a paper about it, and the paper becomes a proxy for the achievement represented by the software. But often this mismatch causes problems. It’s easy to drift away from building beautiful things toward building ugly things that make more suitable subjects for research papers.


as anyone who has written a PhD dissertation knows, the way to be sure you’re exploring virgin territory is to stake out a piece of ground that no one wants.


confidence that they can. The only external test is time. Over time, beautiful things tend to thrive, and ugly things tend to get discarded. Unfortunately, the amounts of time involved can be longer than human lifetimes.


There are worse things than having people misunderstand your work. A worse danger is that you will yourself misunderstand your work.


Hackers need to understand the theory of computation about as much as painters need to understand paint chemistry. You need to know how to calculate time and space complexity, and perhaps also the concept of a state machine, in case you want to write a parser. Painters have to remember a good deal more about paint chemistry than that.


a programming language should, above all, be malleable. A programming language is for thinking of programs, not for expressing programs you’ve already thought of.


When I got to Yahoo, I found that what hacking meant to them was implementing software, not designing it. Programmers were seen as technicians who translated the visions (if that is the word) of product managers into code.


Only a small percentage of hackers can actually design software, and it’s hard for the people running a company to pick these out. So instead of entrusting the future of the software to one brilliant hacker, most companies set things up so that it is designed by committee, and the hackers merely implement the design.


Big companies want to decrease the standard deviation of design outcomes because they want to avoid disasters. But when you damp oscillations, you lose the high points as well as the low. This is not a problem for big companies, because they don’t win by making great products. Big companies win by sucking less than other big companies.


At Viaweb I considered myself lucky if I got to hack a quarter of the time. And the things I had to do the other three quarters of the time ranged from tedious to terrifying. I have a benchmark for this, because I once had to leave a board meeting to have some cavities filled. I remember sitting back in the dentist’s chair, waiting for the drill, and feeling like I was on vacation.


It seems surprising to me that any employer would be reluctant to let hackers work on open source projects. At Viaweb, we would have been reluctant to hire anyone who didn’t. When we interviewed programmers, the main thing we cared about was what kind of software they wrote in their spare time. You can’t do anything really well unless you love it, and if you love to hack you’ll inevitably be working on projects of your own.


Because painters leave a trail of work behind them, you can watch them learn by doing. If you look at the work of a painter in chronological order, you’ll find that each painting builds on things learned in previous ones. When there’s something in a painting that works especially well, you can usually find version 1 of it in a smaller form in some earlier painting.


The fact that hackers learn to hack by doing it is another sign of how different hacking is from the sciences. Scientists don’t learn science by doing it, but by doing labs and problem sets. Scientists start out doing work that’s perfect, in the sense that they’re just trying to reproduce work someone else has already done for them. Eventually, they get to the point where they can do original work. Whereas hackers, from the start, are doing original work; it’s just very bad. So hackers start original, and get good, and scientists start good, and get original.


The other way makers learn is from examples. To a painter, a museum is a reference library of techniques. For hundreds of years it has been part of the traditional education of painters to copy the works of the great masters, because copying forces you to look closely at the way a painting is made.


Everyone by now presumably knows about the danger of premature optimization. I think we should be just as worried about premature design — deciding too early what a program should do. The right tools can help


Everyone by now presumably knows about the danger of premature optimization. I think we should be just as worried about premature design — deciding too early what a program should do.


This sounds like a paradox, but a great painting has to be better than it has to be.


For example, when Leonardo painted the portrait of Ginevra de’ Benci in the National Gallery, he put a juniper bush behind her head. In it he carefully painted each individual leaf. Many painters might have thought, this is just something to put in the background to frame her head. No one will look that closely at it. Not Leonardo. How hard he worked on part of a painting didn’t depend at all on how closely he expected anyone to look at it. He was like Michael Jordan. Relentless.


Great software, likewise, requires a fanatical devotion to beauty. If you look inside good software, you find that parts no one is ever supposed to see are beautiful too.


In hacking, like painting, work comes in cycles. Sometimes you get excited about a new project and you want to work sixteen hours a day on it. Other times nothing seems interesting.


To do good work you have to take these cycles into account, because they’re affected by how you react to them. When you’re driving a car with a manual transmission on a hill, you have to back off the clutch sometimes to avoid stalling. Backing off can likewise prevent ambition from stalling. In both painting and hacking there are some tasks that are terrifyingly ambitious, and others that are comfortingly routine. It’s a good idea to save some easy tasks for moments when you would otherwise stall.


When I was a kid I was constantly being told to look at things from someone else’s point of view. What this always meant in practice was to do what someone else wanted, instead of what I wanted. This of course gave empathy a bad name, and I made a point of not cultivating it. Boy, was I wrong. It turns out that looking at things from other people’s point of view is practically the secret of success.


If I could get people to remember just one quote about programming, it would be the one at the beginning of Structure and Interpretation of Computer Programs.8 Programs should be written for people to read, and only incidentally for machines to execute.


So, if hacking works like painting and writing, is it as cool? After all, you only get one life. You might as well spend it working on something great. Unfortunately, the question is hard to answer. There is always a big time lag in prestige. It’s like light from a distant star. Painting has prestige now because of great work people did five hundred years ago.


So while I admit that hacking doesn’t seem as cool as painting now, we should remember that painting itself didn’t seem as cool in its glory days as it does now.


Over and over we see the same pattern. A new medium appears, and people are so excited about it that they explore most of its possibilities in the first couple generations. Hacking seems to be in this phase now.


Is our time any different? To anyone who has read any amount of history, the answer is almost certainly no. It would be a remarkable coincidence if ours were the first era to get everything just right.


Let’s start with a test: do you have any opinions that you would be reluctant to express in front of a group of your peers? If the answer is no, you might want to stop and think about that. If everything you believe is something you’re supposed to believe, could that possibly be a coincidence? Odds are it isn’t. Odds are you just think whatever you’re told.


The statements that make people mad are the ones they worry might be believed. I suspect the statements that make people maddest are those they worry might be true. If Galileo had said that people in Padua were ten feet tall, he would have been regarded as a harmless eccentric. Saying the earth orbited the sun was another matter. The church knew this would set people thinking.


Do we have no Galileos? Not likely. To find them, keep track of opinions that get people in trouble, and start asking, could this be true? Ok, it may be heretical (or whatever modern equivalent), but might it also be true?


force. The word “defeatist,” for example, has no particular political connotations now. But in Germany in 1917 it was a weapon, used by Ludendorff in a purge of those who favored a negotiated peace. At the start of World War II it was used extensively by Churchill and his supporters to silence their opponents. In 1940, any argument against Churchill’s aggressive policy was “defeatist.” Was it right or wrong? Ideally, no one got far enough to ask that.


The word “defeatist,” for example, has no particular political connotations now. But in Germany in 1917 it was a weapon, used by Ludendorff in a purge of those who favored a negotiated peace. At the start of World War II it was used extensively by Churchill and his supporters to silence their opponents. In 1940, any argument against Churchill’s aggressive policy was “defeatist.” Was it right or wrong? Ideally, no one got far enough to ask that.


I suspect the only taboos that are more than taboos are the ones that are universal, or nearly so. Murder for example. But any idea that’s considered harmless in a significant percentage of times and places, and yet is taboo in ours, is a good candidate for something we’re mistaken about.


Of course, if they have time machines in the future they’ll probably have a separate reference manual just for Cambridge. This has always been a fussy place, a town of i dotters and t crossers, where you’re liable to get both your grammar and your ideas corrected in the same conversation


Most adults, likewise, deliberately give kids a misleading view of the world. One of the most obvious examples is Santa Claus. We think it’s cute for little kids to believe in Santa Claus. I myself think it’s cute for little kids to believe in Santa Claus. But one wonders, do we tell them this stuff for their sake, or for ours?


Moral fashions don’t seem to be created the way ordinary fashions are. Ordinary fashions seem to arise by accident when everyone imitates the whim of some influential person. The fashion for broad-toed shoes in late fifteenth-century Europe began because Charles VIII of France had six toes on one foot. The fashion for the name Gary began when the actor Frank Cooper adopted the name of a tough mill town in Indiana. Moral fashions more often seem to be created deliberately. When there’s something we can’t say, it’s often because some group doesn’t want us to.


To launch a taboo, a group has to be poised halfway between weakness and power. A confident group doesn’t need taboos to protect it. It’s not considered improper to make disparaging remarks about Americans, or the English. And yet a group has to be powerful enough to enforce a


To launch a taboo, a group has to be poised halfway between weakness and power. A confident group doesn’t need taboos to protect it. It’s not considered improper to make disparaging remarks about Americans, or the English. And yet a group has to be powerful enough to enforce a taboo.


I suspect the biggest source of moral taboos will turn out to be power struggles in which one side barely has the upper hand. That’s where you’ll find a group powerful enough to enforce taboos, but weak enough to need them.


We often like to think of World War II as a triumph of freedom over totalitarianism. We conveniently forget that the Soviet Union was also one of the winners.


Although fashions in ideas tend to arise from different sources than fashions in clothing, the mechanism of their adoption seems much the same. The early adopters will be driven by ambition: self-consciously cool people who want to distinguish themselves from the common herd. As the fashion becomes established they’ll be joined by a second, much larger group, driven by fear.9 This second group adopt the fashion not because they want to stand out but because they are afraid of standing out.


To do good work you need a brain that can go anywhere. And you especially need a brain that’s in the habit of going where it’s not supposed to.


Great work tends to grow out of ideas that others have overlooked, and no idea is so overlooked as one that’s unthinkable


Why? It could be that the scientists are simply smarter; most physicists could, if necessary, make it through a PhD program in French literature, but few professors of French literature could make it through a PhD program in physics.


Training yourself to think unthinkable thoughts has advantages beyond the thoughts themselves. It’s like stretching. When you stretch before running, you put your body into positions much more extreme than any it will assume during the run. If you can think things so outside the box that they’d make people’s hair stand on end, you’ll have no trouble with the small trips outside the box that people call innovative.


Within the US car industry there is a lot of hand-wringing about declining market share. Yet the cause is so obvious that any observant outsider could explain it in a second: they make bad cars. And they have for so long that by now the US car brands are antibrands — something you’d buy a car despite, not because of. Cadillac stopped being the Cadillac of cars in about 1970. And yet I suspect no one dares say this.11 Otherwise these companies would have tried to fix the problem.


When you find something you can’t say, what do you do with it? My advice is, don’t say it. Or at least, pick your battles.


Suppose in the future there is a movement to ban the color yellow. Proposals to paint anything yellow are denounced as “yellowist,” as is anyone suspected of liking the color. People who like orange are tolerated but viewed with suspicion. Suppose you realize there is nothing wrong with yellow. If you go around saying so, you’ll be denounced as a yellowist too, and you’ll find yourself having a lot of arguments with anti-yellowists. If your aim in life is to rehabilitate the color yellow, that may be what you want. But if you’re mostly interested in other questions, being labelled as a yellowist will just be a distraction. Argue with idiots, and you become an idiot.


The most important thing is to be able to think what you want, not to say what you want.


When Milton was going to visit Italy in the 1630s, Sir Henry Wootton, who had been ambassador to Venice, told him that his motto should be “i pensieri stretti & il viso sciolto.” Closed thoughts and an open face. Smile at everyone, and don’t tell them what you’re thinking. This was wise advice.


I admit it seems cowardly to keep quiet. When I read about the harassment to which the Scientologists subject their critics,12 or people branded as anti-Semitic for speaking out against Israeli human-rights abuses,13 or researchers threatened with lawsuits under the DMCA,14 part of me wants to say, “All right, you bastards, bring it on.” The problem is, there are so many things you can’t say. If you said them all you’d have no time left for your real work. You’d have to turn into Noam Chomsky.


The trouble with keeping your thoughts secret, though, is that you lose the advantages of discussion. Talking about an idea leads to more ideas. So the optimal plan, if you can manage it, is to have a few trusted friends you can speak openly to. This is not just a way to develop ideas; it’s also a good rule of thumb for choosing friends. The people you can say heretical things to without getting jumped on are also the most interesting to know.


Perhaps the best policy is to make it plain that you don’t agree with whatever zealotry is current in your time, but not to be too specific about what you disagree with.


Better still, answer “I haven’t decided.” That’s what Larry Summers did when a group tried to put him in this position.16 Explaining himself later, he said “I don’t do litmus tests.” A lot of the questions people get hot about are actually quite complicated. There is no prize for getting the answer quickly.


The spread of the term “political correctness” meant the beginning of the end of political correctness, because it enabled one to attack the phenomenon as a whole without being accused of any of the specific heresies it sought to suppress.


the noun “hack” also has two senses. It can be either a compliment or an insult. It’s called a hack when you do something in an ugly way. But when you do something so clever that you somehow beat the system, that’s also called a hack. The word is used more often in the former than the latter sense, probably because ugly solutions are more common than brilliant ones.


there is a gradual continuum between rule breaking that’s merely ugly (using duct tape to attach something to your bike) and rule breaking that is brilliantly imaginative (discarding Euclidean space).


Those in authority tend to be annoyed by hackers’ general attitude of disobedience. But that disobedience is a byproduct of the qualities that make them good programmers. They may laugh at the CEO when he talks in generic corporate new speech, but they also laugh at someone who tells them a certain problem can’t be solved. Suppress one, and you suppress the other.


I suspect people in Hollywood are simply mystified by hackers’ attitudes toward copyrights. They are a perennial topic of heated discussion on Slashdot. But why should people who program computers be so concerned about copyrights, of all things? Partly because some companies use mechanisms to prevent copying. Show any hacker a lock and his first thought is how to pick it. But there is a deeper reason that hackers are alarmed by measures like copyrights and patents. They see increasingly aggressive measures to protect “intellectual property” as a threat to the intellectual freedom they need to do their job. And they are right.


I suspect people in Hollywood are simply mystified by hackers’ attitudes toward copyrights. They are a perennial topic of heated discussion on Slashdot. But why should people who program computers be so concerned about copyrights, of all things? Partly because some companies use mechanisms to prevent copying. Show any hacker a lock and his first thought is how to pick it. But there is a deeper reason that hackers are alarmed by measures like copyrights and patents. They see increasingly aggressive measures to protect “intellectual property” as a threat to the intellectual freedom they need to do their job. And they are right. It is by poking about inside current technology that hackers get ideas for the next generation.


Why are programmers so violently opposed to these laws? If I were a legislator, I’d be interested in this mystery — for the same reason that, if I were a farmer and suddenly heard a lot of squawking coming from my hen house one night, I’d want to go out and investigate. Hackers are not stupid, and unanimity is very rare in this world. So if they’re all squawking, perhaps there is something amiss.


I lived for a while in Florence. But after I’d been there a few months I realized that what I’d been unconsciously hoping to find there was back in the place I’d just left. The reason Florence is famous is that in 1450, it was New York. In 1450 it was filled with the kind of turbulent and ambitious people you find now in America.


It says a great deal about our work that we use the same word for a brilliant or a horribly cheesy solution. When we cook one up we’re not always 100% sure which kind it is. But as long as it has the right sort of wrongness, that’s a promising sign. It’s odd that people think of programming as precise and methodical. Computers are precise and methodical. Hacking is something you do with a gleeful laugh.


Smart-alecks have to develop a keen sense of how much they can get away with. And lately hackers have sensed a change in the atmosphere.


Smart-alecks have to develop a keen sense of how much they can get away with.


Civil liberties are not just an ornament, or a quaint American tradition. Civil liberties make countries rich. If you made a graph of GNP per capita vs. civil liberties, you’d notice a definite trend. Could civil liberties really be a cause, rather than just an effect? I think so. I think a society in which people can do and say what they want will also tend to be one in which the most efficient solutions win, rather than those sponsored by the most influential people. Authoritarian countries become corrupt; corrupt


Civil liberties are not just an ornament, or a quaint American tradition. Civil liberties make countries rich. If you made a graph of GNP per capita vs. civil liberties, you’d notice a definite trend. Could civil liberties really be a cause, rather than just an effect? I think so. I think a society in which people can do and say what they want will also tend to be one in which the most efficient solutions win, rather than those sponsored by the most influential people. Authoritarian countries become corrupt; corrupt countries become poor; and poor countries are weak.


Unlike high tax rates, you can’t repeal totalitarianism if it turns out to be a mistake. This is why hackers worry. The government spying on people doesn’t literally make programmers write worse code. It just leads eventually to a world in which bad ideas will win. And because this is so important to hackers, they’re especially sensitive to it. They can sense totalitarianism approaching from a distance, as animals can sense an approaching thunderstorm.


There is such a thing as American-ness. There’s nothing like living abroad to teach you that. And if you want to know whether something will nurture or squash this quality, it would be hard to find a better focus group than hackers, because they come closest of any group I know to embodying it. Closer, probably, than the men running our government, who for all their talk of patriotism remind me more of Richelieu or Mazarin than Thomas Jefferson or George Washington. When you read what the founding fathers had to say for themselves, they sound more like hackers. “The spirit of resistance to government,” Jefferson wrote, “is so valuable on certain occasions, that I wish it always to be kept alive.”


Imagine an American president saying that today. Like the remarks of an outspoken old grandmother, the sayings of the the founding fathers have embarrassed generations of their less confident successors. They remind us where we come from. They remind us that it is the people who break rules that are the source of America’s wealth and power. Those in a position to impose rules naturally want them to be obeyed. But be careful what you ask for. You might get it.


For the first week or so we intended to make this an ordinary desktop application. Then one day we had the idea of making the software run on our web server, using the browser as an interface. We tried rewriting the software to work over the Web, and it was clear that this was the way to go. If we wrote our software to run on the server, it would be a lot easier for the users and for us as well. This turned out to be a good plan. Now, as Yahoo Store, this software is the most popular online store builder, with over 20,000 users. When we started Viaweb, hardly anyone understood what we meant when we said that the software ran on the server. It was not until Hotmail was launched a year later that people started to get it.


With web-based software, most users won’t have to think about anything except the applications they use. All the messy, changing stuff will be sitting on a server somewhere, maintained by the kind of people who are good at that kind of thing. And so you won’t ordinarily need a computer, per se, to use software. All you’ll need will be something with a keyboard, a screen, and a web browser. Maybe it will have wireless Internet access. Maybe it will also be your cell phone. Whatever it is, it will be consumer electronics: something that costs about $200, and that people choose mostly based on how the case looks. You’ll pay more for Internet services than you do for the hardware, just as you do now with telephones.


If I’d had to wait a year for the next release, I would have shelved most of these ideas, for a while at least. The thing about ideas, though, is that they lead to more ideas.


Have you ever noticed that when you sit down to write something, half the ideas that end up in it are ones you thought of while writing? The same thing happens with software.


What big companies do instead of implementing features is plan them.


There was no protection against breakage except the fear of looking like an idiot to one’s peers, and that was more than enough.


This way of writing software is a double-edged sword of course. It works a lot better for a small team of good, trusted programmers than it would for a big company of mediocre ones, where bad ideas are caught by committees instead of the people who had them.


A large part of what big companies pay extra for is the cost of selling expensive things to them.


IBM made a late and half-hearted entry into the microcomputer business because they were ambivalent about threatening their cash cow, mainframe computing. Microsoft will likewise be hampered by wanting to save the desktop. A cash cow can be a heavy monkey on your back.


Desktop software forces users to become system administrators. Web-based software forces programmers to.


Web pages weren’t designed to be a UI for applications, but they’re just good enough.


There are only two things you have to know about business: build something users love, and make more than you spend. If you get these two right, you’ll be ahead of most startups. You can figure out the rest as you go.


you start out under funded, it will at least encourage a habit


If you start out under funded, it will at least encourage a habit of frugality. The less you spend, the easier it is to make more than you spend.


The best thing software can be is easy, but the way to do this is to get the defaults right, not to limit users’ choices.


Don’t listen to marketing people or designers or product managers just because of their job titles. If they have good ideas, use them, but it’s up to you to decide; software has to be designed by hackers who understand design, not designers who know a little about software. If you can’t design software as well as implement it, don’t start a startup.


It’s a lot easier for a couple of hackers to figure out how to rent office space or hire sales people than it is for a company of any size to get software written.


What’s scary about Microsoft is that a company so big can develop software at all. They’re like a mountain that can walk.


Here is a brief sketch of the economic proposition. If you’re a good hacker in your mid twenties, you can get a job paying about $80,000 per year. So on average such a hacker must be able to do at least $80,000 worth of work per year for the company just to break even. You could probably work twice as many hours as a corporate employee, and if you focus you can probably get three times as much done in an hour.1 You should get another multiple of two, at least, by eliminating the drag of the pointy-haired middle manager who would be your boss in a big company. Then there is one more multiple: how much smarter are you than your job description expects you to be? Suppose another multiple of three. Combine all these multipliers, and I’m claiming you could be 36 times more productive than you’re expected to be in a random corporate job.2 If a fairly good hacker is worth $80,000 a year at a big company, then a smart hacker working very hard without any corporate bullshit to slow him down should be able to do work worth about $3 million a year.


A programmer, for example, instead of chugging along maintaining and updating an existing piece of software, could write a whole new piece of software, and with it create a new source of revenue. Companies are not set up to reward people who want to do this. You can’t go to your boss and say, I’d like to start working ten times as hard, so will you please pay me ten times as much?


the official fiction is that you are already working as hard as you can.


A company that could pay all its employees so straightforwardly would be enormously successful. Many employees would work harder if they could get paid for it. More importantly, such a company would attract people who wanted to work especially hard. It would crush its competitors.


To get rich you need to get yourself in a situation with two things, measurement and leverage. You need to be in a position where your performance can be measured, or there is no way to get paid more by doing more. And you have to have leverage, in the sense that the decisions you make have a big effect.


An example of a job with both measurement and leverage would be lead actor in a movie. Your performance can be measured in the gross of the movie. And you have leverage in the sense that your performance can make or break it.


A good hint to the presence of leverage is the possibility of failure. Upside must be balanced by downside, so if there is big potential for gain there must also be a terrifying possibility of loss. CEOs, stars, fund managers, and athletes all live with the sword hanging over their heads; the moment they start to suck, they’re out. If you’re in a job that feels safe, you are not going to get rich, because if there is no danger there is almost certainly no leverage.


You don’t want small in the sense of a village, but small in the sense of an all-star team.


So all other things being equal, a very able person in a big company is probably getting a bad deal, because his performance is dragged down by the overall lower performance of the others.


technical advances tend to come from unorthodox approaches, and small companies are less constrained by convention.


At Via web one of our rules of thumb was run upstairs. Suppose you are a little, nimble guy being chased by a big, fat, bully. You open a door and find yourself in a staircase. Do you go up or down? I say up. The bully can probably run downstairs as fast as you can. Going upstairs his bulk will be more of a disadvantage. Running upstairs is hard for you but even harder for him. What this meant in practice was that we


At Via web one of our rules of thumb was run upstairs. Suppose you are a little, nimble guy being chased by a big, fat, bully. You open a door and find yourself in a staircase. Do you go up or down? I say up. The bully can probably run downstairs as fast as you can. Going upstairs his bulk will be more of a disadvantage. Running upstairs is hard for you but even harder for him.


What this meant in practice was that we deliberately sought hard problems. If there were two features we could add to our software, both equally valuable in proportion to their difficulty, we’d always take the harder one. Not just because it was more valuable, but because it was harder. We delighted in forcing bigger, slower competitors to follow us over difficult ground. Like guerillas, startups prefer the difficult terrain of the mountains, where the troops of the central government can’t follow. I can remember times when we were just exhausted after wrestling all day with some horrible technical problem. And I’d be delighted, because something that was hard for us would be impossible for our competitors.


Here, as so often, the best defense is a good offense. If you can develop technology that’s simply too hard for competitors to duplicate, you don’t need to rely on other defenses. Start by picking a hard problem, and then at every decision point, take the harder choice.


A startup is like a mosquito. A bear can absorb a hit and a crab is armored against one, but a mosquito is designed for one thing: to score. No energy is wasted on defense. The defense of mosquitos, as a species, is that there are a lot of them, but this is little consolation to the individual mosquito.


The all-or-nothing aspect of startups was not something we wanted. Via web’s hackers were all extremely risk-averse. If there had been some way just to work super hard and get paid for it, without having a lottery mixed in, we would have been delighted. We would have much preferred a 100% chance of $1 million to a 20% chance of $10 million, even though theoretically the second is worth twice as much. Unfortunately, there is not currently any space in the business world where you can get the first deal.


You’d think that a company about to buy you would do a lot of research and decide for themselves how valuable your technology was. Not at all. What they go by is the number of users you have.


Two things changed. The first was the rule of law. For most of the world’s history, if you did somehow accumulate a fortune, the ruler or his henchmen would find a way to steal it. But in medieval Europe something new happened. A new class of merchants and manufacturers began to collect in towns.10 Together they were able to withstand the local feudal lord. So for the first time in our history, the bullies stopped stealing the nerds’ lunch money.


Understanding this may help to answer an important question: why Europe grew so powerful. Was it something about the geography of Europe? Was it that Europeans are somehow racially superior? Was it their religion? The answer (or at least the proximate cause) may be that the Europeans rode on the crest of a powerful new idea: allowing those who made a lot of money to keep it.


When people care enough about something to do it well, those who do it best tend to be far better than everyone else. There’s a huge gap between Leonardo and second-rate contemporaries like Borgognone. You see the same gap between Raymond Chandler and the average writer of detective novels. A top-ranked professional chess player could play ten thousand games against an ordinary club player without losing once.


“Are they really worth 100 of us?” editorialists ask. Depends on what you mean by worth. If you mean worth in the sense of what people will pay for their skills, the answer is yes, apparently.


It may seem unlikely in principle that one individual could really generate so much more wealth than another. The key to this mystery is to revisit that question, are they really worth 100 of us? Would a basketball team trade one of their players for 100 random people? What would Apple’s next product look like if you replaced Steve Jobs with a committee of 100 random people?6 These things don’t scale linearly.


In more organized societies, like China, the ruler and his officials used taxation instead of confiscation. But here too we see the same principle: the way to get rich was not to create wealth, but to serve a ruler powerful enough to appropriate it.


it was not till the Industrial Revolution that wealth creation definitively replaced corruption as the best way to get rich. In England, at least, corruption only became unfashionable (and in fact only started to be called “corruption”) when there started to be other, faster ways to get rich.


By the nineteenth century that had changed. There continued to be bribes, as there still are everywhere, but politics had by then been left to men who were driven more by vanity than greed. Technology had made it possible to create wealth faster than you could steal it. The prototypical rich man of the nineteenth century was not a courtier but an industrialist.


It’s possible to buy expensive, handmade cars that cost hundreds of thousands of dollars. But there is not much point. Companies make more money by building a large number of ordinary cars than a small number of expensive ones. So a company making a mass-produced car can afford to spend a lot more on its design. If you buy a custom-made car, something will always be breaking. The only point of buying one now is to advertise that you


It’s possible to buy expensive, handmade cars that cost hundreds of thousands of dollars. But there is not much point. Companies make more money by building a large number of ordinary cars than a small number of expensive ones. So a company making a mass-produced car can afford to spend a lot more on its design. If you buy a custom-made car, something will always be breaking. The only point of buying one now is to advertise that you can.


most people who are rich enough not to work do anyway. It’s not just social pressure that makes them; idleness is lonely and demoralizing


Materially and socially, technology seems to be decreasing the gap between the rich and the poor, not increasing it. If Lenin walked around the offices of a company like Yahoo or Intel or Cisco, he’d think communism had won. Everyone would be wearing the same clothes, have the same kind of office (or rather, cubicle) with the same furnishings, and address one another by their first names instead of by honorifics.


If I had a choice of living in a society where I was materially much better off than I am now, but was among the poorest, or in one where I was the richest, but much worse off than I am now, I’d take the first option. If I had children, it would arguably be immoral not to. It’s absolute poverty you want to avoid, not relative poverty. If, as the evidence so far implies, you have to have one or the other in your society, take relative poverty.


Most of us were encouraged, as children, to leave this tangle unexamined. If you made fun of your little brother for coloring people green in his coloring book, your mother was likely to tell you something like “you like to do it your way and he likes to do it his way.” Your mother at this point was not trying to teach you important truths about aesthetics. She was trying to get the two of you to stop bickering.


When you’re forced to be simple, you’re forced to face the real problem. When you can’t deliver ornament, you have to deliver substance.


Aiming at timelessness is a way to make yourself find the best answer: if you can imagine someone surpassing you, you should do it yourself. Some of the greatest masters did this so well that they left little room for those who came after. Every engraver since Dürer suffers by comparison.


In science and engineering, some of the greatest discoveries seem so simple that you say to yourself, I could have thought of that. The discoverer is entitled to reply, why didn’t you?


When people talk about being in “the zone,” I think what they mean is that the spinal cord has the situation under control. Your spinal cord is less hesitant, and it frees conscious thought for the hard problems.


The danger of symmetry, and repetition especially, is that it can be used as a substitute for thought.


It takes confidence to throw work away. You have to be able to think, there’s more where that came from.


At an art school where I once studied, the students wanted most of all to develop a personal style. But if you just try to make good things, you’ll inevitably do it in a distinctive way, just as each person walks in a distinctive way. Michelangelo was not trying to paint like Michelangelo. He was just trying to paint well; he couldn’t help painting like Michelangelo.


Nothing is more powerful than a community of talented people working on related problems. Genes count for little by comparison: being a genetic


Nothing is more powerful than a community of talented people working on related problems.


In practice I think it’s easier to see ugliness than to imagine beauty. Most of the people who’ve made beautiful things seem to have done it by fixing something they thought ugly.


Great work usually seems to happen because someone sees something and thinks, I could do better than that.


Part of the problem is that if you use a language for long enough, you start to think in it. So any language that’s substantially different feels terribly awkward, even if there’s nothing intrinsically wrong with it. Inexperienced programmers’ judgements about the relative merits of programming languages are often skewed by this effect.


believe what he’s saying. It’s possible to write the same primitive


primitive Pascal-like programs in almost every language. If you only ever eat at McDonald’s,


If you only ever eat at McDonald’s, it will seem that food is much the same in every country.


lets you do x is at least as good as one that forces you to. So as


Sure, use a language that lets you write object-oriented programs. Whether you ever actually want to then becomes a separate question.


Languages evolve slowly because they’re not really technologies. Languages are notation. A program is a formal description of the problem you want a computer to solve for you. So the rate of evolution in programming languages is more like the rate of evolution in mathematical notation than, say, transportation or communications. Mathematical notation does evolve, but not with the giant leaps you see in technology.


If some applications can be increasingly inefficient while others continue to demand all the speed the hardware can deliver, faster computers will mean that languages have to cover an ever wider range of efficiencies. We’ve seen this happening already. Current implementations of some popular new languages are shockingly wasteful by the standards of previous decades. This isn’t just something that happens with programming languages. It’s a general historical trend. As technologies improve, each generation can do things that the previous generation would have considered wasteful.


I can already tell you what’s going to happen to all those extra cycles that faster hardware is going to give us in the next hundred years. They’re nearly all going to be wasted.


Inefficient software isn’t gross. What’s gross is a language that makes programmers do needless work. Wasting programmer time is the true inefficiency, not wasting machine time. This will become ever more clear as computers get faster.


Somehow the idea of reusability got attached to object-oriented programming in the 1980s, and no amount of evidence to the contrary seems to be able to shake it free.


Language design is being taken over by hackers. The results so far are messy, but encouraging. There are some stunningly novel ideas in Perl, for example. Many are stunningly bad, but that’s always true of ambitious efforts.


But it would be hard to predict now what kinds of libraries might be needed in a hundred years. Presumably many libraries will be for domains that don’t even exist yet. If SETI@home works, for example, we’ll need libraries for communicating with aliens. Unless of course they are sufficiently advanced that they already communicate in XML.


Lisp is worth learning for the profound enlightenment experience you will have when you finally get it; that experience will make you a better programmer for the rest of your days, even if you never actually use Lisp itself a lot.


The average big company grows at about ten percent a year. So if you’re running a big company and you do everything the way the average big company does it, you can expect to do as well as the average big company — that is, to grow about ten percent a year. The same thing will happen if you’re running a startup, of course. If you do everything the way the average startup does it, you should expect average performance. The problem here is, average performance means you’ll go out of business. The survival rate for startups is way less than fifty percent. So if you’re running a startup, you had better be doing something odd. If not, you’re in trouble.


the reason everyone doesn’t use it is that programming languages are not merely technologies, but habits of mind as well, and nothing changes slower.


My purpose here is not to change anyone’s mind, but to reassure people already interested in using Lisp — people who know that Lisp is a powerful language, but worry because it isn’t widely used. In a competitive situation, that’s an advantage. Lisp’s power is multiplied by the fact that your competitors don’t get it.


My purpose here is not to change anyone’s mind, but to reassure people already interested in using Lisp — people who know that Lisp is a powerful language, but worry because it isn’t widely used. In a competitive situation, that’s an advantage. Lisp’s power is multiplied by the fact that your competitors don’t get it. If you think of using Lisp in a startup, you shouldn’t worry that it isn’t widely understood. You should hope that it stays that way.


Ordinarily technology changes fast. But programming languages are different: programming languages are not just technology, but what programmers think in. They’re half technology and half religion.


If you ever do find yourself working for a startup, here’s a handy tip for evaluating competitors. Read their job listings. Everything else on their site may be stock photos or the prose equivalent, but the job listings have to be specific about what they want, or they’ll get the wrong candidates.


The pointy-haired boss miraculously combines two qualities that are common by themselves, but rarely seen together: (a) he knows nothing whatsoever about technology, and (b) he has very strong opinions about it.


Suppose, for example, you need to write a piece of software. The pointy-haired boss has no idea how this software has to work and can’t tell one programming language from another, and yet he knows what language you should write it in. Exactly. He thinks you should write it in Java. Why does he think this? Let’s take a look inside the brain of the pointy-haired boss. What he’s thinking is something like this. Java is a standard. I know it must be, because I read about it in the press all the time. Since it is a standard, I won’t get in trouble for using it. And that also means there will always be lots of Java programmers, so if those working for me now quit, as programmers working for me mysteriously always do, I can easily replace them.


And if you’d shown people Ruby in 1975 and described it as a dialect of Lisp with syntax, no one would have argued with you. Programming languages have almost caught up with 1958.


Figure 13-2. Alpha nerd: John McCarthy.


But in late 1958, Steve Russell,3 one of McCarthy’s grad students, looked at this definition of eval and realized that if he translated it into machine language, the result would be a Lisp interpreter. This was a big surprise at the time. Here is what McCarthy said about it later: Steve Russell said, look, why don’t I program this eval..., and I said to him, ho, ho, you’re confusing theory with practice, this eval is intended for reading, not for computing. But he went ahead and did it.


Present-day Fortran is now arguably closer to Lisp than to Fortran I.


when it comes down to it, the pointy-haired boss doesn’t mind if his company gets their ass kicked, so long as no one can prove it’s his fault.


Within large organizations, the phrase used to describe this approach is “industry best practice.” Its purpose is to shield the pointy-haired boss from responsibility: if he chooses something that is “industry best practice,” and the company loses, he can’t be blamed. He didn’t choose, the industry did.


this term was originally used to describe accounting methods and so on. What it means, roughly, is don’t do anything weird. And in accounting that’s probably a good idea. The terms “cutting-edge” and “accounting” do not sound good together.


It’s a mistake to try to baby the user with long-winded expressions meant to resemble English.


something between an insult to his intelligence and a sin against God.


A friend of mine rarely does anything the first time someone asks him. He knows that people sometimes ask for things they turn out not to want. To avoid wasting his time, he waits till the third or fourth time he’s asked to do something. By then whoever’s asking him may be fairly annoyed, but at least they probably really do want whatever they’re asking for.


By delaying learning VRML, I avoided having to learn it at all.


People who do good work often think that whatever they’re working on is no good. Others see what they’ve done and think it’s wonderful, but the creator sees nothing but flaws. This pattern is no coincidence: worry made the work good.


Anything you can do to keep the redesign cycle going is good. Prose can be rewritten over and over until you’re happy with it. But software, as a rule, doesn’t get redesigned enough. Prose has readers, but software has users. If a writer rewrites an essay, people who read the new version are unlikely to complain that their thoughts have been broken by some newly introduced incompatibility.


The language offers abstractions only as a way of saving you work, rather than as a way of telling you what to do.


Visitors to this country are often surprised to find that Americans like to begin a conversation by asking “what do you do?” I’ve never liked this question. I’ve rarely had a neat answer to it. But I think I have finally solved the problem. Now, when someone asks me what I do, I look them straight in the eye and say, “I’m designing a new dialect of Lisp.” I recommend this answer to anyone who doesn’t like being asked what they do. The conversation will turn immediately to other topics.


When I say that design must be for users, I don’t mean to imply that good design aims at some kind of lowest common denominator. You can pick any group of users you want. If you’re designing a tool, for example, you can design it for anyone from beginners to experts, and what’s good design for one group might be bad for another. The point is, you have to pick some group of users. I don’t think you can even talk about good or bad design except with reference to some intended user.


looking down on the user, however benevolently, always seems to corrupt the designer. I suspect few housing projects in the US were designed by architects who expected to live in them. You see the same thing in programming languages. C, Lisp, and Smalltalk were created for their own designers to use. Cobol, Ada, and Java were created for other people to use.


If you think you’re designing something for idiots, odds are you’re not designing something good, even for idiots.


A program, like a proof, is a pruned version of a tree that in the past has had false starts branching off all over it. So the test of a language is not simply how clean the finished program looks in it, but how clean the path to the finished program was. A design choice that gives you elegant finished programs may not give you an elegant design process. For example, I’ve written a few macro defining macros that look now like little gems, but writing them took hours of the ugliest trial and error, and frankly, I’m still not entirely sure they’re correct.


If you open an average “literary” novel and imagine reading it out loud to your friends as something you’d written, you’ll feel all too keenly what an imposition that kind of thing is upon the reader.


this idea is known as Worse is Better. Actually, there are several ideas mixed together in the concept of Worse is Better, which is why people are still arguing about whether worse is actually better or not.


Worse is Better is found throughout the arts. In drawing, for example, the idea was discovered during the Renaissance. Now almost every drawing teacher will tell you that the right way to get an accurate drawing is not to work your way slowly around the contour of an object, because errors will accumulate and you’ll find at the end that the lines don’t meet. Instead you should draw a few quick lines in roughly the right place, and then gradually refine this initial sketch.


What made oil paint so exciting, when it first became popular in the fifteenth century, was that you could make the finished work from the prototype. You could make a preliminary drawing if you wanted to, but you weren’t held to it; you could work out all the details, and even make major changes, as you finished the painting.


Morale is another reason that it’s hard to design something for an unsophisticated user. It’s hard to stay interested in something you don’t like yourself. To make something good, you have to be thinking, “wow, this is really great,” not “what a piece of shit; those fools will love it.”