text
large_stringlengths 55
74.7k
|
---|
March 2021
I try to write using ordinary words and simple sentences.
That kind of writing is easier to read, and the easier something is to read,
the more deeply readers will engage with it. The less energy they expend on
your prose, the more they'll have left for your ideas.
And the further they'll read. Most readers' energy tends to flag part way
through an article or essay. If the friction of reading is low enough, more
keep going till the end.
There's an Italian dish called _saltimbocca_ , which means "leap into the
mouth." My goal when writing might be called _saltintesta_ : the ideas leap
into your head and you barely notice the words that got them there.
It's too much to hope that writing could ever be pure ideas. You might not
even want it to be. But for most writers, most of the time, that's the goal to
aim for. The gap between most writing and pure ideas is not filled with
poetry.
Plus it's more considerate to write simply. When you write in a fancy way to
impress people, you're making them do extra work just so you can seem cool.
It's like trailing a long train behind you that readers have to carry.
And remember, if you're writing in English, that a lot of your readers won't
be native English speakers. Their understanding of ideas may be way ahead of
their understanding of English. So you can't assume that writing about a
difficult topic means you can use difficult words.
Of course, fancy writing doesn't just conceal ideas. It can also conceal the
lack of them. That's why some people write that way, to conceal the fact that
they have
[__](https://scholar.google.com/scholar?hl=en&as_sdt=0%2C5&q=hermeneutic+dialectics+hegemonic+modalities)nothing
to say. Whereas writing simply keeps you honest. If you say nothing simply, it
will be obvious to everyone, including you.
Simple writing also lasts better. People reading your stuff in the future will
be in much the same position as people from other countries reading it today.
The culture and the language will have changed. It's not vain to care about
that, any more than it's vain for a woodworker to build a chair to last.
Indeed, lasting is not merely an accidental quality of chairs, or writing.
It's a sign you did a good job.
But although these are all real advantages of writing simply, none of them are
why I do it. The main reason I write simply is that it offends me not to. When
I write a sentence that seems too complicated, or that uses unnecessarily
intellectual words, it doesn't seem fancy to me. It seems clumsy.
There are of course times when you want to use a complicated sentence or fancy
word for effect. But you should never do it by accident.
The other reason my writing ends up being simple is the way I do it. I write
the first draft fast, then spend days editing it, trying to get everything
just right. Much of this editing is cutting, and that makes simple writing
even simpler.
|
**Want to start a startup?** Get funded by [Y
Combinator](http://ycombinator.com/apply.html).
December 2010
Someone we funded is talking to VCs now, and asked me how common it was for a
startup's founders to retain control of the board after a series A round. He
said VCs told him this almost never happened.
Ten years ago that was true. In the past, founders rarely kept control of the
board through a series A. The traditional series A board consisted of two
founders, two VCs, and one independent member. More recently the recipe is
often one founder, one VC, and one independent. In either case the founders
lose their majority.
But not always. Mark Zuckerberg kept control of Facebook's board through the
series A and still has it today. Mark Pincus has kept control of Zynga's too.
But are these just outliers? How common is it for founders to keep control
after an A round? I'd heard of several cases among the companies we've funded,
but I wasn't sure how many there were, so I emailed the ycfounders list.
The replies surprised me. In a dozen companies we've funded, the founders
still had a majority of the board seats after the series A round.
I feel like we're at a tipping point here. A lot of VCs still act as if
founders retaining board control after a series A is unheard-of. A lot of them
try to make you feel bad if you even ask — as if you're a noob or a control
freak for wanting such a thing. But the founders I heard from aren't noobs or
control freaks. Or if they are, they are, like Mark Zuckerberg, the kind of
noobs and control freaks VCs should be trying to fund more of.
Founders retaining control after a series A is clearly heard-of. And barring
financial catastrophe, I think in the coming year it will become the norm.
Control of a company is a more complicated matter than simply outvoting other
parties in board meetings. Investors usually get vetos over certain big
decisions, like selling the company, regardless of how many board seats they
have. And board votes are rarely split. Matters are decided in the discussion
preceding the vote, not in the vote itself, which is usually unanimous. But if
opinion is divided in such discussions, the side that knows it would lose in a
vote will tend to be less insistent. That's what board control means in
practice. You don't simply get to do whatever you want; the board still has to
act in the interest of the shareholders; but if you have a majority of board
seats, then your opinion about what's in the interest of the shareholders will
tend to prevail.
So while board control is not total control, it's not imaginary either.
There's inevitably a difference in how things feel within the company. Which
means if it becomes the norm for founders to retain board control after a
series A, that will change the way things feel in the whole startup world.
The switch to the new norm may be surprisingly fast, because the startups that
can retain control tend to be the best ones. They're the ones that set the
trends, both for other startups and for VCs.
A lot of the reason VCs are harsh when negotiating with startups is that
they're embarrassed to go back to their partners looking like they got beaten.
When they sign a termsheet, they want to be able to brag about the good terms
they got. A lot of them don't care that much personally about whether founders
keep board control. They just don't want to seem like they had to make
concessions. Which means if letting the founders keep control stops being
perceived as a concession, it will rapidly become much more common.
Like a lot of changes that have been forced on VCs, this change won't turn out
to be as big a problem as they might think. VCs will still be able to
convince; they just won't be able to compel. And the startups where they have
to resort to compulsion are not the ones that matter anyway. VCs make most of
their money from a few big hits, and those aren't them.
Knowing that founders will keep control of the board may even help VCs pick
better. If they know they can't fire the founders, they'll have to choose
founders they can trust. And that's who they should have been choosing all
along.
**Thanks** to Sam Altman, John Bautista, Trevor Blackwell, Paul Buchheit,
Brian Chesky, Bill Clerico, Patrick Collison, Adam Goldstein, James
Lindenbaum, Jessica Livingston, and Fred Wilson for reading drafts of this.
|
December 2008
A few months ago I read a _New York Times_ article on South Korean cram
schools that said
> Admission to the right university can make or break an ambitious young South
> Korean.
A parent added:
> "In our country, college entrance exams determine 70 to 80 percent of a
> person's future."
It was striking how old fashioned this sounded. And yet when I was in high
school it wouldn't have seemed too far off as a description of the US. Which
means things must have been changing here.
The course of people's lives in the US now seems to be determined less by
credentials and more by performance than it was 25 years ago. Where you go to
college still matters, but not like it used to.
What happened?
_____
Judging people by their academic credentials was in its time an advance. The
practice seems to have begun in China, where starting in 587 candidates for
the imperial civil service had to take an exam on classical literature. [1] It
was also a test of wealth, because the knowledge it tested was so specialized
that passing required years of expensive training. But though wealth was a
necessary condition for passing, it was not a sufficient one. By the standards
of the rest of the world in 587, the Chinese system was very enlightened.
Europeans didn't introduce formal civil service exams till the nineteenth
century, and even then they seem to have been influenced by the Chinese
example.
Before credentials, government positions were obtained mainly by family
influence, if not outright bribery. It was a great step forward to judge
people by their performance on a test. But by no means a perfect solution.
When you judge people that way, you tend to get cram schools—which they did in
Ming China and nineteenth century England just as much as in present day South
Korea.
What cram schools are, in effect, is leaks in a seal. The use of credentials
was an attempt to seal off the direct transmission of power between
generations, and cram schools represent that power finding holes in the seal.
Cram schools turn wealth in one generation into credentials in the next.
It's hard to beat this phenomenon, because the schools adjust to suit whatever
the tests measure. When the tests are narrow and predictable, you get cram
schools on the classic model, like those that prepared candidates for
Sandhurst (the British West Point) or the classes American students take now
to improve their SAT scores. But as the tests get broader, the schools do too.
Preparing a candidate for the Chinese imperial civil service exams took years,
as prep school does today. But the raison d'etre of all these institutions has
been the same: to beat the system. [2]
_____
History suggests that, all other things being equal, a society prospers in
proportion to its ability to prevent parents from influencing their children's
success directly. It's a fine thing for parents to help their children
indirectly—for example, by helping them to become smarter or more disciplined,
which then makes them more successful. The problem comes when parents use
direct methods: when they are able to use their own wealth or power as a
substitute for their children's qualities.
Parents will tend to do this when they can. Parents will die for their kids,
so it's not surprising to find they'll also push their scruples to the limits
for them. Especially if other parents are doing it.
Sealing off this force has a double advantage. Not only does a society get
"the best man for the job," but parents' ambitions are diverted from direct
methods to indirect ones—to actually trying to raise their kids well.
But we should expect it to be very hard to contain parents' efforts to obtain
an unfair advantage for their kids. We're dealing with one of the most
powerful forces in human nature. We shouldn't expect naive solutions to work,
any more than we'd expect naive solutions for keeping heroin out of a prison
to work.
_____
The obvious way to solve the problem is to make credentials better. If the
tests a society uses are currently hackable, we can study the way people beat
them and try to plug the holes. You can use the cram schools to show you where
most of the holes are. They also tell you when you're succeeding in fixing
them: when cram schools become less popular.
A more general solution would be to push for increased transparency,
especially at critical social bottlenecks like college admissions. In the US
this process still shows many outward signs of corruption. For example, legacy
admissions. The official story is that legacy status doesn't carry much
weight, because all it does is break ties: applicants are bucketed by ability,
and legacy status is only used to decide between the applicants in the bucket
that straddles the cutoff. But what this means is that a university can make
legacy status have as much or as little weight as they want, by adjusting the
size of the bucket that straddles the cutoff.
By gradually chipping away at the abuse of credentials, you could probably
make them more airtight. But what a long fight it would be. Especially when
the institutions administering the tests don't really want them to be
airtight.
_____
Fortunately there's a better way to prevent the direct transmission of power
between generations. Instead of trying to make credentials harder to hack, we
can also make them matter less.
Let's think about what credentials are for. What they are, functionally, is a
way of predicting performance. If you could measure actual performance, you
wouldn't need them.
So why did they even evolve? Why haven't we just been measuring actual
performance? Think about where credentialism first appeared: in selecting
candidates for large organizations. Individual performance is hard to measure
in large organizations, and the harder performance is to measure, the more
important it is to predict it. If an organization could immediately and
cheaply measure the performance of recruits, they wouldn't need to examine
their credentials. They could take everyone and keep just the good ones.
Large organizations can't do this. But a bunch of small organizations in a
market can come close. A market takes every organization and keeps just the
good ones. As organizations get smaller, this approaches taking every person
and keeping just the good ones. So all other things being equal, a society
consisting of more, smaller organizations will care less about credentials.
_____
That's what's been happening in the US. That's why those quotes from Korea
sound so old fashioned. They're talking about an economy like America's a few
decades ago, dominated by a few big companies. The route for the ambitious in
that sort of environment is to join one and climb to the top. Credentials
matter a lot then. In the culture of a large organization, an elite pedigree
becomes a self-fulfilling prophecy.
This doesn't work in small companies. Even if your colleagues were impressed
by your credentials, they'd soon be parted from you if your performance didn't
match, because the company would go out of business and the people would be
dispersed.
In a world of small companies, performance is all anyone cares about. People
hiring for a startup don't care whether you've even graduated from college,
let alone which one. All they care about is what you can do. Which is in fact
all that should matter, even in a large organization. The reason credentials
have such prestige is that for so long the large organizations in a society
tended to be the most powerful. But in the US at least they don't have the
monopoly on power they once did, precisely because they can't measure (and
thus reward) individual performance. Why spend twenty years climbing the
corporate ladder when you can get rewarded directly by the market?
I realize I see a more exaggerated version of the change than most other
people. As a partner at an early stage venture funding firm, I'm like a
jumpmaster shoving people out of the old world of credentials and into the new
one of performance. I'm an agent of the change I'm seeing. But I don't think
I'm imagining it. It was not so easy 25 years ago for an ambitious person to
choose to be judged directly by the market. You had to go through bosses, and
they were influenced by where you'd been to college.
_____
What made it possible for small organizations to succeed in America? I'm still
not entirely sure. Startups are certainly a large part of it. Small
organizations can develop new ideas faster than large ones, and new ideas are
increasingly valuable.
But I don't think startups account for all the shift from credentials to
measurement. My friend Julian Weber told me that when he went to work for a
New York law firm in the 1950s they paid associates far less than firms do
today. Law firms then made no pretense of paying people according to the value
of the work they'd done. Pay was based on seniority. The younger employees
were paying their dues. They'd be rewarded later.
The same principle prevailed at industrial companies. When my father was
working at Westinghouse in the 1970s, he had people working for him who made
more than he did, because they'd been there longer.
Now companies increasingly have to pay employees market price for the work
they do. One reason is that employees no longer trust companies to deliver
[deferred rewards](ladder.html): why work to accumulate deferred rewards at a
company that might go bankrupt, or be taken over and have all its implicit
obligations wiped out? The other is that some companies broke ranks and
started to pay young employees large amounts. This was particularly true in
consulting, law, and finance, where it led to the phenomenon of yuppies. The
word is rarely used today because it's no longer surprising to see a 25 year
old with money, but in 1985 the sight of a 25 year old _professional_ able to
afford a new BMW was so novel that it called forth a new word.
The classic yuppie worked for a small organization. He didn't work for General
Widget, but for the law firm that handled General Widget's acquisitions or the
investment bank that floated their bond issues.
Startups and yuppies entered the American conceptual vocabulary roughly
simultaneously in the late 1970s and early 1980s. I don't think there was a
causal connection. Startups happened because technology started to change so
fast that big companies could no longer keep a lid on the smaller ones. I
don't think the rise of yuppies was inspired by it; it seems more as if there
was a change in the social conventions (and perhaps the laws) governing the
way big companies worked. But the two phenomena rapidly fused to produce a
principle that now seems obvious: paying energetic young people market rates,
and getting correspondingly high performance from them.
At about the same time the US economy rocketed out of the doldrums that had
afflicted it for most of the 1970s. Was there a connection? I don't know
enough to say, but it felt like it at the time. There was a lot of energy
released.
_____
Countries worried about their competitiveness are right to be concerned about
the number of startups started within them. But they would do even better to
examine the underlying principle. Do they let energetic young people get paid
market rate for the work they do? The young are the test, because when people
aren't rewarded according to performance, they're invariably rewarded
according to seniority instead.
All it takes is a few beachheads in your economy that pay for performance.
Measurement spreads like heat. If one part of a society is better at
measurement than others, it tends to push the others to do better. If people
who are young but smart and driven can make more by starting their own
companies than by working for existing ones, the existing companies are forced
to pay more to keep them. So market rates gradually permeate every
organization, even the government. [3]
The measurement of performance will tend to push even the organizations
issuing credentials into line. When we were kids I used to annoy my sister by
ordering her to do things I knew she was about to do anyway. As credentials
are superseded by performance, a similar role is the best former gatekeepers
can hope for. Once credential granting institutions are no longer in the self-
fullfilling prophecy business, they'll have to work harder to predict the
future.
_____
Credentials are a step beyond bribery and influence. But they're not the final
step. There's an even better way to block the transmission of power between
generations: to encourage the trend toward an economy made of more, smaller
units. Then you can measure what credentials merely predict.
No one likes the transmission of power between generations—not the left or the
right. But the market forces favored by the right turn out to be a better way
of preventing it than the credentials the left are forced to fall back on.
The era of credentials began to end when the power of large organizations
[peaked](highres.html) in the late twentieth century. Now we seem to be
entering a new era based on measurement. The reason the new model has advanced
so rapidly is that it works so much better. It shows no sign of slowing.
**Notes**
[1] Miyazaki, Ichisada (Conrad Schirokauer trans.), _China's Examination Hell:
The Civil Service Examinations of Imperial China,_ Yale University Press,
1981.
Scribes in ancient Egypt took exams, but they were more the type of
proficiency test any apprentice might have to pass.
[2] When I say the raison d'etre of prep schools is to get kids into better
colleges, I mean this in the narrowest sense. I'm not saying that's all prep
schools do, just that if they had zero effect on college admissions there
would be far less demand for them.
[3] Progressive tax rates will tend to damp this effect, however, by
decreasing the difference between good and bad measurers.
**Thanks** to Trevor Blackwell, Sarah Harlin, Jessica Livingston, and David
Sloo for reading drafts of this.
|
October 2005
The first Summer Founders Program has just finished. We were surprised how
well it went. Overall only about 10% of startups succeed, but if I had to
guess now, I'd predict three or four of the eight startups we funded will make
it.
Of the startups that needed further funding, I believe all have either closed
a round or are likely to soon. Two have already turned down (lowball)
acquisition offers.
We would have been happy if just one of the eight seemed promising by the end
of the summer. What's going on? Did some kind of anomaly make this summer's
applicants especially good? We worry about that, but we can't think of one.
We'll find out this winter.
The whole summer was full of surprises. The best was that the
[hypothesis](hiring.html) we were testing seems to be correct. Young hackers
can start viable companies. This is good news for two reasons: (a) it's an
encouraging thought, and (b) it means that Y Combinator, which is predicated
on the idea, is not hosed.
**Age**
More precisely, the hypothesis was that success in a startup depends mainly on
how smart and energetic you are, and much less on how old you are or how much
business experience you have. The results so far bear this out. The 2005
summer founders ranged in age from 18 to 28 (average 23), and there is no
correlation between their ages and how well they're doing.
This should not really be surprising. Bill Gates and Michael Dell were both 19
when they started the companies that made them famous. Young founders are not
a new phenomenon: the trend began as soon as computers got cheap enough for
college kids to afford them.
Another of our hypotheses was that you can start a startup on less money than
most people think. Other investors were surprised to hear the most we gave any
group was $20,000. But we knew it was possible to start on that little because
we started Viaweb on $10,000.
And so it proved this summer. Three months' funding is enough to get into
second gear. We had a demo day for potential investors ten weeks in, and seven
of the eight groups had a prototype ready by that time. One,
[Reddit](http://reddit.com), had already launched, and were able to give a
demo of their live site.
A researcher who studied the SFP startups said the one thing they had in
common was that they all worked ridiculously hard. People this age are
commonly seen as lazy. I think in some cases it's not so much that they lack
the appetite for work, but that the work they're offered is unappetizing.
The experience of the SFP suggests that if you let motivated people do real
work, they work hard, whatever their age. As one of the founders said "I'd
read that starting a startup consumed your life, but I had no idea what that
meant until I did it."
I'd feel guilty if I were a boss making people work this hard. But we're not
these people's bosses. They're working on their own projects. And what makes
them work is not us but their competitors. Like good athletes, they don't work
hard because the coach yells at them, but because they want to win.
We have less power than bosses, and yet the founders work harder than
employees. It seems like a win for everyone. The only catch is that we get on
average only about 5-7% of the upside, while an employer gets nearly all of
it. (We're counting on it being 5-7% of a much larger number.)
As well as working hard, the groups all turned out to be extraordinarily
responsible. I can't think of a time when one failed to do something they'd
promised to, even by being late for an appointment. This is another lesson the
world has yet to learn. One of the founders discovered that the hardest part
of arranging a meeting with executives at a big cell phone carrier was getting
a rental company to rent him a car, because he was too young.
I think the problem here is much the same as with the apparent laziness of
people this age. They seem lazy because the work they're given is pointless,
and they act irresponsible because they're not given any power. Some of them,
anyway. We only have a sample size of about twenty, but it seems so far that
if you let people in their early twenties be their own bosses, they rise to
the occasion.
**Morale**
The summer founders were as a rule very idealistic. They also wanted very much
to get rich. These qualities might seem incompatible, but they're not. These
guys want to get rich, but they want to do it by changing the world. They
wouldn't (well, seven of the eight groups wouldn't) be interested in making
money by speculating in stocks. They want to make something people use.
I think this makes them more effective as founders. As hard as people will
work for money, they'll work harder for a cause. And since success in a
startup depends so much on motivation, the paradoxical result is that the
people likely to make the most money are those who aren't in it just for the
money.
The founders of [Kiko](http://kiko.com), for example, are working on an Ajax
calendar. They want to get rich, but they pay more attention to design than
they would if that were their only motivation. You can tell just by looking at
it.
I never considered it till this summer, but this might be another reason
startups run by hackers tend to do better than those run by MBAs. Perhaps it's
not just that hackers understand technology better, but that they're driven by
more powerful motivations. Microsoft, as I've said before, is a dangerously
misleading example. Their mean corporate culture only works for monopolies.
Google is a better model.
Considering that the summer founders are the sharks in this ocean, we were
surprised how frightened most of them were of competitors. But now that I
think of it, we were just as frightened when we started Viaweb. For the first
year, our initial reaction to news of a competitor was always: we're doomed.
Just as a hypochondriac magnifies his symptoms till he's convinced he has some
terrible disease, when you're not used to competitors you magnify them into
monsters.
Here's a handy rule for startups: competitors are rarely as dangerous as they
seem. Most will self-destruct before you can destroy them. And it certainly
doesn't matter how many of them there are, any more than it matters to the
winner of a marathon how many runners are behind him.
"It's a crowded market," I remember one founder saying worriedly.
"Are you the current leader?" I asked.
"Yes."
"Is anyone able to develop software faster than you?"
"Probably not."
"Well, if you're ahead now, and you're the fastest, then you'll stay ahead.
What difference does it make how many others there are?"
Another group was worried when they realized they had to rewrite their
software from scratch. I told them it would be a bad sign if they didn't. The
main function of your initial version is to be rewritten.
That's why we advise groups to ignore issues like scalability,
internationalization, and heavy-duty security at first. [1] I can imagine an
advocate of "best practices" saying these ought to be considered from the
start. And he'd be right, except that they interfere with the primary function
of software in a startup: to be a vehicle for experimenting with its own
design. Having to retrofit internationalization or scalability is a pain,
certainly. The only bigger pain is not needing to, because your initial
version was too big and rigid to evolve into something users wanted.
I suspect this is another reason startups beat big companies. Startups can be
irresponsible and release version 1s that are light enough to evolve. In big
companies, all the pressure is in the direction of over-engineering.
**What Got Learned**
One thing we were curious about this summer was where these groups would need
help. That turned out to vary a lot. Some we helped with technical advice--
for example, about how to set up an application to run on multiple servers.
Most we helped with strategy questions, like what to patent, and what to
charge for and what to give away. Nearly all wanted advice about dealing with
future investors: how much money should they take and what kind of terms
should they expect?
However, all the groups quickly learned how to deal with stuff like patents
and investors. These problems aren't intrinsically difficult, just unfamiliar.
It was surprising-- slightly frightening even-- how fast they learned. The
weekend before the demo day for investors, we had a practice session where all
the groups gave their presentations. They were all terrible. We tried to
explain how to make them better, but we didn't have much hope. So on demo day
I told the assembled angels and VCs that these guys were hackers, not MBAs,
and so while their software was good, we should not expect slick presentations
from them.
The groups then proceeded to give fabulously slick presentations. Gone were
the mumbling recitations of lists of features. It was as if they'd spent the
past week at acting school. I still don't know how they did it.
Perhaps watching each others' presentations helped them see what they'd been
doing wrong. Just as happens in college, the summer founders learned a lot
from one another-- maybe more than they learned from us. A lot of the problems
they face are the same, from dealing with investors to hacking Javascript.
I don't want to give the impression there were no problems this summer. A lot
went wrong, as usually happens with startups. One group got an "[exploding
term-sheet](http://www.ventureblog.com/articles/indiv/2003/000024.html)" from
some VCs. Pretty much all the groups who had dealings with big companies found
that big companies do everything infinitely slowly. (This is to be expected.
If big companies weren't incapable, there would be no room for startups to
exist.) And of course there were the usual nightmares associated with servers.
In short, the disasters this summer were just the usual childhood diseases.
Some of this summer's eight startups will probably die eventually; it would be
extraordinary if all eight succeeded. But what kills them will not be
dramatic, external threats, but a mundane, internal one: not getting enough
done.
So far, though, the news is all good. In fact, we were surprised how much fun
the summer was for us. The main reason was how much we liked the founders.
They're so earnest and hard-working. They seem to like us too. And this
illustrates another advantage of investing over hiring: our relationship with
them is way better than it would be between a boss and an employee. Y
Combinator ends up being more like an older brother than a parent.
I was surprised how much time I spent making introductions. Fortunately I
discovered that when a startup needed to talk to someone, I could usually get
to the right person by at most one hop. I remember wondering, how did my
friends get to be so eminent? and a second later realizing: shit, I'm forty.
Another surprise was that the three-month batch format, which we were forced
into by the constraints of the summer, turned out to be an advantage. When we
started Y Combinator, we planned to invest the way other venture firms do: as
proposals came in, we'd evaluate them and decide yes or no. The SFP was just
an experiment to get things started. But it worked so well that we plan to do
[all](http://ycombinator.com/funding.html) our investing this way, one cycle
in the summer and one in winter. It's more efficient for us, and better for
the startups too.
Several groups said our weekly dinners saved them from a common problem
afflicting startups: working so hard that one has no social life. (I remember
that part all too well.) This way, they were guaranteed a social event at
least once a week.
**Independence**
I've heard Y Combinator described as an "incubator." Actually we're the
opposite: incubators exert more control than ordinary VCs, and we make a point
of exerting less. Among other things, incubators usually make you work in
their office-- that's where the word "incubator" comes from. That seems the
wrong model. If investors get too involved, they smother one of the most
powerful forces in a startup: the feeling that it's your own company.
Incubators were conspicuous failures during the Bubble. There's still debate
about whether this was because of the Bubble, or because they're a bad idea.
My vote is they're a bad idea. I think they fail because they select for the
wrong people. When we were starting a startup, we would never have taken
funding from an "incubator." We can find office space, thanks; just give us
the money. And people with that attitude are the ones likely to succeed in
startups.
Indeed, one quality all the founders shared this summer was a spirit of
independence. I've been wondering about that. Are some people just a lot more
independent than others, or would everyone be this way if they were allowed
to?
As with most nature/nurture questions, the answer is probably: some of each.
But my main conclusion from the summer is that there's more environment in the
mix than most people realize. I could see that from how the founders'
attitudes _changed_ during the summer. Most were emerging from twenty or so
years of being told what to do. They seemed a little surprised at having total
freedom. But they grew into it really quickly; some of these guys now seem
about four inches taller (metaphorically) than they did at the beginning of
the summer.
When we asked the summer founders what surprised them most about starting a
company, one said "the most shocking thing is that it worked."
It will take more experience to know for sure, but my guess is that a lot of
hackers could do this-- that if you put people in a position of independence,
they develop the qualities they need. Throw them off a cliff, and most will
find on the way down that they have wings.
The reason this is news to anyone is that the same forces work in the other
direction too. Most hackers are employees, and this
[molds](http://software.ericsink.com/entries/No_Great_Hackers.html) you into
someone to whom starting a startup seems impossible as surely as starting a
startup molds you into someone who can handle it.
If I'm right, "hacker" will mean something different in twenty years than it
does now. Increasingly it will mean the people who run the company. Y
Combinator is just accelerating a process that would have happened anyway.
Power is shifting from the people who deal with money to the people who create
technology, and if our experience this summer is any guide, this will be a
good thing.
**Notes**
[1] By heavy-duty security I mean efforts to protect against truly determined
attackers.
The
[image](https://sep.turbifycdn.com/ty/cdn/paulgraham/sfptable.jpg?t=1688221954&)
shows us, the 2005 summer founders, and Smartleaf co-founders Mark Nitzberg
and Olin Shivers at the 30-foot table Kate Courteau designed for us. Photo by
Alex Lewin.
**Thanks** to Sarah Harlin, Steve Huffman, Jessica Livingston, Zak Stone, and
Aaron Swartz for reading drafts of this.
|
May 2002
"The quantity of meaning compressed into a small space by algebraic signs, is
another circumstance that facilitates the reasonings we are accustomed to
carry on by their aid."
\- Charles Babbage, quoted in Iverson's Turing Award Lecture
In the discussion about issues raised by [Revenge of the Nerds](icad.html) on
the LL1 mailing list, Paul Prescod wrote something that stuck in my mind.
> Python's goal is regularity and readability, not succinctness.
On the face of it, this seems a rather damning thing to claim about a
programming language. As far as I can tell, succinctness = power. If so, then
substituting, we get
> Python's goal is regularity and readability, not power.
and this doesn't seem a tradeoff (if it _is_ a tradeoff) that you'd want to
make. It's not far from saying that Python's goal is not to be effective as a
programming language.
Does succinctness = power? This seems to me an important question, maybe the
most important question for anyone interested in language design, and one that
it would be useful to confront directly. I don't feel sure yet that the answer
is a simple yes, but it seems a good hypothesis to begin with.
**Hypothesis**
My hypothesis is that succinctness is power, or is close enough that except in
pathological examples you can treat them as identical.
It seems to me that succinctness is what programming languages are _for._
Computers would be just as happy to be told what to do directly in machine
language. I think that the main reason we take the trouble to develop high-
level languages is to get leverage, so that we can say (and more importantly,
think) in 10 lines of a high-level language what would require 1000 lines of
machine language. In other words, the main point of high-level languages is to
make source code smaller.
If smaller source code is the purpose of high-level languages, and the power
of something is how well it achieves its purpose, then the measure of the
power of a programming language is how small it makes your programs.
Conversely, a language that doesn't make your programs small is doing a bad
job of what programming languages are supposed to do, like a knife that
doesn't cut well, or printing that's illegible.
**Metrics**
Small in what sense though? The most common measure of code size is lines of
code. But I think that this metric is the most common because it is the
easiest to measure. I don't think anyone really believes it is the true test
of the length of a program. Different languages have different conventions for
how much you should put on a line; in C a lot of lines have nothing on them
but a delimiter or two.
Another easy test is the number of characters in a program, but this is not
very good either; some languages (Perl, for example) just use shorter
identifiers than others.
I think a better measure of the size of a program would be the number of
elements, where an element is anything that would be a distinct node if you
drew a tree representing the source code. The name of a variable or function
is an element; an integer or a floating-point number is an element; a segment
of literal text is an element; an element of a pattern, or a format directive,
is an element; a new block is an element. There are borderline cases (is -5
two elements or one?) but I think most of them are the same for every
language, so they don't affect comparisons much.
This metric needs fleshing out, and it could require interpretation in the
case of specific languages, but I think it tries to measure the right thing,
which is the number of parts a program has. I think the tree you'd draw in
this exercise is what you have to make in your head in order to conceive of
the program, and so its size is proportionate to the amount of work you have
to do to write or read it.
**Design**
This kind of metric would allow us to compare different languages, but that is
not, at least for me, its main value. The main value of the succinctness test
is as a guide in _designing_ languages. The most useful comparison between
languages is between two potential variants of the same language. What can I
do in the language to make programs shorter?
If the conceptual load of a program is proportionate to its complexity, and a
given programmer can tolerate a fixed conceptual load, then this is the same
as asking, what can I do to enable programmers to get the most done? And that
seems to me identical to asking, how can I design a good language?
(Incidentally, nothing makes it more patently obvious that the old chestnut
"all languages are equivalent" is false than designing languages. When you are
designing a new language, you're _constantly_ comparing two languages-- the
language if I did x, and if I didn't-- to decide which is better. If this were
really a meaningless question, you might as well flip a coin.)
Aiming for succinctness seems a good way to find new ideas. If you can do
something that makes many different programs shorter, it is probably not a
coincidence: you have probably discovered a useful new abstraction. You might
even be able to write a program to help by searching source code for repeated
patterns. Among other languages, those with a reputation for succinctness
would be the ones to look to for new ideas: Forth, Joy, Icon.
**Comparison**
The first person to write about these issues, as far as I know, was Fred
Brooks in the _Mythical Man Month_. He wrote that programmers seemed to
generate about the same amount of code per day regardless of the language.
When I first read this in my early twenties, it was a big surprise to me and
seemed to have huge implications. It meant that (a) the only way to get
software written faster was to use a more succinct language, and (b) someone
who took the trouble to do this could leave competitors who didn't in the
dust.
Brooks' hypothesis, if it's true, seems to be at the very heart of hacking. In
the years since, I've paid close attention to any evidence I could get on the
question, from formal studies to anecdotes about individual projects. I have
seen nothing to contradict him.
I have not yet seen evidence that seemed to me conclusive, and I don't expect
to. Studies like Lutz Prechelt's comparison of programming languages, while
generating the kind of results I expected, tend to use problems that are too
short to be meaningful tests. A better test of a language is what happens in
programs that take a month to write. And the only real test, if you believe as
I do that the main purpose of a language is to be good to think in (rather
than just to tell a computer what to do once you've thought of it) is what new
things you can write in it. So any language comparison where you have to meet
a predefined spec is testing slightly the wrong thing.
The true test of a language is how well you can discover and solve new
problems, not how well you can use it to solve a problem someone else has
already formulated. These two are quite different criteria. In art, mediums
like embroidery and mosaic work well if you know beforehand what you want to
make, but are absolutely lousy if you don't. When you want to discover the
image as you make it-- as you have to do with anything as complex as an image
of a person, for example-- you need to use a more fluid medium like pencil or
ink wash or oil paint. And indeed, the way tapestries and mosaics are made in
practice is to make a painting first, then copy it. (The word "cartoon" was
originally used to describe a painting intended for this purpose).
What this means is that we are never likely to have accurate comparisons of
the relative power of programming languages. We'll have precise comparisons,
but not accurate ones. In particular, explicit studies for the purpose of
comparing languages, because they will probably use small problems, and will
necessarily use predefined problems, will tend to underestimate the power of
the more powerful languages.
Reports from the field, though they will necessarily be less precise than
"scientific" studies, are likely to be more meaningful. For example, Ulf Wiger
of Ericsson did a [study](http://www.erlang.se/publications/Ulf_Wiger.pdf)
that concluded that Erlang was 4-10x more succinct than C++, and
proportionately faster to develop software in:
> Comparisons between Ericsson-internal development projects indicate similar
> line/hour productivity, including all phases of software development, rather
> independently of which language (Erlang, PLEX, C, C++, or Java) was used.
> What differentiates the different languages then becomes source code volume.
The study also deals explictly with a point that was only implicit in Brooks'
book (since he measured lines of debugged code): programs written in more
powerful languages tend to have fewer bugs. That becomes an end in itself,
possibly more important than programmer productivity, in applications like
network switches.
**The Taste Test**
Ultimately, I think you have to go with your gut. What does it feel like to
program in the language? I think the way to find (or design) the best language
is to become hypersensitive to how well a language lets you think, then
choose/design the language that feels best. If some language feature is
awkward or restricting, don't worry, you'll know about it.
Such hypersensitivity will come at a cost. You'll find that you can't _stand_
programming in clumsy languages. I find it unbearably restrictive to program
in languages without macros, just as someone used to dynamic typing finds it
unbearably restrictive to have to go back to programming in a language where
you have to declare the type of every variable, and can't make a list of
objects of different types.
I'm not the only one. I know many Lisp hackers that this has happened to. In
fact, the most accurate measure of the relative power of programming languages
might be the percentage of people who know the language who will take any job
where they get to use that language, regardless of the application domain.
**Restrictiveness**
I think most hackers know what it means for a language to feel restrictive.
What's happening when you feel that? I think it's the same feeling you get
when the street you want to take is blocked off, and you have to take a long
detour to get where you wanted to go. There is something you want to say, and
the language won't let you.
What's really going on here, I think, is that a restrictive language is one
that isn't succinct enough. The problem is not simply that you can't say what
you planned to. It's that the detour the language makes you take is _longer._
Try this thought experiment. Suppose there were some program you wanted to
write, and the language wouldn't let you express it the way you planned to,
but instead forced you to write the program in some other way that was
_shorter._ For me at least, that wouldn't feel very restrictive. It would be
like the street you wanted to take being blocked off, and the policeman at the
intersection directing you to a shortcut instead of a detour. Great!
I think most (ninety percent?) of the feeling of restrictiveness comes from
being forced to make the program you write in the language longer than one you
have in your head. Restrictiveness is mostly lack of succinctness. So when a
language feels restrictive, what that (mostly) means is that it isn't succinct
enough, and when a language isn't succinct, it will feel restrictive.
**Readability**
The quote I began with mentions two other qualities, regularity and
readability. I'm not sure what regularity is, or what advantage, if any, code
that is regular and readable has over code that is merely readable. But I
think I know what is meant by readability, and I think it is also related to
succinctness.
We have to be careful here to distinguish between the readability of an
individual line of code and the readability of the whole program. It's the
second that matters. I agree that a line of Basic is likely to be more
readable than a line of Lisp. But a program written in Basic is is going to
have more lines than the same program written in Lisp (especially once you
cross over into Greenspunland). The total effort of reading the Basic program
will surely be greater.
> total effort = effort per line x number of lines
I'm not as sure that readability is directly proportionate to succinctness as
I am that power is, but certainly succinctness is a factor (in the
mathematical sense; see equation above) in readability. So it may not even be
meaningful to say that the goal of a language is readability, not
succinctness; it could be like saying the goal was readability, not
readability.
What readability-per-line does mean, to the user encountering the language for
the first time, is that source code will _look unthreatening_. So readability-
per-line could be a good marketing decision, even if it is a bad design
decision. It's isomorphic to the very successful technique of letting people
pay in installments: instead of frightening them with a high upfront price,
you tell them the low monthly payment. Installment plans are a net lose for
the buyer, though, as mere readability-per-line probably is for the
programmer. The buyer is going to make a _lot_ of those low, low payments; and
the programmer is going to read a _lot_ of those individually readable lines.
This tradeoff predates programming languages. If you're used to reading novels
and newspaper articles, your first experience of reading a math paper can be
dismaying. It could take half an hour to read a single page. And yet, I am
pretty sure that the notation is not the problem, even though it may feel like
it is. The math paper is hard to read because the ideas are hard. If you
expressed the same ideas in prose (as mathematicians had to do before they
evolved succinct notations), they wouldn't be any easier to read, because the
paper would grow to the size of a book.
**To What Extent?**
A number of people have rejected the idea that succinctness = power. I think
it would be more useful, instead of simply arguing that they are the same or
aren't, to ask: to what _extent_ does succinctness = power? Because clearly
succinctness is a large part of what higher-level languages are for. If it is
not all they're for, then what else are they for, and how important,
relatively, are these other functions?
I'm not proposing this just to make the debate more civilized. I really want
to know the answer. When, if ever, is a language too succinct for its own
good?
The hypothesis I began with was that, except in pathological examples, I
thought succinctness could be considered identical with power. What I meant
was that in any language anyone would design, they would be identical, but
that if someone wanted to design a language explicitly to disprove this
hypothesis, they could probably do it. I'm not even sure of that, actually.
**Languages, not Programs**
We should be clear that we are talking about the succinctness of languages,
not of individual programs. It certainly is possible for individual programs
to be written too densely.
I wrote about this in [On Lisp](onlisp.html). A complex macro may have to save
many times its own length to be justified. If writing some hairy macro could
save you ten lines of code every time you use it, and the macro is itself ten
lines of code, then you get a net saving in lines if you use it more than
once. But that could still be a bad move, because macro definitions are harder
to read than ordinary code. You might have to use the macro ten or twenty
times before it yielded a net improvement in readability.
I'm sure every language has such tradeoffs (though I suspect the stakes get
higher as the language gets more powerful). Every programmer must have seen
code that some clever person has made marginally shorter by using dubious
programming tricks.
So there is no argument about that-- at least, not from me. Individual
programs can certainly be too succinct for their own good. The question is,
can a language be? Can a language compel programmers to write code that's
short (in elements) at the expense of overall readability?
One reason it's hard to imagine a language being too succinct is that if there
were some excessively compact way to phrase something, there would probably
also be a longer way. For example, if you felt Lisp programs using a lot of
macros or higher-order functions were too dense, you could, if you preferred,
write code that was isomorphic to Pascal. If you don't want to express
factorial in Arc as a call to a higher-order function (rec zero 1 * 1-) you
can also write out a recursive definition: (rfn fact (x) (if (zero x) 1 (* x
(fact (1- x))))) Though I can't off the top of my head think of any examples,
I am interested in the question of whether a language could be too succinct.
Are there languages that force you to write code in a way that is crabbed and
incomprehensible? If anyone has examples, I would be very interested to see
them.
(Reminder: What I'm looking for are programs that are very dense according to
the metric of "elements" sketched above, not merely programs that are short
because delimiters can be omitted and everything has a one-character name.)
|
**Want to start a startup?** Get funded by [Y
Combinator](http://ycombinator.com/apply.html).
August 2007
_(This is a talk I gave at the last Y Combinator dinner of the summer.
Usually we don't have a speaker at the last dinner; it's more of a party. But
it seemed worth spoiling the atmosphere if I could save some of the startups
from preventable deaths. So at the last minute I cooked up this rather grim
talk. I didn't mean this as an essay; I wrote it down because I only had two
hours before dinner and think fastest while writing.)_
A couple days ago I told a reporter that we expected about a third of the
companies we funded to succeed. Actually I was being conservative. I'm hoping
it might be as much as a half. Wouldn't it be amazing if we could achieve a
50% success rate?
Another way of saying that is that half of you are going to die. Phrased that
way, it doesn't sound good at all. In fact, it's kind of weird when you think
about it, because our definition of success is that the founders get rich. If
half the startups we fund succeed, then half of you are going to get rich and
the other half are going to get nothing.
If you can just avoid dying, you get rich. That sounds like a joke, but it's
actually a pretty good description of what happens in a typical startup. It
certainly describes what happened in Viaweb. We avoided dying till we got
rich.
It was really close, too. When we were visiting Yahoo to talk about being
acquired, we had to interrupt everything and borrow one of their conference
rooms to talk down an investor who was about to back out of a new funding
round we needed to stay alive. So even in the middle of getting rich we were
fighting off the grim reaper.
You may have heard that quote about luck consisting of opportunity meeting
preparation. You've now done the preparation. The work you've done so far has,
in effect, put you in a position to get lucky: you can now get rich by not
letting your company die. That's more than most people have. So let's talk
about how not to die.
We've done this five times now, and we've seen a bunch of startups die. About
10 of them so far. We don't know exactly what happens when they die, because
they generally don't die loudly and heroically. Mostly they crawl off
somewhere and die.
For us the main indication of impending doom is when we don't hear from you.
When we haven't heard from, or about, a startup for a couple months, that's a
bad sign. If we send them an email asking what's up, and they don't reply,
that's a really bad sign. So far that is a 100% accurate predictor of death.
Whereas if a startup regularly does new deals and releases and either sends us
mail or shows up at YC events, they're probably going to live.
I realize this will sound naive, but maybe the linkage works in both
directions. Maybe if you can arrange that we keep hearing from you, you won't
die.
That may not be so naive as it sounds. You've probably noticed that having
dinners every Tuesday with us and the other founders causes you to get more
done than you would otherwise, because every dinner is a mini Demo Day. Every
dinner is a kind of a deadline. So the mere constraint of staying in regular
contact with us will push you to make things happen, because otherwise you'll
be embarrassed to tell us that you haven't done anything new since the last
time we talked.
If this works, it would be an amazing hack. It would be pretty cool if merely
by staying in regular contact with us you could get rich. It sounds crazy, but
there's a good chance that would work.
A variant is to stay in touch with other YC-funded startups. There is now a
whole neighborhood of them in San Francisco. If you move there, the peer
pressure that made you work harder all summer will continue to operate.
When startups die, the official cause of death is always either running out of
money or a critical founder bailing. Often the two occur simultaneously. But I
think the underlying cause is usually that they've become demoralized. You
rarely hear of a startup that's working around the clock doing deals and
pumping out new features, and dies because they can't pay their bills and
their ISP unplugs their server.
Startups rarely die in mid keystroke. So keep typing!
If so many startups get demoralized and fail when merely by hanging on they
could get rich, you have to assume that running a startup can be demoralizing.
That is certainly true. I've been there, and that's why I've never done
another startup. The low points in a startup are just unbelievably low. I bet
even Google had moments where things seemed hopeless.
Knowing that should help. If you know it's going to feel terrible sometimes,
then when it feels terrible you won't think "ouch, this feels terrible, I give
up." It feels that way for everyone. And if you just hang on, things will
probably get better. The metaphor people use to describe the way a startup
feels is at least a roller coaster and not drowning. You don't just sink and
sink; there are ups after the downs.
Another feeling that seems alarming but is in fact normal in a startup is the
feeling that what you're doing isn't working. The reason you can expect to
feel this is that what you do probably won't work. Startups almost never get
it right the first time. Much more commonly you launch something, and no one
cares. Don't assume when this happens that you've failed. That's normal for
startups. But don't sit around doing nothing. Iterate.
I like Paul Buchheit's suggestion of trying to make something that at least
someone really loves. As long as you've made something that a few users are
ecstatic about, you're on the right track. It will be good for your morale to
have even a handful of users who really love you, and startups run on morale.
But also it will tell you what to focus on. What is it about you that they
love? Can you do more of that? Where can you find more people who love that
sort of thing? As long as you have some core of users who love you, all you
have to do is expand it. It may take a while, but as long as you keep plugging
away, you'll win in the end. Both Blogger and Delicious did that. Both took
years to succeed. But both began with a core of fanatically devoted users, and
all Evan and Joshua had to do was grow that core incrementally.
[Wufoo](http://wufoo.com) is on the same trajectory now.
So when you release something and it seems like no one cares, look more
closely. Are there zero users who really love you, or is there at least some
little group that does? It's quite possible there will be zero. In that case,
tweak your product and try again. Every one of you is working on a space that
contains at least one winning permutation somewhere in it. If you just keep
trying, you'll find it.
Let me mention some things not to do. The number one thing not to do is other
things. If you find yourself saying a sentence that ends with "but we're going
to keep working on the startup," you are in big trouble. Bob's going to grad
school, but we're going to keep working on the startup. We're moving back to
Minnesota, but we're going to keep working on the startup. We're taking on
some consulting projects, but we're going to keep working on the startup. You
may as well just translate these to "we're giving up on the startup, but we're
not willing to admit that to ourselves," because that's what it means most of
the time. A startup is so hard that working on it can't be preceded by "but."
In particular, don't go to graduate school, and don't start other projects.
Distraction is fatal to startups. Going to (or back to) school is a huge
predictor of death because in addition to the distraction it gives you
something to say you're doing. If you're only doing a startup, then if the
startup fails, you fail. If you're in grad school and your startup fails, you
can say later "Oh yeah, we had this startup on the side when I was in grad
school, but it didn't go anywhere."
You can't use euphemisms like "didn't go anywhere" for something that's your
only occupation. People won't let you.
One of the most interesting things we've discovered from working on Y
Combinator is that founders are more motivated by the fear of looking bad than
by the hope of getting millions of dollars. So if you want to get millions of
dollars, put yourself in a position where failure will be public and
humiliating.
When we first met the founders of [Octopart](http://octopart.com), they seemed
very smart, but not a great bet to succeed, because they didn't seem
especially committed. One of the two founders was still in grad school. It was
the usual story: he'd drop out if it looked like the startup was taking off.
Since then he has not only dropped out of grad school, but appeared full
length in [Newsweek](http://docs.octopart.com/newsweek_octopart_small.jpg)
with the word "Billionaire" printed across his chest. He just cannot fail now.
Everyone he knows has seen that picture. Girls who dissed him in high school
have seen it. His mom probably has it on the fridge. It would be unthinkably
humiliating to fail now. At this point he is committed to fight to the death.
I wish every startup we funded could appear in a Newsweek article describing
them as the next generation of billionaires, because then none of them would
be able to give up. The success rate would be 90%. I'm not kidding.
When we first knew the Octoparts they were lighthearted, cheery guys. Now when
we talk to them they seem grimly determined. The electronic parts distributors
are trying to squash them to keep their monopoly pricing. (If it strikes you
as odd that people still order electronic parts out of thick paper catalogs in
2007, there's a reason for that. The distributors want to prevent the
transparency that comes from having prices online.) I feel kind of bad that
we've transformed these guys from lighthearted to grimly determined. But that
comes with the territory. If a startup succeeds, you get millions of dollars,
and you don't get that kind of money just by asking for it. You have to assume
it takes some amount of pain.
And however tough things get for the Octoparts, I predict they'll succeed.
They may have to morph themselves into something totally different, but they
won't just crawl off and die. They're smart; they're working in a promising
field; and they just cannot give up.
All of you guys already have the first two. You're all smart and working on
promising ideas. Whether you end up among the living or the dead comes down to
the third ingredient, not giving up.
So I'll tell you now: bad shit is coming. It always is in a startup. The odds
of getting from launch to liquidity without some kind of disaster happening
are one in a thousand. So don't get demoralized. When the disaster strikes,
just say to yourself, ok, this was what Paul was talking about. What did he
say to do? Oh, yeah. Don't give up.
|
January 2016
One advantage of being old is that you can see change happen in your lifetime.
A lot of the change I've seen is fragmentation. US politics is much more
polarized than it used to be. Culturally we have ever less common ground. The
creative class flocks to a handful of happy cities, abandoning the rest. And
increasing economic inequality means the spread between rich and poor is
growing too. I'd like to propose a hypothesis: that all these trends are
instances of the same phenomenon. And moreover, that the cause is not some
force that's pulling us apart, but rather the erosion of forces that had been
pushing us together.
Worse still, for those who worry about these trends, the forces that were
pushing us together were an anomaly, a one-time combination of circumstances
that's unlikely to be repeated — and indeed, that we would not want to repeat.
The two forces were war (above all World War II), and the rise of large
corporations.
The effects of World War II were both economic and social. Economically, it
decreased variation in income. Like all modern armed forces, America's were
socialist economically. From each according to his ability, to each according
to his need. More or less. Higher ranking members of the military got more (as
higher ranking members of socialist societies always do), but what they got
was fixed according to their rank. And the flattening effect wasn't limited to
those under arms, because the US economy was conscripted too. Between 1942 and
1945 all wages were set by the National War Labor Board. Like the military,
they defaulted to flatness. And this national standardization of wages was so
pervasive that its effects could still be seen years after the war ended. [1]
Business owners weren't supposed to be making money either. FDR said "not a
single war millionaire" would be permitted. To ensure that, any increase in a
company's profits over prewar levels was taxed at 85%. And when what was left
after corporate taxes reached individuals, it was taxed again at a marginal
rate of 93%. [2]
Socially too the war tended to decrease variation. Over 16 million men and
women from all sorts of different backgrounds were brought together in a way
of life that was literally uniform. Service rates for men born in the early
1920s approached 80%. And working toward a common goal, often under stress,
brought them still closer together.
Though strictly speaking World War II lasted less than 4 years for the US, its
effects lasted longer. Wars make central governments more powerful, and World
War II was an extreme case of this. In the US, as in all the other Allied
countries, the federal government was slow to give up the new powers it had
acquired. Indeed, in some respects the war didn't end in 1945; the enemy just
switched to the Soviet Union. In tax rates, federal power, defense spending,
conscription, and nationalism, the decades after the war looked more like
wartime than prewar peacetime. [3] And the social effects lasted too. The kid
pulled into the army from behind a mule team in West Virginia didn't simply go
back to the farm afterward. Something else was waiting for him, something that
looked a lot like the army.
If total war was the big political story of the 20th century, the big economic
story was the rise of a new kind of company. And this too tended to produce
both social and economic cohesion. [4]
The 20th century was the century of the big, national corporation. General
Electric, General Foods, General Motors. Developments in finance,
communications, transportation, and manufacturing enabled a new type of
company whose goal was above all scale. Version 1 of this world was low-res: a
Duplo world of a few giant companies dominating each big market. [5]
The late 19th and early 20th centuries had been a time of consolidation, led
especially by J. P. Morgan. Thousands of companies run by their founders were
merged into a couple hundred giant ones run by professional managers.
Economies of scale ruled the day. It seemed to people at the time that this
was the final state of things. John D. Rockefeller said in 1880
> The day of combination is here to stay. Individualism has gone, never to
> return.
He turned out to be mistaken, but he seemed right for the next hundred years.
The consolidation that began in the late 19th century continued for most of
the 20th. By the end of World War II, as Michael Lind writes, "the major
sectors of the economy were either organized as government-backed cartels or
dominated by a few oligopolistic corporations."
For consumers this new world meant the same choices everywhere, but only a few
of them. When I grew up there were only 2 or 3 of most things, and since they
were all aiming at the middle of the market there wasn't much to differentiate
them.
One of the most important instances of this phenomenon was in TV. Here there
were 3 choices: NBC, CBS, and ABC. Plus public TV for eggheads and communists.
The programs that the 3 networks offered were indistinguishable. In fact, here
there was a triple pressure toward the center. If one show did try something
daring, local affiliates in conservative markets would make them stop. Plus
since TVs were expensive, whole families watched the same shows together, so
they had to be suitable for everyone.
And not only did everyone get the same thing, they got it at the same time.
It's difficult to imagine now, but every night tens of millions of families
would sit down together in front of their TV set watching the same show, at
the same time, as their next door neighbors. What happens now with the Super
Bowl used to happen every night. We were literally in sync. [6]
In a way mid-century TV culture was good. The view it gave of the world was
like you'd find in a children's book, and it probably had something of the
effect that (parents hope) children's books have in making people behave
better. But, like children's books, TV was also misleading. Dangerously
misleading, for adults. In his autobiography, Robert MacNeil talks of seeing
gruesome images that had just come in from Vietnam and thinking, we can't show
these to families while they're having dinner.
I know how pervasive the common culture was, because I tried to opt out of it,
and it was practically impossible to find alternatives. When I was 13 I
realized, more from internal evidence than any outside source, that the ideas
we were being fed on TV were crap, and I stopped watching it. [7] But it
wasn't just TV. It seemed like everything around me was crap. The politicians
all saying the same things, the consumer brands making almost identical
products with different labels stuck on to indicate how prestigious they were
meant to be, the balloon-frame houses with fake "colonial" skins, the cars
with several feet of gratuitous metal on each end that started to fall apart
after a couple years, the "red delicious" apples that were red but only
nominally apples. And in retrospect, it _was_ crap. [8]
But when I went looking for alternatives to fill this void, I found
practically nothing. There was no Internet then. The only place to look was in
the chain bookstore in our local shopping mall. [9] There I found a copy of
_The Atlantic_. I wish I could say it became a gateway into a wider world, but
in fact I found it boring and incomprehensible. Like a kid tasting whisky for
the first time and pretending to like it, I preserved that magazine as
carefully as if it had been a book. I'm sure I still have it somewhere. But
though it was evidence that there was, somewhere, a world that wasn't red
delicious, I didn't find it till college.
It wasn't just as consumers that the big companies made us similar. They did
as employers too. Within companies there were powerful forces pushing people
toward a single model of how to look and act. IBM was particularly notorious
for this, but they were only a little more extreme than other big companies.
And the models of how to look and act varied little between companies. Meaning
everyone within this world was expected to seem more or less the same. And not
just those in the corporate world, but also everyone who aspired to it — which
in the middle of the 20th century meant most people who weren't already in it.
For most of the 20th century, working-class people tried hard to look middle
class. You can see it in old photos. Few adults aspired to look dangerous in
1950.
But the rise of national corporations didn't just compress us culturally. It
compressed us economically too, and on both ends.
Along with giant national corporations, we got giant national labor unions.
And in the mid 20th century the corporations cut deals with the unions where
they paid over market price for labor. Partly because the unions were
monopolies. [10] Partly because, as components of oligopolies themselves, the
corporations knew they could safely pass the cost on to their customers,
because their competitors would have to as well. And partly because in mid-
century most of the giant companies were still focused on finding new ways to
milk economies of scale. Just as startups rightly pay AWS a premium over the
cost of running their own servers so they can focus on growth, many of the big
national corporations were willing to pay a premium for labor. [11]
As well as pushing incomes up from the bottom, by overpaying unions, the big
companies of the 20th century also pushed incomes down at the top, by
underpaying their top management. Economist J. K. Galbraith wrote in 1967 that
"There are few corporations in which it would be suggested that executive
salaries are at a maximum." [12]
To some extent this was an illusion. Much of the de facto pay of executives
never showed up on their income tax returns, because it took the form of
perks. The higher the rate of income tax, the more pressure there was to pay
employees upstream of it. (In the UK, where taxes were even higher than in the
US, companies would even pay their kids' private school tuitions.) One of the
most valuable things the big companies of the mid 20th century gave their
employees was job security, and this too didn't show up in tax returns or
income statistics. So the nature of employment in these organizations tended
to yield falsely low numbers about economic inequality. But even accounting
for that, the big companies paid their best people less than market price.
There was no market; the expectation was that you'd work for the same company
for decades if not your whole career. [13]
Your work was so illiquid there was little chance of getting market price. But
that same illiquidity also encouraged you not to seek it. If the company
promised to employ you till you retired and give you a pension afterward, you
didn't want to extract as much from it this year as you could. You needed to
take care of the company so it could take care of you. Especially when you'd
been working with the same group of people for decades. If you tried to
squeeze the company for more money, you were squeezing the organization that
was going to take care of _them_. Plus if you didn't put the company first you
wouldn't be promoted, and if you couldn't switch ladders, promotion on this
one was the only way up. [14]
To someone who'd spent several formative years in the armed forces, this
situation didn't seem as strange as it does to us now. From their point of
view, as big company executives, they were high-ranking officers. They got
paid a lot more than privates. They got to have expense account lunches at the
best restaurants and fly around on the company's Gulfstreams. It probably
didn't occur to most of them to ask if they were being paid market price.
The ultimate way to get market price is to work for yourself, by starting your
own company. That seems obvious to any ambitious person now. But in the mid
20th century it was an alien concept. Not because starting one's own company
seemed too ambitious, but because it didn't seem ambitious enough. Even as
late as the 1970s, when I grew up, the ambitious plan was to get lots of
education at prestigious institutions, and then join some other prestigious
institution and work one's way up the hierarchy. Your prestige was the
prestige of the institution you belonged to. People did start their own
businesses of course, but educated people rarely did, because in those days
there was practically zero concept of starting what we now call a
[_startup_](growth.html): a business that starts small and grows big. That was
much harder to do in the mid 20th century. Starting one's own business meant
starting a business that would start small and stay small. Which in those days
of big companies often meant scurrying around trying to avoid being trampled
by elephants. It was more prestigious to be one of the executive class riding
the elephant.
By the 1970s, no one stopped to wonder where the big prestigious companies had
come from in the first place. It seemed like they'd always been there, like
the chemical elements. And indeed, there was a double wall between ambitious
kids in the 20th century and the origins of the big companies. Many of the big
companies were roll-ups that didn't have clear founders. And when they did,
the founders didn't seem like us. Nearly all of them had been uneducated, in
the sense of not having been to college. They were what Shakespeare called
rude mechanicals. College trained one to be a member of the professional
classes. Its graduates didn't expect to do the sort of grubby menial work that
Andrew Carnegie or Henry Ford started out doing. [15]
And in the 20th century there were more and more college graduates. They
increased from about 2% of the population in 1900 to about 25% in 2000. In the
middle of the century our two big forces intersect, in the form of the GI
Bill, which sent 2.2 million World War II veterans to college. Few thought of
it in these terms, but the result of making college the canonical path for the
ambitious was a world in which it was socially acceptable to work for Henry
Ford, but not to be Henry Ford. [16]
I remember this world well. I came of age just as it was starting to break up.
In my childhood it was still dominant. Not quite so dominant as it had been.
We could see from old TV shows and yearbooks and the way adults acted that
people in the 1950s and 60s had been even more conformist than us. The mid-
century model was already starting to get old. But that was not how we saw it
at the time. We would at most have said that one could be a bit more daring in
1975 than 1965. And indeed, things hadn't changed much yet.
But change was coming soon. And when the Duplo economy started to
disintegrate, it disintegrated in several different ways at once. Vertically
integrated companies literally dis-integrated because it was more efficient
to. Incumbents faced new competitors as (a) markets went global and (b)
technical innovation started to trump economies of scale, turning size from an
asset into a liability. Smaller companies were increasingly able to survive as
formerly narrow channels to consumers broadened. Markets themselves started to
change faster, as whole new categories of products appeared. And last but not
least, the federal government, which had previously smiled upon J. P. Morgan's
world as the natural state of things, began to realize it wasn't the last word
after all.
What J. P. Morgan was to the horizontal axis, Henry Ford was to the vertical.
He wanted to do everything himself. The giant plant he built at River Rouge
between 1917 and 1928 literally took in iron ore at one end and sent cars out
the other. 100,000 people worked there. At the time it seemed the future. But
that is not how car companies operate today. Now much of the design and
manufacturing happens in a long supply chain, whose products the car companies
ultimately assemble and sell. The reason car companies operate this way is
that it works better. Each company in the supply chain focuses on what they
know best. And they each have to do it well or they can be swapped out for
another supplier.
Why didn't Henry Ford realize that networks of cooperating companies work
better than a single big company? One reason is that supplier networks take a
while to evolve. In 1917, doing everything himself seemed to Ford the only way
to get the scale he needed. And the second reason is that if you want to solve
a problem using a network of cooperating companies, you have to be able to
coordinate their efforts, and you can do that much better with computers.
Computers reduce the transaction costs that Coase argued are the raison d'etre
of corporations. That is a fundamental change.
In the early 20th century, big companies were synonymous with efficiency. In
the late 20th century they were synonymous with inefficiency. To some extent
this was because the companies themselves had become sclerotic. But it was
also because our standards were higher.
It wasn't just within existing industries that change occurred. The industries
themselves changed. It became possible to make lots of new things, and
sometimes the existing companies weren't the ones who did it best.
Microcomputers are a classic example. The market was pioneered by upstarts
like Apple. When it got big enough, IBM decided it was worth paying attention
to. At the time IBM completely dominated the computer industry. They assumed
that all they had to do, now that this market was ripe, was to reach out and
pick it. Most people at the time would have agreed with them. But what
happened next illustrated how much more complicated the world had become. IBM
did launch a microcomputer. Though quite successful, it did not crush Apple.
But even more importantly, IBM itself ended up being supplanted by a supplier
coming in from the side — from software, which didn't even seem to be the same
business. IBM's big mistake was to accept a non-exclusive license for DOS. It
must have seemed a safe move at the time. No other computer manufacturer had
ever been able to outsell them. What difference did it make if other
manufacturers could offer DOS too? The result of that miscalculation was an
explosion of inexpensive PC clones. Microsoft now owned the PC standard, and
the customer. And the microcomputer business ended up being Apple vs
Microsoft.
Basically, Apple bumped IBM and then Microsoft stole its wallet. That sort of
thing did not happen to big companies in mid-century. But it was going to
happen increasingly often in the future.
Change happened mostly by itself in the computer business. In other
industries, legal obstacles had to be removed first. Many of the mid-century
oligopolies had been anointed by the federal government with policies (and in
wartime, large orders) that kept out competitors. This didn't seem as dubious
to government officials at the time as it sounds to us. They felt a two-party
system ensured sufficient competition in politics. It ought to work for
business too.
Gradually the government realized that anti-competitive policies were doing
more harm than good, and during the Carter administration it started to remove
them. The word used for this process was misleadingly narrow: deregulation.
What was really happening was de-oligopolization. It happened to one industry
after another. Two of the most visible to consumers were air travel and long-
distance phone service, which both became dramatically cheaper after
deregulation.
Deregulation also contributed to the wave of hostile takeovers in the 1980s.
In the old days the only limit on the inefficiency of companies, short of
actual bankruptcy, was the inefficiency of their competitors. Now companies
had to face absolute rather than relative standards. Any public company that
didn't generate sufficient returns on its assets risked having its management
replaced with one that would. Often the new managers did this by breaking
companies up into components that were more valuable separately. [17]
Version 1 of the national economy consisted of a few big blocks whose
relationships were negotiated in back rooms by a handful of executives,
politicians, regulators, and labor leaders. Version 2 was higher resolution:
there were more companies, of more different sizes, making more different
things, and their relationships changed faster. In this world there were still
plenty of back room negotiations, but more was left to market forces. Which
further accelerated the fragmentation.
It's a little misleading to talk of versions when describing a gradual
process, but not as misleading as it might seem. There was a lot of change in
a few decades, and what we ended up with was qualitatively different. The
companies in the S&P 500 in 1958 had been there an average of 61 years. By
2012 that number was 18 years. [18]
The breakup of the Duplo economy happened simultaneously with the spread of
computing power. To what extent were computers a precondition? It would take a
book to answer that. Obviously the spread of computing power was a
precondition for the rise of startups. I suspect it was for most of what
happened in finance too. But was it a precondition for globalization or the
LBO wave? I don't know, but I wouldn't discount the possibility. It may be
that the refragmentation was driven by computers in the way the industrial
revolution was driven by steam engines. Whether or not computers were a
precondition, they have certainly accelerated it.
The new fluidity of companies changed people's relationships with their
employers. Why climb a corporate ladder that might be yanked out from under
you? Ambitious people started to think of a career less as climbing a single
ladder than as a series of jobs that might be at different companies. More
movement (or even potential movement) between companies introduced more
competition in salaries. Plus as companies became smaller it became easier to
estimate how much an employee contributed to the company's revenue. Both
changes drove salaries toward market price. And since people vary dramatically
in productivity, paying market price meant salaries started to diverge.
By no coincidence it was in the early 1980s that the term "yuppie" was coined.
That word is not much used now, because the phenomenon it describes is so
taken for granted, but at the time it was a label for something novel. Yuppies
were young professionals who made lots of money. To someone in their twenties
today, this wouldn't seem worth naming. Why wouldn't young professionals make
lots of money? But until the 1980s, being underpaid early in your career was
part of what it meant to be a professional. Young professionals were paying
their dues, working their way up the ladder. The rewards would come later.
What was novel about yuppies was that they wanted market price for the work
they were doing now.
The first yuppies did not work for startups. That was still in the future. Nor
did they work for big companies. They were professionals working in fields
like law, finance, and consulting. But their example rapidly inspired their
peers. Once they saw that new BMW 325i, they wanted one too.
Underpaying people at the beginning of their career only works if everyone
does it. Once some employer breaks ranks, everyone else has to, or they can't
get good people. And once started this process spreads through the whole
economy, because at the beginnings of people's careers they can easily switch
not merely employers but industries.
But not all young professionals benefitted. You had to produce to get paid a
lot. It was no coincidence that the first yuppies worked in fields where it
was easy to measure that.
More generally, an idea was returning whose name sounds old-fashioned
precisely because it was so rare for so long: that you could make your
fortune. As in the past there were multiple ways to do it. Some made their
fortunes by creating wealth, and others by playing zero-sum games. But once it
became possible to make one's fortune, the ambitious had to decide whether or
not to. A physicist who chose physics over Wall Street in 1990 was making a
sacrifice that a physicist in 1960 didn't have to think about.
The idea even flowed back into big companies. CEOs of big companies make more
now than they used to, and I think much of the reason is prestige. In 1960,
corporate CEOs had immense prestige. They were the winners of the only
economic game in town. But if they made as little now as they did then, in
real dollar terms, they'd seem like small fry compared to professional
athletes and whiz kids making millions from startups and hedge funds. They
don't like that idea, so now they try to get as much as they can, which is
more than they had been getting. [19]
Meanwhile a similar fragmentation was happening at the other end of the
economic scale. As big companies' oligopolies became less secure, they were
less able to pass costs on to customers and thus less willing to overpay for
labor. And as the Duplo world of a few big blocks fragmented into many
companies of different sizes — some of them overseas — it became harder for
unions to enforce their monopolies. As a result workers' wages also tended
toward market price. Which (inevitably, if unions had been doing their job)
tended to be lower. Perhaps dramatically so, if automation had decreased the
need for some kind of work.
And just as the mid-century model induced social as well as economic cohesion,
its breakup brought social as well as economic fragmentation. People started
to dress and act differently. Those who would later be called the "creative
class" became more mobile. People who didn't care much for religion felt less
pressure to go to church for appearances' sake, while those who liked it a lot
opted for increasingly colorful forms. Some switched from meat loaf to tofu,
and others to Hot Pockets. Some switched from driving Ford sedans to driving
small imported cars, and others to driving SUVs. Kids who went to private
schools or wished they did started to dress "preppy," and kids who wanted to
seem rebellious made a conscious effort to look disreputable. In a hundred
ways people spread apart. [20]
Almost four decades later, fragmentation is still increasing. Has it been net
good or bad? I don't know; the question may be unanswerable. Not entirely bad
though. We take for granted the forms of fragmentation we like, and worry only
about the ones we don't. But as someone who caught the tail end of mid-century
[_conformism_](https://books.google.com/ngrams/graph?content=well-
adjusted&year_start=1800&year_end=2000&corpus=15&smoothing=3), I can tell you
it was no utopia. [21]
My goal here is not to say whether fragmentation has been good or bad, just to
explain why it's happening. With the centripetal forces of total war and 20th
century oligopoly mostly gone, what will happen next? And more specifically,
is it possible to reverse some of the fragmentation we've seen?
If it is, it will have to happen piecemeal. You can't reproduce mid-century
cohesion the way it was originally produced. It would be insane to go to war
just to induce more national unity. And once you understand the degree to
which the economic history of the 20th century was a low-res version 1, it's
clear you can't reproduce that either.
20th century cohesion was something that happened at least in a sense
naturally. The war was due mostly to external forces, and the Duplo economy
was an evolutionary phase. If you want cohesion now, you'd have to induce it
deliberately. And it's not obvious how. I suspect the best we'll be able to do
is address the symptoms of fragmentation. But that may be enough.
The form of fragmentation people worry most about lately is [_economic
inequality_](ineq.html), and if you want to eliminate that you're up against a
truly formidable headwind that has been in operation since the stone age.
Technology.
Technology is a lever. It magnifies work. And the lever not only grows
increasingly long, but the rate at which it grows is itself increasing.
Which in turn means the variation in the amount of wealth people can create
has not only been increasing, but accelerating. The unusual conditions that
prevailed in the mid 20th century masked this underlying trend. The ambitious
had little choice but to join large organizations that made them march in step
with lots of other people — literally in the case of the armed forces,
figuratively in the case of big corporations. Even if the big corporations had
wanted to pay people proportionate to their value, they couldn't have figured
out how. But that constraint has gone now. Ever since it started to erode in
the 1970s, we've seen the underlying forces at work again. [22]
Not everyone who gets rich now does it by creating wealth, certainly. But a
significant number do, and the Baumol Effect means all their peers get dragged
along too. [23] And as long as it's possible to get rich by creating wealth,
the default tendency will be for economic inequality to increase. Even if you
eliminate all the other ways to get rich. You can mitigate this with subsidies
at the bottom and taxes at the top, but unless taxes are high enough to
discourage people from creating wealth, you're always going to be fighting a
losing battle against increasing variation in productivity. [24]
That form of fragmentation, like the others, is here to stay. Or rather, back
to stay. Nothing is forever, but the tendency toward fragmentation should be
more forever than most things, precisely because it's not due to any
particular cause. It's simply a reversion to the mean. When Rockefeller said
individualism was gone, he was right for a hundred years. It's back now, and
that's likely to be true for longer.
I worry that if we don't acknowledge this, we're headed for trouble. If we
think 20th century cohesion disappeared because of few policy tweaks, we'll be
deluded into thinking we can get it back (minus the bad parts, somehow) with a
few countertweaks. And then we'll waste our time trying to eliminate
fragmentation, when we'd be better off thinking about how to mitigate its
consequences.
**Notes**
[1] Lester Thurow, writing in 1975, said the wage differentials prevailing at
the end of World War II had become so embedded that they "were regarded as
'just' even after the egalitarian pressures of World War II had disappeared.
Basically, the same differentials exist to this day, thirty years later." But
Goldin and Margo think market forces in the postwar period also helped
preserve the wartime compression of wages — specifically increased demand for
unskilled workers, and oversupply of educated ones.
(Oddly enough, the American custom of having employers pay for health
insurance derives from efforts by businesses to circumvent NWLB wage controls
in order to attract workers.)
[2] As always, tax rates don't tell the whole story. There were lots of
exemptions, especially for individuals. And in World War II the tax codes were
so new that the government had little acquired immunity to tax avoidance. If
the rich paid high taxes during the war it was more because they wanted to
than because they had to.
After the war, federal tax receipts as a percentage of GDP were about the same
as they are now. In fact, for the entire period since the war, tax receipts
have stayed close to 18% of GDP, despite dramatic changes in tax rates. The
lowest point occurred when marginal income tax rates were highest: 14.1% in
1950. Looking at the data, it's hard to avoid the conclusion that tax rates
have had little effect on what people actually paid.
[3] Though in fact the decade preceding the war had been a time of
unprecedented federal power, in response to the Depression. Which is not
entirely a coincidence, because the Depression was one of the causes of the
war. In many ways the New Deal was a sort of dress rehearsal for the measures
the federal government took during wartime. The wartime versions were much
more drastic and more pervasive though. As Anthony Badger wrote, "for many
Americans the decisive change in their experiences came not with the New Deal
but with World War II."
[4] I don't know enough about the origins of the world wars to say, but it's
not inconceivable they were connected to the rise of big corporations. If that
were the case, 20th century cohesion would have a single cause.
[5] More precisely, there was a bimodal economy consisting, in Galbraith's
words, of "the world of the technically dynamic, massively capitalized and
highly organized corporations on the one hand and the hundreds of thousands of
small and traditional proprietors on the other." Money, prestige, and power
were concentrated in the former, and there was near zero crossover.
[6] I wonder how much of the decline in families eating together was due to
the decline in families watching TV together afterward.
[7] I know when this happened because it was the season Dallas premiered.
Everyone else was talking about what was happening on Dallas, and I had no
idea what they meant.
[8] I didn't realize it till I started doing research for this essay, but the
meretriciousness of the products I grew up with is a well-known byproduct of
oligopoly. When companies can't compete on price, they compete on tailfins.
[9] Monroeville Mall was at the time of its completion in 1969 the largest in
the country. In the late 1970s the movie _Dawn of the Dead_ was shot there.
Apparently the mall was not just the location of the movie, but its
inspiration; the crowds of shoppers drifting through this huge mall reminded
George Romero of zombies. My first job was scooping ice cream in the Baskin-
Robbins.
[10] Labor unions were exempted from antitrust laws by the Clayton Antitrust
Act in 1914 on the grounds that a person's work is not "a commodity or article
of commerce." I wonder if that means service companies are also exempt.
[11] The relationships between unions and unionized companies can even be
symbiotic, because unions will exert political pressure to protect their
hosts. According to Michael Lind, when politicians tried to attack the A&P
supermarket chain because it was putting local grocery stores out of business,
"A&P successfully defended itself by allowing the unionization of its
workforce in 1938, thereby gaining organized labor as a constituency." I've
seen this phenomenon myself: hotel unions are responsible for more of the
political pressure against Airbnb than hotel companies.
[12] Galbraith was clearly puzzled that corporate executives would work so
hard to make money for other people (the shareholders) instead of themselves.
He devoted much of _The New Industrial State_ to trying to figure this out.
His theory was that professionalism had replaced money as a motive, and that
modern corporate executives were, like (good) scientists, motivated less by
financial rewards than by the desire to do good work and thereby earn the
respect of their peers. There is something in this, though I think lack of
movement between companies combined with self-interest explains much of
observed behavior.
[13] Galbraith (p. 94) says a 1952 study of the 800 highest paid executives at
300 big corporations found that three quarters of them had been with their
company for more than 20 years.
[14] It seems likely that in the first third of the 20th century executive
salaries were low partly because companies then were more dependent on banks,
who would have disapproved if executives got too much. This was certainly true
in the beginning. The first big company CEOs were J. P. Morgan's hired hands.
Companies didn't start to finance themselves with retained earnings till the
1920s. Till then they had to pay out their earnings in dividends, and so
depended on banks for capital for expansion. Bankers continued to sit on
corporate boards till the Glass-Steagall act in 1933.
By mid-century big companies funded 3/4 of their growth from earnings. But the
early years of bank dependence, reinforced by the financial controls of World
War II, must have had a big effect on social conventions about executive
salaries. So it may be that the lack of movement between companies was as much
the effect of low salaries as the cause.
Incidentally, the switch in the 1920s to financing growth with retained
earnings was one cause of the 1929 crash. The banks now had to find someone
else to lend to, so they made more margin loans.
[15] Even now it's hard to get them to. One of the things I find hardest to
get into the heads of would-be startup founders is how important it is to do
certain kinds of menial work early in the life of a company. Doing [_things
that don't scale_](ds.html) is to how Henry Ford got started as a high-fiber
diet is to the traditional peasant's diet: they had no choice but to do the
right thing, while we have to make a conscious effort.
[16] Founders weren't celebrated in the press when I was a kid. "Our founder"
meant a photograph of a severe-looking man with a walrus mustache and a wing
collar who had died decades ago. The thing to be when I was a kid was an
_executive_. If you weren't around then it's hard to grasp the cachet that
term had. The fancy version of everything was called the "executive" model.
[17] The wave of hostile takeovers in the 1980s was enabled by a combination
of circumstances: court decisions striking down state anti-takeover laws,
starting with the Supreme Court's 1982 decision in Edgar v. MITE Corp.; the
Reagan administration's comparatively sympathetic attitude toward takeovers;
the Depository Institutions Act of 1982, which allowed banks and savings and
loans to buy corporate bonds; a new SEC rule issued in 1982 (rule 415) that
made it possible to bring corporate bonds to market faster; the creation of
the junk bond business by Michael Milken; a vogue for conglomerates in the
preceding period that caused many companies to be combined that never should
have been; a decade of inflation that left many public companies trading below
the value of their assets; and not least, the increasing complacency of
managements.
[18] Foster, Richard. "Creative Destruction Whips through Corporate America."
Innosight, February 2012.
[19] CEOs of big companies may be overpaid. I don't know enough about big
companies to say. But it is certainly not impossible for a CEO to make 200x as
much difference to a company's revenues as the average employee. Look at what
Steve Jobs did for Apple when he came back as CEO. It would have been a good
deal for the board to give him 95% of the company. Apple's market cap the day
Steve came back in July 1997 was 1.73 billion. 5% of Apple now (January 2016)
would be worth about 30 billion. And it would not be if Steve hadn't come
back; Apple probably wouldn't even exist anymore.
Merely including Steve in the sample might be enough to answer the question of
whether public company CEOs in the aggregate are overpaid. And that is not as
facile a trick as it might seem, because the broader your holdings, the more
the aggregate is what you care about.
[20] The late 1960s were famous for social upheaval. But that was more
rebellion (which can happen in any era if people are provoked sufficiently)
than fragmentation. You're not seeing fragmentation unless you see people
breaking off to both left and right.
[21] Globally the trend has been in the other direction. While the US is
becoming more fragmented, the world as a whole is becoming less fragmented,
and mostly in good ways.
[22] There were a handful of ways to make a fortune in the mid 20th century.
The main one was drilling for oil, which was open to newcomers because it was
not something big companies could dominate through economies of scale. How did
individuals accumulate large fortunes in an era of such high taxes? Giant tax
loopholes defended by two of the most powerful men in Congress, Sam Rayburn
and Lyndon Johnson.
But becoming a Texas oilman was not in 1950 something one could aspire to the
way starting a startup or going to work on Wall Street were in 2000, because
(a) there was a strong local component and (b) success depended so much on
luck.
[23] The Baumol Effect induced by startups is very visible in Silicon Valley.
Google will pay people millions of dollars a year to keep them from leaving to
start or join startups.
[24] I'm not claiming variation in productivity is the only cause of economic
inequality in the US. But it's a significant cause, and it will become as big
a cause as it needs to, in the sense that if you ban other ways to get rich,
people who want to get rich will use this route instead.
**Thanks** to Sam Altman, Trevor Blackwell, Paul Buchheit, Patrick Collison,
Ron Conway, Chris Dixon, Benedict Evans, Richard Florida, Ben Horowitz,
Jessica Livingston, Robert Morris, Tim O'Reilly, Geoff Ralston, Max Roser,
Alexia Tsotsis, and Qasar Younis for reading drafts of this. Max also told me
about several valuable sources.
**Bibliography**
Allen, Frederick Lewis. _The Big Change_. Harper, 1952.
Averitt, Robert. _The Dual Economy_. Norton, 1968.
Badger, Anthony. _The New Deal_. Hill and Wang, 1989.
Bainbridge, John. _The Super-Americans_. Doubleday, 1961.
Beatty, Jack. _Collossus_. Broadway, 2001.
Brinkley, Douglas. _Wheels for the World_. Viking, 2003.
Brownleee, W. Elliot. _Federal Taxation in America_. Cambridge, 1996.
Chandler, Alfred. _The Visible Hand_. Harvard, 1977.
Chernow, Ron. _The House of Morgan_. Simon & Schuster, 1990.
Chernow, Ron. _Titan: The Life of John D. Rockefeller_. Random House, 1998.
Galbraith, John. _The New Industrial State_. Houghton Mifflin, 1967.
Goldin, Claudia and Robert A. Margo. "The Great Compression: The Wage
Structure in the United States at Mid-Century." NBER Working Paper 3817, 1991.
Gordon, John. _An Empire of Wealth_. HarperCollins, 2004.
Klein, Maury. _The Genesis of Industrial America, 1870-1920_. Cambridge, 2007.
Lind, Michael. _Land of Promise_. HarperCollins, 2012.
Mickelthwaite, John, and Adrian Wooldridge. _The Company_. Modern Library,
2003.
Nasaw, David. _Andrew Carnegie_. Penguin, 2006.
Sobel, Robert. _The Age of Giant Corporations_. Praeger, 1993.
Thurow, Lester. _Generating Inequality: Mechanisms of Distribution_. Basic
Books, 1975.
Witte, John. _The Politics and Development of the Federal Income Tax_.
Wisconsin, 1985.
**Related:**
|
September 2004
_(This essay is derived from an invited talk at ICFP 2004.)_
I had a front row seat for the Internet Bubble, because I worked at Yahoo
during 1998 and 1999. One day, when the stock was trading around $200, I sat
down and calculated what I thought the price should be. The answer I got was
$12. I went to the next cubicle and told my friend Trevor. "Twelve!" he said.
He tried to sound indignant, but he didn't quite manage it. He knew as well as
I did that our valuation was crazy.
Yahoo was a special case. It was not just our price to earnings ratio that was
bogus. Half our earnings were too. Not in the Enron way, of course. The
finance guys seemed scrupulous about reporting earnings. What made our
earnings bogus was that Yahoo was, in effect, the center of a Ponzi scheme.
Investors looked at Yahoo's earnings and said to themselves, here is proof
that Internet companies can make money. So they invested in new startups that
promised to be the next Yahoo. And as soon as these startups got the money,
what did they do with it? Buy millions of dollars worth of advertising on
Yahoo to promote their brand. Result: a capital investment in a startup this
quarter shows up as Yahoo earnings next quarter—stimulating another round of
investments in startups.
As in a Ponzi scheme, what seemed to be the returns of this system were simply
the latest round of investments in it. What made it not a Ponzi scheme was
that it was unintentional. At least, I think it was. The venture capital
business is pretty incestuous, and there were presumably people in a position,
if not to create this situation, to realize what was happening and to milk it.
A year later the game was up. Starting in January 2000, Yahoo's stock price
began to crash, ultimately losing 95% of its value.
Notice, though, that even with all the fat trimmed off its market cap, Yahoo
was still worth a lot. Even at the morning-after valuations of March and April
2001, the people at Yahoo had managed to create a company worth about $8
billion in just six years.
The fact is, despite all the nonsense we heard during the Bubble about the
"new economy," there was a core of truth. You need that to get a really big
bubble: you need to have something solid at the center, so that even smart
people are sucked in. (Isaac Newton and Jonathan Swift both lost money in the
South Sea Bubble of 1720.)
Now the pendulum has swung the other way. Now anything that became fashionable
during the Bubble is ipso facto unfashionable. But that's a mistake—an even
bigger mistake than believing what everyone was saying in 1999. Over the long
term, what the Bubble got right will be more important than what it got wrong.
**1\. Retail VC**
After the excesses of the Bubble, it's now considered dubious to take
companies public before they have earnings. But there is nothing intrinsically
wrong with that idea. Taking a company public at an early stage is simply
retail VC: instead of going to venture capital firms for the last round of
funding, you go to the public markets.
By the end of the Bubble, companies going public with no earnings were being
derided as "concept stocks," as if it were inherently stupid to invest in
them. But investing in concepts isn't stupid; it's what VCs do, and the best
of them are far from stupid.
The stock of a company that doesn't yet have earnings is worth _something._ It
may take a while for the market to learn how to value such companies, just as
it had to learn to value common stocks in the early 20th century. But markets
are good at solving that kind of problem. I wouldn't be surprised if the
market ultimately did a better job than VCs do now.
Going public early will not be the right plan for every company. And it can of
course be disruptive—by distracting the management, or by making the early
employees suddenly rich. But just as the market will learn how to value
startups, startups will learn how to minimize the damage of going public.
**2\. The Internet**
The Internet genuinely is a big deal. That was one reason even smart people
were fooled by the Bubble. Obviously it was going to have a huge effect.
Enough of an effect to triple the value of Nasdaq companies in two years? No,
as it turned out. But it was hard to say for certain at the time. [1]
The same thing happened during the Mississippi and South Sea Bubbles. What
drove them was the invention of organized public finance (the South Sea
Company, despite its name, was really a competitor of the Bank of England).
And that did turn out to be a big deal, in the long run.
Recognizing an important trend turns out to be easier than figuring out how to
profit from it. The mistake investors always seem to make is to take the trend
too literally. Since the Internet was the big new thing, investors supposed
that the more Internettish the company, the better. Hence such parodies as
Pets.Com.
In fact most of the money to be made from big trends is made indirectly. It
was not the railroads themselves that made the most money during the railroad
boom, but the companies on either side, like Carnegie's steelworks, which made
the rails, and Standard Oil, which used railroads to get oil to the East
Coast, where it could be shipped to Europe.
I think the Internet will have great effects, and that what we've seen so far
is nothing compared to what's coming. But most of the winners will only
indirectly be Internet companies; for every Google there will be ten JetBlues.
**3\. Choices**
Why will the Internet have great effects? The general argument is that new
forms of communication always do. They happen rarely (till industrial times
there were just speech, writing, and printing), but when they do, they always
cause a big splash.
The specific argument, or one of them, is the Internet gives us more choices.
In the "old" economy, the high cost of presenting information to people meant
they had only a narrow range of options to choose from. The tiny, expensive
pipeline to consumers was tellingly named "the channel." Control the channel
and you could feed them what you wanted, on your terms. And it was not just
big corporations that depended on this principle. So, in their way, did labor
unions, the traditional news media, and the art and literary establishments.
Winning depended not on doing good work, but on gaining control of some
bottleneck.
There are signs that this is changing. Google has over 82 million unique users
a month and annual revenues of about three billion dollars. [2] And yet have
you ever seen a Google ad? Something is going on here.
Admittedly, Google is an extreme case. It's very easy for people to switch to
a new search engine. It costs little effort and no money to try a new one, and
it's easy to see if the results are better. And so Google doesn't _have_ to
advertise. In a business like theirs, being the best is enough.
The exciting thing about the Internet is that it's shifting everything in that
direction. The hard part, if you want to win by making the best stuff, is the
beginning. Eventually everyone will learn by word of mouth that you're the
best, but how do you survive to that point? And it is in this crucial stage
that the Internet has the most effect. First, the Internet lets anyone find
you at almost zero cost. Second, it dramatically speeds up the rate at which
reputation spreads by word of mouth. Together these mean that in many fields
the rule will be: Build it, and they will come. Make something great and put
it online. That is a big change from the recipe for winning in the past
century.
**4\. Youth**
The aspect of the Internet Bubble that the press seemed most taken with was
the youth of some of the startup founders. This too is a trend that will last.
There is a huge standard deviation among 26 year olds. Some are fit only for
entry level jobs, but others are ready to rule the world if they can find
someone to handle the paperwork for them.
A 26 year old may not be very good at managing people or dealing with the SEC.
Those require experience. But those are also commodities, which can be handed
off to some lieutenant. The most important quality in a CEO is his vision for
the company's future. What will they build next? And in that department, there
are 26 year olds who can compete with anyone.
In 1970 a company president meant someone in his fifties, at least. If he had
technologists working for him, they were treated like a racing stable: prized,
but not powerful. But as technology has grown more important, the power of
nerds has grown to reflect it. Now it's not enough for a CEO to have someone
smart he can ask about technical matters. Increasingly, he has to be that
person himself.
As always, business has clung to old forms. VCs still seem to want to install
a legitimate-looking talking head as the CEO. But increasingly the founders of
the company are the real powers, and the grey-headed man installed by the VCs
more like a music group's manager than a general.
**5\. Informality**
In New York, the Bubble had dramatic consequences: suits went out of fashion.
They made one seem old. So in 1998 powerful New York types were suddenly
wearing open-necked shirts and khakis and oval wire-rimmed glasses, just like
guys in Santa Clara.
The pendulum has swung back a bit, driven in part by a panicked reaction by
the clothing industry. But I'm betting on the open-necked shirts. And this is
not as frivolous a question as it might seem. Clothes are important, as all
nerds can sense, though they may not realize it consciously.
If you're a nerd, you can understand how important clothes are by asking
yourself how you'd feel about a company that made you wear a suit and tie to
work. The idea sounds horrible, doesn't it? In fact, horrible far out of
proportion to the mere discomfort of wearing such clothes. A company that made
programmers wear suits would have something deeply wrong with it.
And what would be wrong would be that how one presented oneself counted more
than the quality of one's ideas. _That's_ the problem with formality. Dressing
up is not so much bad in itself. The problem is the receptor it binds to:
dressing up is inevitably a substitute for good ideas. It is no coincidence
that technically inept business types are known as "suits."
Nerds don't just happen to dress informally. They do it too consistently.
Consciously or not, they dress informally as a prophylactic measure against
stupidity.
**6\. Nerds**
Clothing is only the most visible battleground in the war against formality.
Nerds tend to eschew formality of any sort. They're not impressed by one's job
title, for example, or any of the other appurtenances of authority.
Indeed, that's practically the definition of a nerd. I found myself talking
recently to someone from Hollywood who was planning a show about nerds. I
thought it would be useful if I explained what a nerd was. What I came up with
was: someone who doesn't expend any effort on marketing himself.
A nerd, in other words, is someone who concentrates on substance. So what's
the connection between nerds and technology? Roughly that you can't fool
mother nature. In technical matters, you have to get the right answers. If
your software miscalculates the path of a space probe, you can't finesse your
way out of trouble by saying that your code is patriotic, or avant-garde, or
any of the other dodges people use in nontechnical fields.
And as technology becomes increasingly important in the economy, nerd culture
is [rising](nerdad.html) with it. Nerds are already a lot cooler than they
were when I was a kid. When I was in college in the mid-1980s, "nerd" was
still an insult. People who majored in computer science generally tried to
conceal it. Now women ask me where they can meet nerds. (The answer that
springs to mind is "Usenix," but that would be like drinking from a firehose.)
I have no illusions about why nerd culture is becoming more accepted. It's not
because people are realizing that substance is more important than marketing.
It's because the nerds are getting rich. But that is not going to change.
**7\. Options**
What makes the nerds rich, usually, is stock options. Now there are moves
afoot to make it harder for companies to grant options. To the extent there's
some genuine accounting abuse going on, by all means correct it. But don't
kill the golden goose. Equity is the fuel that drives technical innovation.
Options are a good idea because (a) they're fair, and (b) they work. Someone
who goes to work for a company is (one hopes) adding to its value, and it's
only fair to give them a share of it. And as a purely practical measure,
people work a _lot_ harder when they have options. I've seen that first hand.
The fact that a few crooks during the Bubble robbed their companies by
granting themselves options doesn't mean options are a bad idea. During the
railroad boom, some executives enriched themselves by selling watered stock—by
issuing more shares than they said were outstanding. But that doesn't make
common stock a bad idea. Crooks just use whatever means are available.
If there is a problem with options, it's that they reward slightly the wrong
thing. Not surprisingly, people do what you pay them to. If you pay them by
the hour, they'll work a lot of hours. If you pay them by the volume of work
done, they'll get a lot of work done (but only as you defined work). And if
you pay them to raise the stock price, which is what options amount to,
they'll raise the stock price.
But that's not quite what you want. What you want is to increase the actual
value of the company, not its market cap. Over time the two inevitably meet,
but not always as quickly as options vest. Which means options tempt
employees, if only unconsciously, to "pump and dump"—to do things that will
make the company _seem_ valuable. I found that when I was at Yahoo, I couldn't
help thinking, "how will this sound to investors?" when I should have been
thinking "is this a good idea?"
So maybe the standard option deal needs to be tweaked slightly. Maybe options
should be replaced with something tied more directly to earnings. It's still
early days.
**8\. Startups**
What made the options valuable, for the most part, is that they were options
on the stock of [startups](start.html). Startups were not of course a creation
of the Bubble, but they were more visible during the Bubble than ever before.
One thing most people did learn about for the first time during the Bubble was
the startup created with the intention of selling it. Originally a startup
meant a small company that hoped to grow into a big one. But increasingly
startups are evolving into a vehicle for developing technology on spec.
As I wrote in [Hackers & Painters](hackpaint.html), employees seem to be most
productive when they're paid in proportion to the wealth they generate. And
the advantage of a startup—indeed, almost its raison d'etre—is that it offers
something otherwise impossible to obtain: a way of _measuring_ that.
In many businesses, it just makes more sense for companies to get technology
by buying startups rather than developing it in house. You pay more, but there
is less risk, and risk is what big companies don't want. It makes the guys
developing the technology more accountable, because they only get paid if they
build the winner. And you end up with better technology, created faster,
because things are made in the innovative atmosphere of startups instead of
the bureaucratic atmosphere of big companies.
Our startup, Viaweb, was built to be sold. We were open with investors about
that from the start. And we were careful to create something that could slot
easily into a larger company. That is the pattern for the future.
**9\. California**
The Bubble was a California phenomenon. When I showed up in Silicon Valley in
1998, I felt like an immigrant from Eastern Europe arriving in America in
1900. Everyone was so cheerful and healthy and rich. It seemed a new and
improved world.
The press, ever eager to exaggerate small trends, now gives one the impression
that Silicon Valley is a ghost town. Not at all. When I drive down 101 from
the airport, I still feel a buzz of energy, as if there were a giant
transformer nearby. Real estate is still more expensive than just about
anywhere else in the country. The people still look healthy, and the weather
is still fabulous. The future is there. (I say "there" because I moved back to
the East Coast after Yahoo. I still wonder if this was a smart idea.)
What makes the Bay Area superior is the attitude of the people. I notice that
when I come home to Boston. The first thing I see when I walk out of the
airline terminal is the fat, grumpy guy in charge of the taxi line. I brace
myself for rudeness: _remember, you're back on the East Coast now._
The atmosphere varies from city to city, and fragile organisms like startups
are exceedingly sensitive to such variation. If it hadn't already been
hijacked as a new euphemism for liberal, the word to describe the atmosphere
in the Bay Area would be "progressive." People there are trying to build the
future. Boston has MIT and Harvard, but it also has a lot of truculent,
unionized employees like the police who recently held the Democratic National
Convention for
[ransom](http://www.usatoday.com/news/politicselections/nation/president/2004-04-30-boston-
police-convention_x.htm), and a lot of people trying to be Thurston Howell.
Two sides of an obsolete coin.
Silicon Valley may not be the next Paris or London, but it is at least the
next Chicago. For the next fifty years, that's where new wealth will come
from.
**10\. Productivity**
During the Bubble, optimistic analysts used to justify high price to earnings
ratios by saying that technology was going to increase productivity
dramatically. They were wrong about the specific companies, but not so wrong
about the underlying principle. I think one of the big trends we'll see in the
coming century is a huge increase in productivity.
Or more precisely, a huge increase in [variation](gh.html) in productivity.
Technology is a lever. It doesn't add; it multiplies. If the present range of
productivity is 0 to 100, introducing a multiple of 10 increases the range
from 0 to 1000.
One upshot of which is that the companies of the future may be surprisingly
small. I sometimes daydream about how big you could grow a company (in
revenues) without ever having more than ten people. What would happen if you
outsourced everything except product development? If you tried this
experiment, I think you'd be surprised at how far you could get. As Fred
Brooks pointed out, small groups are intrinsically more productive, because
the internal friction in a group grows as the square of its size.
Till quite recently, running a major company meant managing an army of
workers. Our standards about how many employees a company should have are
still influenced by old patterns. Startups are perforce small, because they
can't afford to hire a lot of people. But I think it's a big mistake for
companies to loosen their belts as revenues increase. The question is not
whether you can afford the extra salaries. Can you afford the loss in
productivity that comes from making the company bigger?
The prospect of technological leverage will of course raise the specter of
unemployment. I'm surprised people still worry about this. After centuries of
supposedly job-killing innovations, the number of jobs is within ten percent
of the number of people who want them. This can't be a coincidence. There must
be some kind of balancing mechanism.
**What's New**
When one looks over these trends, is there any overall theme? There does seem
to be: that in the coming century, good ideas will count for more. That 26
year olds with good ideas will increasingly have an edge over 50 year olds
with powerful connections. That doing good work will matter more than dressing
up—or advertising, which is the same thing for companies. That people will be
rewarded a bit more in proportion to the value of what they create.
If so, this is good news indeed. Good ideas always tend to win eventually. The
problem is, it can take a very long time. It took decades for relativity to be
accepted, and the greater part of a century to establish that central planning
didn't work. So even a small increase in the rate at which good ideas win
would be a momentous change—big enough, probably, to justify a name like the
"new economy."
**Notes**
[1] Actually it's hard to say now. As Jeremy Siegel points out, if the value
of a stock is its future earnings, you can't tell if it was overvalued till
you see what the earnings turn out to be. While certain famous Internet stocks
were almost certainly overvalued in 1999, it is still hard to say for sure
whether, e.g., the Nasdaq index was.
Siegel, Jeremy J. "What Is an Asset Price Bubble? An Operational Definition."
_European Financial Management,_ 9:1, 2003.
[2] The number of users comes from a 6/03 Nielsen study quoted on Google's
site. (You'd think they'd have something more recent.) The revenue estimate is
based on revenues of $1.35 billion for the first half of 2004, as reported in
their IPO filing.
**Thanks** to Chris Anderson, Trevor Blackwell, Sarah Harlin, Jessica
Livingston, and Robert Morris for reading drafts of this.
|
**Want to start a startup?** Get funded by [Y
Combinator](http://ycombinator.com/apply.html).
March 2009
A couple days ago I finally got being a good startup founder down to two
words: relentlessly resourceful.
Till then the best I'd managed was to get the opposite quality down to one:
hapless. Most dictionaries say hapless means unlucky. But the dictionaries are
not doing a very good job. A team that outplays its opponents but loses
because of a bad decision by the referee could be called unlucky, but not
hapless. Hapless implies passivity. To be hapless is to be battered by
circumstances — to let the world have its way with you, instead of having your
way with the world. [1]
Unfortunately there's no antonym of hapless, which makes it difficult to tell
founders what to aim for. "Don't be hapless" is not much of a rallying cry.
It's not hard to express the quality we're looking for in metaphors. The best
is probably a running back. A good running back is not merely determined, but
flexible as well. They want to get downfield, but they adapt their plans on
the fly.
Unfortunately this is just a metaphor, and not a useful one to most people
outside the US. "Be like a running back" is no better than "Don't be hapless."
But finally I've figured out how to express this quality directly. I was
writing a talk for [investors](angelinvesting.html), and I had to explain what
to look for in founders. What would someone who was the opposite of hapless be
like? They'd be relentlessly resourceful. Not merely relentless. That's not
enough to make things go your way except in a few mostly uninteresting
domains. In any interesting domain, the difficulties will be novel. Which
means you can't simply plow through them, because you don't know initially how
hard they are; you don't know whether you're about to plow through a block of
foam or granite. So you have to be resourceful. You have to keep trying new
things.
Be relentlessly resourceful.
That sounds right, but is it simply a description of how to be successful in
general? I don't think so. This isn't the recipe for success in writing or
painting, for example. In that kind of work the recipe is more to be actively
curious. Resourceful implies the obstacles are external, which they generally
are in startups. But in writing and painting they're mostly internal; the
obstacle is your own obtuseness. [2]
There probably are other fields where "relentlessly resourceful" is the recipe
for success. But though other fields may share it, I think this is the best
short description we'll find of what makes a good startup founder. I doubt it
could be made more precise.
Now that we know what we're looking for, that leads to other questions. For
example, can this quality be taught? After four years of trying to teach it to
people, I'd say that yes, surprisingly often it can. Not to everyone, but to
many people. [3] Some people are just constitutionally passive, but others
have a latent ability to be relentlessly resourceful that only needs to be
brought out.
This is particularly true of young people who have till now always been under
the thumb of some kind of authority. Being relentlessly resourceful is
definitely not the recipe for success in big companies, or in most schools. I
don't even want to think what the recipe is in big companies, but it is
certainly longer and messier, involving some combination of resourcefulness,
obedience, and building alliances.
Identifying this quality also brings us closer to answering a question people
often wonder about: how many startups there could be. There is not, as some
people seem to think, any economic upper bound on this number. There's no
reason to believe there is any limit on the amount of newly created wealth
consumers can absorb, any more than there is a limit on the number of theorems
that can be proven. So probably the limiting factor on the number of startups
is the pool of potential founders. Some people would make good founders, and
others wouldn't. And now that we can say what makes a good founder, we know
how to put an upper bound on the size of the pool.
This test is also useful to individuals. If you want to know whether you're
the right sort of person to start a startup, ask yourself whether you're
relentlessly resourceful. And if you want to know whether to recruit someone
as a cofounder, ask if they are.
You can even use it tactically. If I were running a startup, this would be the
phrase I'd tape to the mirror. "Make something people want" is the
destination, but "Be relentlessly resourceful" is how you get there.
**Notes**
[1] I think the reason the dictionaries are wrong is that the meaning of the
word has shifted. No one writing a dictionary from scratch today would say
that hapless meant unlucky. But a couple hundred years ago they might have.
People were more at the mercy of circumstances in the past, and as a result a
lot of the words we use for good and bad outcomes have origins in words about
luck.
When I was living in Italy, I was once trying to tell someone that I hadn't
had much success in doing something, but I couldn't think of the Italian word
for success. I spent some time trying to describe the word I meant. Finally
she said "Ah! Fortuna!"
[2] There are aspects of startups where the recipe is to be actively curious.
There can be times when what you're doing is almost pure discovery.
Unfortunately these times are a small proportion of the whole. On the other
hand, they are in research too.
[3] I'd almost say to most people, but I realize (a) I have no idea what most
people are like, and (b) I'm pathologically optimistic about people's ability
to change.
**Thanks** to Trevor Blackwell and Jessica Livingston for reading drafts of
this.
|
"July 2023 \n \nIf you collected lists of techniques for doing great work in a lot of\ndifferent f(...TRUNCATED) |
End of preview. Expand
in Data Studio
No dataset card yet
- Downloads last month
- 26