David Epstein’s book Range is about an interesting idea that runs contrary to much of the recent received wisdom on how gain expertise and the consequences of that. It’s written in that comfortable well-researched style that American reporters learn when writing long-form feature articles. It’s also about twice as long as it needed to be, although I suspect that is a result of American publishing economics.

(Photo: Andrew Curry. CC BY-NC-SA 4.0)

Basically the book starts from two premises, and then works through these in a number of different domains, sometimes via extended short profiles of relevant individuals.

Wicked versus kind

The first premise is that there are two types of learning environments. In “kind” learning environments, “patterns repeat over and over, and feedback is extremely accurate and usually very rapid.” (20-21). In “wicked” learning environments,

the rules of the game are often unclear or incomplete, there may or may not be repetitive patterns and they may not be obvious, and feedback is often delayed, inaccurate, or both. (21)

The second is that is that there are broadly, two types of researcher: those with breadth, called variously foxes, birds, and T-shaped people, depending on whose model is being referenced, and hedgehogs, frogs, or I-shaped people, who are more about depth. Late on, a hybrid shape gets mentioned—someone with breadth but one deep pillar of expertise. (Maybe they are ‘waders’ in the birds and frogs model).

Putting these two together, generalists—people with range—tend to do better in the more complex “wicked” environments.

Literally the day after I’d finished the book, I was pointed to a long new article in Nature that made pretty much exactly the same case. It argued that one of the reasons that the rate of disruptive research was falling was because there was too much depth being funded, and not enough breadth.

Careers

Once you get to the base camp established by these two initial arguments, there are interesting implications for career decisions, and for problem-solving and innovation.

Let me start with the career implications. In a nutshell, the notion of 10,000 hours of practice, and of the obsessive ‘Tiger Mum’ process popularised by Amy Chua, are right only in a small number of areas: chess, certainly, golf, probably, where there is a limited range of patterns to learn.

It’s probably not even true of music, and it doesn’t seem to be true of science. Moving between disciplines doesn’t seem to have a career penalty or a salary penalty. Epstein marshals the evidence here fairly carefully, and suggests that people have a ‘sunk cost’ fallacy about prior study which is reinforced by universities, which are institutions that value depth over breadth. Hiring processes, similarly, and especially when partially automated, tend to screen in candidates with depth and screen out those with breadth.

Analogies

One of the reasons why inter-disciplinary thinking succeeds better in solving problems is because inter-disciplinary teams are able to draw on a wider range of analogies. Kepler and Darwin get referenced for the ways in which they used analogies when they got stuck on a research problem.

Epstein discusses the research of the psychologist Kevin Dunbar, who set out how to understand how productive labs work by following the work of four different labs over the course of a year:

the labs most likely to turn unexpected findings into new knowledge for humanity made a lot of different analogies, and made them from a variety of base domains… The more unusual the challenge, the more distant the analogies, moving away from surface similarities toward deep structural similarities. (118).

Strikingly, two labs got confronted by the same problem during his research. The one where researchers had very similar expertise and backgrounds got stuck on it; the other, with more breadth, solved it. Dunbar wasn’t able to help: one of the conditions of access was that he wasn’t allowed to say what the other labs were up to.

Cultural incongruity

Culturally, innovative organisations need to be able to balance cultural congruity (a degree of conformity) with a level of cultural incongruity. From time to time, you need to get your coat caught on the nail that sticks out, rather than hammering it down. There’s a long section on NASA and the Challenger disaster (and an intriguing related Harvard Business School case study) that explores what happens when conformity becomes over-weighted in the culture. [1]

It’s hard to like Werner von Braun, because, but it is clear from his account that von Braun’s open style around sharing engineering information within NASA served the organisation better than the more conventional upward-reporting management style introduced by his successor, William Lucas.

Along the way, of course, Tetlock’s super-forecasters get a namecheck.

Population Bomb

In some of the discussion in Range Epstein is fairly nuanced. He spends time, for example, on the dispute between Paul Erlich, the author of Population Bomb, and the economist Julian Simon, who thought that innovation would prevent Ehrlich’s bleak Malthusian world from happening. (So far it has).

This was, he suggest, a sterile discussion between two hedgehogs unable to see beyond their own disciplines:

Ehrlich was wrong about population (and the apocalypse), but right on aspects of environmental degradation. Simon was right about the influence of human ingenuity on the food and energy supply, but wrong in claiming that improvements in air and water quality also vindicated his predictions… As each man amassed more information for his own view, each became more dogmatic. (218)

He also dives into the briefly popular notion of ‘grit’, developed by Angela Duckworth. He acknowledges that he’s attracted to the idea because of his own career as a university athlete, where grit helped him overcome the limits of his natural talent.

But he concludes that the ‘grit’ questionnaire doesn’t help. It values persistence over everything, whereas in practice it’s more valuable to know when you have lost your ‘match’ with your environment, and can therefore decide to move on.

Late starts

I lost patience with him a bit while reading a disguised eight page version of van Gogh’s life and work during his 20s, when he was an unsuccessful picture dealer and had a difficult period as a pastor.

Gaugin—who didn’t take up painting until he was 35–gets a call-out around here as well.

At the end of all of this—and it does go on—he pronounces with a flourish that

they aren’t exceptions by virtue of their late starts, and those late starts did not stack the odds against them. Their late starts were integral to their eventual success. (128)

But, but, but: Picasso, Monet, Matisse, Turner, Malevich, Kandinsky, even Cezanne (whose family required him to study law for two years) all contributed as significantly to innovation in modern art, while having much more conventional art trainings.

Hedgehogs

I had the same sense of cherry picking in a subsequent chapter about the persistence of Jill Viles, who suffered from a gene mutation, in alerting scientists to a way to understand better the lamin gene. Her persistence is remarkable, even gritty. Sure, but again, all post-hoc: for every Jill Viles there are 10,000 amateur medical researchers who are just plain wrong.

This isn’t to say that hedgehogs aren’t necessary. One of Tetlock’s super-forecasters says she mines the domain experts for facts, but stays away from their opinions. Einstein was a hedgehog, which probably helped him to formulate the Theory of Relativity in the first place, but may have limited his later research:

God does not play dice with the universe, Einstein asserted, figuratively. Niels Bohr, his contemporary who illuminated the structure of atoms (using analogies to Saturn and the solar system), replied that Einstein should keep an open mind and not tell God how to run the universe. (229)

Learning from mistakes

One of the interesting conclusions here is about learning. In Tetlock’s super-forecaster study, he found that both foxes and hedgehogs were quick to update their beliefs after a successful prediction, reinforcing them. But:

When an outcome took them by surprise, foxes were much more likely to adjust their ideas. Hedgehogs barely budged. Some hedgehogs made authoritative predictions that turned out wildly wrong, and then updated their theories in the wrong direction. They became even more convinced of the original beliefs that led them astray. (231)

Put like this, of course, it sounds very like Leon Festinger’s work on cognitive dissonance.

Dropping your tools

What should you do if you find yourself in a wicked environment where you are, as it were, suddenly out of your depth? Epstein has a chapter called ‘Learning to Drop Your Familiar Tools’, which is both a factual description and a metaphor. It draws on the work of the organisational psychologist Karl Weick on smoke-jumpers—the specialists who fight forest fires. When situations got out of hand, often because a fire jumped a fire break, the people who survived were more likely to be the ones who, literally, drop their tools, their backpacks or chainsaws, to give themselves a better chance of outrunning the fire. But obviously there’s an analogy here as well.

A tweet in response to the Nature paper referenced above suggests eight ways to improve diversity in research thinking, some of which are also mentioned by Epstein. They are captured in a slightly enigmatic graphic.

There’s clearly a significant policy implication here. If we need more breadth and less depth—which is also the clear implication of the Nature article referenced above—we need universities and research institutes to value inter-disciplinary expertise as well as deep disciplinary knowledge. Reading this part of the book, I realised why futures work—definitely a realm of foxes or birds and not hedgehogs or frogs—has struggled to establish itself in academe.

Withered technology

One of my favourite sections of the book was about the Nintendo engineer Gunpei Yokoi, who transformed Nintendo’s business. Yokoi was by his own admission not a very good engineer, but he was a curious tinkerer. His operating principle for Nintendo’s innovation was “lateral thinking with withered technology”, meaning the technologies that were now so familiar that competitors were moving on from them. The Game Boy, which sold 118 million units, was the highlight of this approach:

From a technological standpoint, even in 1989, the Game Boy was laughable. It cut every corner… What its withered technology lacked, the Game Boy made up for in user experience. It was cheap. It could fit in a large pocket. It was indestructible. (196-197)

If there’s a paragraph that summarises the overall conclusion of Range, it comes about two-thirds of the way through the book:

Facing uncertain environments and wicked problems, breadth of experience is invaluable. Facing kind problems, narrow specialization can be remarkably efficient. The problem is that we often expect the hyperspecialist, because of their expertise in a narrow area, to be able to magically extend their skill to wicked problems. The results can be disastrous. (213)

Range is, in short, full of good things. It covers a lot of ground and a lot of research. All the same, it’s still about twice as long as it needs to be. My tip is to skip-read the extended narrative sections.

——

[1] For me Epstein spends a lot of time on the data discussions around the O-rings and not enough time on the power relations that were going on in the room during the pre-launch review meeting.