I first came across the work of Stafford Beer in the 1990s, first around the edges of some reading about the Allende government in Chile, where he and Fernando Flores tried to build a cybernetic information system for the government2. And then, because these things come in pairs, I was able to commission a workshop on behalf of a UK government department (in 1998) that used Beer’s Syntegration model as a way to get an expert group to make some decisions about the future of digital media.

(Photo: Andrew Curry, CC BY-NC-SA 4.0)

Once you’re exposed to these ideas, they become a bit infectious, and it wasn’t long until I had worked though a large chunk of his library, even if much of the mathematics that sits behind cybernetic thinking is beyond me.

There were two elements of this that I found engaging. The first is the way information and feedback connect the organisation to its external environment, which builds on the earlier work of Ross Ashby about requisite variety (more on that in a moment). The second is Stafford Beer’s keen interest in designing decision making systems that reduce, to the extent you can, the power differentials in a group process.

Systems and their environments

From a futures point of view, this was of immediate interest. Much of the rationale of futures work is based on a systems idea about the relationship between the organisation and its environment, which is why we use horizon scanning as a core practice. This is summarised by Merelynne Emery and Tom Devane in a quote I use in presentations quite a lot:

For a system to be viable over time, it must

  • “constantly scan the relevant environment for changes that might affect its viability.
  • ”actively adapt to new information it receives in such a way that it also influences those environments.

From a political point of view, understanding how to try to flatten power differentials was quite exciting too.

But it was also clear that by the 2000s cybernetics had disappeared into a niche, whereas in the 50s and 60s, when Ashby and Beer were influential members of the Cybernetics Society, it was an influential intellectual idea. This was a puzzle that I never really resolved, since by the 2000s the world had moved into a condition where cybernetics could—and should—really have helped.

A new audience

This is a long way in to Henry Farrell’s review on his blog of Dan Davies’ recent book The Unaccountability Machine. The book seeks to reintroduce cybernetics to a new audience, by re-explaining Stafford Beer’s thinking. (I should probably add that Farrell’s very clear that he and Davies are mates who use to blog together).

It’s a long article, and I’ll just manage a few highlights here. But it does have a view on why cybernetics vanished just at the point when it might have become really useful:

it turned out to be much easier to make progress on the information and technology side—all of the fantastic things that could be done with vacuum tubes and then semiconductors—than on the more difficult challenge of understanding the complex workings of modern societies.

“Much easier” is one way of putting it. The other way of putting it is that the political systems that we have lived with for the past 40 years have put a lot of energy into pretending that they’re not political, and one of the things that cybernetics does, I think, is make the political more visible.

‘Variety’ and complexity

Anyway, back to Ross Ashby, whose ‘Principle of Requisite Variety’ is a critically important idea. Farrell summarises:

We live in a complex world which keeps on producing variety that builds on previous variety. That means that there are many, many surprises—complex systems are by their nature difficult to predict. If you do want to anticipate these surprises, and even more importantly, to manage them, you need to have your own complex systems, built into your organization. And these systems need to be as complex as the system that you’re trying to manage.

The word ‘variety’ here is a way of describing complexity, and if organisations want to manage variety, they either need to be able to absorb complexity themselves, or suppress complexity in the outside world. In Beer’s language, you are either ‘amplifying’ the signals from the outside world, or you are ‘attenuating’ them. Attenuation first:

You reduce the variety of the environment you are trying to deal with, so that the system produces fewer possible states of the world to be anticipated or managed. Or you pretend to yourself that the variety is less than it is, and hope that you aren’t devoured by the unknowns that you have chosen to unknow.

One easy example of this: market segmentation, which works fine while your consumer segments behave in the ways your marketing model describes, but starts unravelling when they don’t. Or another example: the way in which police forces in some parts of London more or less decriminalised possession of small amounts of cannabis by not arresting people for this any more.

Amplification

Amplification, on the other hand:

Here, crudely speaking, you amp up the variety inside the organizational structures that you have built, so that it better matches the variety of the environment you find yourself in. Very often, this involves building better feedback loops through which different bits of the organization can negotiate with each other over unexpected problems.

I’d add a bit to this: you also build cognitively richer models of what is happening in the outside world so that you have better interpretative frameworks to understand what’s going on out there. Much of the futures language about “rehearsing possible futures”, whether associated with scenarios or not, is about amplification. (Because: blind spots are a form of unconscious or unintended attenuation.)

As I say, it is a long article, and much of Beer’s work was about designing systems that helped organisations manage these two processes (hence: management cybernetics.) Farrell applies some of this to actually existing current issues—through an American lens—and connects it to other recent books in an interesting way.

Interconnected crises

Farrell argues that potentially cybernetics is a set of thinking that is attuned to the needs of the times, given the interconnected crises that we find ourselves in, whether we think of these as a polycrisis or a metacrisis.

Farrell has a concern that cybernetics is a bit of a technocratic solution to all of this. That depends. I haven’t read Davies’ book yet—it’s on order—but a lot of Beer’s work, especially after the wounding experience of the violent overthrow of the Allende government, was about how you use cybernetic principles to make democracies work better through better forms of participation. We shouldn’t pick up half of his work and ignore the other half.

—-

A version of this article is also published on my Just Two Things Newsletter.