EXTRA: A Return of "Management Cybernetics" as a Way Forward Out of Economics-Based Neoliberalism?

The version of <https://braddelong.substack.com/p/a-return-of-management-cybernetics> that I originally submitted to the “Times Literary Supplement”, that they then edited extensively, and that they then bounced…

Subscribe now


Create a photorealistic image of Stafford Beer's vision of a cybernetic macroeconomic control room. The room is brightly colored with maroon and grey as prominent accent colors. The control room features an array of futuristic consoles with large screens displaying various economic data, graphs, and real-time analytics. In the background, there are shelves filled with books and a large digital world map on the wall. People are engaged in intense discussions, monitoring the data, and making decisions. The overall atmosphere is one of high-tech efficiency and dynamic interaction. The image is in landscape format, capturing the vibrancy and complexity of the control room, and suitable to illustrate the 'Grasping Reality' weblog.

Share

Dan Davies’s The Unaccountability Machine: Why Big Systems Make Terrible Decisions and How the World Lost Its Mind is a little book, and is a great book.

How is it a little book? Damned if I know. Had I set out to write anything like book, I could not have done so in less than four times the length.

Why is it a great book? Because it is a book that takes a lot of very important and fuzzy ideas about how a world of more than 8 billion people tightly linked together by economic commodity exchange, lightspeed voice, and political control can somehow organize itself to be productive, peaceful, and free when there is no way anything in our evolutionary past could possibly have predisposed us to pull and think together at such a scale.

It does so by attempting to revive what was once an important intellectual movement of the post-World War II world, cybernetics—Norbert Weiner’s idea that there should be principles that we can discover about how to make our increasingly large and complex systems of human organization comprehensible, and manageable by human beings. The root is the Greek kybernētikos, meaning “good at steering a boat”. Cybernetics would have been a discipline, metaphorically, about how to steer a boat, or perhaps about how to build a boat that can be steered.

What is this thread? I see three pieces spun together:

First, every organization needs to do five things: operations, regulation, integration, intelligence and philosophy. Operations is doing the work; regulation is making sure the people doing the work have what they need when they need it; integration is making sure people are pulling in the right direction; intelligence is planning so when things happen you can modify operations, regulation, and integration; philosophy is what you are doing all of this for. Davies writes:

Think of soldiers, quartermasters, battlefield commander, reconnaissance and field marshal, or… musicians, conductor, tour manager, artistic director and Elton John”.

Second, sometimes you need to get all the things done by simplifying-and-optimizing: delegate some of the work to a suborganization that will report its metrics, and as long as it meets its metrics don’t worry about it–but when it fails to meet its metrics, take a careful look inside at what is going on. This is attenuation: somehow reduce the complexities the organization has to keep trying to deal with so that there is less to do. (But do this badly and you are just pretending things are less complicated than they are.)

Third, most of the time what you really need to get all the things done is to build better feedback loops, which requires amplification. The organization needs to better match in its internal structures the complexity of the environment it is dealing with, so that it sees what it needs to see in time for something to be done about it before it is too late. Note that nowhere in this management cybernetics is a primary task one of making sure that people have the right incentives to act on the information they have (that elimination of “market failures” is the focus of economics. It is, rather, making sure that the flow of information is not neurotic–neither too little for those who must decide to grasp the situation, too much so that those who must inside drown, or too irrelevant. I wish I could say: “It’s a kind of psychoanalysis for non-human intelligences, with ]counterculture-era management cyberneticist] Stafford Beer as Sigmund Freud”. But I cannot. Felix Martin wrote that in his Financial Times review of The Unaccountability Machine. And since I cannot do better, I unabashedly steal it.

Is this more than mere handwaving? I think not quite, but almost: I hope this book will spur the thinking that we actually need to do, for we badly need a revival of the intellectual thread of cybernetics.

Why do we badly need it? Let me back up, and approach that question the long way around:

We East African Plains Apes are neither wise, nor smart. We are lucky if we can remember where we left our keys last night.

And yet, working together, we have conquered and dominate the world. are an awe-inspiring concept-thinking and nature-manipulating anthology intelligence, whose spatial reach embraces the globe, whose numerical reach now covers more then 8 billion of us, and whose temporal reach–because of writing–now extends back 5000 years. Even as long ago as 150,000 years ago, weak of tooth and absence of claw as we are, and when our ability to coöperate to work and think together was limited to a band of perhaps 100 with a memory that extended back only 60 years or so, we were not just being eaten by but we (or, rather, our very close Neandertal cousins) were also eating the hyenas.

But how can we work and think together at 8 billion-plus scale? We no longer just have our families, our neighbors, and our coworkers with whom we interact via networks of affection, dislike, communication, barter, exchange, small-scale plan, and arm-twisting.

Instead, or rather in addition, more and more of what we do is driven by an extremely complex assembly of vast interlocking social and technological mechanisms that we have made but that we do not understand. These mechanisms are extraordinarily, massively, mind-bogglingly productive. How much richer am I than my Ohlone predecessors who were the only people then living on the shore of San Francisco Bay four hundred years ago? A hundredfold richer? More? And what do I do to gain these riches? I know things and tell people stories about the human economy of the past. That is what I do.

But these mechanisms are also horrifyingly alien, inhumanly cruel, and bizarrely incomprehensible. Franz Kafka saw this. As Randall Jarrell wrote: “Kafka says… the system of your damnation… your society and your universe, is simply beyond your understanding…” Purdue Pharmaceuticals “decides” that a good way to make money for its shareholders is to addict Americans to opiates, and the individual humans who are its components fall into line—and afterwards all protest that that was not what they meant to do. But they did it. Global warming means that Berkeley right now has the climate that Santa Barbara 300 miles south had in my youth. Who decided to do this?

And I have not gotten to the fact that this is the timeline with the killer robots and the automated distributed propaganda machines that would make O’Brien of 1984 or Gletkin of Darkness at Noon laugh with joy.

New York Times columnist Ezra Klein says that in trying to understand the latest wave of cultural technologies that are the tech sectors MAMLMs–Modern Advanced Machine-Learning Models—https://braddelong.substack.com/p/a-return-of-management-cyberneticshe is driven to:

metaphors… [from] fantasy novels and occult texts… act[s] of summoning… strings of letters and numbers… uttered or executed… create some kind of entity… [to] stumble through the portal…. Their call… might summon demons and they’re calling anyway…

But what Ezra does not appear to recognize is that his metaphors of finding ourselves in a room with possibly malevolent THINGS that have escaped confining pentacles applies not just to programs running on NVIDIA-designed GPUs. Mary Shelley saw that it applied to science, Marx to the market economy, Kafka to bureaucracy. Adorno to the creation and transmission of culture, Marcuse to modern democracy, and so on. Can we understand and manage these inhuman massive-scale systems that are in the last analysis made up of people doing things for reasons? Can we control or constrain them to give them humanlike souls, or a human face?

So far the answer has been, largely, no. Consider what Gabriel Garcia-Marquez thought of was extremely high and definitely worshipful praise of Cuba’s Maximum Leader Fidel Castro:

He has breakfast with… two hundred pages of news…. No one can explain how he has the time or what method he employs to read so much and so fast…. A vast bureaucratic incompetence affecting almost every realm of daily life, especially domestic happiness, which has forced Fidel Castro himself, almost thirty years after victory, to involve himself personally in such extraordinary matters as how bread is made and the distribution of beer…

To which Jacobo Timerman snarked:

Castro… has a secret method… for reading quickly…. Yet, thirty years after the revolution, he hasn’t managed to organize a system for baking bread and distributing beer…

From a cybernetic perspective, most of our economic world today suffers from the inverse of the flaws of Fidel Castro’s system. We are under the dominion of sophisters, calculators, and most of all economists. So we have systems that are highly efficient at managing the wrong things in the wrong way. They are maximizers, where the goal is to make as much money as possible. As Davies writes:

A maximising system… defin[es] an objective function, and throw[s] away all the other information…. [But] the environment is going to change, and something which isn’t in the information set any more is going to lead… [to] destruction…. Every decision-making system set up as a maximiser needs to have a higher-level system watching over it. There needs to be a red handle to pull, a way for the decided-upon to indicate intolerability…

But all is not lost, at least not with respect to the major shoggoths of our economy. This is what I see as Davies’s major action-item conclusion:

[In] the decision-making system of a modern corporation… one of its signals has been so amplified that it drowns out the others. The ‘profit motive’ isn’t…. Corporations… don’t have motives. What they have is an imbalance…. They aren’t capable of responding to signals from the long-term planning and intelligence function, because the short-term planning function has to operate under the constraints of the financial market disciplinary system…. Take away that pressure [and] it’s quite likely…corporate decision-making systems will be less hostile…. Viable systems fundamentally seek stability, not maximisation…. On any given day, managers spend a lot more time talking to their customers and employees than they do to investors; if they were able to pay attention to what they heard, that would be much healthier…

Share

Share Brad DeLong’s Grasping Reality

Leave a comment

Subscribe now