[Scpg] Dancing With Systems by Donella Meadows
lakinroe at silcom.com
lakinroe at silcom.com
Sat Aug 18 14:49:07 PDT 2012
Dancing With Systems
by Donella Meadows
Versions of this piece have been published in Whole Earth, winter 2001 and The Systems Thinker, Vol. 13, No. 2 (March 2002).
http://www.sustainer.org/pubs/Dancing.html
The Dance
1. Get the beat.
2. Listen to the wisdom of the system.
3. Expose your mental models to the open air.
4. Stay humble. Stay a learner.
5. Honor and protect information.
6. Locate responsibility in the system.
7. Make feedback policies for feedback systems.
8. Pay attention to what is important, not just what is quantifiable.
9. Go for the good of the whole.
10. Expand time horizons.
11. Expand thought horizons.
12. Expand the boundary of caring.
13. Celebrate complexity.
14. Hold fast to the goal of goodness.
People who are raised in the industrial world and who get enthused about systems thinking are likely to make a terrible mistake. They are likely to assume that here, in systems analysis, in interconnection and complication, in the power of the computer, here at last, is the key to prediction and control. This mistake is likely because the mindset of the industrial world assumes that there is a key to prediction and control.
I assumed that at first too. We all assumed it, as eager systems students at the great institution called MIT. More or less innocently, enchanted by what we could see through our new lens, we did what many discoverers do. We exaggerated our own ability to change the world. We did so not with any intent to deceive others, but in the expression of our own expectations and hopes. Systems thinking for us was more than subtle, complicated mindplay. It was going to Make Systems Work.
But self-organizing, nonlinear, feedback systems are inherently unpredictable. They are not controllable. They are understandable only in the most general way. The goal of foreseeing the future exactly and preparing for it perfectly is unrealizable. The idea of making a complex system do just what you want it to do can be achieved only temporarily, at best. We can never fully understand our world, not in the way our reductionistic science has led us to expect. Our science itself, from quantum theory to the mathematics of chaos, leads us into irreducible uncertainty. For any objective other than the most trivial, we can't optimize; we don't even know what to optimize. We can't keep track of everything. We can't find a proper, sustainable relationship to nature, each other, or the institutions we create, if we try to do it from the role of omniscient conqueror.
For those who stake their identity on the role of omniscient conqueror, the uncertainty exposed by systems thinking is hard to take. If you can't understand, predict, and control, what is there to do?
Systems thinking leads to another conclusion–however, waiting, shining, obvious as soon as we stop being blinded by the illusion of control. It says that there is plenty to do, of a different sort of "doing." The future can't be predicted, but it can be envisioned and brought lovingly into being. Systems can't be controlled, but they can be designed and redesigned. We can't surge forward with certainty into a world of no surprises, but we can expect surprises and learn from them and even profit from them. We can't impose our will upon a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone.
We can't control systems or figure them out. But we can dance with them!
I already knew that, in a way before I began to study systems. I had learned about dancing with great powers from whitewater kayaking, from gardening, from playing music, from skiing. All those endeavors require one to stay wide-awake, pay close attention, participate flat out, and respond to feedback. It had never occurred to me that those same requirements might apply to intellectual work, to management, to government, to getting along with people.
But there it was, the message emerging from every computer model we made. Living successfully in a world of systems requires more of us than our ability to calculate. It requires our full humanity–our rationality, our ability to sort out truth from falsehood, our intuition, our compassion, our vision, and our morality.
I will summarize the most general "systems wisdom" I have absorbed from modeling complex systems and from hanging out with modelers. These are the take-home lessons, the concepts and practices that penetrate the discipline of systems so deeply that one begins, however imperfectly, to practice them not just in one's profession, but in all of life.
The list probably isn't complete, because I am still a student in the school of systems. And it isn't unique to systems thinking. There are many ways to learn to dance. But here, as a start-off dancing lesson, are the practices I see my colleagues adopting, consciously or unconsciously, as they encounter systems.
1. Get the beat.
Before you disturb the system in any way, watch how it behaves. If it's a piece of music or a whitewater rapid or a fluctuation in a commodity price, study its beat. If it's a social system, watch it work. Learn its history. Ask people who've been around a long time to tell you what has happened. If possible, find or make a time graph of actual data from the system. Peoples' memories are not always reliable when it comes to timing.
Starting with the behavior of the system forces you to focus on facts, not theories. It keeps you from falling too quickly into your own beliefs or misconceptions, or those of others. It's amazing how many misconceptions there can be. People will swear that rainfall is decreasing, say, but when you look at the data, you find that what is really happening is that variability is increasing–the droughts are deeper, but the floods are greater too. I have been told with great authority that milk price was going up when it was going down, that real interest rates were falling when they were rising, that the deficit was a higher fraction of the GNP than ever before when it wasn't.
Starting with the behavior of the system directs one's thoughts to dynamic, not static analysis–not only to "what's wrong?" but also to "how did we get there?" and "what behavior modes are possible?" and "if we don't change direction, where are we going to end up?"
And finally, starting with history discourages the common and distracting tendency we all have to define a problem not by the system's actual behavior, but by the lack of our favorite solution. (The problem is, we need to find more oil. The problem is, we need to ban abortion. The problem is, how can we attract more growth to this town?)
2. Listen to the wisdom of the system.
Aid and encourage the forces and structures that help the system run itself. Don't be an unthinking intervener and destroy the system's own self-maintenance capacities. Before you charge in to make things better, pay attention to the value of what's already there.
A friend of mine, Nathan Gray, was once an aid worker in Guatemala. He told me of his frustration with agencies that would arrive with the intention of "creating jobs" and "increasing entrepreneurial abilities" and "attracting outside investors". They would walk right past the thriving local market, where small-scale business people of all kinds, from basket-makers to vegetable growers to butchers to candy-sellers, were displaying their entrepreneurial abilities in jobs they had created for themselves. Nathan spent his time talking to the people in the market, asking about their lives and businesses, learning what was in the way of those businesses expanding and incomes rising. He concluded that what was needed was not outside investors, but inside ones. Small loans available at reasonable interest rates, and classes in literacy and accounting, would produce much more long-term good for the community than bringing in a factory or assembly plant from outside.
3. Expose your mental models to the open air.
Remember, always, that everything you know, and everything everyone knows, is only a model. Get your model out there where it can be shot at. Invite others to challenge your assumptions and add their own. Instead of becoming a champion for one possible explanation or hypothesis or model, collect as many as possible. Consider all of them plausible until you find some evidence that causes you to rule one out. That way you will be emotionally able to see the evidence that rules out an assumption with which you might have confused your own identity.
You don't have to put forth your mental model with diagrams and equations, though that's a good discipline. You can do it with words or lists or pictures or arrows showing what you think is connected to what. The more you do that, in any form, the clearer your thinking will become, the faster you will admit your uncertainties and correct your mistakes, and the more flexible you will learn to be. Mental flexibility–the willingness to redraw boundaries, to notice that a system has shifted into a new mode, to see how to redesign structure -- is a necessity when you live in a world of flexible systems.
4. Stay humble. Stay a learner.
Systems thinking has taught me to trust my intuition more and my figuring-out rationality less, to lean on both as much as I can, but still to be prepared for surprises. Working with systems, on the computer, in nature, among people, in organizations, constantly reminds me of how incomplete my mental models are, how complex the world is, and how much I don't know.
The thing to do, when you don't know, is not to bluff and not to freeze, but to learn. The way you learn is by experiment–or, as Buckminster Fuller put it, by trial and error, error, error. In a world of complex systems it is not appropriate to charge forward with rigid, undeviating directives. "Stay the course" is only a good idea if you're sure you're on course. Pretending you're in control even when you aren't is a recipe not only for mistakes, but for not learning from mistakes. What's appropriate when you're learning is small steps, constant monitoring, and a willingness to change course as you find out more about where it's leading.
That's hard. It means making mistakes and, worse, admitting them. It means what psychologist Don Michael calls "error-embracing." It takes a lot of courage to embrace your errors.
5. Honor and protect information.
A decision maker can't respond to information he or she doesn't have, can't respond accurately to information that is inaccurate, can't respond in a timely way to information that is late. I would guess that 99 percent of what goes wrong in systems goes wrong because of faulty or missing information.
If I could, I would add an Eleventh Commandment: Thou shalt not distort, delay, or sequester information. You can drive a system crazy by muddying its information streams. You can make a system work better with surprising ease if you can give it more timely, more accurate, more complete information.
For example, in 1986 new federal legislation required U.S. companies to report all chemical emissions from each of their plants. Through the Freedom of Information Act (from a systems point of view one of the most important laws in the nation), that information became a matter of public record. In July 1988 the first data on chemical emissions became available. The reported emissions were not illegal, but they didn't look very good when they were published in local papers by enterprising reporters, who had a tendency to make lists of "the top ten local polluters." That's all that happened. There were no lawsuits, no required reductions, no fines, no penalties. But within two years chemical emissions nationwide (at least as reported, and presumably also in fact) had decreased by 40 percent. Some companies were launching policies to bring their emissions down by 90 percent, just because of the release of previously sequestered information.
6. Locate responsibility in the system.
Look for the ways the system creates its own behavior. Do pay attention to the triggering events, the outside influences that bring forth one kind of behavior from the system rather than another. Sometimes those outside events can be controlled (as in reducing the pathogens in drinking water to keep down incidences of infectious disease.) But sometimes they can't. And sometimes blaming or trying to control the outside influence blinds one to the easier task of increasing responsibility within the system.
"Intrinsic responsibility" means that the system is designed to send feedback about the consequences of decision-making directly and quickly and compellingly to the decision-makers.
Dartmouth College reduced intrinsic responsibility when it took thermostats out of individual offices and classrooms and put temperature-control decisions under the guidance of a central computer. That was done as an energy-saving measure. My observation from a low level in the hierarchy is that the main consequence was greater oscillations in room temperature. When my office gets overheated now, instead of turning down the thermostat, I have to call an office across campus, which gets around to making corrections over a period of hours or days, and which often overcorrects, setting up the need for another phone call. One way of making that system more, rather than less responsible, might have been to let professors keep control of their own thermostats and charge them directly for the amount of energy they use. (Thereby privatizing a commons!).
Designing a system for intrinsic responsibility could mean, for example, requiring all towns or companies that emit wastewater into a stream to place their intake pipe downstream from their outflow pipe. It could mean that neither insurance companies nor public funds should pay for medical costs resulting from smoking or from accidents in which a motorcycle rider didn't wear a helmet or a car rider didn't fasten the seat belt. It could mean Congress would no longer be allowed to legislate rules from which it exempts itself.
7. Make feedback policies for feedback systems.
President Jimmy Carter had an unusual ability to think in feedback terms and to make feedback policies. Unfortunately he had a hard time explaining them to a press and public that didn't understand feedback.
He suggested, at a time when oil imports were soaring, that there be a tax on gasoline proportional to the fraction of U.S. oil consumption that had to be imported. If imports continued to rise the tax would rise, until it suppressed demand and brought forth substitutes and reduced imports. If imports fell to zero, the tax would fall to zero.
The tax never got passed.
Carter was also trying to deal with a flood of illegal immigrants from Mexico. He suggested that nothing could be done about that immigration as long as there was a great gap in opportunity and living standards between the U.S. and Mexico. Rather than spending money on border guards and barriers, he said, we should spend money helping to build the Mexican economy, and we should continue to do so until the immigration stopped.
That never happened either.
You can imagine why a dynamic, self-adjusting system cannot be governed by a static, unbending policy. It's easier, more effective, and usually much cheaper to design policies that change depending on the state of the system. Especially where there are great uncertainties, the best policies not only contain feedback loops, but meta-feedback loops–loops that alter, correct, and expand loops. These are policies that design learning into the management process.
8. Pay attention to what is important, not just what is quantifiable.
Our culture, obsessed with numbers, has given us the idea that what we can measure is more important than what we can't measure. You can look around and make up your own mind about whether quantity or quality is the outstanding characteristic of the world in which you live.
If something is ugly, say so. If it is tacky, inappropriate, out of proportion, unsustainable, morally degrading, ecologically impoverishing, or humanly demeaning, don't let it pass. Don't be stopped by the "if you can't define it and measure it, I don't have to pay attention to it" ploy. No one can precisely define or measure justice, democracy, security, freedom, truth, or love. No one can precisely define or measure any value. But if no one speaks up for them, if systems aren't designed to produce them, if we don't speak about them and point toward their presence or absence, they will cease to exist.
9. Go for the good of the whole.
Don't maximize parts of systems or subsystems while ignoring the whole. As Kenneth Boulding once said, Don't go to great trouble to optimize something that never should be done at all. Aim to enhance total systems properties, such as creativity, stability, diversity, resilience, and sustainability–whether they are easily measured or not.
As you think about a system, spend part of your time from a vantage point that lets you see the whole system, not just the problem that may have drawn you to focus on the system to begin with. And realize, that, especially in the short term, changes for the good of the whole may sometimes seem to be counter to the interests of a part of the system. It helps to remember that the parts of a system cannot survive without the whole. The long term interests of your liver require the long term health of your body, and the long term interests of sawmills require the long-term health of forests.
10. Expand time horizons.
The official time horizon of industrial society doesn't extend beyond what will happen after the next election or beyond the payback period of current investments. The time horizon of most families still extends farther than that–through the lifetimes of children or grandchildren. Many Native American cultures actively spoke of and considered in their decisions the effects upon the seventh generation to come. The longer the operant time horizon, the better the chances for survival.
In the strict systems sense there is no long-term/short-term distinction. Phenomena at different time-scales are nested within each other. Actions taken now have some immediate effects and some that radiate out for decades to come. We experience now the consequences of actions set in motion yesterday and decades ago and centuries ago.
When you're walking along a tricky, curving, unknown, surprising, obstacle-strewn path, you'd be a fool to keep your head down and look just at the next step in front of you. You'd be equally a fool just to peer far ahead and never notice what's immediately under your feet. You need to be watching both the short and the long term–the whole system.
11. Expand thought horizons.
Defy the disciplines. In spite of what you majored in, or what the textbooks say, or what you think you're an expert at, follow a system wherever it leads. It will be sure to lead across traditional disciplinary lines. To understand that system, you will have to be able to learn from–while not being limited by–economists and chemists and psychologists and theologians. You will have to penetrate their jargons, integrate what they tell you, recognize what they can honestly see through their particular lenses, and discard the distortions that come from the narrowness and incompleteness of their lenses. They won't make it easy for you.
Seeing systems whole requires more than being "interdisciplinary," if that word means, as it usually does, putting together people from different disciplines and letting them talk past each other. Interdisciplinary communication works only if there is a real problem to be solved, and if the representatives from the various disciplines are more committed to solving the problem than to being academically correct. They will have to go into learning mode, to admit ignorance and be willing to be taught, by each other and by the system.
It can be done. It's very exciting when it happens.
12. Expand the boundary of caring.
Living successfully in a world of complex systems means expanding not only time horizons and thought horizons; above all it means expanding the horizons of caring. There are moral reasons for doing that, of course. And if moral arguments are not sufficient, then systems thinking provides the practical reasons to back up the moral ones. The real system is interconnected. No part of the human race is separate either from other human beings or from the global ecosystem. It will not be possible in this integrated world for your heart to succeed if your lungs fail, or for your company to succeed if your workers fail, or for the rich in Los Angeles to succeed if the poor in Los Angeles fail, or for Europe to succeed if Africa fails, or for the global economy to succeed if the global environment fails.
As with everything else about systems, most people already know about the interconnections that make moral and practical rules turn out to be the same rules. They just have to bring themselves to believe that which they know.
13. Celebrate complexity.
Let's face it, the universe is messy. It is nonlinear, turbulent and chaotic. It is dynamic. It spends its time in transient behavior on its way to somewhere else, not in mathematically neat equilibria. It self-organizes and evolves. It creates diversity, not uniformity. That's what makes the world interesting, that's what makes it beautiful, and that's what makes it work.
There's something within the human mind that is attracted to straight lines and not curves, to whole numbers and not fractions, to uniformity and not diversity, and to certainties and not mystery. But there is something else within us that has the opposite set of tendencies, since we ourselves evolved out of and are shaped by and structured as complex feedback systems. Only a part of us, a part that has emerged recently, designs buildings as boxes with uncompromising straight lines and flat surfaces. Another part of us recognizes instinctively that nature designs in fractals, with intriguing detail on every scale from the microscopic to the macroscopic. That part of us makes Gothic cathedrals and Persian carpets, symphonies and novels, Mardi Gras costumes and artificial intelligence programs, all with embellishments almost as complex as the ones we find in the world around us.
14. Hold fast to the goal of goodness.
Examples of bad human behavior are held up, magnified by the media, affirmed by the culture, as typical. Just what you would expect. After all, we're only human. The far more numerous examples of human goodness are barely noticed. They are Not News. They are exceptions. Must have been a saint. Can't expect everyone to behave like that.
And so expectations are lowered. The gap between desired behavior and actual behavior narrows. Fewer actions are taken to affirm and instill ideals. The public discourse is full of cynicism. Public leaders are visibly, unrepentantly, amoral or immoral and are not held to account. Idealism is ridiculed. Statements of moral belief are suspect. It is much easier to talk about hate in public than to talk about love.
We know what to do about eroding goals. Don't weigh the bad news more heavily than the good. And keep standards absolute.
*****
This is quite a list. Systems thinking can only tell us to do these things. It can't do them for us.
And so we are brought to the gap between understanding and implementation. Systems thinking by itself cannot bridge that gap. But it can lead us to the edge of what analysis can do and then point beyond–to what can and must be done by the human spirit.
_____________________________________________
Donella Meadows authored and co-authored many books including The Limits to Growth and Beyond the Limits, syndicated "The Global Citizen" column, taught at Dartmouth College, helped found the Balaton Group on sustainability, was a MacArthur Fellow, and lived and worked on an organic farm.
For more information on the work of Donella Meadows contact Sustainability Institute, 3 Linden Road, Hartland, VT. 05048 [www.sustainer.org]
--------------------------------------------------------------------------------
More information about the Southern-California-Permaculture
mailing list