horizontal rule

S Y S T E M A N T I C S   2

horizontal rule

II. SYSTEMS AND INFORMATION


1. To those within a system the outside reality tends to disappear.

System components tend over time to become more and more specialized. As this continues, these elements begin to refuse to recognize any inputs that they consider improperly formatted. This formalization tends to result in the creation of "layers" of information between the elements of a system and the real (external) world.

As more and more layers intrude between part of a system and the external world, reality is repeatedly filtered. Any information which might threaten the stability or security of system elements is rejected. Eventually only the most sensational or irrelevant aspects of the objective world are widely propagated throughout the system.

For one example, consider the system of information production and consumption that is formed by journalists and U.S. citizens. Now recall the trials of O.J. Simpson.


2. Components of systems do not do what the system says they do.

Components are often referred to as if they performed the function for which the entire system was designed. For example, does someone who works in the aerospace industry build fighter jets? No. She may rivet wing assemblies, or manage a computer network, or perform cost accounting, but she is not a jet fighter builder.

Similarly, someone who works for the U.S. Department of Health and Human Services cannot claim the whole-system function of "helping our neediest citizens." Chances are good, in fact, that his sole function is the documentation of work done by people whom he has never met and will never meet, and who themselves do not directly help anyone. As an element of a system, he does not do what the system says he is doing.

In fact, once a system grows to a certain size and complexity, there is a good chance that any individual element may be not only not doing what the system says it is doing, but actually working against the stated purpose of the system. In a large wealth-redistribution system, for example, most employees will actually be taking as salaries and wages a significant part of the money which the system's representatives claim is going to the needy.

Just as the reality external to a system fails to reach the internal components of that system, the true nature of those components is disguised to serve the stated purpose of that system. But it should be noted that this is not necessarily a deliberate act on anyone's part--it is often no more or less than a natural result of a system's having grown beyond its functional size.


3. The system itself does not do what it says it is doing.

Particularly in larger systems, stated goals are often not the actual goals. In small (that is, working) systems, before and just after their creation, goals usually aren't stated; it's enough that the system helps solve the problem it was created to address.

Later, however, after the system has been around long enough to generate a consituency, some persons at some point will try to exercise control over the system by defining its outputs in the way most favorable to themselves. When this happens, a kind of crystallization occurs. The external structure of the system solidifies, creating a hard shell between the core of the system and the outside world.

This "shell" then, as the interface of the system to the outside world, becomes the stated goal of the system. But like an iceberg, it is only the visible part of the system. The much larger part is hidden from view.

Also like an iceberg, it is this hidden part that has the most powerful impact on the outside world. The stated goal of a system is merely what that system says it is doing; it is not necessarily what the system is actually doing. In reality, as more time elapses from the point at which that goal was stated, the actual (inner) goal of the system tends to diverge from the official (stated) goal.

For example, when you go to the grocery store to buy an apple, what is it you actually buy? What you get was picked weeks ago and artificially ripened to accord to shipping schedules... but how is that product like what you get when you pluck an apple from a tree yourself?

The difference is the system instituted to satisfy your goal of having an apple to eat. Picking one yourself eliminates the system; you want an apple, you get an apple. But from a grocery store, you get what the grocery store system calls an apple, which is almost nothing like what you get when your true goal is satisfied.


4. The system takes the credit (for any favorable outcome).

If an individual element of a system somehow causes a desired positive effect, the system as a whole will credit itself with having achieved that effect... even if that effect is only incidentally associated with the stated goal of the system. The reasoning (such as it is) goes something like this: Since the individual element was part of the overall system, and since individual elements are incapable of producing the stated goal of that system, therefore if anything even remotely like the goal occurs, the entire system must have produced it.

Interestingly, the converse is also held to be true: if something bad happens, the fault is never the system's. Either "forces" external to the system prevented success, or else some individual element of the system failed. The system exists to meet the goal; this is a noble function; therefore the system itself must not be criticized as faulty.

Convenient, isn't it?


5. Perfection of planning is a symptom of system decay.

This is actually a phenomenon of two separate but related systems dysfunctions. FIrst, when the belief that "once our plans are perfect, we cannot fail" infects a group's leadership, that group begins to fail. The trouble with this kind of thinking is that it assumes that the environment surrounding the system is static, that it will never change. Adapting to a changing reality is hard. It takes constant effort. So there's always pressure to take the easy route, to assume (just for the sake of discussion, of course) that the environment won't change between the time when planning starts and when the plan is complete.

But this is almost never the case. A refusal to accept that the nature of reality is constant change is a death sentence to a plan. A system that refuses to alter its plans to adapt to changes in its environment is a system that has begun to decay.

The other way in which "perfection of planning" is a systems fault is that this mind-set doesn't just fail to respond to changing reality--that would be bad enough--but it actually makes matters worse by diverting resources from real work to counter-productive paper-shuffling. In trying to make a plan perfect, planners need useless information. (Useful information might indicate that the plan is worthless; therefore useful information must be filtered out at all costs.) To gather that useless information will require that some persons be taken off good projects and assigned useless work. Then the plan must be developed, which means meetings, committees, status reports, and the thousand-and-one other means by which productive activity is squelched and reality is filtered from reaching the decision-making elements of a system.

The result: attempting to reach 100 percent perfection of planning eventually requires 100 percent of a system's resources. This not only demonstrates the decay of the system, it hastens that decay.


6. Beware of positive feedback.

Positive feedback is always dangerous.

In its fluffy psychological sense, "positive feedback" is not always a bad thing. Praise, when merited, can be a powerful motivator. But when not merited, it can become a destructive cult of "self-esteem."

In the more objective natural sense, positive feedback consists of the forward motion of a system being "fed back" into the system. The worst result of negative feedback, which limits action, is inactivity. Positive feedback, on the other hand, sends a system the other direction, toward total activity, and most systems are not structurally sound enough to survive that. Mechanical systems can literally shake themselves to pieces. (Note: All feedback is actually "negative" in a mathematical sense. It's simply easier to think of this kind of feedback as being "positive," just as we think of a special kind of centripetal force as "centrifugal.")

Example: The original Tacoma Narrows Bridge was built across a canyon. On one particularly windy day, the wind rushing through the canyon made the air vibrate in the same way that blowing across the top of a bottle makes a musical tone. The problem--which the bridge's designers naturally never thought to consider--was that the bridge itself resonated to one of the major harmonic frequencies of the air blowing through the canyon. Just as playing a note on one guitar string can make the string next to it vibrate, the Tacoma Narrows Bridge began to resonate. The positive feedback of the wind's vibration added to the bridge's motion, causing it to sway, then rock, then flex, then undulate like an ocean wave. Finally the feedback became so great that the bridge tore itself apart.

So it is with other systems. Nazi Germany was one such victim of positive feedback; there, voices that might have acted as brakes on a self-destructive course were silenced. (To a lesser degree, most political rallies partake of the same energizing effect. This is why the freedom of individuals to criticize their government without fear of reprisal is so important.)

For systems, positive feedback is dangerous because systems, by their very nature, are already primed to expand. They don't need the help.


7.

The information you have is not the information you want.

The information you want is not the information you need.

The information you need is not available.


horizontal rule

Background

I. General System Behavior and Morphology

II. Systems and Information

III. System Design Guidelines: Number

IV. System Design Guidelines: Size

V. System Design Guidelines: Longevity


horizontal rule

Home

Heart

Body

Spirit

Mind

Art Writing Religion Personality
Music Travel Politics Computers
Genealogy Work History Reasoning
Fiction Games Economics Science

horizontal rule