S Y S T E M A N T I C S 4
IV. SYSTEM DESIGN GUIDELINES: SIZE
1. A simple system may or may not work.
Things can fail in an infinite number of ways, but they can work in only a few ways. In other words, the odds are against success.
If you can't guarantee success, then the best you can do is try to improve the odds. The most effective way to do so is to follow the old KISS principle: Keep It Simple, Stupid. By minimizing the potential points of failure, you nudge the odds of success in your favor.
Even so, a small system can still fail to work. But it can't hurt to apply sound principles of system design. At worst, it's good practice.
2. A complex system that works is almost always found to have evolved from a simple system that worked.
Big systems built from scratch almost never work. You have to start from scratch with a working simple system and grow it carefully. This might not work, either, but it's your only hope.
3. Big systems either work on their own, or they don't. If they don't, you can't make them. Pushing on the system won't help.
Big systems are like any other large mass: they have a lot of inertia. Once a system gets big, it becomes highly resistant to major change. If it's not working, tinkering is unlikely to make it work... but tinkering is all that anyone will be permitted to do to that system, because big systems always have constituencies whose functions depend on the system remaining as it is.
In such cases, it is almost impossible to effect meaningful change of an existing system. You have to start over with a new small system that works.
4. A large system produced by expanding the the dimensions of a smaller system does not behave like the smaller system. It Kant.
The trouble with the idea of increasing the scope of a small system to make it do more is that--as Immanuel Kant among others pointed out--a sufficient change in degree can constitute a change in kind. To put it another way, enough of a change in the quantity of a thing can mean a change in a quality of that thing.
Consider the temperature. If it's 32 degrees Fahrenheit outside, most U.S. citizens would call that "cold." If it warms up by a degree to 33 degrees, most people would still call that "cold." 34, 35 and 36 degrees would probably be considered "cold," as well. But keep warming up, degree by degree, and eventually those small changes in quantity add up to a change in quality--32 degrees might be "cold," but 110 degrees Fahrenheit is "hot."
The same thing applies to systems. A small system may be able to accomplish some task, but expanding the size of that system does not necessarily mean that you will be able to accomplish a lot more of that same task, or do it faster or better. In fact, the whole system might cease to work, or strange and counterproductive behaviors may ensue. This is because sufficient changes in degree can produce changes in kind; expanding the size of a system generates new problems.
An interesting example of this is the Vehicle Assembly Building (VAB) at Cape Canaveral in Florida. Here the rocket stages and crew compartments are mated for spaceflight missions. One of the largest enclosed spaces in the world, the VAB was built to allow this process to be performed indoors where it is protected from the weather that might damage some of the exposed internal spacecraft components.
The only problem: The VAB is so large that it generates its own weather.
I. General System Behavior and Morphology
II. Systems and Information
III. System Design Guidelines: Number
IV. System Design Guidelines: Size
V. System Design Guidelines: Longevity