Command and (Out of) Control:
The Military Implications of Complexity Theory

John F. Schmitt

I shall proceed from the simple to the complex. But in war more than in any other subject we must begin by looking at the nature of the whole; for here more than elsewhere the part and the whole must always be thought of together.

—Carl von Clausewitz

The greatest and most direct military implications of complexity theory are likely to be in the area of command and control. Complexity theory is command and control theory: both deal with how a widely distributed collection of numerous agents acting individually can nonetheless behave like a single, even purposeful entity. The emerging sciences suggest that war is a radically different type of phenomenon—with a different operating dynamic—than typically understood in the American military. While radically different than commonly understood, war may have much in common with other types of nonlinear dynamical systems such as, as Clausewitz suggested, commerce. If war is a dramatically different type of phenomenon than commonly understood, then the implications for the way we perform command and control may be—should be—nothing short of profound. As we learn more about the behavior of complex systems, we will likely come to view command and control in fundamentally different terms.

The Prevailing View of Command and Control

Military theorists have routinely turned to science to help understand and explain war. In the verifiable and reliable laws of the natural world they have sought analogies and explanations for the unfathomable occurrences of the battlefield. Most often military theorists have turned to physics—and more specifically to Newtonian mechanics—because it is the most established, most elegant, and most precise of the sciences and because its laws describing the movements of material bodies and the physical forces acting upon them seem to provide ready analogies for military forces engaging one another in combat.

The great Prussian military theorist-philosopher Clausewitz was an avid amateur scientist and relied heavily and explicitly on the physical sciences to provide metaphors for his military concepts. Two of his greatest and most enduring concepts—friction and the center of gravity—come straight out of the science of the day. Of course, science for Clausewitz was Newtonian science.

The Reigning Paradigm: Newton Rules

Not only does science provide metaphors and models for isolated military concepts, in our age it plays an even more fundamental role: Newtonian science provides the overarching paradigm which characterizes modern Western culture. In ways that we don’t even realize because it is internalized, our paradigm shapes both our interpretation of the problems we face and the solutions we generate to those problems.

The Newtonian paradigm is the product of the Scientific Revolution which began in the 16th century and reached its crowning moment with Isaac Newton, who gave his name to the resulting world view. The Newtonian paradigm is the mechanistic paradigm: the world and everything in it as a giant machine. The preferred Newtonian metaphor is the clock: finely tooled gears meshing smoothly and precisely, ticking along predictably, measurably and reliably, keeping perfect time.

The Paradigm Deeply Ingrained

The Newtonian/mechanistic paradigm is so deeply ingrained that it is even reflected in our everyday conversation. When things are going well, we say they are going "like clockwork." When our unit is performing well, we describe it as a "well oiled machine," or we say we’re "hitting on all cylinders." We refer to our individual contribution by saying we’re "just one cog in the machine." In the Marine Corps, for example, the common descriptor for an individual rifleman is "killing machine." And what is the Marine Corps’ preferred metaphor for itself? It is the "lean, green machine."

We call military actions "operations," a term which has a strong mechanistic/procedural connotation, suggesting either a surgical procedure performed on an anesthetized patient or the systematic functioning of a piece of machinery. An operation conducted with noteworthy efficiency is referred to as a "surgical strike." Much less frequently do we refer to military actions as "evolutions"—a term which has biological connotations rather than mechanistic ones and suggests adaptation and adjustment rather than precise planning and procedure.

Newtonian War

The Western approach to war has been as heavily influenced by the Newtonian paradigm as any other field. So what is war according to the Newtonian paradigm like? Importantly, Newtonian war is deterministically predictable: given knowledge of the initial conditions and having identified the universal "laws" of combat, we should be fully able to resolve the problem and predict the results. All Newtonian systems can eventually be distilled to one simple concept: cause and effect. And in fact, just such efforts to quantify results in war have abounded, starting at least with the famous Lanchester equations and carrying through Dupuy’s Quantified Judgment Model. In other words, Newtonian war is knowable: all the information which describes any situation is ultimately available, and the implications can be fully worked out. That which we cannot directly observe, we must be able to extrapolate.

Newtonian war is linear: a direct and proportional connection can be established between each cause and effect. (Here "linear" refers to the dynamical properties of a system rather than to linear formations or frontages on a battlefield.) Small causes have minor results; decisive outcomes require massive inputs. In the Newtonian view, linearity is a good thing because linear systems are tame and controllable; they do not do unexpected things. If you know a little about a linear system you know a lot, because if you know a little you can calculate the rest.

The Newtonian view of war is reductionist: we understand war by successively breaking it down into parts eventually small enough to understand and control with the expectation that this will allow us to understand and control the whole. The so-called "Principles of War," reduced to the mnemonic MOOSEMUSS to aid memorization (as if that equals understanding), are a prime example of this approach. Linear processes are amenable to such decomposition; nonlinear processes by definition are not.

The Newtonian/mechanistic view of war tends to see a military operation as a closed system not susceptible to perturbations from its surroundings. This leads toward an inward focus—on the efficient internal functioning of the military machine. If war is deterministic and if the machine is operating at peak efficiency, then victory ought to be guaranteed—without any need to consider external factors. The mechanistic view likewise leads to a focus on optimization—finding the optimal solution to any problem (which is based on the Cartesian assumption that an optimal solution exists). War comes to be seen as a one-sided problem to be solved—like an engineering problem or a mathematics problem—rather than as an interaction between two animate forces. In idealized Newtonian war, the enemy, the least controllable variable, is eliminated from the equation altogether.

Newtonian Command and Control

The natural result is a highly proceduralized or methodical approach to the conduct of military operations—war as an assembly line. Newtonian command and control tends to be highly doctrinaire—heavy on mechanistic and elaborate procedures. The mechanistic view recognizes that war may appear disorderly and confusing but is convinced that with sufficient command and control we can impose order, precision, and certainty. We can eliminate unpleasant surprises and make war go "like clockwork." Just as the Scientific Revolution sought to tame nature, the Newtonian approach to command and control—especially with the help of the information-technology revolution—seeks to tame the nature of war.

Newtonian command and control thus tends to involve precise, positive control, highly synchronized schemes and detailed, comprehensive plans and orders. Perhaps the best metaphor is a chess player moving (i.e., controlling) his chess pieces. Control measures abound, compartmentalizing the various components of the military machine and specifying how those compartments cooperate with one another. Synchronization (the timepiece metaphor applied to military operations) is merely the example nonpareil of Newtonian war: the military as one huge, highly efficient and precise machine—ticking along like a fine Swiss watch.

Newtonian command and control is microscopic command and control. Just as classical mechanics studies a system by studying the behavior of each component in the system, Newtonian command and control seeks to control the military system by positively controlling each component in the system. In military lexicon this is known as detailed control. In this setting, "command" and "control" are seen as working in the same direction: from the top of the organization toward the bottom. See figure 1. The top of the organization imposes command and control on the bottom. Commanders are "in control" of their subordinates and the situation, and subordinates are "under the control" of their commanders. The worst thing that can happen in such a system is to "lose" control.

The object of Newtonian command and control is to gain certainty and impose order—to be "in control." Near-perfect intelligence becomes the expectation. We pursue 95-percent certainty within a battlecube 200 miles on each side and we actually expect that we can achieve it. Consider this passage by Richard Dunn from McNair Paper No. 13:

Increased battlefield "visibility"—provided by enhanced C3I—allows us to grasp the battle much more precisely and quickly. Thus, technology has made warfare much more certain and precise than was ever thought possible....For all intents and purposes, commanders can get a technological God’s eye view of the entire battlefield.

We believe we can blow away Clausewitz’s "fog of war," and if we fail to do so, it is only because our information technology is not quite capable enough yet—but we redouble our acquisition efforts and promise ourselves it will be soon.

The Problem: Reality Catches Up

The Newtonian paradigm offers a neat, clean and intellectually satisfying description of the world—and of war. There is only one problem: it does not match most of reality. When distilled to this level, the Newtonian model of war is manifestly ridiculous. When we reduce it to these terms, I think few people would argue that war is actually this way. And yet, much of the current American approach to command and control is based precisely on the unquestioned assumption of this model. Futurist Alvin Toffler states that while some parts of the universe may operate like machines, these are closed systems, and closed systems, at best, form only a small part of the physical universe. Most phenomena of interest to us are, in fact, open systems, exchanging energy or matter (and, one might add, information) with their environment. Surely biological and social systems [of which war is one] are open, which means that the attempt to understand them in mechanistic terms is doomed to failure.

This suggests, moreover, that most of reality, instead of being orderly, stable, and equilibrial, is seething and bubbling with change, disorder, and process.

The Newtonian paradigm was so compelling, so neat, so logical—in short, so "right"—that it saw and imposed regularities where none existed. For the sake of finding solvable problems, science simplified reality by assuming an idealized world. It connected the discontinuities and linearized the nonlinearities—in short, it simply ignored all the countless inconsistencies and surprises that make the world—and war—such a complex and interesting problem.

The evidence is unmistakable: the Newtonian paradigm no longer satisfactorily describes most of our world (if it ever did). Science is slowly coming to recognize that the world is not remotely an orderly, linear place after all. We need a new paradigm, and once again science may provide the catalyst. It is not after all a Newtonian battlefield: it is a nonlinear dynamical battlefield.

The Emerging View: Nonlinear Dynamical War

So what is war if not a classical Newtonian system? War is fundamentally a far-from-equilibrium, open, distributed, nonlinear dynamical system highly sensitive to initial conditions and characterized by entropy production/dissipation and complex, continuous feedback. Rather than thinking of war as a structure at equilibrium, we should think of it as a standing wave pattern of continuously fluxing matter, energy, and information. War is more a dynamical process than a thing.

The principal law of thermodynamics—the supreme Law of Nature, in fact—is the Second Law which establishes that any natural process involves an overall increase in randomness or disorder—that is, an increase in entropy. The law of increasing entropy applies to war as much as to any other natural phenomenon. The driving force of all natural change in the universe, constructive as well as destructive, is the random and undirected dispersal of energy.

In thermodynamics, equilibrium is the uniform static state of a system in which no further heat transfer is possible. It is the state of maximum entropy. Near equilibrium, systems tend to behave in a fairly linear fashion; it is when the system is forced far from equilibrium that it becomes highly responsive to fluctuations—sensitive to initial conditions—and nonlinear behavior arises. It is here that immeasurably small influences—"countless minor incidents," Clausewitz called them—can cause the system to veer off into an unpredictably and qualitatively different behavior pattern. It is here that the Second Law actually becomes a creative force through the local dissipation of entropy by leading to the spontaneous generation of structure, complexity, and life.

As an open system—continuously exchanging matter, energy, and information with other systems and with the environment at large—war is in a continuous state of flux. It is never at equilibrium, although some manifestations of war may be nearer than others—such as the stalemate of the First World War western front, which may have been as close to thermal equilibrium as any war has ever been. War is driven away from equilibrium by influxes from its environment—in the form of physical matter (or materiel) but also in the form of leadership, political motive, training, creative tactics, or any source of energy or information which tends to inject into the system the capacity to do coherent work. War is damped according to the Second Law and its universal property of entropy—which Clausewitz called "friction"—through the attrition of men and materiel, obviously, but also through fatigue, the loss of morale, poor tactics, uninspired leadership, or any other sump which drains the system of its capacity to do coherent work. At its most fundamental war can be thought of as an exchange of matter, information, and especially energy between linked, open hierarchies. Engaging an enemy by fire can be thought of as a transfer of energy from one component to another with the intended result of increasing the entropy of the latter. These exchanges take place in a complex network of simultaneous, distributed linkages between various elements at various levels in each hierarchy. Some of these linkages are tight, some are loose. Some are direct, some are indirect. See figure 2.

Feedback is a pervasive characteristic of practically all open systems, including war. As compared to Newtonian systems, which tend to have minimal feedback mechanisms, war is characterized by a complex, hierarchical system of feedback loops, some designed but many unintended and unrecognized. Whether positive or negative, feedback results are by definition nonlinear.

War’s essential dynamic comes from its being a complex, distributed system. Economic theorist F.A. Hayek coined the phrase "extended order" to describe economies driven by individual agents, but the term applies equally to war. War is an extended order: its universal nature simply cannot be captured in one place but emerges from the collective behavior of all the individual agents in the open system interacting locally in response to local conditions and partial information. In this respect, decentralization is not merely one choice of command and control: it is the basic nature of war. Centralized command and control represents an effort to muscle the system into some unnatural position—which is not to say, however, that it won’t sometimes work more or less given enough energy and effort.

Information in war is, to borrow another of Hayek’s phrases, "essentially dispersed." Again Hayek was writing about economics but he could just as easily have been writing about military command and control:

This dispersed knowledge is essentially dispersed, and cannot possibly be gathered together and conveyed to an authority charged with the task of deliberately creating order.... Much of the particular information which any individual possesses can be used only to the extent to which he himself can use it in his own decisions. Nobody can communicate to another all that he knows, because much of the information he can make use of he himself will elicit only in the process of making plans for action. Such information will be evoked as he works upon the particular task he has undertaken in the conditions in which he finds himself...Only thus can the individual find out what to look for...

The Result: War as a Complex System

According to practically any definition of the term "complexity," war qualifies as a complex phenomenon. In what could qualify as an excellent description of complexity theory, Clausewitz wrote:

The military machine—the army and everything related to it—is basically very simple and therefore seems easy to manage. But we should bear in mind that none of its components is of one piece: each piece is composed of individuals, every one of whom retains his potential of friction...A battalion is made up of individuals, the least important of whom may chance to delay things or somehow make them go wrong.

Complexity theory deals with the study of systems which exhibit complex, self-organizing behavior. A complex system is any system composed of numerous parts, or agents, each of which must act individually according to its own circumstances and requirements, but which by so acting has global effects which simultaneously change the circumstances and requirements affecting all the other agents. Complex systems are based on the individual "decisions" of their numerous agents.

It is not simply the number of parts that makes a system complex (although more parts can certainly contribute to complexity): it is the way those parts interact. A machine can be complicated and consist of numerous parts, but the parts generally interact only in a designed way. This would be structural complexity. Instead, the type of complexity which most interests us is interactive complexity, by which the parts of a system interact freely in interconnected and unanticipated ways. Each agent within a complex system may itself be a complex system—as in the military, in which a company consists of several platoons and a platoon comprises several squads—creating multiple levels of complexity. But even if this is not so, even if each of the agents is fairly simple in itself, the interaction among the agents creates complexity. This is a significant contradiction of the Newtonian paradigm: simple causes can lead to complicated, disorderly behavior. ("Everything in war is simple," Clausewitz wrote, "but the simplest thing is difficult.") The result is a system which behaves in nonlinear, complicated, unpredictable and even uncontrollable ways. Each agent often affects other agents in ways that simply cannot be anticipated. With a complex system it is usually extremely difficult, if not impossible, to isolate individual causes and their effects, since the parts are all connected in a complex web. The element of chance, interacting randomly with the various agents, introduces even more complexity and disorder.

One of the defining features of complex systems is a property known as emergence in which the global behavior of the system is qualitatively different from the behavior of the parts. No amount of knowledge of the behavior of the parts would allow one to predict the behavior of the whole. Emergence can be thought of as a form of control: it allows distributed agents to group together into a meaningful higher-order system. In complex systems, structure and control thus "grow" up from the bottom; they are not imposed from the top. Reductionism simply will not work with complex systems: the very act of decomposing the system—of isolating even one component—changes the dynamics of the system. It is no longer the same system.

War is clearly a hierarchy of complex systems nested one inside another. From the largest military formation down to the individual rifleman, war consists of agents adapting to their environments—which include enemy agents—and in the process changing the environments of all the other agents.

Some of the processes in war may be deterministically predictable, some are deterministically chaotic, and some are probably purely stochastic. There are probably universals—variables or constants which show up in every mix—but no two battles, campaigns, or wars ever exhibit the same mix or system dynamic. Even the same system may behave differently under different regimes or conditions. Under certain parameters—near equilibrium, before bifurcation—the system may actually behave in a fairly Newtonian way. Witness the Gulf War, for example, which I suggest was an unusually linear manifestation of war, in part because of low levels of interaction between the opposing sides. Under other parameters—when the system is forced farther from equilibrium—the same conflict may become very complex or even "go chaotic." The result is an infinitely complicated and continuously changing problem set that qualifies as mathematically unsolvable.

Implications

What does all this mean? We know what the command and control implications of Newtonian war are: we have been operating with them for more than a century. But if we treat war as a nonlinear dynamical system, the implications are dramatically different. These implications stem from two fundamental conclusions:

Uncertainty A Sure Thing

Nonlinear dynamics suggests that war is uncertain in a deeply fundamental way. Uncertainty is not merely an initial environmental condition which can be reduced by gathering information. It is not that we currently lack the technology to gather enough information but will someday have the capability. Rather, uncertainty is a natural and unavoidable product of the dynamic war: action in war generates uncertainty. The only type of war about which we could achieve certainty would be a system at equilibrium, which would not be war at all.

Nonlinear dynamical systems sensitive to initial conditions are intrinsically unpredictable at the microscopic level, but the inability to accurately predict system behavior is not due to insufficient information about the system as was often assumed. Rather, unpredictability is a direct and irreducible consequence of the system’s sensitivity to initial conditions and the nonlinear rules that govern its dynamics. The best we can hope for is to work out probabilities—or, as Hayek suggests, to focus on "prediction of the principle"—and even then the system will surprise us. Promises of a "God’s-eye view" of the battlefield or Admiral Owens’ dream of 95-percent certainty within a 200x200x200-mile battlespace are thoroughly Newtonian concepts that simply do not jibe with the nature of war as a complex phenomenon. The widespread belief that information technology will allow us to blow away the fog of war is a dangerous delusion which fails to understand the complex nature of war.

Control in War?

Complex systems like war simply cannot be controlled the way machines can. We should not think of command and control as a coercive form of mechanistic control—the way an operator operates a machine. The object of mechanistic command and control is for the top of the organization to be "in control" of the bottom and for the bottom to be "under" the control of the top. The worst thing that can happen is for a commander to "lose" control of the situation. But are the terrain and weather under the commander’s control? Are commanders even remotely in control of what the enemy does? Good commanders may sometimes anticipate the enemy’s actions and may even influence the enemy’s actions by seizing the initiative and forcing the enemy to react to them. But it is a delusion to believe that a commander can really be in control of the enemy or the situation.

Is a kayaker paddling down a raging river really in control of the situation? Does he control the river? Does he really even control his own course? Or does he try to steer his way between and around the rock formations which spell disaster as the rapids carry him along. For the kayaker, success—safely navigating the river—is not a matter of push-button precision. For the kayaker—as for the commander—it is a matter of coping with a changing, turbulent situation. Command in war is less the business of control than it is the business of coping.

Complexity suggests it is a delusion to think that we can be in control in war with any sort of certitude or precision. Complexity further suggests the radical idea that the object of command and control is not to achieve control but to keep the entire organization surfing on the edge of being "out of control" because that is where the system is most adaptive, creative, flexible, and energized.

Macroscopic Command and Control

The turbulence of modern war suggests a need for a looser form of influence—something more akin to the willing cooperation of a soccer team than to the omnipotent direction of the chess player—that provides the necessary parameters in an uncertain, disorderly, time-competitive environment without stifling the initiative of subordinates. Complexity suggests the need for macroscopic command and control. Command and control should not try to impose precise domination over details because the details are inherently uncontrollable. Rather, it should try to provide a broad, meaningful structure to the roiling complexity. Newtonian command and control is microscopic: it attempts to control the system by controlling each particle in the system. Complex war defies microscopic command and control and instead requires macroscopic command and control which "controls" the system by influencing the system parameters and boundary conditions.

Adaptive Command & Control

In a complex, open environment, command and control is fundamentally a process of continuous adaptation. The simple command and control model, the Observation-Orientation-Decision-Action cycle (or OODA loop), essentially describes a process of continuous adaptation to a changing situation. See fig. 3. We might better liken the military organization to a predatory animal—seeking information, learning and adapting in its desire for continued survival—than to some "lean, green machine." Most military actions do not proceed with clockwork mechanics—as "operations"—but instead as "evolutions" along the "edge of chaos."

Rather than thinking of "command" and "control" both operating from the top of the organization toward the bottom, we should think of command and control as an adaptive process in which "command" is top-down guidance and "control" is bottom-up feedback. See fig. 4. All parts of the organization contribute action and feedback—"command" and "control"—in overall cooperation. Command and control is thus fundamentally an activity of reciprocal influence involving give and take among all parts, from top to bottom and side to side.

Mission Command & Control

This response to the problem leads to is what is known in military terminology as directive or mission command and control, in which control is an emergent property arising spontaneously: unity of effort is not the product of conformity imposed from above but of the spontaneous, purposeful cooperation of the distributed elements of the force. Subordinates are guided not by detailed instructions and control measures but by their understanding of the requirements of the overall mission. Commanders command with a loose rein, allowing subordinates greater freedom of action and requiring them to adapt locally to developing conditions. Mission command and control tends to be decentralized to increase tempo and adaptability. Discipline imposed from above is reinforced with self-discipline throughout the organization. Necessary close coordination is effected locally rather than managed centrally.

The critical factor in such a system is to create command parameters and other systems features which provide the necessary guidance and level of understanding to create unity of effort without unnecessarily constraining the activities of subordinates. In other words, how do we create the modes of agent behavior under which the necessary system control will emerge naturally? Clearly, concepts like Commander’s Intent and Focus of Effort play a key role, as do the extensive education, training, and socialization of individual decision makers.

The Concept Of Evolutions?

Rather than thinking of a military action as an "operation," a predetermined plan unfolding with machinelike order and procedural precision, we should think of the action as an "evolution," a system adapting over time in response to its environment. Better yet, we should think of military action as a form of coevolution, our system evolving in response to what the enemy does and the enemy system evolving at the same time in response to us.

Complexity suggests that, just as evolution does not have a predetermined destination, military plans should not prescribe detailed end-state conditions which are instead always changing in response to developments. We should not think of a plan as a closed-form solution to a problem but as an open architecture which maximizes evolutionary opportunities. A good plan becomes the basis for adaptation through evolution. Planning is "solution by evolution" rather than "solution by engineering."

Synchronization Out Of Sync

One military command and control concept that does not mesh well with complexity theory is synchronization. Synchronization and other Newtonian models are invalidated as general operating systems. They may work moderately well within those narrow parameters under which the system behaves relatively tamely. Synchronization falls flat when faced with a complex system which does not exhibit mechanistic dynamics. In fact, healthy complex adaptive systems tend to behave asynchronously—multiple agents acting independently of one another in response to local conditions. Complexity suggests the superiority of loosely coupled, modular plans which do not rely on synchronized control for their unity of effort. Such plans allow greater latitude in execution and, importantly, are more easily modified and repaired than synchronized ones. Where synchronization occurs, it should be the result of local cooperation between agents rather than of centralized direction.

Satisfice, Don’t Optimize

Complexity suggests it is rarely worth the effort trying to find the perfect plan or reach the perfect decision. It simply will not happen: there are too many interconnected variables. As geneticist John Holland has said, in a complex system "there’s no point in imagining that the agents in the system can ever ‘optimize’ their fitness ... The most they can ever do is to change and improve themselves relative to what the other agents are doing." Instead, we should try to satisfice—find a solution that works locally and exploit the results.

Excellence Can Only Start At The Bottom

Evolution moves from the simple to the complex. Healthy complex systems evolve by chunking together healthy simpler systems. Attempts to design large, highly complex organizations from the top down rarely work, if ever. This merely confirms what successful military organizations have long recognized: success starts at the small-unit level. Build strong, adaptable squads and sections first. Train and equip them well—which includes giving them ample time to train themselves (i.e., to evolve). Give them the very best leaders. Give those leaders the freedom and responsibility to lead (i.e., let them act as independent agents). Then chunk the teams and squads together into increasingly larger units.

In Closing: Continuous Adaptation

The physical sciences have dominated our world since the days of Newton. Moreover, the physical sciences have provided the mechanistic paradigm that frames our view of the nature of war. While some systems do behave mechanistically, the latest scientific discoveries tell us that most things in our world do not function this way at all. The mechanistic paradigm no longer adequately describes our world—or our wars. Complex systems—including military organizations, military evolutions, and war—most definitely do not behave mechanistically. Enter complexity.

Complexity encourages us to consider war in different terms which in turn point to a different approach to the command and control of military action. It will be an approach that does not expect or pursue certainty or precise control but is able to function despite uncertainty and disorder. If there is a single unifying thread to this discussion, it is the importance of adaptation, both for success on the battlefield and for institutional survival. In any environment characterized by unpredictability, uncertainty, fluid dynamics, and rapid change, the system that can adapt best and most quickly will be the system that prevails. Complexity suggests that the single most important quality of effective command and control for the coming uncertain future will be adaptability.


| Complexity Index | Part Three Index | Chapter 10 |