Part five in the series of articles on Critical Thinking.
Larger Scale Systems are strange things indeed. While they can do that which otherwise cannot be done and thus populate our modern world from economies, to military forces, to modern aviation, medicine, transportation, technology sectors……and the list goes on, they impact our lives on a daily basis, and yet we know very little about them. This is due largely to the fact that large Scale Systems behave in ways that fall outside our zone of comprehension, and our educational institutions prefer not to teach generalized, overarching subjects, like Critical Thinking and General Systems Theory that will deliver critical understanding of such systems for some unknown reason.
To clarify what we mean, let us first revisit the definition of what is often termed Large Scale Systems. These entities go by a number of names, such as large scale dynamic systems, complex systems, global systems, Superfunction, and the big picture. We will group all these into “Large Scale Systems” for convenience. Thus we can say that Large Scale Systems are inherently complex because they consists of large numbers of interacting variables. These sets of interacting variables are difficult to understand, and are often ignored by conventional Math and Science, except Military Science as developed by Colonel John Boyd. Scott Page and other members of the Santa Fe Institute have pointed out, for example, that conventional decision theory does little to address the optimization parameters of the performance of large scale systems, concentrating instead on limited sets of components and players, adding little to our understanding of how large scale systems actually perform.
Utilizing our tendency to jump to conclusions and focus on the most convenient plausible factor, J. W. Forrester points out that in large scale systems the observable behavior is often counter-intuitive, that is the plausible tends to be wrong.
This problem with the plausible often being wrong is quite easy to understand, however, because of the unusual behavior of large scale systems. The behavioral attributes of these systems are unpredictable, novel and sometimes even bazar. This is because of a characteristic that is unique to large scale systems referred to as “emergent”. This means that these systems exhibit behavior that is non-intrinsic to their constituent parts and this behavior only manifest itself at the global level. Thus large scale systems exhibit system- component discontinuity and consequently cannot be understood utilizing conventional analysis that focuses on the components of the system. We can say, therefore, that our first thinking error is to believe that a component orientation, while plausible, will actually be helpful, leading us into an arena where “the tyranny of the plausible” becomes our biggest challenge.
So what about the plausible. Well, with respect to large scale systems, the plausible will almost always be wrong, leading to misdirection, confusion and catastrophe; much like what we see going wrong with many of our federal, state and local programs.
The problem with the plausible being wrong confronted us on a national level head -on during the Vietnam conflict where it was assumed that our major force multiplier was advanced technology alone, and thus focused operational training and operational protocols were to be deemphasized. Specifically, advancing missile technology will, quite plausibly, replace air combat maneuvering, such that air- combat vehicles will henceforth be missile carrying platforms launching beyond visual range weapons, replacing entirely the need for projectile weapons and close-in maneuvering—or so this was believed by many to be true.
But consistent with the tyranny of the plausible, this turned out to be profoundly wrong, prompting a complete redesign of Air combat training that reemphasized optimizing maneuver performance, and ushered in the age of Top Gun and Red Flag.
Once again the tyranny of the plausible confronted aviation in the 1980’s, this time the commercial aviation sector. Up to this time it was believed by most experts that advancing technology would be the primary factor in reducing critical events and improving flight safety. Thus significant resources were expended on the components of the air Transport System, such as advanced simulation, weather radar, ground proximity warning, etc. as expected—and this was a teaching moment for all of us—operational performance did not improve. It was almost like the large scale system we were addressing was trying to tell us something. Namely, that working on a single or a few components in the hopes of improvements will have no noticeable effect on improving mission performance. Remember our previous discussion about non-intrinsic (emergent) properties.
The commercial aviation industry then began to focus on systems level performance and mission success with impressive results. This resulted in, among other things, the application of LOFT training, or in other words, total mission training and evaluation.
So as operators, managers and concerned citizens, where does this leave us; are we destined to the role of just muddling through, accepting failure at every turn because of our attraction to the plausible.
Fortunately, this is where Critical Thinking comes to the rescue. And provides us with effective reasoning skills that transcend the ordinary and formulate solutions to complex problems that otherwise remain unsolved. Critical Thinking recognizes that things may not be what they initially appear to be, and that to avoid the tyranny of the plausible, one must spend quality time considering that which will “clarify the actual”. In clarifying the actual, certain considerations are encouraged to help us understand that while complexity is difficult, and will often not lend itself to conventional, mainstream ideas, it does open up our zone of comprehension to ingenuity and innovation. When we clarify the actual we activate reasoning protocols that at once rejects component level thinking and embraces systems level thinking that in turn considers global attributes. By employing key principles of effective reasoning contained within the critical thinking model, we expand our zone of comprehension to formulate non-conventional approaches to deal with the behavioral expectations of large scale systems, and determine that which will optimize their performance.
Spoiler Alert: Large scale systems cannot be optimized by focusing only at the component level. While this may lend itself to elegant mathematical formulations, it is not productive for the purposes of optimizing performance of such systems. This is because large scale systems exhibit non-intrinsic global attributes, often leading to novel behavior.
Because large scale systems are, in a rather bizarre way, functionally discontinuous yet dynamic entities, manipulating (or tinkering) with a single component will have no measurable positive effect. Consequently, the implications of this particular property are enormous, not least is that we must cease almost entirely in our plausible but wrong-headed approach that conceptualizes a world view where linear singularities prevail and efforts to coherently model system behavior are ignored. This means that we must create or think about the model of the system first before we begin to think about its constituent parts.
Captured by the tyranny of the plausible, our public policy occupies itself with the manipulation of certain individual components of the national economic system, such as interest rates, bank reserves, minimum wage, subsidized commodes, etc… This will, as we have shown, have no measurable positive effect on the performance of the economic system. What begs the question, therefore, is whether anyone in public office know what is the economic prosperity model of America, and how does one optimize its performance?
To summarize: In “clarify the actual” we must focus our attention on what it will take to optimize overall systems performance, instead of a focus on a few selected components, and therefore achieve mission success. This should be our primary area of inquiry, it must recognize the defining characteristics of large scale systems, and implement that which will optimize mission success, as implausible at it may first appear. But the first thing we must insist on as critical thinkers, is that we accurately define the problem (i.e. clarify the actual) for which we are seeking a solution. Do this without referencing any possible solutions. This will, more than anything else, help us in our problem solving efforts. Remember that often our efforts are doomed to failure because we “apply the right solution to the wrong problem”. By right solution I mean the one most plausible, particularly among the so called experts. By wrong problem I mean exactly that; we are working on the wrong problem.
Insist that a problem definition be formally specified. When someone, or more importantly, a political operative offers a government intervention policy, demand that a formal statement of the problem be created and distributed to all concerned. This will have the side benefit of immediately reducing the number of attempted government interventions by a considerable margin.
So enough for now. Next we will cover even more fascinating aspects of critical thinking as it applies to our efforts to optimize the performance of large scale systems subject to human control.
Captain Smith is the author of the “Commitment to Reason” series of books that address Critical Thinking. This series is being offered by Tate Publishing and Enterprises, LLC; and is available wherever fine books are sold.
Captain Smith is also author (along with Stephane Larrieu) of “Mission Adaptive Display Technologies and Operational Decision making in Aviation”. Published by IGI-Global and available nationally. Please visit: IGI-Global.com. This important book takes Critical Thinking to a new level and introduces new forms of adaptive systems.