This principle has been attacked as unprofessional. What do you think? In conventional systems engineering situations, someone who acts unassertively gets steamrolled by those who tout their solution to a problem with arrogance and confidence. When such dictated approaches do not work, people become less enamored with simple fixes. Thereafter, improvements are pursued more thoughtfully and exercised with greater caution. One must watch what happens and be prepared to try something else. However, one is rarely sure just how long to wait before making the next intervening decision.
Concentrate on the complex system taken as a whole. For example, endeavor to understand how the behavior of the entire system explains the roles of it components; e.g., sub-systems. This may lead to insights about emergent properties that cannot be predicted by the antithesis of holism, namely, reductionism. One cannot use reductionism to accomplish goals. By the time one subdivides the problem, works on optimizing each resulting sub-system, and reassembles the parts, the complex system and its environ-ment have moved on, and little will perform as desired. This is a fundamental problem with government system acquisitions that take many years to accomplish. For example, many overly ambitious weapons programs have been cancelled after billions of dollars have been expended (White 2008).
Optimization may be impossible in a mathematical sense. Optimizing sub-systems can detract from the potential efficacy of the whole. Instead, try to balance various sub-system thrusts. In an automobile enterprise, if manufacturing and sales are each rewarded for the most cars, either more cars will be produced than can be sold or so many cars will be sold that manufacturing cannot keep up. Reward collaborations that keep manufacturing and sales abreast while increasing both production and sales.
Most engineers think of systems engineering as multi-disciplinary, with the fields of sensing, information processing and computing, communicating, networking, and the hard sciences of physics and mathematics coming together. In Complex Systems Engineering (CSE), people are considered part of the system. People are difficult, if not impossible, to model or control. Hence, "trans-disciplines," namely, the soft sciences, such as philosophy (Boardman and Sauser 2008), psychology, sociology, organizational change theory, economics, and politics, should be considered.
Embrace POET (Political, Operational, Economic, and Technical) Aspects
Let's face it. In the world's most pressing problems, politics and economics play critical roles, in addition to operational procedures and technical means. CSE must deal with all four POET aspects or results will be unsatisfactory. One may devise a great technical solution that could improve operations, but this will not go anywhere without (political) acceptance from stakeholders. Understand your stakeholders' values to establish win-win scenarios. Someone also needs to agree to pay (economically) for the improvements.
First, realize that every person sees things differently (McCarter and White 2009). No one has an exclusive grasp of the truth about complex systems. Better solutions are attained through leveraging a large group's cognitive diversity than by a panel of experts (Page 2007). The wisdom of the crowd only works when the crowd members do not know each other's opinion; i.e., there is independence of thought). Diversity of perspectives enhances creative problem solving but requires a good group facilitator to ensure that the disparate voices and opinions are heard and shared. In systems engineering we spend too much time arguing over definitions instead of seeking to understand how we use words. Only after this mutual understanding is attained can a group make real progress.
The great emphasis placed on identifying and mitigating risks is often at the expense of pursuing opportunities. In a complex system, such as a military enterprise, the principal risk is not pursuing opportunities (White 2011a). Yet a balance must be struck. With many opportunities, the initial expectations of a profitable business relationship can be too high. When visiting a new company, what if grass is taking over the parking lot? Share the impressions of potential prospects in meetings, but do not give away all your advice (especially your intellectual property) for nothing. If successful, no task needing attention is too small; it could lead to other opportunities.
Knowing when to make what decisions is a formidable challenge in managing uncertainty. Think in terms of pattern recognition and the general rules of behavior that underlie complex human systems. However, be wary of the effects of outlier behavior on the systems as well as your heuristics. Those in authority must make important decisions, at least occasionally, because it is part of their job. Some excellent work has been accomplished in formulating heuristics to help decision makers (Maier and Rechtin 2009). Nevertheless, heuristics is still a fertile research area. System dynamics was invented by Jay Forrester of MIT by 1968 (Meadows 2008). Here the importance of time delays is paramount. What initially might seem to be positive effects from your system interventions may ultimately prove to be negative, and vice versa. Insist on believable credentials (e.g., a simple heuristic) before engaging. Ask yourself whether the decision makers really get it. Even if there is no burning platform, are decision makers in enough pain because their things are not working to be willing to break their mold and try something different? Heed early signs that involvements are not gelling. Test first impressions while noting whether promises were made good.
How can one expect to interact with stakeholders productively without mutual trust? Establishing trust is difficult, takes a lot of time, and can be lost immediately if a precipitous event is handled poorly. Once lost, trust is also hard to recover. It is imperative to share information; otherwise, progress will suffer. Do not adopt the usual mantra that information is power, nor follow most organizational cultures of protecting information lest you be punished. Instead, try sharing business information, albeit to a limited extent at first. If this is echoed so you learn more, great! Then share more and more. Be open and honest, freely sharing information.
Create an Interactive Environment
Leaders should not try to drive solutions from the top, for they cannot always know what to do. Instead, continually strive to establish and maintain conditions (e.g., a vision of cooperative interactions and suitable reward structures for doing so) to ensure informed, vigorous, and sustained engagements among the troops. When people play nicely, remarkable improvements are more likely than under autocratic rule. If there is more conflict than competition and collaboration, then either the leader has failed to convey the vision or the incentives are inappropriate and need to be modified. Responsibilities to act and be responsive are important to embrace. Do not follow rules slavishly, but do not fight the system, especially about what you cannot even influence. It may be worthwhile to solicit the inputs of external observers as a check on the organization's direction or degree of success.
The following quotations provide support for the power of relationships, interactive environments, self-organization, and diversity:
When the business was suddenly confronted with a crisis that could destroy it, people came together in a different way and achieved extraordinary things. ... [W]e stopped most of the stupid games we were playing and worked together purposefully. ... The teams then did the work and made many decisions on their own as they went. They often saw better ways to do things and did them. ... Everyone had access to everyone. As we shared the information about our progress, we just kept learning from our mistakes and successes. [A]ll the systems were full of constant feedback. ... [W]e co-created our future together. People didn't resist the changes they were helping to create. (Knowles 2002, 2, 26)
... diversity and accuracy contribute equally to collective predictive performance, and that a crowd's collective prediction must always be at least as good as the average prediction of a member of the crowd. ... three core claims: (1) Diverse perspectives and tools enable collections of people to find more and better solutions and contribute to overall productivity; (2) Diverse predictive models enable crowds of people to predict values accurately; and (3) Diverse fundamental preferences frustrate the process of making choices. (Page 2007, 10, 25)
A hallmark of complex systems is self-organization among its living elements. This is particularly true in natural complex systems such as beehives, anthills, bird flocks, fish schools, and so forth. Human languages also strongly self-organize based on how people talk. The powerful human rights demonstrations in the Middle East in early 2011 were self-organized using social networking.
Seek Simple Elements
Another property of many complex systems in nature is the simplicity of the participating elements. For example, it is amazing what can be accomplished in beehives and anthills within their robust societies when each worker bee or ant seems so limited in capabilities. Imagine what humans might be able to do when they interact in self-organizing ways! One difficulty with intended systems engineering solutions to complex problems is the size and complexity (or maybe just complication) of the individual pieces. Current chaordic (collaborative and competitive) efforts also point to this. It might be better to design down-scale and assemble arrangements of smaller identical units that are good at adapting.
Some complex systems consist of simple elements, but admittedly many complex systems consist of complex elements. The implication of this seek simple elements principle is to consider how one's organization can be transformed so that it has fewer groupings of simpler elements.
Enforce Layered Architecture
Layering is applied to increase flexibility or introduce system improvements following changes in environment or implementation technology, for example. What might be better realized in software in one era is better done in hardware in the next, and vice versa. Each layer is confined to a set of closely knit basic functions, grouped in categories of applications, networking, communication links, or physical implementations. The interfaces between layers are kept simple and stable. However, the realization within a given layer can be (more often) adapted to different conditions. As long as the interfaces to that layer remain unchanged, the system still operates effectively.
Even complex organizations might be layered to better effect. Members performing similar functions would be tightly coupled within a given layer for high levels of interaction. Each set of fundamentally different functions would be part of its own layer. The various layers would also interact among themselves but with significantly looser coupling.