The Foundation. System Focus
The Foundation. System Focus
Summary
Safety must be considered in the context of the overall system, not isolated individuals, parts, events or outcomes
Most problems and most possibilities for improvement belong to the system. Seek to understand the system holistically, and consider interactions between elements of the system
"In a system, everything is connected to something; nothing is completely independent." Image: NATS Press Office CC BY-NC-ND 2.0
When one spends any time in an organisation, it is clear that nothing works in isolation. Rather, things work within connected and interacting systems. In a tower and approach unit, for instance, controllers, assistants, engineers, supervisors, managers, and support staff routinely interact with each other and with others such as pilots, drivers and airport staff. People interact with various types of equipment, with much information, and with many procedures. The same applies in an area control operations room, in an equipment room, in an administrative centre, or in a boardroom. In a system, everything is connected to something; nothing is completely independent.
These connections and interactions, along with a purpose, characterise a system. More formally, a system can be described as "a set of elements or parts that is coherently organized and interconnected in a pattern or structure that produces a characteristic set of behaviours, often classified as its 'function' or 'purpose'"[1]. In service organisations, this purpose must be understood from the customer's viewpoint. This means that the customer and their needs must be understood. System performance can then be evaluated against achievement of purpose.
In practice, what constitutes a system is relative, because the boundaries of systems are not fixed and are often unclear; essentially, they are where we choose to draw them for a purpose (e.g. safety investigation, system assessment, design), and people and information cross system boundaries. There are, therefore, multiple perspectives on a system and its boundary, and sometimes its purpose. In a sense, a 'system' is a social construct defined by what it does, not a thing 'out there' that is defined by what it is.
We might think about systems at various levels. Operationally, we may consider the working position or sector. At a higher level we may consider an Ops room, a centre, an organisation, airspace or the aviation system. Systems exist within other systems, and exist within and across organisational boundaries.
While some system components are more visible, others are less visible to the observer. Less visible parts of the system include organisational elements (such as goals, rosters, incentives, rules), and political and economic elements (such as pressures relating to runway occupancy, noise abatement, and performance targets). Again, these interact to form a complex whole.
Despite these interactions, the way that we try to understand and manage socio-technical system performance is often on a component level (a person, a piece of equipment, a unit, a department, etc). A focus on component performance is common in many standard organisational practices. At an individual level, it includes incident investigations that focus only on the controller's performance, behavioural safety schemes that observe individual compliance with rules, individual performance reviews, incentive schemes, etc. The assumption is that if the person would try harder, pay closer attention, do exactly what was prescribed, then things would go well. However, as the management thinker W. Edwards Deming observed, "It is a mistake to assume that if everybody does his job, it will be all right. The whole system may be in trouble"[2]. Organisational theorist Russel Ackoff added that "it is possible to improve the performance of each part or aspect of a system taken separately and simultaneously reduce the performance of the whole" (p. 36) [3]. A focus on components becomes less effective with increasing system complexity and interactivity.
The term 'complex system' is often used in aviation (and other industries), and it is important to consider what is meant by this. According to Snowden and Boone[4], complex systems involve large numbers of interacting elements and are typically highly dynamic and constantly changing with changes in conditions. Their cause-effect relations are non-linear; small changes can produce disproportionately large effects. Effects usually have multiple causes, though causes may not be traceable and are socially constructed. In a complex system, the whole is greater than the sum of its parts and system behaviour emerges from a collection of circumstances and interactions. Complex systems also have a history and have evolved irreversibly over time with the environment. They may appear to be ordered and tractable when looking back with hindsight. In fact, they are increasingly unordered and intractable. It is therefore difficult or impossible to decompose complex systems objectively, to predict exactly how they will work with confidence, or to prescribe what should be done in detail.
This state of affairs differs from, say, an aircraft engine, which we might describe as 'complex' but is actually ordered, decomposable and predictable (with specialist knowledge). Some therefore term such systems 'complicated' instead of complex (though the distinction is not straightforward).
While machines are deterministic systems, organisations and their various units are purposeful 'socio-technical systems'. Yet we often treat organisations as if they were complicated machines, for instance by:
- assuming fixed and universal goals;
- analysing components using reductionist methods;
- identifying 'root causes' of problems or events;
- thinking in a linear and short-term way;
- judging against arbitrary standards, performance targets, and league-tables;
- managing by numbers and outcome data; and
- making changes at the component level.
As well as treating organisations like complicated machines, we also tend to lose sight of the fact that our world is changing at great speed, and accelerating. This means that the way that we have responded to date will become less effective. Ackoff noted that "Because of the increasing interconnectedness and interdependence of individuals, groups, organizations, institutions and societies brought about by changes in communication and transportation, our environments have become larger, more complex and less predictable - in short, more turbulent"[3]. We must therefore find ways to understand and adapt to the changing environment.
Treating a complex socio-technical system as if it were a complicated machine, and ignoring the rapidly changing world, can distort the system in several ways. First, it focuses attention on the performance of components (staff, departments, etc), and not the performance of the system as a whole. We tend to settle for fragmented data that are easy to collect. Second, a mechanical perspective encourages internal competition, gaming, and blaming. Purposeful components (e.g. departments) compete against other components, 'game the system' and compete against the common purpose. When things go wrong, people retreat into their roles, and components (usually individuals) are blamed. Third, as a consequence, this perspective takes the focus away from the customers/ service-users and their needs, which can only be addressed by an end-to-end focus. Fourth, it makes the system more unstable, requiring larger adjustments and reactions to unwanted events rather than continual adjustments to developments.
A systems viewpoint means seeing the system as a purposeful whole - as holistic, and not simply as a collection of parts. We try to "optimise (or at least satisfice) the interactions involved with the integration of human, technical, information, social, political, economic and organisational components"[5]. Improving system performance - both safety and productivity - therefore means acting on the system, as opposed to 'managing the people'[6].
With a systems approach, different stakeholder roles need to be considered. Dul et al[7] identified four main groups of stakeholders who contribute or deliver resources to the system and who benefit from it: system actors (employees and service users), system designers, system decision makers, and system influencers. These four groups are the intended readers of this White Paper. As design and management becomes more inclusive and participatory, roles change and people span different roles. Managers, for instance, become system designers who create the right conditions for system performance to be as effective as possible.
The ten principles give a summary of some of the key tenets and applications of systems thinking for safety that have been found useful to support practice. The principles are, however, integrative, derived from emerging themes in the systems thinking, systems ergonomics, resilience engineering, social science and safety literature. The principles concern system effectiveness, but are written in the context of safety to help move toward Safety-II (EUROCONTROL, 2013; [8] Hollnagel, 2014a; [9] Hollnagel, 2014b) [10]. Safety-II aims to 'ensure that as many things as possible go right', with a focus on all outcomes (not just accidents). It takes a proactive approach to safety management, continuously anticipating developments and events. It views the human as a resource necessary for system flexibility and resilience. Such a shift is necessary in the longer term, but there is a transition, and different perspectives and paradigms are needed for different purposes[1].
Each principle is described along with some practical advice for various types of safety-related activities. 'Views from the field' are included from stakeholders - front-line to CEO - to give texture to the principles from different perspectives. There are some longer narratives to give an impression of how safety specialists have tried to apply some of the principles in their work. Since the principles interrelate and interact, we have tried to describe some interactions, but these will depend on the situation and we encourage you to explore them.
Ultimately, the principles are intended to help bring about a change in thinking about work, systems and safety. They do not comprise a method, but many systems methods exist, and these can be selected and used depending on your purpose. Additional reading is indicated to gain a fuller understanding.
Practical advice
- Identify the stakeholders. Identify who contributes or delivers resources to the system and who benefits, i.e. system actors (including staff and service users), system designers, system decision makers, system influencers.
- Consider system purposes. Consider the common or superordinate purpose(s) that defines the system as a whole, considering customer needs. Study how parts of the system contribute to this purpose, including any conflicts or tension between parts of the system, or with the superordinate system purpose(s).
- Explore the system and its boundary. Model the system, its interactions and an agreed boundary, for the purpose, question or problem in mind (concerning investigation, assessment, design, etc.). Continually adapt this as you get data, exploring the differences between the system-as-imagined and the system-as-found.
- Study system behaviour and system conditions. Consider how changes to one part of the system affect other parts. Bear in mind that decisions meant to improve one aspect can make system performance worse.
View from the field
F/O Juan Carlos Lozano, Chairman, Accident Analysis & Prevention Committee, International Federation of Air Line Pilots' Associations (IFALPA)
"Flying a commercial aircraft at 35,000 feet might be perceived as working in a very expensive bubble. But bubbles are fragile. The aviation system cannot afford to be fragile. Aviation is a system that learns from experience, adapts and improves. Some think that improvements only come from technology. But it is people who make the system more resilient. Information sharing is good, but it is not enough. Knowledge and understanding are key. In the same way that pilots, controllers and technicians needs to understand the technology that they work with, aviation professionals - including managers, specialists, support staff, researchers and authorities - must constantly seek to understand how the system works. With an understanding of the interactions between elements of the aviation system, we can make it more effective, enhancing safety and efficiency. The principles that follow in this White Paper can only help in this endeavour."
References
- ^ a b Meadows, D. & Wright. (2009). Thinking in systems: A primer. Routledge.
- ^ Deming, W.E. (2000). Out of the crisis. MIT Press.
- ^ a b Ackoff, R. (1999). Ackoff’s best: His classic writings on management. John Wiley. Cite error: Invalid tag; name "ackoff" defined multiple times with different content
- ^ Snowden, D.J. & Boone, M.E. (2007). A leader’s framework for decision making. Harvard Business Review, November, pp. 76-79."
- ^ Wilson, 2014, p. 8
- ^ Seddon, J. (2005). Freedom from command and control (Second edition). Vanguard.
- ^ Dul, J., Bruder, R., Buckle, P., Carayon, P., Falzon, P., Marras., W.S., Wilson, J.R., & van der Doelen, B. (2012). A strategy for human factors/ergonomics: Developing the discipline and professions. Ergonomics, 55(4), 377-395.
- ^ EUROCONTROL (2013).>From Safety-I to Safety-II: A White Paper. EUROCONTROL.
- ^ Hollnagel, E. (2014a). Safety-I and Safety-II. The past and future of safety management. Ashgate.
- ^ Hollnagel, E. (2014b). Human factors/ergonomics as a systems discipline? “The human use of human beings” revisited. Applied Ergonomics, 41(1), 40-44.
Source: Systems Thinking for Safety: Ten Principles. A White Paper. Moving towards Safety-II, EUROCONTROL, 2014.
The following Systems Thinking Learning Cards: Moving towards Safety-II can be used in workshops, to discuss the principles and interactions between them for specific systems, situations or cases.