Safety Avoidance Logic
Through a lack of understanding of system safety axioms, poor decisions that hinder system safety and increase risk are often made. Common statements support safety avoidance logic, and these statements are often made in the context of safety, which may be inappropriate. These statements may reflect limited background, knowledge, experience, attitude, perception, misguidance or maybe simply arrogance.
It seems that many people do not understand the axioms of system safety, yet the base of all our efforts in system safety depend on knowing these axioms. There is frustration when the system safety practice is not properly applied, as well as more risk to the system if there is deviation from practice. Note that actual circumstances will vary and good judgment has to be used when applying the axioms.
There are legal ramifications concerning the use of safety-related "terms of art." These terms may or may not be appropriate and will vary depending on the particular legal entity (state, country or jurisdiction). This discussion excludes particular legal ramifications.
It is the reader's responsibility to know and understand all legal ramifications associated with the practice of system safety, safety engineering and safety management.
Common Statements Applied
What follows is a discussion of some common statements that may reflect a limited background, knowledge, experience, attitude, perception, misguidance or arrogance, along with counter statements that support a more logical safety-related argument. It is recommended that these limiting statements be abated or challenged when necessary.
"Quantitative analysis is more vigorous and appropriate then qualitative analysis."
Quantitative analysis is rigorous and laborious, and it takes time. However, quantification is needed to develop design requirements, evaluate simulations to determine distributions composed of random variables and test system response. Probabilities may be helpful in determining availability or reliability. Probabilities actually indicate that an event is possible. Considering randomness and regardless of the high confidence level, the event can still occur tomorrow.
If all safety resources are allocated toward laborious quantification, consider all the system risks that may not have been identified because an inclusive hazard analysis was not conducted. For complex systems, there may be hundreds of risks and thousands of hazards throughout the lifecycle that may not be addressed. It is important that system safety resources are appropriately applied and that, depending on the circumstances, both quantitative and qualitative analyses be conducted. Qualitative analysis may be more appropriate, considering it is important to identify, eliminate and controls risks.
"Accidents, hazards, risks, and outcomes, it's all the same thing....there is only one hazard: fatal injury."
For some systems, there may be a limited amount of final outcomes; for example, a single outcome to five or 10 possible results. There can be hundreds or thousands of ways to get to the outcome, which are adverse flows, paths or cut sets. Each flow, path or cut set represents a system risk. Further, system risks comprise many hazards: initiators, contributors and primary hazards (outcomes). Each system risk represents a potential accident. As a result of accident reconstruction and investigation, it becomes apparent that the accident is the result of many things gone wrong hazards and system states within an adverse process.
"Operators don't make errors and when the system fails it will be detected."
It seems that many people do not understand the axioms of system safety, yet the base of all our efforts in system safety depend on knowing these axioms. There is frustration when the system safety practice is not properly applied, as well as more risk to the system if there is deviation from practice.
Novice analysts may assume best-case situations within hazard analyses and, as a result, only consider three or four hazards of low risk. If the analysis was conducted and all the risks are low, the effort is completed. Depending on contingency response, worst-, mid- and best-case situations must be considered. Different hazards and mitigations may be determined or identified when all possible sequences are addressed. People are imperfect; there will always be errors and, since people create systems, there will be latent errors and real-time hazards to address.
"These are two independent events and the probability of these events occurring is EE -9. We can exclude these events from the analysis."
A probability alone is not a hazard control. Consider all risks and their associated hazards, regardless of what the estimated probability is. All hazards must be fixed. Rare events can result in catastrophic outcomes. Seemingly incredible independent events do line up and form adverse sequences. Consider that system safety analyses may not have been conducted adequately, since common-cause events were not investigated, or hazards were excluded because of an estimated probability.
"The Military Standard 882 Z method is good or bad."
There is no one particular safety standard that contains all the appropriate axioms of system safety, safety engineering or safety management. It is important to fully understand all the principles and practices (axioms) within the safety disciplines. There are common threads that are axioms throughout safety practices.
Once the level of protection has been applied, the safety bar has been raised. There is now an obligation to meet that level of protection, regardless of any new or revised safety standard or specific revised term within the standard. For example, consider the typical safety term of "hazard;" it has been re-defined multiple times, perhaps as the result of a new expert safety engineer. In some entities, a hazard may be considered a threat. A "threat" is a security term. A "hazard" remains the potential for harm, including unsafe acts and/or conditions within an adverse sequence.
"It is more important to have specific industry experience rather than system safety experience."
Yes, specific experience needs to be accessed while conducting system safety analyses; however, the safety professional is just as important. It is vital that system safety axioms be appropriately applied. A cursory or inappropriate safety analysis may introduce additional risks, there may be many risks that have not been addressed, and/or mitigation may be inadequate. False confidence can be established regardless of the result, leading to an inadequate, incomplete or inappropriate analysis.
"We need a team of people to do hazard analysis."
Depending on the analysis or system, a team of people may hinder or adversely affect the analysis. In some situations, it may be more appropriate for a qualified safety professional to initially conduct the analysis and then invite particular people into the process as the analysis progresses. If the analysis team is not properly trained and the meetings not properly facilitated, wheel spinning will occur and resources may be wasted.
"You need a specific credential to do system safety."
Almost anyone with the capability to learn the axioms can practice system safety. On the other hand, there is no such thing as too much knowledge of science, physics, human factors, health, medicine, engineering, technology, mathematics and statistics. An experienced safety professional is constantly learning and has the capability to extract the information needed from other professionals.
"We must apply this particular safety model while conducting analysis."
In the literature, there are many safety models that can be useful. Depending on the analysis and system, some specific safety models may be more appropriate than others. It is often helpful to use many different models to evaluate the system from different points of view. Additional risks, hazards and/or mitigations may be identified. Consider that some models may be abstractions and may or may not be accurate depictions. Models may represent theories or hypotheses that may or may not be true.
"(Abstracted) models of the system are accurate."
Any depiction of the system can be inaccurate. Accuracy can be judged by physical reality, applied physics, measurements, true observations, experimentation and testing.
"It's redundant so it's safe."
Unfortunately, people still make this statement. Considering complex automated systems, it is a challenge to prove redundancy of software, firmware, hardware and human elements. There can be a common connection point or a common event that can defeat redundancy.
As you would expect, "safety" is a relative term. Nothing is totally safe. Safety implies freedom from all forms of harm, which is not possible. However, safety can imply that the identified risks are acceptable, given the mitigations.
"We do safety better than that group. We know the best way of doing safety."
"Not invented here" syndrome often occurs. People become experts in their fields and automatically consider themselves experts in system safety. They may have even developed some form of system safety application. Again, a false sense of security is established. It is advisable to keep an open mind and gain knowledge of system safety axioms throughout various applications and industries.
next page ป