Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Human Error Reduction in Manufacturing
Human Error Reduction in Manufacturing
Human Error Reduction in Manufacturing
Ebook370 pages4 hours

Human Error Reduction in Manufacturing

Rating: 0 out of 5 stars

()

Read preview

About this ebook

For many years, we considered human errors or mistakes as the cause of mishaps or problems. In the manufacturing industries, human error, under whatever label (procedures not followed, lack of attention, or simply error), was the conclusion of any quality problem investigation. The way we look at the human side of problems has evolved during the past few decades. Now we see human errors as the symptoms of deeper causes. In other words, human errors are consequences, not causes.
The basic objective of this book is to provide readers with useful information on theories, methods, and specific techniques that can be applied to control human failure. It is a book of ideas, concepts, and examples from the manufacturing sector. It presents a comprehensive overview of the subject, focusing on the practical application of the subject, specifically on the human side of quality and manufacturing errors. In other words, the primary focus of this book is human failure, including its identification, its causes, and how it can be reasonably controlled or prevented in the manufacturing industry setting. In addition to including a detailed discussion of human error (the inadvertent or involuntary component of human failure), a chapter is devoted to analysis and discussion related to voluntary (intentional) noncompliance.
Written in a direct style, using simple industry language with abundant applied examples and practical references, this book's insights on human failure reduction will improve individual, organizational, and social well-being.
LanguageEnglish
Release dateFeb 13, 2023
ISBN9781636940915
Human Error Reduction in Manufacturing
Author

Jose (Pepe) Rodriguez-Perez

Dr. José (Pepe) Rodríguez-Pérez is the president of Business Excellence Consulting Inc. (BEC), a Puerto Rico-based global consultant, training, and remediation firm in the areas of regulatory compliance, risk management, and regulatory training in the FDA-regulated sector. He’s also president of BEC Spain. Dr. Rodríguez-Pérez is a biologist and earned his doctoral degree in biology from the University of Granada (Spain). He served as professor and director of the Microbiology Department at one of the Puerto Rico schools of medicine, and he also served as Technical Services manager at a manufacturing plant of Abbott Laboratories in Puerto Rico. From 2003 to 2012, he was professor for graduate studies of the Polytechnic University of Puerto Rico, and he served as a Science Advisor for the FDA from 2009 to 2011. Dr. Rodríguez-Pérez is a senior member of ASQ, as well as a member of AAMI, ISPE, PDA, and RAPS. He is an ASQ-certified Six Sigma Black Belt, Quality Manager, Quality Engineer, Quality Auditor, Quality HACCP Auditor, Biomedical Auditor, and Pharmaceutical GMP Professional. He is also the author of the best-selling books CAPA for the FDA-Regulated Industry, Quality Risk Management in the FDA-Regulated Industry, The FDA and Worldwide Current Good Manufacturing Practices and Quality System Requirements Guidebook for Finished Pharmaceuticals, Human Error Reduction in Manufacturing, and Data Integrity and Compliance, all available from ASQ Quality Press. Contact Dr. Rodríguez-Pérez at pepe.rodriguez@bec-global.com.

Related to Human Error Reduction in Manufacturing

Related ebooks

Industries For You

View More

Related articles

Reviews for Human Error Reduction in Manufacturing

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Human Error Reduction in Manufacturing - Jose (Pepe) Rodriguez-Perez

    1

    About Human Error

    Introduction and Some Statistics

    To err is human.

    Human error is a symptom, not a cause.

    Good people plus bad systems = A recipe for error.

    All of us have experienced human errors and mistakes. When we interact with machines or complex systems, we often do things that are different from our intentions. People make errors and mistakes because to err is human, as the adage explains it. Does this mean errors will inevitably happen from time to time and there is nothing we can do about it? Can something be done to better understand, and indeed control, this subject?

    Prevention of human error is generally seen as a major contributor to the reliability and safety of processes and systems. On the other hand, it is necessary to understand that eliminating human errors is almost impossible without first eliminating human beings. Therefore, the focus must be on their control and reduction, and hopefully the mitigation of their consequences.

    Errors are symptoms, and they do have causes. Understanding this concept is fundamental to controlling and reducing the frequency of human errors. Sometimes, we can detect consistent relationships between the frequency of some kinds of errors and specific circumstances. For example, at the beginning of each year, most of us find that we use the old year instead of the new year on documents, including personnel documents such as checks and letters, but also in work documents and our verbal commu­ni­ca­tion. As time passes, however, this tendency fades, and by February, we are aware of the year we are living in. Problem fixed? Just wait until next January.

    There are many other interesting and necessary questions, such as how many possible causes are there for human errors? If every error has its own unique cause, each imaginable error would require its own analysis, and the remedy for one error would not apply to others. However, if there are few (general) causes, we can apply general rules repeatedly to them and thus effectively control and reduce the rate of human errors and mistakes.

    For the sake of clarity, it is worth describing here the meaning of a few concepts that we are going to extensively use across these pages. An in-depth description and evaluation of these terms can be found in Chapter 2:

    • Human failure refers to any time a human activity deviates from accepted standards, rules, or procedures.

    • Human error refers to an action or decision that was not intended, that involved an involuntary deviation from an accepted standard, and that led to an undesirable outcome.

    • Human violation is a deliberate deviation from a rule or procedure. When the objective is to harm, it becomes sabotage.

    • Human factor is any factor that influences behavior at work in a way that can negatively affect the output of the process the human is involved with. In other words, if human errors and violations are the symptoms, then human factors are the causes.

    Unfortunately, and on many occasions also tragically, we have plenty of examples to illustrate the significance of human failures in general and human errors specifically. (See the section Human Error and Safety: The Importance of Human Factors, which includes descriptions of those cases.)

    The High Price of Human Error and Medical Safety

    In this introduction, we discuss some examples from the healthcare field, where medical errors have received a lot of exposure and attention for the last 15 years. There are many forms of medical error that can put the health and safety of patients at risk. Improper processing of medications, surgical mistakes, misuse of medical equipment, or inaccurate clinical laboratory results are some examples of common medical errors. In the United States, a 1999 report by the Institute of Medicine (IOM)¹ indicated that hospital medical errors killed between 44,000 and 98,000 people each year.

    In this report, the IOM defined medical error as the failure of a planned action to be completed as intended (error of execution) or the use of a wrong plan, including failure to use a plan to achieve an aim (error of planning). Specific types of medical errors highlighted in the IOM report included errors in the administration of treatment, failure to employ indicated tests, and avoidable delays in treatment. The IOM report agreed that many healthcare-acquired infections are preventable.

    In Minnesota, a 2005 report² analyzing data from 139 hospitals during 2004 found that there were 13 surgical operations on the wrong body part, 31 cases of foreign objects left inside surgical patients, and 21 prevent­able deaths. Another study about patient-controlled analgesia pumps concluded that an improvement of the pump interface, which focused on human factors, reduced the frequency of human errors by 55%.³

    Analyzing medical death rate data over an eight-year period, Johns Hopkins’s patient safety experts have calculated that more than 250,000 deaths per year in the United States are due to medical errors.⁴ Their figure surpasses the U.S. Centers for Disease Control and Prevention’s (CDC) third leading cause of death, respiratory disease, which kills nearly 150,000 people per year.

    In their study published in 2016, the Johns Hopkins researchers examined four separate studies that analyzed medical death rate data from 2000 to 2008, including one by the U.S. Department of Health and Human Services’ Office of the Inspector General and the Agency for Healthcare Research and Quality. Then, using hospital admission rates from 2013, they extrapolated that based on a total of 35,416,020 hospitalizations, 251,454 deaths stemmed from a medical error, which the researchers say translates to almost 10% of all deaths each year in the United States. According to the CDC, in 2013, 611,105 people died of heart disease, 584,881 died of cancer, and 149,205 died of chronic respiratory disease—the top three causes of death in the United States. The newly calculated figure for medical errors puts this cause of death behind cancer but ahead of respiratory disease.

    In a very interesting experiment performed at a Swiss hospital,⁵ 30 nurses and 28 anesthesiologists had to prepare medications for 20 patients using 22 syringes of various drugs, respectively. Both groups had to perform 22 calculations relating to the preparation of drugs. The study assessed human error probabilities (HEPs), distribution of HEPs, and dependency of HEPs on individuals and task details. In the preparation tasks, overall HEP was 3% for nurses and 6.5% for anesthesiologists. In the arithmetic tasks, overall HEP was 23.8% for nurses and 8.9% for anesthesiologists. A statistically significant difference was noted between the two groups. In both preparation and arithmetic tasks, HEPs were dependent on individual nurses but not on individual anesthesiologists. In every instance, HEPs were dependent on task details.

    During a March 2017 summit held in Germany, the World Health Organization (WHO) launched a new campaign titled Global Patient Safety Challenge on Medication Safety⁶ aimed at reducing severe and avoidable medication-associated damage across the globe by half over the next five years. WHO estimates that every year an estimated one million patients die in hospitals across the world because of avoidable clinical mistakes. The magnitude of this figure places this problem among hypertensive heart disease and road deaths as one of the top causes of death in the world today. The global cost associated with medication errors has been estimated at $42 billion annually, or almost 1% of total global health expenditure. In the United States, medication errors cause at least one death every day and injure approximately 1.3 million people annually.

    The High Price of Human Error Across Industries

    For manufacturing industries, there are several very good, compre­hensive reference books regarding this subject, although they are focused on the relationship between human factors and process safety. The reason is that process industries, especially chemical ones, have a large share of disasters provoked by human failures and errors. Major incidents have highlighted the importance of addressing this crucial aspect of performance. For example, safety culture became a key focus within the offshore oil and gas industry after the Piper Alpha disaster in 1989. Human safety analysis has always been a key area for the nuclear industry as well. The rate of development of human factors has accelerated over the recent years due to a mix of elements including major accidents, increased complexity of industrial systems, regulatory efforts, and social expectations. It has also been recognized that engineering humans out through full automation does not work. What is increasingly required is a proportionate consideration of human capabilities and limitations within any work system.⁷

    Manufacturing industries, especially those involved in the manufacture of medical products such as medicines and medical devices, are also plagued with human errors with very diverse consequences. A big solid-dosage phar­ma­ceutical manufacturing plant with approximately 1000 employees had more than 1100 documented deviations (nonconformances) in one year (2017). One-third of them (375) were classified as human errors. In other words, one deviation occurred every eight hours (a work shift), and, on average, they documented one human error every single day of the year. Human factors and the FDA-regulated industry are introduced later in this chapter.

    The cost of human errors in the food industry can also be very high. Most of these situations end with a high-level recall of the products involved because those errors typically jeopardize the safety of the food. Adding sugar to a sugar-free product or adding undeclared tree nuts or undeclared milk can risk the life of sensitive consumers. Risking consumer health with contaminated (or potentially contaminated) food can be disastrous. In the United States alone, dozens of products are recalled each week due to these reasons. Most of the time, human error is the primary causal agent of the situation. When properly investigated, affected companies often discover real root causes associated with a wide range of human factors, such as workload and inadequate supervision, design of the task, inadequate procedures, lack of competence due to ineffective training, and so on.

    Most medical product manufacturers have tried to address the plague of human error, as one vice president of operational excellence once described it to me. My opinion, based on direct knowledge of the corrective and preventive actions (CAPA) system from some of the biggest regulated companies in the United States and Europe, is that no one has been successful in reducing the plague to a controlled stage. Human errors continue to be an epidemic for regulated companies. This lack of success comes despite some companies undertaking huge investments in technology (for example, making more graphical procedures with many color images, and so on).

    Their lack of success comes from a single factor: they did not change the quality culture, starting from the top of the organization. Some of the required changes include:

    • Promoting the quality of processes over the yield of processes

    • Promoting and requiring personal accountability at all levels

    • Using risk management tools to avoid nonconformances, not to justify the acceptability of using nonconforming products

    I strongly recommend that interested professionals study some of the references included in the Bibliography. Although the oldest ones refer to aviation, nuclear, and industrial accidents, several recent books cover the hospital and healthcare industry, where human errors cost tens of thousands of lives every year in the United States alone.

    Errors versus Defects

    We must differentiate between errors (mistakes) and defects (also known as nonconformances). Regulated companies do not recall products because there were human errors during the manufacturing process. They recall products because their quality systems were unable to detect the human errors, and the nonconforming products were distributed, becoming adul­terated and/or misbranded items once they reached patients.

    As stated by Reason⁸ when discussing how to eliminate affordances for error, "most human beings enjoy the experience of free will. They form intentions, make plans, and carry out actions, guided by what appears to them to be internal processes. Yet these apparently volitional activities are subtly constrained by the character of the physical objects with which they interact. The term affordance refers to the basic property of objects that shape the way in which people react to them."

    Norman also explored this concept in his book The Design of Everyday Things.⁹ He explored, for example, how man-made objects and procedures offer affordance for error:

    I began to realize that human error resulted from bad design. Humans did not always behave so clumsily. But they do so when the things they must do are badly conceived, badly designed. Does a commercial airliner crash? Pilot error, says the report. Does a Soviet nuclear power plant have a serious problem? Human error, says the newspaper. Do two ships at sea collide? Human error is the official cause. But careful analysis of the events that transpired during these kinds of incidents usually gives the lie to such a story. At the famous nuclear power plant disaster, Three Mile Island, the blame was placed on the humans, on the plant operators who misdiagnosed the problems. But was it human error? Consider the phrase ‘operators who misdiagnosed the problems.’ Aha, the phrase reveals that first there was a problem, in fact a series of mechanical failures. Then why wasn’t equipment failure the real cause? What about the misdiagnoses? Why didn’t the operators correctly determine the cause? Well, how about the fact that the proper instruments were not available, that the plant operators did the action that had always been the reasonable and proper ones to do. How about the pressure relief valve that failed to close…. To me it sounds like equipment failure coupled with serious design error.

    To finish this introduction, it is necessary to remark that human errors cannot be eliminated nor even significantly reduced simply by telling the person who made the error or mistake to be more careful. A general admonition or advisory to stop such behavior is a simplistic approach and it does not work because we are not addressing any root cause. Errors cannot be eliminated by simply disciplining the people who make the mistakes.

    A factor very often neglected when considering the cause of human failures is the high frequency of conduct disorders such as attention deficit hyperactivity disorder (ADHD). A report by the National Institutes of Health¹⁰ concluded that almost 25% of U.S. inhabitants had a psychiatric disorder, and almost 60% of them never sought treatment. This study established that the severity of the disorder was 40% mild, 37% moderate, and 22% severe. It included 18% anxiety disorders, 9% mood disorders, and 9% impulse control disorders. This is the picture of the general population from which U.S. companies are recruiting their managers and line workers. More discussion about this topic is included in Chapter 4.

    Some Statistics Related to Human Error

    • Ninety-nine percent of accidental losses (except for natural disasters) begin with a human error.

    • Root causes of the vast majority of accidents are management system weaknesses.

    • Eight percent of men are color blind, while only 0.5% of women have the condition.

    • Eighty percent of medical product recalls due to incorrect expiration date or incorrect lot/batch number are caused by a transposition of digits.

    • One and a half million Americans are injured every year by drug errors in hospitals, nursing homes, and doctors’ offices (patients’ own medication mix-ups are not included), costing the health system more than $3.5 billion (1999).

    • The global cost associated with medication errors has been estimated at $42 billion annually.

    • On average, every hospitalized patient is subject to (at least) one medication error per day.

    • Seventeen hours of work without a break is operationally the same as being legally drunk.

    • The worst period for human errors is 2 a.m. to 5 a.m.

    • About 15% of human errors are due to acquired habits.

    • Human error accounts for 90% of road accidents.

    • Ten percent of all U.S. deaths are due to medical errors.

    • The third highest cause of death in the United States is medical error.

    • An average of 26% of the babies in a neonatal intensive care unit were found to be at risk of being mistaken for another baby in the same unit on any given day.

    • The rate of errors and mistakes for most procedure-based tasks is 1/100.

    • The average worker is interrupted every 11 minutes and then spends almost a third of his/her day recovering from these distractions.

    • Twelve percent of the world’s population is left-handed, with twice as many men as women. Thirty percent of us are mixed-handed and switch hands during some tasks. Ambidextrous people can do any task equally well with either hand, but it’s exceptionally rare. However, most of the pieces of manufacturing equipment and utilities are designed for right-handers.

    Human Error and Safety: The Importance of Human Factors

    Most of the bibliographic references mentioned in this book cover the safety implications of human errors. Incredible tragedies such as those described below can be traced back to some type of human failure. Numerous statistics link human error to a vast majority of accidents and accidental losses. An investigational work published in 1999 indicated that at-risk behavior is the root cause of 85%-90% of all workplace injuries.¹¹

    Between the mid-1970s and the late 1980s there was a series of major accidents (some of them discussed in this section) whose investigations pointed toward organizational and social factors in addition to technical and engineering factors. Behavior was identified as one of these nontechnical factors. People at all levels of organizations were not doing what they were meant to do. Human failures uncovered during these investigations were a mix of both unintentional (errors and mistakes) and intentional (violations). Since the beginning of the 1990s, process safety management became a scientific field, and today process safety is inextricably linked to human factors. Previously, regulators and safety professionals were convinced that putting all their focus on the elimination of hazardous conditions was the best way to prevent workplace injuries.

    Organizations with safety-critical operations recognize the value of developing a robust safety culture because it will influence the way people (at all levels) behave at work. In other words, human error and process safety incidents are directly linked to the safety culture of the organization.

    Safety culture is recognized as a significant element of human factors. Investigations into major disasters in nuclear plants (Three Mile Island and Chernobyl), chemical process industries (Piper Alpha), transportation (Space Shuttle and Exxon Valdez), and gas distribution (San Juan Gas) concluded that systems broke down catastrophically despite the provision of complex technical safeguards. The primary cause of those disasters was not an engineering failure, but the action (or inaction) of the designers, mana­gers, and maintenance/operating workers. Taking inappropriate risks and not following procedures are indicators of a weak safety culture. Elements associated with a positive quality culture are described in Figure 1.1 (modified from Center for Chemical Process Safety [2007]).

    Worker safety behavior can be described as either safe or at-risk. The reason that motivates a well-trained worker to take risks is well documented and explained by behavioral science. There is strong evidence that consequences, meaning what happens after a behavior, are the driving force. Research indicates that the main reason people continue unsafe or nonquality behavior, regardless of knowledge, is because of the positive, immediate, and certain consequences associated with the unsafe behavior.¹² Texting while driving can be a perfect example of this.

    Figure 1.2 depicts a list of human failures, and while none of these events will necessarily cause a quality incident immediately, they represent conditions that may eventually allow one to occur. Each failure may result from a variety of underlying human factors that need to be addressed to minimize the probability of those failures happening. Among them, we can mention inadequate or insufficient training, distraction, fatigue, multitasking, carelessness, inadequate procedures, inadequate supervision and staff, miscommunication, workload, and so on.

    The application of human factors can have a significant impact on reducing the probability of quality incidents and, sometimes, catastrophic accidents. Human factors can be used to create more efficient and safe work systems. If practically all incidents include human failures, addressing the causes of human failures is fundamental to achieving an improvement in both quality and safety. Following is a brief description of several of the major accidents reported in the last 40 years. Originally, almost all of them were attributed to human error, but at the end of their corresponding investigations, key recurring human factors were identified as the root causes of those disasters. The goal is to learn and understand from past disasters how human factors can contribute to improving the performance of the system, thus creating safer and higher-quality processes and systems.

    Flixborough 1974¹³

    The Flixborough disaster was an explosion at a chemical plant close to the village of Flixborough, England. It killed 28 people and seriously injured 36 out of a total of only 72 people on-site at the time. Two months prior to the explosion, the number 5 reactor was discovered to be leaking. It was decided to install a temporary pipe to bypass the leaking reactor to allow continued operation of the plant while repairs were being made. In the absence of a normal 28-inch nominal bore pipe, a thinner, 20-inch nominal bore pipe was used to fabricate the bypass pipe for linking the reactor 4 outlet to the reactor 6 inlet. The new configuration was tested for leak-tightness at working pressure by pressurization with nitrogen. For two months after fitting, the bypass was operated continuously at temperature and pressure and gave no trouble. At the end of May, the reactors had to be depres­surized and allowed to cool in order to deal with leaks elsewhere. The leaks having been dealt with, early on June 1 workers attempted to bring the plant back up to pressure and temperature. On June 1, 1974, there was a massive release of hot cyclohexane in the area of the missing reactor 5, fol­lowed shortly by the ignition of the resulting cloud of flammable vapor and a massive explosion in the plant that demolished the site.

    The conclusion of the investigation was that this disaster was caused by a well-designed and constructed plant undergoing a modification that destroyed its technical integrity. Although the causes of the disaster were complex, the conclusion of the investigation into the accident found that the bypass pipe had failed because of unanticipated stresses in the pipe during a pressure surge. The bypass pipe was

    Enjoying the preview?
    Page 1 of 1