请输入您要查询的百科知识:

 

词条 Human reliability
释义

  1. Common Traps of Human Nature

  2. Analysis techniques

     PRA-based techniques  Cognitive control based techniques  Related techniques  Human Factors Analysis and Classification System (HFACS) 

  3. See also

  4. Footnotes

  5. References

  6. Further reading

  7. External links

     Standards and guidance documents  Tools  Research labs   Media coverage    Networking  

Human reliability (also known as human performance or HU) is related to the field of human factors and ergonomics, and refers to the reliability of humans in fields including manufacturing, medicine and nuclear power. Human performance can be affected by many factors such as age, state of mind, physical health, attitude, emotions, propensity for certain common mistakes, errors and cognitive biases, etc.

Human reliability is very important due to the contributions of humans to the resilience of systems and to possible adverse consequences of human errors or oversights, especially when the human is a crucial part of the large socio-technical systems as is common today. User-centered design and error-tolerant design are just two of many terms used to describe efforts to make technology better suited to operation by humans.

Common Traps of Human Nature

People tend to overestimate their ability to maintain control when they are doing work.

The common characteristics of human nature addressed below are especially accentuated when work is performed in a complex work environment.[1]

Stress The problem with stress is that it can accumulate and overpower a person, thus becoming

detrimental to performance.

Avoidance of Mental Strain Humans are reluctant to engage in lengthy concentrated thinking,

as it requires high levels of attention for extended periods.

The mental biases, or shortcuts, often used to reduce mental effort and expedite decision-making include:

  • Assumptions – A condition taken for granted or accepted as true without verification of the facts.
  • Habit – An unconscious pattern of behavior acquired through frequent repetition.
  • Confirmation bias – The reluctance to abandon a current solution.
  • Similarity bias – The tendency to recall solutions from situations that appear similar
  • Frequency bias – A gamble that a frequently used solution will work.
  • Availability bias – The tendency to settle on solutions or courses of action that readily come to mind.

Limited Working Memory - The mind's short-term memory is the “workbench” for problem solving and decision-making.

Limited Attention Resources - The limited ability to concentrate on two or more activities challenges the ability to process information needed to solve problems.

Mind-Set People tend to focus more on what they want to accomplish (a goal) and less on what needs to be avoided because human beings are primarily goal-oriented by nature. As such, people tend to “see” only what the mind expects, or wants, to see.

Difficulty Seeing One's Own Error - Individuals, especially when working alone, are particularly susceptible to missing errors.

Limited Perspective - Humans cannot see all there is to see. The inability of the human mind to perceive all facts pertinent to a decision challenges problem-solving.

Susceptibility To Emotional/Social Factors - Anger and embarrassment adversely influence team and individual performance.

Fatigue - People get tired.Physical, emotional, and mental fatigue can lead to error and poor judgment.

Presenteeism - Some employees will be present in the need to belong to the workplace despite a diminished capacity to perform their jobs due to illness or injury.

Analysis techniques

A variety of methods exist for human reliability analysis (HRA).[2][3] Two general classes of methods are those based on probabilistic risk assessment (PRA) and those based on a cognitive theory of control.

PRA-based techniques

One method for analyzing human reliability is a straightforward extension of probabilistic risk assessment (PRA): in the same way that equipment can fail in a power plant, so can a human operator commit errors. In both cases, an analysis (functional decomposition for equipment and task analysis for humans) would articulate a level of detail for which failure or error probabilities can be assigned. This basic idea is behind the Technique for Human Error Rate Prediction (THERP).[4] THERP is intended to generate human error probabilities that would be incorporated into a PRA. The Accident Sequence Evaluation Program (ASEP) human reliability procedure is a simplified form of THERP; an associated computational tool is Simplified Human Error Analysis Code (SHEAN).[5] More recently, the US Nuclear Regulatory Commission has published the Standardized Plant Analysis Risk - Human Reliability Analysis (SPAR-H) method to take account of the potential for human error.[6][7]

Cognitive control based techniques

Erik Hollnagel has developed this line of thought in his work on the Contextual Control Model (COCOM) [8] and the Cognitive Reliability and Error Analysis Method (CREAM).[9] COCOM models human performance as a set of control modes—strategic (based on long-term planning), tactical (based on procedures), opportunistic (based on present context), and scrambled (random) - and proposes a model of how transitions between these control modes occur. This model of control mode transition consists of a number of factors, including the human operator's estimate of the outcome of the action (success or failure), the time remaining to accomplish the action (adequate or inadequate), and the number of simultaneous goals of the human operator at that time. CREAM is a human reliability analysis method that is based on COCOM.

Related techniques

Related techniques in safety engineering and reliability engineering include failure mode and effects analysis, hazop, fault tree, and SAPHIRE (Systems Analysis Programs for Hands-on Integrated Reliability Evaluations).

Human Factors Analysis and Classification System (HFACS)

{{Main|Human Factors Analysis and Classification System}}

The Human Factors Analysis and Classification System (HFACS) was developed initially as a framework to understand the role of "human error" in aviation accidents.[10][11] It is based on James Reason's Swiss cheese model of human error in complex systems. HFACS distinguishes between the "active failures" of unsafe acts, and "latent failures" of preconditions for unsafe acts, unsafe supervision, and organizational influences. These categories were developed empirically on the basis of many aviation accident reports.

"Unsafe acts" are performed by the human operator "on the front line" (e.g., the pilot, the air traffic controller, the driver). Unsafe acts can be either errors (in perception, decision making or skill-based performance) or violations (routine or exceptional). The errors here are similar to the above discussion. Violations are the deliberate disregard for rules and procedures. As the name implies, routine violations are those that occur habitually and are usually tolerated by the organization or authority. Exceptional violations are unusual and often extreme. For example, driving 60 mph in a 55-mph zone speed limit is a routine violation, but driving 130 mph in the same zone is exceptional.

There are two types of preconditions for unsafe acts: those that relate to the human operator's internal state and those that relate to the human operator's practices or ways of working. Adverse internal states include those related to physiology (e.g., illness) and mental state (e.g., mentally fatigued, distracted). A third aspect of 'internal state' is really a mismatch between the operator's ability and the task demands; for example, the operator may be unable to make visual judgments or react quickly enough to support the task at hand. Poor operator practices are another type of precondition for unsafe acts. These include poor crew resource management (issues such as leadership and communication) and poor personal readiness practices (e.g., violating the crew rest requirements in aviation).

Four types of unsafe supervision are: inadequate supervision; planned inappropriate operations; failure to correct a known problem; and supervisory violations.

Organizational influences include those related to resources management (e.g., inadequate human or financial resources), organizational climate (structures, policies, and culture), and organizational processes (such as procedures, schedules, oversight).

See also

  • Absolute probability judgement
  • ATHEANA (A Technique for Human Event Analysis)
  • Human error assessment and reduction technique (HEART), a technique used in the field of human reliability
  • Influence diagrams approach
  • Latent human error
  • Team error
  • TESEO (Tecnica Empirica Stima Errori Operatori)

Footnotes

1. ^ https://www.standards.doe.gov/standards-documents/1000/1028-BHdbk-2009-v1/@@images/file DOE-HDBK-1028-2009
2. ^Kirwan and Ainsworth, 1992
3. ^Kirwan, 1994
4. ^Swain & Guttmann, 1983
5. ^Simplified Human Error Analysis Code (Wilson, 1993)
6. ^[https://www.nrc.gov/reading-rm/doc-collections/nuregs/contract/cr6883/ SPAR-H]
7. ^Gertman et al., 2005
8. ^(Hollnagel, 1993)
9. ^(Hollnagel, 1998)
10. ^Shappell and Wiegmann, 2000
11. ^Wiegmann and Shappell, 2003

References

  • {{cite book|author1=Gertman, D. L. |author2=Blackman, H. S.|year=2001|title=Human reliability and safety analysis data handbook|publisher=Wiley}}
  • {{cite book|author=Gertman, D., Blackman, H., Marble, J., Byers, J. and Smith, C.|year=2005|title=The SPAR-H human reliability analysis method. NUREG/CR-6883. Idaho National Laboratory, prepared for U. S. Nuclear Regulatory Commission}}[https://www.nrc.gov/reading-rm/doc-collections/nuregs/contract/cr6883/]
  • {{cite book|author=M. Cappelli, A.M.Gadomski, M.Sepielli|year=2011|title= Human Factors in Nuclear Power Plant Safety Management: A Socio-Cognitive Modeling Approach using TOGA Meta-Theory. Proceedings of International Congress on Advances in Nuclear Power Plants. Nice (FR),|publisher=SFEN (Société Française d'Energie Nucléaire)}}
  • {{cite book|author=Hollnagel, E.|year=1993|title=Human reliability analysis: Context and control| publisher =Academic Press}}
  • {{cite book|author=Hollnagel, E.|year=1998|title=Cognitive reliability and error analysis method: CREAM|publisher=Elsevier}}
  • {{cite book|author1=Hollnagel, E. |author2=Amalberti, R.|year=2001|title=The Emperor’s New Clothes, or whatever happened to "human error"? Invited keynote presentation at 4th International Workshop on Human Error, Safety and System Development.|publisher=Linköping, June 11–12, 2001}}
  • {{cite book|author=Hollnagel, E., Woods, D. D., and Leveson, N. (Eds.)|year=2006|title=Resilience engineering: Concepts and precepts|publisher=Ashgate}}
  • {{cite book|author=Jones, P. M.|year=1999|title=Human error and its amelioration. In Handbook of Systems Engineering and Management (A. P. Sage and W. B. Rouse, eds.), 687-702|publisher= Wiley}}
  • {{cite book|author=Kirwan, B.|year=1994|title=A Guide to Practical Human Reliability Assessment|publisher=Taylor & Francis}}
  • {{cite book|author=Kirwan, B. and Ainsworth, L. (Eds.)|year=1992|title=A guide to task analysis|publisher=Taylor & Francis}}
  • {{cite book|author=Norman, D.|year=1988|title=The psychology of everyday things|publisher=Basic Books}}
  • {{cite book|author=Reason, J.|year=1990|title=Human error|publisher=Cambridge University Press}}
  • {{cite book|author=Roth, E.|year=1994|title=An empirical investigation of operator performance in cognitive demanding simulated emergencies. NUREG/CR-6208, Westinghouse Science and Technology Center|publisher=Report prepared for Nuclear Regulatory Commission|display-authors=etal}}
  • {{cite book|author=Sage, A. P.|year=1992|title=Systems engineering|publisher=Wiley}}
  • {{cite book|author1=Senders, J. |author2=Moray, N.|year=1991|title=Human error: Cause, prediction, and reduction|publisher=Lawrence Erlbaum Associates}}
  • {{cite book|author1=Shappell, S. |author2=Wiegmann, D.|year=2000|title=The human factors analysis and classification system - HFACS. DOT/FAA/AM-00/7, Office of Aviation Medicine, Federal Aviation Administration, Department of Transportation.}}[https://web.archive.org/web/20061214043156/http://www.nifc.gov/safety_study/accident_invest/humanfactors_class%26anly.pdf]
  • {{cite book|author=Swain, A. D., & Guttman, H. E.|year=1983|title=Handbook of human reliability analysis with emphasis on nuclear power plant applications.|publisher=NUREG/CR-1278 (Washington D.C.)}}
  • {{cite book|author1=Wallace, B. |author2=Ross, A.|year=2006|title=Beyond human error|publisher=CRC Press}}
  • {{cite book|author1=Wiegmann, D. |author2=Shappell, S.|year=2003|title=A human error approach to aviation accident analysis: The human factors analysis and classification system.|publisher=Ashgate}}
  • {{cite book|author=Wilson, J.R.|year=1993|title=SHEAN (Simplified Human Error Analysis code) and automated THERP|publisher=United States Department of Energy Technical Report Number WINCO--11908}}  
  • {{cite book|author=Woods, D. D.|year=1990|title=Modeling and predicting human error. In J. Elkind, S. Card, J. Hochberg, and B. Huey (Eds.), Human performance models for computer-aided engineering (248-274)|publisher=Academic Press}}
  • Federal Aviation Administration. 2009 electronic code of regulations. Retrieved September 25, 2009, from http://www.airweb.faa.gov/Regulatory_and_Guidance_Library

Further reading

  • {{cite book|author=Autrey, T.D. |year=2007|title=Perfection Institute Mistake-Proofing Six Sigma: How to Minimize Project Scope and Reduce Human Error|url=http://www.practicingperfectioninstitute.com/reports/sixsigma.aspx|publisher=Practicing}}
  • {{cite book|author=Davies, J.B., Ross, A., Wallace, B. and Wright, L.|year=2003|title=Safety Management: a Qualitative Systems Approach|publisher=Taylor and Francis}}
  • {{cite book|author=Dekker, S.W.A.,|year=2005|title=Ten Questions About Human Error: a new view of human factors and systems safety|url=http://www.leonardo.lth.se/sidney_dekker/books/ten_questions_about_human_error/|publisher=Lawrence Erlbaum Associates}}
  • {{cite book|author=Dekker, S.W.A.,|year=2006|title=The Field Guide to Understanding Human Error|url=http://www.leonardo.lth.se/sidney_dekker/books/the_field_guide_to_understanding_human_error/|publisher=Ashgate}}
  • {{cite book|author=Dekker, S.W.A.,|year=2007|title=Just Culture: Balancing Safety and Accountability|url=http://www.leonardo.lth.se/sidney_dekker/books/just_culture/|publisher=Ashgate}}
  • {{cite book|author=Dismukes, R. K., Berman, B. A., and Loukopoulos, L. D.|year=2007|title=The limits of expertise: Rethinking pilot error and the causes of airline accidents|publisher=Ashgate}}
  • {{cite book|author=Forester, J., Kolaczkowski, A., Lois, E., and Kelly, D.|year=2006|title=Evaluation of human reliability analysis methods against good practices. NUREG-1842 Final Report|publisher=U. S. Nuclear Regulatory Commission}} [https://www.nrc.gov/reading-rm/doc-collections/nuregs/staff/sr1842/]
  • {{cite book|author=Goodstein, L. P., Andersen, H. B., and Olsen, S. E. (Eds.)|year=1988|title=Tasks, errors, and mental models|publisher=Taylor and Francis}}
  • {{cite book|author1=Grabowski, M. |author2=Roberts, K. H.|year=1996|title={{doi-inline|10.1109/3468.477856|Human and organizational error in large scale systems}}, IEEE Transactions on Systems, Man, and Cybernetics, Volume 26, No. 1, January 1996, 2-16}}
  • {{cite book|author=Greenbaum, J. and Kyng, M. (Eds.)|year=1991|title=Design at work: Cooperative design of computer systems|publisher=Lawrence Erlbaum Associates}}
  • {{cite book|author=Harrison, M.|year=2004|title=Human error analysis and reliability assessment|publisher=Workshop on Human Computer Interaction and Dependability, 46th IFIP Working Group 10.4 Meeting, Siena, Italy, July 3–7, 2004}}  
  • {{cite book|author=Hollnagel, E.|year=1991|title= The phenotype of erroneous actions: Implications for HCI design. In G. W. R. Weir and J. L. Alty (Eds.), Human-computer interaction and complex systems|publisher=Academic Press}}
  • {{cite book|author=Hutchins, E.|year=1995|title=Cognition in the wild|publisher=MIT Press}}
  • {{cite book|author=Kahneman, D., Slovic, P. and Tversky, A. (Eds.)|year=1982|title=Judgment under uncertainty: Heuristics and biases|publisher=Cambridge University Press}}
  • {{cite book|author=Leveson, N.|year=1995|title=Safeware: System safety and computers|publisher=Addison-Wesley}}
  • {{cite book|author=Morgan, G.|year=1986|title=Images of Organization|publisher=Sage}}
  • {{cite book|author=Mura, S. S.|year=1983|title=Licensing violations: Legitimate violations of Grice's conversational principle. In R. Craig and K. Tracy (Eds.), Conversational coherence: Form, structure, and strategy (101-115)|publisher=Sage}}
  • {{cite book|author=Perrow, C.|year=1984|title=Normal accidents: Living with high-risk technologies|publisher=Basic Books}}
  • {{cite book|author=Rasmussen, J.|year=1983|title=Skills, rules, and knowledge: Signals, signs, and symbols and other distinctions in human performance models. IEEE Transactions on Systems, Man, and Cybernetics, SMC-13, 257-267}}
  • {{cite book|author=Rasmussen, J.|year=1986|title=Information processing and human-machine interaction: An approach to cognitive engineering|publisher=Wiley}}
  • {{cite book|author=Silverman, B.|year=1992|title=Critiquing human error: A knowledge-based human-computer collaboration approach|publisher=Academic Press}}
  • {{cite book|author=Swets, J.|year=1996|title=Signal detection theory and ROC analysis in psychology and diagnostics: Collected papers|publisher=Lawrence Erlbaum Associates}}
  • {{cite book|author1=Tversky, A. |author2=Kahneman, D.|year=1974|title= Judgment under uncertainty: Heuristics and biases. Science, 185, 1124-1131}}
  • {{cite book|author=Vaughan, D.|year=1996|title=The Challenger launch decision: Risky technology, culture, and deviance at NASA|publisher=University of Chicago Press}}
  • {{cite book|author=Woods, D. D., Johannesen, L., Cook, R., and Sarter, N.|year=1994|title=Behind human error: Cognitive systems, computers, and hindsight. CSERIAC SOAR Report 94-01|publisher=Crew Systems Ergonomics Information Analysis Center, Wright-Patterson Air Force Base, Ohio}}
  • {{cite book|author=Wu, S., Hrudey, S., French, S., Bedford, T., Soane, E. and Pollard, S.|year=2009|title={{doi-inline|10.1016/j.watres.2009.04.040|A role for human reliability analysis (HRA) in preventing drinking water incidents and securing safe drinking water}}, Water Research, Volume 43, No. 13, 2009, 3227-3238}}
  • CCPS, Guidelines for Preventing Human Error. This book explains about qualitative and quantitative methodology for predicting human error. Qualitative methodology called SPEAR: Systems for Predicting Human Error and Recovery, and quantitative methodology also includes THERP, etc.

External links

Standards and guidance documents

  • IEEE Standard 1082 (1997): IEEE Guide for Incorporating Human Action Reliability Analysis for Nuclear Power Generating Stations
  • [https://www.standards.doe.gov/standards-documents/1000/1028-BHdbk-2009-v1/@@images/file DOE Standard DOE-HDBK-1028-2009 : Human Performance Improvement Handbook]

Tools

  • [https://www.epri.com/#/pages/product/1020436/?lang=en-US EPRI HRA Calculator]
  • [https://web.archive.org/web/20120205054219/http://www.eurocontrol.int/hifa/public/standard_page/Hifa_HifaData_Tools_HumErr.html Eurocontrol Human Error Tools]
  • RiskSpectrum HRA software
  • Simplified Human Error Analysis Code

Research labs

  • [https://archive.is/20121130063656/http://erik.hollnagel.googlepages.com/ Erik Hollnagel] at the Crisis and Risk Research Centre at [https://web.archive.org/web/20081023071732/http://www.mines-paristech.fr/Accueil/ MINES ParisTech]
  • Human Reliability Analysis at the US Sandia National Laboratories
  • [https://web.archive.org/web/20050208200500/http://www.orau.gov/chrs/chrs.htm Center for Human Reliability Studies] at the US Oak Ridge National Laboratory
  • Flight Cognition Laboratory at NASA Ames Research Center
  • [https://web.archive.org/web/20070205134258/http://csel.eng.ohio-state.edu/woods/ David Woods ] at the Cognitive Systems Engineering Laboratory at The Ohio State University
  • Sidney Dekker's Leonardo da Vinci Laboratory for Complexity and Systems Thinking, Lund University, Sweden

Media coverage

  • “How to Avoid Human Error in IT“
  • [https://web.archive.org/web/20050512075435/http://www.iienet.org/magazine/magazinefiles/IENOV2004_outliers_p66.pdf “Human Reliability. We break down just like machines“] Industrial Engineer - November 2004, 36(11): 66

Networking

  • High Reliability Management group at LinkedIn.com
{{Authority control}}{{DEFAULTSORT:Human Reliability}}

1 : Human reliability

随便看

 

开放百科全书收录14589846条英语、德语、日语等多语种百科知识,基本涵盖了大多数领域的百科知识,是一部内容自由、开放的电子版国际百科全书。

 

Copyright © 2023 OENC.NET All Rights Reserved
京ICP备2021023879号 更新时间:2024/11/14 4:30:43