liu.seSearch for publications in DiVA
Change search
Refine search result
12 1 - 50 of 76
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Benn, J.
    et al.
    Department of Biosurgery and Surgical Technology, Imperial College London, St. Mary's Hospital, Praed Street, Paddington, London W2 1NY, United Kingdom.
    Healey, A.N.
    Department of Biosurgery and Surgical Technology, Imperial College London, St. Mary's Hospital, Praed Street, Paddington, London W2 1NY, United Kingdom.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Improving performance reliability in surgical systems2008In: Cognition, Technology & Work, ISSN 1435-5558, E-ISSN 1435-5566, Vol. 10, no 4, p. 323-333Article in journal (Refereed)
    Abstract [en]

    Health care has evolved rapidly to meet the medical demands of society, but not to meet the demands of consistent operational safety. In other high risk domains in which the consequences of systems failure are unacceptable, organisational and operational work systems have been developed to deliver consistent, high-quality, failure-free performance. In this paper we review contributions to a special issue of Cognition, Technology and Work on 'Enhancing Surgical Systems'. We consider their implications for improving the reliability of care processes in light of theoretical developments in the area of high-reliability organisations and resilience engineering. Health care must move from reactive safety cultures to be more proactively resilient to the continual threats posed by complexity in clinical care processes and the multi-professional hospital environment. Our analysis emphasises the importance of team working for reliable operational performance. A schematic framework to illustrate how safety interventions in surgery might cohere within an organisational strategy for achieving high-reliability is proposed. The implications for continuous quality improvement and effective regulation of system safety over differing time scales and organisational levels are discussed. © 2007 Springer-Verlag London Limited.

  • 2.
    Dekker, Sidney
    et al.
    Linköping University, The Institute of Technology. Linköping University, Department of Mechanical Engineering, Industrial Ergonomics.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Human factors and folk models2004In: Cognition, Technology & Work, ISSN 1435-5558, E-ISSN 1435-5566, Vol. 6, p. 79-86Article in journal (Refereed)
  • 3.
    Dekker, Sidney
    et al.
    Linköping University, The Institute of Technology. Linköping University, Department of Mechanical Engineering, Industrial Ergonomics.
    Ohlsson, Kjell
    Linköping University, The Institute of Technology. Linköping University, Department of Mechanical Engineering, Industrial Ergonomics.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Alm, Håkan
    Arbetsvetenskap Luleå tekniska universitet.
    Humans in a complex environment II: Automation, IT and operator work2003Book (Other academic)
  • 4.
    Fujita, Yushi
    et al.
    Technova Inc..
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Failures without errors: Quantification of context in HRA2004In: Reliability Engineering & System Safety, ISSN 0951-8320, E-ISSN 1879-0836, Vol. 83, no 2, p. 145-151Article in journal (Refereed)
    Abstract [en]

    PSA-cum-human reliability analysis (HRA) has traditionally used individual human actions, hence individual 'human errors', as a meaningful unit of analysis. This is inconsistent with the current understanding of accidents, which points out that the notion of 'human error' is ill defined and that adverse events more often are the due to the working conditions than to people. Several HRA approaches, such as ATHEANA and CREAM have recognised this conflict and proposed ways to deal with it. This paper describes an improvement of the basic screening method in CREAM, whereby a rating of the performance conditions can be used to calculate a Mean Failure Rate directly without invoking the notion of human error.

  • 5.
    Fujita, Yushi
    et al.
    Technova Inc..
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    From error probabilities to control modes: Quantification of context effects on performance2002In: International Workshop on "Building the new HRA": Strengthening the links between experience and HRA, 2002Conference paper (Other academic)
  • 6.
    Gauthereau, V.
    et al.
    Department of Work Psychology, Université de Lièe, Boulevard de Rectarat 5, B-4000 Lièe, Belgium.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Planning, control, and adaptation: A case study2005In: European Management Journal, ISSN 0263-2373, E-ISSN 1873-5681, Vol. 23, no 1, p. 118-131Article in journal (Refereed)
    Abstract [en]

    This article relates the findings of an ethnographically informed study conducted at a Swedish Nuclear Power Plant. It describes a set of events while the plant was shut down for a short non-productive outage. In order to understand better how high-reliability organizations successfully manage the conflict between bureaucratic planning and flexibility we provide a description of simultaneous levels of control that dissolve the conflict. Instead of merely describing actions as planned, or improvised, the present work illustrates that no clear-cut distinctions can be drawn between what is improvisation and what is not. Understanding planning as a control activity as well, enables us to grasp the different values of the canonical activity of planning. © 2005 Elsevier Ltd. All rights reserved.

  • 7.
    Gauthereau, Vincent
    et al.
    Linköping University, The Institute of Technology. Linköping University, Department of Mechanical Engineering.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Building expectations and the management of surprises: Studies of safety-train outages at a Swedish nuclear power plant.2003In: European Conference on Cognitive Science Approache to Process Control,2003, 2003Conference paper (Refereed)
  • 8.
    Gauthereau, Vincent
    et al.
    Linköping University, Department of Mechanical Engineering, Quality Technology and Management. Linköping University, The Institute of Technology.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Organisational Improvisation: A Field Study At a Swedish NPP during a Productive-Outage.2002In: European Annual Conference on Human Decision Making and Control, 2002Conference paper (Refereed)
  • 9.
    Gawinowski, Gilles
    et al.
    Eurocontrol, Bretigny, F.
    Averty, Philippe
    DTI/SDER, Toulouse, F.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Johansson, Björn
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Wise, John A.
    Honeywell International, Phoenix, AZ.
    ERASMUS: A novel human factors approach to air traffic management2006In: 27th European Association for Aviation Psychology Jubilee Conference,2006, Potsdam: EAAP , 2006Conference paper (Refereed)
  • 10.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    A cognitive systems engineering perspective on automation2003In: International Conference on Human Aspects of Advanced Manufacturing: Agility and Hybrid Automation, X: X , 2003Conference paper (Refereed)
  • 11.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    A function-centred approach to joint driver-vehicle system design2006In: Cognition, Technology & Work, ISSN 1435-5558, E-ISSN 1435-5566, Vol. 8, no 3, p. 169-173Article in journal (Refereed)
  • 12.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    A function-centred approach to joint driver-vehicle system design2004In: IEEE SMC 2004 International Conference on Systems, Man and Cybernetics,2004, Delft, NL: TU Delft , 2004, p. 2548-Conference paper (Refereed)
  • 13.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Analysis and prediction of failures in complex systems: Models & methods2000In: Lecture notes in control and information sciences, ISSN 0170-8643, E-ISSN 1610-7411, Vol. 253, p. 39-41Article in journal (Refereed)
  • 14.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Applying AI to models in cognitive ergonomics2003In: Handbook of Human Factors & Ergonomics / [ed] Neville Stanton, Tokyo: Asakura Publishing Company, Ltd. , 2003, 1, p. 256-271Chapter in book (Other academic)
    Abstract [en]

    Research suggests that ergonomists tend to restrict themselves to two or three of their favorite methods in the design of systems, despite a multitude of variations in the problems that they face. Human Factors and Ergonomics Methods delivers an authoritative and practical account of methods that incorporate human capabilities and limitations, environmental factors, human-machine interaction, and other factors into system design. The Handbook describes 83 methods in a standardized format, promoting the use of methods that may have formerly been unfamiliar to designers.The handbook comprises six sections, each representing a specialized field of ergonomics with a representative selection of associated methods. The sections highlight facets of human factors and ergonomics in systems analysis, design, and evaluation. Sections I through III address individuals and their interactions with the world. Section IV explores social groupings and their interactions (team methods), and Section V examines the effect of the environment on workers. The final section provides an overview of work systems-macroergonomics methods. An onion-layer model frames each method; working from the individual, to the team, to the environment, to the work system. Each chapter begins with an introduction written by the chapter's editor, offering a brief overview of the field and a description of the methods covered. The Handbook provides a representative set of contemporary methods that are valuable in ergonomic analyses and evaluations.The layout of each chapter is standardized for ease-of-use, so you can quickly locate relevant information about each method. Content descriptions are brief, and references are made to other texts, papers, and case studies. Standard descriptions of methods encourage browsing through several potential methods before tackling a problem.

  • 15.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Att förstå olyckor:  lätt att bli syndabock i "effektiv" organisation2003In: Nucleus, ISSN 1104-4578, Vol. 1, p. 12-17Article in journal (Other academic)
  • 16.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Automation and human work2004In: Human factors for engineers / [ed] Carl Sandom and Roger Harvey, London: IEEE , 2004, p. 113-150Chapter in book (Other academic)
    Abstract [en]

    This book introduces the reader to the subject of human factors and provides practical and pragmatic advice to assist engineers in designing interactive systems that are safer, more secure and easier to use - thereby reducing accidents due to human error, increasing system integrity and enabling more efficient process operations. The book discusses human factors integration methodology and reviews the issues that underpin consideration of key topics such as human error, automation and human reliability assessment. There are also design considerations including control room and interface design and acceptance and verification considerations.

     

  • 17.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Barrier analysis and accident prevention2003In: Innovation and consolidation in aviation: selected contributions to the Australian Aviation Psychology Symposium 2000 / [ed] Graham Edkins; Peter Pfister, Aldershot: Ashgate Publishing Limited , 2003, p. 59-74Chapter in book (Other academic)
    Abstract [en]

    This is the formal refereed proceedings of the fifth Australian Aviation Psychology Symposium. The symposium had a diverse range of contributions and development workshops, bringing together practitioners from aviation psychology and human factors, flight operations management, safety managers, pilots, cabin crew, air traffic controllers, engineering and maintenance personnel, air safety investigators, staff from manufacturers and regulatory bodies and applied aviation industry researchers and academics.

    The volume expands the contribution of aviation psychology and human factors to the aviation industry within the Asia Pacific region, developing the safety, efficiency and viability of the industry. It is a forward-looking work, providing strategies for psychology and human factors to increase the safe and effective functioning of aviation organizations and systems, pertinent to both civil and military operations.

  • 18.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Barriers and accident prevention2004Book (Other academic)
    Abstract [en]

    Accidents are preventable, but only if they are correctly described and understood. Since the mid-1980s accidents have come to be seen as the consequence of complex interactions rather than simple threads of causes and effects. Yet progress in accident models has not been matched by advances in methods. The author's work in several fields (aviation, power production, traffic safety, healthcare) made it clear that there is a practical need for constructive methods and this book presents the experiences and the state-of-the-art. The focus of the book is on accident prevention rather than accident analysis and unlike other books, has a proactive rather than reactive approach. The emphasis on design rather than analysis is a trend also found in other fields. Features of the book include: A classification of barrier functions and barrier systems that will enable the reader to appreciate the diversity of barriers and to make informed decisions for system changes. A perspective on how the understanding of accidents (the accident model) largely determines how the analysis is done and what can be achieved. The book critically assesses three types of accident models (sequential, epidemiological, systemic) and compares their strengths and weaknesses. A specific accident model that captures the full complexity of systemic accidents. One consequence is that accidents can be prevented through a combination of performance monitoring and barrier functions, rather than through the elimination or encapsulation of causes. A clearly described methodology for barrier analysis and accident prevention. Written in an accessible style, Barriers and Accident Prevention is designed to provide a stimulating and practical guide for industry professionals familiar with the general ideas of accidents and human error. The book is directed at those involved with accident analysis and system safety, such as managers of safety departments, risk and safety consultants, human factors professionals, and accident investigators. It is applicable to all major application areas such as aviation, ground transportation, maritime, process industries, healthcare and hospitals, communication systems, and service providers.

  • 19.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Computers and covert automation: A cognitive systems engineering view2004In: 7th International Conference on Work With Computing Systems,2004, Kuala Lumpur: WWCS , 2004, p. 15-Conference paper (Refereed)
  • 20.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Decisions about "what", and decisions about "how"2003In: Human Factors of Decision Making in Complex Systems, X: X , 2003, p. 8-10Conference paper (Refereed)
  • 21.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Developing measurements of driving performance for the effects of active safety systems.2003In: Triennial Conference International Ergonomics Association,2003, 2003Conference paper (Refereed)
  • 22.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Flight decks and free flight: Where are the boundaries?2004In: The Flightdeck of the Future:Human Factors in Datalink and Freeflight,2004, Nottingham, UK: University of Nottingham , 2004Conference paper (Other academic)
  • 23.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    From cognitive task analysis to cognitive task design2003In: Triennial Conference International Ergonomics Association, X: X , 2003Conference paper (Refereed)
  • 24.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    From HMS to HCI - and back again2001In: People in control: Human factors in control room design / [ed] Jan Noyes and Matthew Bransby, London, UK: The Institution of Engineering and Technology , 2001, p. xvii-xxChapter in book (Other academic)
    Abstract [en]

    The aim of this book is to provide state-of-the-art information on various aspects of humanmachine intereraction and human-centred issues encountered in the control room setting. Illustrated with useful case studies.

  • 25.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    From human factors to cognitive systems engineering: Human-machine interaction in the 21st Century2001In: Anzen-no-Tankyu (Researches on safety), Tokyo: ERC Publishing , 2001Chapter in book (Other academic)
  • 26.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    From human-centred to function-centred design2002In: International Conference on The Sciences of Design: The Scientific Challenge for the 21st Century, 2002Conference paper (Refereed)
  • 27.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    From levels of automation to layers of control2003In: NATO RTO Human Factors and Medical Panel Workshop: Uninhabited Military Vehicles - Human Factors Of Augmenting The Force, 2003Conference paper (Other academic)
  • 28.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Handbook of cognitive task design2003Book (Other academic)
    Abstract [en]

    Offers the theories, models, and methods related to cognitive task design. This work summarizes the extensive, worldwide experience with cognitive task design since the 1980s. It defines the state of the art and outlines the future of this ever-developing field

  • 29.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    How to assess the risks of human erroneous actions2003In: Safety and reliability: interactions between machines, software and people / [ed] Jörgen Eklund, Jens-Peder Ekros ; contributions by Jörgen Eklund, Linköping: Linköpings universitet , 2003, p. 23-36Chapter in book (Other academic)
  • 30.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Is affective computing an oxymoron?2003In: International journal of human-computer studies, ISSN 1071-5819, E-ISSN 1095-9300, Vol. 59, no 1-2, p. 65-70Article, review/survey (Refereed)
    Abstract [en]

    An overview of affective computing in respect with the human-computer interaction was presented. It dealt with the affective states of a computer which are similar to humans, hence computer can also show affects and emotions. In this regard, the concept of enhancement of effective of communication by imitating the emotional aspects was also discussed.

  • 31.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Joint human-computer system dependability2002In: European Conference on Cognitive Ergonomics and Safe, 2002Conference paper (Refereed)
  • 32.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Looking for errors of omission and commission or The Hunting of the Snark revisited2000In: Reliability Engineering & System Safety, ISSN 0951-8320, E-ISSN 1879-0836, Vol. 68, no 2, p. 135-145Article in journal (Refereed)
    Abstract [en]

    Since the early 1990s, considerable effort has been spent to understand what is meant by an `error of commission' (EOC), to complement the traditional notion of an `error of omission' (EOO). This paper argues that the EOO-EOC dyad, as an artefact of the PSA event tree, is insufficient for human reliability analysis (HRA) for several reasons: (1) EOO-EOC fail to distinguish between manifestation and cause, (2) EOO-EOC refer to classes of incorrect actions rather than to specific instances: (3) there is no unique way of classifying an event using EOO-EOC, (4) the set of error modes that cannot reasonably be classified as EOO is too diverse to fit into any single category of its own. Since the use of EOO-EOC leads to serious problems for HRA. an alternative is required. This can be found in the concept of error modes, which has a long history in risk analysis. A specific system for error mode prediction was tested in a simulator experiment. The analysis of the results showed that error modes could be qualitatively predicted with sufficient accuracy (68% correct) to propose this method as a way to determine how operator actions can fail in PSA-cum-HRA. Although this still leaves the thorny issue of quantification, a consistent prediction of error modes provides a better starting point for determining probabilities than the EOO-EOC dyad. It also opens a possibility for quantification methods where the influence of the common performance conditions is prior to and more important than individual failure rates.

  • 33.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Modelling the controller of a process1999In: Transactions of the Institute of Measurement and Control, ISSN 0142-3312, E-ISSN 1477-0369, Vol. 21, no 4-5, p. 163-170Article in journal (Refereed)
    Abstract [en]

    Models of humans (operators, controllers) in human-machine systems have tacitly assumed that humans must have a model of the process in order to control it. Humans have therefore traditionally been described as information processing systems with an internal or mental model of the process as an important component. A more systemic or cybernetic view acknowledges that the human must be a model of the process in order to control it. This suggests a different approach to modelling, which is functional rather than structural, and where the emphasis is on how the joint human-machine system can maintain control of a situation. A specific model, called the Contextual Control Model (COCOM), which is based on the principles of cognitive systems engineering, illustrates the approach, COCOM provides a foundation for analysing controller performance as well as implementing controller needs.

  • 34.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Modelling the orderliness of human action2000In: Cognitive engineering in the aviation domain / [ed] Nadine B Sarter; René Amalberti, Hillsdale, NJ: Erlbaum , 2000, p. -376Chapter in book (Other academic)
    Abstract [en]

    Although cognitive engineering has gained widespread acceptance as one of the most promising approaches to addressing and preventing difficulties with human-machine coordination and collaboration, it still meets with considerable skepticism and resistance in some of the industries that could benefit from its insights and recommendations. The challenge for cognitive engineers is to better understand the reasons underlying these reservations and to overcome them by demonstrating and communicating more effectively their concepts, approaches, and proposed solutions. To contribute to this goal, the current volume presents concrete examples of cognitive engineering research and design. It is an attempt to complement the already existing excellent literature on cognitive engineering in domains other than aviation and to introduce professionals and students in a variety of domains to this rather young discipline.

  • 35.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Modelos de acidentes e analises de acidentes2003In: Caminhos da análise de acidentes do trabalho / [ed] Ildeberto Muniz de Almeida, Brasilia: Ministerio do Trabalho e Emprego. , 2003, p. 99-105Chapter in book (Other academic)
  • 36.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Prolegomenon to cognitive task design2003In: Handbook of cognitive task design / [ed] Erik Hollnagel, Mahwah, NJ: Erlbaum , 2003, 1, p. -808Chapter in book (Other academic)
    Abstract [en]

    This Handbook serves as a single source for theories, models, and methods related to cognitive task design. It provides the scientific and theoretical basis required by industrial and academic researchers, as well as the practical and methodological guidance needed by practitioners who face problems of building safe and effective human-technology systems.Fundamental across a wide range of disciplines, from military systems to consumer goods and process industries, cognitive task design covers the whole life-cycle of work from pre-analysis, specification, design, risk assessment, implementation, training, daily operation, fault finding, maintenance, and upgrading. It applies to people, sophisticated machines, and to human-machine ensembles. This comprehensive volume summarizes the extensive, worldwide experience with cognitive task design since the 1980s; it defines the state of the art and outlines the future of this ever-developing field.Aimed at the graduate and postgraduate level, the Handbook of Cognitive Task Design is applicable to courses relating to the design of human-technology systems, interaction design, cognitive engineering, and applied industrial engineering.

  • 37.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Resilience: The challenge of the unstable2006In: Resilience engineering: Concepts and precepts / [ed] Erik Hollnagel, David D. Woods, Nancy Leveson, Aldershot, UK: Ashgate , 2006, 1, p. -397Chapter in book (Other academic)
    Abstract [en]

    For Resilience Engineering, 'failure' is the result of the adaptations necessary to cope with the complexity of the real world, rather than a breakdown or malfunction. The performance of individuals and organizations must continually adjust to current conditions and, because resources and time are finite, such adjustments are always approximate. This definitive new book explores this groundbreaking new development in safety and risk management, where 'success' is based on the ability of organizations, groups and individuals to anticipate the changing shape of risk before failures and harm occur. Featuring contributions from many of the worlds leading figures in the fields of human factors and safety, "Resilience Engineering" provides provocative insights into system safety as an aggregate of its various components, subsystems, software, organizations, human behaviours, and the way in which they interact. The book provides an introduction to Resilience Engineering of systems, covering both the theoretical and practical aspects. It is written for those responsible for system safety on managerial or operational levels alike, including safety managers and engineers (line and maintenance), security experts, risk and safety consultants, human factors professionals and accident investigators.

  • 38.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Task analysis2004In: Berkshire Encyclopedia of human-computer interaction / [ed] William Sims Bainbridge, Great Barrington, MA: Berkshire Publishing Group , 2004, p. 707-711Chapter in book (Other academic)
    Abstract [en]

    HCI (Human-Computer Interation) is everywhere. With a team of 200 experts, the "Berkshire Encyclopedia of Human-Computer Interation" is a comprehensive guide to every aspect of HCI. The work covers the field's history, breakthroughs, current research, and future direction. An ideal reference for students, educators, professionals, and business leaders, the encyclopedia is full of lively sidebars, more than 75 photos, charts, tables and figures, glossaries, and a rich and comprehesive Master Bibliography of HCI.

  • 39.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    The role of automation in joint cognitive systems2003In: Symposium on Automated Systems Based on Human Skill and Knowledge, 2003Conference paper (Refereed)
  • 40.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Thinking about artefacts and work: An introduction to cognitive task design2003In: Triennial Conference International Ergonomics Association, X: X , 2003Conference paper (Refereed)
  • 41.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Time and time again2002In: Theoretical issues in Ergonomics Science, ISSN 1463-922X, Vol. 3, no 2, p. 143-158Article in journal (Refereed)
    Abstract [en]

    In the study of human–machine systems, the need to have a model of the user is by now taken for granted. The model can be used both as support for design and analysis and as a representation of the user that resides somewhere in the machine. When it comes to the practice of modelling, two characteristic approaches can be recognized. The first focuses on the how of modelling, and is concerned mainly with the structure and contents of models. The second focuses on what is being modelled, and is concerned mainly with the functioning or performance of the model. The first approach has dominated human–machine systems research for several decades, and has led to orthodoxy in modelling by which certain structural characteristics are accepted without questioning. This unreflective attitude to modelling has been criticised several times, although with little effect. In taking the second approach and focusing on what should be modelled, two important issues are that human performance varies in level of control, i.e. in terms of how orderly it is, and that thinking and acting take time— and occur in a context where time is limited. Although it is clearly essential that user models can account for these characteristics, very few existing models are capable of doing so because they focus on internal information processing rather than on performance in a dynamic environment. The paper describes a type of functional model, called contextual control models, which shows how it is possible to account for both different control modes and how performance is affected by time. Indeed, control and time are intimately linked and loss of one may lead to a loss of the other. The contextual control model distinguishes among four characteristic control modes (strategic, tactical, opportunistic and scrambled) and two time parameters (time to evaluate, time to select) that are seen relative to the available time. Finally, a number of applications of contextual control models are described.

  • 42.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Time to think and time to do? I can fail and so can you!2003In: International Australian Aviation Psychology Symposium, X: X , 2003Conference paper (Refereed)
  • 43.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Understanding accidents - from root causes to performance variability.2002In: Conference on Human Factors and Power Plants, IEEE , 2002Conference paper (Refereed)
  • 44.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Understanding performance deviations2002In: International Symposium on Artificial Intelligence, Robotics and Human Centered Technology for Nuclear Applications, 2002Conference paper (Refereed)
  • 45.
    Hollnagel, Erik
    et al.
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Bye, A.
    Principles for modelling function allocation2000In: International journal of human-computer studies, ISSN 1071-5819, E-ISSN 1095-9300, Vol. 52, no 2, p. 253-265Article in journal (Refereed)
    Abstract [en]

    Automation is the key element in safety, reliability of industrial processes. Selecting the right type and level of automation requires careful consideration of how to allocate tasks between operators and automation. This is important in order that the joint system, human and machine as seen together, perform in the intended manner. The Halden Reactor Project is currently engaged in a project to study this topic, with an emphasis on maximizing the operator's ability to maintain control and handle unexpected events. Functional models can be used to study this in a process control environment, because they explicitly describe the functions that must be provided by the process or the operator. This paper describes how functional modelling of the joint system can be used to provide a basis for how functions should be allocated.

  • 46.
    Hollnagel, Erik
    et al.
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Gauthereau, Vincent
    Linköping University, The Institute of Technology. Linköping University, Department of Mechanical Engineering, Quality Technology and Management.
    Persson, Bodil
    Linköping University, The Institute of Technology. Linköping University, Department of Mechanical Engineering, Quality Technology and Management.
    Operational readiness verification, Phase 3: A Field Study at a Swedish NPP during a Productive Outage (Safety-train Outage)2004Report (Other academic)
  • 47.
    Hollnagel, Erik
    et al.
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Goteman, Örjan
    SAS.
    The functional resonance accident model2004In: Cognitive Systems Engineering in Process Control,2004, Sendai, Japan: Tohoku University , 2004Conference paper (Refereed)
  • 48.
    Hollnagel, Erik
    et al.
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Källhammer, Jan-Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Effects of a night vision enhancement system (NVES) on driving: Results from a simulator study2003In: International Driving Symposium on Human Factors in Driver Assessment, Training, and Vehicle Design,2003, 2003Conference paper (Refereed)
  • 49.
    Hollnagel, Erik
    et al.
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Niwa, Y.
    Institute of Nuclear Safety Systems, 64, Sata, Mihama-cho, Makata-gun, Fukui , Japan.
    Input requirements to a performance monitoring system2003In: 10th International Conference on Human-Computer Interaction, X: X , 2003Conference paper (Refereed)
  • 50.
    Hollnagel, Erik
    et al.
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Nåbo, Arne
    Saab Automobile AB, Trollhättan.
    Lau, I.
    General Motors Corporation .
    A systemic model for Driver-in-Control2003In: International Driving Symposium on Human Factors in Driver Assessment, Training, and Vehicle Design, 2003Conference paper (Refereed)
    Abstract [en]

    Models of driving have traditionally been couched either in terms of guidance and control or in terms of human factors. There is, however, a need for more powerful models that can match the rapidly growing complexity and sophistication of modern cars. Such models must provide coherent and consistent ways of describing driver performance to help engineers develop and validate technical concepts for semi- and fully automated systems in cars. This paper presents a qualitative model for Driver-in-Control (DiC) based on the principles of cognitive systems engineering. The model describes driving in terms of multiple, simultaneous control loops with the joint driver-vehicle system (JVDS) as a unit. This provides the capability to explain how disturbances may propagate between control levels. The model also enables new functions to be evaluated at the specific level at which they are aimed, rather than by their effects on global driving performance.

12 1 - 50 of 76
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • harvard1
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf