liu.seSearch for publications in DiVA
Change search
Refine search result
1234 1 - 50 of 169
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    A Herrera, I
    et al.
    Norwegian University Science and Technology NTNU.
    Woltjer, Rogier
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, Human-Centered systems.
    Comparing a multi-linear (STEP) and systemic (FRAM) method for accident analysis2010In: RELIABILITY ENGINEERING and SYSTEM SAFETY, ISSN 0951-8320, Vol. 95, no 12, p. 1269-1275Article in journal (Refereed)
    Abstract [en]

    Accident models and analysis methods affect what accident investigators look for, which contributory factors are found, and which recommendations are issued. This paper contrasts the Sequentially Timed Events Plotting (STEP) method and the Functional Resonance Analysis Method (FRAM) for accident analysis and modelling. The main issue addressed in this paper is the comparison of the established multi-linear method STEP with the new systemic method FRAM and which new insights the latter provides for accident analysis in comparison to the former established multi-linear method. Since STEP and FRAM are based on a different understandings of the nature of accidents, the comparison of the methods focuses on what we can learn from both methods, how, when, and why to apply them. The main finding is that STEP helps to illustrate what happened, involving which actors at what time, whereas FRAM illustrates the dynamic interactions within socio-technical systems and lets the analyst understand the how and why by describing non-linear dependencies, performance conditions, variability, and their resonance across functions.

  • 2.
    Ahrenberg, Lars
    et al.
    Linköping University, Department of Computer and Information Science, NLPLAB - Natural Language Processing Laboratory. Linköping University, The Institute of Technology.
    Dahlbäck, Nils
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Thureé, Åke
    Linköping University.
    Jönsson, Arne
    Linköping University, Department of Computer and Information Science, NLPLAB - Natural Language Processing Laboratory. Linköping University, The Institute of Technology.
    Customizing Interaction for Natural Language Interfaces1996Report (Other academic)
    Abstract [en]

    Habitability and robustness have been noted as important qualities of natural-language interfaces. In this paper we discuss how these requirements can be met, in particular as regards the system's ability to support a coherent and smooth dialogue. The discussion is based on current work on customizing a dialogue system for three different applications.   We adopt a sublanguage approach to the problem and propose a method for customization combining bottom-up use of empirical data with a global pragmatic analysis of a given application. Finally, we suggest three design principles that have emerged from our work called the sublanguage principle, the asymmetry principle and the quantity principle.

    Download full text (pdf)
    http://www.ep.liu.se/ea/cis/1996/001/cis96001.pdf
  • 3.
    Alm, Torbjörn
    et al.
    Linköping University, Department of Management and Engineering, Industrial ergonomics. Linköping University, The Institute of Technology.
    Kovordányi, Rita
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Ohlsson, Kjell
    Linköping University, Department of Management and Engineering, Industrial ergonomics. Linköping University, The Institute of Technology.
    Continuous versus Situation-dependent Night Vision Presentation in Automotive Applications2006Conference paper (Refereed)
    Abstract [en]

    As the number of advanced driver assistance systems in modern cars increases the question of possible negative behavioral adaptation is raised. We have investigated this phenomenon for night vision systems in a driving simulator. One common opinion is that there is a risk for using the enhanced visual conditions that come with these systems to increase speed during nighttime driving and thereby eliminate the safety margins the system was designed to provide. In our study two system approaches were compared, one with continuous presentation and one with presentation only when dangerous objects were detected by the system. The latter approach was meant to minimize the risk of negative adaptation, which was partly confirmed in the study. Moreover, the results showed better and more consistent driver performance with the situation-dependent system and all subjects preferred this approach from a workload perspective.

  • 4.
    Alm, Torbjörn
    et al.
    Linköping University, The Institute of Technology. Linköping University, Department of Mechanical Engineering, Industrial Ergonomics.
    Ohlsson, Kjell
    Linköping University, The Institute of Technology. Linköping University, Department of Mechanical Engineering, Industrial Ergonomics.
    Kovordanyi, Rita
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Glass Cockpit Simulators - Tools for IT-based Car Systems Design and Evaluation2005Conference paper (Refereed)
  • 5.
    Aminoff, Hedviq
    et al.
    Infocentret Ankaret.
    Johansson, Björn
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Trnka, Jiri
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, GIS - Geographical Information Science Group.
    Understanding Coordination in Emergency Response2007In: European Annual Conference on Human Decision-Making and Manual Control,2007, Lyngby, Denmark: Technical University of Denmark , 2007Conference paper (Other academic)
    Abstract [en]

    This paper describes and discusses analysis of an emergency management exercise. In the exercise scenario, different emergency management organizations jointly try to cope with a forest fire and related incidents. The Extended Control Model is utilized for examination of the establishment of en emergent emergency response organization. Ambiguity in how functions are to be handled in a large event, indicating vulnerabilities in face of larger crises; functions moving across roles during the evolving event; and recognizable phases of a response are uncovered. This is assessed by utilizing episodic analysis of the communication between different functions and roles in the participating emergency management organizations. The results indicate requirements for future information and communication technologies, and occurrences that can be explored in future studies.

  • 6.
    Arvidsson, Fredrik
    et al.
    Linköping University, Department of Computer and Information Science, ASLAB - Application Systems Laboratory. Linköping University, The Institute of Technology.
    Ihlström, Carina
    Högskolan Halmstad.
    Lundberg, Jonas
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Visions of Future News - Consensus or Conflict?2002In: Information Systems Research Seminar in Scandinavia, 2002Conference paper (Refereed)
    Abstract [en]

    The move from print to multimedia will cause changes not only to the form of the news service but also the involved processes in the news organizations. The cooperative scenario building technique is used on a number of groups; end-users, management and media professionals to envisioning the news services of the future. We take the perspective of consensus and conflict to illustrate the identified visions. Firstly, we illuminate conflicts and consensus between the groups, regarding their visions and future use scenarios. Secondly, we show the implications of using the cooperative scenariobuilding technique in relation the consensus and conflict perspectives in cooperative design. We conclude that both consensus and conflicts could be found in the scenarios described in the paper and that the cooperative technique was suitable in this context.

  • 7.
    Arvola, Mattias
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    The Interaction Character of Computers in Co-located Collaboration2003In: People and Computers XVII: Proceedings of HCI 2003: Designing for Society / [ed] O'Neill, E., Palanque, P., Johnson, P., London: Springer , 2003Conference paper (Refereed)
    Abstract [en]

    An INTERACTION CHARACTER refers to a coherent set of qualities of the actions that an application mediates. Examples of such characters include the ‘computer as a tool’ and the ‘computer as a medium’. This paper investigates INTERACTION CHARACTERS of applications used in colocated collaboration. Three qualitative cases have been investigated: consultation at banks, interaction design studio work, and interactive television usage. Interviews, observations, and workshops, as well as prototype design and testing, were conducted as part of the case studies. The results show that the INTERACTION CHARACTER may change swiftly in the middle of usage, which means that people are using the systems quite differently from one moment to the next. One way to increase the flexibility of a system is to facilitate those shifts between different INTERACTION CHARACTERS, by for instance letting people use the system as a tool one minute, and as a medium or a resource the next.

  • 8.
    Benn, J.
    et al.
    Department of Biosurgery and Surgical Technology, Imperial College London, St. Mary's Hospital, Praed Street, Paddington, London W2 1NY, United Kingdom.
    Healey, A.N.
    Department of Biosurgery and Surgical Technology, Imperial College London, St. Mary's Hospital, Praed Street, Paddington, London W2 1NY, United Kingdom.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Improving performance reliability in surgical systems2008In: Cognition, Technology & Work, ISSN 1435-5558, E-ISSN 1435-5566, Vol. 10, no 4, p. 323-333Article in journal (Refereed)
    Abstract [en]

    Health care has evolved rapidly to meet the medical demands of society, but not to meet the demands of consistent operational safety. In other high risk domains in which the consequences of systems failure are unacceptable, organisational and operational work systems have been developed to deliver consistent, high-quality, failure-free performance. In this paper we review contributions to a special issue of Cognition, Technology and Work on 'Enhancing Surgical Systems'. We consider their implications for improving the reliability of care processes in light of theoretical developments in the area of high-reliability organisations and resilience engineering. Health care must move from reactive safety cultures to be more proactively resilient to the continual threats posed by complexity in clinical care processes and the multi-professional hospital environment. Our analysis emphasises the importance of team working for reliable operational performance. A schematic framework to illustrate how safety interventions in surgery might cohere within an organisational strategy for achieving high-reliability is proposed. The implications for continuous quality improvement and effective regulation of system safety over differing time scales and organisational levels are discussed. © 2007 Springer-Verlag London Limited.

  • 9.
    Blomquist, Åsa
    et al.
    Swedish National Tax Board, Solna.
    Arvola, Mattias
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Personas in Action: Ethnography in an Interaction Design Team2002In: The Second Nordic Conference on Human-Computer Interaction / [ed] Bertelsen, Olav W, New York: ACM , 2002, p. 197-200Conference paper (Refereed)
    Abstract [en]

    Alan Cooper's view on interaction design is both appealing and provoking since it avoids problems of involving users by simply excluding them. The users are instead represented by an archetype of a user, called persona. This paper reports a twelve-week participant observation in an interaction design team with the purpose of learning what really goes on in a design team when they implement personas in their process. On the surface it seemed like they used personas, but our analysis show how they had difficulties in using them and encountered problems when trying to imagine the user. We furthermore describe and discuss how the design team tried to involve users in order to compensate for their problems. It is concluded that it is not enough for the design team, and particularly not for the interaction designers, to have the know-how of using the method. They also have to integrate it with existing knowledge and practices in order to feel at home with it and use it efficiently.

  • 10.
    Burman, Linn
    et al.
    Skogforsk, Sweden.
    Dahlbäck, Nils
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, Faculty of Arts and Sciences.
    Knowledge Carrying and Knowledge Emphasizing Animations: A Useful Distincitions when Developing Educational Software2008In: Proceedings of World Conference on Educational Multimedia, Hypermedia and Telecommunications 2008 / [ed] Joseph Luca & Edgar R. Weippl, Chesapeake, VA: AACE , 2008, p. 1228-1233Conference paper (Refereed)
    Abstract [en]

    Previous research on the usefulness of animations in educational software has not shown any unequivocal advantage of animations. We claim that these conflicting results at least in part stem from not distinguishing between different educational communicative uses of animations. We suggest that it is necessary to distinguish between knowledge carrying and knowledge emphasizing animations, and suggest guidelines for when animations can and cannot be used in educational software. We illustrate our analysis with examples from learners’ responses to educational software for the teaching of optics. Users not familiar with the field were found to have difficulties in understanding when the animation in itself is something to be learned from cases where the purpose of the animation is to draw the users’ attention to something that is to be learned. By adhering to the distinction above, we found no misunderstandings from our users regarding the purpose of the animations.

  • 11.
    Dahlbäck, Nils
    et al.
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, Faculty of Arts and Sciences.
    Wang, QianYing
    Stanford University, Stanford, CA.
    Nass, Clifford
    Stanford University, Stanford, CA.
    Alwin, Jenny
    Linköping University, Department of Social and Welfare Studies, Äldre - vård - civilsamhälle (ÄVC) . Linköping University, Faculty of Health Sciences.
    Similarity is more important than expertise: accent effects in speech interfaces2007In: CHI '07 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY, USA: ACM , 2007, p. 1553-1556Conference paper (Refereed)
    Abstract [en]

    In a balanced between-participants experiment (N = 96) American and Swedish participants listened to tourist information on a website about an American or Swedish city presented in English with either an American or Swedish accent and evaluated the speakers' knowledge of the topic, the voice characteristics, and the information characteristics. Users preferred accents similar to their own. Similarity-attraction effects were so powerful that same-accents speakers were viewed as being more knowledgeable than different-accent speakers even when the information would be much better-known by the opposite-accent speaker. Implications for similarity-attraction overwhelming expertise are discussed.

  • 12.
    Dekker, Sidney
    et al.
    Linköping University, The Institute of Technology. Linköping University, Department of Mechanical Engineering, Industrial Ergonomics.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Human factors and folk models2004In: Cognition, Technology & Work, ISSN 1435-5558, E-ISSN 1435-5566, Vol. 6, p. 79-86Article in journal (Refereed)
  • 13.
    Dekker, Sidney
    et al.
    Linköping University, The Institute of Technology. Linköping University, Department of Mechanical Engineering, Industrial Ergonomics.
    Ohlsson, Kjell
    Linköping University, The Institute of Technology. Linköping University, Department of Mechanical Engineering, Industrial Ergonomics.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Alm, Håkan
    Arbetsvetenskap Luleå tekniska universitet.
    Humans in a complex environment II: Automation, IT and operator work2003Book (Other academic)
  • 14.
    Fujita, Yushi
    et al.
    Technova Inc..
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Failures without errors: Quantification of context in HRA2004In: Reliability Engineering & System Safety, ISSN 0951-8320, E-ISSN 1879-0836, Vol. 83, no 2, p. 145-151Article in journal (Refereed)
    Abstract [en]

    PSA-cum-human reliability analysis (HRA) has traditionally used individual human actions, hence individual 'human errors', as a meaningful unit of analysis. This is inconsistent with the current understanding of accidents, which points out that the notion of 'human error' is ill defined and that adverse events more often are the due to the working conditions than to people. Several HRA approaches, such as ATHEANA and CREAM have recognised this conflict and proposed ways to deal with it. This paper describes an improvement of the basic screening method in CREAM, whereby a rating of the performance conditions can be used to calculate a Mean Failure Rate directly without invoking the notion of human error.

  • 15.
    Fujita, Yushi
    et al.
    Technova Inc..
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    From error probabilities to control modes: Quantification of context effects on performance2002In: International Workshop on "Building the new HRA": Strengthening the links between experience and HRA, 2002Conference paper (Other academic)
  • 16.
    Gauthereau, V.
    et al.
    Department of Work Psychology, Université de Lièe, Boulevard de Rectarat 5, B-4000 Lièe, Belgium.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Planning, control, and adaptation: A case study2005In: European Management Journal, ISSN 0263-2373, E-ISSN 1873-5681, Vol. 23, no 1, p. 118-131Article in journal (Refereed)
    Abstract [en]

    This article relates the findings of an ethnographically informed study conducted at a Swedish Nuclear Power Plant. It describes a set of events while the plant was shut down for a short non-productive outage. In order to understand better how high-reliability organizations successfully manage the conflict between bureaucratic planning and flexibility we provide a description of simultaneous levels of control that dissolve the conflict. Instead of merely describing actions as planned, or improvised, the present work illustrates that no clear-cut distinctions can be drawn between what is improvisation and what is not. Understanding planning as a control activity as well, enables us to grasp the different values of the canonical activity of planning. © 2005 Elsevier Ltd. All rights reserved.

  • 17.
    Gauthereau, Vincent
    et al.
    Linköping University, The Institute of Technology. Linköping University, Department of Mechanical Engineering.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Building expectations and the management of surprises: Studies of safety-train outages at a Swedish nuclear power plant.2003In: European Conference on Cognitive Science Approache to Process Control,2003, 2003Conference paper (Refereed)
  • 18.
    Gauthereau, Vincent
    et al.
    Linköping University, Department of Mechanical Engineering, Quality Technology and Management. Linköping University, The Institute of Technology.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Organisational Improvisation: A Field Study At a Swedish NPP during a Productive-Outage.2002In: European Annual Conference on Human Decision Making and Control, 2002Conference paper (Refereed)
  • 19.
    Gawinowski, Gilles
    et al.
    Eurocontrol, Bretigny, F.
    Averty, Philippe
    DTI/SDER, Toulouse, F.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Johansson, Björn
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Wise, John A.
    Honeywell International, Phoenix, AZ.
    ERASMUS: A novel human factors approach to air traffic management2006In: 27th European Association for Aviation Psychology Jubilee Conference,2006, Potsdam: EAAP , 2006Conference paper (Refereed)
  • 20.
    Goteman, O.
    et al.
    Goteman, Ö., Scandinavian Airlines Stockholm, Sweden, Flight Operations Standards, STOPS, Scandinavian Airlines, S-195 87 Stockholm, Sweden.
    Smith, Kip
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Dekker, Sidney
    Linköping University, The Institute of Technology. Linköping University, Department of Management and Engineering, Industrial ergonomics .
    HUD with a velocity (flight-path) vector reduces lateral error during landing in restricted visibility2007In: The International journal of aviation psychology, ISSN 1050-8414, E-ISSN 1532-7108, Vol. 17, no 1, p. 91-108Article in journal (Refereed)
    Abstract [en]

    The operational community has assumed that using a head-up display (HUD) instead of conventional head-down displays will increase accuracy and safety during approach and landing. The putative mechanism for this increase in safety is that previously demonstrated improvements in lateral and vertical control of the aircraft in flight should carry over to the landing situation. Alternatively, it is possible that, during approach and landing, the HUD might affect the pilot's ability to assimilate outside cues at the decision height, thereby reducing the success ratio for landings using an HUD. This article reports a pair of experiments that test these competing hypotheses. Taking advantage of the opportunity when an air transport operator introduced HUD in an existing aircraft fleet, we were able to use a Boeing 737-700 full-motion simulator flown by commercial airline pilots. We explored the effects of (a) HUD use, (b) ambient visibility, and (c) length of approach lighting on the size and location of the touchdown footprint. We also explored the effects of HUD use on approach success ratio. HUD use reduced the width of the touchdown footprint in all tested visibility and lighting conditions, including visibility below the minimum allowed. HUD use had no effect on the length of the touchdown footprint. We could not detect any decrease in approach success rate for HUD approaches. Based on these empirical data, the minimum visibility for approaches using HUDs could be set lower than for approaches without an HUD. Copyright © 2007, Lawrence Erlbaum Associates, Inc.

  • 21.
    Göran, Pettersson
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Human Factors of Decisionmaking in Complex Systems2003In: Decision making support systems: achievements, trends, and challenges for the new decade / [ed] Manuel Mora, Guisseppi A. Forgionne, Jatinder N.D. Gupta., 2003, p. -418Chapter in book (Other academic)
    Abstract [en]

    The book presents state of the art knowledge about Decision-Making Support Systems (DMSS).  Its main goals are to help diffuse knowledge about effective methods and strategies for successfully designing, developing, implementing, and evaluating decision-making support systems, and to create an awareness among academicians and practitioners about the relevance of decision-making support systems in the current  dynamic management environment.  Decision-Making Support Systems: Achievements and Challenges for the New Decade is a comprehensive compilation of DMSS thought and vision, dealing with issues such as decision making concepts in organizations.

  • 22.
    Hancock, Peter A.
    et al.
    Dept of Psychology University of Central Florida.
    Smith, Kip
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    The design of and some results from a distributed air-traffic information display simulator (DATIDS)2008In: International Journal of Applied Aviation Studies, ISSN 1546-3214, E-ISSN 1939-0300, Vol. 7, no 2, p. 232-244Article in journal (Refereed)
  • 23.
    Hedenskog, Åsa
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Increasing the automation of radio network control2003Licentiate thesis, monograph (Other academic)
    Abstract [en]

    The efficient utilization of radio frequencies is becoming more important with new technology, new telecom services and a rapidly expanding market. Future systems for radio network management are therefore expected to contain more automation than today's systems.

    This thesis describes a case study performed at a large European network operator. The first purpose of this study was to identify and describe elements in the current environment of telecommunication radio network management, in order to draw conclusions about the impact of a higher degree of automation in future software systems for radio network management.

    The second purpose was to identify specific issues for further analysis and development.

    Based on a case study comprising eight full-day observations and eleven interviews with the primary user category, and their colleagues on other teams, this thesis:

    • Describes the work environment by presenting findings regarding task performance and the use of knowledge, qualities of current tools and the expected qualities of new technology.
    • Based on the empirical findings, it concludes that full automation is not feasible at this time, but that a supervisory control system including both a human operator and a machine is therefore the best solution.
    • Describes the design considerations for such a supervisory control system for this domain.
    • Based on the finding that users allocate function in order to learn about a tool, it introduces the concept of adaption through praxis, as a way of introducing a supervisory control system which includes automation.
    • In conclusion, it discusses research issues for future studies in this area.
  • 24.
    Hedenskog, Åsa
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Perceive those things which cannot be seen: A cognitive systems engineering perspective on requirements management2006Doctoral thesis, monograph (Other academic)
    Abstract [en]

    Non-functional requirements contribute to the overall quality of software, and should therefore be a part of any development effort. However, in practice they are often considered to be too difficult to handle.

    The purpose of this thesis is to gain understanding of where the nature and origin of these difficulties may lie. The focus is on a specific type of non-functional requirements: usability requirements. The basis for the thesis is two case studies, the results of which are presented herein:

    The first case study describes the work environment of radio network optimizers by presenting needs regarding task performance and the use of knowledge, qualities of current tools, and the expected qualities of new technology. The original purpose of this study was to investigate how a higher level of automation in the software tools used for managing the radio network would impact the optimizers’ work. As a result of the ethnographical method used, the first study revealed that there was a body of user requirements that were not addressed in the tool development.

    This led to the second case study, specifically examining the difficulties of managing usability requirements. The study took place over the course of two years, at a company that is a large supplier of systems for radio network control. The purpose was to seek knowledge about the requirements engineering process at the studied company, in order to better understand the environment, people and tasks involved in controlling this process. The motivation for this was to find an answer to the question of why some requirements are not addressed in the tool development, even though they are important to the tool users. It was also the ambition to identify and describe areas in the requirements engineering process that might be improved. The requirements engineering process was analyzed from a cognitive systems engineering perspective, which is suitable for analysis and design of complex, high variety systems, such as the system that controls the requirements management process.

    The result from the second case study is a description of the difficulties of handling requirements, specifically usability requirements. The impacts of the process, the organization, and the culture are discussed, as is the overall task of controlling the requirements engineering process.

    The study concludes that:

    - The engineering production culture impacts the way non-functional (especially usability-) requirements are addressed in software development.

    - Lack of knowledge of potential problems with usability requirements can be a state which is maintained by a self-reinforcing process.

    - A discrepancy between where responsibility for managing requirements is, and where resources are, can cause problems where usability requirements are concerned.

    It was also empirically verified that:

    - A cognitive systems engineering approach can be successfully applied to this type of system, and easily incorporates cultural aspects in the analysis.

  • 25.
    Herrera, Ivonne A.
    et al.
    Department of Production and Quality Engineering, Norwegian University of Science and Technology, Trondheim, Norway.
    Woltjer, Rogier
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, Human-Centered systems.
    Comparing a multi-linear (STEP) and systemic (FRAM) method for accident analysis2009In: Safety, Reliability and Risk Analysis: Theory, Methods and Applications. / [ed] Martorell, S., Guedes Soares, C., & Barnett, J., London, UK: Taylor & Francis Group, 2009, p. 19-26Conference paper (Refereed)
    Abstract [en]

    Accident models and analysis methods affect what accident investigators look for, which contributing factors are found, and which recommendations are issued. This paper contrasts the Sequentially Timed Events Plotting (STEP) method and the Functional Resonance Analysis Method (FRAM) for accident analysis and modelling. The main issues addressed in this paper are comparing the established multi-linear method (STEP) with the systemic method (FRAM) and evaluating which new insights the latter systemic method provides for accident analysis in comparison to the former established multi-linear method. Since STEP and FRAM are based on a different understandings of the nature of accidents, the comparison of the methods focuses on what we can learn from both methods, how, when, and why to apply them. The main finding is that STEP helps to illustrate what happened, whereas FRAM illustrates the dynamic interactions within socio-technical systems and lets the analyst understand the how and why by describing non-linear dependencies, performance conditions, variability, and their resonance across functions.

  • 26.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    A cognitive systems engineering perspective on automation2003In: International Conference on Human Aspects of Advanced Manufacturing: Agility and Hybrid Automation, X: X , 2003Conference paper (Refereed)
  • 27.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    A function-centred approach to joint driver-vehicle system design2006In: Cognition, Technology & Work, ISSN 1435-5558, E-ISSN 1435-5566, Vol. 8, no 3, p. 169-173Article in journal (Refereed)
  • 28.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    A function-centred approach to joint driver-vehicle system design2004In: IEEE SMC 2004 International Conference on Systems, Man and Cybernetics,2004, Delft, NL: TU Delft , 2004, p. 2548-Conference paper (Refereed)
  • 29.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Analysis and prediction of failures in complex systems: Models & methods2000In: Lecture notes in control and information sciences, ISSN 0170-8643, E-ISSN 1610-7411, Vol. 253, p. 39-41Article in journal (Refereed)
  • 30.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Applying AI to models in cognitive ergonomics2003In: Handbook of Human Factors & Ergonomics / [ed] Neville Stanton, Tokyo: Asakura Publishing Company, Ltd. , 2003, 1, p. 256-271Chapter in book (Other academic)
    Abstract [en]

    Research suggests that ergonomists tend to restrict themselves to two or three of their favorite methods in the design of systems, despite a multitude of variations in the problems that they face. Human Factors and Ergonomics Methods delivers an authoritative and practical account of methods that incorporate human capabilities and limitations, environmental factors, human-machine interaction, and other factors into system design. The Handbook describes 83 methods in a standardized format, promoting the use of methods that may have formerly been unfamiliar to designers.The handbook comprises six sections, each representing a specialized field of ergonomics with a representative selection of associated methods. The sections highlight facets of human factors and ergonomics in systems analysis, design, and evaluation. Sections I through III address individuals and their interactions with the world. Section IV explores social groupings and their interactions (team methods), and Section V examines the effect of the environment on workers. The final section provides an overview of work systems-macroergonomics methods. An onion-layer model frames each method; working from the individual, to the team, to the environment, to the work system. Each chapter begins with an introduction written by the chapter's editor, offering a brief overview of the field and a description of the methods covered. The Handbook provides a representative set of contemporary methods that are valuable in ergonomic analyses and evaluations.The layout of each chapter is standardized for ease-of-use, so you can quickly locate relevant information about each method. Content descriptions are brief, and references are made to other texts, papers, and case studies. Standard descriptions of methods encourage browsing through several potential methods before tackling a problem.

  • 31.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Att förstå olyckor:  lätt att bli syndabock i "effektiv" organisation2003In: Nucleus, ISSN 1104-4578, Vol. 1, p. 12-17Article in journal (Other academic)
  • 32.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Automation and human work2004In: Human factors for engineers / [ed] Carl Sandom and Roger Harvey, London: IEEE , 2004, p. 113-150Chapter in book (Other academic)
    Abstract [en]

    This book introduces the reader to the subject of human factors and provides practical and pragmatic advice to assist engineers in designing interactive systems that are safer, more secure and easier to use - thereby reducing accidents due to human error, increasing system integrity and enabling more efficient process operations. The book discusses human factors integration methodology and reviews the issues that underpin consideration of key topics such as human error, automation and human reliability assessment. There are also design considerations including control room and interface design and acceptance and verification considerations.

     

  • 33.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Barrier analysis and accident prevention2003In: Innovation and consolidation in aviation: selected contributions to the Australian Aviation Psychology Symposium 2000 / [ed] Graham Edkins; Peter Pfister, Aldershot: Ashgate Publishing Limited , 2003, p. 59-74Chapter in book (Other academic)
    Abstract [en]

    This is the formal refereed proceedings of the fifth Australian Aviation Psychology Symposium. The symposium had a diverse range of contributions and development workshops, bringing together practitioners from aviation psychology and human factors, flight operations management, safety managers, pilots, cabin crew, air traffic controllers, engineering and maintenance personnel, air safety investigators, staff from manufacturers and regulatory bodies and applied aviation industry researchers and academics.

    The volume expands the contribution of aviation psychology and human factors to the aviation industry within the Asia Pacific region, developing the safety, efficiency and viability of the industry. It is a forward-looking work, providing strategies for psychology and human factors to increase the safe and effective functioning of aviation organizations and systems, pertinent to both civil and military operations.

  • 34.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Barriers and accident prevention2004Book (Other academic)
    Abstract [en]

    Accidents are preventable, but only if they are correctly described and understood. Since the mid-1980s accidents have come to be seen as the consequence of complex interactions rather than simple threads of causes and effects. Yet progress in accident models has not been matched by advances in methods. The author's work in several fields (aviation, power production, traffic safety, healthcare) made it clear that there is a practical need for constructive methods and this book presents the experiences and the state-of-the-art. The focus of the book is on accident prevention rather than accident analysis and unlike other books, has a proactive rather than reactive approach. The emphasis on design rather than analysis is a trend also found in other fields. Features of the book include: A classification of barrier functions and barrier systems that will enable the reader to appreciate the diversity of barriers and to make informed decisions for system changes. A perspective on how the understanding of accidents (the accident model) largely determines how the analysis is done and what can be achieved. The book critically assesses three types of accident models (sequential, epidemiological, systemic) and compares their strengths and weaknesses. A specific accident model that captures the full complexity of systemic accidents. One consequence is that accidents can be prevented through a combination of performance monitoring and barrier functions, rather than through the elimination or encapsulation of causes. A clearly described methodology for barrier analysis and accident prevention. Written in an accessible style, Barriers and Accident Prevention is designed to provide a stimulating and practical guide for industry professionals familiar with the general ideas of accidents and human error. The book is directed at those involved with accident analysis and system safety, such as managers of safety departments, risk and safety consultants, human factors professionals, and accident investigators. It is applicable to all major application areas such as aviation, ground transportation, maritime, process industries, healthcare and hospitals, communication systems, and service providers.

  • 35.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Computers and covert automation: A cognitive systems engineering view2004In: 7th International Conference on Work With Computing Systems,2004, Kuala Lumpur: WWCS , 2004, p. 15-Conference paper (Refereed)
  • 36.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Decisions about "what", and decisions about "how"2003In: Human Factors of Decision Making in Complex Systems, X: X , 2003, p. 8-10Conference paper (Refereed)
  • 37.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Developing measurements of driving performance for the effects of active safety systems.2003In: Triennial Conference International Ergonomics Association,2003, 2003Conference paper (Refereed)
  • 38.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Flight decks and free flight: Where are the boundaries?2004In: The Flightdeck of the Future:Human Factors in Datalink and Freeflight,2004, Nottingham, UK: University of Nottingham , 2004Conference paper (Other academic)
  • 39.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    From cognitive task analysis to cognitive task design2003In: Triennial Conference International Ergonomics Association, X: X , 2003Conference paper (Refereed)
  • 40.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    From HMS to HCI - and back again2001In: People in control: Human factors in control room design / [ed] Jan Noyes and Matthew Bransby, London, UK: The Institution of Engineering and Technology , 2001, p. xvii-xxChapter in book (Other academic)
    Abstract [en]

    The aim of this book is to provide state-of-the-art information on various aspects of humanmachine intereraction and human-centred issues encountered in the control room setting. Illustrated with useful case studies.

  • 41.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    From human factors to cognitive systems engineering: Human-machine interaction in the 21st Century2001In: Anzen-no-Tankyu (Researches on safety), Tokyo: ERC Publishing , 2001Chapter in book (Other academic)
  • 42.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    From human-centred to function-centred design2002In: International Conference on The Sciences of Design: The Scientific Challenge for the 21st Century, 2002Conference paper (Refereed)
  • 43.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    From levels of automation to layers of control2003In: NATO RTO Human Factors and Medical Panel Workshop: Uninhabited Military Vehicles - Human Factors Of Augmenting The Force, 2003Conference paper (Other academic)
  • 44.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Handbook of cognitive task design2003Book (Other academic)
    Abstract [en]

    Offers the theories, models, and methods related to cognitive task design. This work summarizes the extensive, worldwide experience with cognitive task design since the 1980s. It defines the state of the art and outlines the future of this ever-developing field

  • 45.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    How to assess the risks of human erroneous actions2003In: Safety and reliability: interactions between machines, software and people / [ed] Jörgen Eklund, Jens-Peder Ekros ; contributions by Jörgen Eklund, Linköping: Linköpings universitet , 2003, p. 23-36Chapter in book (Other academic)
  • 46.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Is affective computing an oxymoron?2003In: International journal of human-computer studies, ISSN 1071-5819, E-ISSN 1095-9300, Vol. 59, no 1-2, p. 65-70Article, review/survey (Refereed)
    Abstract [en]

    An overview of affective computing in respect with the human-computer interaction was presented. It dealt with the affective states of a computer which are similar to humans, hence computer can also show affects and emotions. In this regard, the concept of enhancement of effective of communication by imitating the emotional aspects was also discussed.

  • 47.
    Hollnagel, Erik
    Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory. Linköping University, The Institute of Technology.
    Joint human-computer system dependability2002In: European Conference on Cognitive Ergonomics and Safe, 2002Conference paper (Refereed)
  • 48.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Looking for errors of omission and commission or The Hunting of the Snark revisited2000In: Reliability Engineering & System Safety, ISSN 0951-8320, E-ISSN 1879-0836, Vol. 68, no 2, p. 135-145Article in journal (Refereed)
    Abstract [en]

    Since the early 1990s, considerable effort has been spent to understand what is meant by an `error of commission' (EOC), to complement the traditional notion of an `error of omission' (EOO). This paper argues that the EOO-EOC dyad, as an artefact of the PSA event tree, is insufficient for human reliability analysis (HRA) for several reasons: (1) EOO-EOC fail to distinguish between manifestation and cause, (2) EOO-EOC refer to classes of incorrect actions rather than to specific instances: (3) there is no unique way of classifying an event using EOO-EOC, (4) the set of error modes that cannot reasonably be classified as EOO is too diverse to fit into any single category of its own. Since the use of EOO-EOC leads to serious problems for HRA. an alternative is required. This can be found in the concept of error modes, which has a long history in risk analysis. A specific system for error mode prediction was tested in a simulator experiment. The analysis of the results showed that error modes could be qualitatively predicted with sufficient accuracy (68% correct) to propose this method as a way to determine how operator actions can fail in PSA-cum-HRA. Although this still leaves the thorny issue of quantification, a consistent prediction of error modes provides a better starting point for determining probabilities than the EOO-EOC dyad. It also opens a possibility for quantification methods where the influence of the common performance conditions is prior to and more important than individual failure rates.

  • 49.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Modelling the controller of a process1999In: Transactions of the Institute of Measurement and Control, ISSN 0142-3312, E-ISSN 1477-0369, Vol. 21, no 4-5, p. 163-170Article in journal (Refereed)
    Abstract [en]

    Models of humans (operators, controllers) in human-machine systems have tacitly assumed that humans must have a model of the process in order to control it. Humans have therefore traditionally been described as information processing systems with an internal or mental model of the process as an important component. A more systemic or cybernetic view acknowledges that the human must be a model of the process in order to control it. This suggests a different approach to modelling, which is functional rather than structural, and where the emphasis is on how the joint human-machine system can maintain control of a situation. A specific model, called the Contextual Control Model (COCOM), which is based on the principles of cognitive systems engineering, illustrates the approach, COCOM provides a foundation for analysing controller performance as well as implementing controller needs.

  • 50.
    Hollnagel, Erik
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, CSELAB - Cognitive Systems Engineering Laboratory.
    Modelling the orderliness of human action2000In: Cognitive engineering in the aviation domain / [ed] Nadine B Sarter; René Amalberti, Hillsdale, NJ: Erlbaum , 2000, p. -376Chapter in book (Other academic)
    Abstract [en]

    Although cognitive engineering has gained widespread acceptance as one of the most promising approaches to addressing and preventing difficulties with human-machine coordination and collaboration, it still meets with considerable skepticism and resistance in some of the industries that could benefit from its insights and recommendations. The challenge for cognitive engineers is to better understand the reasons underlying these reservations and to overcome them by demonstrating and communicating more effectively their concepts, approaches, and proposed solutions. To contribute to this goal, the current volume presents concrete examples of cognitive engineering research and design. It is an attempt to complement the already existing excellent literature on cognitive engineering in domains other than aviation and to introduce professionals and students in a variety of domains to this rather young discipline.

1234 1 - 50 of 169
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf