Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

Modeling Human–System Interaction: Philosophical and Methodological Considerations, with Examples
Modeling Human–System Interaction: Philosophical and Methodological Considerations, with Examples
Modeling Human–System Interaction: Philosophical and Methodological Considerations, with Examples
Ebook336 pages3 hours

Modeling Human–System Interaction: Philosophical and Methodological Considerations, with Examples

Rating: 0 out of 5 stars

()

Read preview

About this ebook

This book presents theories and models to examine how humans interact with complex automated systems, including both empirical and theoretical methods.

  • Provides examples of models appropriate to the four stages of human-system interaction
  • Examines in detail the philosophical underpinnings and assumptions of modeling
  • Discusses how a model fits into "doing science" and the considerations in garnering evidence and arriving at beliefs for the modeled phenomena

Modeling Human-System Interaction is a reference for professionals in industry, academia and government who are researching, designing and implementing human-technology systems in transportation, communication, manufacturing, energy, and health care sectors.

LanguageEnglish
PublisherWiley
Release dateDec 19, 2016
ISBN9781119275282
Modeling Human–System Interaction: Philosophical and Methodological Considerations, with Examples

Related to Modeling Human–System Interaction

Related ebooks

Electrical Engineering & Electronics For You

View More

Related articles

Reviews for Modeling Human–System Interaction

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    Modeling Human–System Interaction - Thomas B. Sheridan

    Table of Contents

    COVER

    TITLE PAGE

    PREFACE

    INTRODUCTION

    1 KNOWLEDGE

    GAINING NEW KNOWLEDGE

    SCIENTIFIC METHOD: WHAT IS IT?

    FURTHER OBSERVATIONS ON THE SCIENTIFIC METHOD

    REASONING LOGICALLY

    PUBLIC (OBJECTIVE) AND PRIVATE (SUBJECTIVE) KNOWLEDGE

    THE ROLE OF DOUBT IN DOING SCIENCE

    EVIDENCE: ITS USE AND AVOIDANCE

    METAPHYSICS AND ITS RELATION TO SCIENCE

    OBJECTIVITY, ADVOCACY, AND BIAS

    ANALOGY AND METAPHOR

    2 WHAT IS A MODEL?

    DEFINING MODEL

    MODEL ATTRIBUTES: A NEW TAXONOMY

    EXAMPLES OF MODELS IN TERMS OF THE ATTRIBUTES

    WHY MAKE THE EFFORT TO MODEL?

    ATTRIBUTE CONSIDERATIONS IN MAKING MODELS USEFUL

    SOCIAL CHOICE

    WHAT MODELS ARE NOT

    3 IMPORTANT DISTINCTIONS IN MODELING

    OBJECTIVE AND SUBJECTIVE MODELS

    SIMPLE AND COMPLEX MODELS

    DESCRIPTIVE AND PRESCRIPTIVE (NORMATIVE) MODELS

    STATIC AND DYNAMIC MODELS

    DETERMINISTIC AND PROBABILISTIC MODELS

    HIERARCHY OF ABSTRACTION

    SOME PHILOSOPHICAL PERSPECTIVES

    4 FORMS OF REPRESENTATION

    VERBAL MODELS

    GRAPHS

    MAPS

    SCHEMATIC DIAGRAMS

    LOGIC DIAGRAMS

    CRISP VERSUS FUZZY LOGIC (SEE ALSO APPENDIX, SECTION MATHEMATICS OF FUZZY LOGIC)

    SYMBOLIC STATEMENTS AND STATISTICAL INFERENCE (SEE ALSO APPENDIX, SECTION MATHEMATICS OF STATISTICAL INFERENCE FROM EVIDENCE)

    5 ACQUIRING INFORMATION

    INFORMATION COMMUNICATION (SEE ALSO APPENDIX, SECTION MATHEMATICS OF INFORMATION COMMUNICATION)

    INFORMATION VALUE (SEE ALSO APPENDIX, SECTION MATHEMATICS OF INFORMATION VALUE)

    LOGARITHMIC‐LIKE PSYCHOPHYSICAL SCALES

    PERCEPTION PROCESS (SEE ALSO APPENDIX, SECTION MATHEMATICS OF THE BRUNSWIK/KIRLIK PERCEPTION MODEL)

    ATTENTION

    VISUAL SAMPLING (SEE ALSO APPENDIX, SECTION MATHEMATICS OF HOW OFTEN TO SAMPLE)

    SIGNAL DETECTION (SEE ALSO APPENDIX, SECTION MATHEMATICS OF SIGNAL DETECTION)

    SITUATION AWARENESS

    MENTAL WORKLOAD (SEE ALSO APPENDIX, SECTION RESEARCH QUESTIONS CONCERNING MENTAL WORKLOAD)

    EXPERIENCING WHAT IS VIRTUAL: NEW DEMANDS FOR HUMAN–SYSTEM MODELING (SEE ALSO APPENDIX, SECTION BEHAVIOR RESEARCH ISSUES IN VIRTUAL REALITY)

    6 ANALYZING THE INFORMATION

    TASK ANALYSIS

    JUDGMENT CALIBRATION

    VALUATION/UTILITY (SEE ALSO APPENDIX, SECTION MATHEMATICS OF HUMAN JUDGMENT OF UTILITY)

    RISK AND RESILIENCE

    TRUST

    7 DECIDING ON ACTION

    WHAT IS ACHIEVABLE

    DECISION UNDER CONDITION OF CERTAINTY (SEE ALSO APPENDIX, SECTION MATHEMATICS OF DECISIONS UNDER CERTAINTY)

    DECISION UNDER CONDITION OF UNCERTAINTY (SEE ALSO APPENDIX, SECTION MATHEMATICS OF DECISIONS UNDER UNCERTAINTY)

    COMPETITIVE DECISIONS: GAME MODELS (SEE ALSO APPENDIX MATHEMATICS OF GAME MODELS)

    ORDER OF SUBTASK EXECUTION

    8 IMPLEMENTING AND EVALUATING THE ACTION

    TIME TO MAKE A SELECTION

    TIME TO MAKE AN ACCURATE MOVEMENT

    CONTINUOUS FEEDBACK CONTROL (SEE ALSO APPENDIX, SECTION MATHEMATICS OF CONTINUOUS FEEDBACK CONTROL)

    LOOKING AHEAD (PREVIEW CONTROL) (SEE ALSO APPENDIX, SECTION MATHEMATICS OF PREVIEW CONTROL)

    DELAYED FEEDBACK

    CONTROL BY CONTINUOUSLY UPDATING AN INTERNAL MODEL (SEE ALSO APPENDIX, SECTION STEPPING THROUGH THE KALMAN FILTER SYSTEM)

    EXPECTATION OF TEAM RESPONSE TIME

    HUMAN ERROR

    9 HUMAN–AUTOMATION INTERACTION

    HUMAN–AUTOMATION ALLOCATION

    SUPERVISORY CONTROL

    TRADING AND SHARING

    ADAPTIVE/ADAPTABLE CONTROL

    MODEL‐BASED FAILURE DETECTION

    10 MENTAL MODELS

    WHAT IS A MENTAL MODEL?

    BACKGROUND OF RESEARCH ON MENTAL MODELS

    ACT‐R

    LATTICE CHARACTERIZATION OF A MENTAL MODEL

    NEURONAL PACKET NETWORK AS A MODEL OF UNDERSTANDING

    MODELING OF AIRCRAFT PILOT DECISION‐MAKING UNDER TIME STRESS

    MUTUAL COMPATIBILITY OF MENTAL, DISPLAY, CONTROL, AND COMPUTER MODELS

    11 CAN COGNITIVE ENGINEERING MODELING CONTRIBUTE TO MODELING LARGE‐SCALE SOCIO‐TECHNICAL SYSTEMS?

    BASIC QUESTIONS

    WHAT LARGE‐SCALE SOCIAL SYSTEMS ARE WE TALKING ABOUT?

    WHAT MODELS?

    POTENTIAL OF FEEDBACK CONTROL MODELING OF LARGE‐SCALE SOCIETAL SYSTEMS

    THE STAMP MODEL FOR ASSESSING ERRORS IN LARGE‐SCALE SYSTEMS

    PAST WORLD MODELING EFFORTS

    TOWARD BROADER PARTICIPATION

    APPENDIX

    MATHEMATICS OF FUZZY LOGIC (CHAPTER 4, SECTION CRISP VERSUS FUZZY LOGIC)

    MATHEMATICS OF STATISTICAL INFERENCE FROM EVIDENCE (CHAPTER 4, SECTION SYMBOLIC STATEMENTS AND STATISTICAL INFERENCE)

    MATHEMATICS OF INFORMATION COMMUNICATION (CHAPTER 5, SECTION INFORMATION COMMUNICATION)

    MATHEMATICS OF INFORMATION VALUE (CHAPTER 5, SECTION INFORMATION VALUE)

    MATHEMATICS OF THE BRUNSWIK/KIRLIK PERCEPTION MODEL (CHAPTER 5, SECTION PERCEPTION PROCESS)

    MATHEMATICS OF HOW OFTEN TO SAMPLE (CHAPTER 5, SECTION VISUAL SAMPLING)

    MATHEMATICS OF SIGNAL DETECTION (CHAPTER 5, SECTION SIGNAL DETECTION)

    RESEARCH QUESTIONS CONCERNING MENTAL WORKLOAD (CHAPTER 5, SECTION MENTAL WORKLOAD)

    BEHAVIOR RESEARCH ISSUES IN VIRTUAL REALITY (CHAPTER 5, SECTION EXPERIENCING WHAT IS VIRTUAL; NEW DEMANDS FOR MODELING)

    MATHEMATICS OF HUMAN JUDGMENT OF UTILITY (CHAPTER 6, SECTION VALUATION/UTILITY)

    MATHEMATICS OF DECISIONS UNDER CERTAINTY (CHAPTER 7, SECTION DECISION UNDER CONDITION OF CERTAINTY)

    MATHEMATICS OF DECISIONS UNDER UNCERTAINTY (CHAPTER 7, SECTION DECISION UNDER CONDITION OF UNCERTAINTY)

    MATHEMATICS OF GAME MODELS (CHAPTER 7, SECTION COMPETITIVE DECISIONS: GAME MODELS)

    MATHEMATICS OF CONTINUOUS FEEDBACK CONTROL (CHAPTER 8, SECTION CONTINUOUS FEEDBACK CONTROL)

    MATHEMATICS OF PREVIEW CONTROL (CHAPTER 8, SECTION LOOKING AHEAD (PREVIEW CONTROL))

    STEPPING THROUGH THE KALMAN FILTER SYSTEM (CHAPTER 8, SECTION CONTROL BY CONTINUOUSLY UPDATING AN INTERNAL MODEL)

    REFERENCES

    INDEX

    END USER LICENSE AGREEMENT

    List of Tables

    Chapter 02

    TABLE 2.1 A taxonomy of model attributes

    Chapter 09

    TABLE 9.1 Fitts’ list

    TABLE 9.2 The original levels of automation scale

    List of Illustrations

    Chapter 04

    FIGURE 4.1 Trends in telephone company data (hypothetical).

    FIGURE 4.2 Gaussian probability density function. .

    FIGURE 4.3 Hypothetical supply–demand curves.

    FIGURE 4.4 Map of the United States. .

    FIGURE 4.5 Rasmussen’s schematic diagram depicting levels of behavior.

    FIGURE 4.6 Wickens’ (1984) model of human multiple resources (modified by author).

    FIGURE 4.7 Forward chaining tree.

    FIGURE 4.8 Backward chaining tree, where AND indicates necessity and OR indicates sufficiency.

    FIGURE 4.9 Kanizsa square illusion.

    Chapter 05

    FIGURE 5.1 The complexity of communication with a person or a machine.

    FIGURE 5.2 Interpretation of Brunswik lens model (after a diagram by Kirlik, 2006).

    FIGURE 5.3 Wickens’ SEEV model of attention. .

    FIGURE 5.4 Senders’ model: sampling matches the Nyquist criterion.

    FIGURE 5.5 Properties of mental workload (effects of very low workload not shown).

    FIGURE 5.6 Regions of workload accommodation. .

    FIGURE 5.7 Two images of a video showing superposition of computerized truck images on actual driver view in a test drive on a country road. White objects on trees along the roadway are fiduciary markers to enable continuous geometric correspondence of the AR image to the real world.

    FIGURE 5.8 Variables contributing to presence in VR.

    FIGURE 5.9 Relationship of VR created by computer and telepresence resulting from high‐quality sensing and display of events at an actual remote location. The dashed line around the remote manipulator arm suggests that the remote arm can be either real or virtual, and that if the visual and/or tactile feedback are good enough, there will be no difference in the human operator’s perception (mental model, shown in the cloud) of the (real or virtual) reality.

    Chapter 06

    FIGURE 6.1 A hypothetical form for performing a task analysis.

    FIGURE 6.2 An example of calibration for a three‐dimensional problem space.

    FIGURE 6.3 Stress–strain analogy to resilience.

    FIGURE 6.4 Variables affecting trust (after Lee and See, 2004).

    Chapter 07

    FIGURE 7.1 Example of determining the space of what is achievable within the space defined by what is aspired to and what is acceptable (in a simple two‐dimensional problem space).

    FIGURE 7.2 Tulga’s task for deciding where to attend and act.

    Chapter 08

    FIGURE 8.1 Fitts’ index of difficulty test.

    FIGURE 8.2 Classical feedback control system.

    FIGURE 8.3 Ferrell (1965) results for time to make accurate positioning movements with delayed feedback.

    FIGURE 8.4 Response times of nuclear plant operator teams to properly respond to a major accident alarm. For the particular mathematical function used (log normal), using specialized graph paper (logarithm of response time on y‐axis, Gaussian percentiles on x‐axis) reduces that function to a straight line. The 95th percentile mark is seen to be roughly 100 s. .

    FIGURE 8.5 Reason’s taxonomy of human error.

    FIGURE 8.6 Capture error.

    FIGURE 8.7 The Swiss Cheese model of accident occurrence as a result of penetrating multiple barriers. After Reason (1991).

    Chapter 09

    FIGURE 9.1 Four stages of human operator activity.

    FIGURE 9.2 Supervisory control, as originally proposed for lunar rover operations (Ferrell and Sheridan, 1967).

    FIGURE 9.3 Functions of the supervisor in relation to elements of the local human‐interactive computer (Figure 9.2) and multiple remote task‐interactive computers.

    FIGURE 9.4 Supervisory control in relation to degree of automation and task entropy.

    FIGURE 9.5 Distinctions with and between trading and sharing control. .

    FIGURE 9.6 Adaptable control (from Sheridan, 2011).

    FIGURE 9.7 Model‐based failure detection.

    Chapter 10

    FIGURE 10.1 The ACT‐R cognitive architecture (after Byrne et al., 2008).

    FIGURE 10.2 An example of Moray’s 1990 lattice model of the operation of a pump: (a) causality relations and (b) purpose relations.

    FIGURE 10.3 Formation of neuronal packets in Yufik’s model of understanding.

    FIGURE 10.4 Multiple model representations in teleoperation. .

    Chapter 11

    FIGURE 11.1 The Leveson STAMP model. .

    FIGURE 11.2 An example of system dynamics. .

    FIGURE 11.3 Relationships in a policy flight simulator. .

    bapp

    FIGURE A.1 Hypothetical fuzzy membership functions for basketball players.

    FIGURE A.2 Information relationships.

    FIGURE A.3 How often to sample.

    FIGURE A.4 Payoff matrix for signal detection.

    FIGURE A.5 Probability densities for evidence in signal detection.

    FIGURE A.6 Receiver operating characteristic (ROC).

    FIGURE A.7 The definition and experimental elicitation of a person’s utility function.

    FIGURE A.8 Pareto frontier and utility curve intersection determine optimal choice.

    FIGURE A.9 Sample payoff matrix for decisions under probabilistic contingencies.

    FIGURE A.10 Dominating and nondominating strategies (at left) and prisoner’s dilemma (right).

    FIGURE A.11 Dynamic programming model of preview control.

    FIGURE A.12 Kalman model of control.

    STEVENS INSTITUTE SERIES ON COMPLEX SYSTEMS AND ENTERPRISES

    William B. Rouse, Series Editor

    WILLIAM B. ROUSE

    Modeling and Visualization of Complex Systems and Enterprises

    ELISABETH PATE‐CORNELL, WILLIAM B. ROUSE, AND CHARLES M. VEST

    Perspectives on Complex Global Challenges: Education, Energy, Healthcare, Security, and Resilience

    WILLIAM B. ROUSE

    Universities as Complex Enterprises: How Academia Works, Why It Works These Ways, and Where the University Enterprise Is Headed

    THOMAS B. SHERIDAN

    Modeling Human–System Interaction: Philosophical and Methodological Considerations, with Examples

    MODELING HUMAN–SYSTEM INTERACTION

    Philosophical and Methodological Considerations, with Examples

    THOMAS B. SHERIDAN

    logo.gif

    Copyright © 2017 by John Wiley & Sons, Inc. All rights reserved

    Published by John Wiley & Sons, Inc., Hoboken, New Jersey

    Published simultaneously in Canada

    No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior written permission of the Publisher, or authorization through payment of the appropriate per‐copy fee to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750‐8400, fax (978) 750‐4470, or on the web at www.copyright.com. Requests to the Publisher for permission should be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ 07030, (201) 748‐6011, fax (201) 748‐6008, or online at http://www.wiley.com/go/permissions.

    Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts in preparing this book, they make no representations or warranties with respect to the accuracy or completeness of the contents of this book and specifically disclaim any implied warranties of merchantability or fitness for a particular purpose. No warranty may be created or extended by sales representatives or written sales materials. The advice and strategies contained herein may not be suitable for your situation. You should consult with a professional where appropriate. Neither the publisher nor author shall be liable for any loss of profit or any other commercial damages, including but not limited to special, incidental, consequential, or other damages.

    For general information on our other products and services or for technical support, please contact our Customer Care Department within the United States at (800) 762‐2974, outside the United States at (317) 572‐3993 or fax (317) 572‐4002.

    Wiley also publishes its books in a variety of electronic formats. Some content that appears in print may not be available in electronic formats. For more information about Wiley products, visit our web site at www.wiley.com.

    Library of Congress Cataloging‐in‐Publication Data:

    Names: Sheridan, Thomas B., author.

    Title: Modeling human‐system interaction : philosophical and methodological considerations, with examples / Thomas B. Sheridan.

    Description: Hoboken, New Jersey : John Wiley & Sons, [2017] | Series: Stevens Iinstitute series on complex systems and enterprises | Includes bibliographical references and index.

    Identifiers: LCCN 2016038455 (print) | LCCN 2016051718 (ebook) | ISBN 9781119275268 (cloth) | ISBN 9781119275299 (pdf) | ISBN 9781119275282 (epub)

    Subjects: LCSH: Human‐computer interaction. | User‐centered system design.

    Classification: LCC QA76.9.H85 S515 2017 (print) | LCC QA76.9.H85 (ebook) | DDC 004.01/9–dc23

    LC record available at https://lccn.loc.gov/2016038455

    Cover Image: Andrey Prokhorov/Gettyimages

    PREFACE

    This book has evolved from a professional lifetime of thinking about models and, more generally, thinking about thinking. I have previously written seven books over a span of 42 years, and they all have all talked about models, except for one privately published as a memoir for my family. One even dealt with the concept of God and whether God is amenable to modeling (mostly no). So what is new or different in the present book?

    The book includes quite a bit of the philosophy of science and the scientific method as a precursor to discussing human–system models. Many aspects of modeling are discussed: the purpose and uses of models for doing science and thinking about the world and examples of different kinds of models in what has come to be called human–system interaction or cognitive engineering. Along with new material, the book also includes many modeling ideas previously discussed by the author. When not otherwise cited, illustrations were drawn by the author for the book or were original works under the author’s copyright or previously declared by the author to be in public domain prior to publication.

    I gratefully acknowledge contributions to these ideas from many colleagues I have worked with, especially Neville Moray, who has been my friend and invaluable critic over the years, and Bill Rouse, who shepherded the book as Wiley series editor. Modeling contributions of past coauthors Russ Ferrell, Bill Verplank, Gunnar Johannsen, Toshi Inagaki, Raja Parasuraman, Chris Wickens, Peter Hancock, Joachim Meyer, and many other colleagues and former graduate students are gratefully acknowledged.

    Finally, I dedicate this effort to Rachel Sheridan, my inspiration and life partner for 63 years.

    INTRODUCTION

    This is a book about models, scientific models, of the interaction of individual people with technical environments, which has come to be called human–system interaction or cognitive engineering. The latter term emphasizes the role of the human intelligence in perceiving, analyzing, deciding, and acting rather than the biomechanical or energetic interactions with the physical environment.

    Alphonse Chapanis (1917–2002) is widely considered to be one of the founders of the field of human factors, cognitive engineering, or whatever term one wishes to use. He coauthored one of the (if not THE) first textbooks in the field (Chapanis et al., 1949). I had the pleasure of working with him on the original National Research Council Committee in our field (nowadays called Board on Human Systems Integration, originally chaired by Richard Pew). I recall that Chapanis, while a psychologist by training, repeatedly emphasized the point that our field is ultimately applied to designing technology to serve human needs; in other words it is about engineering. Models are inherent to doing engineering.

    More generally, models are the summaries of ideas we hang on to in order to think, communicate to others, and refine in order to make progress in the world. They are cognitive handles. Models come in two varieties: (1) those couched in language we call connotative (metaphor, myth other linguistic forms intended to motivate a person to make his or her own interpretation of meaning based on life experience) and (2) language we call denotative (where forms of language are explicitly selected to minimize the variability of meaning across peoples and cultures). Concise and explicit verbal statements, graphs, and mathematics are examples of denotative language. There is no doubt that connotative language plays a huge role in life, but science depends on denotative expression and models couched in denotative language, so that we can agree on what we’re talking about.

    The book focuses on the interaction between humans and systems in the human environment of physical things and other people. The models that are discussed are representations of events that are observable and measurable. In experiments, these necessarily include the causative factors (inputs, independent variables), the properties of the human operator (experimental subject), the assigned task, and the task environment. They also include the effects (outputs, dependent variables), the measures of human response correlated to the inputs.

    Chapters 1–3 of the book are philosophical, and apply to science and scientific models quite generally, models in human–system interaction being no exception. Chapter 1 begins with a discussion of what knowledge is and what the scientific method is including the philosophical distinction between private (subjective) knowledge and public (objective) knowledge, the importance of doubt, using and avoiding evidence, objectivity and advocacy, bias, analogy, and metaphor.

    Chapter 2 defines the meaning of model and offers a six‐factor taxonomy of model attributes. It poses the question of what is to be gained by modeling and the issue of social

    Enjoying the preview?
    Page 1 of 1