Next Article in Journal
Quantifying Risk Perception: The Entropy Decision Risk Model Utility (EDRM-U)
Previous Article in Journal
An Application of an Urban Freight Transportation System for Reduced Environmental Emissions
Previous Article in Special Issue
Virtual Modeling of User Populations and Formative Design Parameters
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Introduction to the Special Issue on “Human Factors in Systems Engineering”

by
Michael E. Miller
1,* and
Christina F. Rusnock
2,*
1
Department of Systems Engineering and Management, Air Force Institute of Technology, Wright Patterson AFB, OH 45433, USA
2
Air Force Life Cycle Management Center, Wright Patterson AFB, OH 45402, USA
*
Authors to whom correspondence should be addressed.
Systems 2020, 8(4), 50; https://doi.org/10.3390/systems8040050
Submission received: 23 November 2020 / Accepted: 24 November 2020 / Published: 1 December 2020
(This article belongs to the Special Issue Human Factors in Systems Engineering)
This paper summarizes the aim and the results of this Special Issue. Overall, it is observed that the papers illustrate that human variability arises due to human adaptation to the affordances of the systems with which they interact. The papers within this Special Issue provide methods for understanding and applying knowledge of this variability during system design to provide more robust means of predicting these influences on system application, permitting designers to understand and consider these differences during system design.
Systems engineering is rapidly adopting model-based methods to support development of system requirements, design, analysis, verification, and validation activities throughout the system life cycle [1]. In human factors, modeling is increasingly used to provide early insight into human physical, perceptual, and cognitive performance. These models include static models, such as cognitive maps, analytic hierarchies, or task analyses, as well as dynamic models, including models such as anthropometric simulation models, computational cognitive human performance models, and agent-based models. Integration of human factors models with model-based systems engineering tools has been explored and discussed to aid traceability of human-based requirements [2,3], analyze tradeoffs, and to verify that design alternatives potentially fulfill system requirements [4,5].
An important driving force in the increasing use of modeling is the ever-present pressure to increase the rate of development of increasingly complex systems, requiring improvements in rapid systems integration. The complexity of systems that include humans arises from increasing integration of larger geographically separated teams, the incorporation of automation, and the increasing complexity of the underlying hardware and software. This Special Issue focuses on models useful for understanding human influences in man-made system designs.
This Special Issue includes six papers which generally address different aspects of the human as a component in a complex-adaptive, socio-technical system. The variability introduced by the human is addressed differently within the papers of this Special Issue.
The Special Issue begins with a pair of papers which discusses methods to include human variability during system design. The first of these papers discusses a method to address selection of design parameters which are likely to be sensitive to variability in the target human population and the selection of the population to evaluate alternative system designs [6]. The second paper considers that not only anthropometric and biomechanics variability, but interaction variability, should be considered when designing the physical attributes of systems or products. Specifically, this paper illustrates that mechanical engineering students, even after receiving rudimentary education regarding these principles, often do not anticipate the same levels of human interaction with the product. These differences can lead to less than optimal designs [7].
The third paper explores methods to reduce variability in human response. Specifically this paper demonstrates that information presentation methods can reduce the variability and bias in human responses while providing value judgements [8]. Thus, specific presentation methods can reduce the confidence intervals associated with these value judgements. Methods such as pairwise value comparisons may be useful for improving multi-attribute decision making. As illustrated in the sixth paper, it is often difficult to understand the attributes that are likely to influence human decision making. Specifically, this paper explores attributes that influence a software engineer’s willingness to reuse existing code [9]. As this paper illustrates, software engineers were less willing to reuse code created by automated software repair tools than by other humans. However, the source of this difference is unclear as multiple possible explanations are provided.
The two final papers in this Special Issue address variability in execution during operations. The fourth paper demonstrates a novel method of understanding operator information needs and decision making in novel situations through wargaming [10]. This interesting method facilitates cognitive task analysis to be conducted for “to-be” systems which differ significantly from any extant systems. The use of cognitive task analysis is commonly applied in generating requirements for extant systems [11]. By applying the proposed method, designers can obtain a glimpse into how humans are likely to adapt their behavior in response to a future technical system. Thus, this method has the potential to provide insight into relatively robust requirements for a first-generation system before investing in software development and may provide a useful boot-strapping method for Agile software development in novel systems.
The final paper looks at the level of task abstraction for system control where it is assumed that automation can be designed to take on tasks with increasing levels of abstraction, as might be applied in the robotics subsumption architectures [12]. However, in this paper, the level of control abstraction is characterized from the human’s viewpoint [13]. At lower levels of abstraction, the human is responsible for continuously controlling the system, requiring the human to continuously focus their attention on the control of the system. At higher levels of abstraction, continuous control of the system is provided by the automation and the human is not required to continuously focus their attention on control of the system. Importantly, the paper suggests that the human operator should have the flexibility to control the level of abstraction. Thus, the human can adapt the level of system control allocated to the automation and thus control the level of attention and mental resources they are required to dedicate to system control.
As discussed by Norman, humans adapt their use of systems and behavior based upon the affordances the system provides and the environmental demands [14]. This theme runs throughout the papers in this Special Issue. The papers within this Special Issue demonstrate methods to provide insight into this adaptive behavior and subsequent application of this knowledge to aid system design. It is important that these descriptions of behavior, while not mathematical or prescriptive, provide estimates of future human behavior. Thus, they provide methods to model and apply human interaction with the system under design, improving the robustness of the deployed system.

Author Contributions

Conceptualization, M.E.M.; writing—original draft preparation, M.E.M.; writing—review and editing, C.F.R. All authors have read and agreed to the published version of the manuscript.

References

  1. International Council on Systems Engineering (INCOSE). A World in Motion: Systems Engineering Vision 2025. San Diego, CA, 2014. Available online: https://www.incose.org/docs/default-source/aboutse/se-vision-2025.pdf?sfvrsn=b69eb4c6_4 (accessed on 23 November 2020).
  2. Miller, M.E.; McGuirl, J.M.; Schneider, M.F.; Ford, T.C. Systems modeling language extension to support modeling of human-agent teams. Syst. Eng. 2020. [Google Scholar] [CrossRef]
  3. Madni, A.M.; Madni, C.C. Architectural Framework for Exploring Adaptive Human-Machine Teaming Options in Simulated Dynamic Environments. Systems 2018, 6, 44. [Google Scholar] [CrossRef] [Green Version]
  4. Watson, M.E.; Rusnock, C.F.; Colombi, J.M.; Miller, M.E. Human-centered design using system modeling language. J. Cogn. Eng. Decis. Mak. 2017, 11, 252–269. [Google Scholar] [CrossRef]
  5. Watson, M.; Rusnock, C.F.; Miller, M.; Colombi, J.M. Informing System Design Using Human Performance Modeling. Syst. Eng. 2017, 20, 173–187. [Google Scholar] [CrossRef]
  6. Knisely, B.M.; Vaughn-Cooke, M. Virtual Modeling of User Populations and Formative Design Parameters. Systems 2020, 8, 35. [Google Scholar] [CrossRef]
  7. Cage, K.; Vaughn-Cooke, M.; Fuge, M. Design and validation of a method to characterize human interaction variability. Systems 2020, 8, 32. [Google Scholar] [CrossRef]
  8. Kristbaum, J.P.; Ciarallo, F.W. Strategic decision facilitation: Supporting critical assumptions of the human in empirical modeling of pairwise value comparisons. Systems 2020, 8, 30. [Google Scholar] [CrossRef]
  9. Alarcon, G.M.; Walter, C.; Gibson, A.M.; Gamble, R.F.; Capiola, A.; Jessup, S.A.; Ryan, T.J. Would you fix this code for me? Effects of repair source and commenting on trust in code repair. Systems 2020, 8, 8. [Google Scholar] [CrossRef] [Green Version]
  10. Dorton, S.L.; Maryeski, L.R.; Ogren, L.; Dykens, I.T.; Main, A. A wargame-augmented knowledge elicitation method for the agile development of novel systems. Systems 2020, 8, 27. [Google Scholar] [CrossRef]
  11. Potter, S.; Gualtieri, J.; Elm, W. Case studies: Applied cognitive work analysis in the design of innovative decision support. In Handbook of Cognitive Task Design; CRC Press: Boca Raton, FL, USA, 2003; pp. 1–35. [Google Scholar]
  12. Brooks, R.A. A Robust Layered Control System for A Mobile Robot. IEEE J. Robot. Autom. 1986, 2, 14–23. [Google Scholar] [CrossRef] [Green Version]
  13. Johnson, C.; Miller, M.E.; Rusnock, C.F.; Jacques, D.R. Applying Control Abstraction to the Design of Human—Agent Teams. Systems 2020, 8, 10. [Google Scholar] [CrossRef]
  14. Norman, D. The Design of Everyday Things; Basic Books: New York, NY, USA, 2013. [Google Scholar]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Miller, M.E.; Rusnock, C.F. Introduction to the Special Issue on “Human Factors in Systems Engineering”. Systems 2020, 8, 50. https://doi.org/10.3390/systems8040050

AMA Style

Miller ME, Rusnock CF. Introduction to the Special Issue on “Human Factors in Systems Engineering”. Systems. 2020; 8(4):50. https://doi.org/10.3390/systems8040050

Chicago/Turabian Style

Miller, Michael E., and Christina F. Rusnock. 2020. "Introduction to the Special Issue on “Human Factors in Systems Engineering”" Systems 8, no. 4: 50. https://doi.org/10.3390/systems8040050

APA Style

Miller, M. E., & Rusnock, C. F. (2020). Introduction to the Special Issue on “Human Factors in Systems Engineering”. Systems, 8(4), 50. https://doi.org/10.3390/systems8040050

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop