Operational Feasibility Analysis of the Multimodal Controller Working Position “TriControl”
Abstract
:1. Introduction
2. Background of Multimodal Interfaces, Feasibility Analysis, and the CWP prototype TriControl
2.1. Multimodal HMIs and Their Benefits for ATC
2.2. Validation Methodology for Feasibility Analysis with Usability, Acceptability, and User-Centred Design
2.3. Multimodal CWP TriControl
3. Multimodal CWP Feasibility Analysis
3.1. Evaluation Site and Study Participants’ Characteristics
3.2. Tasks during the Human-in-the-Loop Study for Feasibility Analysis
3.3. System Usability and Feasibility Analysis Questionnaire
4. Results of the Feasibility Study
4.1. Score of System Usability Scale (SUS)
4.2. Feasibility Questionnaire Ratings
4.2.1. Ratings on TriControl Concept (T)
4.2.2. Ratings on Command Input
Ratings on Eye-Tracking (E)
Ratings on Clearances (C)
Ratings on Gestures (G)
Ratings on Speech Recognition (S)
Ratings on Input Procedure (I)
4.2.3. Ratings on Radar Screen (R)
4.3. Feasibility Questionnaire Comments of all ATCOs
5. Discussion of TriControl Feasibility
- Were able to perform parallel input with different modalities,
- Hardly experienced any malfunction with the multitouch pad correspondence,
- Did not forget to perform the confirmation gesture after command completion,
- Did not perform wrong gestures,
- Did not experience some troubles with eye-tracking,
- Experienced more reliable speech recognition,
- Did not make other interaction mistakes, such as:
- ○
- Too long press for confirmation and thus turning into a direct_to command,
- ○
- Forgetting to toggle back from the multitouch device’s graphical user interface mode,
- ○
- Pressing the foot pedal for voice recording during complete command creation.
6. Summary, Conclusions, and Outlook
7. Patents
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Appendix A
N = 14 (AllATCOs) | N = 4 (Active APP ATCOs) | |||||||
---|---|---|---|---|---|---|---|---|
No. | Statement 1 | M | SD | n | k | p | M | SD |
Controlling | ||||||||
T1.1 | I was able to guide the aircraft to their destination. | 4.8 | 1.1 | 14 | 11 | 0.03 | 5.8 | 0.5 |
T1.2 | I was able to guide the aircraft following the common safety requirements. | 4.5 | 1.2 | 14 | 10 | 0.09 | 5.3 | 1.0 |
T1.3 | I was able to guide the aircraft in an efficient way. | 3.4 | 1.7 | 14 | 8 | 0.40 | 5.3 | 0.5 |
Task Adequacy | ||||||||
T2.1 | The interface sufficiently supported me achieving my best performance. | 3.6 | 1.5 | 14 | 7 | 0.60 | 4.3 | 1.0 |
T2.2 | The parallel command input enabled me to issue commands fast and effectively (type and value). | 3.4 | 2.0 | 14 | 7 | 0.60 | 5.3 | 0.5 |
T2.3 | All in all the command procedure was appropriate for its intended use (input and feedback). | 4.4 | 1.4 | 14 | 11 | 0.03 | 5.0 | 0.8 |
Self-Descriptiveness | ||||||||
T3.1 | I was always aware of the state of use I was currently operating in (monitoring, issuing commands). | 3.9 | 1.1 | 14 | 9 | 0.21 | 5.0 | 0.0 |
T3.2 | I was always aware of the state the command input was in (inactive, active, receiving, received, accepted). | 3.6 | 0.9 | 14 | 8 | 0.40 | 4.0 | 0.8 |
T3.3 | I always knew which actions I was able to execute at any given moment. | 4.4 | 1.1 | 14 | 11 | 0.03 | 5.3 | 0.5 |
T3.4 | I always knew how those actions had to be executed. | 5.1 | 0.6 | 14 | 14 | 0.00 | 5.3 | 0.5 |
Expectation Conformity | ||||||||
T4.1 | I was always able to intuitively interact with the interface the way I needed to. | 3.6 | 1.6 | 14 | 9 | 0.21 | 5.0 | 0.0 |
T4.2 | TriControl matched common conventions of use (content, depictions, specificity of numeric information etc.) | 3.6 | 1.6 | 14 | 8 | 0.40 | 4.8 | 0.5 |
Controllability | ||||||||
T5.1 | I was able to start the command issuing exactly when I wanted to. | 3.8 | 1.8 | 14 | 9 | 0.21 | 5.0 | 0.8 |
T5.2 | I was able to control the command issuing the way I wanted to (proceed, cancel, or confirm). | 4.1 | 1.5 | 14 | 11 | 0.03 | 5.3 | 0.5 |
T5.3 | I was able to control the pace at which the commands were entered. | 3.6 | 1.7 | 14 | 9 | 0.21 | 4.0 | 0.8 |
Error Tolerance | ||||||||
T6.1 | In case of a mistake, a command could still be issued with little extra effort (time and mental effort). | 3.8 | 1.7 | 13 | 9 | 0.13 | 5.0 | 0.0 |
Suitability for Individualization | ||||||||
T7.1 | I would like to be able to adapt the interface to my personal preferences. | 4.2 | 1.5 | 9 | 7 | 0.09 | 4.0 | 2.0 |
Satisfaction and Acceptability of TriControl | ||||||||
T8.1 | TriControl is useful for managing routine approach air traffic. | 3.6 | 1.6 | 14 | 8 | 0.40 | 4.8 | 1.0 |
T8.2 | Working with TriControl is more effective than working with common interfaces. | 2.8 | 1.4 | 14 | 3 | 0.99 | 3.3 | 0.5 |
T8.3 | TriControl is easy to use. | 4.3 | 1.6 | 14 | 9 | 0.21 | 5.5 | 1.0 |
T8.4 | TriControl is user friendly. | 4.4 | 1.6 | 14 | 10 | 0.09 | 5.8 | 0.5 |
T8.5 | It is easy to learn to use TriControl. | 4.9 | 1.0 | 14 | 13 | 0.00 | 5.8 | 0.5 |
T8.6 | Overall, I am satisfied with TriControl. | 4.2 | 1.4 | 14 | 10 | 0.09 | 5.3 | 0.5 |
T8.7 | I would want to use TriControl for my daily work if I had the option. | 3.3 | 1.8 | 12 | 5 | 0.81 | 4.0 | 2.2 |
T8.8 | I would prefer TriControl over common ATC interfaces. | 3.1 | 1.5 | 14 | 6 | 0.79 | 3.5 | 1.3 |
N = 14 (AllATCOs) | N = 4 (Active APP ATCOs) | |||||||
---|---|---|---|---|---|---|---|---|
No. | Statement 1 | M | SD | n | k | p | M | SD |
Aircraft Selection | ||||||||
E1.1 | I was able to select every aircraft I wanted to. | 3.8 | 1.3 | 14 | 10 | 0.09 | 4.5 | 0.6 |
E1.2 | Only little effort was needed to select aircraft. | 4.0 | 1.6 | 13 | 10 | 0.05 | 5.0 | 0.8 |
Satisfaction and Acceptability of the Eye-Tracking Feature | ||||||||
E2.1 | The eye-tracking method is useful for aircraft selection. | 4.1 | 1.4 | 14 | 11 | 0.03 | 5.0 | 0.0 |
E2.2 | The eye-tracking method works more effectively than conventional aircraft selection methods. | 2.9 | 1.4 | 14 | 6 | 0.79 | 3.0 | 1.4 |
E2.3 | The eye-tracking method is easy to use. | 4.1 | 1.5 | 14 | 10 | 0.09 | 5.0 | 1.4 |
E2.4 | The eye-tracking method is user-friendly. | 3.9 | 1.4 | 14 | 10 | 0.09 | 4.8 | 1.3 |
E2.5 | It is easy to learn to use the eye-tracking method. | 4.9 | 1.0 | 14 | 12 | 0.01 | 5.5 | 0.6 |
E2.6 | Overall, I am satisfied with the eye-tracking as a method of aircraft selection. | 3.7 | 1.4 | 14 | 9 | 0.21 | 4.8 | 0.5 |
E2.7 | I would want to use it for my daily work if I had the option. | 3.3 | 1.9 | 14 | 7 | 0.60 | 3.8 | 2.2 |
E2.8 | I would prefer it over conventional input methods. | 3.1 | 1.8 | 14 | 7 | 0.60 | 3.3 | 1.7 |
Issuing Commands | ||||||||
C2.1 | I was able to issue altitude clearance. | 4.9 | 0.9 | 14 | 13 | 0.00 | 5.8 | 0.5 |
C2.2 | I was able to issue speed clearance. | 4.9 | 0.9 | 14 | 13 | 0.00 | 5.8 | 0.5 |
C2.3 | I was able to issue heading clearance. | 4.8 | 1.0 | 14 | 13 | 0.00 | 5.5 | 1.0 |
C2.4 | I was able to command heading to a certain waypoint. | 5.0 | 0.7 | 12 | 12 | 0.00 | 5.7 | 0.8 |
C2.5 | I was able to command hand over to tower. | 4.0 | 1.8 | 14 | 9 | 0.21 | 5.0 | 2.0 |
C2.6 | I was able to identify when I was able to issue commands. | 4.8 | 1.2 | 13 | 12 | 0.00 | 5.8 | 0.6 |
C2.7 | I was able to identify when my commands were being received. | 4.7 | 1.3 | 14 | 12 | 0.01 | 5.8 | 0.5 |
C2.8 | I was able to identify when my commands were being accepted by the system. | 4.8 | 1.1 | 14 | 13 | 0.00 | 5.3 | 1.0 |
C2.9 | I was able to enter command type and command value simultaneously. | 3.9 | 1.5 | 14 | 9 | 0.21 | 4.5 | 1.3 |
Satisfaction and Acceptability of the Gesture based Command type input | ||||||||
G2.1 | The gesture-based command type input is useful for the input of command types. | 4.6 | 1.2 | 14 | 11 | 0.03 | 5.5 | 0.6 |
G2.2 | The gesture-based command type input is more effective than common approaches. | 3.2 | 1.4 | 14 | 7 | 0.60 | 3.5 | 1.3 |
G2.3 | The gesture-based command type input is easy to use. | 4.4 | 1.4 | 14 | 10 | 0.09 | 5.3 | 0.5 |
G2.4 | The gesture-based command type input method is user friendly. | 4.1 | 1.5 | 14 | 9 | 0.21 | 5.0 | 0.8 |
G2.5 | It is easy to learn the gestures. | 5.1 | 0.5 | 14 | 14 | 0.00 | 5.5 | 0.6 |
G2.6 | Overall, I am satisfied with the gesture-based command type input. | 4.0 | 1.5 | 14 | 9 | 0.21 | 5.3 | 0.5 |
G2.7 | I would want to use it for my daily work if I had the option. | 3.4 | 1.4 | 14 | 7 | 0.60 | 4.0 | 1.2 |
G2.8 | I would prefer it over common methods of command type input. | 3.1 | 1.4 | 14 | 6 | 0.79 | 3.5 | 1.3 |
Satisfaction and Acceptability of Speech-Recognition based command value input | ||||||||
S2.1 | Speech recognition is useful for the input of command values. | 4.2 | 1.4 | 14 | 9 | 0.21 | 5.3 | 0.5 |
S2.2 | The speech recognition command value input is more effective than common approaches. | 3.3 | 1.3 | 13 | 6 | 0.71 | 4.3 | 1.2 |
S2.3 | The speech recognition is easy to use. | 4.0 | 1.6 | 13 | 8 | 0.29 | 5.3 | 0.5 |
S2.4 | The speech recognition-based command value input is user friendly. | 4.1 | 1.5 | 14 | 9 | 0.21 | 5.3 | 0.5 |
S2.5 | It is easy to learn to use the speech recognition. | 4.4 | 1.4 | 14 | 10 | 0.09 | 5.3 | 0.5 |
S2.6 | It was easy to get used to only verbalize the command value and not the whole command. | 4.3 | 1.4 | 14 | 11 | 0.03 | 4.8 | 0.5 |
S2.7 | Overall, I am satisfied with the speech recognition-based command value input. | 3.8 | 1.3 | 14 | 7 | 0.60 | 4.8 | 0.5 |
S2.8 | I would want to use it for my daily work if I had the option. | 3.2 | 1.6 | 14 | 6 | 0.79 | 3.8 | 1.5 |
S2.9 | I would prefer it over common methods of command value input. | 3.0 | 1.2 | 14 | 6 | 0.79 | 3.3 | 1.0 |
Satisfaction and Acceptability of the complete command input procedure | ||||||||
I2.1 | TriControl command input procedure is useful for issuing commands. | 4.3 | 1.3 | 14 | 10 | 0.09 | 4.8 | 0.5 |
I2.2 | The command input procedure is more effective than common approaches for command issuing. | 2.9 | 1.3 | 14 | 5 | 0.91 | 3.0 | 1.4 |
I2.3 | TriControl’s command input procedure is easy to use. | 4.2 | 1.6 | 14 | 10 | 0.09 | 5.0 | 0.8 |
I2.4 | The combination of eye-tracking, gestures, speech recognition and confirmation is user friendly. | 3.8 | 1.7 | 14 | 8 | 0.40 | 4.8 | 1.3 |
I2.5 | It is easy to learn to use the command input procedure. | 4.7 | 1.1 | 14 | 13 | 0.00 | 5.0 | 0.8 |
I2.6 | Overall, I am satisfied with the command input procedure. | 3.9 | 1.4 | 14 | 9 | 0.21 | 5.0 | 0.0 |
I2.7 | I would want to use the command input procedure for my daily work if I had the option. | 3.1 | 1.4 | 14 | 7 | 0.60 | 3.8 | 1.0 |
I2.8 | I would prefer the command input procedure over common methods of command value input. | 2.6 | 1.3 | 14 | 4 | 0.97 | 2.8 | 1.0 |
N = 14 (AllATCOs) | N = 4 (Active APP ATCOs) | |||||||
---|---|---|---|---|---|---|---|---|
No. | Statement 1 | M | SD | n | k | p | M | SD |
Aircraft within my sector: Identification | ||||||||
R1.1.1 | I was able to identify every aircraft’s presence. | 4.9 | 1.2 | 14 | 12 | 0.01 | 5.3 | 0.5 |
R1.1.2 | I was able to identify every aircraft’s location. | 5.1 | 0.9 | 14 | 13 | 0.00 | 5.5 | 0.6 |
R1.1.3 | I was able to identify every aircraft’s call sign. | 5.4 | 0.6 | 14 | 14 | 0.00 | 5.8 | 0.5 |
R1.1.4 | I was able to identify every aircraft’s weight class. | 4.0 | 1.8 | 14 | 7 | 0.60 | 3.3 | 1.9 |
Aircraft within my sector: Coordination | ||||||||
R1.2.1 | I was able to obtain information regarding every aircraft’s altitude. | 4.7 | 1.1 | 14 | 12 | 0.01 | 4.5 | 1.7 |
R1.2.2 | I was able to obtain information regarding every aircraft’s cleared altitude. | 4.6 | 1.1 | 14 | 12 | 0.01 | 4.5 | 1.7 |
R1.2.3 | I was able to obtain information regarding every aircraft’s speed. | 4.7 | 1.1 | 14 | 12 | 0.01 | 4.5 | 1.7 |
R1.2.4 | I was able to obtain information regarding every aircraft’s cleared speed. | 4.6 | 1.1 | 14 | 12 | 0.01 | 4.5 | 1.7 |
R1.2.5 | I was able to obtain information regarding every aircraft’s heading. | 4.6 | 1.1 | 14 | 12 | 0.01 | 4.5 | 1.7 |
R1.2.6 | I was able to obtain information regarding every aircraft’s cleared heading. | 4.6 | 1.1 | 14 | 12 | 0.01 | 4.5 | 1.7 |
R1.2.7 | I was able to obtain information regarding every aircraft’s next selected waypoint. | 4.6 | 1.1 | 13 | 11 | 0.01 | 4.3 | 2.1 |
R1.2.8 | I was able to obtain information regarding every aircraft’s distance to another aircraft. | 3.8 | 1.5 | 14 | 9 | 0.21 | 2.8 | 2.2 |
R1.2.9 | I was able to obtain information regarding every aircraft’s sequence number suggested by the AMAN. | 4.3 | 1.3 | 11 | 8 | 0.11 | 5.0 | 0.0 |
R1.2.10 | I was able to obtain information regarding every aircraft’s miscellaneous information (Cleared ILS, Handover to Tower). | 4.2 | 1.4 | 13 | 8 | 0.29 | 4.0 | 1.8 |
Aircraft heading into my sector: Identification | ||||||||
R2.1.1 | I was able to obtain the information that aircraft were heading into my sector. | 4.0 | 1.1 | 11 | 7 | 0.27 | 5.0 | 0.0 |
R2.1.2 | I was able to obtain the information how many aircraft were heading into my sector. | 3.8 | 1.2 | 12 | 6 | 0.61 | 4.3 | 1.2 |
R2.1.3 | I was able to obtain the call sign of every aircraft heading into my sector. | 4.8 | 0.8 | 13 | 12 | 0.00 | 5.3 | 0.5 |
Aircraft heading into my sector: Coordination | ||||||||
R2.2.1 | I was able to obtain every aircraft’s estimated time of arrival (ETA). | 2.5 | 0.5 | 6 | 0 | 0.99 | - | - |
R2.2.2 | I was able to obtain every aircraft’s point of entry. | 2.4 | 1.0 | 7 | 1 | 0.99 | - | - |
Orientation Aids | ||||||||
R3.1 | I was able to obtain the runway location. | 5.3 | 0.6 | 13 | 13 | 0.00 | 5.3 | 0.6 |
R3.2 | I was able to obtain the runway orientation. | 5.4 | 0.5 | 13 | 13 | 0.00 | 5.3 | 0.6 |
R3.3 | I was able to obtain the extended runway centerline. | 5.4 | 0.5 | 14 | 14 | 0.00 | 5.3 | 0.5 |
R3.4 | I was able to obtain the standard arrival routes (STAR). | 5.2 | 0.6 | 11 | 11 | 0.00 | 5.3 | 0.6 |
R3.5 | I was able to obtain the borders of my airspace sector. | 5.1 | 0.8 | 11 | 10 | 0.01 | 5.3 | 0.6 |
R3.6 | I was able to obtain GPS waypoints. | 5.3 | 0.5 | 14 | 14 | 0.00 | 5.3 | 0.5 |
The Centerline Separation Range | ||||||||
R4.1 | I was able to obtain the location of aircraft in final descend. | 4.8 | 1.0 | 13 | 11 | 0.01 | 5.3 | 0.5 |
R4.2 | I was able to obtain the separation between aircraft and neighboring elements (runway, different aircraft). | 4.8 | 1.0 | 13 | 11 | 0.01 | 5.3 | 0.5 |
R4.3 | I was able to obtain the weight class of aircraft in final descend. | 4.3 | 1.6 | 13 | 9 | 0.13 | 3.5 | 2.4 |
Information Design: Clarity | ||||||||
R5.1.1 | I was able to obtain all information quickly. | 4.1 | 1.3 | 14 | 11 | 0.03 | 4.3 | 1.7 |
R5.1.2 | All information is as specific as I need it to be. | 4.0 | 1.1 | 14 | 10 | 0.09 | 4.3 | 1.0 |
Information Design: Discriminability | ||||||||
R5.2.1 | I was able to discriminate between different radar screen elements in general. | 4.7 | 0.9 | 14 | 12 | 0.01 | 5.0 | 0.8 |
Information Design: Discriminability—Aircraft | ||||||||
R5.2.1.1 | I was able to easily discriminate between different aircraft within my sector. | 4.8 | 1.1 | 14 | 13 | 0.00 | 5.5 | 0.6 |
R5.2.1.2 | I was able to easily discriminate between different aircraft heading into my sector. | 4.6 | 1.1 | 14 | 12 | 0.01 | 5.3 | 1.0 |
R5.2.1.3 | I was able to easily discriminate between different information within the label. | 4.5 | 1.2 | 14 | 11 | 0.03 | 5.3 | 1.0 |
R5.2.1.4 | I was able to easily discriminate between different command states within the aircraft label (Inactive, active, received, confirmed). | 4.1 | 1.5 | 14 | 9 | 0.21 | 5.0 | 0.8 |
R5.2.1.5 | I was able to easily discriminate between different indicated weight classes. | 4.1 | 1.2 | 13 | 9 | 0.13 | 3.8 | 1.7 |
R5.2.1.6 | I was able to easily discriminate between different Arrival Manager order suggestions. | 3.8 | 1.5 | 10 | 7 | 0.17 | 5.0 | 1.4 |
R5.2.1.7 | I was able to easily discriminate between different heading directions. | 3.9 | 1.4 | 14 | 10 | 0.09 | 4.3 | 1.7 |
Information Design: Discriminability—Orientation Aids and Centerline Separation Range (CSR) | ||||||||
R5.2.2.1 | I was able to easily discriminate between different categories of orientation aids in general. | 4.7 | 0.8 | 11 | 10 | 0.01 | 4.0 | 1.4 |
R5.2.2.2 | I was able to easily discriminate between different GPS waypoints. | 5.0 | 0.8 | 11 | 11 | 0.00 | 5.3 | 0.8 |
R5.2.2.3 | I was able to easily discriminate between different runways. | 5.2 | 0.6 | 10 | 10 | 0.00 | 5.3 | 0.8 |
R5.2.2.4 | I was able to easily discriminate between different aircraft on the CSR. | 5.3 | 0.7 | 8 | 8 | 0.00 | 5.5 | 0.7 |
R5.2.2.5 | I was able to easily discriminate between different distances between aircraft on the CSR. | 5.0 | 0.9 | 8 | 7 | 0.04 | 5.3 | 0.6 |
Information Design: Consistency | ||||||||
R5.3.1 | The format of the information given was consistent with what I expected it to be. | 4.4 | 1.0 | 14 | 13 | 0.00 | 4.8 | 1.0 |
Information Design: Compactness | ||||||||
R5.4.1 | I obtained all the information I needed to monitor the area effectively. | 4.0 | 1.5 | 14 | 10 | 0.09 | 5.0 | 0.8 |
R5.4.2 | The radar screen didn’t present any unnecessary information. | 4.5 | 0.9 | 14 | 12 | 0.01 | 4.3 | 1.0 |
Information Design: Detectability | ||||||||
R5.5.1 | I was able to direct my attention towards the currently necessary information. | 3.9 | 1.4 | 14 | 10 | 0.09 | 4.8 | 0.5 |
R5.5.2 | The radar screen didn’t divert my attention towards currently unnecessary information. | 4.4 | 1.2 | 14 | 11 | 0.03 | 5.3 | 1.0 |
Information Design: Readability | ||||||||
R5.6.1 | I was able to easily read alphanumeric information concerning the aircraft. | 4.9 | 1.0 | 14 | 13 | 0.00 | 5.5 | 0.6 |
R5.6.2 | I was able to easily read alphanumeric information concerning the orientation aids. | 5.1 | 0.7 | 14 | 13 | 0.00 | 5.5 | 0.6 |
R5.6.3 | I was able to easily read alphanumeric information within the CSR. | 5.2 | 0.6 | 11 | 11 | 0.00 | 5.5 | 0.7 |
Information Design: Comprehensibility of coded meaning | ||||||||
R5.7.1 | I was able to easily understand the coded information in general. | 4.5 | 1.1 | 13 | 11 | 0.01 | 5.3 | 0.6 |
R5.7.2 | I perceived the used coding of information as unambiguous. | 3.9 | 1.4 | 12 | 8 | 0.19 | 4.0 | 1.7 |
R5.7.3 | I was able to easily interpret all used codes. | 4.1 | 1.3 | 13 | 9 | 0.13 | 5.3 | 0.6 |
R5.7.4 | I found it easy to deduce the coded meaning of the given information. | 4.2 | 1.3 | 12 | 9 | 0.07 | 5.3 | 0.6 |
Satisfaction and acceptability of the radar screen | ||||||||
R6.1 | The information design used in the radar screen is useful for sector monitoring. | 4.1 | 0.8 | 14 | 12 | 0.01 | 4.0 | 0.8 |
R6.2 | The radar screen depicts information more effectively than conventional models. | 2.8 | 1.2 | 13 | 3 | 0.99 | 3.0 | 1.4 |
R6.3 | The radar screen is easy to use for monitoring. | 4.1 | 0.9 | 14 | 11 | 0.03 | 4.3 | 1.0 |
R6.4 | The radar screen design is user friendly. | 4.2 | 1.0 | 13 | 11 | 0.01 | 4.5 | 1.0 |
R6.5 | It was easy to learn to use the radar screen. | 4.4 | 1.0 | 14 | 13 | 0.00 | 4.5 | 1.7 |
R6.6 | Overall, I am satisfied with the radar screen information design. | 3.9 | 1.2 | 14 | 10 | 0.09 | 3.8 | 2.1 |
R6.7 | I would want to use it for my daily work if I had the option. | 3.0 | 1.3 | 13 | 6 | 0.71 | 3.0 | 1.4 |
R6.8 | I would prefer it over conventional radar screen designs. | 2.7 | 1.2 | 13 | 4 | 0.95 | 2.8 | 1.0 |
References
- Quek, F.; McNeill, D.; Bryll, R.; Kirbas, C.; Arslan, H.; McCullough, K.E.; Furuyama, N.; Ansari, R. Gesture, speech, and gaze cues for discourse segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2000 (Cat. No.PR00662), Hilton Head Island, SC, USA, 15 June 2000; Volume 2, pp. 247–254. [Google Scholar]
- Oviatt, S.L. Multimodal interfaces. In The Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications; CRC Press: Boca Raton, FL, USA, 2003; pp. 286–304. [Google Scholar]
- Oviatt, S.L. Advances in Robust Multimodal Interface Design. IEEE Comput. Graph. Appl. 2003, 23, 62–68. [Google Scholar] [CrossRef]
- Koons, D.B.; Sparrell, C.J.; Thorisson, K.R. Integrating simultaneous input from speech, gaze, and hand gestures. In Intelligent Multimedia Interfaces; Maybury, M.T., Ed.; American Association for Artificial Intelligence: Menlo Park, CA, USA, 1993; pp. 257–276. [Google Scholar]
- Uebbing-Rumke, M.; Gürlük, H.; Jauer, M.-L.; Hagemann, K.; Udovic, A. Usability evaluation of multi-touch displays for TMA controller working positions. In Proceedings of the 4th SESAR Innovation Days, Madrid, Spain, 25–27 November 2014. [Google Scholar]
- Gürlük, H.; Helmke, H.; Wies, M.; Ehr, H.; Kleinert, M.; Mühlhausen, T.; Muth, K.; Ohneiser, O. Assistant Based Speech Recognition—Another Pair of Eyes for the Arrival Manager. In Proceedings of the 34th Digital Avionics Systems Conference (DASC), Prague, Czech Republic, 13–17 September 2015. [Google Scholar]
- Helmke, H.; Ohneiser, O.; Mühlhausen, T.; Wies, M. Reducing Controller Workload with Automatic Speech Recognition. In Proceedings of the 35th Digital Avionics Systems Conference (DASC), Sacramento, CA, USA, 25–29 September 2016. [Google Scholar]
- Möhlenbrink, C.; Papenfuß, A. Eye-data metrics to characterize tower controllers’ visual attention in a multiple remote tower exercise. In Proceedings of the ICRAT, Istanbul, Turkey, 26–30 May 2014. [Google Scholar]
- Ohneiser, O.; Jauer, M.-L.; Gürlük, H.; Uebbing-Rumke, M. TriControl—A Multimodal Air Traffic Controller Working Position. In Proceedings of the 6th SESAR Innovation Days, Delft, The Netherlands, 8–10 November 2016. [Google Scholar]
- Ohneiser, O.; Jauer, M.-L.; Rein, J.R.; Wallace, M. Faster Command Input Using the Multimodal Controller Working Position “TriControl”. Aerospace 2018, 5, 54. [Google Scholar] [CrossRef] [Green Version]
- Tiewtrakul, T.; Fletcher, S.R. The challenge of regional accents for aviation English language proficiency standards: A study of difficulties in understanding in air traffic control-pilot communications. Ergonomics 2010, 2, 229–239. [Google Scholar] [CrossRef] [PubMed]
- ICAO. The Second Meeting of the Regional Airspace Safety Monitoring Advisory Group (RASMAG/2). 2004. Available online: https://www.icao.int/Meetings/AMC/MA/2004/RASMAG2/ip03.pdf (accessed on 14 January 2020).
- Chatty, S.; Lecoanet, P. Pen Computing for Air Traffic Control. In Proceedings of the CHI’96: SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 13–18 April 1996; pp. 87–94. [Google Scholar]
- Schmugler, A. Feasibility Analysis of the Multimodal Air Traffic Controller Working Position Prototype “TriControl”. Master’s Thesis, Technische Universität Dresden, Dresden, Germany, 2018. [Google Scholar]
- Czaja, S.J.; Nair, S.N. Human Factors Engineering and Systems Design. In Handbook of Human Factors and Ergonomics; Salvendy, G., Ed.; John Wiley & Sons: Hoboken, NJ, USA, 2006. [Google Scholar] [CrossRef]
- Bernsen, N. Multimodality Theory. In Multimodal User Interfaces. Signals and Communication Technologies; Tzovaras, D., Ed.; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar] [CrossRef]
- Adkar, P. Unimodal and Multimodal Human Computer Interaction: A Modern Overview. Int. J. Comput. Sci. Inf. Eng. Technol. 2013, 2, 8–15. [Google Scholar]
- Norman, D.A. Design Rules Based on Analysis of Human Error. Commun. ACM 1983, 4, 254–258. [Google Scholar] [CrossRef]
- Nachreiner, F.; Nickel, P.; Meyer, I. Human factors in process control systems: The design of human-machine interfaces. Saf. Sci. 2006, 44, 5–26. [Google Scholar] [CrossRef]
- Sheridan, T.B. Humans and Automation. System Design and Research Issues. In Wiley Series in System Engineering and Management: HFES Issues in Human Factors and Ergonomics Series; Human Factors and Ergonomics Society, Ed.; John Wiley & Sons Inc.: Hoboken, NJ, USA, 2002; Volume 3. [Google Scholar]
- EUROCONTROL. Integrated Task and Job Analysis of Air Traffic Controllers—Phase 2: Task Analysis of En-route Controllers; EUROCONTROL: Brussels, Belgium, 1999. [Google Scholar]
- EUROCONTROL. Integrated Task and Job Analysis of Air Traffic Controllers—Phase 3: Baseline Reference of Air Traffic Controller Task and Cognitive Processes in the ECAC Area; EUROCONTROL: Brussels, Belgium, 2000. [Google Scholar]
- Cardosi, K.M.; Brett, B.; Han, S. An Analysis of TRACON (Terminal Radar Approach Control) Controller-Pilot Voice Communications; DOT/FAA/AR-96/66; DOT FAA: Washington, DC, USA, 1996.
- Proctor, R.W.; Vu, K.-P.L. Human Information Processing: An Overview for Human-Computer Interaction. In Human Computer Interaction Fundamentals; Sears, A., Jacko, J.A., Eds.; CRC Press: Boca Raton, FL, USA, 2009; pp. 19–38. [Google Scholar]
- Oviatt, S.L. Human-centered design meets cognitive load theory: Designing interfaces that help people think. In Proceedings of the 14th Annual ACM international Conference on Multimedia, New York, NY, USA, 23–27 October 2006; pp. 871–880. [Google Scholar]
- Baddeley, A.D. Working Memory. Science 1992, 255, 556–559. [Google Scholar] [CrossRef]
- Bolt, R.A. Put-that-there: Voice and gesture at the graphics interface. Comput. Graph. 1980, 3, 262–270. [Google Scholar] [CrossRef]
- Nigay, L.; Coutaz, J. A Design Space for Multimodal Systems: Concurrent Processing and Data Fusion. In Proceedings of the INTERCHI’93 Conference on Human Factors in Computing Systems, Amsterdam, The Netherlands, 24–29 April 1993; pp. 172–178. [Google Scholar]
- Bourguet, M.L. Designing and Prototyping Multimodal Commands. In Proceedings of the Human-Computer Interaction INTERACT’03, Zurich, Switzerland, 1–5 September 2003; pp. 717–720. [Google Scholar]
- Oviatt, S.L. Breaking the Robustness Barrier: Recent Progress on the Design of Robust Multimodal Systems. Adv. Comput. 2002, 56, 305–341. [Google Scholar]
- Manawadu, E.U.; Kamezaki, M.; Ishikawa, M.; Kawano, T.; Sugano, S. A Multimodal Human-Machine Interface Enabling Situation-Adaptive Control Inputs for Highly Automated Vehicles. In Proceedings of the IEEE Intelligent Vehicles Symposium (IV), Los Angeles, CA, USA, 11–14 June 2017; pp. 1195–1200. [Google Scholar]
- Pentland, A. Perceptual Intelligence. Commun. ACM 2000, 4, 35–44. [Google Scholar] [CrossRef]
- Seifert, K. Evaluation of Multimodal Computer Systems in Early Development Phases, Original German Title: Evaluation Multimodaler Computer-Systeme in Frühen Entwicklungsphasen. Ph.D. Thesis, Technische Universität Berlin, Berlin, Germany, 2002. [Google Scholar] [CrossRef]
- Oviatt, S.L. Multimodal interactive maps: Designing for human performance. Hum. Comput. Interact. 1997, 12, 93–129. [Google Scholar]
- Cohen, P.R.; McGee, D.R. Tangible multimodal interfaces for safety-critical applications. Commun. ACM 2004, 1, 1–46. [Google Scholar] [CrossRef]
- den Os, E.; Boves, L. User behaviour in multimodal interaction. In Proceedings of the HCI International, Las Vegas, NV, USA, 22–27 July 2005; Available online: http://lands.let.ru.nl/literature/boves.2005.2.pdf (accessed on 14 January 2020).
- Shi, Y.; Taib, R.; Ruiz, N.; Choi, E.; Chen, F. Multimodal Human-Machine Interface and User Cognitive Load Measurement. Proc. Int. Fed. Autom. Control 2007, 40, 200–205. [Google Scholar] [CrossRef]
- Oviatt, S. User-centered modeling for spoken language and multimodal interfaces. IEEE Multimed. 1996, 4, 26–35. [Google Scholar] [CrossRef]
- Oviatt, S.L. Mutual disambiguation of recognition errors in a multimodal architecture. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, Pittsburgh, PA, USA, 15–20 May 1999; pp. 576–583. [Google Scholar]
- Oviatt, S.L. Ten myths of multimodal interaction. Commun. ACM 1999, 11, 74–81. [Google Scholar] [CrossRef]
- Oviatt, S.L.; Coulston, R.; Lunsford, R. When do we interact multimodally? Cognitive load and multimodal communication patterns. In Proceedings of the 6th International Conference on Multimodal interfaces, State College, PA, USA, 13–15 October 2004; pp. 129–136. [Google Scholar]
- Oviatt, S.L.; Coulston, R.; Tomko, S.; Xiao, B.; Lunsford, R.; Wesson, M.; Carmichael, L. Toward a theory of organized multimodal integration patterns during human-computer interaction. In Proceedings of the ICMI 5th International Conference on Multimodal Interfaces, Vancouver, BC, Canada, 5–7 November 2003; pp. 44–51. [Google Scholar]
- Marusich, L.R.; Bakdash, J.Z.; Onal, E.; Yu, M.S.; Schaffer, J.; O’Donovan, J.; Höllerer, T.; Buchler, N.; Gonzalez, C. Effects of information availability on command-and-control decision making performance, trust, and situation awareness. Hum. Factors 2016, 2, 301–321. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Connolly, D.W. Voice Data Entry in Air Traffic Control; Report N93-72621; National Aviation Facilities Experimental Center: Atlantic City, NJ, USA, 1977. [Google Scholar]
- ICAO. ATM (Air Traffic Management): Procedures for Air Navigation Services; DOC 4444 ATM/501; International Civil Aviation Organization (ICAO): Montréal, QC, Canada, 2007. [Google Scholar]
- Helmke, H.; Oualil, Y.; Schulder, M. Quantifying the Benefits of Speech Recognition for an Air Traffic Management Application. Konferenz Elektronische Sprachsignalverarbeitung. 2017, pp. 114–121. Available online: http://essv2017.coli.uni-saarland.de/pdfs/Helmke.pdf (accessed on 14 January 2020).
- Helmke, H.; Slotty, M.; Poiger, M.; Herrer, D.F.; Ohneiser, O.; Vink, N.; Cerna, A.; Hartikainen, P.; Josefsson, B.; Langr, D.; et al. Ontology for transcription of ATC speech commands of SESAR 2020 solution PJ.16-04. In Proceedings of the IEEE/AIAA 37th Digital Avionics Systems Conference (DASC), London, UK, 23–27 September 2018. [Google Scholar]
- Cordero, J.M.; Dorado, M.; de Pablo, J.M. Automated speech recognition in ATC environment. In Proceedings of the 2nd International Conference on Application and Theory of Automation in Command and Control Systems, London, UK, 29–31 May 2012; pp. 46–53. [Google Scholar]
- Chen, S.; Kopald, H.D.; Elessawy, A.; Levonian, Z.; Tarakan, R.M. Speech inputs to surface safety logic systems. In Proceedings of the IEEE/AIAA 34th Digital Avionics Systems Conference (DASC), Prague, Czech Republic, 13–17 September 2015. [Google Scholar]
- Chen, S.; Kopald, H.D.; Chong, R.; Wei, Y.; Levonian, Z. Read back error detection using automatic speech recognition. In Proceedings of the 12th USA/Europe Air Traffic Management Research and Development Seminar (ATM2017), Seattle, WA, USA, 26–30 June 2017. [Google Scholar]
- Updegrove, J.A.; Jafer, S. Optimization of Air Traffic Control Training at the Federal Aviation Administration Academy. Aerospace 2017, 4, 50. [Google Scholar] [CrossRef] [Green Version]
- Helmke, H.; Ohneiser, O.; Buxbaum, J.; Kern, C. Increasing ATM Efficiency with Assistant Based Speech Recognition. In Proceedings of the 12th USA/Europe Air Traffic Management Research and Development Seminar (ATM2017), Seattle, WA, USA, 26–30 June 2017. [Google Scholar]
- Helmke, H.; Rataj, J.; Mühlhausen, T.; Ohneiser, O.; Ehr, H.; Kleinert, M.; Oualil, Y.; Schulder, M. Assistant-Based Speech Recognition for ATM Applications. In Proceedings of the 11th USA/Europe Air Traffic Management Research and Development Seminar (ATM2015), Lisbon, Portugal, 23–26 June 2015. [Google Scholar]
- Traoré, M.; Hurter, C. Exploratory study with eye tracking devices to build interactive systems for air traffic controllers. In Proceedings of the International Conference on Human-Computer Interaction in Aerospace (HCI-Aero’16), Paris, France, 14–16 September 2016; ACM: New York, NY, USA, 2016. [Google Scholar]
- Merchant, S.; Schnell, T. Applying Eye Tracking as an Alternative Approach for Activation of Controls and Functions in Aircraft. In Proceedings of the 19th Digital Avionics Systems Conference (DASC), Philadelphia, PA, USA, 7–13 October 2000. [Google Scholar]
- Hurter, C.; Lesbordes, R.; Letondal, C.; Vinot, J.L.; Conversy, S. Strip’TIC: Exploring augmented paper strips for air traffic controllers. In Proceedings of the International Working Conference on Advanced Visual Interfaces, Capri Island, Italy, 22–26 May 2012; ACM: New York, NY, USA, 2012; pp. 225–232. [Google Scholar]
- Alonso, R.; Causse, M.; Vachon, F.; Parise, R.; Dehaise, F.; Terrier, P. Evaluation of head-free eye tracking as an input device for air traffic control. Ergonomics 2013, 2, 246–255. [Google Scholar] [CrossRef] [Green Version]
- Westerman, W.C. Hand Tracking, Finger Identification, and Chordic Manipulation on a Multi-Touch Surface. Ph.D. Thesis, University of Delaware, Newark, DE, USA, 1999. Available online: https://resenv.media.mit.edu/classarchive/MAS965/readings/Fingerwork.pdf (accessed on 14 January 2020).
- Seelmann, P.-E. Evaluation of an eye tracking and multi-touch based operational concept for a future multimodal approach controller working position, original German title: Evaluierung eines Eyetracking und Multi-Touch basierten Bedienkonzeptes für einen zukünftigen multimodalen Anfluglotsenarbeitsplatz. Bachelor’s Thesis, Technische Universität Braunschweig, Braunschweig, Germany, 2015. [Google Scholar]
- Jauer, M.-L. Multimodal Controller Working Position, Integration of Automatic Speech Recognition and Multi-Touch Technology, original German title: Multimodaler Fluglotsenarbeitsplatz, Integration von automatischer Spracherkennung und Multi-Touch-Technologie. Bachelor’s Thesis, Technische Universität Braunschweig, Braunschweig, Germany, 2014. [Google Scholar]
- Prakash, A.; Swathi, R.; Kumar, S.; Ashwin, T.S.; Reddy, G.R.M. Kinect Based Real Time Gesture Recognition Tool for Air Marshallers and Traffic Policemen. In Proceedings of the 2016 IEEE 8th International Conference on Technology for Education (T4E), Mumbai, India, 2–4 December 2016; pp. 34–37. [Google Scholar]
- Singh, M.; Mandal, M.; Basu, A. Visual gesture recognition for ground air traffic control using the Radon transform. In Proceedings of the 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, Edmonton, AB, Canada, 2–6 August 2005; pp. 2586–2591. [Google Scholar]
- Savery, C.; Hurter, C.; Lesbordes, R.; Cordeil, M.; Graham, T.C.N. When Paper Meets Multi-touch: A Study of Multi-modal Interactions in Air Traffic Control. In Human-Computer Interaction—INTERACT 2013; Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M., Eds.; Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2013; Volume 8119, pp. 196–213. [Google Scholar]
- Mertz, C.; Chatty, S.; Vinot, J.-L. Pushing the limits of ATC user interface design beyond S&M interaction: The DigiStrips experience. In Proceedings of the 3rd USA/Europe Air Traffic Management Research and Development Seminar (ATM2000), Naples, Italy, 3–6 June 2000. [Google Scholar]
- EUROCONTROL. E-OCVM Version 3.0 Volume I—European Operational Concept Validation Methodology; EUROCONTROL: Brussels, Belgium, 2010. [Google Scholar]
- NASA. Technology Readiness Level Definitions. n.d. Available online: https://www.nasa.gov/pdf/458490main_TRL_Definitions.pdf (accessed on 14 January 2020).
- SESAR Joint Undertaking. Introduction to the SESAR 2020 Programme Execution. 2015. Available online: https://ec.europa.eu/research/participants/data/ref/h2020/other/guides_for_applicants/jtis/h2020-pr-exec-intro-er-sesar-ju_en.pdf (accessed on 14 January 2020).
- Nielsen, J. Usability Engineering; Academic Press: Boston, MA, USA, 1993. [Google Scholar]
- DIN EN ISO 9241-11:2016. Ergonomics of Human-System-Interaction—Part 11: Usability: Definitions and Concepts; ISO: Geneva, Switzerland, 2017. [Google Scholar]
- Chen, Y.-H.; Germain, C.A.; Rorissa, A. An Analysis of Formally Published Usability and Web Usability Definitions. Proc. Am. Soc. Inf. Sci. Technol. 2009, 46, 1–18. [Google Scholar] [CrossRef]
- Shackel, B. The concept of usability. In Visual Display Terminals: Usability Issues and Health Concerns; Ennet, J.L.B., Arver, D.C., Andelin, J.S., Smith, M., Eds.; Prentice-Hall: Englewood Cliffs, NJ, USA, 1984; pp. 45–88. [Google Scholar]
- Shackel, B. Usability—Context, Framework, Definition, Design and Evaluation. In Human Factors for Informatics Usability; Shackel, B., Richardson, S., Eds.; Cambridge University Press: Cambridge, UK, 1991; pp. 21–38. [Google Scholar]
- Maguire, M. Methods to support human-centred design. Int. J. Hum.-Comput. Stud. 2001, 55, 587–634. [Google Scholar] [CrossRef]
- Weinschenk, S. Usability: A Business Case, Human Factors International. White Paper. 2005. Available online: https://humanfactors.com/downloads/whitepapers/business-case.pdf (accessed on 14 January 2020).
- Seebode, J.; Schaffer, S.; Wechsung, I.; Metze, F. Influence of training on direct and indirect measures for the evaluation of multimodal systems. In Proceedings of the Tenth Annual Conference of the International Speech Communication Association (INTERSPEECH2009), Brighton, UK, 6–10 September 2009. [Google Scholar]
- Nielsen, J.; Levy, J. Measuring usability: Preference vs. performance. Commun. ACM 1994, 4, 66–75. [Google Scholar] [CrossRef]
- Xu, Y.; Mease, D. Evaluating web search using task completion time. In Proceedings of the 32nd international ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR’09), Boston, MA, USA, 19–23 July 2009; ACM: New York, NY, USA, 2009; pp. 676–677. [Google Scholar]
- Wechsung, I. What Are Multimodal Systems? Why Do They Need Evaluation? Theoretical Background. In An Evaluation Framework for Multimodal Interaction; T-Labs Series in Telecommunication Services; Springer: Cham, Switzerland, 2014; pp. 7–22. [Google Scholar] [CrossRef]
- Landauer, T.K. Research methods in human-computer interaction. In Handbook of Human-Computer Interactio; Elsevier: Amsterdam, The Netherlands, 1988; pp. 905–928. [Google Scholar]
- Virzi, R.A. Refining the Test Phase of Usability Evaluation: How Many Subjects is Enough? Hum. Factors 1992, 4, 457–468. [Google Scholar] [CrossRef]
- Nielsen, J. Why You Only Need to Test with 5 Users. 2000. Available online: https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users/ (accessed on 14 January 2020).
- Schmettow, M. Sample size in usability studies. Commun. ACM 2012, 4, 64–70. [Google Scholar] [CrossRef]
- Ajzen, I.; Fishbein, M. Understanding Attitudes and Predicting Social Behavior; Prentice-Hall: Englewood Cliffs, NJ, USA, 1980. [Google Scholar]
- Davis, F.D. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Q. 1989, 3, 319–340. [Google Scholar] [CrossRef] [Green Version]
- Davis, F.D.; Bagozzi, R.P.; Warshaw, P.R. User Acceptance of Computer Technology: A Comparison of Two Theoretical Models. Manag. Sci. 1989, 8, 982–1003. [Google Scholar] [CrossRef] [Green Version]
- Davis, F.D.; Venkatesh, V. A critical assessment of potential measurement biases in the technology acceptance model: Three experiments. Int. J. Hum.-Comput. Stud. 1996, 45, 19–45. [Google Scholar] [CrossRef] [Green Version]
- Yousafzai, S.Y.; Foxall, G.R.; Pallister, J.G. Technology acceptance: A meta-analysis of the TAM: Part 1. J. Model. Manag. 2007, 3, 251–280. [Google Scholar] [CrossRef]
- Kim, H.; Kankanhalli, A. Investigating User Resistance to Information Systems Implementation: A Status Quo Bias Perspective. MIS Q. 2009, 3, 567–582. [Google Scholar] [CrossRef] [Green Version]
- Markus, M.L. Power, politics, and MIS implementation. Commun. ACM 1983, 6, 430–444. [Google Scholar] [CrossRef]
- Likert, R.A. Technique for the Measurement of Attitudes. Arch. Psychol. 1932, 140, 5–55. [Google Scholar]
- Davis, F.D. A Technology Acceptance Model for Empirically Testing New End-User Information Systems: Theory and Results. Ph.D. Thesis, Massachusetts Institute of Technology, Cambridge, MA, USA, 1985. Available online: https://dspace.mit.edu/bitstream/handle/1721.1/15192/14927137-MIT.pdf (accessed on 14 January 2020).
- Doll, W.J.; Hendrickson, A.; Deng, X. Using Davis’s perceived usefulness and ease-of-use instrument for decision making: A confirmatory and multi-group invariance analysis. Decis. Sci. 1998, 4, 839–869. [Google Scholar] [CrossRef]
- Jackson, T.F. System User Acceptance Thru System User Participation. In Proceedings of the Annual Symposium on Computer Application in Medical Care; American Medical Informatics Association: Bethesda, MD, USA, 1980; Volume 3, pp. 1715–1721. [Google Scholar]
- Lin, W.T.; Shao, B.B.M. The relationship between user participation and system success: A simultaneous contingency approach. Inf. Manag. 2000, 27, 283–295. [Google Scholar] [CrossRef]
- Luna, D.R.; Lede, D.A.R.; Otero, C.M.; Risk, M.R.; de Quirós, F.G.B. User-centered design improves the usability of drug-drug interaction alerts: Experimental comparison of interfaces. J. Biomed. Inform. 2017, 66, 204–213. [Google Scholar] [CrossRef]
- Kujala, S. User involvement: A review of the benefit and challenges. Behav. Inf. Technol. 2003, 1, 1–16. [Google Scholar] [CrossRef]
- König, C.; Hofmann, T.; Bruder, R. Application of the user-centred design process according ISO 9241-210 in air traffic control. Work 2012, 41, 167–174. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- DLR Institute of Flight Guidance. TriControl—Multimodal ATC Interaction. 2016. Available online: http://www.dlr.de/fl/Portaldata/14/Resources/dokumente/veroeffentlichungen/TriControl_web.pdf (accessed on 14 January 2020).
- Ohneiser, O. RadarVision—Manual for Controllers, Original German Title: RadarVision—Benutzerhandbuch für Lotsen; Internal Report 112-2010/54; German Aerospace Center, Institute of Flight Guidance: Braunschweig, Germany, 2010. [Google Scholar]
- Brooke, J. SUS: A “quick and dirty” usability scale. In Usability Evaluation in Iindustry; Jordan, P.W., Thomas, B., Weerdmeester, B.A., McClelland, I.L., Eds.; Taylor & Francis: London, UK, 1996; pp. 189–194. [Google Scholar]
- Brooke, J. SUS: A Retrospective. J. Usability Stud. 2013, 2, 29–40. [Google Scholar]
- Sauro, J.; Leis, J.R. When Designing Usability Questionnaires, Does It Hurt to Be Positive? In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Vancouver, BC, Canada, 7–12 May 2011; pp. 2215–2224. [Google Scholar]
- Bangor, A.; Kortum, P.T.; Miller, J.T. An Empirical Evaluation of the System Usability Scale. Int. J. Hum.-Comput. Interact. 2008, 24, 574–594. [Google Scholar] [CrossRef]
- Bangor, A.; Kortum, P.T.; Miller, J.T. Determining What Individual SUS Scores Mean: Adding an Adjective Rating Scale. J. Usability Stud. 2009, 4, 114–123. [Google Scholar]
- Brinkman, W.-P.; Haakma, R.; Bouwhuis, D.G. Theoretical foundation and validity of a component-based usability questionnaire. Behav. Inf. Technol. 2009, 28, 121–137. [Google Scholar] [CrossRef]
- Jakobi, J. Prague—A SMGCS Test Report. 2010. Available online: http://emma2.dlr.de/maindoc/2-D631_PRG-TR_V1.0.pdf (accessed on 14 January 2020).
- Bishop, P.A.; Herron, R.L. Use and Misuse of the Likert Item Responses and Other Ordinal Measures. Int. J. Exerc. Sci. 2015, 3, 297–302. [Google Scholar]
- Burnard, P.; Gill, P.; Stewart, K.; Treasure, E.; Chadwick, B. Analysing and presenting qualitative data. Br. Dent. J. 2008, 8, 429–432. [Google Scholar] [CrossRef] [PubMed]
- Nørgaard, M.; Hornbæk, K. What do usability evaluators do in practice? An explorative study of think-aloud testing. In Proceedings of the 6th conference on Designing Interactive systems, University Park, PA, USA, 26–28 June 2006; ACM: New York, NY, USA, 2006; pp. 209–218. [Google Scholar]
- Battleson, B.; Booth, A.; Weintrop, J. Usability Testing of an Academic Library Web Site: A Case Study. J. Acad. Librariansh. 2001, 3, 188–198. [Google Scholar] [CrossRef]
I… | N = 14 (All ATCOs) | N = 4 (Active APP ATCOs) | |||
---|---|---|---|---|---|
No. | System Usability Score Items 1 | M | SD | M | SD |
S01 | think that I would like to use the system frequently. | 2.1 | 1.5 | 3.5 | 0.6 |
S02 | found the system unnecessarily complex. 2 | 2.6 | 1.2 | 3.3 | 0.5 |
S03 | thought the system was easy to use. | 2.5 | 1.2 | 3.3 | 0.5 |
S04 | think that I would need the support of a technical person to be able to use the system. 2 | 2.7 | 1.3 | 3.3 | 1.0 |
S05 | found the various functions in the system were well integrated. | 2.3 | 1.1 | 3.0 | 0.8 |
S06 | thought there was too much inconsistency in the system. 2 | 2.4 | 1.2 | 3.5 | 0.6 |
S07 | would imagine that most people would learn to use the system very quickly. | 2.2 | 1.1 | 2.5 | 1.0 |
S08 | found the system very cumbersome to use. 2 | 2.5 | 1.6 | 4.0 | 0.0 |
S09 | felt very confident using the system. | 2.1 | 1.1 | 3.0 | 0.0 |
S10 | needed to learn a lot of things before I could get going with the system. 2 | 2.9 | 1.1 | 2.5 | 1.7 |
Total SUS score | 60.9 | 21.9 | 79.4 | 9.7 | |
S11 | found that TriControl multitouch gestures for command selection are intuitive and easy to learn. | 2.8 | 1.2 | 3.5 | 0.6 |
S12 | think that the use of eye-tracking feature for selecting aircraft is disturbing. 2 | 2.3 | 1.4 | 2.5 | 1.0 |
S13 | think that automatic speech recognition is a good way to enter values. | 2.2 | 1.4 | 2.8 | 1.5 |
S14 | found the use of multiple modalities (eye gaze, gestures, speech) is too demanding. 2 | 2.6 | 1.2 | 3.0 | 1.2 |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ohneiser, O.; Biella, M.; Schmugler, A.; Wallace, M. Operational Feasibility Analysis of the Multimodal Controller Working Position “TriControl”. Aerospace 2020, 7, 15. https://doi.org/10.3390/aerospace7020015
Ohneiser O, Biella M, Schmugler A, Wallace M. Operational Feasibility Analysis of the Multimodal Controller Working Position “TriControl”. Aerospace. 2020; 7(2):15. https://doi.org/10.3390/aerospace7020015
Chicago/Turabian StyleOhneiser, Oliver, Marcus Biella, Axel Schmugler, and Matt Wallace. 2020. "Operational Feasibility Analysis of the Multimodal Controller Working Position “TriControl”" Aerospace 7, no. 2: 15. https://doi.org/10.3390/aerospace7020015
APA StyleOhneiser, O., Biella, M., Schmugler, A., & Wallace, M. (2020). Operational Feasibility Analysis of the Multimodal Controller Working Position “TriControl”. Aerospace, 7(2), 15. https://doi.org/10.3390/aerospace7020015