Next Article in Journal
Augmentation Method for X-Ray Pulsar Navigation Using Time Difference of Arrival and Range Measurement, Based on Polarization Encoded Pulse Position Modulation
Next Article in Special Issue
Multi-Objective Airport Slot Allocation with Demand-Side Fairness Considerations
Previous Article in Journal
Extracting the Spatial Correlation of Wall Pressure Fluctuations Using Physically Driven Artificial Neural Network
Previous Article in Special Issue
Feasibility of Conflict Prediction of Drone Trajectories by Means of Machine Learning Techniques
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

New Approaches for the Use of Extended Mock-Ups for the Development of Air Traffic Controller Working Positions

by
Lennard Nöhren
1,*,
Lukas Tyburzy
1,
Marco-Michael Temme
1,
Kathleen Muth
1,
Thomas Hofmann
2,
Deike Heßler
2,
Felix Tenberg
2,
Eilert Viet
3 and
Michael Wimmer
3
1
Institute of Flight Guidance, German Aerospace Center (DLR), 38108 Braunschweig, Germany
2
Faculty of Engineering and Computer Science, University of Applied Sciences Osnabrück, 49076 Osnabrück, Germany
3
Frequentis Orthogon, 28207 Bremen, Germany
*
Author to whom correspondence should be addressed.
Aerospace 2025, 12(2), 114; https://doi.org/10.3390/aerospace12020114
Submission received: 13 November 2024 / Revised: 26 January 2025 / Accepted: 29 January 2025 / Published: 31 January 2025
(This article belongs to the Special Issue Future Airspace and Air Traffic Management Design)

Abstract

:
Today, integrating new functions into air traffic controller working positions or developing completely new displays are time-consuming and expensive processes. The users are often only included during the concept phase and after the main development phase is completed. Therefore, they do not have the chance to influence the design and development process by giving structural feedback. Any subsequent changes to the system after completing the main development phase will be expensive and slow. This paper proposes a new approach to integrate designers and users more tightly in the development process for digital air traffic control systems. By creating and reviewing realistic mock-ups in small iterative steps, the look and feel of future support functions can be validated in advance of the actual implementation and easily adapted if changes are requested. We performed a series of steps to evaluate the new workflow as a case study, including idea development, design, validation, and implementation into the target system. In a validation campaign with air traffic controllers, the developed design and functionalities received very positive feedback and the new workflow was successfully applied and evaluated as a case study.

1. Introduction

The integration of extended functionalities and displays into air traffic controller working positions (CWP) represents a time-consuming and cost-intensive process [1]. This applies in particular if air navigation service providers (ANSP) are exposed to increased cost pressure as a result of privatization [2]. Air traffic controllers as end users are often only involved during the concept phase and later as a tester after the main development phase has been completed. Therefore, they have little opportunity to influence the embodiment process through structural feedback [3]. Once the main development phase has been completed, elementary changes to the system are expensive and lengthy [4]. There is a risk that useful and necessary changes will only be implemented late or not at all.
By creating and using realistic and interactive mock-ups with basic functionalities, the appearance and handling of future support functions should be validated in small iterative steps prior to actual implementation. They can then be easily adapted in the event of changes. This process is not new in industry, but it is not yet used in some safety-critical areas, such as air traffic control [5]. Our approach shows that the use of mock-ups is also safe for ANSP applications and should be considered in this domain in order to develop software faster and more efficiently to bring it into operational use.
This article presents an approach implemented in the Envision project (Erprobungsplattform für Nutzerzentrierte Visualisierungs- und Interaktionskonzepte—Experimental platform for user-centered visualization and interaction concepts), where developers, designers, and controllers are more closely involved in the development process for the digitization of the air traffic management process [6]. With the aim of improving the development workflow, a complete development cycle for a CWP prototype was run through in order to evaluate the path to this aim and the achievement of the objective, followed by a selection of current and conceptual functional enhancements. Iterative evaluations and further developments were accompanied by frequent workshops with air traffic controllers (ATCO), who were regularly involved in the design process as test persons both at the Germany Aerospace Center (DLR) laboratories and online.
The focus of this paper is on the development workflow of radar displays (primary displays) which are used in area control centres today and will become more important in the future when more advanced controller support software is used. These displays provide controllers with continuous guidance on the organisation of air traffic and additional information through these displays. As new support functions are developed, they must be seamlessly integrated into the display and, of course, into the development workflow. At the same time, it is important to avoid overloading the display, otherwise there is a risk that important information will be overlooked, leading to a loss of situational awareness [7].

1.1. Controller Working Position

For visual monitoring, centre controllers have radar equipment at their disposal. Depending on the system used, it projects an overview of the traffic situation in the airspace onto a display every five seconds or more frequently. The pure position data are supplemented and processed by additional information from the on-board Automatic Dependent Surveillance—Broadcast (ADS-B) transponders, Controller Pilot Data Link Communication (CPDLC) information and flight plan data, as well as information from other control centres. In addition to current flight situation data, this may also include current climb or descent rates and actual clearances.
Most workplaces in air traffic control centres in Europe are team workstations where executives and planners organise air traffic safely and efficiently in their assigned sector together. The executive with direct radio contact to the flight crew is dependent on ad hoc support from the systems, while the planner, acting as a coordinator between their own and neighbouring sectors, tends to benefit more from tactical to pre-tactical tools. The software they currently use is developed by commercial companies and adapted to the requirements of the respective centre in the various countries. In addition to the pure display of information, this includes, for example, support systems for Medium-term Conflict Detection (MTCD) [8] and Short-term Conflict Alert (STCA) [9]. Air traffic controller support systems available today include Arrival Manager (AMAN), Terminal Sequencing and Spacing (TSAS), Departure Manager (DMAN) and Advanced Surface Movement Guidance and Control Systems (A-SMGCS).
These support systems will only reach their full potential when they are coupled with each other and thus enable global optimization across the entire air traffic management system [10]. The operation of these additional systems poses a particular challenge for controllers, as they must always be operated using mouse, pen, and keyboard for data exchange and functional interactions. Many aircraft-specific tools like distance measuring are shown and operated on the same primary display. At the same time, in order to prevent controllers from overlooking aberrant flight manoeuvres or impending conflicts, additional data presentations and embedded or superimposed data frames shall not obscure any traffic-relevant views. This applies in particular to meteorological displays of severe weather situations in the CWP [11]. Another challenge is the requirement to provide only information that is relevant to controllers at that particular moment.
DLR’s participation in numerous national and international projects with controllers and ANSPs has shown that until now, the users have been neglected at the start of the development process for new air traffic controller working positions. User involvement is particularly important when developing new, future-oriented HMIs in order to implement useful improvements [12]. Nevertheless, establishing a thoughtfully structured workflow for developing the new interface in the project is essential, as effective design with a carefully crafted user interface [13] helps in managing the complexity of a system [7]. The problem could be exacerbated by the introduction of single controller operations (SCO) [14] or flight-centric procedures [15]. These will require completely new working environments for air traffic controllers.

1.2. Software Development in the ATC-Domain

All of the above criteria and constraints mean that the development and implementation of new support systems for centre controller workstations is very complex, time-consuming and safety-critical. Every software change has to be thoroughly analysed in terms of its usefulness for the controllers and its safety for flight operations. From the developer’s point of view, it is necessary to test each system, information display, and new control function with the controllers and to obtain feedback on how much the function meets the user’s criteria. As controller support functions and their specific displays can currently only be tested in realistic Human-in-the-Loop simulations and real operation after their complex implementation in a controller working position, this inevitably leads to systems and displays having to be adapted and optimised step by step in several iterations with a high expenditure of time.
In order to simplify and accelerate this adaptation process, it is necessary to test and optimise new concepts before starting the software development. The use of a mock-up system should be established for fundamental innovations that cannot be based on an existing system to reduce development cost [16]. It should have the same look and feel as the final system, but should be quick and easy to adapt and change. The actual functionalities are not implemented in the system until the user interface has been evaluated with the controllers and all displays have been harmonised with them. In this paper, we propose a novel CWP development process for the ATC-domain that includes a user-centred, iterative HMI design phase using design mock-ups that precedes the complex software implementation process.

2. CWP Development Workflow

To improve ATC software development, a new workflow is proposed in this work. This chapter will begin by describing the typical project execution currently employed in the ATC domain. Afterwards, it will present alternative workflow models used in other fields. Problems that can occur when applying these workflows to ATC software development are outlined. Finally, a new and improved approach that is tailored to the special conditions in the ATC domain is presented.

2.1. Current Project Development Workflow

This section reflects the experience of Frequentis Orthogon from 20 years of project execution with worldwide customers in the ATC domain. An ATC project, in most cases, starts with a tender specification provided by a customer directly to a system supplier or as a public procurement call (see e.g., [17]). The tender specification already contains requirements for an HMI. These requirements often describe existing systems with some extensions and/or modifications. The overall tender description is technical and not user-oriented. The tender is answered by the industry by providing an offer containing product requirements, screenshots of existing solutions and sketches of HMIs to be developed. After being awarded the contract, the solution provider starts a specification phase together with the customer. The project management of the solution provider and the customer share the goal to define the HMI specification as quickly as possible. The specification phase typically includes an HMI workshop where customer and solution provider discuss the existing HMI and the solution provider presents design ideas for new HMI elements. Both parties share the problem of limited resources. Because of this, on the customer side very few controllers participate in the workshop; often not even all the different user groups of tower, approach, and sector controllers are involved. During the HMI workshop, design ideas are illustrated by the system supplier in form of wire frames or drawings but in most cases without an HMI prototype. The controllers decide on the solutions presented and may raise concerns, but the users are not encouraged to introduce their own ideas. Action items are collected and documented accordingly to track the open discussion points. In the following project period the action items are consolidated, e.g., by providing updates of the wire frames. Once both parties consider the HMI specification as mature, the HMI developers start working on the new or changed HMI features. The customer may receive a limited number of early drops of the software to be able to check the development before the first acceptance test. The software is checked by one or all workshop participants who are familiar with the new features and feedback is returned to the solution provider. Figure 1 shows an overview of the currently used workflow.
The overall process is working well, but it is not optimal due to the following deficits:
  • The process is not user-oriented. The tender does not include work flows or use cases because the tender serves the purpose of formally describing a system as a basis for a commercial offer. The same principle applies to the specification phase of the supplier which aims at reaching a solution agreement as quickly and as inexpensively as possible. Overall, the controllers, as users of the system, play only a subordinate role in the workflow.
  • The decision-making process during one or two days of HMI workshops is insufficient, often rushing towards a consensus without allowing controllers sufficient time to weigh the advantages and disadvantages of new workflows presented.
  • Technical abstractions are challenging for controllers who typically do not have a software development background, making wire frames, drawings, use cases, and requirements too abstract to engage with effectively.
Due to the missing user orientation, the specified HMI workflow may not fully meet controller needs when the HMI development is started. This is a problem for the solution provider as significant errors must be detected in the early stages of design preventing costly re-designs. It is a problem for the customer as minor design changes at later project stage are discouraged when they require changes to formal specifications and fixed agreements [18].

2.2. Agile Approach

The deficits described in the previous section are typical for a waterfall model. The workflow could be improved by using an agile project approach, which is common in other sectors [19]. An agile project focuses on iterative development, customer collaboration, cross-functional teams, adaptability, and continuous improvement. Projects are divided into small iterations, each resulting in a working product increment (Figure 2). Continuous feedback from customers ensures the product meets their needs [20].
The approach translated into the ATC HMI development involves HMI designers and controllers of the customer as part of the team. However, the ATC industry faces unique challenges in directly implementing agile methodologies. These include the following:
  • Conducting repeated HMI workshops with a larger group of controllers is impractical due to financial, temporal, and resource constraints. Neither the supplier nor the customers are able to support multiple, extensive workshops.
  • Creating multiple HMI prototypes is prohibitively expensive. The presentation of HMI prototypes requires either face-to-face interactions between controllers and developers or the ability for controllers to independently operate prototype software. Given that the supplier’s clientele is globally dispersed, frequent travel for HMI feature development is unfeasible. Additionally, ATC security policies typically prevent controllers from installing software on their systems.
  • ATC tender specifications often clash with agile methods due to the fixed formal specifications, pre-defined timelines, fixed prices, and missing customer commitment.
  • There is insufficient continuous work to justify a full-time HMI designer position in the scope of a project. Moreover, a designer’s creativity may stagnate over time without fresh challenges. On the other hand, a good HMI designer knows the customer domain and the ATC user domain requires very specific know-how.
These issues highlight the complexities of optimizing HMI development within the ATC industry’s constraints.

2.3. User-Centred Design

Since the requirements within the process of creating software for air traffic control change frequently [21], it makes sense to choose a workflow that remains flexible during the course of the project and allows for adjustments at any time. A frequently referenced design process in industrial design is “user-centred design”. This is based on DIN EN ISO 9241-210 [22], which addresses the user-centred design of products and software. The model, which can be adapted to the individual processes, provides six approaches for adapting a product to the user. This adaptation can lead to increasing user productivity, reducing the amount of training required, increasing usability and accessibility, improving the user experience and reducing stress. These approaches include understanding the user, tasks, and working environment, involving the user throughout the entire development and design process, continuously evaluating and adapting design solutions, providing for iterations, taking the user experience into account, and linking and simplifying interdisciplinary skills within the design team. In particular, the provision of iterations is repeatedly emphasised. The focus here is on the principles of usability in accordance with DIN EN ISO 9241-11 [23]: effectiveness, efficiency, and satisfaction. As depicted in Figure 3, user-centred design, just like the agile approach, is also based on an iterative procedure.
In the air traffic control context, it has been found that it is often difficult for users to communicate their mental model [24]. The way in which the user can be integrated into the project varies greatly depending on the use case and therefore there is no standard solution that explains how users can best be integrated. The iterative approach and the prioritization of the user often prove to be a challenge in practice. Every now and then, a clear step forward must be taken instead of another iteration loop in order to be able to implement clear results and keep development costs low [25]. Furthermore, the user cannot be constantly involved in the process. As almost all international air navigation service providers (ANSP) are currently suffering from a shortage of controllers and trainee controllers, ATCOs are not available in large numbers and are not constantly available for scientific research or extensive ANSP-internal developments [26,27]. As a result, the integration of the user must be well-prepared and thus quickly and easily accessible.

2.4. The New Workflow

As described in Section 2.2 and Section 2.3, there are challenges in applying established agile and user-centered design models to the air traffic control (ATC) domain. To address these issues, the project developed a process that integrates both iterative and user-centered approaches and embeds the combined result in a linear model, similar to the one described in Section 2.1. While such mixed processes are common in software development, the iterative approach is still relatively uncommon in ATC software implementation. Figure 4 illustrates the new workflow with an iterative design loop based on design mock-ups followed by a linear implementation and acceptance phase.
It is essential not to adhere too rigidly to ISO 9241-210 [22], but to adopt an even more iterative and recursive procedure in the design process [25], which reflects an agile approach [24]. By expanding the team into an interdisciplinary group of developers, designers, SMEs and air traffic controllers, more diverse expertise is achieved. This broadly diversified consortium improves the communication and the coordination within the team [12]. Interdisciplinary workshops [28] are used to determine the expectations and perspectives of stakeholders [13], adjust the existing workflow with the project, and develop a template draft for a modified workflow. In dealing with users, workshops have also proven to be a productive tool in the past [28]. To understand the user, the status quo must first be explored [24]. The basis for this is a respectful and friendly approach to the user [29] to be able to empathise with them and make design decisions from their perspective [12]. Interactive design mock-ups are used to harmonise the ideas of the team with the mental model of the users and within the consortium. They allow an idea of the features and elements of the software to be obtained without investing a great deal of development effort [30]. Interactive design mock-ups include selected interactions that simulate planned software features, efficiently conveying the designed elements. When the process identifies areas for improvement or potential adaptations, the software is designed to allow future modifications, ensuring its long-term viability. Once users positively evaluate the design and feature ideas, these are transferred to the actual application through development. If users are unhappy with any of the elements or areas of the software, an iteration loop is integrated. The development team is responsible for creating all the technical interfaces and making the system functional, while the design team ensures usability and efficiency [13].

3. Applying the New Workflow: A Case Study

To evaluate whether the new workflow proposed in Section 2.4 can be applied to the development of ATC software, a case study was performed and feedback from actual users (i.e., air traffic controllers) was collected. This chapter describes the different steps that were performed. Initially, the various stakeholders of the software development process performed small workshops for general concepts and ideas. Based on the results, design proposals for an example en-route CWP were created and implemented as mock-ups. Afterwards, a validation campaign was performed where user feedback for the mock-ups and the proposed workflow was collected in two steps: First, a main workshop was conducted in-person at the DLR, and in a second step, a remote workshop was performed as follow-up to check if that is also a feasible way to perform user workshops. For both of these workshops, the design mock-ups were presented to the users and questionnaires were provided, which were on the one hand focusing on the contents of the mock-ups, and on the other hand on the proposed workflow and the tools used. This way, the workflow could be evaluated twofold: directly by subjective user feedback and indirectly by checking whether users can evaluate a newly designed system based entirely on mock-ups.

3.1. Initial Phase and First Workshops

At the beginning of the project, questionnaires were distributed to all stakeholders in the project, in which questions were asked about the status quo and how people understood their roles in the project. Three interdisciplinary workshops were then organised. These were used to develop the process already described in Section 2.4 and to coordinate it with the consortium. Air traffic controllers, developers, designers, and subject matter experts (SMEs) were involved in the workshops. The aim of the workshops was to develop the stakeholders’ requirements and mental models [31] for the CWP, to define the exact workflows and to establish communication channels, tools, and responsibilities. Regarding the design, the feedback from the ATCOs was particularly important; with regard to the development and workflows in the project, the requirements and ideas of the people involved in the development process were essential. The workflow was enriched by many meetings and workshops, which were held both remotely and on-site. The focus here was often on brainstorming and coordination within the team as well as with the user. The software packages Adobe XD and then Figma were used to create the prototypes, to which both the designers and the developers had access. The results agreed upon by the team and the user were then transferred to the actual software in order to keep the development effort as low as possible. With regard to the visual design, the design team used schematic illustrations of various concepts. Surveys and discussions were held with controllers from DLR and MUAC. They evaluated which representations are useful for the software and which design features have disadvantages. As such surveys are not generally valid statements and are always based on personal preferences, the controllers’ opinions were seen as a tendency and compared with other results, interviews, and research bases.

3.2. Main Workshop

To evaluate the design proposals that were developed as described in the previous section, some examples were prepared as high-fidelity prototypes of a controller working position with some basic interactions for an evaluation workshop. Six test persons were invited for the evaluation. Three of them were active air traffic controllers from Germany and three were retired ATCOs (one German, two British). They had different levels of experience as air traffic controllers as some of them had just finished their training and some had many years of experience. The workshop was split in two parts and each test person performed the workshop individually. In the first part, mock-ups were displayed on a demo working position (Figure 5) and the controllers were guided through them by the DLR development team. The individuals had up to five minutes to interact with each mock-up and ask questions. Afterwards, they received a questionnaire to the specific mock-up for general feedback to the usability and quality of the features on the one hand, and answer mock-up specific questions on the other.
Mock-ups were shown for five different topics:
  • Air situation display design
  • Track label design
  • Track label interactions
  • Layout
  • Advisories and conflicts
The mock-ups were a mixture of static images and interactive demos. Figure 6 depicts one of the mock-ups that was used to show the overall design of the air situation display. It contains a hover functionality to expand the track labels of the flight for a more realistic feeling.
After all mock-ups were examined by the users and the corresponding questionnaires were completed, the second part of the workshop began. In this part the controllers received two additional questionnaires in which they could evaluate the mock-up tool Figma and the new workflow. These questionnaires did not refer to the actual designs shown in the previous step, but only to the general idea of using design mock-ups to include users at an early stage and throughout the whole development process. At the end, there was some additional time for an open discussion and questions regarding design and workflow.
In the following weeks after completing the workshop, the feedback was evaluated and some of the recommendations were applied to new versions of mock-ups and considered during the following implementation of software prototypes.

3.3. Follow-Up Workshop

Hosting a workshop as described above comes with a high cost for multiple reasons: each participant needs to travel to the location, several supervisors are needed to help and guide the test person and to take notes, and lastly it takes a lot of time, as only one test person can perform the workshop at a time. For these reasons, the cost of a workshop rises drastically with the number of participants. However, for a robust evaluation, it is necessary to involve as many participants as possible, and as described in Section 2.4, these iterative review workshops must be performed regularly. Therefore, a way was explored how to involve more participants regularly in such a review workshop, without increasing the cost even further, by performing a follow-up workshop remotely.
It is possible to simply send the workshop material (Figma mock-ups and online questionnaires) to the participants and let them perform the workshop on their own and in their own time. Nevertheless, there are some possible negative aspects to this process. If the mock-ups are not properly explained the participants might not understand everything and questions cannot be clarified as easily as in-person. This can be somewhat alleviated by adding detailed explanations to the mock-ups. Another possible problem is that the mock-ups might not look as intended because of the different hardware conditions of the participants.
In order to test the feasibility of the remote workshop and to continue checking the usability of the proposed workflow, the attendees of the initial workshop were invited to participate in a remote follow-up workshop. In spite of the time constraints of ATCOs, half of the participants (i.e., three), agreed and the remote workshop took place a few weeks after the main workshop. Organizing another in-person workshop in this short time frame would not have been possible. For the follow-up workshop, some new Figma mock-ups were created containing weather widgets and incorporating some of the feedback from the main workshop. Online questionnaires were created using the web-tool Lime Survey. The questionnaires were structured the same as in the main workshop (first part with one questionnaire per mock-up topic, second part in which the general workflow and the mock-up tool were targeted). The questions of the second part were kept exactly the same as in the initial workshop to enable an evaluation of how the opinion of the workflow and mock-up changed after the remote workshop. Since the users should perform the workshop alone and without the guidance of any developers or designers, extensive descriptions of the mock-ups and possible interactions were created and combined with the questionnaires to guide the users through the evaluation process.
  • Inclusion of previous feedback
  • Weather in radar screen (three variants)
  • Weather advisories (two variants)
  • Weather in sidebar (two variants)
  • Permanent wind information (two variants)
Figure 7 shows a weather advisory mock-up that was used in the remote workshop. It displays a proposed route change to avoid a dangerous weather area and contains interactions to accept or decline the proposed detour route.

3.4. Questionnaires

This section presents the questionnaires that were used to evaluate the features, the evaluation workflow as well as the overall quality of the mock-ups and their suitability to evaluate CWP features.

3.4.1. User Experience Questionnaire

The User Experience Questionnaire (UEQ) is a widely used instrument for assessing the user experience of interactive products and systems [32]. The UEQ offers a structured approach to evaluate various aspects of the user experience, encompassing pragmatic and hedonic qualities. Pragmatic quality refers to how well a product meets its intended purpose, i.e., its effectiveness, efficiency, and usefulness, whereas hedonic quality pertains to how enjoyable or pleasurable the user experience is. The short-form version of the questionnaire (UEQ(S) [33]) further minimizes respondent burden and the length of the questionnaire while still providing robust insights into the user experience. Using this standardized questionnaire, a comparative evaluation is possible across different iterations of the design.

3.4.2. Feature Questionnaire

In addition to the standardised UEQ scales, custom questions tailored to the specific context of ATC operations and the specific feature are incorporated into the questionnaire as open-ended questions. These questions delve deeper into domain-specific concerns, such as task relevance, information clarity, and system integration, augmenting the UEQ’s evaluative scope with targeted inquiries for the individual features.
Using the UEQ(S) questionnaire in combination with open-ended questions to facilitate discussion in our iterative design workflow, our aim is to systematically assess the impact of proposed features on the user experience of ATC working positions. The complete questionnaire not only enables quantitative measurement of user perceptions but also facilitates qualitative insights through open-ended responses, enriching our understanding of user needs and preferences.

3.4.3. Workflow Questionnaire

After the users had completed the questionnaires for each feature, they were asked to evaluate the entire workflow of using design mock-ups and questionnaires for feature evaluation by stating their agreement to six statements about the evaluation process on a scale from four (Completely Agree) to zero (Completely Disagree).
The statements about the overall evaluation workflow were as follows:
(A)
I can very well imagine being involved in the design process on a regular basis through the design evaluation process.
(B)
I find the evaluation process to be unnecessarily complex.
(C)
I can imagine most ATCOs understanding the evaluation process quickly.
(D)
I had enough information available to participate in the design evaluation process.
(E)
The mock-ups had enough functionality/contained enough information.
(F)
I think it is important that ATCOs are involved early in the design process.

3.4.4. Mock-Up Tool Questionnaire

The last questionnaire was used to evaluate the overall quality and usability of the mock-ups. Six additional statements were given to the users, who were again asked to state their agreement or disagreement.
The statements about using the mock-ups were as follows:
(A)
I find the mock-ups easy to use.
(B)
I think I would need technical support to use the mock-ups.
(C)
I imagine that most people will learn to master the mock-ups quickly.
(D)
I find the mock-ups very cumbersome to use.
(E)
I felt very confident using the mock-ups.
(F)
I had to learn a lot of things before I could work with the mock-ups.

4. Validation Results

This chapter summarises the results of the individual workshops with ATCOs and SMEs in regards to the workflow. The results of the UEQ(S) and open discussions are not analysed in detail in this work, as they are not relevant for the workflow evaluation. They are only presented briefly to evaluate whether the participants were able to properly evaluate the features based on the mock-ups.

4.1. Main Workshop Results

The results of the in-person main workshop will be presented first. They are divided into the User Experience Questionnaire (Short) Results, the summary of the Workflow Evaluation Questionnaire, the Mock-up Tool Evaluation Results and some Qualitative Results.

4.1.1. User Experience Questionnaire (Short) Results

With the evaluation of the different scales in the UEQ(S), a score for pragmatic quality, hedonic quality, and overall quality was calculated for each presented feature. The presented features included the air situation display, track label design, track label interaction, the layout with floating windows, the layout with the attached sidebar, and the various conflict advisories. Figure 8 shows the overall scores for each feature. The scores can be interpreted as follows: maximum and minimum values are 3 and −3, but values above 2 and below −2 are already extreme results that are only achieved rarely. Any value above 1.6 can be seen as an excellent result. Results between 0.8 and 1.6 are above average. Values between −0.8 and 0.8 can be seen as a neutral evaluation. Anything below that is a negative result [32].
All participants completed the UEQ(S) and the mock-ups presented for each feature were rated with an overall score of above 1. This shows that the user experience was perceived in general very positively. Nevertheless, the differences in the scores could be analysed in more detail to determine which features could still be improved, enabling the iterative process of the proposed workflow.

4.1.2. Workflow Evaluation Questionnaire

Participants were given six specific statements about the workflow and their opinion of contributing to designing the CWP (see Section 3.4.3).
The statements were rated on a scale from zero (Completely Disagree) to four (Completely Agree), and the responses of the six participants were analysed. The results of this questionnaire can be found in Figure 9. A standard box plot is used, displaying the mean, the standard deviation, and outliers (o symbol).
Participants expressed a strong willingness to be regularly involved in the design process (statement A), with a high mean score of 3.5 and a relatively low standard deviation of 0.76, indicating general agreement and consistency in their responses. Conversely, the complexity of the evaluation process (statement B) was rated with a low mean score of 0.67 and a standard deviation of 0.75, suggesting that most participants did not find the process of evaluating the mock-ups to be overly complex.
Regarding the understanding of the evaluation process (statement C), participants gave an average rating of 2.67 (stddev 1.25), indicating some variability in their perceptions. The availability of sufficient information (statement D) for participation was rated positively, with a mean of 2.83 and a standard deviation of 0.69, reflecting a general consensus that the information provided was adequate.
The functionality and informativeness of the mock-ups (statement E) received a favorable mean score of 3.17, with a standard deviation of 0.90, showing that participants generally found the mock-ups to be sufficiently detailed. Lastly, the importance of involving ATCOs early in the design process (statement F) was highly rated (mean of 3.5, stddev 1.12), underscoring the perceived value of early ATCO engagement.
In summary, the workshop questionnaire results highlight a positive outlook towards regular involvement in the design process and the adequacy of provided information and mock-ups.

4.1.3. Mock-Up Tool Evaluation Results

In the same way that the workflow itself was evaluated, the adequacy of the mock-ups was rated by the users by rating their agreement with the statements (see Section 3.4.4). The results are presented in Figure 10.
Participants strongly agree (mean 3.83, stddev 0.37) that they find the mock-ups easy to use (statement A). This suggests a high level of satisfaction with the usability of the mock-ups. There is a moderate level of agreement with statements C and E (means = 3.33 and 3.17) that people will learn to master the mock-ups quickly and participants felt confident using them. However, there is some variation in opinions regarding these statements with standard deviations of 0.75 and 0.69, respectively. Many participants disagree (mean = 1.0, stddev 1.0) with statement B that they would need technical support to use the mock-ups. Also, they mostly disagree with statement D that the mock-ups are cumbersome to use (mean = 1.00, stddev 1.15). This might indicate that most participants feel comfortable using the mock-ups without assistance. This is confirmed by the strong disagreement with statement E (mean = 0.17, stddev 0.37), that a lot of things had to be learned to use the design mock-ups.
Overall, the participants generally agree that the design mock-ups are easy to use and can be mastered quickly. However, there is some variation in opinions on the need for technical support and confidence in using the mock-ups.

4.1.4. Qualitative Results

At the beginning of each test series, the participants were asked whether they had ever taken part in a similar iterative development process for a CWP or whether they could remember a similar process having taken place in their ANSP environment. This question was answered negatively by all participants.
After each test run, an open briefing was held to allow the ATCOs involved to ask questions, discuss comments, and make their own suggestions about the workstation and information display. These were not analysed quantitatively, but they show a clear trend as to which topics controllers are currently concerned about in their working environment and the typical workflow. A total of 152 comments and suggestions were noted, which could be divided into separate categories like track label, colours or weather presentation.
This extensive amount of feedback proves that the mock-ups were sufficient to let the participants examine the presented features and to start discussions about possible improvements for the next iteration steps.

4.2. Follow-Up Workshop Results

The results of the remote follow-up workshop are briefly presented here and compared to the results of the main workshop to evaluate if such a remote workshop is a viable alternative and to outline some challenges.

4.2.1. Follow-Up Workshop User Experience Questionnaire (Short) Results

As before, the UEQ(S) questionnaires are evaluated first and the condensed results are displayed in Figure 11.
The results in the remote workshop are much more mixed than in the initial workshop. Some of the mock-ups were evaluated extremely well, while others were rated in a more discerning way. Nevertheless, these results give valuable insights on the user experience of the different features, enabling improvements in future development iterations.

4.2.2. Follow-Up Workshop Workflow Evaluation Questionnaire

For the workflow evaluation, the same statements were used in the remote workshop as in the initial workshop (see Section 3.4.3). This way the results can directly be compared. Figure 12 shows the evaluation of the workflow statements of the remote workshop.
The participants rated the statements in the follow-up workshop even more positively then in the initial workshop. Statements A, C, D and F all received the maximum rating from all participants. As one participant wished for more information for the mock-up, statement E received a score of 3.33, which is still a very high value and higher than in the initial workshop. Statement B, which was about the complexity of the workshop, only received a value of 0.33 with a standard deviation of 0.67, showing that all participants were easily able to conduct the workshop on their own.
A possible explanation for why the results in the follow-up workshop are even better might be that the participants now conducted such a workshop for the second time and became therefore more familiar with the used tools and the process.

4.2.3. Follow-Up Workshop Qualitative Results

The participants of the remote workshop also had the opportunity to answer open questions to give more detailed feedback. Generally, all participants were able to successfully complete the workshop within the planned time of one hour and agreed that it is not too difficult to perform it on their own. The mock-ups were understandable, thanks to the descriptions that were provided. There were little to no technical difficulties with the software that was used for the workshop, but some of the participants noted some difficulties due to the size of their computer screens.
Each participant stated that they would gladly take part in future workshops in this form, which shows that these types of workshops are engaging and interesting for the participants.
In total, the participants gave 69 answers to the open questions. Comparing that amount of answers from the 3 participants to the 152 answers of the 6 participants in the main workshop, it can be seen that the remote workshop was almost as effective at gathering open feedback as the in-person workshop. But many of the text responses in the remote workshop were only short and direct answers, while most of the responses in the in-person workshop stemmed from discussions and can therefore be seen as higher quality answers.

5. Discussion

The main workshop and workflow show the benefits of a proven concept, while remote workshops bring new opportunities with some challenges. Both types of workshops and the new workflow are discussed in the following sections.

5.1. Main Workshop and Workflow

The main workshop delivered very positive results for the evaluation of the proposed workflow. All participants were able to properly evaluate each mock-up and gave a lot of additional feedback in the open questions, showing that the concept of the evaluation using mock-ups works well. The direct questions for the main workshop were also answered very positively, showing that the participants agree with the idea of including ATCOs in the development process in the manner presented here. However, the relatively low number of participants, especially in the remote and therefore completely digital follow-up workshop, does not allow for statistically significant statements to be made about the degree of approval for the early involvement of controllers in the software development process. Nevertheless, the evaluation was a successful case study and a proof-of-concept to show how the workflow could be executed. The designs developed in this work were used to show the workflow in action. The complete iterative methodology described in Section 2.4 was performed in a short time frame of few months within the research project. This very fast progress combined with the overwhelmingly positive direct feedback from the users and the indirect results from the large amount of feedback to the mock-ups, proves that the presented workflow can indeed be utilized to improve and speed up the software development process in the ATC domain the same way as in many other domains, where similar workflows are already standard procedures.
The standardised UEQ(S) evaluation of every mock-up also proved to be a good method to evaluate these types of mock-ups. All the participants were able to quickly understand the questionnaires and answer them without much need for assistance. The results collected for the mock-ups were generally also very positive in the main workshop. Most participants were impressed by the design of the mock-ups and would like to see similar designs in their own software that they use on a daily basis. Additionally, the participants gave constructive feedback for possible improvements in the open questions to the mock-ups. Nevertheless, the used questionnaires need to undergo validity and reliability testing, to ensure the reliability of the results.
In other software development domains, similar workflows have been successfully applied for a long time. Agile and user-centred software development are not new concepts [19,22]. However, in the domain of ATC software development, they are, at least in Europe, yet to be used often, based on the interview with German ATCOs, subject matter experts and ATC software developers. Therefore, introducing these concepts into this domain, adapting them to the specific constraints, and showing that they are feasible and advantageous to use, would improve future software development processes for ATC software if they are applied by the industry.

5.2. Remote Workshop

The remote workshop, as described in Section 3.3, was successfully completed by every participant without any further assistance. All questionnaires were filled in by each participant and a lot of further feedback was given for the open questions. The number of answers to the open questions per participant in the remote workshop was very similar to the number of answers per participant in the main workshop. This shows that such a type of workshop is feasible after an initial introduction to the topic. Therefore, it alleviates the challenge of the low availability of air traffic controllers, which is one of the biggest difficulties with classical agile or user-centred approaches when applied to the ATC domain, as explained in Section 2.
The results of the UEQ(S) evaluation were much more mixed then in the main workshop. This can be explained with the reduced preparation time of the follow-up workshop and the lower number of participants. The ideas and designs, based on which the mock-ups of the main workshop were created, were very refined and some of them already went through earlier evaluation iterations (see Section 3.1). The mock-ups of the follow-up workshop on the other hand, were created in a short time frame after the initial workshop based on the feedback collected there. They were therefore not as refined and various different ideas were tried out. Some of these ideas received very negative feedback in the UEQ(S) questionnaires. This negative feedback is not a problem, but actually just as helpful as positive feedback. It shows that the idea shown in this specific mock-up is not accepted by ATCOs and should therefore not be followed further. The negatively evaluated designs can now be filtered out at an early stage and no additional development time will be spent on these ideas. Performing this workshop early in the development cycle saved a lot of time that otherwise would be needed to create more sophisticated prototypes of these features. These results again show the advantages of the improved development workflow by filtering features in very early development stages, with only a low cost.
However, some challenges were identified when the follow-up workshop was conducted remotely. Firstly, it is necessary that every mock-up contained in the workshop is explained extensively because the participants might otherwise have difficulties understanding them or focus on the wrong parts of the mock-ups. Secondly, it is not possible to ensure that all participants have the same conditions while performing the workshop. Two participants in our remote workshop pointed out that their computer monitors were too small, and therefore the mock-ups were harder to assess. This means the results of such a remote workshop either need to be evaluated more carefully with these possible problems in mind or other measures need to be taken to ensure the correct execution of the workshop. A possible idea, which was also supported by one of the participants of the remote workshop, would be to make a computer available at a location that is accessible for air traffic controllers (e.g., a break room in an ATC centre). This would ensure a sufficient hardware setup and controlled environment for every participant to conduct the design evaluation remotely.

6. Summary and Outlook

In this work, a new workflow for the development of air traffic controller working positions was envisioned. The workflow resembles agile software development with a high focus on the involvement of users and different stakeholders in the development process. The workflow was tested in a research case study by developing designs, mock-ups, and prototypes of an en-route controller working position. The designs and the workflow were evaluated using mock-ups in two workshops with air traffic controllers who provided very positive feedback on both accounts. With these workshops, it was possible to evaluate the new workflow in two ways. Firstly, the workflow was applied to real-world examples and all the required steps of one design iteration, starting at the design phase, the feedback collection from users, and lastly the incorporation of feedback after the workshops, were performed. This way, the usability of the workflow was evaluated first hand by the designers and developers. Secondly, the workflow was also evaluated directly by active air traffic controllers, who as users represent the most important stakeholder group.
In particular, when describing the new approach Section 2.4, many sources were referenced that explore the user-centred approach in the air traffic control context. In particular, the work of König et al. has already described recursive processes and greater user participation in development as a productive approach in the presentation of weather phenomena [28] and the design of tower workstations [30]. The task now is to implement the results in actual development and application in the future. Challenges in the application of the principles from DIN EN ISO 9241-210 [22] were also taken from previous projects [21,24,25]. Analysing this research made it possible to detect problems at an early stage and adapt the process. This resulted in the iterative process loops, which were integrated into a linear process chain so that requirements and formalities from the safety-critical and highly work-intensive development of air traffic control interfaces could be met and still allow iterative work steps, at least in phases.
The next necessary step is to apply the new workflow to a real-world setting when developing a CWP for a customer, rather than just using it in a research environment. The project partner Frequentis Orthogon is already starting to adapt their internal processes according to Section 2.4 and has successfully started to incorporate the design developed in this project into their latest products. To really push the general industry of the ATC domain to modernize their software development processes and reap the described benefits, it is important that more commercial companies and especially ANSPs start accepting and applying a workflow as described in this work.
Some questions that were beyond the scope of a research project still remain open: will customers accept a design as part of a formal specification? If so, where should the boundary between design and formal requirements be drawn? How can it be verified if a developer correctly implemented a design, and who should perform this check: quality assurance, UX experts, or both? What is the best abstraction for which problem? These remaining questions could be answered if the workflow and design were used in a larger project with an ANSP as a client. Furthermore, it would be possible to carry out more extensive evaluations with a larger number of participants in order to obtain more statistically significant results.

Author Contributions

Conceptualization, M.-M.T. and D.H.; Data curation, L.T.; Formal analysis, L.N. and L.T.; Funding acquisition, T.H. and E.V.; Investigation, L.N., L.T., M.-M.T. and K.M.; Methodology, D.H., F.T. and E.V.; Project administration, E.V.; Resources, L.N.; Software, L.N., L.T., K.M. and M.W.; Supervision, L.N., D.H. and E.V.; Validation, L.N., L.T. and M.-M.T.; Visualization, D.H. and F.T.; Writing—original draft, L.N., L.T., M.-M.T., D.H. and E.V.; Writing—review and editing, T.H., F.T. and M.W. All authors have read and agreed to the published version of the manuscript.

Funding

This work takes place in the frame of the Envision project funded by German Federal Ministry for Economic Affairs and Climate Action (BMWK).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The raw data supporting the conclusions of this article will be made available by the authors on request.

Conflicts of Interest

Authors Eilert Viet and Michael Wimmer were employed by the company Frequentis Orthogon. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
ADS-BAutomatic Dependent Surveillance—Broadcast
AMANArrival Manager
ANSPAir Navigation Service Provider
A-SMGCSAdvanced Surface Movement Guidance and Control System
ATCAir Traffic Control
ATCOAir Traffic Controller
CPDLCController Pilot Data Link Communication
CWPController Working Position
DLRDeutsches Zentrum für Luft- und Raumfahrt
DMANDeparture Manager
HMIHuman Machine Interface
ISOInternational Organization for Standardization
MTCDMedium-term Conflict Detection
MUACMaastricht Upper Area Control
SMESubject Matter Expert
STCAShort-term Conflict Alert
TSASTerminal Sequencing and Spacing
UEQ(S)User Experience Questionnaire (Short)
UXUser Experience

References

  1. Wang, J.J.; Datta, K.; Landi, M.R. A Life-Cycle Cost Estimating Methodology for NASA-Developed Air Traffic Control Decision Support Tools. In Proceedings of the International Society of Parametric Analysis Conference, 21–24 May 2002; NASA/CR-2002-211395; NASA AMES Research Center: Moffett Field, CA, USA, 2002. [Google Scholar]
  2. Majumdar, A. Commercializing and restructuring air traffic control: A review of the experience and issues involved. J. Air Transp. Manag. 1995, 2, 111–122. [Google Scholar] [CrossRef]
  3. Gotel, O.; Finkelstein, C. An analysis of the requirements traceability problem. In Proceedings of the IEEE International Conference on Requirements Engineering, Colorado Springs, CO, USA, 18–22 April 1994; pp. 94–101. [Google Scholar] [CrossRef]
  4. Gordieiev, O.; Gordieieva, D.; Rainer, A.; Pishchukhina, O. Relationship between factors influencing the software development process and software defects. In Proceedings of the 13th IEEE International Conference on Dependable Systems, Services and Technologies (DESSERT’2023), Athens, Greece, 13–15 October 2023. [Google Scholar] [CrossRef]
  5. Rivero, J.M.; Grigera, J.; Distante, D.; Montero, F.; Rossi, G. DataMock: An Agile Approach for Building Data Models from User Interface Mockups. Softw. Syst. Model. 2019, 18, 663–690. [Google Scholar] [CrossRef]
  6. Temme, M.M.; Tyburzy, L.; Nöhren, L.; Muth, K.; Heßler, D.; Tenberg, F.; Viet, E.; Wimmer, M. Anwendung funktionaler Mockups bei der Entwicklung von Lotsenarbeitsplätzen. In Proceedings of the Deutscher Luft- und Raumfahrtkongress (DLRK2024), Hamburg, Germany, 30 September–2 October 2024. [Google Scholar] [CrossRef]
  7. Norman, D.A. Living with Complexity; MIT Press: Cambridge, MA, USA, 2011. [Google Scholar]
  8. Morton, S. Specification for Medium-Term Conflict Detection; SPEC-0139; European Organisation for the Safety of Air Navigation (EUROCONTROL): Brussels, Belgium, 2017. [Google Scholar]
  9. Drozdowski, S. Guidelines for Short Term Conflict Alert—Part I—Concept and Requirements; EUROCONTROL-GUID-159; European Organisation for the Safety of Air Navigation (EUROCONTROL): Brussels, Belgium, 2017. [Google Scholar]
  10. Phojanamongkolkij, N.; Okuniek, N.; Lohr, G.W.; Schaper, M.; Christoffels, L.; Latorella, K.A. Functional Analysis for an Integrated Capability of Arrival/Departure/Surface Management with Tactical Runway Management; NASA report No NASA/TM–2014-218553; NASA Langley Research Center: Hampton, VA, USA, 2014. [Google Scholar]
  11. Temme, M.M.; Gluchshenko, O.; Nöhren, L.; Kleinert, M.; Ohneiser, O.; Muth, K.; Ehr, H.; Groß, N.; Temme, A.; Lagasio, M.; et al. Innovative Integration of Severe Weather Forecasts into an Extended Arrival Manager. Aerospace 2023, 10, 210. [Google Scholar] [CrossRef]
  12. Moser, C. User Experience Design—Mit Erlebniszentrierter Softwareentwicklung zu Produkten, die Begeistern; Springer: Berlin/Heidelberg, Germany, 2012. [Google Scholar]
  13. Kurz, M.; Zebner, F. Zum Verhältnis von Design I& Technik. In Design, Anfang des 21. Jh.—Diskurse und Perspektiven; Eisele, P., Bürdek, B.E., Eds.; Avedition: Ludwigsburg, Germany, 2011; pp. 176–185. [Google Scholar]
  14. Hunger, R.; Christoffels, L.; Friedrich, M.; Jameel, M.; Pick, A.; Gerdes, I.; von der Nahmer, P.M.; Sobotzki, F. Lesson Learned: Design and Perception of Single Controller Operations Support Tools. In Engineering Psychology and Cognitive Ergonomics; HCII 2024; Lecture Notes in Computer Science; Harris, D., Li, W.C., Eds.; Springer: Cham, Switzerland, 2024; Volume 14693, pp. 15–33. [Google Scholar]
  15. Gerdes, I.; Temme, A.; Schultz, M. Dynamic airspace sectorisation for flight-centric operations. Transp. Res. Part Emerg. Technol. 2018, 95, 460–480. [Google Scholar] [CrossRef]
  16. Uebbing-Rumke, M.; Gürlük, H.; Jauer, M.L.; Hagemann, K.; Udovic, A. Usability Evaluation of Multi-Touch-Displays for TMA Controller Working Positions. 2014. Available online: https://api.semanticscholar.org/CorpusID:56047957 (accessed on 28 January 2025).
  17. EUROCONTROL. Procurement. Available online: https://www.eurocontrol.int/procurement (accessed on 12 November 2024).
  18. Alagar, V.S.; Periyasamy, K. Specification of Software Systems; Springer: London, UK, 2011. [Google Scholar] [CrossRef]
  19. Beck, K.; Grenning, J.; Martin, R.C.; Beedle, M.; Highsmith, J.; Mellor, S.; van Bennekum, A.; Hunt, A.; Schwaber, K.; Cockburn, A.; et al. Manifesto for Agile Software Development. Available online: https://agilemanifesto.org (accessed on 12 November 2024).
  20. Koch, A. Agile Software Development; Artech: Morristown, NJ, USA, 2004. [Google Scholar]
  21. König, C.; Hofmann, T.; Bruder, R. Application of the User-Centered Design Process According to ISO 9241-210 in Air Traffic Management; International Ergonomics Association: Recive, Brazil, 2012. [Google Scholar]
  22. DIN EN ISO 9241-210; Human-Centered Design Process for Interactive Systems. ISO: Geneva, Switzerland, 2008.
  23. DIN EN ISO 9241-11; Ergonomie der Mensch-System-Interaktion—Teil 11: Gebraustauglichkeit: Begriffe und Konzepte. DIN Deutsches Institut für Normung e.V.: Berlin, Germany, 2008.
  24. Hofmann, T.; Syndicus, M.; Bergner, J.; Bruder, R. Air Traffic Control HMI—Herausforderungen der Integration Mentaler Modelle Spezieller Nutzer in den Designprozess; GfA Frühjahrskongress: Dresden, Germany, 2019. [Google Scholar]
  25. Knothe, S.; Hofmann, T.; Bleßmann, C. Theory and Practice in UX Design—Identification of Discrepancies in the Develpopment of User-Oriented HMI; HCI: Copenhagen, Denmark, 2021. [Google Scholar]
  26. Wallace, G. FAA Still Short About 3000 Air Traffic Controllers, New Federal Numbers Show; CNN: Atlanta, GA, USA, 2024. [Google Scholar]
  27. IFATCA. Staff Shortage Survey—EUR Region. International Federation of Air Traffic Controllers’ Associations (IFATCA), Montreal, Quebec, Canada. 2024. Available online: https://ifatca.org/staff-shortage-survey-eur-region/ (accessed on 12 November 2024).
  28. König, C.; Hofmann, T.; Röbig, A.; Bergner, J. Fluglotsen-Arbeitsplätze der Zukunft, Neue Arbeits- und Lebenswelten Gestalten; Frühjahrskongress der GfA; GfA-Press: Darmstadt, Germany, 2010; Volume 56. [Google Scholar]
  29. Hofmann, T.; Heßler, D.; Knothe, S.; Lampe, A. New Scientific Methods and Old School Models in Ergonomic System Development. In Ergonomic Insights; Pazell, S., Karanikas, N., Eds.; CRC Press: Boca Raton, FL, USA, 2023; pp. 227–242. [Google Scholar]
  30. König, C.; Hofmann, T.; Bergner, J. Einsatz von Beobachtungsinterviews bei der Entwicklung von Interfaces für Tower Fluglotsen. Der Mensch im Mittelpunkt technischer Systeme. In Proceedings of the 8th Berliner Werkstatt Mensch-Maschine-Systeme, Berlin, Germany, 7–9 October 2009. [Google Scholar]
  31. Nielsen, N. Mental Models. 2010. Available online: https://www.nngroup.com/articles/mental-models (accessed on 28 January 2025).
  32. Schrepp, M.; Hinderks, A.; Thomaschewski, J. Applying the User Experience Questionnaire (UEQ) in Different Evaluation Scenarios. In Design, User Experience, and Usability. Theories, Methods, and Tools for Designing the User Experience, Proceedings of the Third International Conference, DUXU 2014, Held as Part of HCI International 2014, Heraklion, Crete, Greece, 22–27 June 2014, Proceedings, Part I 3; Springer International Publishing: New York, NY, USA, 2014; Volume 8517, pp. 383–392. [Google Scholar] [CrossRef]
  33. Schrepp, M.; Hinderks, A.; Thomaschewski, J. Design and Evaluation of a Short Version of the User Experience Questionnaire (UEQ-S). Int. J. Interact. Multimed. Artif. Intell. 2017, 4, 103. [Google Scholar] [CrossRef]
Figure 1. Current project development workflow.
Figure 1. Current project development workflow.
Aerospace 12 00114 g001
Figure 2. Standard agile approach according to [20].
Figure 2. Standard agile approach according to [20].
Aerospace 12 00114 g002
Figure 3. The user-centred design process according to ISO 9241-210 [22].
Figure 3. The user-centred design process according to ISO 9241-210 [22].
Aerospace 12 00114 g003
Figure 4. Process practiced in Envision.
Figure 4. Process practiced in Envision.
Aerospace 12 00114 g004
Figure 5. The demo working position used to carry out the main workshop [6].
Figure 5. The demo working position used to carry out the main workshop [6].
Aerospace 12 00114 g005
Figure 6. The air situation display design mock-up used in the main workshop.
Figure 6. The air situation display design mock-up used in the main workshop.
Aerospace 12 00114 g006
Figure 7. One of the weather mock-ups used in the remote workshop.
Figure 7. One of the weather mock-ups used in the remote workshop.
Aerospace 12 00114 g007
Figure 8. Overall quality scores of the individual mock-up features.
Figure 8. Overall quality scores of the individual mock-up features.
Aerospace 12 00114 g008
Figure 9. Evaluation of the workflow statements.
Figure 9. Evaluation of the workflow statements.
Aerospace 12 00114 g009
Figure 10. Evaluation of the design mock-up statements.
Figure 10. Evaluation of the design mock-up statements.
Aerospace 12 00114 g010
Figure 11. Overall quality scores of the individual mock-up features from the follow-up workshop.
Figure 11. Overall quality scores of the individual mock-up features from the follow-up workshop.
Aerospace 12 00114 g011
Figure 12. Evaluation of the workflow statements from the follow-up workshop.
Figure 12. Evaluation of the workflow statements from the follow-up workshop.
Aerospace 12 00114 g012
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Nöhren, L.; Tyburzy, L.; Temme, M.-M.; Muth, K.; Hofmann, T.; Heßler, D.; Tenberg, F.; Viet, E.; Wimmer, M. New Approaches for the Use of Extended Mock-Ups for the Development of Air Traffic Controller Working Positions. Aerospace 2025, 12, 114. https://doi.org/10.3390/aerospace12020114

AMA Style

Nöhren L, Tyburzy L, Temme M-M, Muth K, Hofmann T, Heßler D, Tenberg F, Viet E, Wimmer M. New Approaches for the Use of Extended Mock-Ups for the Development of Air Traffic Controller Working Positions. Aerospace. 2025; 12(2):114. https://doi.org/10.3390/aerospace12020114

Chicago/Turabian Style

Nöhren, Lennard, Lukas Tyburzy, Marco-Michael Temme, Kathleen Muth, Thomas Hofmann, Deike Heßler, Felix Tenberg, Eilert Viet, and Michael Wimmer. 2025. "New Approaches for the Use of Extended Mock-Ups for the Development of Air Traffic Controller Working Positions" Aerospace 12, no. 2: 114. https://doi.org/10.3390/aerospace12020114

APA Style

Nöhren, L., Tyburzy, L., Temme, M.-M., Muth, K., Hofmann, T., Heßler, D., Tenberg, F., Viet, E., & Wimmer, M. (2025). New Approaches for the Use of Extended Mock-Ups for the Development of Air Traffic Controller Working Positions. Aerospace, 12(2), 114. https://doi.org/10.3390/aerospace12020114

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop