Next Article in Journal
Research on the Dynamic Modeling of Rigid–Flexible Composite Spacecraft Under Fixed Constraints Based on the ANCF
Next Article in Special Issue
Optimized Airspace Structures and Sequencing Method for Urban Logistics Droneport
Previous Article in Journal
Experimental Study on the Ignition/Extinction Performances of a Central Staged Direct Mixing Combustor and the Construction of an Engineering Prediction Model
Previous Article in Special Issue
FRAM-Based Analysis of Airport Risk Assessment Process
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Identifying Human Factor Causes of Remotely Piloted Aircraft System Safety Occurrences in Australia

1
School of Engineering and Information Technology, UNSW, Canberra, ACT 2600, Australia
2
School of Science, Edith Cowan University, Joondalup, WA 6027, Australia
3
Capability Systems Centre, UNSW, Canberra, ACT 2612, Australia
4
School of Science, UNSW, Canberra, ACT 2600, Australia
*
Author to whom correspondence should be addressed.
Aerospace 2025, 12(3), 206; https://doi.org/10.3390/aerospace12030206
Submission received: 16 December 2024 / Revised: 28 January 2025 / Accepted: 28 February 2025 / Published: 28 February 2025

Abstract

:
Remotely piloted aircraft are a fast-emerging sector of the aviation industry. Although technical failures have been the largest cause of accident occurrences for Remotely Piloted Aircraft Systems (RPASs), if they are to follow the path of conventionally crewed aviation, Human Factors (HFs) will increasingly contribute to accidents as the technology of RPASs improves. Examining an RPAS accident database from 2008–2019 for HF-caused accidents and coding to the Human Factors Analysis and Classification System (HFACS) taxonomy, an exploration of RPAS HFs is carried out and the predominant HF issues for RPAS pilots identified. The majority of HF accidents were coded to the Unsafe Acts level of the HFCAS. Skill errors, depth perception and environmental issues were the largest contributors to HF RPAS safety occurrences. A comparison with other sectors of aviation is also made where perception issues were found to be a greater contributor to occurrences for RPAS pilots than for other sectors of aviation. Developing appropriate training programs to develop skilled RPAS operators with good depth perception can contribute to a reduction in RPAS accident rates. The importance of reporting RPAS incidents is also discussed.

1. Introduction

From beginning as a military option, in a short time, uncrewed aircraft have been taken on as a useful tool in many applications within civilian operations. Those applications are becoming very diverse and will continue to be used in ever more industries as new technologies are developed. It is predicted that non-military users of RPA will make up nearly 30% of drone activity [1]. Remotely Piloted Aircraft Systems (RPASs) are a sub-set of unmanned aircraft systems. The RPAS consists primarily of three components, the remotely piloted aircraft (RPA), of which there is a wide range of designs, the control station (remote pilot station) and the command-and-control link (C2). CASA [2] in Australia differentiates between an RPA and a remotely piloted aircraft. The former includes remotely piloted aircraft but not kites, balloons or model aircraft.
Within the United States there are more than 1.6 million registered unmanned aircraft with over 10% of the registrations being for aircraft used for commercial operations [3]. The rate of growth of the civil RPA sector over the next decade is forecast to be greater than that of the military sector with an annual growth rate of over 12%. The revenues for the civil sector up to 2028 are projected to be more than USD 88 billion [1]. This increased activity will require a large increase in the number of pilots required to fly the burgeoning fleet.
With the ever-increasing range of Remotely Piloted Aircraft System (RPAS) operations and flight hours there has been an attendant increase in incidents and accidents involving RPA. While in modern conventionally crewed flight the most dominant factor in accident and incident causation has been HFs, recent research analyzing RPAS accidents has indicated mechanical and equipment issues being the most dominant factor in RPAS accident causation [4,5]. With RPASs being an emergent technology it is not surprising that technical issues are a large contributor to RPAS accidents. However, Cooke and Pedersen [6] note that, despite the importance of human factors in unmanned flight, there is a surprising lack of attention given to HFs in this sphere of aviation. Elbanhawi et al. [7] expressed surprise at the prevalence of technical issues over human factors in Micro Air Vehicle (MAV) operations. It was expected that, owing to the challenging physical settings within which MAVs operated, along with human limitations, there would be a greater HF influence on the causes of accidents.
RPAS technical issues that are currently the leading cause of accidents need attention and solutions. However, RPAS HFs should also be given attention if the lessons learnt in conventionally crewed flight can be extrapolated across to the RPAS sphere. At the beginning of conventionally crewed powered flight, 80% of accidents were caused by mechanical or technical issues. This was not surprising as the aviation industry was in its infancy, the early aircraft were fragile and underpowered and technological advancements were slow in being developed. Even after the developments in World War II, there were major technical failures such as the de Havilland Comet. These technical causes of accidents receded during the last decades of the twentieth century as improved technology made aircraft progressively safer and HFs became the main cause of 60–80% of accidents [8].
As the technology of aircraft improved, allied with the strengthening of regulations devoted to technological improvements, the accident rates, while trending downwards, did not fall to zero. Pilot error was increasingly identified as the cause of aviation accidents [9]. To meet this challenge of human failures, the ICAO, in 1994, issued their Human Factors Digest No. 7 [10] to address the need for aviation to investigate HFs in accident causation.
Despite the human operator not being co-located with the aircraft during RPAS operations it has been found that, just as in conventionally manned flight, uncrewed flight is also negatively influenced by HFs. The remote pilot does not receive tactile feedback from the aircraft. The viewing of the visual field and the relationship of the RPA to items within the visual field are uniquely different for the remote pilot. Whilst Wild et al. [4] identified technology issues as a major contributor to RPAS incidents and accidents, further research by Wild et al. [5] identified the growing importance of human factors in the contribution to RPAS accidents. HFs were identified as the second largest contributor. There has also been both a lag in relevant regulations for RPASs and inconsistent national development and application of RPAS rules and regulations [11]. This has applied to regulations for both the technical development of RPAS hardware and the licensing of people who operate RPASs.

2. Literature Review

Human factor is an encompassing label covering humans, their behaviors and their relations with other humans as well as the machinery they are required to operate (ergonomics). It has been part of the aviation system since the beginning of powered flight as early aviators were concerned about themselves and their passengers. Earlier examples of this were the protection of the pilot from physical elements. Post World War II, human factor was a term used to explain accident causation by viewing the individual human as the cause of the accident. The label applied was “pilot error”. This resulted in a reliance on individual performance and brilliance to overcome adverse situations. Within this viewpoint the accident investigators sought evidence of poorly performing individuals who made bad decisions or behaved incorrectly to explain why the accident happened [12].
Understanding HFs changed from the beginning of the 1980s. Paries and Amalberti [13] describe this as a change from pilot error to one where poor resource management, including poor teamwork by crews, caused accidents and incidents. Reason [14] describes this as a widening of the search from the narrowness of blaming the pilot to looking for other factors leading to major accidents. It also included the structures supporting the individual in their operation as being of importance for safe outcomes. The search for factors that influenced human performance and thus aviation safety led to the understanding of the cause of accidents and incidents arising from within the organization. The culture of the organization and the individual contribute to the accident occurring. The model that became well known for illustrating the importance of organizational and other non-operational issues contributing to accident causation was developed by Reason [15]. It evolved to become known as the “Swiss cheese” model (SCM) [16].

2.1. Swiss Cheese Model

The SCM is “about how unsafe acts and latent conditions combine to breach barriers and safeguards” [17] (p. 94). It distinguishes between active errors, “whose effects are felt almost immediately” [15] (p. 173), and latent conditions, “whose adverse consequences may lie dormant within the system for a long time, only becoming evident when they combine with other factors to breach the system’s defences” [15] (p. 173). The active errors were likely to arise from the sharp-end operators such as the pilot whilst latent conditions arise from within the managerial sphere and decisions made by people distant in time and place from any accident.
Reason’s [15] first description and illustration of the “Swiss cheese” model had five “planes”, with two describing active failures and three latent failures. There are holes in each of these planes and when the holes line up an accident can occur. Over the following decade, there were various developments and evolutions in the SCM [16]. From these developments there is little doubt that Reason’s SCM has been very influential within aviation but not exclusively. The arising examination of systemic error and organizational factors involved in aviation accidents developed momentum through the 1990s. National regulatory bodies as well as international bodies accepted the need for examination of organizations and their systems when looking at causes of accidents. The use of Reason’s SCM has become so embedded within the aviation industry that the ICAO is fulsome in its praise of the model in its human factor accident investigation manual [18].

2.2. Human Factors Analysis Classification System

Despite the widespread acceptance of Reason’s SCM, there were criticisms [19]. Wiegmann and Shappell [18] identified limitations to the model which included a lack of specificity in the identification of latent conditions that cause the “holes” in each “slice” or layer of safety within the productive system. Further, the SCM was seen to be a descriptive model designed more for academic study rather than an analytical model that can be used for investigating accidents by active accident investigators.
Building on Reason’s SCM but wanting to overcome the identified criticisms, especially regarding the lack of specificity, Wiegmann and Shappell [18]) developed the Human Factors Analysis and Classification System (HFACS) taxonomy to examine human error in accidents. This taxonomy is developed within the framework of a system’s understanding of accident causation where there is an examination of the role of active errors and latent conditions and their contribution to the cause of an accident [20].
The basis for the development of the HFACS was examination of accident reports within the US Navy and Marine Corps [21]. The original classification framework, labeled Taxonomy of Unsafe Operations, was based on Reason’s SCM and had an academic focus rather than being for use by investigators [18]. Testing of this framework by military personnel led to modifications being made. Continued testing using civilian operators who identified specific military terms being used that were not understood by them led to further modifications being made and the HFACS being formed. This development and testing of the HFACS was conducted over a five-year period [18]. Thus, the taxonomy has been successfully used in examining not just military accidents but also civilian accidents [21,22]. It has also been used in other transport modes including rail as well as aviation maintenance [20].
Gaur [23] also demonstrated that the taxonomy crossed cultural boundaries when used to classify HF issues in accidents investigated by Indian authorities. ATSB [24] also successfully used the HFACS to examine and understand the HF errors leading to accidents in Australian aviation operations. The overall ubiquitousness of the HFACS taxonomy was confirmed in a review of accident analysis models which confirmed how widespread use of the taxonomy is, not just in aviation and other transport modes but also across a range of safety-dependent industries such as mining and nuclear power [25]. Development of the taxonomy has not stopped with its initial release, with different versions being released. These versions can be specifically tailored to the unique operations of an organization (e.g., Olsen and Williamson [26] examined the Australian Defence Force version of the Human Factors Analysis and Classification System (HFACS-ADF)). The most recent version is HFACS (8.0) which the U.S. Army has in the last 12 months integrated into its Army Safety Management Information System [27].
The HFACS contains four levels which equate to the four levels within Reason’s model. Each of these four levels is broken down into causal categories. The four levels of the HFACS are 1. Unsafe Acts of Operators, 2. Preconditions for Unsafe Acts, 3. Unsafe Supervision, 4. Organizational Influences [18] (p. 50).
The first level of Unsafe Acts is broken down into Errors—further broken down into skill errors, decision errors and perceptual errors—and Violations which are broken down into routine and exceptional violations. Wiegmann and Shappell [18] describe this category of unsafe acts as dominating accident databases because of our fallible human nature. The second level is the Preconditions for the Unsafe Acts that affect the performance of the operator. This level examines the condition of the operator, personnel factors and environmental factors that may have led to the unsafe actions of the operator.
The third level is Unsafe Supervision. This level is closely related to Reason’s latent conditions arising from the decisions and actions of managers—or supervisors in the nomenclature of Wiegmann and Shappell. Within this level, there are four divisions, inadequate supervision, planned inappropriate operations, failure to correct a known problem and supervisory violations [18] (p. 63).
The fourth level is the influence of the organization within which the operator performs his or her tasks. This influence is exerted through the culture of the organization, defined as the “unofficial or unspoken rules, values, attitudes, beliefs and customs of an organization” [18] (p. 67). This culture will affect the decision making and resource allocation—both hardware and human—provided by the organization to the operators.

2.3. Human Factors and RPASs

While there have been many studies on RPASs, they have primarily ignored the human side and emphasized the technological side of the RPAS equation. Herz [28] believed that “many strategists tend to underplay the role of human factors and oversell the technology to benefit their company’s product” (p. 43). Stark et al. [29] note that, while there have been many human-based articles within the RPA domain, most of these articles are narrowly focused on robotics and the human interaction with automation. More recently there have been an increasing number of studies examining RPAS HFs. Hobbs [30] identifies two major areas of human factors that influenced RPAS operations, being the reduction of sensory cues available to the pilot and the controlling of the aircraft via a radio link between the ground station and the aircraft. Özyörük [3] identified 69 publications on RPAS human factors arising from a search covering the years 1945 to 2020. The researcher found the areas covered by the research publications could be grouped into four main groups: Familiarization of Human Factors (49%), Design and Ergonomics (26%), Crew members (16%) and Operational Issues (8.6%). Working with the Royal Canadian Air Force, Arrabito et al. [31] identified eight human factors issues that could compromise operational issues: “interface, decision making, skills and knowledge, situation awareness, teamwork, documentation, mission preparation and organization” (p. 5). Özyoruk [3] comments “that in the field of unmanned aerial vehicles, human factors studies are not yet sufficient in terms of number and depth” (p. 75).
Along with the paucity of HF studies in RPAS operations, much of the available material draws on military forces and their use of large RPA flying BVLOS operations [32]. An examination of HF mishaps in military RPAS operations using HFACS found Levels 1 and 2 were the predominant causation reasons for the occurrences [33].
As there is increasing civilian use of RPASs that is often different from military flying, there is a need to explore the HF issues that lead to less-than-desired RPAS outcomes. This study seeks to identify the major HF issues confronting RPAS operators through the examination of incidents and accidents contained in an accident database. Learning from incidents (LFI) is a process that aims to prevent further incidents and accidents and enhance the safety of operations within organizations and industries [34]. By identifying HF-caused incidents and accidents and then classifying the causal factors of these occurrences using the Human Factors Analysis Classification System (HFACS) it is hoped to identify the predominant HF issues associated with RPAS operations. Knowledge of these less-than-optimal HF outcomes can in turn influence the training of future RPA license holders to increase safe flight outcomes.

3. Methods

A post-incident analysis methodology has been used by Wild et al. [4] as an effective means of identifying factors that could lead to improving safety. Drupsteen and Guldenmund [34] describe the process an organization can undertake in learning from incidents (LFI) as, “detecting events, by reflecting on them, by learning lessons from them and by putting these lessons into practice to prevent future incidents” (p. 81). Although this process has been described as tombstone safety [35] it has a long history in aviation studies. Flanagan [36] sought to understand the critical requirements of the job of an airline pilot using data that included “critical pilot behaviors reported in accident records, and critical incidents reported anonymously in interviews by the pilots themselves” (p. 330). The use of occurrence reports for LFI and the enhancement of safety outcomes emphasizes the importance of all participants in aviation operations, including RPAS operators, reporting incidents and accidents [37,38]. Once the data were gathered they were analyzed and classified into the critical requirements of the airline pilot. It is the learning from the commonalities of the incidents that provided the patterns that lead to understanding and further development of relevant and critical tasks for a job [39].
For this study, the critical incident reports came from the database of RPAS incidents and accidents of the Australian Transport Safety Bureau (ATSB), the Australian government’s independent agency charged with investigating transport incidents and accidents, including those involving aviation occurrences. The ATSB reports are publicly available.
Reports that had HF issues as the seminal cause of the incidents and accidents were examined. The period of reporting was across eleven years from 28 May 2008 to 31 December 2019. The database contained 290 occurrence reports concerning RPAS operations, although only 10 of the reports had been investigated (3.4%). The remainder were pilot self-reports.
Information that was provided in the ATSB reports included the location of the occurrence, aircraft manufacturer, operation type, airspace type and class and a report of the occurrence. All reports were assessed for HF causation and, if there was an identified HF cause of the incident or accident, it was then codified using Wiegmann and Shappell’s 2003 version of the HFACS taxonomy [18]. Of the 290 reports in the database there were found to be 34.5% or 100 reports with HFs as a leading cause. After coding to one of the four levels of the HFACS, a second more fine-grained analysis utilizing sub-levels [18] within each of the main levels of the HFACS was carried out.
After identifying the frequency of occurrences within the HFACS a further analysis of the occurrences was made with a Pareto analysis. This provides an 80/20 analysis whereby 80% of the RPAS incidents and occurrences can be identified as arising from 20% of the HFACS levels and sub-levels. This allows for the identification of areas that, when priority is afforded to them, will lead to the largest improvements in RPAS safety outcomes.
Coding was performed by the first author of the study. His experience includes teaching human factors and accident causation over a nearly 25-year career in universities in multiple countries along with previous experience with coding accidents using a taxonomy. The quantitative data analysis including the Pareto analysis was conducted using Microsoft Excel.
To highlight the unique issues facing RPAS HFs, the results from the RPAS HFACS Level 1—Unsafe Acts—findings were compared to those Unsafe Acts for Australian powered aircraft as published by the ATSB [24] and Australian general aviation as published by Lenne et al. [20].

4. Results

The database of Australian RPA incidents and accidents was examined for HF causes of RPAS incidents and accidents and coded in the HFACS taxonomy. Of the 290 reports in the database, 100 reports (34.5%) were found to have human factors as a leading cause when codified against the different levels of the HFACS. The frequency of occurrences for HFACS levels and sub-levels is indicated in Table 1.
All the RPAS occurrences that were found to have an HF causation were coded in the lower two levels of the HFACS, Unsafe Acts and Preconditions. These lower two levels of the taxonomy relate to the direct actions of the operator and the conditions surrounding the operation. There were no coded occurrences in the two higher levels that relate to the organizational issues that surround operations.
A chi-squared test, using the data presented in Figure 1, for HFACS Levels 1 and 2 relative to a uniform expected distribution gives X2 (5) = 55.19, p < 0.001. As such, the null hypothesis can be rejected and the alternative hypothesis that the failure sources are not equal can be accepted. The significant contributions to this are the much greater than expected skill-based errors and the below-expectation violations and technical environment.
A Pareto analysis of the HFACS coded occurrences in Levels 1 and 2 of the taxonomy, shown in Figure 2, indicates that skill levels of RPAS operators should be prioritized for lowering the rates of occurrences. The physical environment in which the RPAS operation takes place is also a leading cause of trouble for RPAS operators.
A fine-grained examination was conducted on Level 1 Unsafe Acts only using a Pareto analysis, shown in Figure 3. This confirmed the predominance of operator skill level followed by perceptual errors as leading causes of RPAS incidents.
In their formulation of the HFACS, Wiegmann and Shappell [18] describe possible causes of skill-based errors. These were used to try to further understand the underlying causes of skill-based errors. The occurrences originally coded to skill-based errors were further coded according to this list of possible causes of skill-based errors, shown in Table 2.
The greatest number of skill-based errors arose from the poor technique of the operator when flying the remotely piloted aircraft. The second most frequent cause of skill-based errors was poor airmanship which can be closely aligned to technique. An example of skill error is seen in ATSB reference 201407488, “While conducting aerial survey operations, the remotely piloted aircraft (RPA) over-banked during a turn”.
The Pareto analysis of causes of skill-based errors, shown in Figure 4, indicates these two underlying causes—poor technique and poor airmanship—are the two causes that should be focused on to reduce the largest number of occurrences.
The second largest cause of unsafe acts was perceptual error which accounted for 22 of the unsafe acts, shown in Table 1. These perceptual errors were almost solely the misjudging of distance (n = 21) with one report describing the drone pilot experiencing spatial disorientation. The misjudging of distance arises from a lack of depth perception. An example of this type of error is seen in ATSB reference 201808386, “While manoeuvring, the propeller struck a tree and the remotely piloted aircraft collided with terrain”.
The third largest cause of unsafe acts was decision error. While this has been seen in conventionally crewed flights to have a large impact on incidents and accidents, the number for RPAS operations is still small with only 9% of HF-caused occurrences being coded to decision errors, shown in Table 1.
Using Wiegmann and Shappell’s [18] possible causes of decision errors, the two largest causes of these decision errors are not following correct procedures and the lack of systems knowledge, shown in Table 3. An incorrect RPAS procedure was illustrated in an incident during a flight off a beach in NSW, Australia. During the planning for the flight, co-ordinates for a northern hemisphere location were erroneously selected. During the flight there was a breakdown in the C2 link, leading to the aircraft automatically flying to the incorrect location. The aircraft was lost [40].
From the second level of HFACS, the preconditions of the unsafe acts are environmental factors. While there was only one report that described the technical environment as leading to an unsafe act, the physical environment of wind (n = 9) and bird strikes (n = 15) contributed to 24 occasions that led to an unsafe act.
A comparison of the unsafe acts between the RPAS sector and conventionally crewed aircraft was made. The conventionally crewed sector was broken down to all powered aircraft from an analysis conducted by [24] and specifically general aviation activity [20], shown in Table 4.
Skill-based errors were the single largest cause of incidents and accidents for all three sectors of the aviation industry. General aviation had the largest number of accidents attributable to skill errors whilst RPASs had the smallest number. However, the gap between RPASs and all powered aircraft was relatively small compared to the large gap between RPASs and all aircraft and the higher total of general aviation as seen in Figure 5.
Decision errors were similar for both segments of conventionally crewed activity and these were more common than occurrences caused by decision errors for RPAS operations. This is not surprising as the influence of decision-making factors in accidents has long been recognized in conventionally crewed flight, e.g., O’Hare et al. [8] identified the importance and influence of decisional factors in fatal aviation accidents in New Zealand.
Perceptual errors as a cause of occurrences were higher for RPASs than conventionally crewed aircraft. With perceptual errors being the second largest cause of skill-based errors, this finding further indicates the difficulties of depth perception and related obstacle avoidance for pilots not co-located with the aircraft. That all powered aircraft had lower perceptual errors than general aviation aircraft and RPAS operations could indicate that piloting experience levels have an influence on accidents caused by perceptual errors. With RPAS operations still a relatively new segment of the aviation sector, many drone operators may have yet to build substantial experience levels from which they can draw upon to avoid the occurrences arising from perceptual errors.
A chi-squared test, for the data presented in Figure 5 (Table 4), comparing RPASs to the expected distribution of powered aircraft, gives X2 (3) = 105.7, p < 0.001; hence, the null hypothesis can be rejected and the alternative hypothesis that the distribution of unsafe acts for RPASs is different to that of all powered aircraft can be accepted. Specifically, we note that decision errors are significantly less common while perceptual errors are significantly more common, with skill errors and violations being similar. Noting that, for powered aircraft, the percentage of perceptual errors is 6.1, this suggests that if RPASs were similar, we should expect 3.5 cases out of 75, and we observe 22 cases, which is 6.3 times greater.

5. Discussion

From the ATSB database of reports and investigations of civilian RPAS incidents and accidents across an eleven-year period from 28 May 2008 to 31 December 2019, most occurrences arose from technical issues. However, slightly more than a third of the occurrences (34.5%) were identified as having HF causes. If RPAS operations are to follow a similar historical trajectory to conventionally crewed operations, the number of HF accidents for RPAS operations will grow.
The occurrences identified as having an HF cause were codified using the HFACS, an established and well-used taxonomy that has been shown to have validity across a wide range of aviation sectors, other industries and different cultures. Most of the occurrences that had HF causes were coded to Unsafe Acts (75%), the lowest level of the taxonomy. This level of the HFACS relates to the actions of the individual pilots. A large number of occurrences being attributable to individual error was also identified in a review of U.S. Army UAS accidents where 93% of accidents arose from individual errors [39].
The remaining 25% of the HF occurrences were coded to the physical environment and technical environments in which the RPAS flight took place, which comes from the second level of the HFACS [41].
There were no HF-aligned occurrences from this study of civilian RPAS operations coded to the upper two levels of the taxonomy. These levels identify the causation antecedents of accidents as being in the areas of managerial oversight, planning and organizational influences such as allocation of resources and operating culture. For military RPAS accidents, however, it has been identified that the leading cause of accidents originated in these higher levels of the taxonomy [33]. These researchers were able to establish links between unsafe acts at Level 1 with higher levels of the HFACS. Having a relationship between both the individual operator and the organizational and managerial spheres is the intention of the SCM, from which the HFACS has been developed.
That there was no such relationship for the studied civilian RPAS occurrences is not surprising as RPAS operations are an immature sector of the aviation industry and still are dominated by small companies using an owner-operator model. CASA lists over 2500 companies and individuals holding a Remotely Piloted Aircraft Operator’s Certificate (ReOC), the organizational certificate required for the commercial operation of RPASs in Australia. This is in comparison to 15 holders of an Unmanned Aircraft System Operator Certificate (UOC)—the forerunner of the ReOC—in 2012 [42].
Further, there is a large difference in organizational structures surrounding civilian and military types of flying operations. In large organizations such as the different arms of the military, there are strong organization structures guiding operations. In the civilian RPAS sector of the industry, the search for latent conditions contributing to accidents may not yet be beneficial for ongoing safety enhancements.
The Pareto analysis of the first and second levels of the HFACS indicates that the areas that should be attended to for increases in safety outcomes are the skill levels of the pilots, the environment in which they fly and depth perception issues. These three areas of skill, environment and perception can be indicative of the current training systems not fully preparing the pilots for the demands of the tasks they are conducting. Although small in overall number, decision errors are caused by poor procedures and lack of knowledge is further confirmation that the RPAS pilots may not be fully and adequately prepared for operational flying.
The number of accidents attributable to skill level—or lack of—across most sectors of the aviation industry including RPASs, general aviation and all powered aircraft is worryingly high and must be of concern. Whilst RPASs have the lowest level of skill error accidents amongst the three groupings it is at a level that should be considered unacceptable, with efforts made to lower this cause of safety occurrences. The breakdown of the causes of the skill errors indicates that the poor technique and handling skills of RPAS pilots are by far the biggest cause of the unsafe acts, with nearly two-thirds of these unsafe acts being caused by a lack of handling ability. This reflects the training environment and the preparedness of student RPAS pilots for the operational challenges awaiting them. As the database covers the earliest days of drone operations, the lack of flying skills could have arisen from an earlier “park-flying” mentality in RPAS flying. The accessibility and affordability of drones made purchasing one easy and simple [42]. The ab initio RPAS pilot could take the newly acquired drone to the local park and engage in self-instruction, resulting in drone pilots who were less than optimally prepared for operations. As Barsch [42] comments, there was a lack of civilian operating experiences that were able to be drawn upon to shape RPAS pilot qualifications.
The influence of perceptual errors on RPAS operations is seen in these accidents, being the second largest cause of unsafe acts. The number of depth perception errors being so high indicates the difficulty of flying an aircraft when not co-located with it. A comparison with the causes of accidents for powered aircraft and specifically general aviation aircraft confirms the perceptual difficulties pilots face when not co-located with their aircraft, with nearly 30% of RPAS occurrences being linked to perceptual issues compared to 16% for GA aircraft and 6% for all powered aircraft. Depth perception is a challenging task and becomes exponentially more difficult the further away the aircraft is from the pilot and/or in cluttered flying environments such as operating in and around buildings. If the full potential of remotely piloted aircraft is to be fully realized, then the aircraft is to be flown at large distances from the pilot for VLOS operations and even further for EVLOS and BVLOS operations. For these operations, judging distance from obstacles becomes more difficult for the pilot.
The large impact of skill error and perception-caused accidents would indicate that the training and testing regimes of ab initio RPAS pilots warrant continuing examination to ensure graduating RPAS operators are prepared for the ever-increasing challenges of operations they will face in the future. With the maturing and developing of the RPAS sector of Australian aviation, CASA has introduced a comprehensive RPAS training syllabus with both experiential and practical competence requirements. The findings of this LFI study suggest that continued emphasis on the development of practical flying skills will assist in improving RPAS safety outcomes. A further priority given to depth perception training would be further warranted to develop safe RPA flying practices.

6. Reporting Systems

Developing a culture of reporting is an important step for the future safety of RPAS operations. Sieberichs and Kluge [43] identified that too few incidents were being reported in the aviation sphere. Barriers to reporting incidents have included personal motivation, lack of recognition of the benefits of LFI and potential negative consequences for the reporting person [44]. The type of incident also influenced whether a pilot would report it [43] or the perceived inconsequential nature of the incident [45]. Within Australian conventionally crewed aviation, it was found that nearly one-third of responding commercial airline pilots did not report safety occurrences to their respective airline voluntary reporting system. The reasons provided were fear of the report being used against the pilot by either the company or the regulator and the effort required to make a report being greater than the perceived benefits. Underreporting the occurrence—that is, only providing partial or minimal information—was also an issue raised in the responses from the commercial pilots [37]. Walton and Henderson [38] identified that over 85% of RPAS incidents in New Zealand in the period from 2015–2022 were not reported on any reporting system. The largest reason for the non-reporting was the RPAS operator not thinking the incident was serious enough to warrant a report. The lack of regulatory requirements was also identified as contributing to poor reporting. The researchers identified education regarding the importance of reporting as a way forward to improving low RPAS reporting rates.
This study utilized a postincident analysis methodology with the goal of LFI. For this type of approach to be successful there needs to be the reporting of incidents by RPAS operators. Of the 290 reports contained in the ATSB database, the vast majority of the reports—96.6%—are self-reports from the RPAS pilots involved in the occurrence. Up until the end of 2019 only 10 of the reports arise from an independent investigation carried out by the investigatory authority in Australia. Self-reporting has its limitations as it is a natural response for people not to be overly damning of their actions and nor to believe that these actions could not have led to an incident or accident. A previous study [46], examining causes of accidents within New Zealand aviation, found limitations in pilot self-reports. As Zotov [46] comments:
In some cases, the pilots may have been genuine in their belief of what happened, but (whether from incomplete perception or lack of knowledge) that belief was at variance with the facts found by the official investigators. In other cases, the investigators openly expressed their disbelief in the pilots’ veracity.
(page 73)
The value of using confidential incident reports might be reduced because of the limitations of self-reporting by the pilots involved. As the RPAS sector of aviation grows and matures it is hoped that funding will be made available to investigatory bodies to be able to complete independent investigations of not only accidents but also seemingly minor incidents to enhance the lessons that can be taken from these unfortunate occurrences. This, however, should not deter RPAS operators from making self-reports of safety occurrences. The importance of continuing to report RPAS incidents and accidents should be an important consideration for RPAS operators.

7. Conclusions

The RPAS is the new arrival in the aviation industry. This sector is experiencing similarities in the leading cause of incidents and accidents to conventionally crewed aviation. From the beginnings of the history of flight, technical issues were the largest cause of accidents and so it is proving to be with RPASs. As maturation has occurred in aviation, there has been a change in the leading causes of accidents. It is understood that HFs now have a large part to play in accidents. While this swing has not yet happened in RPASs it can be expected to happen. Despite the human pilot being remotely located away from the aircraft there is still a human involved in both the development of the technology and the planning and flying of RPAS missions. This study is a first step in understanding what human factors are at play in the causation of RPAS incidents and accidents. Using a well-utilized taxonomy in both aviation—for which it was developed—and other industries it has been discovered that the flying skill of the remote pilot contributes to many occurrences. This is not unique to the RPAS sector as it is also a large contributor to accidents across all powered flight, including general aviation, segments of the aviation industry. However, another challenge for RPAS pilots—but not solely limited to them—is perceptual difficulties in flying their aircraft away from them and being able to operate close to obstacles without colliding with them.

Author Contributions

Conceptualization, J.M. and G.W.; methodology, J.M. and G.W.; validation, J.M., G.W., K.J. and S.R.; formal analysis, J.M. and G.W.; investigation, J.M. and G.W.; resources, J.M. and G.W.; data curation, J.M. and G.W.; writing—original draft preparation, J.M.; writing—review and editing, J.M., G.W., K.J. and S.R.; visualization, J.M. and G.W.; supervision, G.W., K.J. and S.R.; project administration, G.W. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by an Australian Government Research Training Programme (RTP) Scholarship.

Data Availability Statement

The data analyzed were from ATSB and are available under a Creative Commons Attribution 3.0 Australia license. Upon request to the authors, the ATSB database of RPAS reports can be made available.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Macro Source Media. The Global Unmanned Aerial Vehicles (UAV) Market Report; Macrosource Media Pty Limited; Macro Source Media: Savannah, GA, USA, 2020; ISSN 2206-4052. [Google Scholar]
  2. Civil Aviation Safety Authority. Remotely Piloted Aircraft Systems—Licensing and Operations. In Advisory Circular AC 101-01 v6.0; Civil Aviation Safety Authority: Canberra, Australia, 2024. [Google Scholar]
  3. Özyoruk, H.E. Systematic Analysis and Classification of the Literature Regarding the Impact of Human Factors on Unmanned Aerial Vehicles (UAV). J. Aviat. 2020, 4, 71–81. [Google Scholar] [CrossRef]
  4. Wild, G.; Murray, J.; Baxter, G. Exploring Civil Drone Accidents and Incidents to Help Prevent Potential Air Disasters. Aerospace 2016, 3, 22. [Google Scholar] [CrossRef]
  5. Wild, G.; Gavin, K.; Murray, J.; Silva, J.; Baxter, G. A post-accident analysis of civil remotely-piloted aircraft system accidents and incidents. J. Aerosp. Technol. Manag. 2017, 9, 157–168. [Google Scholar] [CrossRef]
  6. Cooke, N.J.; Pedersen, H.K. Unmanned Aerial Vehicles. In Handbook of Aviation Human Factors; Wise, J.A., Hopkin, V.D., Garland, D.J., Eds.; CRC Press: Boca Raton, FL, USA, 2009. [Google Scholar]
  7. Elbanhawi, M.; Mohamed, A.; Clothier, R.; Palmer, J.L.; Simic, M.; Watkins, S. Enabling technologies for autonomous MAV operations. Prog. Aerosp. Sci. 2017, 91, 27–52. [Google Scholar] [CrossRef]
  8. O’Hare, D.; Wiggins, M.; Batt, R.; Morrison, D. Cognitive failure analysis for aircraft accident investigation. Ergonomics 1994, 37, 1855–1869. [Google Scholar] [CrossRef]
  9. Helmreich, R.L.; Foushee, H.C. Why Crew Resource Management? Empirical and Theoretical Bases of Human Factors Training in Aviation. In Cockpit Resource Management; Wiener, E.L., Kanki, B.G., Helmreich, R.L., Eds.; Academic Press: Cambridge, MA, USA, 1993; pp. 3–45. [Google Scholar]
  10. ICAO. Investigation of Human Factors in Accidents and Incidents. In Human Factors Digest 7, CIRCULAR 240-AN/144; ICAO: Montreal, QC, Canada, 1994. [Google Scholar]
  11. MacPherson, E. Is the World Ready for Drones? Air Space Law 2018, 43, 149–178. [Google Scholar] [CrossRef]
  12. Dekker, S. The Field Guide to Understanding “Human Error”; Ashgate: Farnham, UK, 2014. [Google Scholar]
  13. Paries, J.; Amalberti, R. Aviation Safety Paradigms and Training Implications. In Cognitive Engineering in the Aviation Domain; Sarter, N.B., Amalberti, R., Eds.; CRC Press: Boca Raton, FL, USA, 2000. [Google Scholar]
  14. Reason, J. Managing the Risks of Organizational Accidents; Routledge: London, UK, 1997. [Google Scholar]
  15. Reason, J. Human Error; Cambridge University Press: Cambridge, UK, 1990. [Google Scholar]
  16. Eurocontrol. Revisiting the Swiss Cheese Model of Accidents. In EEC Note 13/06; Eurocontrol: Brussels, Belgium, 2024. [Google Scholar]
  17. Reason, J. The Human Contribution: Unsafe Acts, Accidents and Heroic Recoveries; Ashgate: Farnham, UK, 2008. [Google Scholar]
  18. Wiegmann, D.A.; Shappell, S.A. A Human Error Approach to Aviation Accident Analysis: The Human factors Analysis and Classification System; Ashgate Publishing: Farnham, UK, 2003. [Google Scholar]
  19. Larouzee, J.; Le Coze, J.C. Good and bad reasons: The Swiss cheese model and its critics. Saf. Sci. 2020, 126, 104660. [Google Scholar] [CrossRef]
  20. Lenné, M.G.; Ashby, K.; Fitzharris, M. Analysis of general aviation crashes in Australia using the Human Factors Analysis and Classification System. Int. J. Aviat. Psychol. 2008, 18, 340–352. [Google Scholar] [CrossRef]
  21. Wiegmann, D.; Faaborg, T.; Boquet, A.; Detwiler, C.; Holcomb, K.; Shappell, S. Human Error and General Aviation Accidents: A Comprehensive, Fine-Grained Analysis Using HFACS. In Federal Aviation Administration, Office of Aerospace Medicine Technical Report No. DOT/FAA/AM-05/24; Office of Aerospace Medicine: Washington, DC, USA, 2005. [Google Scholar]
  22. Shappell, S.; Detwiler, C.; Holcomb, K.; Hackworth, C.; Boquet, A.; Wiegmann, D. Human Error and Commercial Aviation Accidents: A Fine-Grained Analysis Using HFACS. In Office of Aerospace Medicine Technical Report No. DOT/FAA/AM-06/18; Office of Aerospace Medicine: Washington, DC, USA, 2006. [Google Scholar]
  23. Gaur, D. Human factors analysis and classification system applied to civil aircraft accidents in India. Aviat. Space Environ. Med. 2005, 76, 501–505. [Google Scholar] [PubMed]
  24. Australian Transport Safety Bureau. Human Factors Analysis of Australian Aviation Accidents and Comparison with the United States. In Aviation Research and Analysis Report B2004/0321; Australian Transport Safety Bureau: Canberra, Australia, 2007. [Google Scholar]
  25. Hulme, A.; Stanton, N.A.; Walker, G.H.; Waterson, P.; Salmon, P.M. Accident Analysis in Practice: A review of Human Factors Analysis and Classification System (HFACS) Applications in the Peer Reviewed Academic Literature. Proc. Hum. Factors Ergon. Soc. Annu. Meet. 2019, 63, 1849–1853. [Google Scholar] [CrossRef]
  26. Olsen, N.; Williamson, A. Application of classification principles to improve the reliability of incident classification systems: A test case using HFACS-ADF. Appl. Ergon. 2017, 63, 31–40. [Google Scholar] [CrossRef] [PubMed]
  27. Acord, C. ASMIS Gains Powerful New Tool–HFACS 8.0. U.S. Army. 2024. Available online: https://www.army.mil/article/273512/asmis_gains_powerful_new_tool_hfacs_8_0 (accessed on 28 January 2025).
  28. Herz, R.P. Assessing the Influence of Human Factors and Experiences on Predator Mishaps. Ph.D. Thesis, Northcentral University, San Diego, CA, USA, 2008. [Google Scholar]
  29. Stark, B.; Coopmans, C.; Chen, Y. Concept of Operations for Personal Remote Sensing Unmanned Aerial Systems. J. Intell. Robot. Syst. 2012, 69, 5–20. [Google Scholar] [CrossRef]
  30. Hobbs, A. Remotely Piloted Aircraft. In Handbook of Human Factors in Air Transportation Systems; Landry, S.J., Ed.; CRC Press: Boca Raton, FL, USA, 2018. [Google Scholar]
  31. Arrabito, G.; Hou, M.; Banbury, S.; Martin, B.; Ahmad, F.; Fang, S. A review of human factors research performed from 2014 to 2017 in support of the Royal Canadian Air Force remotely piloted aircraft system project. J. Unmanned Veh. Syst. 2020, 9, 1–20. [Google Scholar] [CrossRef]
  32. Renshaw, P.; Wiggins, M. The sensitivity of Cue utilization when learning to operate a visual line of sight multi-rotor remotely piloted aircraft. Int. J. Ind. Ergon. 2020, 77, 102953. [Google Scholar] [CrossRef]
  33. Tvaryanas, A.P.; Thompson, W.T.; Constable, S.H. Human factors in remotely piloted aircraft operations: HFACS analysis of 221 mishaps over 10 years. Aviat. Space Environ. Med. 2006, 77, 724–732. [Google Scholar]
  34. Drupsteen, L.; Guldenmund, F.W. What is learning? A review of the safety literature to define learning from incidents, accidents and disasters. J. Contingencies Crisis Manag. 2014, 22, 81–96. [Google Scholar] [CrossRef]
  35. Bartsch, R.; Coyne, J.; Gray, K. Drones in Society: Exploring the Strange New World of Unmanned Aircraft; Routledge: London, UK, 2017. [Google Scholar]
  36. Flanagan, J.C. The critical incident technique. Psychol. Bull. 1954, 51, 327–358. [Google Scholar] [CrossRef]
  37. McMurtrie, K.; Molesworth, B. Australian Flight Crews’ Trust in Voluntary Reporting Systems and Just Culture Policies. Aviat. Psychol. Appl. Hum. Factors 2018, 8, 11–21. [Google Scholar] [CrossRef]
  38. Walton, C.; Henderson, I. Safety occurrence reporting amongst New Zealand uncrewed aircraft users. Eng 2023, 4, 236–258. [Google Scholar] [CrossRef]
  39. Kain, D.L. Owning Significance: The Critical Incident Technique in Research. In Foundations for Research: Methods of Inquiry in Education and the Social Sciences; de Marrais, K., Lapan, S.D., Eds.; Lawrence Erlbaum: Mahwah, NJ, USA, 2003; pp. 69–85. [Google Scholar]
  40. Australian Transport Safety Bureau. Loss of Control Involving Remotely Piloted Aircraft Pulse Aerospace Vapor 55: AO-2016-128; Australian Transport Safety Bureau: Canberra, Australia, 2017.
  41. Feltman, K.A.; Curry, I.P.; Kelley, A.M. A Review of US Army Unmanned Aerial Systems Accidents. Aviat. Psychol. Appl. Hum. Factors 2020, 10, 24–28. [Google Scholar] [CrossRef]
  42. Bartsch, R. Unmanned and Uncontrolled: The Commingling Theory and the Legality of Unmanned Aircraft System Operations. Master’s Thesis, University of Sydney, Sydney, Australia, 2016. [Google Scholar]
  43. Sieberichs, S.; Kluge, A. Why learning opportunities from aviation incidents are lacking. Aviat. Psychol. Appl. Hum. Factors 2021, 11, 33–47. [Google Scholar] [CrossRef]
  44. Margaryan, A.; Littlejohn, A.; Stanton, N.A. Research and development agenda for Learning from Incidents. Saf. Sci. 2017, 99, 5–13. [Google Scholar] [CrossRef]
  45. de Boer, R.J.; Hurts, K. Automation surprise. Results of a field survey of Dutch pilots. Aviat. Psychol. Appl. Hum. Factors 2017, 7, 28–41. [Google Scholar] [CrossRef]
  46. Zotov, D. Pilot Error: Cognitive Failure Analysis. Master’s Thesis, Massey University, Palmerston North, New Zealand, 1997. [Google Scholar]
Figure 1. Chi-squared test for HFACS analysis.
Figure 1. Chi-squared test for HFACS analysis.
Aerospace 12 00206 g001
Figure 2. Pareto Analysis for HFACS Levels 1 and 2.
Figure 2. Pareto Analysis for HFACS Levels 1 and 2.
Aerospace 12 00206 g002
Figure 3. Pareto Analysis for HFACS Level 1.
Figure 3. Pareto Analysis for HFACS Level 1.
Aerospace 12 00206 g003
Figure 4. Pareto Analysis of Causes of Skill-based errors.
Figure 4. Pareto Analysis of Causes of Skill-based errors.
Aerospace 12 00206 g004
Figure 5. Comparison of Unsafe Acts between sectors of Aviation.
Figure 5. Comparison of Unsafe Acts between sectors of Aviation.
Aerospace 12 00206 g005
Table 1. Frequency of Human Factor Accidents in HFACS taxonomy.
Table 1. Frequency of Human Factor Accidents in HFACS taxonomy.
HFACS Level HFACS Sub-Level Count (n) % of All
  • Unsafe Acts
Skill-Based Error 41 14.1
Decision Error 9 3.1
Perceptual Error 22 7.5
Violation 3 1
2.
Preconditions
Physical Environment 24 8.3
Technical Environment 1 0.3
3.
Unsafe Supervision
0
4.
Organizational Influences
0
Total 100
Table 2. Causes of Skill-Based errors.
Table 2. Causes of Skill-Based errors.
Skill-Based Errors Count (n) %
Breakdown in visual scan 2 4.8
Omitted checklist item 3 7.3
Overreliance on automation 4 9.8
Poor airmanship 5 12.2
Poor technique 26 63.4
Inadvertent use of controls 1 2.4
Total 41
Table 3. Causes of Decision Errors.
Table 3. Causes of Decision Errors.
Decision Errors Count (n) %
Inadequate system knowledge 3 33.3
Inappropriate procedure 4 44.4
Exceeded ability in the ambient conditions 1 11.1
Inadequate preflight 1 11.1
Total 9
Table 4. Comparison between sectors of the aviation industry.
Table 4. Comparison between sectors of the aviation industry.
RPAS GA Aircraft Powered Aircraft
Count (n) % Count (n) % Count (n) %
Skill error 41 54.6 103 60.9 1180 84
Decision error 9 12 60 35.5 464 33
Perceptual error 22 29.3 27 16 85 6.1
Violation 3 4 27 16 108 7.7
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Murray, J.; Richardson, S.; Joiner, K.; Wild, G. Identifying Human Factor Causes of Remotely Piloted Aircraft System Safety Occurrences in Australia. Aerospace 2025, 12, 206. https://doi.org/10.3390/aerospace12030206

AMA Style

Murray J, Richardson S, Joiner K, Wild G. Identifying Human Factor Causes of Remotely Piloted Aircraft System Safety Occurrences in Australia. Aerospace. 2025; 12(3):206. https://doi.org/10.3390/aerospace12030206

Chicago/Turabian Style

Murray, John, Steven Richardson, Keith Joiner, and Graham Wild. 2025. "Identifying Human Factor Causes of Remotely Piloted Aircraft System Safety Occurrences in Australia" Aerospace 12, no. 3: 206. https://doi.org/10.3390/aerospace12030206

APA Style

Murray, J., Richardson, S., Joiner, K., & Wild, G. (2025). Identifying Human Factor Causes of Remotely Piloted Aircraft System Safety Occurrences in Australia. Aerospace, 12(3), 206. https://doi.org/10.3390/aerospace12030206

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop