Next Article in Journal
Interpretable Lightweight Ensemble Classification of Normal versus Leukemic Cells
Next Article in Special Issue
Applying Web Augmented Reality to Unexplosive Ordnance Risk Education
Previous Article in Journal
Extract Class Refactoring Based on Cohesion and Coupling: A Greedy Approach
Previous Article in Special Issue
An Integrated Mobile Augmented Reality Digital Twin Monitoring System
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

UXO-AID: A New UXO Classification Application Based on Augmented Reality to Assist Deminers

Department of Computer Science, College of Computer Science and Mathematics, Tikrit University, Tikrit 34001, Iraq
*
Author to whom correspondence should be addressed.
Computers 2022, 11(8), 124; https://doi.org/10.3390/computers11080124
Submission received: 28 June 2022 / Revised: 8 August 2022 / Accepted: 16 August 2022 / Published: 19 August 2022
(This article belongs to the Special Issue Advances in Augmented and Mixed Reality to the Industry 4.0)

Abstract

:
Unexploded ordnance (UXO) is a worldwide problem and a long-term hazard because of its ability to harm humanity by remaining active and destructive decades after a conflict has concluded. In addition, the current UXO clearance methods mainly involve manual clearance and depend on the deminer’s experience. However, this approach has a high misclassification rate, which increases the likelihood of an explosion ending the deminer’s life. This study proposes a new approach to identifying the UXO based on augmented reality technology. The methodology is presented based on two phases. Firstly, a new dataset of UXO samples is created by printing 3D samples and building a 3D model of the object data file with accurate data for 3D printed samples. Secondly, the development of the UXO-AID mobile application prototype, which is based on augmented reality technology, is provided. The proposed prototype was evaluated and tested with different methods. The prototype’s performance was measured at different light intensities and distances for testing. The testing results revealed that the application could successfully perform in excellent and moderate lighting with a distance of 10 to 30 cm. As for recognition accuracy, the overall recognition success rate of reached 82.5%, as the disparity in the number of features of each object affected the accuracy of object recognition. Additionally, the application’s ability to support deminers was assessed through a usability questionnaire submitted by 20 deminers. The questionnaire was based on three factors: satisfaction, effectiveness, and efficiency. The proposed UXO-AID mobile application prototype supports deminers to classify the UXO accurately and in real time, reducing the cognitive load of complex tasks. UXO-AID is simple to use, requires no prior training, and takes advantage of the wide availability of mobile devices.

1. Introduction

Unexploded ordnance (UXO) is a global problem and an ongoing threat due to the possibility of its remaining active and potentially explosive even decades after a conflict has ended, as reported by L. Safatly et al. [1]. The most significant problem posed by UXO is that it endangers civilian lives. Furthermore, the risk of explosives impacts the lives of troops and explosives specialists, as well as the country’s development. Unexploded ordnance is increasing in continued wars, such as the Russian–Ukrainian conflict. More than 54,000 unexploded ordnances were located and destroyed in the first month of the Ukraine–Russia crisis [2], Ukraine being one of the countries most highly contaminated by unexploded ordnance. The Landmine and Cluster Munition Monitor published its 23 annual reports in 2021 [3], which stated that 2020 was the sixth consecutive year that recorded a high number of casualties due to increased conflict and unexploded ordnance pollution since 2015. The report also said that 80% of the deaths were civilians (4437), with children constituting at least half of the civilian casualties (1872) while the remaining 20% consisted of military casualties (1105). Furthermore, as stated by the Geneva International Centre for Humanitarian Demining (GICHD), there are over 60 countries contaminated with (UXO) [4].
Demining or explosive removal are the sole options for eliminating risks of unexploded ordnance, despite the dangers, time-consuming, and cost. Demining approaches, in general, are classified into three types: mechanical clearance, robotic clearance, and manual clearance, according to R. Achkar et al. [5]. Animals and machine clearance methods have been increasingly used in demining operations. However, most UXO and ERW are still removed using the manual clearance method, as reported by M. A. V. Habib [6]. Therefore, it is impossible to dispense with the intervention of specialist human operators in the removal of mines and explosives, despite the threat of wounding or loss of life. According to reports [3] from 2017 to 2020, over 250 human operators were killed and injured, although the number may be higher, since the report’s coverage was limited to specific regions.
Recently, researchers developed artificial intelligence-based strategies to assist specialists and human operators in detecting explosives. K. Tbarki et al. [7] used one-class classification to detect and locate whether the buried object was UXO or clutter. The GPR data were used as input to the classifier to classify whether the detected object was a UXO. The authors evaluated the proposed method by conducting a comparison study with other methods. Similarly, K. Tbarki et al. [8] used a support vector machine (SVM) for landmine detection. The GPR data were utilised as input to the (SVM). The authors measured the performance of their proposed method with various techniques, such as receiver operating characteristic (ROC) and running time. The results indicated that the method was successful in landmine detection. The online dictionary learning technique was developed by F. Giovanneschi et al. [9] to form the received GPR data into sparse representations (SR) to improve the feature extraction process for successful landmine detection. This method takes advantage of the fact that much of the training data are likely correlated. The authors conducted a comparison study with three online algorithms to evaluate the proposed method. F. Lombardi et al. [10] presented adaptable demining equipment based on GPR sensors and employed convolutional neural network (CNN) to process the GPR data to detect the buried UXO. The results of the experiments showed excellent distinguishing accuracy between UXO and clutter objects.
Metal Mapper is another sensor used to detect buried objects. J. B. Sigman et al. [11] developed an automatic detection approach based on the Metal Mapper sensor coupled with the supervised learning technique naïve Bayes classifier. The proposed method can automatically detect the UXO without requiring user intervention, reducing cost and time. Furthermore, the minimum connected component (MCC) approach based on the mathematical concept of graph theory was presented in V. Ramasamy et al. [12] to detect buried UXO. The method explored the 2D image output from the GPR sensor. The proposed method demonstrated its effectiveness in feature extraction in the training datasets. Another form of hardware used is a metal-detector handheld device designed to identify suspicious objects containing metallic components. L. Safatly et al. [1] proposed a landmine recognition and classification method. A robotic system with a metal detector was designed to build the dataset. The authors used several machine learning algorithms, such as boosting bagging and CNN, to evaluate the system’s precision in discrimination between UXO and clutter objects. Computer vision was also explored to detect and classify the landmines. R. Achkar et al. [5,13] proposed a robot to detect the landmines and identify the type and model by employing a neural network. A. Lebbad et al. [14] suggested a system based on computer vision focused on landmine classification issues by developing an image-based technique that utilised neural networks trained on a self-built and limited dataset.
Most researchers focused on detecting the explosives or distinguishing between UXO and clutter, ignoring the identification of UXO and its properties. Identifying UXO is critical to assisting operators in the minefield in avoiding mistakes and thus saving their lives by providing valuable information regarding the recognised object. Therefore, developing an application capable of identifying explosives and providing the minefield operator with information about the recognised object is required.
Augmented reality (AR) is a promising technology for supporting users in performing complex tasks and activities due to its ability to incorporate digital information with users’ real-world perceptions, as shown by E. Marino et al. [15]. AR is becoming increasingly widespread in supporting operators in the workplace by reducing human mistakes and reducing reliance on operator memory, according to D. Ariansyah et al. [16]. AR applications have been developed and effectively deployed in a variety of fields, such as education, as in M. N. I. Opu et al. [17], heritage visualisation, as in G. Trichopoulos et al. [18], and training, as in H. Xue et al. [19].
Various AR applications were proven to be effective, valid, practical, and reliable approaches to risk identification, safety training, and inspection in the study by X. Li et al. [20]. AR can recognise unsafe settings and produce potential scenarios and visualisations using conventional safety training methods, as shown by K. Kim et al. [21]. AR is also employed as a platform for presenting immersive visualisations of fall-related risks on construction sites, as reported by R. Eiris Pereira et al. [22]. Furthermore, the capabilities of AR in presenting various visual information in real time have proven to be an advantage in emergency management and a better alternative than traditional methods, such as maps, according to Y. Zhu et al. [23].
In the literature, two studies based on AR are presented with relevance for the UXO field. T. Maurer et al. [24] developed a prototype that integrates AR with embedded training abilities into handheld detectors, improving the training process by enhancing the operator’s visualisation with AR in order to examine the locations of a previously scanned area. Golden West Design Lab has implemented a marker-based AR system (AROLS) in Vietnam to train the local minefield operators. A set of markers were designed that start the AR process, as reported by A. D. J. T. J. o. C. W. D. Tan [25]. Both studies provided an AR application for training minefield operators that was only intended to be used indoors. Furthermore, these applications were unable to identify UXO or offer relevant information, and therefore did not support UXO deminers during the clearance process in real time. To the best of the author’s knowledge, there is no application designed to identify UXO types using the AR technique.
This research proposes a new augmented-reality-based approach for identifying UXO types in real-time, enhancing field operators’ productivity, and assisting in the disposal process. The main contributions of this paper are as follows.
  • The proposed application presents a unique, innovative, and inexpensive UXO classification method through AR technology.
  • The proposed application provides information in real time related to the detected object.
  • It can reduce the risk imposed on deminers during UXO clearance operations by displaying visual information illustrating the type and the components of the UXO.
  • An evaluation study in a different setting and a questionnaire were conducted to measure the performance and usability of our proposed application.
This paper is organised as follows: Section 2 defines the research background. Section 3 presents a detailed description of the methodology. The experimental process is described in Section 4. Section 5 defines the evaluation of the application through a usability test. Finally, the conclusion and future work are presented in the section.

2. Research Background

This section discusses UXO activities such as risk management, clearance methods, and UXO types. Finally, we review the AR-based assistance applications with an object recognition approach.

2.1. UXO Risk Assessment and Clearance Methods

UXO risk consists of two main kinds. The first risk is related to the hazard of explosion. Humans are injured, dismembered, or killed when exposed to explosive UXO. The second risk is the damage caused to the soil and the environment due to the leakage of chemical material into soil and water. Furthermore, the types of UXO located in the sites differ greatly based on the kinds of wars, conflicts, and military training that happened at the location; UXO can range from small ordnance to extremely large. Furthermore, every UXO contains different charges of explosive materials. Before any operations occur, every governmental and humanitarian organisation assesses the possibility of the existence of UXO. UXO risk assessment can be defined as analysing and evaluating the likelihood of detecting UXO, as asserted by J. MacDonald et al. [26].
The UXO risk assessment operations consist of four main stages (see Figure 1): preliminary assessment, comprehensive assessment, UXO mitigation plan, and UXO collecting, followed by detonation, as each step depends on the primary outcome. In the primary assessment stage, a simple scanning inspection is conducted by checking whether there is a history of military activity, such as training and weapons testing at the site, or whether the site location was a part of a war conflict. Finally, an inspection is carried out regarding any reports of UXO detection regarding that site.
The outcome of this stage is to determine whether there is a need for a comprehensive assessment. If a thorough evaluation is required, a detailed scanning and in-depth groundwork are conducted to determine if there is a high potential risk of UXO on the site. When the outcome of this stage indicates that the site is probably contaminated, a UXO clearance plan is devised that includes UXO detection and classification. After detecting and classifying the UXO, the deminers collect and transfer the various UXO to a secure and isolated place. The authorities set a date to destroy the collected UXO and announce the date in a public announcement to protect public safety and avoid possible accidental explosions.
According to H. Kasban et al. [27], UXO detection has different methods, including electromagnetic, mechanical, and biological detection. Metal detectors, GPR sensors, and IR techniques are electromagnetic detecting methods. Mechanical detection methods include using mechanical equipment, such as a vehicle, that moves the UXO from its location, which causes the explosion of UXO. Biological detection approaches include dogs, bees, rodents, bacteria, and plants. The effectiveness of each method is measured according to the nature of the contaminated soil and the features of the UXO.

2.2. AR with Object Recognition

Augmented reality (AR) and object recognition are among the most advanced technologies that have seized the attention of researchers, and the integration of these two technologies enables a unique approach to solving diverse problems. Manufacturing and operator assistance in performing tasks and training have benefited most from integrating these two technologies due to Industry 4.0., because AR is an appropriate method for assisting operators during the performance of complex activities by offering guidance to make their performance more efficient, which helps to reduce human errors caused by distraction or inexperience; in such systems, object recognition was accomplished using Mask R-CNN, as reported by K.-B. Park et al. [28] and YOLO, as reported by H. Bahri et al. [29] R-CNN Z.-H. Lai et al. [30].
L. Zheng et al. [31] introduced an AR system to support cable assembling and inspection operations. The system used deep learning to detect and identify cable brackets and CNN to capture the labels on the cables simultaneously; AR presents visual guidance in real time. Moreover, AR combined with object recognition are evolving as a favoured approach for training and technical support because of its capability to provide an intuitive method to convey detailed information, as shown by C. Piciarelli et al. [32] and B. Zhou et al. [33].
Another domain that utilizes AR with object recognition is that of driving assistance systems. In R. Anderson et al. [34], YOLO and Viola-Jones were two detection methods used to detect roads and different obstacles, while the AR part displayed holograms to increase the driver’s awareness, hence improving driver safety and helping to reduce fatal accidents. Another domain that utilizes AR with object recognition is the driving assistance systems. L. Abdi et al. [35] presented an in-vehicle assistance system that consists of three parts, a wireless network, an AR virtual information projected on the car windshield, and CNN for object detection.
In the learning and education domain, AR and object recognition integration is a promising medium that allows students to understand information quickly. Moreover, the learning process itself becomes more engaging and pleasant. B. Huynh et al. [36] presented an in-site language learning framework. The proposed framework used SSD as an object recognition method and an AR component that attached a virtual label to the detected object in a different language. In E. F. Rivera et al. [37], AR application was developed as a learning and training tool for studying automotive engineering. The objective was to demonstrate to the students the various driving conditions of the contrasting functions and status of the power splitting device. The application was designed with a Vuforia (SDK) object scanner and CAD software to create 3D models.

3. Methodology

This study aimed to design and develop an augmented reality (AR) application to support deminers by classifying the type of UXO and providing virtual information to assist the deminers in making the appropriate decisions, thus reducing the risks.
The application identifies UXO and provides contextual information to the operators. This information can take the form of different mediums, such as text, images, 3D, animation, and videos. To design and build the proposed application for aiding the demining operation, we needed to collect all the information related to the deminer workflow and the procedure used in classifying the UXO. Hence, we conducted unstructured interviews with deminers to understand the limitations and problems encountered during their operations in collecting information. In addition, we used the suggestions as a guideline to build the application. As for the technological aspect, we chose AR techniques to develop the virtual assistance content. AR technology met all the requirements for showing all the required information effectively and in real time. With the benefit of AR techniques, the deminers viewed the virtual information in the real world. Finally, we evaluated the application of a qualitative analysis-based questionnaire presented to 20 minefield operators to measure the usability of the proposed prototype. Figure 2 depicts the methodology used in this study.

3.1. Data Gathering

Data gathering involved conducting interviews to collect valuable information associated with the study. The interview procedure is an appropriate and productive method for obtaining knowledge and gathering information on a specific issue. Therefore, we interviewed experts in UXO detection and clearance. Consequently, an unstructured interview was conducted with two experts working at the Explosive Control Directorate (EOD)—Salaheddin department. They shared their knowledge and experience about UXO clearance and classification and emphasised the significance of various aspects of the UXO procedures. In addition, they provided their thoughts and suggestions to help decide on the most suitable approach to designing the system and the appropriate aid required to support their operations in the field. According to the experts’ feedback, the procedure of identifying and classifying UXO is outlined in a paper manual comprising simple illustrations and labels assigned to different components of UXO. After completing the interview, photos of UXO samples were acquired see Figure 3. However, working with actual samples of UXO was not possible because the detonator triggers were still active, which imposed safety issues.

3.2. Dataset Collection

Datasets are fundamental to fostering AI development, giving results scope, robustness, and confidence. This study’s dataset collection process included two phases: a 3Dprinting sample and a 3D mobile-based model.

3.2.1. 3D Printing Sample

Considering the danger of working with the active UXO and the authors’ lack of experience manually handling the UXO, 3D printing technology was chosen to create the UXO samples. 3D printing has evolved swiftly in recent years, making it possible for ideas to transfer to reality quickly. Furthermore, light and elastic printing materials, such as ABS or PLA, are inexpensive and contribute to reducing production. For this purpose, the Creality 3D CR-10S Pro V2 printer with FDM 3D printing technology using polylactic acid (PLA) material was used to print the UXO samples. Four types of UXO were chosen as the reference objects, namely, MPN-2, VS-MK, VS-50, and 45mm mortars, as shown in Figure 4. We chose these UXO samples based on the advice of the deminers we met during the data gathering process, since they are among the most prevalent UXO found in minefields. In addition, the four types selected differed in size and shape, helping in the evaluation of the proposed prototype’s performance. Figure 5 illustrates the samples of unexploded ordnance chosen in this study.
Three-dimensional printing processes involve two stages, namely modelling and printing. We designed the UXO model in the modelling phase, utilising a computer-aided design (CAD) software package. The object model was saved in stereolithography (STL) format when finished. The item was built in the second phase of 3D printing (i.e., the printing phase). To begin printing the specified UXO, a 3D (STL) format file was uploaded to a 3D printing machine. The instructions in the corresponding file will be used by the 3D printer to determine where and how the material is deposited.

3.2.2. 3D Mobile-Based Model

After creating the UXO samples, the dataset can be built using the android application Vuforia Object Scanner, which operates as a scanner by using the mobile camera to make a 3D model. The scanning process creates the object data file (OD) that includes accurate data for targeting objects in the Target Manager. A series of prerequisites must be completed for a successful scanning operation, such as acquiring and printing the Object Scanning Target from the Vuforia website. Furthermore, the environment must be free from background noise with moderate and uniform lighting. Therefore, a grey background was used to minimise the noise. In addition, several scanning processes were conducted to obtain the best result. Once the scanning process was completed, the (OD) file was saved for future import into the Unity software. Figure 5 shows the scanning process.

3.3. UXO-AID Overview

The core of the presented mobile application is illustrated in Figure 6. The application layout is simple to provide a user-friendly interface that allowed quick perception. Firstly, feature extraction is applied to the object of interest from diverse viewpoints and distances, and saved to the application’s database in a 3D target model. UXO-AID then acquires a frame from the mobile phone camera, extracts the features, and performs a matching process between the extracted features and the features of the 3D model saved in the database. In the event of a successful matching process, the object is successfully detected and recognised. Then the virtual graphics and text are displayed. The virtual content guides the minefield operator on the location of the denoting trigger in the UXO. Moreover, additional information is provided about the model, type, and how the UXO is actuated.
In addition, the operational flow of the UXO recognition process using the smartphone’s camera and the sequence of actions taken by the user for successful recognition are shown in Figure 7. The steps illustrated in the flowchart are:
  • Open camera: This is the first step, where the deminer opens the mobile phone’s camera.
  • Scan object: After opening the camera, the deminer must point the camera to identify the UXO type.
  • Object detection: The camera recognises the UXO’s appearance, which matches the target model already stored in the database. If the detected object is identified, the operation proceeds to the next step; if not, the operation returns to the scanning stage.
  • Render virtual information: Once the UXO type is recognised, the deminer sees all the relevant information of the UXO type.
  • Screenshot: After viewing the virtual information, the deminer can take a screenshot of the augmented view, which is saved on the mobile phone’s memory.

3.4. Software Tools

3.4.1. Vuforia SDK

Vuforia contains a set of diverse functionalities that support several types of recognition, such as image, text, and object recognition. Moreover, it is a widely used platform for creating augmented reality applications due to its compatibility with many mobile devices, such as tablets and mobile phones, and supports Android, iOS, and WUP. Vuforia can recognise 3D objects in real time using computer vision technology, and allows for direct interaction between the user and the real world. The mobile screen is a portal that combines the real-world scene with virtual content. In addition, it provides services related to recognition in different modes, such as online and offline. Cloud recognition works online, allowing developers to store and handle targets online. The supported types of targets include 2d images and planer. Meanwhile, the offline mode enables developers to recognise 2d images, planer, and 3d objects, and hosts the different targets locally on the device, which removes the need for internet service to use the application.
Furthermore, Vuforia provides the natural feature tracking (NFT) tracking technique. NFT is a model-based or image-based approach that recognises and tracks the previously extracted natural features from the target object. The goal is to facilitate the process of feature pattern detection. The Vuforia SDK employs NFT (natural feature tracking) that contains a natural feature tracking method, i.e., SIFT (scale invariant feature transform). SIFT is used to detect the feature points of the object and calculate the object’s scale by mapping the coordinate values.

3.4.2. Unity 3D Engine

Unity is a cross-platform 2D–3D game engine that supports 2D and 3D graphics and C# scripting, in addition to animations designed by the Unity 3D engine. It is a powerful engine used to build AR and VR applications, providing a basic level of human-machine interaction using AR development tools. Furthermore, Unity is compatible with Vuforia SDK, allowing for the building of AR applications capable of recognising and tracking 3D objects.

3.5. UXO-AID Implementation

For the implementation of the application, Android was selected as the target mobile operating system (OS) since it is considered one of the most commonly used OSs in the world. As this research aimed to use AR technology to assist the deminers during their tasks, adaptation to additional platforms was unnecessary. Then a comparison of the current AR SDK was conducted to choose a suitable tool. Different AR SDKs and libraries are used for creating AR applications, such as Vuforia, ARCore, ARToolkit, easyAR, Wikitude, and Kudan. However, when comparing these AR SDKs, Vuforia was found to offer better 3D object recognition and tracking. As a result, Vuforia was the suitable SDK for implementing the application. The prototype was created using Unity (version 2019.4.35f1 (64-bit)) with a Vuforia plug-in. After selecting Vuforia SDK, the UXO model dataset was created and imported, as mentioned in Section 3.2. The main criterion for successful object recognition is the acquisition of as many feature points as possible to facilitate the Vuforia engine to recognise the object easily. Then the application user interface was created using Unity’s GameObject-based interface design tool and immediate mode GUI (IMGUI) coding API. The gameObject-based feature allows the addition of various user interface components, such as canvas, texts, and buttons, simply by placing the details on the screen. The application has two different user interfaces: the home interface and AR mode. The home interface is displayed when the user starts the application. Figure 8a shows the home interface with two buttons; open camera and exit buttons. Figure 8b shows the second interface, AR mode, which opens the camera view of the mobile’s camera feature, allowing the users to scan the target object. In addition, the interface has Screen-shot, Back, and Exit buttons. After completing the design of the application user interfaces, scripting was added to expand the application’s features and enable the required capabilities that allow the application to be dynamic. Therefore, three C# scripts were written. The first script is linked with a button to screenshot the displayed virtual content associated with the UXO; an open-source package (UnityNativeGallery) was imported to use the feature of saving the screenshot on the mobile gallery. The second script is an autofocus script associated with the AR camera. Lastly, the third script is linked with the Exit buttons to close the application and the Back button to return to the main menu.

3.6. Evaluation of UXO-AID

This section describes the assessment of the performance of the proposed prototype. To accomplish this assessment, firstly, a preliminary experiment was conducted to determine the application’s applicability. The testing process focused on distance in centimetres, lighting, and object recognition accuracy. The experiment aimed to determine if the application could run correctly when the object was scanned from different distances in conjunction with both good lighting and low lighting conditions. Furthermore, to measure the frequency of recognition, each UXO was scanned individually 20 times using a mobile phone camera. Figure 9 shows the procedure to measure the object recognition accuracy. Next, the accuracy was computed using Equation (1), while the error rate was calculated using Equation (2), as listed below:
Accuracy   % = UXO   recognised   no .   of   attempts × 100
Error   not   recognised = UXO   not   recognised no .   of   attempts × 100
Secondly, usability testing was performed to determine the level of applicability and usability of the AR application for participants, and to examine whether the AR application could support the deminer operations and to what extent the displayed virtual information regarding the recognised UXO assisted the operator. A questionnaire based on usability was completed and submitted by 20 deminers. According to N. Bevan [38], usability is defined as how a user may utilise software to accomplish an objective with satisfaction, efficiency, and effectiveness for a specific usage context. Therefore, the questionnaire was broken down into three significant factors to be considered in assessing the AR application: satisfaction, effectiveness, and efficiency, as described in Table 1. Each factor was measured using items derived from different studies [39,40,41]. The assessed items for each factor are also detailed in Table 1. After collecting the responses from the deminers, descriptive statistics were applied to analyse the questionnaire. Hence, maximum (max), minimum (min), mean, standard deviation (std), tabled T and calculated T were implemented. Figure 10. Demonstrate the evaluation techniques used in this study.

4. Experimental Results

4.1. Preliminary Tests

A series of preliminary experiments were carried out to measure the application functionality and to assess the application’s applicability regarding distance and lighting settings. Table 2 depicts the testing result from six different distance points with varying light settings. The results indicate that the objects were easily recognised from all the distance points except when the distance was about 40 cm. In addition, scanning in light intensity of less than 12 LUX made UXO objects undetectable and unrecognisable, preventing them from being identified. Figure 11 shows the result of testing using the VS-50 sample. Figure 11a displays the camera view in landscape mode, and the VS-50 recognised within a distance equal to 20 cm. Figure 11b displays the camera view in landscape mode, and the VS-50 identified within a distance equal to 30 cm. On the other hand, the images of Figure 11c,d display the camera view in portrait mode, and the VS-50 recognised within distances equal to 30 and 35 cm.
As for object recognition accuracy, it was found that the small number of feature points caused the recognition process to fail. In the case of VS-50 and VS-MK mines, the recognition rate was above 90%, because these two types had a sufficient number of feature points. On the other hand, the recognition rate decreased for the PMN-2 and 45 mm mortars due to insufficient feature points. Furthermore, the cylinder shape of the 45 mm mortars was another factor that caused the recognition process to fail. Finally, scanning the UXO in scenarios that contained light reflection and shadow also caused failure in recognition. Overall, the total accuracy of UXO recognition reached 82.5%. Table 3 shows the details of the testing results.

4.2. Usability Testing

The complete usability testing consisted of two parts; in the first part, the participants were asked to fill out a pre-questionnaire based on their information. The purpose of the pre-questionnaire was to collect relevant information from the participants. A group of 20 deminers (male 70% and female 30%) tested and evaluated the application. Table 4 outlines the participants’ gathered demographic information. The pre-questionnaire focused on aspects concerning years of experience, estimation of the time taken to classify the UXO according to participants’ answers, and whether the participants used any system to help them in the UXO classification process. Lastly, the participants’ usage frequency of mobile devices was also considered.
The analysis of the pre-questionnaire shows that all participants did not use any software during their operations. Furthermore, most participants had a university education (70%) and institutional education (30%). Regarding user experience, four participants (20%) had 1–5 years’ experience, and seven participants (35%) reported that they had 5–10 years’ experience, while another four participants (20%) had 10–20 years’ experience. In contrast, five participants (35%) had more than 20 years of experience. Out of 20 participants, nine (45%) reported they could classify UXO within 2 h, and five participants (25%) said they could classify UXO within one hour. As for mobile skills usage, 16 participants (80%) reported that their skills in utilising mobile applications were excellent, whereas four participants (20%) stated that their skills in using the mobile application were good. Figure 12 illustrates the distribution of experience years and the required time to classify the UXO.
In the second part of the evaluation, each participant who used the application answered the questionnaire. The participants were requested to convey their agreement level. The responses were rated on a five-point Likert scale, from strongly disagree (1) to strongly agree (5).
The result for each factor is discussed as follows.
  • Satisfaction
This factor evaluated the level of user acceptance, impact, and ease of use of the application, and to measure this factor, five criteria were considered, as illustrated in Table 5. The majority of participants stated that the AR application was practical and could help the deminers in their operations to complete the tasks. Furthermore, many participants’ answers indicated that they agreed or strongly agreed that the application was easy to use and did not require training to learn how to use the application, primarily due to the simplicity of the user interface and ease of navigation. This also influenced many participants who agreed that the application would be used frequently. However, another group chose to remain neutral, as they needed to use the application more before deciding to use it more regularly. In addition, many participants agreed that the application would increase their productivity in completing their tasks. At the same time, a small percentage did not agree with this statement. Table 6 shows the factor’s overall statistical results, and implies that positive outcomes were acquired for the satisfaction factor obtained from the usage of AR.
  • Effectiveness
Effectiveness was another significant factor considered as a section of the assessment with four criteria. The results in Table 7 indicate that most of the participants agreed that the application response was in real time, and the participants did not wait for the virtual information to be displayed. However, a small group of participants remained neutral. Furthermore, most participants agreed or strongly agreed that the application operated without noticeable issues. Still, a few participants stated that the application encountered some problems during the execution. This was due to the old versions of mobile phones with low-resolution cameras. The participants showed good agreement that the virtual information displayed appropriately regarding the positions and orientations.
Finally, a significant group of participants answered neutrally to the last statement. The participants proposed including a dynamic 3D model with the already displayed information for better assistance in completing the tasks. In general, the positive effectiveness results imply that end-users correctly and properly viewed the application outputs. Overall statical results on effectiveness are shown in Table 8.
  • Efficiency
The outcomes for this factor are presented in Table 9. Most participants agreed that the application would reduce the time used in classifying the UXO. Moreover, most participants strongly agreed with statement two that the application decreased the danger of explosion during the deminers’ clearance operations. In addition, a few participants disagreed that the application helped reduce the mental load on the deminer. However, the majority agreed with the statement and indicated that the application reduced the cognitive efforts in classifying the UXO. A group of participants remained neutral with regard to suggestions to use smart glasses to allow the deminer to operate with both hands and incorporate remote collaboration in real-time. Table 10 depicts the overall statistical analysis for efficiency.
The mean distribution of the three factors is shown in Figure 13, which demonstrates the positive results that were collected and indicates that each factor’s mean value was higher than 3.5, verifying a high approval level.
In addition, Figure 14 presents an overview of the mean of each measured item. From the chart, it is observed that participants agreed with the items Q1, Q2, Q10, and Q11, which signifies that the application is practical, easy to use, and reduces classification time and risks imposed on the deminers. Meanwhile, it is noted that participants agreed less with Q13, in which suggestions were offered to extend the capabilities of the proposed application. In general, end-users responded positively to the application of AR in aiding deminers through the improvement of UXO classification and minimisation of the hazards imposed on minefield deminers.

5. Limitations

Several limitations of the proposed mobile application must be addressed. Due to the infeasibility of building a UXO dataset, this application uses Vuforia object recognition for detecting and recognising UXO, which is not ideal for large datasets. In addition, the scanned object must have sufficient feature points; otherwise, the recognition will not be successful. Deep learning methods could be introduced in the future with a UXO dataset. In addition, the simplicity of the printed UXO models must be regarded as a limitation of this study. More studies are required to test the application using real UXO in different contexts, which can be viewed as more challenging and riskier. Moreover, the small number of participants is also considered a limitation in the study. Thus, additional studies and testing on a larger number of participants must be conducted to explore various aspects of the study.

6. Conclusions and Future Work

This study introduced an augmented reality (AR) mobile app designed to assist minefield operators in recognising and classifying UXO and providing information on the classified UXO and how to handle it. The AR application works on-site through a mobile device and was designed and implemented in Unity 3D software with Vuforia SDK. In addition, 2D image visualisation is also rendered that shows the components that construct the UXO in a disassembled fashion. As working with real and active UXO samples was unfeasible, 3D printing technology was employed to print replicas of four UXO samples, namely vs-50, PMN-2, VS-MK, and 45 mm mortars, to build the database. These four UXO samples were selected based on their shapes and widespread use.
The application was tested to measure and ensure the soundness of the application’s performance. A series of preliminary tests were performed to evaluate the application’s functionality according to the following aspects: accuracy, lighting, and distance. The testing results revealed that the application could successfully perform in excellent and moderate lighting with a distance of 10 to 35 cm. However, the AR application could not function in insufficient lighting. As for the recognition accuracy, each UXO sample was scanned 20 times and the total recognition accuracy was calculated. The recognition rates of VS-50 and VS-MK were 100% and 95%, respectively. In comparison, the recognition rates of PMN-2 and 45 mm mortars were 75% and 60%, respectively. The overall recognition success rate reached 82.5%. The difference in the recognition rates due to the disparity in the number of features of each object affected the accuracy of object recognition. Furthermore, usability testing based on ISO 9421-11 standards was employed to evaluate the AR application. A questionnaire of 13 questions was conducted and submitted by 20 deminers. The usability questionnaire was based on three elements: satisfaction, efficacy, and efficiency. The results showed that the application reduced the time required to classify the unexploded ordnance object, and reduced the hazards imposed on the deminer during the demining process. Based on the survey results, we can report that AR technology is an excellent medium to aid minefield operators during the demining or recognition process, help reduce cognitive load, and minimise hazardous human errors that can put the operators in a life-threatening situation. Thus, the proposed AR application has established its capability to help deminers to complete complex tasks. Future work will explore the prospects of implementing advanced computer vision algorithms, such as deep learning, to improve object recognition and test the application in real scenarios. In addition, creating a complete UXO dataset is mandatory to increase the number of UXO objects to which such technology can be applied. In addition, utilising the Microsoft HoloLens as a display device is suggested to make the AR experience more comfortable.

Author Contributions

Conceptualisation, Q.A.H., H.A.H. and M.A.A.; methodology, M.A.A. and H.A.H.; software, Q.A.H., R.D.I. and M.B.O.; investigation, H.A.H., M.A.A., Q.A.H., M.M.S., R.D.I. and M.B.O.; data curation, H.A.H. and M.A.A.; writing—original draft preparation, Q.A.H., H.A.H., M.A.A., R.D.I. and M.B.O.; writing—review and editing, H.A.H. and M.A.A.; supervision, M.A.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

All data were presented in the main text.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Safatly, L.; Baydoun, M.; Alipour, M.; Al-Takach, A.; Atab, K.; Al-Husseini, M.; El-Hajj, A.; Ghaziri, H. Detection and classification of landmines using machine learning applied to metal detector data. J. Exp. Theor. Artif. Intell. 2020, 33, 203–226. [Google Scholar] [CrossRef]
  2. Khurshudyan, I.; Bearak, M. Clearing the Deadly Litter of Unexploded Russian Bombs in Ukraine. Available online: https://www.washingtonpost.com/world/2022/04/15/ukraine-clearing-unexploded-russian-bombs-missiles/ (accessed on 15 April 2022).
  3. Landmine Monitor. 2021. Available online: http://the-monitor.org/en-gb/our-research/landmine-monitor.aspx (accessed on 12 June 2022).
  4. GICHD. What Risks Do We Face? 2022. Available online: https://www.gichd.org/en/explosive-ordnance/ (accessed on 29 May 2022).
  5. Achkar, R.; Owayjan, M.; Mrad, C. Landmine detection and classification using MLP. In Proceedings of the 2011 Third International Conference on Computational Intelligence, Modelling & Simulation, Langkawi, Malaysia, 20–22 September 2011; pp. 1–6. [Google Scholar]
  6. Habib, M.K. Mine clearance techniques and technologies for effective humanitarian demining. J. Mine Action 2002, 6, 17. [Google Scholar]
  7. Tbarki, K.; Ben Said, S.; Ksantini, R.; Lachiri, Z. Covariance-guided landmine detection and discrimination using ground-penetrating radar data. Int. J. Remote Sens. 2017, 39, 289–314. [Google Scholar] [CrossRef]
  8. Tbarki, K.; Ben Said, S.; Ksantini, R.; Lachiri, Z. RBF kernel based SVM classification for landmine detection and discrimination. In Proceedings of the 2016 International Image Processing, Applications and Systems (IPAS), Hammamet, Tunisia, 5–7 November 2016; pp. 1–6. [Google Scholar]
  9. Giovanneschi, F.; Mishra, K.V.; Gonzalez-Huici, M.A.; Eldar, Y.C.; Ender, J.H.G. Dictionary Learning for Adaptive GPR Landmine Classification. IEEE Trans. Geosci. Remote Sens. 2019, 57, 10036–10055. [Google Scholar] [CrossRef]
  10. Lombardi, F.; Lualdi, M.; Picetti, F.; Bestagini, P.; Janszen, G.; Di Landro, L.A. Ballistic Ground Penetrating Radar Equipment for Blast-Exposed Security Applications. Remote Sens. 2020, 12, 717. [Google Scholar] [CrossRef]
  11. Sigman, J.B.; O’Neill, K.; Barrowes, B.; Wang, Y.; Shubitidze, F. Automatic classification of unexploded ordnance applied to live sites for MetalMapper sensor. In Proceedings of the Detection and Sensing of Mines, Explosive Objects, and Obscured Targets XIX, Baltimore, MD, USA, 5–7 May 2014; Volume 9072, pp. 95–101. [Google Scholar]
  12. Ramasamy, V.; Nandagopal, D.; Tran, M.; Abeynayake, C. A Novel Feature Extraction Algorithm for IED Detection from 2-D Images using Minimum Connected Components. Procedia Comput. Sci. 2017, 114, 507–514. [Google Scholar] [CrossRef]
  13. Alrubayi, A.H.; Mohamed, A.A. A pattern recognition model for static gestures in malaysian sign language based on machine learning techniques. Comput. Electr. Eng. 2021, 95, 107383. [Google Scholar] [CrossRef]
  14. Lebbad, A.; Clayton, G.; Nataraj, C. Classification of UXO Using Convolutional Networks Trained on a Limited Dataset. In Proceedings of the 2017 16th IEEE International Conference on Machine Learning and Applications (ICMLA), Cancun, Mexico, 18–21 December 2017; pp. 1098–1101. [Google Scholar]
  15. Marino, E.; Barbieri, L.; Colacino, B.; Fleri, A.K.; Bruno, F. An Augmented Reality inspection tool to support workers in Industry 4.0 environments. Comput. Ind. 2021, 127, 103412. [Google Scholar] [CrossRef]
  16. Ariansyah, D.; Erkoyuncu, J.A.; Eimontaite, I.; Johnson, T.; Oostveen, A.-M.; Fletcher, S.; Sharples, S. A head mounted augmented reality design practice for maintenance assembly: Toward meeting perceptual and cognitive needs of AR users. Appl. Ergon. 2021, 98, 103597. [Google Scholar] [CrossRef]
  17. Opu, N.I.; Islam, R.; Kabir, M.A.; Hossain, S.; Islam, M.M. Learn2Write: Augmented Reality and Machine Learning-Based Mobile App to Learn Writing. Computers 2021, 11, 4. [Google Scholar] [CrossRef]
  18. Trichopoulos, G.; Aliprantis, J.; Konstantakis, M.; Michalakis, K.; Caridakis, G. Tangible and Personalized DS Application Approach in Cultural Heritage: The CHATS Project. Computers 2022, 11, 19. [Google Scholar] [CrossRef]
  19. Xue, H.; Sharma, P.; Wild, F. User Satisfaction in Augmented Reality-Based Training Using Microsoft HoloLens. Computers 2019, 8, 9. [Google Scholar] [CrossRef]
  20. Li, X.; Yi, W.; Chi, H.-L.; Wang, X.; Chan, A.P. A critical review of virtual and augmented reality (VR/AR) applications in construction safety. Autom. Constr. 2018, 86, 150–162. [Google Scholar] [CrossRef]
  21. Kim, K.; Alshair, M.; Holtkamp, B.; Yun, C. Using Immersive Augmented Reality to Assess the Effectiveness of Construction Safety Training. J. Constr. Eng. Proj. Manag. 2019, 9, 16–33. [Google Scholar]
  22. Pereira, R.E.; Moore, H.F.; Gheisari, M.; Esmaeili, B. Development and Usability Testing of a Panoramic Augmented Reality Environment for Fall Hazard Safety Training. In Advances in Informatics and Computing in Civil and Construction Engineering; Springer: Cham, Switzerland, 2018; pp. 271–279. [Google Scholar]
  23. Zhu, Y.; Li, N. Virtual and augmented reality technologies for emergency management in the built environments: A state-of-the-art review. J. Saf. Sci. Resil. 2020, 2, 1–10. [Google Scholar] [CrossRef]
  24. Maurer, T.; Cook, K.; Graybeal, J. Counter-mine augmented reality training system (CMARTS). In Proceedings of the Detection and Sensing of Mines, Explosive Objects, and Obscured Targets XXIV, Baltimore, MD, USA, 15–17 April 2019; Volume 11012, p. 1101210. [Google Scholar]
  25. Tan, A.D. Augmented and Virtual Reality for HMA EOD Training. J. Conv. Weapons Destr. 2020, 23, 4. [Google Scholar]
  26. Gibson, J.M.; Knopman, D.; Lockwood, J.R.; Cecchine, G.; Willis, H.H. Unexploded Ordnance: A Critical Review of Risk Assessment Methods; Defense Technical Information Center: Fort Belvoir, VA, USA, 2004.
  27. Kasban, H.; Zahran, O.; El-Kordy, M.; Elaraby, S.; Abd El-samie, F. Landmines detection technologies: A comparative study. Egypt. J. Environ. Chang. 2009, 1, 54–59. [Google Scholar]
  28. Park, K.-B.; Kim, M.; Choi, S.H.; Lee, J.Y. Deep learning-based smart task assistance in wearable augmented reality. Robot. Comput. Manuf. 2019, 63, 101887. [Google Scholar] [CrossRef]
  29. Bahri, H.; Krčmařík, D.; Kočí, J. Accurate Object Detection System on HoloLens Using YOLO Algorithm. In Proceedings of the 2019 International Conference on Control, Artificial Intelligence, Robotics & Optimization (ICCAIRO), Majorca Island, Spain, 3–5 May 2019; pp. 219–224. [Google Scholar]
  30. Lai, Z.-H.; Tao, W.; Leu, M.C.; Yin, Z. Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing. J. Manuf. Syst. 2020, 55, 69–81. [Google Scholar] [CrossRef]
  31. Zheng, L.; Liu, X.; An, Z.; Li, S.; Zhang, R. A smart assistance system for cable assembly by combining wearable augmented reality with portable visual inspection. Virtual Real. Intell. Hardw. 2020, 2, 12–27. [Google Scholar] [CrossRef]
  32. Piciarelli, C.; Vernier, M.; Zanier, M.; Foresti, G.L. An augmented reality system for technical staff training. In Proceedings of the 2018 IEEE 16th International Conference on Industrial Informatics (INDIN), Porto, Portugal, 18–20 July 2018; pp. 899–904. [Google Scholar]
  33. Zhou, B.; Guven, S. Fine-Grained Visual Recognition in Mobile Augmented Reality for Technical Support. IEEE Trans. Vis. Comput. Graph. 2020, 26, 3514–3523. [Google Scholar] [CrossRef] [PubMed]
  34. Anderson, R.; Toledo, J.; ElAarag, H. Feasibility Study on the Utilization of Microsoft HoloLens to Increase Driving Conditions Awareness. In Proceedings of the 2019 SoutheastCon, Huntsville, AL, USA, 11–14 April 2019; pp. 1–8. [Google Scholar]
  35. Abdi, L.; Takrouni, W.; Meddeb, A. In-vehicle cooperative driver information systems. In Proceedings of the 2017 13th International Wireless Communications and Mobile Computing Conference (IWCMC), Valencia, Spain, 26–30 June 2017; pp. 396–401. [Google Scholar]
  36. Huynh, B.; Orlosky, J.; Höllerer, T. In-situ labeling for augmented reality language learning. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1606–1611. [Google Scholar]
  37. Rivera, E.F.; Pilco, M.V.; Espinoza, P.S.; Morales, E.E.; Ortiz, J.S. Training System for Hybrid Vehicles Through Augmented Reality. In Proceedings of the 2020 15th Iberian Conference on Information Systems and Technologies (CISTI), Sevilla, Spain, 24–27 June 2020; pp. 1–6. [Google Scholar]
  38. Bevan, N. Extending Quality in Use to Provide a Framework for Usability Measurement. In Proceedings of the International Conference on Human Centered Design, San Diego, CA, USA, 19–24 July 2009; pp. 13–22. [Google Scholar]
  39. Ouali, I.; Sassi, M.S.H.; Halima, M.B.; Ali, W. A New Architecture based AR for Detection and Recognition of Objects and Text to Enhance Navigation of Visually Impaired People. Procedia Comput. Sci. 2020, 176, 602–611. [Google Scholar] [CrossRef]
  40. Ahmed, M.A.; Zaidan, B.B.; Zaidan, A.A.; Alamoodi, A.H.; Albahri, O.S.; Al-Qaysi, Z.T.; Albahri, A.S.; Salih, M.M. Real-time sign language framework based on wearable device: Analysis of MSL, DataGlove, and gesture recognition. Soft Comput. 2021, 25, 11101–11122. [Google Scholar] [CrossRef]
  41. Blanco-Pons, S.; Carrión-Ruiz, B.; Lerma, J.L.; Villaverde, V. Design and implementation of an augmented reality application for rock art visualization in Cova dels Cavalls (Spain). J. Cult. Herit. 2019, 39, 177–185. [Google Scholar] [CrossRef]
  42. Bevan, N.; Carter, J.; Harker, S. ISO 9241-11 revised: What have we learnt about usability since 1998? In Proceedings of the International Conference on Human-Computer Interaction, Bamberg, Germany, 14–18 September 2015; pp. 143–151. [Google Scholar]
Figure 1. UXO risk-assessment stages.
Figure 1. UXO risk-assessment stages.
Computers 11 00124 g001
Figure 2. The methodology of the study.
Figure 2. The methodology of the study.
Computers 11 00124 g002
Figure 3. UXO sample.
Figure 3. UXO sample.
Computers 11 00124 g003
Figure 4. UXO samples: (a) PMN-2 soviet mine; (b) VS-MK soviet mine; (c) VS-50 Italian mine; (d) 45 mm mortars.
Figure 4. UXO samples: (a) PMN-2 soviet mine; (b) VS-MK soviet mine; (c) VS-50 Italian mine; (d) 45 mm mortars.
Computers 11 00124 g004
Figure 5. UXO scan 3d.
Figure 5. UXO scan 3d.
Computers 11 00124 g005
Figure 6. Application overview.
Figure 6. Application overview.
Computers 11 00124 g006
Figure 7. Workflow of the object recognition.
Figure 7. Workflow of the object recognition.
Computers 11 00124 g007
Figure 8. Application user interfaces: (a) home interface; (b) AR mode interface.
Figure 8. Application user interfaces: (a) home interface; (b) AR mode interface.
Computers 11 00124 g008
Figure 9. Flowchart of UXO testing.
Figure 9. Flowchart of UXO testing.
Computers 11 00124 g009
Figure 10. Testing and evaluation methods used in the study.
Figure 10. Testing and evaluation methods used in the study.
Computers 11 00124 g010
Figure 11. VS-50 recognition results in a different setting: (a,b) show the camera view in landscape mode with distances 20 and 30 cm; (c,d) show the camera view in portrait mode with distances of 30 and 35 cm.
Figure 11. VS-50 recognition results in a different setting: (a,b) show the camera view in landscape mode with distances 20 and 30 cm; (c,d) show the camera view in portrait mode with distances of 30 and 35 cm.
Computers 11 00124 g011
Figure 12. Distribution of pre-questionnaire regarding: (a) experience years.; (b) the required time to classify UXO.
Figure 12. Distribution of pre-questionnaire regarding: (a) experience years.; (b) the required time to classify UXO.
Computers 11 00124 g012
Figure 13. Mean distribution of each factor.
Figure 13. Mean distribution of each factor.
Computers 11 00124 g013
Figure 14. Mean of each measured item.
Figure 14. Mean of each measured item.
Computers 11 00124 g014
Table 1. Implemented factors with their associated measured item.
Table 1. Implemented factors with their associated measured item.
FactorDefinitionMeasured Item
SatisfactionSatisfaction is influenced by likeability, functional appropriateness, and simplicity of use [38]
  • The AR application is useful
  • The user interface is simple to navigate and easy to learn
  • The app will be used frequently
  • In general, it is easy to use the app
  • I believe I could become productive quickly using this system
EffectivenessEffectiveness means an objective can be measured in accuracy, completeness, and output precision [42].
  • The response of the AR application in real-time
  • The AR application does not have any issues at the time of execution
  • The digital information properly presented
  • The virtual information was effective in helping me complete the tasks.
EfficiencyEfficiency is the resources utilised, such as time to finish a particular task, human labour, and cost [42].
  • The classification operation time decreased through using the application
  • The application helps reduce the hazard imposed by the clearance operation
  • The application helps reduce the mental demand to identify the UXO
  • This application has all the functions and capabilities expected it to have
Table 2. Various distance and lighting test samples results.
Table 2. Various distance and lighting test samples results.
Distance (cm)101520253040
Lighting Intensity
2445 (LUX)YESYESYESYESYESNO
2197 (LUX)YESYESYESYESYESNO
1328 (LUX)YESYESYESYESYESNO
843 (LUX)YESYESYESYESYESNO
431 (LUX)YESYESYESYESNONO
380 (LUX)YESYESYESYESNONO
177 (LUX)YESYESYESYESNONO
35 (LUX)YESYESYESNONONO
Less than 12 (LUX)NONONONONONO
Table 3. Performance of UXO-AID.
Table 3. Performance of UXO-AID.
UXO ModelRecognisedNot RecognisedAccuracy
VS-50200100%
PMN-215575%
VS-MK19195%
45 mm mortars12860%
Overall Accuracy82.5%
Table 4. Demographics of the participants.
Table 4. Demographics of the participants.
VariablesValueFrequencyPercentage %
GenderMale1470%
Female630%
Education levelUniversity1670%
Institution430%
High school00%
Years’ experience1–5420%
5–10735%
15–20420%
More than 20525%
Number of training coursesOne course420%
Two courses420%
Three courses420%
Four courses525%
More than four courses315%
Time to classify UXO1 h525%
2 h945%
3 h420%
More than 3 h210%
Skills in using mobile appsExcellent1680%
Good420%
Medium00%
Low00%
Did you use any system to assist you in the tasksNo20100%
Yes00%
Table 5. Descriptive statistics for the Satisfaction factor.
Table 5. Descriptive statistics for the Satisfaction factor.
Item No.Strongly AgreeAgreeNeutralDisagreeStrongly DisagreeMaxMinMean
145%40%10%5%-524.25
255%40%-5%-524.45
35%65%25%5%-523.7
445%50%---514.3
515%65%20%10%-523.75
Table 6. Overall statistical analysis for the Satisfaction factor.
Table 6. Overall statistical analysis for the Satisfaction factor.
MeanStd. DeviationCalculated MeanCalculated TTabled T
20.453.06157.942.08
Table 7. Descriptive statistics for the Effectiveness factor.
Table 7. Descriptive statistics for the Effectiveness factor.
Item No.Strongly AgreeAgreeNeutralDisagreeStrongly DisagreeMaxMinMean
135%50%15%--534.2
220%45%30%5%-523.8
325%50%25%--534
420%35%45%--533.75
Table 8. Overall statistical analysis for the effectiveness factor.
Table 8. Overall statistical analysis for the effectiveness factor.
MeanStd. DeviationCalculated MeanCalculated TTabled T
20.453.06157.942.08
Table 9. Descriptive statistics for the Efficiency factor.
Table 9. Descriptive statistics for the Efficiency factor.
Item No.Strongly AgreeAgreeNeutralDisagreeStrongly DisagreeMaxMinMean
150%40%5%5%-524.35
250%45%-5%-524.4
330%65%--5%514.15
4-40%45%10%5%413.2
Table 10. Overall statistical analysis for the Efficiency factor.
Table 10. Overall statistical analysis for the Efficiency factor.
MeanStd. DeviationCalculated MeanCalculated TTabled T
16.12.79126.5742.08
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Hameed, Q.A.; Hussein, H.A.; Ahmed, M.A.; Salih, M.M.; Ismael, R.D.; Omar, M.B. UXO-AID: A New UXO Classification Application Based on Augmented Reality to Assist Deminers. Computers 2022, 11, 124. https://doi.org/10.3390/computers11080124

AMA Style

Hameed QA, Hussein HA, Ahmed MA, Salih MM, Ismael RD, Omar MB. UXO-AID: A New UXO Classification Application Based on Augmented Reality to Assist Deminers. Computers. 2022; 11(8):124. https://doi.org/10.3390/computers11080124

Chicago/Turabian Style

Hameed, Qabas A., Harith A. Hussein, Mohamed A. Ahmed, Mahmood M. Salih, Reem D. Ismael, and Mohammed Basim Omar. 2022. "UXO-AID: A New UXO Classification Application Based on Augmented Reality to Assist Deminers" Computers 11, no. 8: 124. https://doi.org/10.3390/computers11080124

APA Style

Hameed, Q. A., Hussein, H. A., Ahmed, M. A., Salih, M. M., Ismael, R. D., & Omar, M. B. (2022). UXO-AID: A New UXO Classification Application Based on Augmented Reality to Assist Deminers. Computers, 11(8), 124. https://doi.org/10.3390/computers11080124

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop