Next Article in Journal
The Performance Analysis of PSO-ResNet for the Fault Diagnosis of Vibration Signals Based on the Pipeline Robot
Previous Article in Journal
A Highly Pipelined and Highly Parallel VLSI Architecture of CABAC Encoder for UHDTV Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

The Future of Mine Safety: A Comprehensive Review of Anti-Collision Systems Based on Computer Vision in Underground Mines

1
Alqualsadi (Digital Innovation on Enterprise Architectures) Research Team & IRDA (Information Retrieval and Data Analytics) Research Team, Rabat IT Center, ENSIAS, Mohammed V University, Rabat 10112, Morocco
2
MASciR (Moroccan Foundation for Advanced Science), Innovation and Research, Rabat 10112, Morocco
3
Reminex (Research & Development, Engineering and Project Delivery Arm), Managem, Casablanca 20250, Morocco
4
Faculté des Sciences Semlalia de Marrakech (FSSM), Cadi Ayyad University, Marrakech 40000, Morocco
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(9), 4294; https://doi.org/10.3390/s23094294
Submission received: 8 March 2023 / Revised: 17 April 2023 / Accepted: 20 April 2023 / Published: 26 April 2023
(This article belongs to the Section Sensing and Imaging)

Abstract

:
Underground mining operations present critical safety hazards due to limited visibility and blind areas, which can lead to collisions between mobile machines and vehicles or persons, causing accidents and fatalities. This paper aims to survey the existing literature on anti-collision systems based on computer vision for pedestrian detection in underground mines, categorize them based on the types of sensors used, and evaluate their effectiveness in deep underground environments. A systematic review of the literature was conducted following the PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) guidelines to identify relevant research work on anti-collision systems for underground mining. The selected studies were analyzed and categorized based on the types of sensors used and their advantages and limitations in deep underground environments. This study provides an overview of the anti-collision systems used in underground mining, including cameras and lidar sensors, and their effectiveness in detecting pedestrians in deep underground environments. Anti-collision systems based on computer vision are effective in reducing accidents and fatalities in underground mining operations. However, their performance is influenced by factors, such as lighting conditions, sensor placement, and sensor range. The findings of this study have significant implications for the mining industry and could help improve safety in underground mining operations. This review and analysis of existing anti-collision systems can guide mining companies in selecting the most suitable system for their specific needs, ultimately reducing the risk of accidents and fatalities.

1. Introduction

Despite the technological advancements [1,2,3] of the modern world, mining is still one of the industries with the highest rates of accidents and occupational illnesses [4,5,6,7,8]. The number of accidents caused by mining equipment has risen in recent years [9]. Many workers seem to be unable to recognize hazards or interpret and recognise risk situations in terms of actions [10,11].
However, with the growth of Industry 4.0 in recent years, there has been a significant decrease in accidents and injuries, making mining a much safer industry than before. The latest technological revolutions, known as Mining 4.0 [12,13,14], have played a crucial role in this transformation, bringing about a paradigm shift in the way mining operations are carried out. Staff and machines are all safer as a result of this evolution [15]. Moreover, in Industry 5.0 [16], cobots will be the next generation of industrial robots (autonomous and agnostic to humans) with increased sensitivity to human activities and readiness to interact with humans.
One area where technological advancements have had a significant impact is in the field of artificial intelligence (AI). Throughout the entire life cycle of mining, from exploration to reclamation and closure, artificial intelligence (AI) can be implemented in various stages, such as exploration, planning, mobile machine operations, drilling and blasting, and ore beneficiation [17,18]. The utilization of AI has significantly improved the field of machine and vehicle operation automation in recent times [17,18].
Perhaps one of the most promising applications of AI in mining is in the development of anti-collision systems based on computer vision. These systems use advanced sensors, such as cameras and LIDAR [19,20], along with sophisticated algorithms and deep learning techniques to detect and track the movement of people and objects in underground mines [21,22,23,24,25,26,27,28,29,30,31,32]. This technology can help prevent accidents and injuries by providing real-time alerts and warning signals to operators and workers [22].
Another technological advancement that has gained traction in the mining industry is the use of virtual and augmented reality (VR/AR) technologies [33,34]. VR/AR can provide immersive training experiences for workers, allowing them to practice their skills in a safe and controlled environment [35,36]. It can also be used to simulate emergency situations and prepare workers for different scenarios, improving their response time and overall safety [37]. In addition, VR/AR can be used for remote monitoring and maintenance of equipment, reducing the need for workers to physically inspect machinery and thus minimizing the risk of accidents. All these interactions between the mining industry and these areas mentioned above fall into the category of disruptive mining technologies [38] as shown in Figure 1.

1.1. Safety Challenges in Underground Mines

Falls, bricks, rocks, loading and shipping procedures, and electrical faults are also common causes of injuries. Additionally, engine-powered machinery has a considerable effect on accident rates [39,40], especially as the need for more complicated and sophisticated devices grows, requiring a higher degree of expertise to operate [41].
Mine construction, equipment specification, and human factors all affect the risks associated with heavy machinery [42]. Staff become trapped in rolling machines, are caught by moving machine parts, or are run over by transport vehicles in these collisions (Figure 2). Keeping workers healthy when they are around machines is a difficult task. Allowing workers to be around moving equipment to perform their jobs is a common industry procedure in many ways. In underground mines and smaller surface activity, for example, personnel and massive mobile equipment are often in close proximity.
Consequently, the goal of this study is to highlight the accidents involving machinery, so we can have a better understanding of the causes in order to enhance the security of pedestrians, machines, and mining construction, including their roads and ramp design.

1.2. Taxonomy of Accidents in Underground Mines

The studies examined identify multiple categories of equipment that were involved in accidents, such us haulage trucks, front-end loaders, non-powered hand tools, dumpers, conveyors, continuous miners, forklifts, long walls, dozers, LHDs, jack leg drills, and shuttle cars. Accidents caused by the machines mentioned before and other machines and materials I will discuss in the next sections are classified according to the Mine Safety and Health Administration (MSHA) [43] and many studies in Figure 3. Because some of the terms seem to be similar to or even the same as the recorded problem, I opted to create a diagram of all of the terms used in each sample rather than clustering them in Figure 3 below.
Below, we clarify these types of accidents:
  • Electrical: caused by electric current.
  • Explosions: involving the detonation of manufactured explosives, Airdox, or Cardox and can result in flying debris, concussive forces, fumes, or the ignition or explosion of gas or dust. This category also includes accidents involving exploding gasoline vapors, space heaters, or furnaces.
  • Fire: including unplanned fires that are not extinguished within 10 min in underground mines or 30 min in surface mines and surface areas of underground mines.
  • Materials: involving lifting, pulling, pushing, or shoveling material and are caused by handling the material itself. The material may be in bags or boxes, or it may be loose, such as sand, coal, rock, or timber.
  • Machinery: caused by the action or motion of machinery or by the failure of component parts. This category includes all haulage machines, such as dozers, haul trucks, front-end loaders, load–haul–dumps, dumpers, and excavators. Accidents caused by an energized or moving unit or failure of component parts, as well as collisions between machines and objects or workers, are also included.
  • Block Fall: caused by falling material and the result of improper blocking of equipment during repair or inspection. In this case, the accident should be classified as the type of equipment most directly responsible for the resulting accident.
For a better understanding of equipment-related accidents, reports from the Mine Safety and Health Administration (MSHA) and the Current Population Survey (CPS) starting from 1995 to 2004 were tested and analyzed [39,44,45]. Table 1 sum up Rate of fatal accidents caused by mobile and haulage equipment in some countries.
During 1995–2005, Kecojevic et al. (2007) [39,45] examined all fatalities related to equipment at underground and surface mining operations in the United States. According to their findings, haul trucks were responsible for the most fatalities, 22.3%, followed by belt conveyors 9.3%.
MSHA accident data from 1995 to 2006 were used by Kecojevic, Komljenovic et al. (2008) [39,45] to measure risk for loaders and dozers. The most common dangers for loaders, according to this report, were failures to obey proper maintenance procedures and system part failures. Failure to recognize hazardous site conditions was the most common threat for dozers.
Mitchell, Driscoll, and Harrison [46] performed a systematic study of more than one hundred deaths that happened in mines all around Australia between 1982 and 1984. They found that 34% of accidents were caused by falling objects, 29% were vehicle accidents, and being hit by a vehicle or equipment part was the most frequent cause of injuries (18%).
A study conducted by Dhillon in 2009 [47] found that fatalities at quarries in the United Kingdom have been increasing. According to this study, between 1983 and 1993, 41% of all fatalities in these quarries were caused by vehicle-related incidents, such as collisions, traveling on the edge, or rollovers.
According to a study by Ural and Demirkol (2008) [48], Turkey had one of the highest fatal accident rates in the mining industry among major mineral-producing countries in 2004, with a total of 68 fatalities. The most frequent types of fatal accidents in surface mines were blasting operations (18%), controlled haulage (16%), ground collapse (14%), and machinery (12%).
A study by Eric Stemn [49] about mines in Ghana found that the top five tasks associated with injuries in underground mining were drilling (20%), charging (10%), walking (8.3%), barring/scaling (5%), and changing/adjusting (5%), which accounted for 48.3% of all underground mining injuries. The remaining injuries were caused by component/parts (17.9%), drill rigs (12.8%), rock drills/borers (12.8%), and other earth-moving equipment (12.8%).

1.3. Typology of Underground Mining Machinery and Its Associated Injuries

Numerous surveys [39,41,44] show that mining equipment is the top contributor to injuries in the mining industry. Moreover, several other variables, such as work-speed, demand, and load, have an impact and can lead to injuries.
In surface mining, one study showed that out-of-control vehicles were the leading cause of equipment-related deaths [45]. Braking system failures have also been mentioned. The change to machinery and equipment without handles to suit a given function were defined as accident causes by Bonsu et al. [39,41,44].
According to the study by Eric Stemn [49], the top reasons for fatal surface mining accidents in Ghana were lack of preparation (37%), seat belt usage failure (31%), poor communication (19%), and lack of control over haul roads (13%). For injuries, collisions with other cars or pedestrians or rollovers and contact with public service poles can be considered as major reasons.
Numerous recorded accidents involve heavy transport machinery, such as haul trucks [44,50,51] and dumpers [52,53], with jack leg drills having the highest accident rate [54]. In general, mobile equipment [45] often poses a visibility issue for workers, which may be due to the size and design of the equipment. Injuries often occur while handling mining supplies, such as those used for bolting tasks, where a worker comes in direct contact with or trapped by the machinery [55]. This type of incident accounts for 54% of all deaths. “Non-powered hand tools” were reported as the safest, causing only nonfatal injuries, while the use of off-road and underground machinery was the main cause of fatalities among all causes [44].
We will give more details about types of accidents caused by haulage equipment and machines in underground mining [42,44,51,52,56,57,58].
Figure 4 shows different types of mining machinery used either in surface mining or in underground mining. However, in this study we focus on underground mining machinery because we use such machinery in mining Moroccan ores, such as cobalt, copper, silver, and gold. Dumpers and load haul dump (LHD) vehicles, which serve as the primary means of transporting material within the mine, are utilized in underground mining. So, our study concentrates on the types of accidents this machinery causes.
Table 2 shows the most frequently recorded injuries involving the following pieces of equipment in underground mines: dumper, Load Haul Dump (LHD).
Table 3 shows that dumper reversals are responsible for 21% of dumper accidents, whereas front run over and collision are responsible for 29% and 18% of dumper incidents, respectively.
Table 4 shows the frequency of injuries linked to Load–Haul–Dump (LHD) equipment for each combination of activity and mechanism.
When we analyze these statistics, it becomes clear that injuries are more common among LHD drivers and that the most common injury mechanisms are collisions, which are connected with bumpy roads. The next most prevalent action at the time of injury was gaining entrance to, and especially egress from, LHDs, with falls from the machine being a major injury mechanism.
A simplified classification of the causes of injuries is introduced in Figure 5: human mistakes, maintenance issues, and environmental factors. Generally, the authors in [42,44,51,52,56,57,58] concentrate on human error, recommending providing more formations for miners to resolve the problem. Two major problems in the repair process can be noted, which are inadequate maintenance and the malfunction of mechanical components. These papers emphasize the importance of road construction, lack of visibility, and inadequate signaling as causal factors in accidents.
This study conducts a comprehensive review of anti-collision systems for detecting pedestrians in underground mines based on computer vision and artificial intelligence. The article covers an extensive range of sensors and algorithms utilized in these systems, including deep learning techniques, cameras, and LIDAR. Moreover, various data collection methods employed in training these systems are discussed. Additionally, this study evaluates different types of anti-collision solutions, such as collision avoidance, rescue operations, mapping, and navigation, along with presenting some of the industrial solutions implemented in the mining industry. Finally, we offer unique perspectives and insights on the current state of the art in the field, addressing the challenges and opportunities for future research and development in the mining industry.

2. Materials and Methods

The Preferred Reporting of Items for Systematic Reviews and Meta-Analyses (PRISMA) [59] is a set of guidelines and directions for reporting systematic reviews and meta-analyses based on evidence. The original goal of PRISMA was to report on reviews that evaluate the effects of interventions, but it can also be helpful for other types of systematic reviews. Other fields of knowledge quickly adopted this technique since it incorporates the criteria of scientific methodology, such as objectivity and repeatability, into the literature review process. In 2015, the authors of the PRISMA Statement made massive modification to the PRISMA [60] to emphasize the importance of the methodological component (PRISMA-P).
The most prominent journals and databases within the interdisciplinary and engineering disciplines, such as Scopus, Google Scholar, ScienceDirect, MDPI, and Directory of Open Access Journals, were used as sources of information for this study. Throughout this research, the keywords “pedestrian”, “human”, “detection”, “anti-collision”, and “systems” were established a priori, and they were combined with “mine” and “underground” separated by the Boolean operator “AND”.
During the early stages, the primary goal of this process was to determine and choose the most optimal information (based on specified research standards). The snowballing approach [61] called for the inclusion of all literature and an extension of the publication time in the second phase. During the eligibility phase, each record was evaluated against a set of inclusion criteria to determine whether it should be included in the research: articles were required to provide data for a specific time frame and include analysis of accidents and descriptions of equipment. Any articles that failed to meet these requirements were omitted from the investigation.
Following the PRISMA diagram flow as shown in Figure 6. We tracked 2000 articles through the databases cited above. After applying the exclusion criteria (see Table 5), we eliminated all articles not published between 1998 and 2022 and all articles not written in English. Then, we kept 1868 articles. Next, we removed 165 duplicate papers. We rejected 1289 articles by analyzing titles, 247 articles by analyzing the abstracts, and 45 articles because they were not available. Finally, among the remaining 95 articles, we excluded 55 articles that concern surface mines, open pit mines, and tunnels.

3. Proposed Solutions for Pedestrian Detection in Underground Mines

3.1. The Application of Computer Vision in Underground Mining Operations

Despite improvements in the technology of sensors, the use of a many types of sensors is limited in underground mines. This is due to confined space, reduced visibility, lack of illumination, dust, high temperature, and the lack of wireless communication systems.
The use of computer vision in underground mines has many applications. These applications include mapping and navigation inside underground mines, anti-collision systems for pedestrian safety, and underground mine rescue operations (Figure 7).

3.2. Sensory Part

Unfortunately, environmental conditions (humidity, high temperature, pressure, dust, and lack of visibility) prevent the use of many types of sensors. Nevertheless, a multitude of devices are available that can identify people and other vehicles near mining vehicles. Several of these systems have particular restrictions. Moreover, GPS (Global Positioning System) systems are frequently employed in surface operations. Because of its reliance on satellite signals, however, this technology cannot be used in an underground environment [19,20,63]. Ultra-wide band radar (UWB) detects obstacles by sending out electromagnetic waves in the radio spectrum. Pulse radar and continuous wave (Doppler radar) radar are the two forms of radar. Among their limitations, they raise false alarms and cannot distinguish pedestrians from things [19,20,63]. The use of radio waves to identify persons and items is represented by workers wearing tags with identification codes, which are detected by the reader when they reach a danger zone, triggering an alarm that warns the vehicle operator via both an audible and visual alarms. Among the inconveniences of RFID systems, they do not offer the precise location of employees, every vehicle and person must be equipped with tags, and wall roughness causes signal attenuation in UHF [19,20,63]. Ultrasonic sensors employ the sound speed to calculate the distance to an item that reflects a sent sound wave. There are many limitations in this type of sensor that limit their use in underground mines. Temperature, humidity, and angle of incidence affect the speed of sound which in turn influence the accuracy of measured distances [19,20,63]. LIDAR produces pulses of infrared light instead and calculates the time it takes for those pulses to return after reflecting off surrounding objects. A LIDAR sensor can accurately detect the distance between all objects in a scanning area up to 50 meters. It is rapid and gives an exact depth measurement [19,20,63].
On the other hand, the parts that interest us most in this review are the solutions based on computer vision and artificial intelligence.
First, we will describe the types of cameras used to solve these problems. Cameras are employed in a variety of industries to allow the driver to navigate a blind spot region in real time through a display in the driver’s cab.

3.2.1. RGB Camera

An RGB (red, green, blue) camera can be employed for multiple purposes, including surveying and mapping, stockpiling volume calculations, traffic and security surveillance, inspection, and more. An RGB camera (Figure 8a) is a sensor device that captures RGB (red, green, and blue) pictures as well as a per-pixel depth report. In low light environment it needs a spotlight LED (Figure 8b).To provide depth assessment at a large number of pixels, RGB cameras employ either active stereo or time-of-flight sensing technology. The camera must be carefully chosen, taking into account the drone’s fuel consumption. For fixed-wing drones, a tiny camera is preferred because bulky gadgets cannot be carried [19,20,63].

3.2.2. Infrared Camera

A thermal camera (Figure 9) is a type of camera that uses heat-sensing technology to detect and display the temperature of objects in the area of vision. Thermal cameras work by detecting the infrared (IR) radiation emitted by objects and converting that information into an image that can be displayed on a screen. This allows thermal cameras to “see” through darkness, fog, and other forms of visual obstruction, as well as detect hot and cold spots that may not be visible to the naked eye. There are several applications for thermal cameras, including security, search and rescue, military, industrial maintenance, and building energy efficiency. They can be handled, mounted on a vehicle or aircraft, or integrated into a fixed surveillance system. Thermal cameras can be useful in a number of situations, including detecting people or animals in the dark, finding hot or cold spots in a building or electrical system, and identifying sources of heat loss or gain in a building envelope. They can also be used to detect fires, locate sources of heat of flames, and monitor the temperature of industrial equipment or processes [19,20,63].
Figure 9. Thermal camera [66].
Figure 9. Thermal camera [66].
Sensors 23 04294 g009
Table 6. Thermal infrared range [67].
Table 6. Thermal infrared range [67].
NameWavelength
NIR (Near infrared)0.75–1.4 µm
SWIR (Short wavelength infrared)1.4–3 µm
MWIR (Medium wavelength infrared)3–8 µm
LWIR (Long wavelength infrared)8–15 µm
FIR (Far infrared)15–1000 µm
As shown in Figure 10 and detailed in Table 6, the electromagnetic spectrum’s infrared (IR) band is large and has a variety of sub-bands [68], each having unique properties and applications. Commercial day/night surveillance systems and remote controllers are the most typical uses of near IR as it is simple to produce and to detect but invisible and unobtrusive. Military night vision equipment generally uses short-wave infrared radiation. Mid-wave IR is utilized for astronomy, monitoring furnaces, and photographing hot things. Long-wave IR corresponds to the wavelength of IR radiation emitted by objects that are colder than a few hundred degrees Celsius and is therefore used mostly for thermal imaging.

3.2.3. Stereoscopic Camera

A camera capable of capturing three-dimensional pictures is referred to as a stereoscopic camera (Figure 11), which utilizes a combination of at least two lenses, much like the human visual system. Stereo cameras use their distinct image sensors to create three-dimensional pictures. Stereo cameras offer the benefits of high precision and high-quality resolution in a controlled environment. Under smoke, fog, or dust, however, it performs poorly because light waves are warped in these situations [19,20].

3.2.4. LIDAR (Light Detection and Ranging)

Similar to radar, LIDAR (Figure 12) uses pulses of infrared light rather than radio waves to conduct remote sensing and range operations. After the pulses bounce off surrounding objects, the return time is measured. Knowing the speed of light, the LIDAR sensor can accurately calculate the distance to each object from the time between the emission of the laser pulse and the return pulse. Every second, LIDAR takes millions of precise distance measurement points from which a 3D matrix of its surroundings can be produced. This detailed mapping can provide information on the position, shape, and behavior of objects.

3.3. Data Collection Method

We talked about the sensory part in the previous subsection. Now, we will discuss the data collection method. Researchers have conducted various experiments using four hardware systems: aerial vehicles (drones), unmanned ground vehicles (UGVs), mobile machines, such as load–haul–dump (LHD) vehicles, and fixed cameras such as surveillance cameras.

3.3.1. Drones

Drones, or unmanned aerial vehicles (UAVs) (Figure 13), can also be used in underground mines as a way to improve safety, efficiency, and productivity. These drones are typically equipped with sensors and cameras and are controlled remotely by a human operator or through autonomous programming. There are various uses for drones in underground mines, including [19,72] surveying, mapping, monitoring, inspection, hazardous tasks, and search and rescue. Globally, the use of drones in underground mines can help improve safety and efficiency by reducing the need for human workers to perform tasks in potentially hazardous environments.
The purpose of this research [23] is to present a convolutional neural network (CNN) method for low-cost micro aerial vehicle (MAV) platforms to autonomously navigate through a dark underground mine. The suggested CNN component uses the picture stream from the onboard camera to generate online heading rate directives for the MAV, allowing a collision-free descent of the tunnel axis for the platform. The production of the dataset utilized to train the CNN is a new aspect of the presented approach.
In order to deliver useful data in textureless [24], dark, and obscurant–filled conditions, the combination of optical and thermal cameras, LIDAR, and inertial measurement unit cues is examined. In northern Nevada, the integrated system is being evaluated in field experiments inside underground metal mines.
In this paper [73], during the development of the system, E. Jones et al deployed a hovermap autonomous flight system in underground and GPS-denied areas. A drone equipped with a LIDAR sensor was utilized in the development of this system. The primary goal of the examples was to increase safety by better understanding the rock mass behavior and failure mechanisms frequently observed in deep and high-stress mining situations and by incorporating these insights into the design process.
The paper [74] attempts to analyze the potential for creating a completely autonomous enclosed drone-based remote monitoring system to survey inaccessible zones in underground mines. Flying a drone underground is highly challenging because of the tight quarters, poor sight, air velocity, dense concentration of dust, absence of a wireless connection system, and other factors. Furthermore, it is very hard for a drone operator to access hazardous areas in underground mines. By suggesting the construction of an autonomous spherical micro-drone for monitoring underground mine safety, this study proposes solutions to the mentioned problems.
In this study [75], only a UAV with technology that can film with effective lighting is able to remotely visit regions at the Cuiabá mine using the sublevel technique. Additionally, the equipment must be able to fly without GPS, do so at a varied low speed to absorb minor bumps from the rock wall and other support structures, and do so often without the pilot’s visual input. Additionally, the UAV controls must be simple enough for miners with no aeronautical experience or understanding to use. For this objective, a hybrid device was created that brings together a helium-filled balloon and quadcopter propellers. It is equipped with remote control capabilities, rechargeable batteries, powerful LED lighting, image stabilization cameras, and radio frequency transmitters for both control and visualization of images. To prevent issues specific to mine operations, the gadget has been customized with a variety of additional solutions. The local team changed projects, rethought pillar sizes, discovered ore locked inside the stopes, and identified other critical circumstances as a result of the UAV’s photographs being shown in endless trails, increasing safety and output.

3.3.2. Unmanned Ground Vehicle

Unmanned ground vehicles (UGVs) are becoming increasingly popular in underground mines as a way to improve safety (Figure 14), efficiency, and productivity. These vehicles are equipped with sensors and cameras and are typically controlled remotely by a human operator or through autonomous programming. Some potential uses for UGVs in underground mines include [76,77] surveying and mapping, material handling, and hazardous tasks. Overall, the use of UGVs in underground mines can help to improve safety and efficiency by reducing the need for human workers to perform tasks in potentially hazardous environments.
The authors of [22] presented many scenarios involving underground mine rescue. This paper presents the outcome of testing a UGV robotic system equipped with a sensory system and an image processing module. The module is based on an adaptation of the you only look once (YOLO) and histogram of oriented gradients (HOG) algorithms.
For belt conveyor maintenance, the authors of [78] recommend using an unmanned ground vehicle (UGV) equipped with both RGB and thermal cameras and a localization system.
In this study [25], the efficacy of a system for identifying road signs using machine vision and an autonomous vehicle for driving in underground mines was evaluated using a combination of data from cameras and LIDAR sensors.
In the field of mine safety, coal mine rescue robots with binocular vision have been tested. This work [76] comprehensively introduces the scientific state of stereo vision matching for binocular vision and camera calibration.
In this investigation [26], the accuracy of three techniques for determining the location of an autonomous driving robot in underground mines was compared. These techniques were an inertial measurement unit combined with encoder sensors (IMU + encoder), light detection and ranging with encoder sensors (LIDAR + encoder), and a combination of IMU and LIDAR with encoder sensors (IMU + LIDAR + encoder). As the robot moved, its position was calculated using each of these techniques, and the precision of the results was assessed by comparing the estimated location with the robot’s actual location.
This research [28] analyses the specifications and suggests a robot design capable of autonomous underground navigation and object manipulation. A robust four-wheeled platform with weather-resistant electric motors serves as the robot’s base. The kit comes with illumination, color and depth cameras, laser scanners, an inertial measurement unit, and a robotic arm. Two experiments were conducted to evaluate self-navigation and mapping.
Geothermal heat causes hot walls in deep gold mines, and thermal imaging becomes a viable alternative for structural imaging. The authors in [77] use the combination of temperature and 3D data to allow for the calculation of a risk measure, which may be used to identify possible danger zones. The miner’s interpretation of risk data might be represented in a variety of ways. Finally, a distance sensor and a thermal camera are utilized to detect and monitor pedestrians in order to forecast probable accidents.
In this research [79], the authors suggest PftNet, a novel parallel feature transfer network-based detector that outperforms one-stage [80,81,82,83] approaches while maintaining the accuracy of two-stage [84,85,86] methods. The pedestrian identification module and the pedestrian location module are two interrelated components that make up PftNet. The former tries to generally alter the anchor box’s location and size, filter out the negative anchor box, and improve the regression’s initialization.
The paper [87] presents an improved SLAM algorithm, called LeGO-LOAM-SM, which is designed to construct more precise point cloud maps using LIDAR sensors and improve position estimation in underground coal mines. The proposed algorithm is evaluated with experimental data, showing better map clarity, trajectory smoothness, and accuracy. Moreover, the study explores a two-dimensional occupancy grid map, which effectively filters outlier noise, and achieves high mapping accuracy of 0.01 meters while minimizing storage space requirements.

3.3.3. Mobile Machines

A load–haul–dump (LHD) [88,89] is a type of underground mining vehicle that is used to load, haul, and dump material within an underground mine (Figure 15). These vehicles are designed to operate in tight spaces, such as mine tunnels and chambers. LHDs are typically used to transport materials, such as ore, waste rock, and coal, within the mine. They are equipped with a bucket or scoop on the front of the vehicle, which is used to load material into the vehicle. The material is then transported to a designated dumping location, where it is unloaded using the bucket or a conveyor belt. LHDs are commonly used in underground mines to improve efficiency and productivity. They can reduce the need for human workers to manually load and transport materials and can operate around the clock if necessary. They are also capable of operating in a variety of underground environments, including mines with low headroom or limited access.
The proposed collision detection system [62] combines a three-dimensional (3D) sensor with a thermal infrared camera for human identification and tracking. A distance sensor provides depth information in addition to a thermal camera, allowing the vehicle and pedestrian velocities to be estimated.
In a simulated environment, the control of an underground loader was analyzed using reinforcement learning and a multi-agent deep neural network approach [27]. At the initiation of each loading cycle, an agent was responsible for selecting the excavation position from a visual representation captured by a depth camera of a pile of fragmented rock. In order to fill the bucket at the prescribed loading location without colliding, getting stuck, or losing ground traction, a second agent was in charge of maintaining vehicle control.
The authors of this work [90] provide many new vision-based algorithms for visual place recognition on underground mining vehicles that outperform existing technologies while simply requiring a camera input. By selectively processing picture areas, the authors describe a Shannon entropy-based salience creation strategy that improves the performance of single-image-based location identification. The authors then include a learning-based approach that filters problematic photos from both the reference map databases and live processing using support vector machines (SVMs).
This research [91] describes a human detection strategy for preventing accidents in mines that are dark or dimly lit. This technology employs a mix of infrared sensors and distance estimation through sound or light. When there is no light, the approach is used to ensure both human recognition and precise distance measurement. The outcomes of the experiments are shown, demonstrating the strategy’s efficacy and efficiency.
For monitoring the safety and real-time location of underground miners, the enhanced YOLOv4 model reaches 96.25% and 48.2 fps in global AP (average precision) and detection speed, respectively, according to experimental results in this study [29]. The enhanced YOLOv4 model also has the benefits of excellent robustness and generalization capacity, which make it ideal for detecting individuals underground and offering a solid assurance for the management and monitoring of mine workers’ safety.
One of the most critical technologies is the detection of the obstacles in front of unmanned electric locomotives while they operate which is of great significance for safe operations. Authors of [30] proposed an improved YOLOv3 (YOLOv3-4L) algorithm for intelligent obstacle detection. The unsafe region for driving an electric locomotive can be found by locating the location of the track and extending a specified distance outside of it. Obstacles are found using the enhanced YOLOv3 algorithm, and only those types and places that coincide with risky zones are output. The experimental findings demonstrate that conventional computer vision approaches, such as perspective transformation, sliding windows, and least square cubic polynomials, can detect both straight and curved tracks, making up for the Hough transforms’ limitations in this regard. The YOLOv3-4L algorithm enhances the MAP (mean average precision) by 5.1% and the detection speed by 7 fps when compared to the original YOLOv3 method. The YOLOv3-4L detection model can match real-world working conditions and serve as a technical guide for autonomous electric locomotive operation in underground coal mines thanks to its high detection accuracy and speed.
The purpose of this study [92] is to investigate the analysis of moving targets in coal mines using computer vision processing and to explore image processing techniques for target monitoring. The paper presents the concept of image enhancement and its related algorithms, with experimental results showing that the proposed algorithm outperforms existing techniques by 56.60% and 68.26% for histogram equalization and dark primary color prior dehazing algorithms, respectively. The study also provides examples of image enhancement in coal mines and concludes that the proposed algorithm yields superior results.
In response to the need for real-time detection of obstacles in front of underground track mine carts, a new system [32] was proposed that utilizes both camera and LIDAR information. The system employs a custom point cloud clustering algorithm specifically designed for the challenging mine environment to extract obstacle information and then leverages the YOLOv5 algorithm to identify obstacles in the resulting images.
Imam et al. [93] propose a new anti-collision system for pedestrian detection in underground mines based on RGB images collected in five different mines. They used Yolov5s which reaches 75% of precision and 71% of MAP.
The paper [94] introduces a new method for solving the problem of local path planning in reactive navigation of underground vehicles. The proposed approach uses 2D LIDAR to extract the centerline of the current laneway from the skeletons obtained from the LIDAR data. The centerline is then smoothed to produce the current planned local path in underground mining scenarios.

3.3.4. Surveillance Cameras

Surveillance cameras can be used in underground mines to monitor and assess the condition of equipment, infrastructure, and the environment within the mine (Figure 16). These cameras can be placed in strategic locations throughout the mine and can be used to transmit real-time video footage to a central monitoring station. There are several potential benefits to using surveillance cameras in underground mines: improved safety, enhanced productivity, and improved communication. Overall, the use of surveillance cameras in underground mines can help to improve safety and efficiency by providing mine operators with a better understanding of conditions within the mine.
The authors of [21] integrate YOLOv2 with the FCN skip structure to increase the accuracy of identifying pedestrians in groups in underground coal mines. In this approach, they offer YWSSv1 and YWSSv2, two modified variants of YOLOv2.
The problem of numerous noise sources in underground mine videos is addressed in this work [96], which proposes an improved K-SVD object identification method.
Limei CAI et al. [97] presented a method for finding miners by detecting their helmets in this research. Background subtraction, denoising, erosion, region increasing, circle detection, and tracking are among the processes. Real underground coal mine footage was used to test the method.
In this research [95], the authors present a human tracking approach for underground coal mines using scene information fusion, with the goal of improving tracking accuracy and resilience in the face of difficult elements (e.g., low illumination and low intensity contrast, bright spots, and shadow disturbance).
Zongwen Bai et al. [98] present an efficient hybrid feature detection in images approach for seamless stitching of industrial coal mine surveillance footage taken in real-world scenarios using multiple cameras. This approach utilizes valuable information from several cameras to achieve fast video stitching.
In this paper [31], an improved algorithm of mine pedestrian detection yolov4-tiny-SPP based on yolov4-tiny was proposed. This algorithm consists of adding SPP (spatial pyramid pooling) after the arknet layer. This algorithm solved the occlusion of pedestrians in mines and it reached 91.98% in MAP, which was 2.32% higher than the original algorithm.
A low-cost battery-free localization scheme called MineBL was proposed [99] for accurate positioning in underground coal mines. The approach utilizes reflective balls as position nodes and applies a series of algorithms based on depth images to achieve target localization. Moreover, an optimized weighted centroid location algorithm based on multilateral location errors was developed to minimize localization errors in underground scenarios. The experimental results demonstrate that MineBL performs well with localization errors below 30 cm in 95% of cases.

3.4. Algorithmic Part

As previously mentioned, we discussed the sensory systems and the data collection methods knowing we are going to quote the algorithms (Table 7) used in research already conducted to solve this problem by using computer vision in underground mines either for pedestrian detection, for navigation, or mapping anti-collision systems.
As shown in Table 7, we cite algorithms with the types, as well as the data used and the objectives of these algorithms. Additionally, Table 8 shows the AP that all these algorithms reached. Yolov4 has the highest precision in relation to the number of frames per second.
On the other hand, we know that an underground mine contains very severe environmental conditions, such as dust, dew drops, lack of visibility, and high humidity. So, the only environment that has conditions close or similar to an underground mine is road scenes at night or in severe weather conditions, such as fog and rain. We note some research on this problem (See Table 9).
The efficiency of existing pedestrian detection algorithms is severely limited by decreased visibility and fuzzy outline and look like pedestrian pictures taken during foggy weather, as described in [102]. Three innovative deep learning techniques based on Yolo are presented by the authors. Their network was made more efficient by using depthwise separable convolution and linear bottleneck techniques to minimize the computing cost and number of parameters. The authors presented a 16-bit thermal data collection collected during severe weather conditions in the four largest European Union nations in [103]. We evaluated and supplied 16-bit depth changes for the YOLOv3 DNN-based detector, with a MAP of up to 89.1%. Moreover, the experimental findings in [104] demonstrate that CNN-based detectors achieved high performance on FIR picture at night owing to the large amount and diversity of training data. The authors of [105] propose a deep-learning-based data augmentation technique that uses the six most accurate and fastest detectors (TinyV3, TinyL3, YOLOv3, YOLOv4, ResNet50, and ResNext50) to enrich far-infrared images collected in good weather conditions with distortions similar to those caused by bad weather. Multi-spectral pictures of color thermal pairs were shown to be more successful than a single color channel for pedestrian identification in [106], particularly under difficult lighting circumstances. The authors discovered that the confidence in detecting pedestrians in color or thermal pictures is proportional to lighting conditions. They propose an illumination-aware faster R-CNN with this in mind (IAF R-CNN). To offer an illumination measure of the input picture, an illumination-aware network was introduced. The study [107] uses an image feature extractor based on histograms of oriented gradients followed by a support vector machine (SVM) classifier to evaluate pedestrian recognition in low resolution night vision infrared pictures.
Instead of employing a strict intensity threshold, the authors in [108] created areas of interest by merging picture segments and exploiting infrared images’ low-frequency characteristics. It helped to make the technique less reliant on light and more adaptable.
A technique for identifying and tracking pedestrians using a single digital infrared camera was demonstrated in [109]. In this example, an assessment was made to distinguish between objects identified as hot spots and humans through the use of a support vector machine (SVM) algorithm.
Ref. [110] contains an example of a detection system’s module for pedestrian recognition, as well as some ideas for possible enhancements to evaluate human presence in both nocturnal and daytime situations.
The authors of [111] emphasized the need of integrating high-resolution imaging with range data to improve detection system performance.
The authors demonstrated a way to extract form characteristics from segmented stereo vision images in a subsequent paper devoted to a system for detecting, localizing, and tracking individuals for military UGVs [112].
An adaptive Boolean-map-based saliency model was then included to improve thermal image recognition of people [113], which allowed for better differentiating a person from the background.
In a recent paper, the authors of [114] demonstrated that an improved pedestrian identification technique based on a convolutional neural network is accurate in diverse weather situations when compared to five other standard methods. Below, we cite the research already conducted to solve this problem using computer vision.
Ref. [115] provided another example of an advanced rapid detection system that may be used to identify people (among 9000 item types).
In [116], both a thermal and an RGB camera were used for referencing infrared images.
As shown in Table 9, we cite algorithms with the types as well as data used and the objectives of these algorithms. Additionally, Table 10 shows the AP reached by all there algorithms. Yolov5s had the highest AP, and yolov4 reached the highest MAP.
According to Table 11, different solutions use an RGB camera and a thermal camera for pedestrian detection and a stereo camera and LIDAR for depth estimation. Algorithms used for pedestrian detection are Yolo and its variation (v3, v4, 9000 and tiny), Fast R-CNN, HOG, and SVM. In addition, stereo matching algorithms are used for depth calculation. In the end, the final result is pedestrian detection and pedestrian distance prediction.
Table 9. The most commonly used algorithms in road scenes under severe weather conditions with their purpose.
Table 9. The most commonly used algorithms in road scenes under severe weather conditions with their purpose.
AlgorithmData TypePurpose
Yolo (v1, v2, v3, v4 and v5) [102,103,105,113,117]RGB Images [102], Thermal Images [103,105,113,117]Pedestrian detection [102,103,104,105,106,107,108,109,110,111,112,113,114,117,118],
falling elderly people [118]
Tiny (v3,l3, v2)  [103,105,113]Thermal Images [103,105,113]
Fast R-CNN [104,106]RGB Images [106], Thermal Images [104,106]
HOG [107,110,113]RGB Images [110], Thermal Images [107,110,113]
SVM [107,108,109,110,113,118]RGB Images [110,118], Thermal Images [107,108,109,110,113]
ROI [111,112]Stereoscopic Images [111,112]
Table 10. The most used algorithms with the accuracy they have reached.
Table 10. The most used algorithms with the accuracy they have reached.
ReferenceAlgorithmAP (%)MAP (%)fps
[102]Yolov378N/A22.2
VggPrioriBoxes-Yolo80.5N/A81.7
MNPrioriBoxes-Yolo80.5N/A151.9
[103]Yolov3N/A80.5
TINYv3N/A66.3
[105]TINYv3N/A73.2655.57
TINYL3N/A80.1443.01
Yolov3N/A80.4817.88
Yolov4N/A86.0515.97
ResNet50N/A8119.82
ResNet50N/A77.0717.7
[113]Tiny Yolov2 with ABMS87.12N/A62.8
Tiny Yolov2 with BMS85.37N/A62.8
Tiny Yolov2 without preprocessing78.4 63.8
 [117]Yolov5s90N/AN/A
Table 11. Summary of solutions.
Table 11. Summary of solutions.
PurposeSensorsAlgorithms
Pedestrian Detection
  • RGB Camera
  • Thermal Camera
  • Yolo
  • Fast R-CNN
  • HOG
Depth Estimation
  • Stereoscopic Camera
  • Lidar Sensor
  • Stereo Matching
  • Point cloud image

3.5. Industrial Solutions

Over the past several years, many businesses have made investments in collision avoidance systems (CAS; known also as proximity detection systems) in an effort to decrease the frequency of accidents involving injuries and fatalities.
The Earth Moving Equipment Safety Round Table (EMESRT) is one organization that focuses on vehicle interaction and has done so for over ten years. According to the EMESRT, approximately 30–40% of mining industry deaths in underground operations each year are due to failures of vehicle interaction controls, with about half of these involving pedestrians [119].
The EMESRT realized a project that initially focused on awareness, advice, and intervention (PDS/CAS) technologies. In a nine layer model of control effectiveness (see Figure 17), the project was scattered to add both the operational controls and design. Level 9 (L9) of PDS solutions will introduce technologies that automatically intervene and take control of the machine in order to prevent or mitigate unsafe interactions.
In 2015, a report from the EMESRT highlighted some solutions for underground vehicles. The report highlighted that underground machines have struggled to achieve interoperability. In fact, the development of proximity detection systems linked to the machine’s interface has been customized for each individual machine. This creates challenges for operating in mines, as there will never be just one brand or type of equipment. Therefore, it is crucial to have interoperability between the various levels of sensing and intelligence rules and the various vehicles in order to achieve a comprehensive solution for the mining industry.
Several companies offer a wide range of advanced collision avoidance systems that can intervene in the maneuverability of machinery by braking or decreasing speed in order to prevent collisions or run-overs (LEVEL 9 Intervention Control—EMESRT) (see Figure 18). These solutions are shared and available for the international mining community in order to obtain a clearer vision on how things work.
Finally, Table 12 provides a brief summary of most CAS products applied in the mining industry available on the market. Operator fatigue can also lead to collision accidents. Table 13 summarizes some of the industrial solutions that alert operators during their work.

4. Discussion

4.1. General Discussion

Underground, mine workers usually face significant problems, such as weak lighting, heat and humidity, dirty air, and of course cramped space. All these environmental limitations prevent several types of sensors, such as GPS, RADARS, ultrasonic sensors, and RFID systems, from working in similar places. For this reason, researchers have migrated to solutions based on computer vision using all types of RGB, thermal infrared, and stereoscopic cameras as well as LIDARs. The aim of this study is presenting and explaining all these technologies and techniques used in anti-collision systems based on computer vision for underground environments and citing the proposed solutions for pedestrian detection in underground mines.
Finally, for obtaining good results for pedestrian detection in deep underground mines, the hybridation of sensors is the better solution such as the mixtures below:
  • RGB Cameras + Thermal Cameras + Stereoscopic Cameras;
  • Thermal Cameras + Stereoscopic Cameras;
  • RGB Cameras + Lidar Sensor;
  • Thermal Cameras + Lidar Sensor.
The objective of this hybridation is that each device covers the deficiency of the other device.
For the algorithmic part, Yolov3, v4, and v5 reach the highest precision and efficiency in terms of fps. In addition, they have high precision while using different types of images:
  • RGB images;
  • Nir images;
  • LWIR images;
  • Fir images.

4.2. Improving Mine Safety Based on Computer Vision

In this section, we propose several ideas to improve the performance and accuracy of existing collision avoidance systems using computer vision techniques. Connected mines [12,13,14,121,122] can be equipped with smart sensors that can monitor the environment and send real-time alerts to miners in case of danger. Cloud computing [123,124] can be used to store and analyze data collected by sensors, allowing anti-collision systems to benefit from additional computing power to improve accuracy and responsiveness. A multi-sensor system [22,62,125] uses a combination of sensors, such as cameras, radar, LIDAR, and proximity sensors to collect information about the environment around the mobile mining machines and allowing miners to more easily detect potential hazards. 3D mapping [87,94,125] can be used to create accurate models of the underground environment, improving obstacle and pedestrian detection. Smart robots [22,78,126] can be used to inspect dangerous or inaccessible areas, reducing the risk to miners.
The use of augmented reality [33,34,127] can help miners visualize potential obstacles and hazards, improving their real-time decision making. Finally, improved lighting conditions can help increase the accuracy of computer-vision-based anti-collision systems by providing better visibility in underground areas.

5. Limitations

Despite the efforts made to carry out this work, the comprehensiveness of this study cannot be guaranteed in a rather vast and evolving field.

6. Conclusions

6.1. General Conclusions

In conclusion, this review paper has analyzed the types of anti-collision systems based on computer vision for pedestrian detection in underground mines. From the results of this study, anti-collision systems based on computer vision have the potential to greatly improve safety in underground mines, particularly in pedestrian detection. The different types of sensors, such as RGB cameras, stereoscopic cameras, infrared cameras, and LIDAR sensors, used to collect data and prevent collisions between miners and mobile machines as well as their installation in UGVs, mobile machines, and camera surveillance, were thoroughly discussed. Additionally, this paper has cited some industrial solutions based on computer vision that have been implemented in underground mines to enhance safety.

6.2. Practical Application

Furthermore, in terms of practical applications, this paper has discussed the need for continuous innovation and development of new systems to keep miners safe and has suggested that top management’s engagement can help recognize key points that may become bottlenecks in any output rise. The biggest concern that the study should focus on is to improve the protection of machine components and proximity prevention systems to avoid collisions between machines and miners or between machines and machines.

6.3. Current and Future Trend

As the world moves towards Industry 4.0 and 5.0 with greater emphasis on human–machine interaction, it is important to continue to develop and innovate new systems, methods, and solutions for anti-collision systems in underground mining. The concept of autonomous mining is spreading quickly in various applications, and future research should examine how anti-collision systems can be integrated with autonomous mining to minimize accidents and achieve a digital mine that prioritizes safety.

Author Contributions

K.B. and Y.T.: Proposed the idea and review the first draft. M.I.: Wrote the first version of the manuscript and editing. E.M.R., Y.A., I.B. and E.h.A.: final review. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Center for Scientific and Technical Research of Morocco (CNRST), Moroccan Foundation for Advanced Science, Innovation and Research and Reminex (Research Development, Engineering and Project Delivery arm), Managem Group.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The “Smart Connected Mine” project, which has received support from the Moroccan Ministry of Higher Education, Scientific Research and Innovation, the Digital Development Agency (DDA), and the National Center for Scientific and Technical Research of Morocco (CNRST) through the Al-Khawarizmi program, is the context for this research. The work presented in this article is part of a collaborative effort among various partners, including MASCIR (Moroccan Foundation for Advanced Science, Innovation and Research), REMINEX Engineering, R&D and Project Management subsidiary of MANAGEM group, ENSIAS (National School of Computer Science and Systems Analysis), UCA (Cadi Ayyad University), and ENSMR (National School of Mines of Rabat). We express our gratitude to MANAGEM Group and its subsidiary CMG for granting us permission to conduct on-site research and data collection as an industrial partner in this project.

Conflicts of Interest

The authors have no conflicts of interest to declare. The funders did not play a role in designing the study, collecting, analyzing, or interpreting data, writing the manuscript, or making the decision to publish the findings.

Abbreviations

The following abbreviations are used in this manuscript:
MSHAMine Safety and Health Administration
CPSCurrent Population Survey
LHDLoad–Haul–Dump
PRISMAThe Preferred Reporting of Items for Systematic Reviews and Meta-Analyses
GPSGlobal Positioning System
UWBUltra-Wide band Radar
RGBRed Green Blue
2DTwo-dimensional
3DThree-dimensional
LIDARLight Detection And Ranging
RFIDRadio Frequency Identification
RADARRadio Detection and Ranging
IRInfrared
NIRNear Infrared
SWIRShort Wavelength Infrared
MWIRMedium Wavelength Infrared
LWIRLong Wavelength Infrared
FIRFar Infrared
UGVUnmanned Ground Vehicle
UAVUnmanned Aerial Vehicle
YOLOYou Only Look Once
HOGHistogram of Oriented Gradients
SVMSupport Vector Machine
APAverage Precision
MAPMean Average Precision
fpsFrames per second
CASCollision-avoidance systems
EMESRTEarth Moving Equipment Safety Round Table

References

  1. Kusi-Sarpong, S.; Bai, C.; Sarkis, J.; Wang, X. Green supply chain practices evaluation in the mining industry using a joint rough sets and fuzzy TOPSIS methodology. Resour. Policy 2015, 46, 86–100. [Google Scholar] [CrossRef]
  2. Holmberg, K.; Kivikytö-Reponen, P.; Härkisaari, P.; Valtonen, K.; Erdemir, A. Global energy consumption due to friction and wear in the mining industry. Tribol. Int. 2017, 115, 116–139. [Google Scholar] [CrossRef]
  3. Ranängen, H.; Lindman, Å. A path towards sustainability for the Nordic mining industry. J. Clean. Prod. 2017, 151, 43–52. [Google Scholar] [CrossRef]
  4. Sanmiquel, L.; Rossell, J.M.; Vintró, C. Study of Spanish mining accidents using data mining techniques. Saf. Sci. 2015, 75, 49–55. [Google Scholar] [CrossRef]
  5. Permana, H. Risk assessment as a strategy to prevent mine accidents in Indonesian mining. Rev. Min. 2010, 4, 43–50. [Google Scholar]
  6. Kizil, M. Virtual Reality Applications in the Australian Minerals Industry; South African Institute of Mining and Metallurgy: Johannesburg, South Africa, 2003; pp. 569–574. [Google Scholar]
  7. Kazan, E.; Usmen, M.A. Worker safety and injury severity analysis of earthmoving equipment accidents. J. Saf. Res. 2018, 65, 73–81. [Google Scholar] [CrossRef] [PubMed]
  8. Squelch, A.P. Virtual reality for mine safety training in South Africa. J. S. Afr. Inst. Min. Metall. 2001, 101, 209–216. [Google Scholar]
  9. Kia, K.; Fitch, S.M.; Newsom, S.A.; Kim, J.H. Effect of whole-body vibration exposures on physiological stresses: Mining heavy equipment applications. Appl. Ergon. 2020, 85, 103065. [Google Scholar] [CrossRef]
  10. Domingues, M.S.Q.; Baptista, A.L.F.; Diogo, M.T. Engineering complex systems applied to risk management in the mining industry. Int. J. Min. Sci. Technol. 2017, 27, 611–616. [Google Scholar] [CrossRef]
  11. Tichon, J.; Burgess-Limerick, R. A review of virtual reality as a medium for safety related training in mining. J. Health Saf. Res. Pract. 2011, 3, 33–40. [Google Scholar]
  12. Lööw, J.; Abrahamsson, L.; Johansson, J. Mining 4.0—the Impact of New Technology from a Work Place Perspective. Mining Metall. Explor. 2019, 36, 701–707. [Google Scholar] [CrossRef]
  13. Sishi, M.; Telukdarie, A. Implementation of Industry 4.0 technologies in the mining industry—A case study. Int. J. Min. Miner. Eng. 2020, 11, 1–22. [Google Scholar] [CrossRef]
  14. Sánchez, F.; Hartlieb, P. Innovation in the Mining Industry: Technological Trends and a Case Study of the Challenges of Disruptive Innovation. Mining Met. Explor. 2020, 37, 1385–1399. [Google Scholar] [CrossRef]
  15. Dong, L.; Sun, D.; Han, G.; Li, X.; Hu, Q.; Shu, L. Velocity-free localization of autonomous driverless vehicles in underground intelligent mines. IEEE Trans. Veh. Technol. 2020, 69, 9292–9303. [Google Scholar] [CrossRef]
  16. Baïna, K. Society/Industry 5.0—Paradigm Shift Accelerated by COVID-19 Pandemic, beyond Technological Economy and Society. In Proceedings of the BML’21: 2nd International Conference on Big Data, Modelling and Machine Learning, Kenitra, Morocco, 15–16 July 2021. [Google Scholar]
  17. Ali, D.; Frimpong, S. Artificial intelligence, machine learning and process automation: Existing knowledge frontier and way forward for mining sector. Artif. Intell. Rev. 2020, 53, 6025–6042. [Google Scholar] [CrossRef]
  18. Zeshan, H.; Siau, K.; Nah, F. Artificial Intelligence, Machine Learning, and Autonomous Technologies in Mining Industry. J. Database Manag. 2019, 30, 67–79. [Google Scholar] [CrossRef]
  19. Shahmoradi, J.; Talebi, E.; Roghanchi, P.; Hassanalian, M. A Comprehensive Review of Applications of Drone Technology in the Mining Industry. Drones 2020, 4, 34. [Google Scholar] [CrossRef]
  20. Patrucco, M.; Pira, E.; Pentimalli, S.; Nebbia, R.; Sorlini, A. Anti-Collision Systems in Tunneling to Improve Effectiveness and Safety in a System-Quality Approach: A Review of the State of the Art. Infrastructures 2021, 6, 42. [Google Scholar] [CrossRef]
  21. Wang, L.; Li, W.; Zhang, Y.; Wei, C. Pedestrian Detection Based on YOLOv2 with Skip Structure in Underground Coal Mine; IEEE: Piscateway, NJ, USA, 2017. [Google Scholar]
  22. Szrek, J.; Zimroz, R.; Wodecki, J.; Michalak, A.; Góralczyk, M.; Worsa-Kozak, M. Application of the Infrared Thermography and Unmanned Ground Vehicle for Rescue Action Support in Underground Mine—The AMICOS Project. Remote Sens. 2021, 13, 69. [Google Scholar] [CrossRef]
  23. Mansouri, S.S.; Karvelis, P.; Kanellakis, C.; Kominiak, D.; Niko-lakopoulos, G. Vision-Based MAV Navigation in Underground Mine Using Convolutional Neural Network; IEEE: Piscateway, NJ, USA, 2019. [Google Scholar]
  24. Papachristos, C.; Khattak, S.; Mascarich, F. Autonomous Navigation and Mapping in Underground Mines Using Aerial Robots; IEEE: Piscateway, NJ, USA, 2019. [Google Scholar]
  25. Kim, H.; Choi, Y. Autonomous Driving Robot That Drives and Returns along a Planned Route in Underground Mines by Recognizing Road Signs. Appl. Sci. 2021, 11, 10235. [Google Scholar] [CrossRef]
  26. Kim, H.; Choi, Y. Comparison of Three Location Estimation Methods of an Autonomous Driving Robot for Underground Mines. Appl. Sci. 2020, 10, 4831. [Google Scholar] [CrossRef]
  27. Backman, S.; Lindmark, D.; Bodin, K.; Servin, M.; Mörk, J.; Löfgren, L. Continuous Control of an Underground Loader Using Deep Reinforcement Learning. Machines 2021, 9, 216. [Google Scholar] [CrossRef]
  28. Losch, R.; Grehl, S.; Donner, M.; Buhl, C.; Jung, B. Design of an Autonomous Robot for Mapping, Navigation, and Manipulation in Underground Mines. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar]
  29. Li, X.; Wang, S.; Liu, B.; Chen, W.; Fan, W.; Tian, Z. Improved YOLOv4 network using infrared images for personnel detection in coal mines. J. Electron. Imaging 2022, 31, 013017. [Google Scholar] [CrossRef]
  30. Wang, W.; Wang, S.; Guo, Y.; Zhao, Y. Obstacle detection method of unmanned electric locomotive in coal mine based on YOLOv3-4L. J. Electron. Imaging 2022, 31, 023032. [Google Scholar] [CrossRef]
  31. Fengbo, W.; Liu, W.; Wang, S.; Zhang, G. Improved Mine Pedestrian Detection Algorithm Based on YOLOv4-Tiny. In Proceedings of the Third International Symposium on Computer Engineering and Intelligent Communications (ISCEIC 2022), Xi’an, China, 16–18 September 2023. [Google Scholar] [CrossRef]
  32. Biao, L.; Tian, B.; Qiao, J. Mine Track Obstacle Detection Method Based on Information Fusion. J. Phys. Conf. Ser. 2022, 2229, 012023. [Google Scholar] [CrossRef]
  33. Zhang, H. Head-mounted display-based intuitive virtual reality training system for the mining industry. Int. J. Min. Sci. Technol. 2017, 27, 717–722. [Google Scholar] [CrossRef]
  34. Guo, H.; Yu, Y.; Skitmore, M. Visualization technology-based construction safety management: A review. Autom. Constr. 2017, 73, 135–144. [Google Scholar] [CrossRef]
  35. Melendez Guevara, J.A.; Torres Guerrero, F. Evaluation the role of virtual reality as a safety training tool in the context NOM-026- STPS-2008 BT. In Smart Technology. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering; Torres Guerrero, F., Lozoya-Santos, J., Gonzalez Mendivil, E., Neira-Tovar, L., Ramírez Flores, P.G., Martin-Gutierrez, J., Eds.; Springer International Publishing: Cham, Switzerland, 2018; Volume 213, pp. 3–9. [Google Scholar]
  36. Mallett, L.; Unger, R. Virtual reality in mine training. Mining Metall. Explor. 2007, 2, 1–4. [Google Scholar]
  37. Laciok, V.; Bernatik, A.; Lesnak, M. Experimental implementation of new technology into the area of teaching occupational safety for Industry 4.0. Int. J. Saf. Secur. Eng. 2020, 10, 403–407. [Google Scholar] [CrossRef]
  38. Qassimi, S.; Abdelwahed, E.H. Disruptive Innovation in Mining Industry 4.0. In Distributed Sensing and Intelligent Systems. Studies in Distributed Intelligence; Elhoseny, M., Yuan, X., Krit, S., Eds.; Springer: Cham, Switzerland, 2022. [Google Scholar] [CrossRef]
  39. Groves, W.A.; Kecojevic, V.J.; Komljenovic, D. Analysis of fatalities and injuries involving mining equipment. J. Saf. Res. 2007, 38, 461–470. [Google Scholar] [CrossRef]
  40. Boniface, R.; Museru, L.; Munthali, V.; Lett, R. Occupational injuries and fatalities in a tanzanite mine: Need to improve workers safety in Tanzania. Pan Afr. Med. J. 2013, 16, 120. [Google Scholar] [CrossRef] [PubMed]
  41. Amponsah-Tawiah, K.; Jain, A.; Leka, S.; Hollis, D.; Cox, T. Examining psychosocial and physical hazards in the Ghanaian mining industry and their implications for employees’ safety experience. J. Saf. Res. 2013, 45, 75–84. [Google Scholar] [CrossRef] [PubMed]
  42. Kumar, R.; Ghosh, A.K. The accident analysis of mobile mine machinery in Indian opencast coal mines. Int. J. Inj. Contr. Saf. Promot. 2014, 21, 54–60. [Google Scholar] [CrossRef]
  43. Mine Safety and Health Administration (MSHA). Available online: https://www.msha.gov/ (accessed on 8 March 2023).
  44. Ruff, T.; Coleman, P.; Martini, L. Machine-related injuries in the US mining industry and priorities for safety research. Int. J. Inj. Control. Saf. Promot. 2010, 18, 11–20. [Google Scholar] [CrossRef]
  45. Kecojevic, V.; Komljenovic, D.; Groves, W.; Radomsky, M. An analysis of equipment-related fatal accidents in U.S. mining operations: 1995–2005. Saf. Sci. 2007, 45, 864–874. [Google Scholar] [CrossRef]
  46. Mitchell, R.J.; Driscoll, T.R.; Harrison, J.E. Traumatic work-related fatalities involving mining in Australia. Saf. Sci. 1998, 29, 107–123. [Google Scholar] [CrossRef]
  47. Dhillon, B.S. Mining equipment safety: A review, analysis methods and improvement strategies. Int. J. Mining Reclam. Environ. 2009, 23, 168–179. [Google Scholar] [CrossRef]
  48. Ural, S.; Demirkol, S. Evaluation of occupational safety and health in surface mines. Saf. Sci. 2008, 46, 1016–1024. [Google Scholar] [CrossRef]
  49. Stemn, E. Analysis of Injuries in the Ghanaian Mining Industry and Priority Areas for Research. Saf. Health Work. 2019, 10, 151–165. [Google Scholar] [CrossRef]
  50. Bonsu, J.; Van Dyk, W.; Franzidis, J.P.; Petersen, F.; Isafiade, A. A systemic study of mining accident causality: An analysis of 91 mining accidents from a platinum mine in South Africa. J. S. Afr. Inst. Min. Metall. 2017, 117, 59–66. [Google Scholar] [CrossRef]
  51. Kecojevic, V.; Radomsky, M. The causes and control of loader- and truck-related fatalities in surface mining operations. Int. J. Inj. Contr. Saf. Promot. 2004, 11, 239–251. [Google Scholar] [CrossRef] [PubMed]
  52. Md-Nor, Z.; Kecojevic, V.; Komljenovic, D.; Groves, W. Risk assessment for loader- and dozer-related fatal incidents in US mining. Int. J. Inj. Contr. Saf. Promot. 2008, 15, 65–75. [Google Scholar] [CrossRef] [PubMed]
  53. Dindarloo, S.R.; Pollard, J.P.; Siami-Irdemoos, E. Off-road truck-related accidents in US mines. J. Saf. Res. 2016, 58, 79–87. [Google Scholar] [CrossRef] [PubMed]
  54. Burgess-Limerick, R. Injuries associated with underground coal mining equipment in Australia. Ergon. Open J. 2011, 4, 62–73. [Google Scholar] [CrossRef]
  55. Nasarwanji, M.F.; Pollard, J.; Porter, W. An analysis of injuries to front-end loader operators during ingress and egress. Int. J. Ind. Ergon. 2017, 65, 84–92. [Google Scholar] [CrossRef]
  56. Duarte, J.; Torres Marques, A.; Santos Baptista, J. Occupational Accidents Related to Heavy Machinery: A Systematic Review. Safety 2021, 7, 21. [Google Scholar] [CrossRef]
  57. Dash, A.K.; Bhattcharjee, R.M.; Paul, P.S.; Tikader, M. Study and analysis of accidents due to wheeled trackless transportation machinery in Indian coal mines: Identification of gap in current investigation system. Procedia Earth Planet. Sci. 2015, 11, 539–547. [Google Scholar] [CrossRef]
  58. Zhang, M.; Kecojevic, V.; Komljenovic, D. Investigation of haul truck-related fatal accidents in surface mining using fault tree analysis. Saf. Sci. 2014, 65, 106–117. [Google Scholar] [CrossRef]
  59. Moher, D.; Liberati, A.; Tetzlaff, J.; Altman, D.; Altman, D.; Antes, G.; Atkins, D.; Barbour, V.; Barrowman, N.; Berlin, J.; et al. Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLoS Med. 2009, 6. [Google Scholar] [CrossRef]
  60. Shamseer, L.; Moher, D.; Clarke, M.; Ghersi, D.; Liberati, A.; Petticrew, M.; Shekelle, P.; Stewart, L.A.; Altman, D.G.; Booth, A.; et al. Preferred reporting items for systematic review and meta-analysis protocols (PRISMA-P) 2015: Elaboration and explanation. BMJ 2015, 349, 1–25. [Google Scholar] [CrossRef]
  61. Wohlin, C. Guidelines for snowballing in systematic literature studies and a replication in software engineering. In Proceedings of the 18th International Conference on Evaluation and Assessment in Software Engineering—EASE’14, London, UK, 12–14 May 2014; ACM: New York, NY, USA, 2014; pp. 1–10. [Google Scholar]
  62. Dickens, J.S.; van Wyk, M.A.; Green, J.J. Pedestrian Detection for Underground Mine Vehicles Using Thermal Images; IEEE: Piscateway, NJ, USA, 2011. [Google Scholar]
  63. Dickens, J.; Green, J.; Teleka, R.; Sabatta, D. A Global Survey of Systems and Technologies Suitable for Vehicle to Person Collision Avoidance in Underground Rail-Bound Operations; Mine Health and Safety Council: Woodmead, South Africa, 2014. [Google Scholar]
  64. BlackFlyS USB 3.0, Flir. Available online: https://www.flir.com/products/blackfly-s-usb3/ (accessed on 8 March 2023).
  65. Slim Led Light. Available online: https://www.osculati.com/en/ (accessed on 8 March 2023).
  66. Tau 2, Flir. Available online: https://www.flir.fr/products/tau-2/ (accessed on 8 March 2023).
  67. Thermal Range. Available online: https://www.ametek-land.fr/pressreleases/blog/2021/june/thermalinfraredrangeblog (accessed on 8 March 2023).
  68. Dickens, J. Thermal Imagery to Prevent Pedestrian-Vehicle Collisions in Mines; CSIR Centre for Mining Innovation. Automation & Controls, Vector. 2013. Available online: https://www.academia.edu/96375466/Thermal_imagery_to_prevent_pedestrian_vehicle_collisions_in_mines (accessed on 24 April 2023).
  69. Zed 2i, Stereolabs. Available online: https://www.stereolabs.com/zed-2i/ (accessed on 8 March 2023).
  70. Lidar Sensor 2d, SICK LMS 111. Available online: https://www.sick.com (accessed on 8 March 2023).
  71. Lidar Sensor 3d, Velodyne Puck VLP-16. Available online: https://velodynelidar.com (accessed on 8 March 2023).
  72. Ren, H.; Zhao, Y.; Xiao, W.; Hu, Z. A review of UAV monitoring in mining areas: Current status and future perspectives. Int. J. Coal Sci. Technol. 2019, 6, 320–333. [Google Scholar] [CrossRef]
  73. Jones, E.; Sofonia, J.; Canales, C.; Hrabar, S.; Kendoul, F. Advances and applications for automated drones in underground mining operations. In Proceedings of the Deep Mining 2019: Ninth International Conference on Deep and High Stress Mining, The Southern African Institute of Mining and Metallurgy, Johannesburg, South Africa, 24–25 June 2019; pp. 323–334. [Google Scholar] [CrossRef]
  74. Shahmoradi, J.; Mirzaeinia, A.; Roghanchi, P.; Hassanalian, M. Monitoring of Inaccessible Areas in GPS-Denied Underground Mines using a Fully Autonomous Encased Safety Inspection Drone. In Proceedings of the AIAA Scitech 2020 Forum, Orlando, FL, USA, 6–10 January 2020. [Google Scholar] [CrossRef]
  75. Freire, G.R.; Cota, R.F. Capture of images in inaccessible areas in an underground mine using an unmanned aerial vehicle. In Proceedings of the UMT 2017: Proceedings of the First International Conference on Underground Mining Technology, Australian Centre for Geomechanics, Perth, Australia, 11–13 October 2017. [CrossRef]
  76. Zhai, G.; Zhang, W.; Hu, W.; Ji, A.Z. Coal Mine Rescue Robots Based on Binocular Vision: A Review of the State of the Art. IEEE Access 2020, 8, 130561–130575. [Google Scholar] [CrossRef]
  77. Green, J.J.; Coetzee, S. Novel Sensors for Underground Robotics; IEEE: Piscateway, NJ, USA, 2012. [Google Scholar] [CrossRef]
  78. Szrek, J.; Wodecki, J.; Blazej, R.; Zimroz, R. An Inspection Robot for Belt Conveyor Maintenance in Underground Mine—Infrared Thermography for Overheated Idlers Detection. Appl. Sci. 2020, 10, 4984. [Google Scholar] [CrossRef]
  79. Wei, X.; Zhang, H.; Liu, S.; Lu, Y. Pedestrian Detection in Underground Mines via Parallel Feature Transfer Network; Elsevier Ltd.: Amsterdam, The Netherlands, 2020. [Google Scholar] [CrossRef]
  80. Wei, L.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.; Berg, A.C. SSD: Single Shot MultiBox Detector. In Proceedings of the Computer Vision–ECCV 2016: 14th European Conference, Amsterdam, The Netherlands, 11–14 October 2016; pp. 21–37. [Google Scholar] [CrossRef]
  81. Joseph, R.; Farhadi, A. YOLO9000: Better, Faster, Stronger. arXiv 2016, arXiv:1612.08242. [Google Scholar]
  82. Farhadi, R.A. YOLOv3: An Incremental Improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar]
  83. Alexey, B.; Wang, C.; Liao, H.M. YOLOv4: Optimal Speed and Accuracy of Object Detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
  84. Ross, G.; Donahue, J.; Darrell, T.; Malik, J. Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation. arXiv 2014, arXiv:1311.2524. [Google Scholar]
  85. Kaiming, H.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask R-CNN. arXiv 2018, arXiv:1703.06870. [Google Scholar]
  86. Shaoqing, R.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. arXiv 2016, arXiv:1506.01497. [Google Scholar]
  87. Xue, G.; Li, R.; Liu, S.; Wei, J. Research on Underground Coal Mine Map Construction Method Based on LeGO-LOAM Improved Algorithm. Energies 2022, 15, 6256. [Google Scholar] [CrossRef]
  88. Load Haul Dump, Epiroc. Available online: https://www.epiroc.com/en-za/products/loaders-and-trucks (accessed on 8 March 2023).
  89. Load Haul Dump, Caterpillar. Available online: https://www.cat.com/en_US/products/new/equipment/underground-hard-rock/underground-mining-load-haul-dump-lhd-loaders.html (accessed on 8 March 2023).
  90. Zeng, F.; Jacobson, A.; Smith, D.; Boswell, N.; Peynot, T.; Milford, M. Enhancing Underground Visual Place Recognition with Shannon Entropy Saliency. In Proceedings of the Australasian Conference on Robotics and Automation (ACRA 2017), Sydney, Australia, 11–13 December 2017. [Google Scholar]
  91. Kazi, N.; Parasar, D. Human Identification Using Thermal Sensing Inside Mines; IEEE Xplore Part Number: CFP21K74-ART; IEEE: Piscateway, NJ, USA, 2021; ISBN 978-0-7381-1327-2. [Google Scholar]
  92. Pengfei, X.; Zhou, Z.; Geng, Z. Safety Monitoring Method of Moving Target in Underground Coal Mine Based on Computer Vision Processing. Sci. Rep. 2022, 12, 17899. [Google Scholar] [CrossRef]
  93. Imam, M.; Baïna, K.; Tabii, Y.; Benzakour, I.; Adlaoui, Y.; Ressami, E.M.; Abdelwahed, E.H. Anti-Collision System for Accident Prevention in Underground Mines using Computer Vision. In Proceedings of the 6th International Conference on Advances in Artificial Intelligence (ICAAI ’22), Brimingham, UK, 21–23 October 2023; Association for Computing Machinery: New York, NY, USA, 2023; pp. 94–101. [Google Scholar] [CrossRef]
  94. Yuanjian, J.; Peng, P.; Wang, L.; Wang, J.; Wu, J.; Liu, Y. LIDAR-Based Local Path Planning Method for Reactive Navigation in Underground Mines. Remote Sens. 2023, 15, 309. [Google Scholar] [CrossRef]
  95. Zhou, X.; Chen, K.; Zhou, Q. Human Tracking by Employing the Scene Information in Underground Coal Mines; IEEE: Piscateway, NJ, USA, 2017. [Google Scholar] [CrossRef]
  96. Lu, Y.; Huang, Q.X. Object Detection Algorithm in Underground Mine Based on Sparse Representation. Int. J. Adv. Technol. 2018, 9, 210. [Google Scholar] [CrossRef]
  97. Limei, C.; Jiansheng, Q. A Method for Detecting Miners in Underground Coal Mine Videos; IEEE: Piscateway, NJ, USA, 2009. [Google Scholar] [CrossRef]
  98. Bai, Z.; Li, Y.; Chen, X.; Yi, T.; Wei, W.; Wozniak, M.; Damasevicius, R. Real-Time Video Stitching for Mine Surveillance Using a Hybrid Image Registration Method. Electronics 2020, 9, 1336. [Google Scholar] [CrossRef]
  99. Qu, S.; Bao, Z.; Yin, Y.; Yang, X. MineBL: A Battery-Free Localization Scheme with Binocular Camera for Coal Mine. Sensors 2022, 22, 6511. [Google Scholar] [CrossRef] [PubMed]
  100. Dickens, J.S.; Green, J.J.; van Wyk, M.A. Human Detection For Underground Autonomous Mine Vehicles Using Thermal Imaging. In Proceedings of the 26th International Conference on CAD/CAM, Robotics & Factories of the Future, Kuala Lumpur, Malaysia, 26–28 July 2011. [Google Scholar]
  101. Dickens, J.S.; Green, J.J. Segmentation techniques for extracting humans from thermal images. In Proceedings of the 4th Robotics & Mechatronics Conference of South Africa, Pretoria, South Africa, 23–25 November 2011. [Google Scholar]
  102. Li, G.; Yang, Y.; Qu, X. Deep Learning Approaches on Pedestrian Detection in Hazy Weather. IEEE Trans. Ind. Electron. 2019, 67, 8889–8899. [Google Scholar] [CrossRef]
  103. Tumas, P.; Nowosielski, A.; Serackis, A. Pedestrian detection in severe weather conditions. IEEE Access 2020, 8, 62775–62784. [Google Scholar] [CrossRef]
  104. Xu, Z.; Zhuang, J.; Liu, Q.; Zhou, J.; Peng, S. Benchmarking a large-scale FIR dataset for on-road pedestrian detection. Infrared Phys. Technol. 2019, 96, 199–208. [Google Scholar] [CrossRef]
  105. Tumas, P.; Serackis, A.; Nowosielski, A. Augmentation of Severe Weather Impact to Far-Infrared Sensor Images to Improve Pedestrian Detection System. Electronics 2021, 10, 934. [Google Scholar] [CrossRef]
  106. Li, C.; Song, D.; Tong, R.; Tang, M. Illumination-aware faster R-CNN for robust multi-spectral pedestrian detection. Pattern Recognition 2019, 85, 161–171. [Google Scholar] [CrossRef]
  107. Pawlowski, P.; Dabrowski, A.; Piniarski, K. Pedestrian detection in low resolution night vision images. In 2015 Signal Processing: Algorithms, Architectures, Arrangements, and Applications (SPA); IEEE: Piscateway, NJ, USA, 2015; p. 7365157. [Google Scholar] [CrossRef]
  108. Kim, D.; Lee, K. Segment-based region of interest generation for pedestrian detection in far-infrared images. Infrared Phys. Technol. 2013, 61, 120–128. [Google Scholar] [CrossRef]
  109. Xu, F.; Liu, X.; Fujimura, K. Pedestrian detection and tracking with night vision. IEEE Trans. Intell. Transp. Syst. 2005, 6, 63–71. [Google Scholar] [CrossRef]
  110. Bertozzi, M.; Broggi, A.; Del Rose, M.; Felisa, M.; Rakotomamonjy, A.; Suard, F. A pedestrian detector using histograms of oriented gradients and a support vector machine classifier. In Proceedings of the 2007 IEEE Intelligent Transportation Systems Conference, Seattle, WA, USA, 30 September–3 October 2007; pp. 143–148. [Google Scholar]
  111. Bajracharya, M.; Moghaddam, B.; Howard, A.; Matthies, L.H. Detecting personnel around UGVs using stereo vision. In Proceedings of the Unmanned Systems Technology X. International Society for Optics and Photonics, Orlando, FL, USA, 16 April 2008; Volume 6962, p. 696202. [Google Scholar]
  112. Bajracharya, M.; Moghaddam, B.; Howard, A.; Brennan, S.; Matthies, L.H. A fast stereo-based system for detecting and tracking pedestrians from a moving vehicle. Int. J. Robot. Res. 2009, 28, 1466–1485. [Google Scholar] [CrossRef]
  113. Heo, D.; Lee, E.; Ko, B.C. Pedestrian detection at night using deep neural networks and saliency maps. Electron. Imaging 2018, 2018, 060403-1. [Google Scholar]
  114. Chen, Y.; Shin, H. Pedestrian detection at night in infrared images using an attention-guided encoder-decoder convolutional neural network. Appl. Sci. 2020, 10, 809. [Google Scholar] [CrossRef]
  115. Redmon, J.; Farhadi, A. YOLO9000: Better, faster, stronger. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 7263–7271. [Google Scholar]
  116. Available online: https://www.flir.eu/oem/adas/adas-dataset-form/ (accessed on 8 March 2023).
  117. Rivera Velazquez, J.M.; Khoudour, L.; Saint Pierre, G.; Duthon, P.; Liandrat, S.; Bernardin, F.; Fiss, S.; Ivanov, I.; Peleg, R. Analysis of thermal Imaging Performance under Extreme Foggy Conditions: Applications to Autonomous Driving. J. Imaging 2022, 8, 306. [Google Scholar] [CrossRef]
  118. Youssfi Alaoui, A.; Tabii, Y.; Oulad Haj Thami, R.; Daoudi, M.; Berretti, S.; Pala, P. Fall Detection of Elderly People Using the Manifold of Positive Semidefinite Matrices. J. Imaging 2021, 7, 109. [Google Scholar] [CrossRef]
  119. EMESRT Vehicle Interaction. Available online: https://emesrt.org/vehicle-interaction/ (accessed on 8 March 2023).
  120. The EMERST 9-Layer Control Effectiveness Model. Available online: https://emesrt.org/stream/emesrt-journey-steps-v2/2-should-we-change/the-emesrt-nine-layer-control-effectiveness-model/ (accessed on 8 March 2023).
  121. Qian, M.; Zhao, K.; Li, B.; Gong, H.; Seneviratne, A. Survey of Collision Avoidance Systems for Underground Mines: Sensing Protocols. Sensors 2022, 22, 7400. [Google Scholar] [CrossRef]
  122. Singh, M.; Kumar, M.; Malhotra, J. Energy efficient cognitive body area network (CBAN) using lookup table and energy harvesting. J. Intell. Fuzzy Syst. 2018, 35, 1253–1265. [Google Scholar] [CrossRef]
  123. Zhang, F.; Tian, J.; Wang, J.; Liu, G.; Liu, Y. ECViST: Mine Intelligent Monitoring Based on Edge Computing and Vision Swin Transformer-YOLOv5. Energies 2022, 15, 9015. [Google Scholar] [CrossRef]
  124. Bing, Z.; Wang, X.; Dong, Z.; Dong, L.; He, T. A novel edge computing architecture for intelligent coal mining system. Wireless Netw. 2022, 1–10. [Google Scholar] [CrossRef]
  125. Sakshi, G.; Snigdh, I. Multi-Sensor Fusion in Autonomous Heavy Vehicles. In Autonomous and Connected Heavy Vehicle Technology; Elsevier: Amsterdam, The Netherlands, 2022; pp. 375–389. [Google Scholar] [CrossRef]
  126. Russell, B.; Dellaert, F.; Camurri, M.; Fallon, M. Learning Inertial Odometry for Dynamic Legged Robot State Estimation. In Proceedings of the 5th Conference on Robot Learning, London, UK, 8–11 January 2022. [Google Scholar]
  127. Fang, J.; Fan, C.; Wang, F.; Bai, D. Augmented Reality Platform for the Unmanned Mining Process in Underground Mines. Mining Metall. Explor. 2022, 39, 385–395. [Google Scholar] [CrossRef]
Figure 1. Disruptive mining technologies.
Figure 1. Disruptive mining technologies.
Sensors 23 04294 g001
Figure 2. Underground mine description.
Figure 2. Underground mine description.
Sensors 23 04294 g002
Figure 3. Accidents clustering.
Figure 3. Accidents clustering.
Sensors 23 04294 g003
Figure 4. Types of equipment.
Figure 4. Types of equipment.
Sensors 23 04294 g004
Figure 5. Causes of accidents.
Figure 5. Causes of accidents.
Sensors 23 04294 g005
Figure 6. PRISMA flow diagram.
Figure 6. PRISMA flow diagram.
Sensors 23 04294 g006
Figure 7. Applications of computer vision in underground mining operations: (a) navigation mapping [24]; (b) pedestrian detection [62]; (c) rescue operation [22].
Figure 7. Applications of computer vision in underground mining operations: (a) navigation mapping [24]; (b) pedestrian detection [62]; (c) rescue operation [22].
Sensors 23 04294 g007
Figure 8. The use of RGB camera with Spotlight LED. (a) RGB camera [64]; (b) spotlight LED [65].
Figure 8. The use of RGB camera with Spotlight LED. (a) RGB camera [64]; (b) spotlight LED [65].
Sensors 23 04294 g008
Figure 10. Thermal range [67].
Figure 10. Thermal range [67].
Sensors 23 04294 g010
Figure 11. Stereoscopic camera [69].
Figure 11. Stereoscopic camera [69].
Sensors 23 04294 g011
Figure 12. Three simple graphs. (a) 2D LIDAR sensor [70]; (b) 3D LIDAR sensor [71].
Figure 12. Three simple graphs. (a) 2D LIDAR sensor [70]; (b) 3D LIDAR sensor [71].
Sensors 23 04294 g012
Figure 13. Drone [24].
Figure 13. Drone [24].
Sensors 23 04294 g013
Figure 14. Unmanned ground vehicle [26].
Figure 14. Unmanned ground vehicle [26].
Sensors 23 04294 g014
Figure 15. Load–haul–dump [90]: (a) load–haul–dump (LHD); (b) view from fixed camera on LHD.
Figure 15. Load–haul–dump [90]: (a) load–haul–dump (LHD); (b) view from fixed camera on LHD.
Sensors 23 04294 g015
Figure 16. Viewfrom a surveillance camera [95].
Figure 16. Viewfrom a surveillance camera [95].
Sensors 23 04294 g016
Figure 17. The EMESRT nine layer control effectiveness model reframing our understanding of vehicle interaction controls [120].
Figure 17. The EMESRT nine layer control effectiveness model reframing our understanding of vehicle interaction controls [120].
Sensors 23 04294 g017
Figure 18. Detection zones around mobile machines during operations.
Figure 18. Detection zones around mobile machines during operations.
Sensors 23 04294 g018
Table 1. Rateof fatal accidents caused by mobile and haulage equipment in some countries.
Table 1. Rateof fatal accidents caused by mobile and haulage equipment in some countries.
CountryPeriodFatal Accidents (%)Causes
USA1995–200531.6%Haulage Equipment
Australia1982–198429%Vehicles
United Kingdom1983–199341%Haulage equipment
Turkey200416%Haulage equipment
Ghana2008–201748.3%Mobile Equipment
Table 2. Types of accidents by type of machinery.
Table 2. Types of accidents by type of machinery.
Type of Machines
DumperLoad Haul Dump (LHD)
Sensors 23 04294 i001Sensors 23 04294 i002
type of accidentsCollision with pedestrianX
Collision with machineXX
(Front/Reversal) Run overX
Fall from machine X
Rollover
Caught between X
CountryIndiaAustralia
Table 3. Frequency of dumper accidents, according to its mode of action.
Table 3. Frequency of dumper accidents, according to its mode of action.
Type of AccidentFrequency of Accidents
Reversal run over51
Front run over68
Lost control30
Collision51
Other46
Table 4. Frequency of load–haul–dump (LHD) accidents, according to its mode of action.
Table 4. Frequency of load–haul–dump (LHD) accidents, according to its mode of action.
Type of AccidentFrequency of Accidents
Caught between45
Collision with machine87
Fall from machine27
Table 5. Metadata of research conducted.
Table 5. Metadata of research conducted.
DateIn the initial phase, only papers published between 2000 and 2022 were taken in consideration. The year 2000 was selected after a sensitivity analysis of the quantity of publications identified using the designated keywords.
Paper typeWe limited our consideration to research papers.
LanguageOur evaluation was restricted exclusively to papers written in English.
Table 7. The most commonly used algorithms in underground mines with their purposes.
Table 7. The most commonly used algorithms in underground mines with their purposes.
AlgorithmData TypePurpose
Yolo (v2,v3,v4 and v5) [21,22,29,30,31,32]RGB Images [21,22,30,31,32], Thermal Images [22,29]Pedestrian detection [21,22,29,30,31,32],
electric locomotives and stones falling [30]
HOG [22]RGB Images [22], Thermal Images [22]Pedestrian detection [22]
SVM [90,100]RGB Images [90,100], Thermal Images [100]Enhancing underground visual place [90],
pedestrian segmentation [100]
Image Segmentation and Thresholding [78,100,101]RGB Images [78], Thermal images [78,100,101]Overhead boulders detection [78],
pedestrian detection [100,101]
Navigation and mapping [23,24,25,26,27,28]RGB Image [23,25,27], Stereoscopic Image [24,28], Thermal Image [24], Image from LIDAR [23,24,25,26,27,28]Anti-collision [23],
exploration path planning solutions [24],
road signs recognition [25],
location estimation method [26],
trajectory controller [27]
Table 8. The most used algorithms with the accuracy they have reached.
Table 8. The most used algorithms with the accuracy they have reached.
ReferenceAlgorithmAP (%)MAP (%)fps
[21]Yolov2N/A5443
YWSSv1N/A54.443
YWSS2N/A66.35
[29]Yolov495.47N/A44.3
SPAD91.63N/A35.9
Faster-RCNN88.2N/A25.1
SSD86.2N/A38.7
Improved Yolov469.25N/A48.2
[30]A -Darknet53 YOLOv393.1N/A31
B -Model A + add a fourth feature scale96.1N/A26
C -Model B + Darknet-3797.5N/A38
D- Model C + DIOU + Focal98.2N/A38
Table 12. Summary of industrial collision-avoidance-system-based computer vision.
Table 12. Summary of industrial collision-avoidance-system-based computer vision.
CompanyProductType of SolutionEnvironmentTechnologyCommentsLink
DotNetixSAFEYEObject detection (recognition signs, pedestrian, mobile machines )Surface and undergroundStereoscopic cameras, Display warning system, Multicamera controller (mini PC)dotNetix has created a 3D camera capable of determining the distance to pedestrians (up to 25 m) and machines (up to 50 m).https://www.dotnetix.co.za/safeye (accessed on 8 March 2023)
EDEYEVoid and berm detectionEDGEYE provides a berm and void detection system for industrial machines using 3D camera and machine learning.https://www.dotnetix.co.za/edgeye (accessed on 8 March 2023)
BlaxtairBlaxtairPrevents collisions between industrial machinery and pedestriansSurface and undergroundStereoscopic cameraWhen a pedestrian is in danger, the Blaxtair Origin delivers a visual and audio warning to warn the driver owing to its AI algorithms, which are supported by a distinctive and large learning database.https://blaxtair.com/en/products/blaxtair-pedestrian-obstacle-detection-camera (accessed on 8 March 2023)
Blaxtaire OriginPedestrian detectionMonoscopic camerahttps://blaxtair.com/en/products/blaxtair-origin-pedestrian-detection-camera (accessed on 8 March 2023)
Blaxtair OmegaPrevents collisions between industrial machinery and pedestriansRobust stereoscopic camera (−40 °C to +75 °C)The Omega 3D industrial camera is a robust tool that computes the disparity map and gives the user access to metadata.https://blaxtair.com/en/products/omega-smart-3d-vision-for-robust-automation (accessed on 8 March 2023)
AXORARadar and video anti-collision systemCollision avoidanceSurface and undergroundLIDARAXORA provides a LIDAR system that helps vehicles avoid collision in mines using rebounding laser beams to alert operators about obstacles. Additionally, this solution offers a 3D environment map that can be utilized to enhance visibility and exert control over the stability of the ceiling.https://www.axora.com/marketplace/radar-and-video-anti-collision-system/ (accessed on 8 March 2023)
SickOver height and narrowness detection in hard rock shaftsCollision-avoidance systemUnderground2D LIDARTo avoid collisions with low roofs and in cramped locations, they rely on compact 2D LIDAR sensors to prevent accidents.https://www.sick.com/cn/en/industries/mining/underground-mining/vehicles-for-mining/over-height-and-narrowness-detection-in-hardrock-shafts/c/p661478 (accessed on 8 March 2023)
Identification of mine vehicles in areas with poor visibilityLIDARThe sensors detect when vehicles are dangerously close, even if dust concentrations are high. They also detect whether a vehicle is moving or stationary. They pass the information on to a traffic control system, which warns incoming traffic with a traffic light system if necessary.https://www.sick.com/cn/en/industries/mining/underground-mining/vehicles-for-mining/identification-of-mine-vehicles-in-areas-with-poor-visibility/c/p674800 (accessed on 8 March 2023)
TorsaCollision avoidance for shovelsCollision avoidanceSurface and undergroundLIDAR 3DThis system is based on LIDAR 3D technology and analyzes the interaction between vehicles and the shovel itself with 0.01 cm of precision, guaranteeing safety in the loading operation by informing the operator of the machinery about the type, position, and distance of the different vehicles and obstacles around the shovel.https://torsaglobal.com/en/solution/collision-avoidance-shovels/ (accessed on 8 March 2023)
Collision-avoidance system for haul trucks, auxiliary, and light vehiclesThis CAS for trucks and auxiliary vehicles includes LIDAR 3D technology, which is able to analyse its environment with a very high level of precision and definition. The system is designed to protect the vehicle operator at all times, proactively and predictively assessing and alerting potential risk situations.https://torsaglobal.com/en/solution/collision-avoidance-mining/ (accessed on 8 March 2023)
Collision avoidance for drillers and rigsThe system has been conceived to identify any object (mining equipment, rocks, personnel, etc.) that could cause a hazardous situation in the driller’s operating area, and it has a powerful communications interface that allows monitoring its safety remotely while operating the rig remotely and autonomously.https://torsaglobal.com/en/solution/collision-avoidance-drillers/ (accessed on 8 March 2023)
Collision-avoidance system for underground mining operationsTime of Flight (TOF)This (CAS) can function over the equipment (Level 9) in underground mining operations to prevent any run-overs or collisions. Covering scoops, auxiliary, and light vehicles, and operational staff are its primary targets.https://torsaglobal.com/en/solution/collision-avoidance-underground/ (accessed on 8 March 2023)
WabtecCollision-management systemCollision-avoidance systemSurfaceCamera, RFID, GPSWabtec’s Collision Awareness solution is a reporting system developed specifically for the mining industry. In order to reduce the risk of collisions between actors and components of the mine, it gives 360-degree situational awareness of things close to a heavy vehicle throughout stationary, slow-speed, and high-speed operations.https://www.wabteccorp.com/mining/digital-mine/environmental-health-safety-ehs (accessed on 8 March 2023)
Waytronic SecurityCollision avoidanceCollision-avoidance systemManufacturingUltrasonic sensors, cameraPedestrian and forklift collision avoidancehttp://www.wt-safe.com/factorycoll_1.html?device=c&kyw=proximity%20detection%20system (accessed on 8 March 2023)
LSM technologiesRadarEyeCollision-avoidance systemMiningRadar, cameraRadar sensor with detection range 2–20 m and virtually 360 degree viewing;https://www.lsm.com.au/item.cfm?category_id=2869&site_id=3 (accessed on 8 March 2023)
Matrix Design GroupIntelliZoneCollision-avoidance systemUnderground, coal mineMagnetic field with optional Lidar/Radar/camera integrationMachine-specific straight-line and angled zoneshttps://www.matrixteam.com/wp-content/uploads/2018/08/IntelliZone-8_18.pdf (accessed on 8 March 2023)
Preco ElectronicsPreViewCollision-avoidance systemSurface and underground mineRadar, cameraVarious product lineshttps://preco.com/product-manuals/ (accessed on 8 March 2023)
CaterpillarMineStar DetectCollision-avoidance systemSurface and underground mineCamera, radar, GNSS https://www.westrac.com.au/en/technology/minestar/minestar-detect (accessed on 8 March 2023)
GE MiningCASCollision-avoidance systemSurface and underground mineSurface—GPS tracking, RF unit and camera; underground—VLF magnetic and WiFiReal-time data, data communication networkhttps://www.ge.com/digital/sites/default/files/downloadassets/GE-Digital-Mine-Collision-Avoidance-System-datasheet.pdf (accessed on 8 March 2023)
JannatecSmartViewCollision-avoidance systemMiningMulti-camera, WiFi Bluetooth (for communication)Text voice and video communicationhttps://www.jannatec.com/ensosmartview (accessed on 8 March 2023)
Schauenburg SystemsSCAS surface PDSCollision-avoidance systemSurface mineGPS, GSM, RFID, cameraUse time of flight with an accuracy <1 mhttp://schauenburg.co.za/product/scas-surface-proximity-detection-system/ (accessed on 8 March 2023)
SCAS underground PDSUnderground mineCamerasTagless, artificial intelligenthttp://schauenburg.co.za/mimacs/ (accessed on 8 March 2023)
Joy Global, P&HHawkEye camera systemCollision-avoidance systemMiningFisheye cameras with infrared filtersDigital video recorder (DVR)—100 to 200 h videohttps://mining.komatsu/en-au/technology/proximity-detection/hawkeye-camera-system (accessed on 8 March 2023)
Intec Video SystemsCar VisionCollision-avoidance systemIndustrialCameraVehicle safety monitoring camerashttp://www.intecvideo.com/products.html (accessed on 8 March 2023)
PreViewIndustrialRadar, cameraLow power 5.8 GHz radar signal
Ifm EfectorO3M 3D Smart SensorCollision-avoidance systemOutdoorOptical technologyPMD-based 3D imaging informationhttp://eval.ifm-electronic.com/ifmza/web/mobile-3d-app-02-Kollisionsvorhersage.htm (accessed on 8 March 2023)
Motion MetricsShovelMetricsCollision-avoidance systemMining and constructionRadar, thermal imagingInterface with our centralised data analysis platformhttps://www.motionmetrics.com/shovel-metrics/ (accessed on 8 March 2023)
3D Laser MappingSiteMonitorCollision-avoidance systemMiningLaser sensorAccuracy of 10 mm out of range up to 6000 mhttps://www.mining-technology.com/contractors/exploration/3d-laser-mapping/ (accessed on 8 March 2023)
Hitachi MiningSkyAngleCollision-avoidance systemMiningCameraBird’s-eye viewhttps://www.mining.com/web/hitachi-introduces-skyangle-advanced-peripheral-vision-support-system-at-minexpo-international/ (accessed on 8 March 2023)
GuardvantProxGuard CASCollision-avoidance systemMiningRadar, camera, and GPSLight vehicles and heavy equipmenthttps://www.mining-technology.com/contractors/health-and-safety/guardvant/pressreleases/pressguardvant-proxguard-collision-avoidance/ (accessed on 8 March 2023)
Safety VisionVision systemCollision-avoidance systemDiverse range of usesCamera http://www.safetyvision.com/products (accessed on 8 March 2023)
ECCOVision systemCollision-avoidance systemDiverse range of usesCamera https://www.eccoesg.com/us/en/products/camera-systems (accessed on 8 March 2023)
FlirVision systemCollision-avoidance systemWide range of applicationThermal camera https://www.flir.com.au/applications/camera-cores-components/ (accessed on 8 March 2023)
NautitechVision systemCollision-avoidance systemHarsh environmentThermal cameraHarsh environmenthttps://nautitech.com.au/wp-content/uploads/2019/05/Nautitech-Camera-Brochure-2019.pdf (accessed on 8 March 2023)
Table 13. Summary of solutions of health monitoring.
Table 13. Summary of solutions of health monitoring.
CompanyProductType of SolutionEnvironmentTechnologyCommentsLink
DotNetixNEXUSMonitoring operator fatigueSurface and underground Using advanced sensors and algorithms, this system monitors the operator and determines if he is distracted or tired from his facial features.https://www.dotnetix.co.za/fatigue-monitoring (accessed on 8 March 2023)
TorsaHuman vibration exposure monitoringMiner safetySurface and undergroundVibration sensorThe TORSA’s vibration monitor system measures and evaluates the vibrations to which the operators of vehicles of the mining operation are exposed for reducing the number of possible injuries that could result from your daily activity.https://torsaglobal.com/en/solution/human-vibration-exposure-system/ (accessed on 8 March 2023)
Mining3SmartCapMonitoring operator fatigue SmartCap is a wearable device that monitors driver and heavy vehicle operator tiredness. The life app gives operators the ability to control their own level of awareness. Through the user-friendly in-cab life display, the app gives the driver immediate visual and aural alerts.https://www.mining3.com/solutions/smartcap/ (accessed on 8 March 2023)
HexagonMineProtect operator alertness systemFatigue and distraction management solutionSurface and underground An integrated fatigue and distraction management solution, the HxGN MineProtect operator alertness system helps operators of heavy and light vehicles maintain the level of attention necessary for long hours and monotonous tasks.https://hexagon.com/products/hxgn-mineprotect-operator-alertness-system (accessed on 8 March 2023)
CaterpillarCat DetectFatigue and distraction managementSurface and underground Any operation is susceptible to distraction and exhaustion, especially if the activities are monotonous. Cat Detect provides fatigue and distraction management, a solution that will help operators create a culture where safety comes first to give you piece of mind.https://www.westrac.com.au/technology/minestar/minestar-detect (accessed on 8 March 2023)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Imam, M.; Baïna, K.; Tabii, Y.; Ressami, E.M.; Adlaoui, Y.; Benzakour, I.; Abdelwahed, E.h. The Future of Mine Safety: A Comprehensive Review of Anti-Collision Systems Based on Computer Vision in Underground Mines. Sensors 2023, 23, 4294. https://doi.org/10.3390/s23094294

AMA Style

Imam M, Baïna K, Tabii Y, Ressami EM, Adlaoui Y, Benzakour I, Abdelwahed Eh. The Future of Mine Safety: A Comprehensive Review of Anti-Collision Systems Based on Computer Vision in Underground Mines. Sensors. 2023; 23(9):4294. https://doi.org/10.3390/s23094294

Chicago/Turabian Style

Imam, Mohamed, Karim Baïna, Youness Tabii, El Mostafa Ressami, Youssef Adlaoui, Intissar Benzakour, and El hassan Abdelwahed. 2023. "The Future of Mine Safety: A Comprehensive Review of Anti-Collision Systems Based on Computer Vision in Underground Mines" Sensors 23, no. 9: 4294. https://doi.org/10.3390/s23094294

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop