Next Article in Journal
ResBCDU-Net: A Deep Learning Framework for Lung CT Image Segmentation
Previous Article in Journal
Which Digital-Output MEMS Magnetometer Meets the Requirements of Modern Road Traffic Survey?
 
 
Order Article Reprints
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Comprehensive Bird Preservation at Wind Farms

1
Bioseco Sp. z. o. o., Budowlanych 68, 80-298 Gdansk, Poland
2
Department of Mathematics and Natural Sciences, Blekinge Institute of Technology, 371 79 Karlskrona, Sweden
3
Department of Vertebrate Ecology and Zoology, University of Gdansk, Wita Stwosza 59, 80-308 Gdansk, Poland
*
Author to whom correspondence should be addressed.
Sensors 2021, 21(1), 267; https://doi.org/10.3390/s21010267
Received: 26 November 2020 / Revised: 20 December 2020 / Accepted: 25 December 2020 / Published: 3 January 2021
(This article belongs to the Section Intelligent Sensors)

Abstract

:
Wind as a clean and renewable energy source has been used by humans for centuries. However, in recent years with the increase in the number and size of wind turbines, their impact on avifauna has become worrisome. Researchers estimated that in the U.S. up to 500,000 birds die annually due to collisions with wind turbines. This article proposes a system for mitigating bird mortality around wind farms. The solution is based on a stereo-vision system embedded in distributed computing and IoT paradigms. After a bird’s detection in a defined zone, the decision-making system activates a collision avoidance routine composed of light and sound deterrents and the turbine stopping procedure. The development process applies a User-Driven Design approach along with the process of component selection and heuristic adjustment. This proposal includes a bird detection method and localization procedure. The bird identification is carried out using artificial intelligence algorithms. Validation tests with a fixed-wing drone and verifying observations by ornithologists proved the system’s desired reliability of detecting a bird with wingspan over 1.5 m from at least 300 m. Moreover, the suitability of the system to classify the size of the detected bird into one of three wingspan categories, small, medium and large, was confirmed.

1. Introduction

With the growth of the human population, the robust expansion of urban facilities such as wind farms, power lines and airports into the natural environment of animals, particularly birds and bats, may be observed [1,2,3,4,5,6,7]. Therefore, mutual cohabitation of wildlife and humans increasingly leads to unwanted conflicts and close contact. Bird strikes with synthetic structures are dangerous situations for both. On the one hand, turbine damage and airplane crashes cause human problems. On the other hand, human expansion inflicts itself on the local ecosystem leading not only to habitat loss and fragmentation but above all to the suffering and death of birds [8].
Although we may believe that wind power is a green and renewable energy source, it may also cause the death of rare species of birds and bats. Rotating high speed wind turbine blades are hardly visible for hunting predatory birds. It is hard to estimate the accurate mortality rate, but the most recent studies show that in the U.S. between 140,000 and 500,000 birds die annually [9,10,11]. With the increase of wind energy capacity, this number could even reach 1.4 million. Therefore, there is an immediate need for technical solutions mitigating the impact of wind turbines on local avifauna [9].
Sustainable development requires not only the reduction of carbon dioxide emissions, but it must be conducted without the depletion of nature and wildlife [12]. Therefore, among other things, humans need to develop sustainable, efficient and nature friendly methods and instruments, helping to mitigate the impact of synthetic structures and machines on the whole avifauna.
One of the main technological approaches to achieving Avifauna Individuals Avoidance (AIA) is the Automated Detection and Reaction Method (ADaRM). The ADaRM might be based on the detection of birds and/or bats using machine vision and appropriate reactions to achieve AIA. Such reactions may use light deterrents and/or sound signals, the slowing or stopping of a wind turbine, the delayed takeoff or landing of an aircraft, or the calling of a falconer to scare resting or foraging birds. A solution based on this approach using a stereo-vision system embedded in distributed computing and IoT paradigms is presented in this paper. The system development process applies a User-Driven Design method. The bird detection, localization and identification are carried out using vision methods and artificial intelligence algorithms. Validation tests with a fixed-wing drone and verifying observations by ornithologists proved the system can protect birds around wind farms, with the desired reliability.

2. Survey of Related Work

2.1. Collision Prevention

There has been several research papers regarding the effective bird protection on wind farms [13,14,15]. So far, the solution preferable by ornithologists is periodic turbine shut-downs during specific weather conditions. The shut-down of turbines is also obligatory during spring and autumn migrations. However, this solution limits the power production of the wind farm and thus the operators’ profits. Therefore, an automatic collision prevention system that could reduce the bird mortality is the subject of research for many scientists and engineers [16].
Pulsing light is one of the methods of repelling birds. Such a solution is widespread in airports to prevent bird collisions with airplanes. Blackwell et al. [17] show that birds register pulsating lights quicker than static lights. Moreover, they claim that the best repelling reaction may be obtained for lights from the wavelength range of 380 nm–400 nm. Doppler et al. [18] as a continuation of the research, applied light at a wavelength of 470 nm, obtaining promising results. In the most recent works Goller et al. [19] tested LED light at 380 nm, 470 nm, 525 nm and 630 nm. It was shown that the best results were obtained applying 470 nm and 630 nm LED light. Moreover, their research showed that waves of 380 nm and 525 nm may actually be luring to birds.
Another tested method of repelling birds has been sound repellents. Bishop et al. [20] show that high frequency sounds and ultrasounds are either inefficient or even dangerous to the birds. They also proved that lower frequencies of sound deter birds more efficiently. However, they observed a habituation effect for birds subjected to longer emissions of the same sound.
Cadets of the Air Force Academy proposed combining pulsing lights with sounds [21]. They obtained good effects in repelling birds using white light, and sound of 2 kHz at a strength of between 90 dB and 135 dB. The presented research shows that it is possible to repel birds from the wind turbine vicinity; however, to reduce the habituation effect it is recommended that the repelling method is only used when a bird is approaching the turbine. Moreover, to ensure enough reaction time for the turbine stopping routine, the bird needs to be detected from a sufficient distance away, which can vary for different species [16].

2.2. Detection Methods

The very first automated detection system for birds was created in the 1950s, and was mostly based on radar [22,23]. The interest in the bird detection problem was aroused with the growth in aviation and the subsequent increase in bird strikes. The radar systems can detect any flying object in the monitoring area and estimate the object’s position, velocity and movement [3]. The detection range depends on several factors including the system frequency band, beam angle and power, and antenna size. Presently, bird detection systems allow observations up to 5 km [24]. However, radars are not able to perform direct classification of the species or to distinguish birds from flying objects e.g., drones. Therefore, detailed analysis of the data obtained is still required e.g., through biologist consultation [3]. Moreover, the price, the size of the system, the power consumption, and government emissions regulations limiting the beam frequency and power are the main barrier to wide-scale application of radar for bird detection [25].
Despite the limitations, radar is widely used for bird observations [23,24]. Nevertheless, in the last decade, with the development of image processing algorithms, Artificial Intelligence (AI) and advantages of Graphics Processing Units (GPU) capabilities, vision-based detection systems are becoming more and more powerful [26]. There are two well-known vision detection approaches applied in industrial applications—the single and stereoscopic methods. A single camera unit can detect bird movement and carry out species identification. Such an approach finds use in aerial systems [27,28] and in low budget detection systems [4]. However, the most recent systems use stereoscopy, which extends the single camera system capabilities with additional position and size information for the detected birds [29,30]. Presently, high-resolution cameras coupled in stereoscopic mode may ensure similar distance estimation performance to radar systems [31]. Although the detection range of the vision-based system is limited up to 1.0 km [30]. The main advantage of the vision approach over the radar one, is its ability to detect a single bird or bat, which then can be followed by their identification [32].
In recent years, several approaches have been developed to solve the vision-based bird detection problem on wind farms. Companies such as DT Bird [33], SafeWind [34], IdentiFlight [35], BirdVision and Airelectronics [36] have already implemented and validated their solutions, see Table 1.
Most of the available solutions on the market are based on the monoscopic approach installed on the wind turbine. Only [35] applies stereo-vision installed on a separate tower. However, the system’s orientation is unidirectional. Depending on the sensor used, the detection ranges of the solutions presented vary between 300 m and 1500 m. Most of them use sounds and turbine stopping for collision prevention. It is only the Identiflight solution that consists of an embedded a classifier allowing classification of three different species.

2.3. Identification Algorithms

The core of a vision-based system is a detection algorithm. With the growing capability of computers, the AI-based detection algorithm are becoming more efficient [26]. Since 2012, when Krizhevskys’ Neural Network (NN) won the ImageNet competition [37], AI-based solutions have become common for image identification tasks [38]. Furthermore, Machine Learning (ML) and Deep Learning (DL) techniques are being applied for object detection [39], image classification [40] and sound recognition [41].
Regardless of the image sensor used, it is challenging to distinguish birds from other flying objects such as insects, drones, or airplanes. Therefore, the Deep Learning approach is deemed as a suitable tool for bird identification [4,28,42,43,44,45,46]. The comparative analysis of AI-based methods used for bird identification is presented in Table 2.
There are many types of Convolutional Neural Networks (CNN) used for bird identification [50,51,52]. In general, except the number of layers, the important parameters of CNN architecture are: the pooling method, the activation function and the optimization method. It has been found that the pooling with a max 2 × 2 feature window is commonly used. Among the activation functions the ReLU and softmax functions are the most popular. In the training process of the CNN, the Stochastic Gradient Descent algorithm is used as an error minimization method. Other parameters such as input image size, number of epochs, and size of the training dataset, are very individual and are selected for each task respectively.
There have also been attempts to apply identification methods other than by Neural Network, such as Haar Feature Based Cascade Classifier [48], which could give a better individual detection performance in comparison to CNN, but does not perform as well when tasks are conducted on many features [42]. Another method is the Long Short-Term Memory (LSTM) [45], which gives a better performance in bird identification near the moving blades of wind turbines. Nevertheless, identification of small birds with CNN-LSTM still requires improvement. Dense CNN [48] reported good identification performance with additional skip connections [46,53], which improve feature extraction. After 100 epochs, CNN with skip connections reach near 99% identification accuracy, when the same CNN architecture without skip connections reaches 89%.

3. Problem Statement, Objectives and Main Contributions

As the survey of related works shows, there are several solutions for bird protection at wind farms. Most of them are based on a single camera, however, with monoscopic vision, it is neither possible to accurately estimate a bird’s size nor its distance from a turbine. Those features are crucial for a reliable and efficient bird collision avoidance system, where the reduction of unnecessary turbine stopping is desired. To stop the wind turbine safely, a bird needs to be detected from a distance up to 200 m–400 m, depending on the species and their flying characteristics. The detection range can depend also on a wind farm’s surroundings and on local environmental authority requirements regarding safety. Most of the protected birds are of medium and large sizes, with a wingspan of more than 1.2 m. Reliable detection of avifauna and its classification is a challenge especially for relatively long distances and varying weather conditions.
The main objective of the paper is to find a structure for a vision-based bird collision avoidance system. The solution should detect and identify a bird from a range of at least 300 m and then classify it into one of three bird categories: small, medium and large. The system should work in real time to ensure that it is possible for the turbine to stop in enough time to avoid collision. The mechanical structure of the system must facilitate its installation on a wind turbine. The system needs to be customizable to adjust its functionalities with respect to requirements of both the local environmental authorities and the wind farm developers. Moreover, the system should assure a high reliability of detection, identification and classification without compromising the needs of low purchase cost along with installation and maintenance costs. To assure a real-time operation mode, the proposed solution applies distributed computing into the IoT paradigm [54]. With a stereoscopic vision acquisition system, AI-based identification and size classification algorithms. The modular structure of the system is proposed to monitor and track bird presence all around the wind turbine. Furthermore, based on processed information about bird category and its distance to the turbine, the system provides a suitable reaction to avoid its collision with the turbine. The proposed decision-making system combines information from each detection module, estimates the bird’s position, classifies it to one of three categories and takes suitable deterrent measures such as stroboscopic lights and/or pulsing sound or stopping the turbine. To design a system, which would meet the requirements specified by the local environmental authorities, wind farm developers and turbine manufacturers, a User-Driven Designed (UDD) methodology [55] is used. The proposed system was implemented, and its prototypes’ performance was experimentally validated and then verified by ornithologists in a real environment on the wind turbine.

4. Design

The needs definition and system design phases of the Bird Protection System (BPS) was based on User-Driven Design methodology [55]. In each step of the systematized design process, the following stakeholders are involved: wind farm owners and environmental authorities, future users, ornithologists or wind farm employees preparing reports about bird activities, and finally designers and manufacturers. Such an approach allows for minimization of the risk of not meeting expected needs and allows for a market tailored design solution.
The design of a bird protection system is complex due to the possible counteractive requirements of the stakeholders and environmental authorities. On the one hand, the environmental authorities require high reliability of collision prevention and thus wind turbine stopping on each rare or big bird occurrence. On the other hand, wind farm developers and operators need to prevent unnecessary breaks in power production and wish to minimize turbine-off time.
In Table 3, the functionalities expected of a system and the constraints, which limit possible solutions are shown. For the environmental authorities, wind farm developers and designers along with manufacturers, the overriding goal is to protect birds. Environmental authorities especially, expect high protection of rare birds and additional protection for other birds. Wind farm owners aim to meet the requirements of the authorities while maintaining the highest possible production with the least possible stoppage of turbines. The designers and manufacturers need to create systems that meet requirements of both contributors.
For detection and protection of the birds, the environmental authorities require that a monitoring system will work effectively during the daylight, because most birds do not fly at night [56]. For reliable collision avoidance in the form of turbine stopping is demanded for all the rare species and most big birds of wing spawn larger than 1.5 m. For medium and small birds, deterrent methods such as sound and light signals are allowed from long distance.
Recorded photos and video of each event need to provide data validation and be used as an assessment tool for the prevalence of individual bird species. Moreover, the resolution of the photos and videos should allow the identification of the birds on the captured frames.
The wind farm owners expect reliable collision avoidance systems with minimal impact on the turbines and power production. This could be ensured with reliable classification of bird sizes. With precise information about the detected bird’s size and, knowing its distance, it is possible to minimize turbine stopping only for rare bird species at close distances. Moreover, in some cases stakeholders require additional deterrent methods launched in advance to force the bird to change the flight path before reaching the turbine stopping zone.
As a nonfunctional requirement stakeholders require non-invasive installations, which will not affect the turbine lifetime and its warranty. Easy access to the data gathered by the system e.g., through web and/or mobile applications is crucial for all the stakeholders. Moreover, all gathered data are expected to be retained for at least two years raising the possibility of their presentation and aggregation, which will be useful in annual reports on bird activity.
It is important to develop easy to install and quick to run systems, which in the case of malfunction will be easy to replace. The expected lifetime of the system is to exceed 20 years. It is also crucial to offer a solution that is compatible with existing systems in the turbine, especially in the case of turbine stopping routines. In the case of system installations on different farms, in different countries on different continents it is also necessary to have a remote connection with the system for maintenance purposes and daily status checks.
To achieve a good reception of the system, it should be highly reliable in bird detection with a small number of false positive detections caused by non-bird objects such as airplanes, clouds or insects.
The functionalities and constrains shown were assessed as technically feasible. However, the most important among them is the possibility of system customization and configuration. Different wind farms, even within one country, could require different triggers for activating the turbine stopping routine and/or the deterrent signals. The triggers could be defined with the bird size and distance. Therefore, before designing the system, its basic configuration should be established, according to Figure 1. Following environmental authority guidance, some wind farms need to stop the turbine only for birds considered to be big with a wingspan of more than 1.5 m. In other cases it is forbidden to use acoustic signals due to the proximity to buildings. In some cases the operators prefer only to use deterrent signals, without stopping the turbine.

5. Modelling

The general system configuration chosen is presented in Figure 2. The system is composed of five separate segments: Data Acquisition, Bird Detection, 3D Localization, Bird Size Classification, and Collision Avoidance System. The Data Acquisition block represents the system hardware and its functionalities, which ensure the reproduction of the bird image onto an image plane. The Bird Detection algorithms allow real-time detection resulting from the object contour. The 3D Localization algorithm is used for estimation of the detected object’s distance and height from the turbine. The object’s contour and its 3D localization from the turbine are used for Bird Size Classification. In the final stage, the Collision Avoidance decision and method is undertaken.
To achieve project objectives, the interrelated parameters of the vision system, such as Vision Sensor Size, V S S , Field of View, F o V , and Image Resolution, I R , need to be selected according to system constraints that include cost-efficiency. The inter-dependent parameters of the hardware configuration of the bird protection system are presented in the Figure 3.
To optimize the solution we use the systematic approach presented in this section. In Section 5.1 the parameters of the hardware system components are selected in such a way that focal length, f, F o V and V S S are optimal. Then in Section 5.2, the stereo-vision baseline, which ensures bird localization in the range of 300 m is selected. In Section 5.3 the system processing architecture is presented.

5.1. Acquisition and Detection System

As the system needs to monitor space around the turbine, a 360 horizontal field of view ( F o V h ) is required. This can be met by multiplication of the single detection module, as presented on the Figure 4. Furthermore, to prevent a bird’s collision with the rotor blades, the detection system needs to monitor and detect objects in the space in front of the blades at a distance, which allows time for suitable avoidance action.
Overall, the shape of the monitored space depends on the height of system installation l, camera vertical F o V v and number of modules used N, see Figure 4. To maximize cost-efficiency, the number of modules needs to be limited to N m a x = 10 , which defines the camera minimal F o V h as:
F o V h 360 N m a x F o V h 36
From the Side View shown in Figure 4a, the dead zone of the system is defined by the two variables: the dead zone maximal distance E, and the blade rotation diameter R B . Knowing that the detection range needs to be not less than 40 m to allow time for a successful repelling action, and assuming the maximal height of wind turbines to be around 100 m in height with blade rotation diameter of around 80 m, then the minimal required F o V v can be calculated using formula:
F o V v 90 arctan E R B F o V v 63
A vision system is defined by the focal length, f [m], Vision Sensor Size, V S S h / v [m], Vision Sensor Resolution, V S R h / v [px], and Pixel Size, p W / H [m], which can be different in horizontal ( h ) and vertical ( v ) axis. The relationship between FoV, V S S and f can be shown as:
F o V h / v = 2 × arctan V S S h / v 2 f
Knowing the distance D b [m] of the object to the vision system, the object’s size in width S i z e W [m] and in height S i z e H [m] can be estimated using formula:
S i z e W / H = D b × p W / H V S R h / v × V S S h / v f
where p W [px] and p H [px] are the object’s width and height on the image in pixels, respectively. During the simulations we assume that the bird is localized perpendicular to the camera as with during e.g., gliding flight. Therefore, the p W represents the size of a wingspan and p H represents bird length.
From the thumb rule, to detect the bird from the background, its size needs to have at least p W m i n = 12 px and p H m i n = 2 px. This requirement and constraints of the monitored space (1) and (2) lead to the following sets of equations:
S i z e W D × V S R h V S S h × f 12 p x 2 × arctan V S S h m a x 2 f 36
S i z e H D × V S R v V S S v × f 2 p x 2 × arctan V S S v 2 f 63
Four common vision sensors and their optical parameters are listed in Table 4.
The case study of system performance is carried out for the medium-sized protected bird Red Kite (Latin: Milvus Milvus) (its length is between 0.61 m and 0.72 m and wingspan between 1.40 m and 1.65 m). Using data from Table 4, image width p W and height p H of the bird at the distance of 300 m as well as horizontal ( F o V h ) and vertical F o V v fields of view were calculated for six different lenses, see Table 5.
In Table 5, the cases when all F o V v / h and p W / H fulfill requirements (5) and (6) are in bold. From the four cases in bold only C1 assures the best performance in terms of computational complexity for the cheapest lens of f = 3 mm.
To understand the system performance for the selected C1 camera and lens of f = 3 mm, Figure 5 shows how bird projection on the image plane depends on the bird size and its distance from the system. The red lines are demarcations between the three main classes of bird sizes: big with a wingspan of more than 1.5 m, medium with a wingspan between 1.2 m and 1.5 m and small birds between 0.75 m and 1.2 m. The classification capabilities of the system design are represented by the gradient of the blue plane. It can be observed that for a target system range of 300 m, the system is capable of distinguishing between medium and large birds. However, the small bird detection range is below 180 m, which however meets the general system requirements.

5.2. Collision Avoidance System

The Collision Avoidance System is based on a decision-making algorithm, which processes information about detected object distance and its size. To estimate the object distance from the tower a stereo-vision is used.
Due to the convenience of installation and maintenance, the vision system is mounted at the lower part of the tower of the wind turbine. Therefore, to cover the required observation area at the front of the blades, the baseline of the stereo cameras is in the vertical position and rotated by the α = F o V v / 2 , as it is shown in Figure 6. By analogy, the stereoscopic scene is also rotated by α . The distance D b between the baseline and the object can be calculated from stereoscopic imaging using the formula [57]:
D b = B × V S R h 2 × ( y U y D ) × tan ( F o V v 2 )
where ( y u y d ) is the difference in pixels between the projections of the object on the upper y u and lower y d camera matrix respectively, and the baseline B = B u + B d where [58]:
B u = D b × tan φ 1
B d = D b × tan φ 2
The distance D between the object and the tower, and the height H of the object with respect to the lower camera could be calculated from:
D = L D + E D
H = L H + E H
The components of (10) and (11) can be found from:
E D = D b cos α L D = B u sin α = D b tan φ u sin α
E H = D b sin α L H = B d cos α = D b tan φ d cos α
where φ u and φ d are angles between the optical axis of the camera and the line section connecting centers of the camera with the detected bird for upper and lower camera, respectively. They could be calculated as:
tan ( φ u ) = 2 y u V S R h 1 tan F o V v 2
tan ( φ d ) = 2 y d V S R h 1 tan F o V v 2
Using (10)–(15), the distance D and height H can be calculated from:
D = D b tan φ u sin α + D b cos α = = B × V S R h 2 × ( y u y d ) × tan F o V v 2 × 2 y u V S R h 1 tan F o V v 2 sin α + cos α
H = D b tan φ d cos α + D b sin α = = B × V S R h 2 × ( y u y d ) × tan F o V v 2 × 2 y d V S R h 1 tan F o V v 2 cos α + sin α
The distance D is a non-linear function of V S R , B, and F o V . For the chosen parameters of V S R h and F o V v , the baseline length, B determines uncertainty of distance measurement, which could be estimated using the exact differential method expressed by the formula [59]:
Δ D b = ± D b 2 × ( y u y d )
Δ D = ± 1 2 × ( y u y d ) × [ D + B * sin ( α ) ]
Δ H = ± 1 2 × ( y u y d ) × [ H + B * cos ( α ) ]
Since the desired object detection range is up to 300 m then the recommended baseline should be between 3 m and 10 m [57]. However due to other reasons, such a large baseline is not technically convenient, therefore 1 m baseline is applied. The impact of the baseline on the distance measurement uncertainty is presented on Figure 7 and Figure 8. The figures show how the measurement uncertainty and y d i f f vary in respect to distance for three values of baseline and for a given worst case of ϕ u i.e., y u = V S R v , for camera c 1 with 3 mm lens. The quantization error for B = 1 m is up to three times greater than for B = 3 m, but it is still acceptable since at the 300 m distance, the measurement uncertainty is around 30.7 m, which is less than desired 90% accuracy.
The measurement resolution uncertainty and pixel difference value y d i f f with respect to distance and height for the applied 1 m baseline is shown in Figure 9 and Figure 10, respectively. To see how the pixel difference value y d i f f and uncertainty depends on angles φ u and φ d , the values of the y u and respective y d are set to minimum and maximum i.e., 1 px and V S R v , respectively. The distance estimation uncertainty of the desired detection range of 300 m varies from 15 m to 33 m, for y d i f f equals 7 px and 17 px, respectively. The uncertainty of object height measurement is greater than for distance measurement and at the desired detection range of 300 m varies from 9 m to 36 m, for y d i f f equals 5 px and 19 px, respectively.

5.3. Processing System Architecture

The system processing architecture is shown in Figure 11. To ensure real-time performance, the system is based on the distributed computing concept. It consists of two main subsystems: Detection Module and Decision-Making System. The first one uses embedded CPU and GPU architecture of the Local Processing Unit. However, data processing at the second subsystem is performed at the database server. The input data of the Detection Module are provided from the stereo-vision system, consisting of two integrated cameras. Data from each camera is used for independent Motion Detection and Object Identification.
When a moving object is identified as a bird, a trigger is activated and the information determined by the Motion Detection algorithm about objects size o s [px], its width p W [px] and height p H [px], and the geometric center coordinates x c and y c are sent to the Decision-Making System. Additionally, the frame with the identified bird is received by the Decision-Making System for data handling. The Local Processing Unit is designed applying the Internet of Things (IoT) concept with IP addressing, so the Decision-Making System could easily identify the source of the data stream. Information from the Local Processing Units combines estimation of the object’s distance with classification of the bird size. Based on the classification a decision about the action to be performed by Collision Avoidance is taken. All data are stored on the Database and available on the website via GUI.

5.4. Detection Module Processing

The bird detection algorithm presented in Figure 12 is based on motion detection, which guarantees low computational complexity needed for real-time processing. First, two image frames, current and previous, are subjected to Mean blurring using the Gaussian blur method, also known as Gaussian smoothing. This step aims to minimize the impact of small lighting changes. Then, the frames are subtracted from each other to determine differences between the images, caused by an object’s movement. Frames difference generates the gradient matrix containing the value of the difference in each pixel. To filter out negligible differences, the image is subjected to Difference thresholding At the resulting image, the moving object could appear as two-fold. To determine the object’s singular envelope, the split images need to be merged, which is done by the Mean blurring using Gaussian blur filtering and Binary thresholding Then Contour detection is applied on the resulting binary image to get the envelope of the object. Knowing the object contour, the value of p W and p H can be calculated using the center of mass of the object contour and image moments [60]. If the object is smaller than p W m i n = 12 px or p H m i n = 2 px, it is neglected as an artefact. Otherwise, objects smaller than 100 px, are cropped using a standard mask of 100 × 100 px. If the object is greater than 100 px in any p W / H , then the cropping mask is resized to a greater value of p W or p H . The object size, o s , is calculated as the number of pixels over the contour, which is computed using the Green formula [61]. The object size is used in the Decision-Making System for object classification.
In the next step, the cropped image is subjected to the identification process since the Motion Detection algorithm determines all moving objects, not only birds, but also insects, planes, drones or moving clouds. The applied Convolutional Neural Network, CNN, was selected as a standard for object classification.
The architecture of the proposed CNN is presented in Figure 13. It consists of two convolutional layers with sizes of L C 1 and L C 2 , which are used for feature extraction. Two additional fully connected layers L F C 1 and L F C 2 are responsible for class probability calculation and final object identification. The layer L F C 1 uses the Softmax function to obtain binary information bird/not bird. The layers L C 1 , L C 2 and L F C 1 are activated by the Rectified Linear Unit function. Between layers L C 1 , L C 2 and L F C 1 , the Max pooling with 2 × 2 pool size is used. Thus, the number of features was set to 50.
At the first stage of CNN design, its model needed to be optimized and trained. The aim of the optimization process is to select a suitable number of neurons in each layer ( L C 1 × L C 2 × L F C 1 ) ensuring real-time inference with high reliability. The size of L F C 2 for the binary decision-making task was á priori chosen for 2 and the validation split was set to 10%. Parameter ϵ of ADAM optimizer was set to 10 7 , learning rate to 10 5 , training length was set to 50 epochs. The 3 × 3 convolution kernel was used.
For the CNN training, a database of 45,000 birds and 45,000 non-birds RGB images previously identified by the Detection algorithm were used. All images were manually double-checked to ensure best quality of the training process. The birds dataset consists of images of different species of small, medium and large birds taken at a distance of 40 m to 500 m from the wind turbine. However, the non-birds dataset consists of any other objects identified by the Detection algorithm. Objects bigger than 100 px × 100 px were re-scaled to Region of Interest (ROI). The examples of images used in CNN training are shown in Figure 14. The CNN was trained using 2 × NVIDIA Quadro RTX6000+ NVLink and Intel Xeon W-2223 3.6/3.9 GHz with 128 GB DDR4 ECC.
To optimize the CNN, it needs to be quantitatively evaluated in respect to the quality of the identification process. The following parameters were selected: Precision, Recall, F1, Specificity and Identification accuracy, as the most commonly used [62].
Precision, also called Positive Predictive Value is the ratio of correctly identified birds to all objects identified as birds. It is a measure of confidence that an identified object is truly a bird:
P r e c i s i o n = T P T P + F P ,
where TP stands for the number of True Positive detections, and FP means the number of False Positive detections.
Recall, also known as Detection Sensitivity is the ratio of correctly identified birds to all birds included in the test set. This parameter is a crucial measure of birds missed by the system:
R e c a l l = T P T P + F N ,
where FN is the number of False Negative detections.
The Harmonic Mean, F1, combines Precision and Recall into one coefficient, which is a measure of correctly identified birds to all false detections, both positive and negative:
F 1 = 2 × P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l = 2 T P 2 T P + F N + F P .
Specificity is the ratio of correctly identified non-birds to all non-birds included in the test set. The parameter is also a measure how many non-birds were miss-identified by the system as birds:
S p e c i f i c i t y = T N T N + F P ,
where TN is number of True Negative detections
Identification Accuracy is the ratio of correctly identified birds and non-birds to all objects in the test set. This is also a measure of system reliability to distinguish between birds and non-birds:
A c c u r a c y = T P + T N ( T P + T N ) + ( F N + F P )
To optimize the CNN, performance evaluation was carried out using a test dataset of 40,000 birds and 40,000 non-birds images. The results are presented in Table 6. For each simulation, the Feed-Forward, FF, time was estimated on the NVIDIA Quad-core ARM Cortex-A57 MPCore processor. This parameter was calculated as a mean time of FF process over each of 80,000 testing images.
The fastest solution, with the FF time bellow 1 ms, is when CNN contains L C 1 = 32, L C 2 = 32 and L F C 1 = 32 neurons. However, the obtained Precision and Specificity were the lowest of the tested parameters. The greatest Precision, F1 and Accuracy are for CNN of L C 1 = 32, L C 2 = 64 and L F C 1 = 256 neurons. However, for this case, the FF time of 2.85 ms could make impossible the real-time performance. Therefore, the CNN consisting of L C 1 = 32, L C 2 = 32 and L F C 1 = 128 neurons with FF = 1.09 ms has been selected as a reasonable trade-off.
The Recall values shown in Table 6 are the same for each tested CNN. With the variation of 0.002, this coefficient is statistically insignificant. It could mean that the birds differ strongly from other objects and could be easily distinguished by the tested CNNs. Overall, the system can assure low FN detections, which is desirable for safety applications.

5.5. Decision-Making Module Software

The block diagram of the Decision-Making System is presented in Figure 15. This system combines information from all Local Processing Units installed over the Wind Turbine. The applied distributed computing configuration of the system along with the IoT technology facilities allow for real-time performance of up to 20 Detection modules.
The Decision-Making System works on data-sets generated by the Local Processing Unit from the Upper and Lower cameras. The Data Stream containing information about the bird’s size o s [px], its image width p W [px] and height p H [px] along with the image geometric center coordinates x c and y c . First, the data is stored and then processed at the Data Stream Synchronization block, responsible for merging two data-streams from each Local Processing Unit. Based on the timestamp, the data from the upper and the lower cameras are fused. The identified objects’ coordinates are paired based on Geometric distance matching by minimization of the difference in objects image centers’ coordinates in the X and Y axis, using the following formula:
k min ( L , M ) k N + o c d i f f [ k ] = i L j M i , j N + min ( x c u [ i ] x c d [ j ] ) 2 + ( y c u [ i ] y c d [ j ] ) 2
where i and j are the indexes of L objects identified at the upper and M objects identified at the lower camera, respectively. For each object at the upper camera, the geometric distance to each object at the lower camera is calculated, and then a set of K = min(L,M) pairs with minimum distances is taken into further consideration.
Then, the False pair filtering algorithm, removes pairs of maximum center differences in x and y coordinates greater than 150 px. Such objects could be insects flying close to the system. Meanwhile the minimum of the y c d i f f is limited to 1 px, since negative value of the distances on such directed stereo-vision cameras is not possible.
When the center points from the 2D image planes of the upper and lower cameras are paired, then in a Distance estimation and classification block, a distance D b c to the object’s center could be calculated using (7) where y c u and y c d are y coordinates of the image geometric centers of the upper and lower images, respectively.
Knowing the distance D b c and using information about size of bird images in terms of p W and p H , the bird wingspan P W [m] and height P H [m] could be estimated using:
P W = ( D b c ± Δ D b c ) × p W × V S S h f × 1 V S R h P H = ( D b c ± Δ D b c ) × p H × V S S h f × 1 V S R h
From o s [px], which is estimated as the number of pixels over the bird’s contour, the object area O s [m 2 ] could be calculated as:
O s = o s × D b f 2 × V S S h V S R h × V S S v V S R v
An isosceles triangle, which is shown in Figure 16 has been used as an approximation method o a p p r o x to evaluate system performance. The triangle base corresponds to the bird’s wingspan P W [px]. However, the height of the triangle is equal to the P H [px] and denotes the bird’s height.
O a p p r o x = P W × P H 2 .
Based on the size estimate, the developed classifier distinguishes three bird size classes: small, medium and large. In Table 7, classification boundaries of small, medium and large birds are presented. The small birds are these whose wingspan is between 0.65 m and 1.25 m and height is between 0.32 m and 0.39 m. Birds of wingspan between 1.26 m and 1.50 m, and height from 0.40 m to 0.55 m is classified as medium birds. The large birds have wingspan above 1.50 m and height above 0.55 m.
The representation of the bird on an image plane depends mostly on the object distance from the system. Therefore, the uncertainty of distance measurement impacts mostly on the size classification accuracy. The uncertainty ranges of image sizes for each of the three class average sizes are presented in the Figure 17. Within distance ranges of each class, there are no overlaps for class average sizes. However, the boundaries between classes could be very fuzzy, and therefore the classification could be ambiguous, especially at long distances. To reduce the fuzziness, each object is differentiated based on three parameters, p W , p H and o s . Due to safety reasons the valid class always selects the largest of the indicated parameters. For instance, if one parameter indicates a large bird, then a bird is classified as large. Similarly, if even just one parameter indicates a medium bird and the two remaining suggest small bird, then the bird is classified as a medium.
Based on the object’s distance and its size, the Collision Avoidance system activates one of its predefined actions. A user could specify the distance and size category for activation of sound and/or strobe repellents or even turbine stopping. An example of the system setting is presented in Figure 1.
The system includes archiving of undertaken actions, which could be later analyzed by authorities. The archive consists of photos and/or videos. This functionality is required by some stakeholders and users.

6. Prototyping and Testing

In this section, the prototype of the system and its installations are described. Furthermore, the system and its implementation have been validated and the test results are shown here.

6.1. System Prototype

The optimized hardware and software have been implemented on suitable platforms to make possible the validation of the system in the field. The prototype of detection modules presented in Figure 18 is composed of two IMX219 cameras with 3 mm lens. An optional full HD camera using an IMX219 sensor is installed for video event verification. As a Local Processing Unit a Quad-core ARM Cortex-A57 MPCore processor with 2 GB RAM for object detection and 512-core Volta GPU for object identification were used. The Decision-Making System was implemented on a database Dell server with Xeon X5687 processor of 3.6 GHz and 8 GB of RAM. Two hard drive with 8 TB memory are included for media storage. The connection between the Decision-Making System and Detection Modules is provided by Ethernet protocol. The Detection Modules are powered using Safety Extra-Low Voltage, SELV.
For the strobe deterrent system, two 10 W, 800 lm and 4500 K lamps with flash frequency of 5 Hz–7 Hz are applied. As an audio deterrent, two 124 dB speakers generating a signal with frequency range of 2.4 kHz–6.5 kHz are used. The sound generated by the audio repellents is randomly selected from the set of sounds prepared by ornithologists. The hardware parameters of the deterrents system are selected based on a survey of related works and market availability of off-the-shelf products. To ensure low weight, the detection system is made in an acrylic cover.

6.2. Implementation and Testing in the Field

The developed system has been installed on the wind turbine at a wind farm in the northern part of Poland as shown in Figure 19. The system is designed to allow a non-invasive installation by using steel clamps fixing the modules to the wind turbine. The detection modules are uniformly distributed around the wind turbine, connected to a server using the IoT concept and powered using low voltage. To allow easy access of the stored data and to monitor the status of the Detection Modules in real time, a Long-Term Evolution, LTE, wireless broadband communication is used. The test field was chosen to allow horizontal and vertical coverage of the most crucial area along with good detection range of the system.
Figure 20 shows samples of time-lapse photos of a Red Kite caught by a detection module. The 2D pictures of the bird vary in size depending on its distance from the system.

6.3. Distance Measurement Evaluation

The system’s localization and size classification performance at different distances was validated using birds painted on canvasses. Three bird silhouettes simulating small, medium and large objects are shown in Figure 21. The measurements were taken from 50 m up to 300 m with a step of 50 m. The object’s dimensions of wingspan ( p W ), height ( p H ) and contoured area ( o s ) along with y u and y d were measured to estimate the distances ( D b ) and objects’ Wingspan ( P W ), Height ( P H ), approximation size O a p p r o x and contour size O s . The test results are summarized in Table 8. The value of Δ D b r e f is theoretical quantization error for the measured D b , while ( D b r e f D b ) is a real measurement uncertainty.
Most importantly, the experimental results show the high measurement repeatability. All tests of small size class validated the theoretical model, and measured distances are within quantization range. As expected, an estimation of P W is more accurate then P H , since bird’s height is fairly small. An interesting conclusion would be that an approximated size estimation O a p p r o x matches real size more precisely than that from the contour. For the medium class, experimental results are slightly worse than for the small size class, but just for the furthest distance of 250 m, where the accuracy is at a level of ±1 quantum instead of assumed ±1/2 quantum. A similar conclusion at this distance can be drawn for the large size silhouettes. However, for the longest distance of 300 m the measurements are very accurate with an uncertainty of 0.7%. Overall, the distance estimation uncertainty, calculated as D r e f D is less than 5% of the reference distance, which meets user’s expectation of 10% localization accuracy defined in Table 3. The precision of the objects’ wingspan and height estimation decreases with distance; however, it is still sufficient to distinguish the class of the object. It has been proved that the proposed simplified approximation method of the bird’s size O a p p r o x using the isosceles triangle is enough for its estimation.

6.4. System Validation

For system dynamics validation, a fixed-wing drone was used. The bird-like drone with a wingspan of 1.99 m and height of 1.03 m and weight of 2 kg, see Figure 22, was programmed to fly around the wind turbine at 100 m high and 150 m distance. The drone was equipped with a GPS sensor and autopilot for remote control. The GPS flight path is shown in Figure 23. The GPS measured average flight height and distance from the wind turbine were (102.9 ± 1.5) m (143.3 ± 2.5) m, respectively. Average speed of the drone was 15 m/s. The flight was monitored by the eight modules installed on the wind turbine. In Figure 23, zones of two randomly selected adjacent detection modules are depicted in blue ( M o d u l e A G P S ) and red ( M o d u l e B G P S ). The Module A detected the drone 42 times, whereas the Module B 37 times.
Figure 24 presents time samples of y u and y d , and Figure 25 presents corresponding estimated distances D, D b and H obtained for Module A. The histogram of measured difference in pixels, y d i f f is presented on the Figure 26. The most frequent measured value of y d i f f was 15 px and 16 px for Module A and Module B respectively, which corresponds to D b equal to 178.4 m and 167.2 m, respectively.
Figure 27 shows samples of estimated distance D and height H for Module A, Module B and GPS data. On the figure, three ellipses illustrate measurement statistics of each module and GPS. For each ellipse, the center depicts mean value of D and H while a semi-major axis represents standard deviation σ D and a semi-minor axis corresponds to standard deviation σ H . The maximal error of distance measurement is equal to 12.9 m and 18.2 m for Module A and Module B, respectively, in distance and 10.4 m and 11.9 m for corresponding height measurement. The test measurement results of heights H and distances D are summarized in Table 9. The difference in mean values between modules and GPS data is 2.8 m and 2.9 m in distance and 1.2 m and 3.8 m in height for Module A and Module B respectively. The average values of the measurements are within the range of uncertainties error Δ D b = 3.85 m.

6.5. System Verification by Ornithological Observations

The system verification was carried out by experienced ornithologists at random dates, times and weather conditions. The 67.5 h of observations were conducted for 14 days between May and July 2020. The time, roughly estimated distance and height of birds observed by the ornithologist were recorded. The records were compared with events reported by the system as summarized in Table 10. All Wood Pigeons and Common Buzzards observed by the ornithologist up to 200 m, which is in the range of small bird detection zone were correctly detected by the system. The Wood Pigeons, which belong to the small bird category were correctly classified with a rate of 8/10, while 2/10 were assigned as medium. In case of 10 observed Common Buzzards, whose size lies on the border of small and medium classes, 5 were classified as small and 3 were marked as medium, but 2 were classified as large, while over-sizing is considered to be a less crucial classification mistake.
In the case of medium-sized birds such as Raven and Marsh, the system correctly detected all 26 birds at up to 100 m, and just 1 of 20 was missed in zone of 100 m–200 m. For a distance over 200 m ornithologists observed 29 birds while the system detected 23 of them. As the lowest limit of Raven and Marsh harrier wingspan belongs to the small class, 16 out of 70 birds were classified as small, 23 were considered to be medium and 31 were marked as large. For Herring gull of wingspan covering medium size boundaries, each of the three detected birds was classified to a different class, which can be symptomatic for this class of bird.
Ornithologists observed just six medium/big birds, two Red kites and four Cranes. All of them were detected by the system up to 200 m, but one Crane was missed by the system at a distance of over 200 m. All observed Red Kite were classified as medium, while one of the Cranes was classified as medium and two as large.
Overall, the ornithologists identified 105, while the system missed 9 of them, but no bird was missed at a distance below 100 m and one medium size bird was omitted at a distance of between 100 m and 200 m. Most omissions, 7/9 happened for birds observed over 200 m and one of these 7 was a big bird. Furthermore, in one case of 98 bird classifications, the system misclassified a bird into a lower class than expected. The images of detected Raven and the Red Kite are shown in Figure 28a,b, respectively. The 2D flight routes of these birds are shown in Figure 29.
The Figure 30 shows the top view of bird detection density for one day of ornithologist observations. The map was created using Google heat maps layer [64]. The area of higher detection intensity is colored in red, whereas the area of lower intensity appears in green. The majority of detected birds in flight are to the West and North of the Wind turbine. There are also some more distant detections to the East and South.

7. Conclusions

This article tackles the problem of avifauna preservation at a wind farm. To reduce bird mortality near the wind turbine, a vision-based collision avoidance system is proposed. To assure real-time operation mode, the proposed solution applies a distributed computing paradigm embedded into IoT methodology. It means that the data processing is split between the Local Processing Unit and the Decision-Making System. The second one undertakes predefined repelling action based on the prepossessed information of the object position on the images at the top and bottom camera.
The system has been developed using a User-Driven Design (UDD) approach, which assured that the stakeholders, environmental authorities, future users and designers were actively involved in the design process. Such an approach ensured that the designed system was tailored by not only the market, but more importantly by the authorities. Moreover, customization opportunities have also been implemented to increase system adaptability for different installations.
The developed stereoscopic vision acquisition system allows the detection of an object and determines its distance from the turbine and then estimates its size. The designed AI-based identification method and size classification algorithm used for decision-making, reduces false positive detection and limits turbine stopping only for detected rare big birds. The repelling method implemented was designed according to state-of-the-art and has a cascading form composed of light and sound deterrents, which are backed by the most secure collision prevention method: turbine stopping.
The presented stereoscopic vision acquisition system was evaluated by the measurement of bird silhouettes painted on a canvas. The performed tests confirmed the assumed detection, localization and size classification performance quality for small birds up to 150 m, medium size birds up to 250 m and large birds up to 300 m.
The constructed prototype composed of eight Detection Modules and one Decision-Making System was installed at the wind turbine in northern Poland. Two kinds of tests where applied. First, the system was validated using a bird-like GPS equipped drone of wingspan 2.0 m. The averaged drone localization uncertainty error (2.85 m) was below theoretical quantization error (3.85 m) during the flight at 143.3 m from turbine.
Secondly, the results of ornithologists’ long-term observations were compared with the system records. During an 67.5 hour observation, the ornithologists identified 105 of small, medium and large birds. In this period the system detects 96 birds. All 9 missed object where observed at the larger distances (>150 m). More importantly, within the 100 m range, all birds observed by ornithologists were also detected by the system. At a distance between 100 m and 200 m only one medium size bird was not detected by the system. Furthermore, in one case of 98 birds, the system misclassified a bird into a lower class than the ornithologist. The test proved the required performance quality of the developed detection, localization and classification algorithms.
The system configuration and customization abilities allow its adjustment to desired requirements. The detection range of the presented solution was designed to cover the requested 300 m observation zone for large birds such as Red Kite. However, some authorities have recently introduced an even more restricting obligation with a detection range of up to 500 m [16]. Nevertheless, through the applied distributed computing paradigm and modular construction, the system could be easily re-configured to cover even more challenging observation zone.
Although the system was verified by ornithologists at random dates, different times of the day and weather conditions, there is still a need for a more systematized system’s performance analysis under various overcast conditions. Also, the reliability of bird flock detection should be evaluated. These are the areas of our future work.
Future development may concern a tracking algorithm to anticipate bird flight paths to decrease unnecessary turbine stopping. Furthermore, implementation of the Kalman filter, Probability Hypothesis Density (PHD) filter or Multiple Hypothesis Tracking (MHT) can improve the localization measurement accuracy and thus bird classification. Due to the fuzziness of bird size classes, the classifier may be enhanced by applying a fuzzy logic approach.

Author Contributions

Conceptualization, D.G., D.D. and W.J.K.; methodology, D.G., D.D., and W.J.K.; software, D.D., D.K., D.G. and M.M.; validation, A.S.-K., D.K.; writing—original draft preparation, D.G., D.D., M.M., and W.J.K.; writing—review and editing, W.J.K.; visualization, D.K., D.D. and D.G.; supervision, W.J.K.; project administration, A.J.; funding acquisition, W.J.K., D.G., D.D. and A.J. All authors have read and agreed to publish this version of the manuscript.

Funding

This research was funded by The National Centre for Research and Development of Poland grant number POIR.01.02.00-00-0247/17. Project title: ’Realization of R&D works leading to the implementation of a new solution—MULTIREJESTRATOR PLUS for monitoring and control of the power system in terms of operational efficiency, the life span extending and optimizing the impact on the surrounding wind farms’.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Presented data accessible for authorised staff accordingly to local regulation.

Acknowledgments

The authors would like to acknowledge Sandy Hamilton’s support to improve this article. In addition, it should be emphasized the enormous participation of all Bioseco employees in the project implementation.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Bauer, S.; Shamoun-Baranes, J.; Nilsson, C.; Farnsworth, A.; Kelly, J.F.; Reynolds, D.R.; Dokter, A.M.; Krauel, J.F.; Petterson, L.B.; Horton, K.G.; et al. The grand challenges of migration ecology that radar aeroecology can help answer. Ecography 2019, 42, 861–875. [Google Scholar] [CrossRef][Green Version]
  2. José Carlos Ribeiro, D.D. Analysis and evaluation of the risk of bird strikes in the international guarulohs airport surroundings. Indep. J. Manag. Prod. 2019, 10, 1193–1212. [Google Scholar] [CrossRef][Green Version]
  3. Van Gasteren, H.; Krijgsveld, K.L.; Klauke, N.; Leshem, Y.; Metz, I.C.; Skakuj, M.; Sorbi, S.; Schekler, I.; Shamoun-Baranes, J. Aeroecology meets aviation safety: Early warning systems in Europe and the Middle East prevent collisions between birds and aircraft. Ecography 2019, 42, 899–911. [Google Scholar] [CrossRef]
  4. Bhusal, S.; Bhattarai, U.; Karkee, M. Improving Pest Bird Detection in a Vineyard Environment using Super-Resolution and Deep Learning. IFAC-PapersOnLine 2019, 52, 18–23. [Google Scholar] [CrossRef]
  5. Cryan, P.M.; Gorresen, P.M.; Hein, C.D.; Schirmacher, M.R.; Diehl, R.H.; Huso, M.M.; Hayman, D.T.; Fricker, P.D.; Bonaccorso, F.J.; Johnson, D.H.; et al. Behavior of bats at wind turbines. Proc. Natl. Acad. Sci. USA 2014, 111, 15126–15131. [Google Scholar] [CrossRef] [PubMed][Green Version]
  6. Lambertucci, S.A.; Shepard, E.L.; Wilson, R.P. Human-wildlife conflicts in a crowded airspace. Science 2015, 348, 502–504. [Google Scholar] [CrossRef]
  7. Drewitt, A.L.; Langston, R.H. Assessing the impacts of wind farms on birds. Ibis 2006, 148, 29–42. [Google Scholar] [CrossRef]
  8. Environmental Impacts of Wind Energy|EnergySage. Available online: https://www.energysage.com/about-clean-energy/wind/environmental-impacts-wind-energy/ (accessed on 18 April 2020).
  9. U.S. Fish & Wildlife Service—Migratory Bird Program Conserving America’s Birds. Available online: https://www.fws.gov/birds/bird-enthusiasts/threats-to-birds/collisions/wind-turbines.php (accessed on 18 April 2020).
  10. Loss, S.R.; Will, T.; Marra, P.P. Estimates of bird collision mortality at wind facilities in the contiguous United States. Biol. Conserv. 2013, 168, 201–209. [Google Scholar] [CrossRef]
  11. Smallwood, K.S.; Bell, D.A. Effects of Wind Turbine Curtailment on Bird and Bat Fatalities. J. Wildl. Manag. 2020, 84, 685–696. [Google Scholar] [CrossRef]
  12. Scialabba, N. SAFA Guidelines: Sustainability Assessment of Food and Agriculture Systems, 3rd ed.; Food and Agriculture Organization of the United Nations: Rome, Italy, 2014. [Google Scholar]
  13. Hötker, H. The Impact of Repowering of Wind Farms on Birds and Bats; Michael-Otto-Institut im NABU: Bergenhusen, Germany, 2006. [Google Scholar]
  14. Furness, R.W.; Wade, H.M.; Masden, E.A. Assessing vulnerability of marine bird populations to offshore wind farms. J. Environ. Manag. 2013, 119, 56–66. [Google Scholar] [CrossRef]
  15. Cabrera-Cruz, S.A.; Cervantes-Pasqualli, J.; Franquesa-Soler, M.; Muñoz-Jiménez, Ó.; Rodríguez-Aguilar, G.; Villegas-Patraca, R. Estimates of aerial vertebrate mortality at wind farms in a bird migration corridor and bat diversity hotspot. Glob. Ecol. Conserv. 2020, 22, e00966. [Google Scholar] [CrossRef]
  16. Kompetenzzentrum Naturschutz und Energiewende. Available online: https://www.naturschutz-energiewende.de/fachwissen/veroeffentlichungen/synopse-detektionssysteme-zur-ereignisbezogenen-abschaltung-von-windenergieanlagen-zum-schutz-von-tagaktiven-brutvoegeln/ (accessed on 26 August 2020).
  17. Blackwell, B.F.; DeVault, T.L.; Seamans, T.W.; Lima, S.L.; Baumhardt, P.; Fernández-Juricic, E. Exploiting avian vision with aircraft lighting to reduce bird strikes. J. Appl. Ecol. 2012, 49, 758–766. [Google Scholar] [CrossRef][Green Version]
  18. Doppler, M.S.; Blackwell, B.F.; DeVault, T.L.; Fernández-Juricic, E. Cowbird responses to aircraft with lights tuned to their eyes: Implications for bird–aircraft collisions. Condor. Ornithol. Appl. 2015, 117, 165–177. [Google Scholar] [CrossRef][Green Version]
  19. Goller, B.; Blackwell, B.F.; DeVault, T.L.; Baumhardt, P.E.; Fernández-Juricic, E. Assessing bird avoidance of high-contrast lights using a choice test approach: Implications for reducing human-induced avian mortality. PeerJ 2018, 6, e5404. [Google Scholar] [CrossRef] [PubMed]
  20. Bishop, J.; McKay, H.; Parrott, D.; Allan, J. Review of International Research Literature Regarding the Effectiveness of Auditory Bird Scaring Techniques and Potential Alternatives; Food and Rural Affairs: London, UK, 2003. [Google Scholar]
  21. Air Force Times. Available online: http://www.airforcetimes.com/article/20140115/NEWS/301150011/Cadets-test-sound-light-system-deter-bird-strikes (accessed on 28 June 2020).
  22. Fox, A.D.; Beasley, P.D.L. David Lack and the birth of radar ornithology. Arch. Nat. Hist. 2010, 37, 325–332. [Google Scholar] [CrossRef]
  23. Dokter, A.M.; Desmet, P.; Spaaks, J.H.; van Hoey, S.; Veen, L.; Verlinden, L.; Nilsson, C.; Haase, G.; Leijnse, H.; Farnsworth, A.; et al. bioRad: Biological analysis and visualization of weather radar data. Ecography 2019, 42, 852–860. [Google Scholar] [CrossRef][Green Version]
  24. Advanced Radar Technology—Get to know our Bird Radar Solutions. Available online: https://www.robinradar.com/products (accessed on 19 April 2020).
  25. Gründinger, W. The renewable energy sources act (EEG). In Drivers of Energy Transition; Springer: Berlin/Heidelberg, Germany, 2017; pp. 257–419. [Google Scholar]
  26. Feng, X.; Jiang, Y.; Yang, X.; Du, M.; Li, X. Computer vision algorithms and hardware implementations: A survey. Integration 2019, 69, 309–320. [Google Scholar] [CrossRef]
  27. Chabot, D.; Francis, C.M. Computer-automated bird detection and counts in high-resolution aerial images: A review. J. Field Ornithol. 2016, 87, 343–359. [Google Scholar] [CrossRef]
  28. Hong, S.J.; Han, Y.; Kim, S.Y.; Lee, A.Y.; Kim, G. Application of deep-learning methods to bird detection using unmanned aerial vehicle imagery. Sensors 2019, 19, 1651. [Google Scholar] [CrossRef][Green Version]
  29. May, R.F.; Hamre, Ø.; Vang, R.; Nygård, T. Evaluation of the DTBird Video-System at the Smøla Wind-Power Plant. Detection Capabilities for Capturing Near-Turbine Avian Behaviour; Norwegian Institute for Nature Research (NINA): Trondheim, Norway, 2012. [Google Scholar]
  30. McClure, C.J.; Martinson, L.; Allison, T.D. Automated monitoring for birds in flight: Proof of concept with eagles at a wind power facility. Biol. Conserv. 2018, 224, 26–33. [Google Scholar] [CrossRef]
  31. Gil, G.; Savino, G.; Piantini, S.; Pierini, M. Is stereo vision a suitable remote sensing approach for motorcycle safety? An analysis of LIDAR, RADAR, and machine vision technologies subjected to the dynamics of a tilting vehicle. In Proceedings of the 7th Transport Research Arena TRA 2018 (TRA 2018), Vienna, Austria, 16–19 April 2018. [Google Scholar]
  32. Nilsson, C.; Dokter, A.M.; Schmid, B.; Scacco, M.; Verlinden, L.; Bäckman, J.; Haase, G.; Dell’Omo, G.; Chapman, J.W.; Leijnse, H.; et al. Field validation of radar systems for monitoring bird migration. J. Appl. Ecol. 2018, 55, 2552–2564. [Google Scholar] [CrossRef]
  33. DTBird, System for Bird Monitoring. Available online: https://dtbird.com/ (accessed on 28 June 2020).
  34. SafeWind—Détecter et RéDuire les Risques de Collision de la Faune Volante sur les éoliennes. Available online: https://www.biodiv-wind.com/en/ (accessed on 28 June 2020).
  35. Artificial Vision Air Detection. Available online: https://www.identiflight.com (accessed on 28 June 2020).
  36. Airelectronics Vulture detection. Available online: http://www.airelectronics.es/products/neural_network/net_vulture/ (accessed on 26 August 2020).
  37. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet classification with deep convolutional neural networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef]
  38. Jiang, X.; Wang, Y.; Liu, W.; Li, S.; Liu, J. Capsnet, cnn, fcn: Comparative performance evaluation for image classification. Int. J. Mach. Learn. Comput. 2019, 9, 840–848. [Google Scholar] [CrossRef]
  39. Li, X.; Tian, Y.; Zhang, F.; Quan, S.; Xu, Y. Object Detection in the Context of Mobile Augmented Reality. arXiv 2020, arXiv:2008.06655. [Google Scholar]
  40. Qin, J.; Pan, W.; Xiang, X.; Tan, Y.; Hou, G. A biological image classification method based on improved CNN. Ecol. Inform. 2020, 58, 101093. [Google Scholar] [CrossRef]
  41. Tandel, N.H.; Prajapati, H.B.; Dabhi, V.K. Voice Recognition and Voice Comparison using Machine Learning Techniques: A Survey. In Proceedings of the 2020 6th International Conference on Advanced Computing and Communication Systems (ICACCS), Coimbatore, India, 6–7 March 2020; pp. 459–465. [Google Scholar]
  42. Yoshihashi, R.; Kawakami, R.; Iida, M.; Naemura, T. Evaluation of bird detection using time-lapse images around a wind farm. In Proceedings of the European Wind Energy Association Annual Conference and Exhibition 2015, Paris, France, 17–20 November 2015; pp. 104–107. [Google Scholar]
  43. Sofia, K.; Pillai, M.M.; Raghuwanshi, U.S. Deep Learning Neural Network for Identification of Bird Species Sofia. In Proceedings of the IRSCNS 2018, Goa, India, 30–31 August 2018; Volume 75, pp. 291–298. [Google Scholar] [CrossRef]
  44. Gavali, P.; Mhetre, M.P.; Patil, M.N.; Bamane, M.N.; Buva, M.H. Bird Species Identification using Deep Learning. Int. J. Eng. Res. Technol. 2019, 8, 68–72. [Google Scholar]
  45. Trinh, T.T.; Yoshihashi, R.; Kawakami, R.; Iida, M.; Naemura, T. Bird detection near wind turbines from high-resolution video using lstm networks. In Proceedings of the World Wind Energy Conference 2016, Tokyo, Japan, 30 October–1 November 2016. [Google Scholar]
  46. Huang, G.; Liu, Z.; Van Der Maaten, L.; Weinberger, K.Q. Densely connected convolutional networks. In Proceedings of the 30th IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2016; pp. 2261–2269. [Google Scholar] [CrossRef][Green Version]
  47. Yoshihashi, R.; Kawakami, R.; Iida, M.; Naemura, T. Construction of a bird image dataset for ecological investigation. In Proceedings of the International Conference on Image Processing (ICIP) 2015, Quebec City, QC, Canada, 27–30 September 2015; pp. 1–5. [Google Scholar]
  48. Huang, Y.P.; Basanta, H. Bird image retrieval and recognition using a deep learning platform. IEEE Access 2019, 7, 66980–66989. [Google Scholar] [CrossRef]
  49. Welinder, P.; Branson, S.; Mita, T.; Wah, C.; Schroff, F. Caltech-ucsd Birds 200; California Institute of Technology: Pasadena, CA, USA, 2010; Volume 200, pp. 1–15. [Google Scholar]
  50. Mohanty, R.; Mallik, B.K.; Solanki, S.S. Automatic bird species recognition system using neural Network based on spike. Appl. Acoust. 2020, 161, 107177. [Google Scholar] [CrossRef]
  51. Houpt, R.; Pearson, M.; Pearson, P.; Rink, T.; Seckler, S.; Stephenson, D.; VanderStoep, A. Using Neural Networks to Identify Bird Species from Birdsong Samples. In An Introduction to Undergraduate Research in Computational and Mathematical Biology; Springer: Berlin/Heidelberg, Germany, 2020; pp. 401–442. [Google Scholar]
  52. Triveni, G.; Malleswari, G.N.; Sree, K.N.S.; Ramya, M. Bird Species Identification using Deep Fuzzy Neural Network. Int. J. Res. Appl. Sci. Eng. Technol. (IJRASET) 2020, 8, 1214–1219. [Google Scholar] [CrossRef]
  53. Orhan, A.E.; Pitkow, X. Skip connections eliminate singularities. In Proceedings of the 6th International Conference on Learning Representations, Vancouver, BC, Canada, 30 April–3 May 2018. [Google Scholar]
  54. Gradolewski, D.; Maslowski, D.; Dziak, D.; Jachimczyk, B.; Mundlamuri, S.T.; Prakash, C.G.; Kulesza, W.J. A Distributed Computing Real-Time Safety System of Collaborative Robot. Elektronika ir Elektrotechnika 2020, 26, 4–14. [Google Scholar] [CrossRef]
  55. Dziak, D.; Jachimczyk, B.; Kulesza, W.J. IoT-based information system for healthcare application: Design methodology approach. Appl. Sci. 2017, 7, 596. [Google Scholar] [CrossRef]
  56. Martin, G. Birds by Night; A&C Black: London, UK, 2010. [Google Scholar]
  57. Mrovlje, J.; Vrani, D. Distance measuring based on stereoscopic pictures. In Proceedings of the 9th International PhD Workshop on Systems and Control: Young Generation Viewpoint, Izola-Simonov zaliv, Sloveni, 1–3 October 2008. [Google Scholar]
  58. Gradolewski, D.; Dziak, D.; Kaniecki, D.; Jaworski, A.; Kulesza, W. A Stereovision Method and System. U.K. Patent Application No. 2018391.9, 23 November 2020. [Google Scholar]
  59. Chen, J.; Khatibi, S.; Kulesza, W. Depth reconstruction uncertainty analysis and improvement–The dithering approach. Image Vis. Comput. 2010, 28, 1377–1385. [Google Scholar] [CrossRef]
  60. Liu, Y.; Yin, Y.; Zhang, S. Hand gesture recognition based on HU moments in interaction of virtual reality. In Proceedings of the 2012 4th International Conference on Intelligent Human-Machine Systems and Cybernetics, Nanchang, China, 26–27 August 2012; Volume 1, pp. 145–148. [Google Scholar]
  61. Riley, K.F.; Hobson, M.P.; Bence, S.J. Mathematical Methods for Physics and Engineering: A Comprehensive Guide; Cambridge University Press: Cambridge, UK, 2006. [Google Scholar]
  62. Kassim, Y.; Prasath, S.; Glinskii, O.; Glinsky, V.; Huxley, V.; Palaniappan, K. Microvasculature segmentation of arterioles using deep CNN. In Proceedings of the 2017 IEEE International Conference on Image Processing (ICIP), Beijing, China, 17–20 September 2017; pp. 580–584. [Google Scholar] [CrossRef]
  63. Svensson, L.; Mullarney, K.; Zetterström, D. Collins bird guide 2nd edition. Br. Birds 2010, 103, 248–252. [Google Scholar]
  64. Google Maps. Available online: https://www.google.com/maps/@54.3644633,18.5030449,14z (accessed on 26 August 2020).
Figure 1. Configuration and variables’ definitions of bird protection system, where (*) means an optional prevention method.
Figure 1. Configuration and variables’ definitions of bird protection system, where (*) means an optional prevention method.
Sensors 21 00267 g001
Figure 2. The general system configuration and data processing scheme.
Figure 2. The general system configuration and data processing scheme.
Sensors 21 00267 g002
Figure 3. Functionalities and constrains and their impact on hardware components.
Figure 3. Functionalities and constrains and their impact on hardware components.
Sensors 21 00267 g003
Figure 4. Monitoring area of the system.
Figure 4. Monitoring area of the system.
Sensors 21 00267 g004
Figure 5. Projection of the bird on an image [px] as a function of bird wingspan [m] and its distance from the baseline, for C1 camera and lens of f = 3 mm.
Figure 5. Projection of the bird on an image [px] as a function of bird wingspan [m] and its distance from the baseline, for C1 camera and lens of f = 3 mm.
Sensors 21 00267 g005
Figure 6. Mapping of stereoscopic camera scenes, defining basic system parameters.
Figure 6. Mapping of stereoscopic camera scenes, defining basic system parameters.
Sensors 21 00267 g006
Figure 7. Baseline [m] and distance [m] impact of uncertainty of distance measurement [m]. Black and green lines denote recommended sizes of Baseline of 10 m and 3 m respectively, red line is selected trade-off of 1 m.
Figure 7. Baseline [m] and distance [m] impact of uncertainty of distance measurement [m]. Black and green lines denote recommended sizes of Baseline of 10 m and 3 m respectively, red line is selected trade-off of 1 m.
Sensors 21 00267 g007
Figure 8. Relationships between object Distance [m] and the difference in pixels on an image [px] (blue color) and a resolution of distance measurement [m] (brown color) for baseline of 1 m, 3 m and 10 m.
Figure 8. Relationships between object Distance [m] and the difference in pixels on an image [px] (blue color) and a resolution of distance measurement [m] (brown color) for baseline of 1 m, 3 m and 10 m.
Sensors 21 00267 g008
Figure 9. The measurement resolution uncertainty [m] and pixel difference value y d i f f [px] with respect to distance, for boundary values of the row number of the object projection on the image plane.
Figure 9. The measurement resolution uncertainty [m] and pixel difference value y d i f f [px] with respect to distance, for boundary values of the row number of the object projection on the image plane.
Sensors 21 00267 g009
Figure 10. The measurement resolution uncertainty [m] and pixel difference value y d i f f [px] with respect to height for boundary values of the row number of object projection on the image plane.
Figure 10. The measurement resolution uncertainty [m] and pixel difference value y d i f f [px] with respect to height for boundary values of the row number of object projection on the image plane.
Sensors 21 00267 g010
Figure 11. Illustration of system general processing architecture.
Figure 11. Illustration of system general processing architecture.
Sensors 21 00267 g011
Figure 12. Bird detection algorithm flowchart illustrated by original images from the system.
Figure 12. Bird detection algorithm flowchart illustrated by original images from the system.
Sensors 21 00267 g012
Figure 13. Architecture of Convolutional Neural Network used for bird identification.
Figure 13. Architecture of Convolutional Neural Network used for bird identification.
Sensors 21 00267 g013
Figure 14. Examples of images used for the training process.
Figure 14. Examples of images used for the training process.
Sensors 21 00267 g014
Figure 15. Block diagram of decision-making system.
Figure 15. Block diagram of decision-making system.
Sensors 21 00267 g015
Figure 16. Graphical approximation of the bird’s size calculation.
Figure 16. Graphical approximation of the bird’s size calculation.
Sensors 21 00267 g016
Figure 17. The change of object size o a p p r o x with distance caused by the quantization error of distance measurement for average representative of small, medium and large bird.
Figure 17. The change of object size o a p p r o x with distance caused by the quantization error of distance measurement for average representative of small, medium and large bird.
Sensors 21 00267 g017
Figure 18. A photo of the detection module of the bird protection system.
Figure 18. A photo of the detection module of the bird protection system.
Sensors 21 00267 g018
Figure 19. The system prototype mounted on a wind turbine at the test field. The system consists of eight Detection modules fixed on the tower wall and the Decision-making system placed inside the tower.
Figure 19. The system prototype mounted on a wind turbine at the test field. The system consists of eight Detection modules fixed on the tower wall and the Decision-making system placed inside the tower.
Sensors 21 00267 g019
Figure 20. Time-lapse photos of detection samples of Red Kite.
Figure 20. Time-lapse photos of detection samples of Red Kite.
Sensors 21 00267 g020
Figure 21. Pictures of three bird silhouettes simulating (a) small ( P W = 0.8 m and P H = 0.3 m), (b) medium ( P W = 1.2 m and P H = 0.4 m) and (c) large ( P W = 1.5 m and P H = 0.5 m) birds at distance of 150 m.
Figure 21. Pictures of three bird silhouettes simulating (a) small ( P W = 0.8 m and P H = 0.3 m), (b) medium ( P W = 1.2 m and P H = 0.4 m) and (c) large ( P W = 1.5 m and P H = 0.5 m) birds at distance of 150 m.
Sensors 21 00267 g021
Figure 22. Photos of the fixed-wing drone used for the system validation (a) on the ground (b) in flight.
Figure 22. Photos of the fixed-wing drone used for the system validation (a) on the ground (b) in flight.
Sensors 21 00267 g022
Figure 23. The GPS flight path of the test drone.
Figure 23. The GPS flight path of the test drone.
Sensors 21 00267 g023
Figure 24. The variation of y u and y d with time (sample) for Module A and Module B in the drone test.
Figure 24. The variation of y u and y d with time (sample) for Module A and Module B in the drone test.
Sensors 21 00267 g024
Figure 25. Variation of estimated distances D, D b and H for Module A in the drone test.
Figure 25. Variation of estimated distances D, D b and H for Module A in the drone test.
Sensors 21 00267 g025
Figure 26. Histogram of variation in pixel difference [px], y d i f f for module A and B in the drone test.
Figure 26. Histogram of variation in pixel difference [px], y d i f f for module A and B in the drone test.
Sensors 21 00267 g026
Figure 27. Distance from a wind tower vs height of drone test. Green dots-GPS data. Red and blue dots-data from module A and module B, respectively. Corresponding color ellipses illustrate standard deviations of respective distance and height measurements.
Figure 27. Distance from a wind tower vs height of drone test. Green dots-GPS data. Red and blue dots-data from module A and module B, respectively. Corresponding color ellipses illustrate standard deviations of respective distance and height measurements.
Sensors 21 00267 g027
Figure 28. Example images with depicted detected and classified (a) Raven, (b) Red Kite.
Figure 28. Example images with depicted detected and classified (a) Raven, (b) Red Kite.
Sensors 21 00267 g028
Figure 29. Examples of flight paths of (a) Raven, (b) Red Kite observed on 21 May 2020 visualized on Google Maps [64].
Figure 29. Examples of flight paths of (a) Raven, (b) Red Kite observed on 21 May 2020 visualized on Google Maps [64].
Sensors 21 00267 g029
Figure 30. A heat-map of the one-day observation where the red color depicts the highest density of detections.
Figure 30. A heat-map of the one-day observation where the red color depicts the highest density of detections.
Sensors 21 00267 g030
Table 1. Comparison of vision techniques for bird detection by DT Bird, SafeWind, Identyflight, BirdVision, Airelectronics.
Table 1. Comparison of vision techniques for bird detection by DT Bird, SafeWind, Identyflight, BirdVision, Airelectronics.
MethodDT BirdSafeWindIdentyflightBirdVisionAirelectronics
Detection methodMonoscopicMonoscopicStereoscopicMonoscopicMonoscopic
Distance estimationNoNoYesNoYes
LocalizationNoNoStereo-vision-basedNoNo
Maximum detection range650 m-1500 m300 m600 m
Target classificationNoNoGolden & Bald Eagles, Red KiteNoNo
InstallationWind turbineWind turbineSeparate towerWind turbineWind turbine
Collision preventionAudio, Turbine stopAudio, Turbine StopTurbine stopTurbine stopAudio, Turbine stop
Table 2. Comparison of CNN architectures used for bird identification. BN—Batch Normalization; SC—Skip Connection; FP—False Positives.
Table 2. Comparison of CNN architectures used for bird identification. BN—Batch Normalization; SC—Skip Connection; FP—False Positives.
PaperDatabaseIdentification AlgorithmImage Size [px × px × Channel]Pooling WindowActivation FunctionIdentification Accuracy
 [47][42]CNN28 × 28 × 3max 2 × 2ReLU80–90%, 0.2 FP
[45][45]CNN256 × 256 × 1-softmax90–98%, 0.2 FP
[48][49]CNN (BN, SC)112 × 112 × 1max 2 × 2ReLU, softmax90–99%
[4][4]CNN (SC)128 × 128 × 1; 96 × 96 × 1; 64 × 64 × 1; 32 × 32 × 1--70–90%
Table 3. Functional and nonfunctional requirements and particular constrains.
Table 3. Functional and nonfunctional requirements and particular constrains.
General RequirementsItemized RequirementsParticular Constrains
Environmental Authorities Rare bird speciesVery high effectiveness
Protection of the birdsBig birdsHigh effectiveness
Medium/Small birdsMedium effectiveness
During daylight>100 lux
Collision avoidanceTurbine stoppingCompulsory for rare and big birds
DeterrenceOptional for further distances
Validation dataPhoto and video from eventsHigh resolution allowing bird identification, data storage for 1 year
Wind farm developersFunctionalBird localizationDistance estimation90% accuracy
Bird classification/SizeSmall/Medium/Large
identificationSpeciesLocal rare species
High classification reliability
Collision avoidanceTurbine stoppingMinimization of turbine-off time
Deterrence methodAudio/Strobo
User interfaceEasy accessWeb/mobile application
NonfunctionalInstallationNon-invasive installationOn turbine using stainless steel climbs
System lifetimeAs long as possibleMinimum five years
System verificationData from the eventsHigh-resolution photos
High-resolution color smooth video
Collision monitoringAt least HD resolution smooth Video
System accessibilityWeb AppChrome, Mozilla, Safari
Mobile AppAndroid, IOS
Data handlingStorageAt least 2 years
ReportsSelective allowing the choice of only interesting data up to a year back
ManufacturerFunctionalInstallationPlug and Play SolutionModule construction, easy replacement
Compatibility with existing systemsTurbine stop using PLC or SCADA trough ModBus.
MaintenanceRemote software upgradeIoT.
In situ auto calibrationDaily.
NonfunctionalHigh reliabilitySmall number of FPAnnual average number of FP less than 10% of all detections.
Remote connectionDaily status checkFast and secure.
CustomizabilityAdjustment of system parameters such as detection range and size classification criteriaTo bird species nesting nearby, local law and regulations, specific ornithology’s recommendations, turbines features.
Table 4. Optical parameters of selected vision sensors.
Table 4. Optical parameters of selected vision sensors.
ParameterUnitC1 (IMX219)C2 (IMX447)C3 (AR1335)C4 (AR1820HS)
V S R h px3280405642084912
V S R v px2464304031203684
V S S h mm3.6806.2876.3007.660
V S S v mm2.7604.7125.7004.560
V S R h / V S S h px/mm891.30645.14667.93641.25
V S R v / V S S v px/mm892.75645.16547.36807.95
Table 5. Impact of the lens on the camera detection capabilities. The parameters, which fulfill the requirements are in bold. The selected options are underlined.
Table 5. Impact of the lens on the camera detection capabilities. The parameters, which fulfill the requirements are in bold. The selected options are underlined.
f [mm]C1 C2C3C4
FoV v / h p W / H FoV v / h p W / H FoV v / h p W / H FoV v / h p W / H
[ ] × [ ][px] × [px][ ] × [ ][px] × [px][ ] × [ ][px] × [px][ ] × [ ][px] × [px]
363.0 × 49.4 13.0 × 2.092.7 × 76.310.0 × 1.092.8 × 87.110.0 × 1.0103.9 × 74.59.0 × 2.0
449.4 × 38.118.0 × 2.076.3 × 61.013.0 × 2.076.4 × 70.913.0 × 1.087.5 × 59.413.0 × 2.0
634.1 × 42.926.0 × 4.055.3 × 42.919.0 × 3.055.4 × 50.820.0 × 2.065.1 × 41.619.0 × 3.0
825.9 × 32.835.0 × 5.042.9 × 32.826.0 × 3.043.0 × 39.227.0 × 3.051.2 × 31.826.0 × 4.0
1217.4 × 22.253.0 × 7.029.4 × 22.239.0 × 5.029.4 × 26.740.0 × 4.035.4 × 21.538.0 × 6.0
1613.1 × 16.871.0 × 10.022.2 × 16.852.0 × 7.022.3 × 20.253.0 × 6.026.9 × 16.251.0 × 9.0
Table 6. Test results of CNN performance evaluation, where the bolded row highlights the selected configuration; the values in red highlight the best values for a given parameter.
Table 6. Test results of CNN performance evaluation, where the bolded row highlights the selected configuration; the values in red highlight the best values for a given parameter.
CNN ParametersFF Time [ms]PrecisionRecallF1SpecificityAccuracy
L C 1 L C 2 L FC
3232320.800.9870.9890.9880.9870.988
3232640.970.9900.9890.9890.9900.989
32321281.090.9960.9890.9930.9960.993
32322561.590.9950.9890.9920.9950.992
3264321.280.9950.9890.9920.9950.992
3264641.420.9980.9880.9930.9980.993
32641281.930.9950.9890.9920.9950.992
32642562.850.9980.9890.9940.9990.994
6432321.540.9790.9890.9840.9790.984
6432641.650.9610.9890.9750.9600.975
64321281.840.9970.9890.9930.9970.993
64322562.310.9870.9890.9880.9870.988
6464322.330.9970.9890.9930.9970.993
6464642.510.9940.9890.9920.9940.992
64641283.320.9870.9890.9880.9870.988
64642563.840.9980.9890.9940.9980.994
min0.800.9610.9890.9750.9600.975
max3.840.9980.9890.9940.9990.994
Table 7. Classification boundaries of small, medium and large birds.
Table 7. Classification boundaries of small, medium and large birds.
ClassDetection Range
[m]
Wingspan
[m]
Height
[m]
Size
[ m 2 ]
Example Bird
Uncategorized-<0.68<0.32<0.11Feral Pigeon
House Sparrow
Small10–1830.68–1.250.32–0.390.11–0.24Common kestrel
Peregrine Falcon
Medium10–3121.26–1.500.40–0.550.25–0.41Steppe Buzzard
Marsh Harrier
Large10–392>1.50>0.55>0.41Red Kite
White stork
Table 8. Results of distance and size measurements at different distances for silhouettes simulating small, medium and large birds.
Table 8. Results of distance and size measurements at different distances for silhouettes simulating small, medium and large birds.
ParameterUnitReferenceReference Distance D b ref  [m]
Value50.00100.00150.00200.00250.00300.00
y d i f f px-542719---
D b m 50.35 100.69 143.09 ---
Δ D b r e f m 0.43 1.86 3.77 ---
D b r e f D b m 0.350.696.91---
P W m0.800.840.830.75---
P H m0.300.340.440.32---
O s m 2 0.120.160.180.13---
O a p p r o x m 2 0.140.190.12---
y d i f f px-5528181310-
D b m 49.43 97.09 151.04 209.13 271.87 -
Δ D b r e f m 0.43 2.17 4.20 8.0413.59
D b r e f D b m 0.572.911.049.1321.87-
P W m1.201.181.161.241.091.22-
P H m0.400.410.440.510.390.41-
O s m 2 0.240.240.240.330.230.23-
O a p p r o x m 2 0.230.320.320.210.25-
y d i f f px-54261913109
D b m 50.35 104.56 143.09 209.13 271.87 302.07
Δ D b r e f m 0.40 2.01 3.77 8.0413.5916.78
D b r e f D b m 0.354.566.919.1321.872.07
P W m1.501.421.561.501.491.531.36
P H m0.500.510.510.480.470.510.45
O s m 2 0.380.420.390.360.380.400.38
O a p p r o x m 2 0.360.400.360.350.390.31
Table 9. Summary of drone test results.
Table 9. Summary of drone test results.
D ¯ [m] σ D [m] H ¯ [m] σ H [m] D b min [m] D b max [m]
Module A146.17.7104.14.4148.7178.4
Module B146.28.4106.75.5140.9178.4
Drone143.31.2102.90.6--
Table 10. Comparison of ornithologists’ observations and system’s identification and classification of selected bird species.
Table 10. Comparison of ornithologists’ observations and system’s identification and classification of selected bird species.
Species NameWingspan [m]Identification Rate (Sys/Ornithologist)System Classification
[Eng]/[Lat][63]<100 m<100 m–200 m>>200 mSmall/Medium/Large
Wood pigeon/0.67–0.775/55/50/18/2/0
Columba palumbus
Common buzzard/1.10–1.306/64/4-5/3/2
Buteo
Raven/1.15–1.3024/2418/1922/2814/22/30
Corvus corax
Marsh harrier/1.15–1.402/21/11/12/1/1
Circus aeruginosus
Herring gull/1.23–1.482/21/1-1/1/1
Larus argentatus
Red Kite/1.40–1.651/1-1/10/2/0
Milvus
Crane/1.80–2.22-2/21/20/1/2
Grus
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gradolewski, D.; Dziak, D.; Martynow, M.; Kaniecki, D.; Szurlej-Kielanska, A.; Jaworski, A.; Kulesza, W.J. Comprehensive Bird Preservation at Wind Farms. Sensors 2021, 21, 267. https://doi.org/10.3390/s21010267

AMA Style

Gradolewski D, Dziak D, Martynow M, Kaniecki D, Szurlej-Kielanska A, Jaworski A, Kulesza WJ. Comprehensive Bird Preservation at Wind Farms. Sensors. 2021; 21(1):267. https://doi.org/10.3390/s21010267

Chicago/Turabian Style

Gradolewski, Dawid, Damian Dziak, Milosz Martynow, Damian Kaniecki, Aleksandra Szurlej-Kielanska, Adam Jaworski, and Wlodek J. Kulesza. 2021. "Comprehensive Bird Preservation at Wind Farms" Sensors 21, no. 1: 267. https://doi.org/10.3390/s21010267

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop