Next Article in Journal
Audio Deep Fake Detection with Sonic Sleuth Model
Next Article in Special Issue
Employing Different Algorithms of Lightweight Convolutional Neural Network Models in Image Distortion Classification
Previous Article in Journal
Technical Innovations and Social Implications: Mapping Global Research Focus in AI, Blockchain, Cybersecurity, and Privacy
Previous Article in Special Issue
YOLOv8-Based Drone Detection: Performance Analysis and Optimization
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Spatiotemporal Bayesian Machine Learning for Estimation of an Empirical Lower Bound for Probability of Detection with Applications to Stationary Wildlife Photography

by
Mohamed Jaber
,
Robert D. Breininger
,
Farag Hamad
†,‡ and
Nezamoddin N. Kachouie
*
Department of Mathematics and Systems Engineering, Florida Institute of Technology, 150 West University Blvd, Melbourne, FL 32901, USA
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Current address: Department of Mathematics, Benghazi University, Benghazi 16063, Libya.
Computers 2024, 13(10), 255; https://doi.org/10.3390/computers13100255
Submission received: 9 August 2024 / Revised: 20 September 2024 / Accepted: 23 September 2024 / Published: 8 October 2024
(This article belongs to the Special Issue Machine Learning Applications in Pattern Recognition)

Abstract

:
An important parameter in the monitoring and surveillance systems is the probability of detection. Advanced wildlife monitoring systems rely on camera traps for stationary wildlife photography and have been broadly used for estimation of population size and density. Camera encounters are collected for estimation and management of a growing population size using spatial capture models. The accuracy of the estimated population size relies on the detection probability of the individual animals, and in turn depends on observed frequency of the animal encounters with the camera traps. Therefore, optimal coverage by the camera grid is essential for reliable estimation of the population size and density. The goal of this research is implementing a spatiotemporal Bayesian machine learning model to estimate a lower bound for probability of detection of a monitoring system. To obtain an accurate estimate of population size in this study, an empirical lower bound for probability of detection is realized considering the sensitivity of the model to the augmented sample size. The monitoring system must attain a probability of detection greater than the established empirical lower bound to achieve a pertinent estimation accuracy. It was found that for stationary wildlife photography, a camera grid with a detection probability of at least 0.3 is required for accurate estimation of the population size. A notable outcome is that a moderate probability of detection or better is required to obtain a reliable estimate of the population size using spatiotemporal machine learning. As a result, the required probability of detection is recommended when designing an automated monitoring system. The number and location of cameras in the camera grid will determine the camera coverage. Consequently, camera coverage and the individual home-range verify the probability of detection.

1. Introduction

Spatiotemporal analysis of population dynamics has a broad range of applications such as preserving the population of endangered species and controlling the population of invasive species. One of the first steps to manage a population is estimating the population size. Several methods have been developed to estimate the abundance of animals [1,2,3]. A popular approach to estimate the abundance of animals is by counting the individuals or their signs. In this way, the estimated number of individuals per unit of the area can be obtained. This estimate is proportionate to the whole population and by considering the habitat area of the population, it can be used as a proxy for making inferences about the population size.
Capture-recapture methods have been widely used and have become a standard sampling and analytical framework for ecological statistics with applications to population analysis, such as population size and density [1,2,4,5,6,7]. The intuition behind the capture-mark-recapture technique is that if a significant number of individuals in a population are marked and released, the fractions that are recaptured in the next sample can be used to extrapolate the size of the entire population. Three pieces of information are needed to estimate the population size using this approach- the number of individuals that are marked in the first sampling occasion M , the individuals that were captured in the second sampling occasion, and the number of individuals that were captured in the first sample and recaptured in the second sample [1]. These methods are developed based on the physical traps to capture individuals for recording the encounter history. Due to technological advances, the ability to obtain encounter history data has improved. The encounter history of individuals can be collected using more efficient methods, such as camera traps, acoustic recordings, and DNA samples [8,9]. Camera traps (in particular) can be more effective to capture elusive species. Some problems with using camera traps are the possibility of capturing encounters of the same animal on multiple cameras in a short time period, and the fact that captured individuals are not identified.
Even though capture-recapture methods have been commonly used, methods that consider the spatial structure of the population in the sampling and analysis have only been implemented recently [10,11,12,13,14]. Specifically, conventional capture-recapture methods do not use any explicit spatial information with regard to the spatial nature of the sampling and spatial distribution of individual encounters. A spatial capture-recapture method has been introduced by Chandler and Royle [15] where observed encounter histories of individuals are used for spatial population ecology, where new technologies such as remote cameras and acoustic sampling can be used. While this method is promising, the estimated population size is not robust and suffers from spatial complexity problems.
Wildlife population monitoring is needed to manage the spatiotemporal dynamics of population size and density of different species. In our previous work [16], the accuracy of the spatial capture model regarding the distribution of individuals’ home range was investigated, and an enhanced capture model was proposed using an informative prior distribution for the home range. It was demonstrated that the proposed model improved the robustness and the accuracy of the estimated population size. The main objective of this research to extend our previous work in [16], is to study the impact of probability of detection on accuracy of the spatial capture models. Probability of detection is a latent variable that depends on the camera coverage in stationary wildlife photography and is an essential parameter to obtain accurate spatiotemporal estimates of the population size by monitoring population dynamics using stationary wildlife photography.

2. Hierarchical Spatial Capture Model

In this model, the individuals are associated with a location parameter for home range which means that each individual has a specific home range. The home range associated with each animal is unknown, and therefore the population size N is equal to the number of unknown activity centers. Encounters of individuals are collected using camera traps in the study area. A camera grid is essentially a multisource network of sensors for concurrent data collection. Collected photographs by multiple cameras in the grid are integrated to generate a georeferenced unified dataset. Numerical approximation of Bayesian capture-recapture model [15,16,17,18] using MCMC follows.
Spatial distance between the trap location and center of activity is calculated assuming that an individual i in the population has a fixed center of activity with spatial coordinates s i = ( s x , s y ) where i = 1 ,   2 , , N , and N is number of activity centers randomly distributed in the area of study S . A uniform prior is used to model the unknown activity center   s i by its bivariate coordinates:
s i ~   U n i f o r m ( S )
The camera grid for stationary wildlife photography has J camera at spatial locations with the coordinate x j ,   j = 1 ,   2 , ,   J . It is assumed that an individual can be virtually captured multiple times at the same camera trap or at different camera traps during a sampling occasion. The camera encounter history z i j k for individual i , at camera j , in occasion k , is models by a Poisson distribution:
z i j k ~ P o i s s o n ( λ i j )
where λ i j is the encounter rate at camera j for individual i . The likelihood of an individual i being captured at camera j , is a function of the Euclidean distance between its activity center s i and the camera location d i j = s i x j :
λ i j = λ 0 g i j
with the baseline encounter rate of λ 0 , and g i j defined as a monotonically decreasing half Gaussian function of the distance and a scale parameter σ estimated from the data:
g i j = e x p d i j 2 2 σ 2
The encounter history z i j k takes a binary value of 1 if the individual i is captured at camera j , or 0 otherwise, when assuming an individual can be captured at most once during the sampling occasion k . More practically, assuming an individual can be captured more than once during a sampling occasion, a ( J × K ) encounter history matrix is defined for each individual to summarize the capture history where z i j k will be the number of times that the individual i is caught at camera j on occasion k , with k = 1 ,   2 , , K . Because individual animals are not physically marked, they cannot be identified. Hence, the capture histories z i j k cannot be directly observed. Therefore, to estimate the unknown population size, a data augmentation method is implemented. The number of camera encounters at camera j in occasion k is:
n j k = i = 1 N z i j k
The full conditional latent encounter data is defined by a multinomial distribution:
z 1 j k ,   z 2 j k ,   , z N j k ~   M u l t i n o m i a l ( n j k ,   π 1 j , π 2 j ,   , π N j )
where π i j = λ i j i = 1 N λ i j . The camera encounter counts are modeled using a Poisson distribution:
n j k ~   P o i s s o n   Λ j
where:
Λ j = λ 0   i = 1 N g i j
The number of camera encounters at camera j can be obtained by:
n j . = k = 1 K n j k
Because Λ j and K are independent:
n j . ~   P o i s s o n   K Λ j
By augmenting the collected camera encounters using a set of all-zero camera encounter histories, an augmented hypothetical population size M is obtained as a proxy to the unknown true population size N . To avoid the truncation of the posterior distribution of N , the augmented parameter M must be an integer much greater than unknown N   M N , where a large value of M increases the computational time. Prior distributions of λ 0 ,   σ and ψ are often defined as uninformative U n i f o r m 0,1 distributions, where ψ ~ U n i f o r m 0,1 is the probability of an individual in the occupancy model (of size M ) belong to the true population of size N ~ B i n o m i a l M , ψ . Indicator variables ω 1 ,   ω 2 ,   ,   ω M are individuals in the augmented population with L = M N of them associated with all-zeros encounter histories:
ω i = 0 ,       i f   t h e   i n d i v i d u a l   i   i s   n o t   a   m e m b e r   o f   t h e   p o p u l a t i o n 1 ,       i f   t h e   i n d i v i d u a l   i   i s   a   m e m b e r   o f   t h e   p o p u l a t i o n  
where ω i ~ B e r n o u l l i ψ ,   i = 1 ,   2 , ,   M , with expected value E ω i = ψ and variance V a r ω i = ψ ( 1 ψ ) .
Therefore, the camera encounters of individual i in the augmented population is:
z i j k | ω i   ~   P o i s s o n ( λ i j ω i )
and the true population size is estimated by:
N ^ = i = 1 M ω i
A joint prior distribution of the model parameters can be defined by:
ψ , λ 0 , σ   ψ λ 0 σ
with the joint posterior distribution of:
z , ω , s , ψ , λ 0 , σ | n , X   i = 1 M { j = 1 J k = 1 K n j k | z i j k z i j k | ω i ,   s i , σ , λ 0 }   ω i | ψ s i   ψ λ 0 σ

3. Results

In this section, we use the simulation to demonstrate the sensitivity of the spatial model to its parameters. The data augmentation model is implemented to estimate unknown population size N , home range radius σ , baseline encounter rate λ 0 , and center of home range for each individual member of the population. Hence, a large value as an upper bound for augmented population size M = N + L (total number of hypothetical individuals) is selected to construct the augmented dataset. In this way, estimated N can assume values between zero and M . We performed several simulations to test the sensitivity of the model to the selected augmentation parameter L , the probability of detection P based on assumed camera coverage, and the sample size (number of occasions) K , given the true value of population size N . The results are discussed below.

3.1. Sensitivity of N ^ to L for Different Values of P

As it can be noticed in Table 1, for low probability of detection of P 0.1 , the estimated N has a broad range of values between 63 and 164. The sensitivity of the model to the added number of zeros L is more noticeable for low probability of detection. The sensitivity of the estimated N decreases as we increase the probability of detection. The estimated N is steady for probability of detection of 0.3 or higher. We can observe that, the width of the CI has negative correlation with the probability of detection such that higher probabilities (of detection) provide narrower CI’s for estimated   N . Although, for probability of detection below 0.25, the width of the CI has a positive correlation with the augmented population parameter L , the width of CI is not sensitive to the augmented population size for probability of detection of 0.25 or higher.
When the probability of detection is greater than 0.25, the width of the CI is stable and does not have substantial changes. This suggests that to have reliable estimates of population, the minimum probability of detection of 0.25 must be achieved. In turn it is essential to design the camera grid to enforce the minimum detection probability of 0.25.
As it is depicted in Figure 1, regardless of the set value for the augmented population parameter   L , a fair estimate of N is obtained for probability of detection of 0.3 or higher. However, based on the estimated values of standard error and the length of confidence interval in Table 1, to achieve an absolute estimation error of 10% or less, the probability of detection should reach 0.5 or higher. The convergence of posterior distributions of N ^ and P ^ for the probability of detection P = 0.05 and P = 0.3 are shown in Figure 2 and Figure 3 respectively. We can clearly see in these density plots that Markov chain for P = 0.3 converges to almost the same posterior distribution. As it can be observed in Figure 3, all chains are well mixed and the running mean for the first 1500 iterations are almost the same. Moreover, the autocorrelation of the chains for P = 0.3 drops considerably faster than that that of P = 0.05 .

3.2. Sensitivity of N ^ to P for Different Values of L

Sensitivity of estimated population size is then studied regarding different values for probability of detection at set values of augmented population parameter L . As we can see in Figure 4, for L = 100 , it underestimates the population size for P 0.15 . However, it provides a reasonable and consistent estimate of N for P > 0.15 . For L = 200 , it underestimates the population size only for P = 0.05 and a satisfactory estimate of N   is obtained for P 0.2 . For L = 300   &   400 , it overestimates the population size for P < 0.3 . The standard error of estimate increases by increasing L from 100 to 400. Nevertheless, a standard error of below 10% can be achieved for P > 0.5 regardless of value of L .

3.3. Sensitivity of P ^ to L for Different Values of P

The simulation results demonstrated in Table 2 and depicted in Figure 5, show the estimated value of the probability of detection for different values of augmented population parameter L . We can observe that the estimated value of the probability of detection is a fair estimate regardless of L given the probability of detection P > 0.25 . However, to achieve a standard error of estimate of Overall, the estimated value of the probability of detection is less sensitive to L than estimated N . Consequently, based on the simulation results, a probability of detection better than 0.25 must be achieved to obtain reliable estimate of the population size N . However, to obtain a standard error below 20%, the probability of detection must be 0.4 or higher. Notice that the probability of detection itself is an unknown parameter that depends on the home range of animals. In turn, it is imperative to design the camera grid based on animal home range for sustaining the desired probability of detection P > 0.25 .

3.4. Sensitivity of N ^ to K for Different Values of P

Next, the sensitivity of the estimated population size to the sample size was studied. The sample size depends on the sampling duration or number of occasions. Hence, simulation studies were performed, and the population size and probability of detection were estimated for different number of occasions at set values of probability of detection. From simulation results shown in Table 3 and Figure 6, we can see the impact of the number of occasions on the estimated population size N . The estimated N ranges from 88 (for P = 0.10 and K = 3 ) to 100 (for P = 0.25 and K = 25 ). Even for small number of occasions ( K = 3 ), and low probability of detection ( P = 0.10 ), a reasonable estimate of 88 is obtained. However, the standard error is over 55% for P = 0.10 and K = 3 , and it drops to below 1% for P = 0.25 and K = 25 . Moreover, for a large number of occasions like K = 25 , the estimated N is highly accurate regardless of P , such that even for low probability of detection of P = 0.1 , the estimation error of N is below 7%, while with a moderate probability of detection of P = 0.25 , even with K = 10 , an accurate estimate of N with the standard error of below 5% can be achieved. Therefore, if the probability of detection is low, or sufficient information about the abundance of animals is not available, increasing the number of occasions can noticeably improve the accuracy of the estimated population size.

3.5. Sensitivity of P ^ to K for Different Values of P

Next, the probability of detection P was estimated for different number of occasions at set values of probability of detection. Simulation results in Table 4 and Figure 7 show that the estimated value of the probability of detection P ^ is less sensitive to the number of occasions. The absolute error of estimated P ranges from zero to 0.028 (or 28%) for P = 0.1 and K = 3 . For a moderate probability of detection of 0.25, the absolute error of P ^ is almost zero regardless of the number of occasions. However, with P = 0.25 , for a reasonable accuracy with standard error of 15% or higher, K 10 must be attained. For a small number of occasions K = 3 , the standard error ranges from 36% for P = 0.25 to 83.5% for P = 0.1 . It means even with moderate P = 0.25 , if the number of occasions is small ( K = 3 ), the estimated P has a high standard error. Conversely, with small P = 0.1 , a reasonably accurate estimate of P with standard error of 15% can be obtained. Therefore, if the probability of detection is low, and/or sufficient information about the abundance of animals is not available, increasing the number of occasions can noticeably improve the accuracy of the estimated population size.

4. Discussion

The goal of this study was to design a camera grid for stationary wildlife photography to monitor population dynamics. Wildlife population monitoring is essential to manage and control the spatiotemporal variation of population size and density for one or more species. To assess and control a growing population, collected camera encounters will be used to estimate the population size and density. Spatial capture-unmarked, alternatively called capture-unidentified, models are broadly used for population analysis. The sensitivity of these models to the probability of detection was investigated in this study. To do so, unknown probability of detection was sampled using a non-informative prior distribution and randomized camera locations about the center of animals’ home-range. In this way, the likelihood of detecting an individual animal can be calculated using the distance of the camera location from the spatial location of the individual. Hence, the number of cameras and their locations in a camera grid must be carefully chosen to reach an adequate probability of detection based on the random encounters of individual animals with the camera traps.
From the simulation results, it was demonstrated that the standard error of the estimated N depends on the probability of detection. It was verified by simulation that the accuracy of the estimated population size improves by increasing the probability of detection. It was found that a detection probability of at least 0.3 was needed to obtain moderate accuracy for estimating the population size. Furthermore, a standard error of below 10% can be achieved for P > 0.5 regardless of value of L. These are notable outcomes of this study which are recommended to ensure a minimum camera coverage for reliable estimation of population size when designing a monitoring system using camera traps.
The estimated population size is also sensitive to the sample size and in turn it depends on the sampling duration. The population size was estimated through numerous simulations with different sampling duration (number of occasions). Yet again, as it was observed for the augmented population parameter L, with a moderate probability of detection P = 0.25 , a reasonable estimate of population size can be obtained even with small number of occasions K (shorter sampling duration). It was shown that not only larger values of L could not make up for a poor probability of detection to obtain a reliable estimate of N, but also with a low probability of detection, the accuracy of the estimated N deteriorated by increasing L. In contrast, an accurate estimate of N can be obtained with low probability of detection, contingent on increasing the sampling duration. It was revealed by simulation studies that larger values of K can make up for a poor probability of detection to obtain a reliable estimate of N. Consequently, if the probability of detection is low, or adequate information about the abundance of animals is not available, increasing the number of occasions can noticeably improve the accuracy of the estimated population size.
Here, we studied the impact of detection probability and drove a lower bound for it. However, it must be pointed out that there are several other factors to be considered when designing a camera grid to be able to maintain this recommended lower bound for detection probability for population analysis. There is no single remedy for different scenarios. One such factor is the size of the camera grid, i.e., the number of camera traps utilized and the area that will be surveyed. Because the size of the camera grid has an impact on the detection probability, a sufficient number of cameras is needed in order to achieve the lower-bound detection probability required to obtain an accurate estimate of population size and density [19,20,21].
Another important factor to consider in camera grid design is the placement of the camera traps along specific features, such as roads and trails [21]. There is the potential for significant biases in detection probability with non-random camera trap placements [19,22]. For example, placing camera traps along game trails can significantly increase the detection probability of certain species by up to 33%.
The species being studied must be also considered when designing a camera grid in order to achieve a proper detection probability, as there is significant inter-species variation in detection probability that is influenced by camera trap location [19,23,24]. For example, the number of camera traps required to estimate the population size and density of common species is much smaller than the number of camera traps needed for rare species [20]. A grid where cameras are placed several feet above the ground may capture slow moving or larger species more successfully than faster or smaller species.
Other factors that should be considered for camera trap grid designs are the camera sensing region and the placement of the camera. To meet the lower-bound detection probability, the way an individual is encountered by a camera needs to be considered. In order for an individual to be detected by a camera, that individual must be present in the study area, and it must enter the camera sensing region. This constitutes a true positive detection. However, false positive and negative detections are possible when using camera traps. False positive detections can occur when the camera is triggered but an individual is not present, for example when surrounding vegetation moves by wind or breeze and triggers the motion sensor of a camera. False negative detection occurs when an individual is present in the camera sensing region, but it is not recorded by the camera. In particular false negatives must be considered as they contribute to missing data. This type of detection can occur in many ways, such as when an individual is hidden by the surroundings in the camera sensing region or when an individual passes the camera trap swiftly and is not detected [24]. To reach the recommended lower bound for the detection probability, the false positive and false negative detections must also be taken into account based on the study area [25]. Our future work will be focused to study the sensitivity of the spatiotemporal models to these factors and how these factors can be incorporated into the model to maintain the recommended lower bound for the detection probability.

5. Conclusions

The sensitivity of the model to the augmented population parameter L is more noticeable for low probability of detection. It is important to point out that for low probability of detection P 0.2 , the accuracy of the estimated N decreases with increasing L and the capture-unidentified models tend to noticeably overestimate the population size for large values of the augmented population L. This sounds counterintuitive as it has been recommended to choose large L [14,15] to have accurate estimates of population size. Nevertheless, it was demonstrated in this study that, to maintain reliable estimates of N for large values of L, a probability of detection of P = 0.25 or higher must be attained. Moreover, the sensitivity of the estimated N decreases as we increase the probability of detection. Hence, to attenuate the sensitivity of the model to L, a probability of detection of P 0.35 must be reached. Additionally, to obtain a standard error below 15%, the probability of detection of 0.45 or higher must be achieved.
Based on the simulation results, a minimum probability of detection of 0.25 must be attained to obtain consistent estimate of the population size N. However, the probability of detection itself is an unknown parameter due to the random nature of the camera encounters. The probability of detection depends on how often individual animals are captured through camera traps. Consequently, the frequency of the random camera encounters is a function of animal’s home range and its distance from the camera location. Therefore, it is imperative to design the camera grid based on the animal’s home range to attain the desired probability of detection of P > 0.25 . Our future work is focused on how to maintain the minimum probability of detection in designing the sampling camera grid.
In the bottom line, we should point out that limited accuracy of the estimated population size [26] is due to intrinsic restrictions of capture-unidentified models. The purpose of this work was to study the sensitivity of capture-unidentified models regarding probability of detection, data augmentation size, and sample size (sampling duration). While it is essential to consider the sensitivity of the model in experiment design to attain a target probability of detection and appropriate sample size, there are other complementary approaches that can potentially improve the estimated population size. Evidently, the implementation of these methods relies on the available resources. For example, marking captured individuals [15] will facilitate their identification, and in turn substantially improves the accuracy of the estimated population size. Another approach to identify the individuals of the population is collecting genetic information using noninvasive methods [27].
Although effective sample size can be quantified in the MCMC method [28], the expected effective sample size must be contemplated during experimental design in association with the probability of detection and augmented population size. In this way, the target probability of detection and appropriate sampling duration can be achieved. Finally, another important aspect of implementing capture-unidentified models using MCMC is the convergence of the Markov Chain [29]. It has been observed in our previous works that the convergence of the Markov Chain (MC) does not necessarily uphold the reliability of the estimated population size [30,31]. Hence, our future research will be conducted to study the sensitivity of MC’s convergence as well as the estimation accuracy of the converged MC, to the augmented population size and the probability of detection.

Author Contributions

Conceptualization, N.N.K.; Methodology, M.J. and N.N.K.; Software, M.J., R.D.B., F.H. and N.N.K.; Validation, M.J., F.H. and N.N.K.; Formal analysis, M.J., R.D.B. and F.H.; Investigation, M.J. and N.N.K.; Resources, N.N.K.; Data curation, R.D.B.; Writing—original draft, M.J., R.D.B., F.H. and N.N.K.; Writing—review & editing, N.N.K.; Visualization, M.J., R.D.B. and F.H.; Supervision, N.N.K.; Project administration, N.N.K. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data was not collected. The random samples were generated for the simulation studies in this work.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Pollock, K.H. Capture-Recapture Models: A Review of Current Methods, Assumptions and Experimental Design. Available online: https://repository.lib.ncsu.edu/server/api/core/bitstreams/5b4fd48b-48fb-4eba-adf2-1366811bd4f7/content (accessed on 1 January 2023).
  2. Nichols, J.D. Capture-Recapture Models. BioScience 1992, 42, 94–102. [Google Scholar] [CrossRef]
  3. Schwarz, C.J.; Seber, G.A. Estimating animal abundance: Review III. Stat. Sci. 1999, 14, 427–456. [Google Scholar] [CrossRef]
  4. Pollock, K.H. A Capture-Recapture Design Robust to Unequal Probability of Capture. J. Wildl. Manag. 1982, 46, 752–757. [Google Scholar] [CrossRef]
  5. Pollock, K.H.; Nichols, J.D.; Brownie, C.; Hines, J.E. Statistical Inference for Capture-Recapture Experiments. Wildl. Monogr. 1990, 107, 3–97. [Google Scholar]
  6. Karanth, K.U. Estimating Tiger Panthera Tigris Populations from Camera-Trap Data Using Capture Recapture Models. Biol. Conserv. 1995, 71, 333–338. [Google Scholar] [CrossRef]
  7. O’Connell, A.F.; Nichols, J.D.; Karanth, K.U. Camera Traps in Animal Ecology: Methods and Analyses; Springer: Berlin/Heidelberg, Germany, 2011; Volume 271. [Google Scholar]
  8. Sollmann, R.; Gardner, B.; Belant, J.L. How does spatial study design influence density estimates from spatial capture-recapture models? PLoS ONE 2012, 7, e34575. [Google Scholar] [CrossRef]
  9. Engeman, R.M.; Massei, G.; Sage, M.; Gentle, M.N. Monitoring Wild Pig Populations: A Review of Methods; USDA National Wildlife Research Center—Staff Publications: Fort Collins, CO, USA, 2013; Paper 1496. [Google Scholar]
  10. Royle, J.A.; Young, K.V. A Hierarchical Model for Spatial Capture–Recapture Data. Ecology 2008, 89, 2281–2289. [Google Scholar] [CrossRef]
  11. Royle, J.A.; Karanth, K.U.; Gopalaswamy, A.M.; Kumar, N.S. Bayesian inference in camera trapping studies for a class of spatial capture–recapture models. Ecology 2009, 90, 3233–3244. [Google Scholar] [CrossRef]
  12. Kery, M.; Gardner, B.; Stoeckle, T.; Weber, D.; Royle, J.A. Use of Spatial Capture-Recapture Modeling and DNA Data to Estimate Densities of Elusive Animals. Conserv. Biol. 2011, 25, 356–364. [Google Scholar] [CrossRef]
  13. Borchers, D. A non-technical overview of spatially explicit capture–recapture models. J. Ornithol. 2012, 152 (Suppl. S2), S435–S444. [Google Scholar] [CrossRef]
  14. Royle, J.A.; Chandler, R.B.; Sollmann, R.; Gardner, B. Spatial Capture-Recapture; Academic Press: Cambridge, MA, USA, 2013. [Google Scholar]
  15. Chandler, R.B.; Royle, J.A. Spatially Explicit Models for Inference About Density in Unmarked or Partially Marked Populations. Ann. Appl. Stat. 2013, 7, 936–954. [Google Scholar] [CrossRef]
  16. Jaber, M.; Hamad, F.; Breininger, R.D.; Kachouie, N.N. An Enhanced Spatial Capture Model for Population Analysis Using Unidentified Counts through Camera Encounters. Axioms 2023, 12, 1094. [Google Scholar] [CrossRef]
  17. Royle, J.A.; Dorazio, R.M.; Link, W.A. Analysis of multinomial models with unknown index using data augmentation. J. Comput. Graph. Stat. 2007, 16, 67–85. [Google Scholar] [CrossRef]
  18. Royle, J.A.; Dorazio, R.M. Parameter-expanded data augmentation for Bayesian analysis of capture–recapture models. J. Ornithol. 2012, 152, 521–537. [Google Scholar] [CrossRef]
  19. O’Connor, K.M.; Nathan, L.R.; Liberati, M.R.; Tingley, M.W.; Vokoun, J.C.; Rittenhouse, T.A.G. Camera trap arrays improve detection probability of wildlife: Investigating study design considerations using an empirical dataset. PLoS ONE 2017, 12, e0175684. [Google Scholar] [CrossRef]
  20. Kays, R.; Arbogast, B.S.; Baker-Whatton, M.; Beirne, C.; Boone, H.M.; Bowler, M.; Burneo, S.F.; Cove, M.V.; Ding, P.; Espinosa, S.; et al. An empirical evaluation of camera trap study design: How many, how long and when? Methods Ecol. Evol. 2020, 11, 700–713. [Google Scholar] [CrossRef]
  21. Hofmeester, T.R.; Cromsigt, J.P.G.M.; Odden, J.; Andrén, H.; Kindberg, J.; Linnel, J.D.C. Framing pictures: A conceptual framework to identify and correct for biases in detection probability of camera traps enabling multi-species comparison. Ecol. Evol. 2019, 9, 2320–2336. [Google Scholar] [CrossRef]
  22. Kolowski, J.M.; Forrester, T.D. Camera trap placement and the potential for bias due to trails and other features. PLoS ONE 2017, 12, e0186679. [Google Scholar] [CrossRef]
  23. Mann, G.K.H.; O’Riain, M.J.; Parker, D.M. The road less travelled: Assessing variation in mammal detection probabilities with camera traps in a semi-arid biodiversity hotspot. Biodivers. Conserv. 2015, 24, 531–545. [Google Scholar] [CrossRef]
  24. Findlay, M.A.; Briers, R.A.; White, P.J.C. Component processes of detection probability in camera-trap studies: Understanding the occurrence of false-negatives. Mammal Res. 2020, 65, 167–180. [Google Scholar] [CrossRef]
  25. Symington, A.; Waharte, S.; Julier, S.; Trigoni, N. Probabilistic Target Detection by Camera-Equipped UAVs. In Proceedings of the IEEE International Conference on Robotics and Automation, Anchorage, AK, USA, 3–7 May 2010; pp. 4076–4081. [Google Scholar]
  26. Yamaura, Y.; Kery, M.; Royle, J.A. Study of biological communities subject to imperfect detection: Bias and precision of community N-mixture abundance models in small-sample situations. Ecol. Res. 2016, 31, 289–305. [Google Scholar] [CrossRef]
  27. Sollmann, R.; Tôrres, N.M.; Furtado, M.M.; Jácomo, A.T.d.A.; Palomares, F.; Roques, S.; Silveira, L. Combining camera-trapping and noninvasive genetic data in a spatial capture–recapture framework improves density estimates for the jaguar. Biol. Conserv. 2013, 167, 242–247. [Google Scholar] [CrossRef]
  28. Martino, L.; Elvira, V.; Louzada, F. Effective sample size for importance sampling based on discrepancy measures. Signal Process. 2017, 131, 386–401. [Google Scholar] [CrossRef]
  29. Fabreti, L.G.; Höhna, S. Convergence assessment for Bayesian phylogenetic analysis using MCMC simulation. Methods Ecol. Evol. 2022, 13, 77–90. [Google Scholar] [CrossRef]
  30. Mohamed, J. A Spatiotemporal Bayesian Model for Population Analysis. Ph.D. Thesis, Florida Institute of Technology, Melbourne, FL, USA, 2022. Available online: https://repository.fit.edu/etd/880 (accessed on 1 January 2023).
  31. Jaber, M.; Van Woesik, R.; Kachouie, N.N. Probabilistic Detection Model for Population Estimation. In Proceedings of the Second International Conference on Mathematics of Data Science, Old Dominion University, Norfolk, VA, USA; 2018; p. 43. Available online: https://scholar.google.com/citations?user=ghxCwaAeRhoC&hl=en&oi=sra (accessed on 1 January 2023).
Figure 1. Estimated N for different values of augmented population parameter L at set values of probability of detection P , given N = 100 . True N (black), Estimated N (blue), and Upper and Lower Confidence Bounds (red and orange respectively).
Figure 1. Estimated N for different values of augmented population parameter L at set values of probability of detection P , given N = 100 . True N (black), Estimated N (blue), and Upper and Lower Confidence Bounds (red and orange respectively).
Computers 13 00255 g001aComputers 13 00255 g001b
Figure 2. Diagnostic Plots for estimated N with P = 0.05 and L = 100 . Density (top left), Autocorrelation (top right), Ruining mean (middle right), and Trace (bottom). Red, green, and blue show three chains of MCMC.
Figure 2. Diagnostic Plots for estimated N with P = 0.05 and L = 100 . Density (top left), Autocorrelation (top right), Ruining mean (middle right), and Trace (bottom). Red, green, and blue show three chains of MCMC.
Computers 13 00255 g002
Figure 3. Diagnostic Plots for estimated N with P = 0.30 and L = 100 . Density (top left), Autocorrelation (top right), Ruining mean (middle right), and Trace (bottom). Red, green, and blue show three chains of MCMC.
Figure 3. Diagnostic Plots for estimated N with P = 0.30 and L = 100 . Density (top left), Autocorrelation (top right), Ruining mean (middle right), and Trace (bottom). Red, green, and blue show three chains of MCMC.
Computers 13 00255 g003
Figure 4. Estimated N for different values of P at set values of L for given N = 100 . True N (black), Estimated N (blue), and Upper and Lower Confidence Interval Limit (red and orange respectively).
Figure 4. Estimated N for different values of P at set values of L for given N = 100 . True N (black), Estimated N (blue), and Upper and Lower Confidence Interval Limit (red and orange respectively).
Computers 13 00255 g004
Figure 5. Estimated Value of P for Different Values of L at set values of Probability of Detection   P . True P (black), Estimated P (blue), Upper and Lower Confidence Interval Limit (red and orange respectively).
Figure 5. Estimated Value of P for Different Values of L at set values of Probability of Detection   P . True P (black), Estimated P (blue), Upper and Lower Confidence Interval Limit (red and orange respectively).
Computers 13 00255 g005
Figure 6. Estimated Value of N for different Number of Occasions K for the set values of Probability of Detection P . True N (black), Estimated N (blue), Upper and Lower Confidence Interval Limit (red and orange respectively).
Figure 6. Estimated Value of N for different Number of Occasions K for the set values of Probability of Detection P . True N (black), Estimated N (blue), Upper and Lower Confidence Interval Limit (red and orange respectively).
Computers 13 00255 g006
Figure 7. Estimated Value of P for Different values of Probability of Detection P and Different Number of Occasions K . True P (black), Estimated P (blue), Upper and Lower Confidence Interval Limit (red and orange respectively).
Figure 7. Estimated Value of P for Different values of Probability of Detection P and Different Number of Occasions K . True P (black), Estimated P (blue), Upper and Lower Confidence Interval Limit (red and orange respectively).
Computers 13 00255 g007
Table 1. Computed N ^ for N = 100 with different values of L and different values of P , along with standard error ( S d N ) and width of credible interval (CI width).
Table 1. Computed N ^ for N = 100 with different values of L and different values of P , along with standard error ( S d N ) and width of credible interval (CI width).
P L N ^ S d N Median L B A d j U B A d j CI width
0.0510063.11723.83660.80315.444110.79095.346
20092.55746.25884.4950.041185.073185.032
300118.73670.290103.3800259.316281.160
500158.755106.403131.8500372.135425.546
0.1010087.56120.93687.45545.690129.43383.743
200114.99641.742109.13531.513198.480166.967
300127.07555.131116.40016.814237.337220.523
500163.89384.067143.8350332.027336.269
0.1510096.24419.38795.48057.470135.01877.548
200113.99934.327107.97545.344182.653137.309
300128.74744.564120.23539.619217.875178.255
500124.80649.577112.95525.653223.960198.307
0.20100102.53618.035101.11566.466138.60672.140
200112.68127.015107.93558.651166.711108.060
300116.14431.775109.76052.594179.694127.100
500116.12932.978109.22550.173182.086131.913
0.25100103.32015.866101.76571.589135.05263.463
200110.30420.596106.99069.112151.49782.385
300107.95820.690104.44066.577149.33982.762
500105.02719.699101.74565.629144.42578.796
0.30100104.41713.511102.81077.395131.44054.045
200104.88714.861102.74575.166134.60959.442
300106.25915.258104.06075.743136.77461.031
500104.20814.765102.02074.678133.73859.061
0.35100103.61111.107102.18081.397125.82444.427
200102.30511.132100.87080.041124.56944.528
300105.01411.624103.51081.766128.26246.496
500102.52510.918101.09080.690124.36143.671
0.40100103.6788.853102.59585.972121.38435.411
200101.9868.678101.00584.631119.34234.712
300102.6778.713101.63585.252120.10334.851
500103.1948.787102.17085.619120.76835.149
0.45100101.2576.517100.54088.223114.29126.067
200101.0536.741100.32087.571114.53626.965
300101.6176.788100.84088.042115.19227.151
500101.9556.891101.18088.172115.73727.564
0.50100100.6165.247100.02090.121111.11020.988
200101.4765.334100.85590.807112.14421.336
300102.1795.586101.56091.007113.35122.344
500102.0985.439101.57091.221112.97621.755
0.7510099.9801.42699.72797.128102.8315.703
20099.8681.41299.64097.021102.6315.610
300100.1441.39699.92097.353102.9365.584
500100.1731.45999.88097.248103.0935.846
Table 2. Estimated Mean of P for Different Values of L and for Different values of Probability of Detection, along with standard error ( S d P ) and width of credible interval (CI width).
Table 2. Estimated Mean of P for Different Values of L and for Different values of Probability of Detection, along with standard error ( S d P ) and width of credible interval (CI width).
P L P ^ s d P Median L B A d j U B CI width
0.051000.1050.0540.0920.0000.2140.217
2000.0880.0540.0740.0000.1960.216
3000.0740.0530.0590.0000.1790.211
5000.0700.0520.0560.0000.1730.206
0.101000.1280.0420.1210.0450.2120.167
2000.1090.0460.1010.0180.2010.182
3000.1130.0490.1050.0150.2120.197
5000.0980.0460.0890.0050.1910.186
0.151000.1690.0430.1640.0830.2550.172
2000.1560.0480.1510.0600.2510.191
3000.1440.0470.1390.0500.2380.189
5000.1540.0500.1490.0540.2540.201
0.201000.2110.0440.2070.1230.3000.177
2000.1980.0480.1950.1030.2930.190
3000.1920.0480.1900.0960.2890.193
5000.1980.0490.1960.1010.2960.195
0.251000.2530.0450.2510.1630.3430.180
2000.2430.0460.2410.1510.3360.185
3000.2500.0470.2480.1560.3440.189
5000.2510.0480.2500.1560.3460.190
0.301000.2990.0440.2970.2100.3880.178
2000.2960.0450.2950.2050.3870.182
3000.2950.0450.2940.2040.3850.181
5000.2970.0460.2960.2050.3880.183
0.351000.3450.0430.3450.2580.4320.174
2000.3490.0440.3480.2610.4370.175
3000.3420.0440.3420.2550.4300.175
5000.3510.0440.3510.2630.4390.175
0.401000.3930.0420.3930.3100.4760.167
2000.3960.0420.3960.3120.4800.168
3000.3980.0420.3980.3140.4810.167
5000.3950.0420.3940.3110.4780.167
0.451000.4540.0400.4550.3750.5330.158
2000.4490.0400.4490.3680.5290.160
3000.4470.0400.4470.3670.5270.160
5000.4440.0400.4450.3640.5250.160
0.501000.5020.0380.5020.4260.5770.151
2000.5010.0380.5010.4250.5770.152
3000.4890.0380.4900.4130.5660.153
5000.4960.0380.4960.4200.5710.152
0.751000.7480.0270.7490.6940.8020.108
2000.7500.0270.7510.6960.8040.107
3000.7520.0270.7530.6980.8060.107
5000.7440.0270.7450.6900.7990.109
Table 3. Estimated N for Different Number of Occasions and for Different values of Probability of Detection, along with standard error ( S d N ) and width of credible interval (CI width).
Table 3. Estimated N for Different Number of Occasions and for Different values of Probability of Detection, along with standard error ( S d N ) and width of credible interval (CI width).
K P N ^ S d N MedianLBUBCI width
30.1087.56120.93687.45545.690129.43383.743
0.1596.05719.42895.36057.202134.91277.710
0.20100.31017.81098.97064.691135.92971.239
0.25104.37316.066102.75072.241136.50664.265
50.1098.90318.24998.26062.405135.40172.996
0.15101.97415.193100.40071.587132.36160.774
0.20103.40212.086101.96079.230127.57548.345
0.25101.2858.659100.37083.967118.60234.634
100.10102.52112.239101.12078.043126.99848.955
0.1599.7116.77099.02086.171113.25127.080
0.2099.7484.26699.31091.217108.27917.062
0.2599.7792.80299.43594.174105.38311.209
150.10100.7637.213100.08586.337115.19028.853
0.15100.8433.748100.42093.347108.33914.992
0.2099.9632.07499.74095.816104.1108.294
0.25100.4901.271100.20097.948103.0325.084
200.10100.4324.57299.99091.288109.57618.288
0.15100.2382.20699.97095.826104.6508.824
0.2099.9311.12499.76097.682102.1804.497
0.25100.0260.55999.71498.909101.1432.234
250.1099.5953.17999.20093.237105.95412.717
0.1599.9421.40499.68097.134102.7495.615
0.2099.9750.63799.59098.702101.2492.547
0.2599.9830.28699.90099.411100.5551.144
Table 4. Estimated P for Different Number of Occasions and for Different values of Probability of Detections, along with standard error ( S d P ) and width of credible interval (CI width).
Table 4. Estimated P for Different Number of Occasions and for Different values of Probability of Detections, along with standard error ( S d P ) and width of credible interval (CI width).
K P P ^ S d P MedianLBUBCI width
30.100.1280.0420.1210.0450.2120.167
0.150.1680.0430.1630.0820.2540.172
0.200.2080.0440.2040.1190.2960.177
0.250.2500.0450.2480.1610.3390.178
50.100.1080.0260.1050.0560.1600.103
0.150.1580.0290.1560.1010.2150.114
0.200.2020.0280.2010.1460.2590.113
0.250.2510.0300.2500.1910.3110.120
100.100.1010.0150.1000.0710.1310.060
0.150.1500.0150.1500.1200.1810.062
0.200.2010.0150.2010.1710.2310.061
0.250.2510.0190.2510.2140.2890.075
150.100.0990.0100.0990.0790.1190.040
0.150.1490.0100.1490.1290.1690.040
0.200.2020.0110.2020.1800.2240.044
0.250.2470.0100.2470.2270.2670.040
200.100.1010.0100.1010.0810.1210.040
0.150.1510.0090.1510.1330.1690.037
0.200.2010.0090.2010.1830.2190.036
0.250.2540.0100.2540.2340.2740.040
250.100.1010.0100.1010.0810.1210.040
0.150.1500.0070.1500.1350.1650.030
0.200.2010.0100.2010.1810.2210.040
0.250.2490.0100.2490.2290.2690.040
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Jaber, M.; Breininger, R.D.; Hamad, F.; Kachouie, N.N. Spatiotemporal Bayesian Machine Learning for Estimation of an Empirical Lower Bound for Probability of Detection with Applications to Stationary Wildlife Photography. Computers 2024, 13, 255. https://doi.org/10.3390/computers13100255

AMA Style

Jaber M, Breininger RD, Hamad F, Kachouie NN. Spatiotemporal Bayesian Machine Learning for Estimation of an Empirical Lower Bound for Probability of Detection with Applications to Stationary Wildlife Photography. Computers. 2024; 13(10):255. https://doi.org/10.3390/computers13100255

Chicago/Turabian Style

Jaber, Mohamed, Robert D. Breininger, Farag Hamad, and Nezamoddin N. Kachouie. 2024. "Spatiotemporal Bayesian Machine Learning for Estimation of an Empirical Lower Bound for Probability of Detection with Applications to Stationary Wildlife Photography" Computers 13, no. 10: 255. https://doi.org/10.3390/computers13100255

APA Style

Jaber, M., Breininger, R. D., Hamad, F., & Kachouie, N. N. (2024). Spatiotemporal Bayesian Machine Learning for Estimation of an Empirical Lower Bound for Probability of Detection with Applications to Stationary Wildlife Photography. Computers, 13(10), 255. https://doi.org/10.3390/computers13100255

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop