Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (10)

Search Parameters:
Keywords = randomly occurring uncertainties

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
26 pages, 940 KiB  
Article
Dynamic Event-Triggered Robust Fusion Estimation for Multi-Sensor Systems Under Time-Correlated Fading Channels
by Taixian Zhao, Yiyang Cui, Cong Huang, Quan Shi and Hailong Chen
Electronics 2025, 14(11), 2211; https://doi.org/10.3390/electronics14112211 - 29 May 2025
Viewed by 306
Abstract
This paper investigates the problem of robust fusion state estimation for multi-sensor systems under the influence of time-correlated fading channels, incorporating a dynamic event-triggered mechanism (DETM). The randomly occurring parameter uncertainties are characterized by a stochastic variable following a Bernoulli distribution, while sensor [...] Read more.
This paper investigates the problem of robust fusion state estimation for multi-sensor systems under the influence of time-correlated fading channels, incorporating a dynamic event-triggered mechanism (DETM). The randomly occurring parameter uncertainties are characterized by a stochastic variable following a Bernoulli distribution, while sensor measurements are transmitted to the corresponding estimators through time-correlated fading channels and dynamic event-triggered mechanisms. The DETM dynamically adjusts the triggering threshold via regulation and memory factors, enhancing adaptability in data transmission while effectively reducing redundant communication overhead. Furthermore, an augmented state model is constructed by integrating system states, channel coefficients, and the event-triggering mechanism, thereby comprehensively capturing the impact of dynamic environments on state estimation. Based on this model, a local state estimation algorithm is designed to ensure the convergence of the upper bound of the local estimation error covariance, which is further minimized at each time step through adaptive adjustment of local estimator gains. Subsequently, the local estimates obtained from multiple estimators are fused using the covariance intersection fusion strategy, improving the overall estimation accuracy. Simulation experiments demonstrate that the proposed recursive fusion state estimation framework significantly reduces communication overhead and enhances estimation performance in the presence of both time-correlated fading channels and randomly occurring parameter uncertainties, while maintaining an acceptable computational cost. Compared to the traditional Kalman filtering method, the proposed recursive fusion state estimation algorithm improves estimation accuracy by 58% while increasing computational time by only 32.4%. Additionally, the DETM effectively reduces communication frequency by 36.7% Full article
Show Figures

Figure 1

18 pages, 4079 KiB  
Article
Patch-Based Surface Accuracy Control for Digital Elevation Models by Inverted Terrestrial Laser Scanning (TLS) Located on a Long Pole
by Juan F. Reinoso-Gordo, Francisco J. Ariza-López and José L. García-Balboa
Remote Sens. 2024, 16(23), 4516; https://doi.org/10.3390/rs16234516 - 2 Dec 2024
Cited by 1 | Viewed by 790
Abstract
Currently, many digital elevation models (DEMs) are derived from airborne LiDAR data acquisition flights. The vertical accuracy of both products has typically been evaluated using methods based on randomly sampled control points. However, due to the superficial nature of the DEM, logic suggests [...] Read more.
Currently, many digital elevation models (DEMs) are derived from airborne LiDAR data acquisition flights. The vertical accuracy of both products has typically been evaluated using methods based on randomly sampled control points. However, due to the superficial nature of the DEM, logic suggests that it is more appropriate to use a superficial object as an evaluation and control element, that is, a “control surface” or “control patch”. Our approach proposes a method for obtaining each patch from a georeferenced point cloud (PC) measured with a terrestrial laser scanner (TLS). In order to reduce the dilution of precision due to very acute angles of incidence that occur between the terrain and the scanner′s rays when it is stationed on a conventional tripod, a system has been created that allows the scanner to be placed face down at a height of up to 7 m. Stationing the scanner at that height also has the advantage of reducing shadow areas in the presence of possible obstacles. In our experiment, the final result is an 18 m × 18 m PC patch which, after resampling, can be transformed into a high-density (10,000 points/m2) and high-quality (absolute positional uncertainty < 0.05 m) DEM patch, that is, with a regular mesh format. This DEM patch can be used as the ground truth to assess the surface accuracy of DEMs (DEM format) or airborne LiDAR data acquisition flights (PC format). Full article
(This article belongs to the Special Issue Applications of Laser Scanning in Urban Environment)
Show Figures

Figure 1

19 pages, 972 KiB  
Article
Robust H Control for Autonomous Underwater Vehicle’s Time-Varying Delay Systems under Unknown Random Parameter Uncertainties and Cyber-Attacks
by Soundararajan Vimal Kumar and Jonghoek Kim
Appl. Sci. 2024, 14(19), 8827; https://doi.org/10.3390/app14198827 - 1 Oct 2024
Cited by 4 | Viewed by 913
Abstract
This paper investigates robust H-based control for autonomous underwater vehicle (AUV) systems under time-varying delay, model uncertainties, and cyber-attacks. Sensor and actuator cyber-attacks can cause faults in the overall AUV system. In addition, the behavior of the system can be affected [...] Read more.
This paper investigates robust H-based control for autonomous underwater vehicle (AUV) systems under time-varying delay, model uncertainties, and cyber-attacks. Sensor and actuator cyber-attacks can cause faults in the overall AUV system. In addition, the behavior of the system can be affected by the presence of complexities, such as unknown random uncertainties that occur in system modeling. In this paper, the robustness against unpredictable random uncertainties is investigated by considering unknown but norm-bounded (UBB) random uncertainties. By constructing a proper Lyapunov–Krasovskii functional (LKF) and using linear matrix inequality (LMI) techniques, new stability criteria in the form of LMIs are derived such that the AUV system is stable. Moreover, this work is novel in addressing robust H control, which considers time-varying delay, cyber-attacks, and randomly occurring uncertainties for AUV systems. Finally, the effectiveness of the proposed results is demonstrated through two examples and their computer simulations. Full article
(This article belongs to the Section Robotics and Automation)
Show Figures

Figure 1

19 pages, 17130 KiB  
Article
Estimating the Quality of the Most Popular Machine Learning Algorithms for Landslide Susceptibility Mapping in 2018 Mw 7.5 Palu Earthquake
by Siyuan Ma, Xiaoyi Shao and Chong Xu
Remote Sens. 2023, 15(19), 4733; https://doi.org/10.3390/rs15194733 - 27 Sep 2023
Cited by 13 | Viewed by 2038
Abstract
The Mw 7.5 Palu earthquake that occurred on 28 September 2018 (UTC 10:02) on Sulawesi Island, Indonesia, triggered approximately 15,600 landslides, causing about 4000 fatalities and widespread destruction. The primary objective of this study is to perform landslide susceptibility mapping (LSM) associated with [...] Read more.
The Mw 7.5 Palu earthquake that occurred on 28 September 2018 (UTC 10:02) on Sulawesi Island, Indonesia, triggered approximately 15,600 landslides, causing about 4000 fatalities and widespread destruction. The primary objective of this study is to perform landslide susceptibility mapping (LSM) associated with this event and assess the performance of the most widely used machine learning algorithms of logistic regression (LR) and random forest (RF). Eight controlling factors were considered, including elevation, hillslope gradient, aspect, relief, distance to rivers, peak ground velocity (PGV), peak ground acceleration (PGA), and lithology. To evaluate model uncertainty, training samples were randomly selected and used to establish the models 20 times, resulting in 20 susceptibility maps for different models. The quality of the landslide susceptibility maps was evaluated using several metrics, including the mean landslide susceptibility index (LSI), modelling uncertainty, and predictive accuracy. The results demonstrate that both models effectively capture the actual distribution of landslides, with areas exhibiting high LSI predominantly concentrated on both sides of the seismogenic fault. The RF model exhibits less sensitivity to changes in training samples, whereas the LR model displays significant variation in LSI with sample changes. Overall, both models demonstrate satisfactory performance; however, the RF model exhibits superior predictive capability compared to the LR model. Full article
Show Figures

Graphical abstract

13 pages, 388 KiB  
Article
Network Analytics Enabled by Generating a Pool of Network Variants from Noisy Data
by Aamir Mandviwalla, Amr Elsisy, Muhammad Saad Atique, Konstantin Kuzmin, Chris Gaiteri and Boleslaw K. Szymanski
Entropy 2023, 25(8), 1118; https://doi.org/10.3390/e25081118 - 26 Jul 2023
Viewed by 1454
Abstract
Mapping network nodes and edges to communities and network functions is crucial to gaining a higher level of understanding of the network structure and functions. Such mappings are particularly challenging to design for covert social networks, which intentionally hide their structure and functions [...] Read more.
Mapping network nodes and edges to communities and network functions is crucial to gaining a higher level of understanding of the network structure and functions. Such mappings are particularly challenging to design for covert social networks, which intentionally hide their structure and functions to protect important members from attacks or arrests. Here, we focus on correctly inferring the structures and functions of such networks, but our methodology can be broadly applied. Without the ground truth, knowledge about the allocation of nodes to communities and network functions, no single network based on the noisy data can represent all plausible communities and functions of the true underlying network. To address this limitation, we apply a generative model that randomly distorts the original network based on the noisy data, generating a pool of statistically equivalent networks. Each unique generated network is recorded, while each duplicate of the already recorded network just increases the repetition count of that network. We treat each such network as a variant of the ground truth with the probability of arising in the real world approximated by the ratio of the count of this network’s duplicates plus one to the total number of all generated networks. Communities of variants with frequently occurring duplicates contain persistent patterns shared by their structures. Using Shannon entropy, we can find a variant that minimizes the uncertainty for operations planned on the network. Repeatedly generating new pools of networks from the best network of the previous step for several steps lowers the entropy of the best new variant. If the entropy is too high, the network operators can identify nodes, the monitoring of which can achieve the most significant reduction in entropy. Finally, we also present a heuristic for constructing a new variant, which is not randomly generated but has the lowest expected cost of operating on the distorted mappings of network nodes to communities and functions caused by noisy data. Full article
(This article belongs to the Special Issue Selected Featured Papers from Entropy Editorial Board Members)
Show Figures

Figure 1

16 pages, 1162 KiB  
Article
Probabilistic Risk Assessment of Combined Exposure to Deoxynivalenol and Emerging Alternaria Toxins in Cereal-Based Food Products for Infants and Young Children in China
by Xiaofeng Ji, Yingping Xiao, Wentao Lyu, Minglu Li, Wen Wang, Biao Tang, Xiaodan Wang and Hua Yang
Toxins 2022, 14(8), 509; https://doi.org/10.3390/toxins14080509 - 25 Jul 2022
Cited by 22 | Viewed by 2874
Abstract
Deoxynivalenol (DON) and emerging Alternaria toxins often co-occur in cereal-based products, but the current risk assessment is commonly conducted for only one type of mycotoxin at a time. Compared to adults, infants and young children are more susceptible to mycotoxins through food consumption, [...] Read more.
Deoxynivalenol (DON) and emerging Alternaria toxins often co-occur in cereal-based products, but the current risk assessment is commonly conducted for only one type of mycotoxin at a time. Compared to adults, infants and young children are more susceptible to mycotoxins through food consumption, especially with cereal-based food products which are the main source of exposure. This study aimed to perform a probabilistic risk assessment of combined exposure to DON and three major Alternaria toxins, namely including alternariol monomethyl ether (AME), alternariol (AOH), and tenuazonic acid (TeA) through consumption of cereal-based foods for Chinese infants and young children. A total of 872 cereal-based food products were randomly collected and tested for the occurrence of DON and three major Alternaria toxins. The results on mycotoxin occurrence showed the DON, TeA, AOH, and AME was detected in 56.4%, 47.5%, 7.5%, and 5.7% of the samples, respectively. Co-contamination of various mycotoxins was observed in 39.9% of the analyzed samples. A preliminary cumulative risk assessment using the models of hazard index (HI) and combined margin of exposure (MoET) was performed on DON and Alternaria toxins that were present in cereal-based food products for infants and young children in China for the first time. The results showed that only 0.2% and 1.5%, respectively, of individuals exceeded the corresponding reference value for DON and TeA, indicating a low health risk. However, in the case of AME and AOH, the proportion of individuals exceeding the reference value was 24.1% and 33.5%, respectively, indicating the potential health risks. In the cumulative risk assessment of AME and AOH, both HI and MoET values indicated a more serious risk than that related to individual exposure. Further research is necessary to reduce the uncertainties that are associated with the toxicities of the Alternaria toxins and cumulative risk assessment methods. Full article
Show Figures

Figure 1

28 pages, 21804 KiB  
Article
Modelling Fire Behavior to Assess Community Exposure in Europe: Combining Open Data and Geospatial Analysis
by Palaiologos Palaiologou, Kostas Kalabokidis, Michelle A. Day, Alan A. Ager, Spyros Galatsidas and Lampros Papalampros
ISPRS Int. J. Geo-Inf. 2022, 11(3), 198; https://doi.org/10.3390/ijgi11030198 - 15 Mar 2022
Cited by 7 | Viewed by 4455
Abstract
Predicting where the next large-scale wildfire event will occur can help fire management agencies better prepare for taking preventive actions and improving suppression efficiency. Wildfire simulations can be useful in estimating the spread and behavior of potential future fires by several available algorithms. [...] Read more.
Predicting where the next large-scale wildfire event will occur can help fire management agencies better prepare for taking preventive actions and improving suppression efficiency. Wildfire simulations can be useful in estimating the spread and behavior of potential future fires by several available algorithms. The uncertainty of ignition location and weather data influencing fire propagation requires a stochastic approach integrated with fire simulations. In addition, scarcity of required spatial data in different fire-prone European regions limits the creation of fire simulation outputs. In this study we provide a framework for processing and creating spatial layers and descriptive data from open-access international and national databases for use in Monte Carlo fire simulations with the Minimum Travel Time fire spread algorithm, targeted to assess cross-boundary wildfire propagation and community exposure for a large-scale case study area (Macedonia, Greece). We simulated over 300,000 fires, each independently modelled with constant weather conditions from a randomly chosen simulation scenario derived from historical weather data. Simulations generated fire perimeters and raster estimates of annual burn probability and conditional flame length. Results were used to estimate community exposure by intersecting simulated fire perimeters with community polygons. We found potential ignitions can grow large enough to reach communities across 27% of the study area and identified the top-50 most exposed communities and the sources of their exposure. The proposed framework can guide efforts in European regions to prioritize fuel management activities in order to reduce wildfire risk. Full article
(This article belongs to the Special Issue The Use of Geo-Spatial Tools in Forestry)
Show Figures

Graphical abstract

32 pages, 14803 KiB  
Article
Retrieving Secondary Forest Aboveground Biomass from Polarimetric ALOS-2 PALSAR-2 Data in the Brazilian Amazon
by Henrique Luis Godinho Cassol, João Manuel de Brito Carreiras, Elisabete Caria Moraes, Luiz Eduardo Oliveira e Cruz de Aragão, Camila Valéria de Jesus Silva, Shaun Quegan and Yosio Edemir Shimabukuro
Remote Sens. 2019, 11(1), 59; https://doi.org/10.3390/rs11010059 - 29 Dec 2018
Cited by 21 | Viewed by 6605
Abstract
Secondary forests (SF) are important carbon sinks, removing CO2 from the atmosphere through the photosynthesis process and storing photosynthates in their aboveground live biomass (AGB). This process occurring at large-scales partially counteracts C emissions from land-use change, playing, hence, an important role [...] Read more.
Secondary forests (SF) are important carbon sinks, removing CO2 from the atmosphere through the photosynthesis process and storing photosynthates in their aboveground live biomass (AGB). This process occurring at large-scales partially counteracts C emissions from land-use change, playing, hence, an important role in the global carbon cycle. The absorption rates of carbon in these forests depend on forest physiology, controlled by environmental and climatic conditions, as well as on the past land use, which is rarely considered for retrieving AGB from remotely sensed data. In this context, the main goal of this study is to evaluate the potential of polarimetric (quad-pol) ALOS-2 PALSAR-2 data for estimating AGB in a SF area. Land-use was assessed through Landsat time-series to extract the SF age, period of active land-use (PALU), and frequency of clear cuts (FC) to randomly select the SF plots. A chronosequence of 42 SF plots ranging 3–28 years (20 ha) near the Tapajós National Forest in Pará state was surveyed to quantifying AGB growth. The quad-pol data was explored by testing two regression methods, including non-linear (NL) and multiple linear regression models (MLR). We also evaluated the influence of the past land-use in the retrieving AGB through correlation analysis. The results showed that the biophysical variables were positively correlated with the volumetric scattering, meaning that SF areas presented greater volumetric scattering contribution with increasing forest age. Mean diameter, mean tree height, basal area, species density, and AGB were significant and had the highest Pearson coefficients with the Cloude decomposition (λ3), which in turn, refers to the volumetric contribution backscattering from cross-polarization (HV) (ρ = 0.57–0.66, p-value < 0.001). On the other hand, the historical use (PALU and FC) showed the highest correlation with angular decompositions, being the Touzi target phase angle the highest correlation (Φs) (ρ = 0.37 and ρ = 0.38, respectively). The combination of multiple prediction variables with MLR improved the AGB estimation by 70% comparing to the NL model (R2 adj. = 0.51; RMSE = 38.7 Mg ha−1) bias = 2.1 ± 37.9 Mg ha−1 by incorporate the angular decompositions, related to historical use, and the contribution volumetric scattering, related to forest structure, in the model. The MLR uses six variables, whose selected polarimetric attributes were strongly related with different structural parameters such as the mean forest diameter, basal area, and the mean forest tree height, and not with the AGB as was expected. The uncertainty was estimated to be 18.6% considered all methodological steps of the MLR model. This approach helped us to better understand the relationship between parameters derived from SAR data and the forest structure and its relation to the growth of the secondary forest after deforestation events. Full article
Show Figures

Graphical abstract

20 pages, 361 KiB  
Article
Fusion Estimation from Multisensor Observations with Multiplicative Noises and Correlated Random Delays in Transmission
by Raquel Caballero-Águila, Aurora Hermoso-Carazo and Josefa Linares-Pérez
Mathematics 2017, 5(3), 45; https://doi.org/10.3390/math5030045 - 4 Sep 2017
Cited by 13 | Viewed by 3779
Abstract
In this paper, the information fusion estimation problem is investigated for a class of multisensor linear systems affected by different kinds of stochastic uncertainties, using both the distributed and the centralized fusion methodologies. It is assumed that the measured outputs are perturbed by [...] Read more.
In this paper, the information fusion estimation problem is investigated for a class of multisensor linear systems affected by different kinds of stochastic uncertainties, using both the distributed and the centralized fusion methodologies. It is assumed that the measured outputs are perturbed by one-step autocorrelated and cross-correlated additive noises, and also stochastic uncertainties caused by multiplicative noises and randomly missing measurements in the sensor outputs are considered. At each sampling time, every sensor output is sent to a local processor and, due to some kind of transmission failures, one-step correlated random delays may occur. Using only covariance information, without requiring the evolution model of the signal process, a local least-squares (LS) filter based on the measurements received from each sensor is designed by an innovation approach. All these local filters are then fused to generate an optimal distributed fusion filter by a matrix-weighted linear combination, using the LS optimality criterion. Moreover, a recursive algorithm for the centralized fusion filter is also proposed and the accuracy of the proposed estimators, which is measured by the estimation error covariances, is analyzed by a simulation example. Full article
(This article belongs to the Special Issue Stochastic Processes with Applications)
Show Figures

Figure 1

17 pages, 658 KiB  
Article
Fairness and Trust in Structured Populations
by Corina E. Tarnita
Games 2015, 6(3), 214-230; https://doi.org/10.3390/g6030214 - 20 Jul 2015
Cited by 18 | Viewed by 6826
Abstract
Classical economic theory assumes that people are rational and selfish, but behavioral experiments often point to inconsistent behavior, typically attributed to “other regarding preferences.” The Ultimatum Game, used to study fairness, and the Trust Game, used to study trust and trustworthiness, have been [...] Read more.
Classical economic theory assumes that people are rational and selfish, but behavioral experiments often point to inconsistent behavior, typically attributed to “other regarding preferences.” The Ultimatum Game, used to study fairness, and the Trust Game, used to study trust and trustworthiness, have been two of the most influential and well-studied examples of inconsistent behavior. Recently, evolutionary biologists have attempted to explain the evolution of such preferences using evolutionary game theoretic models. While deterministic evolutionary game theoretic models agree with the classical economics predictions, recent stochastic approaches that include uncertainty and the possibility of mistakes have been successful in accounting for both the evolution of fairness and the evolution of trust. Here I explore the role of population structure by generalizing and expanding these existing results to the case of non-random interactions. This is a natural extension since such interactions do not occur randomly in the daily lives of individuals. I find that, in the limit of weak selection, population structure increases the space of fair strategies that are selected for but it has little-to-no effect on the optimum strategy played in the Ultimatum Game. In the Trust Game, in the limit of weak selection, I find that some amount of trust and trustworthiness can evolve even in a well-mixed population; however, the optimal strategy, although trusting if the return on investment is sufficiently high, is never trustworthy. Population structure biases selection towards strategies that are both trusting and trustworthy trustworthy and reduces the critical return threshold, but, much like in the case of fairness, it does not affect the winning strategy. Further considering the effects of reputation and structure, I find that they act synergistically to promote the evolution of trustworthiness. Full article
(This article belongs to the Special Issue Cooperation, Trust, and Reciprocity)
Show Figures

Figure 1

Back to TopTop