Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (551)

Search Parameters:
Keywords = generalized extreme-value distribution

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
55 pages, 28544 KB  
Article
Spatial Flows of Information Entropy as Indicators of Climate Variability and Extremes
by Bernard Twaróg
Entropy 2025, 27(11), 1132; https://doi.org/10.3390/e27111132 (registering DOI) - 31 Oct 2025
Abstract
The objective of this study is to analyze spatial entropy flows that reveal the directional dynamics of climate change—patterns that remain obscured in traditional statistical analyses. This approach enables the identification of pathways for “climate information transport”, highlights associations with atmospheric circulation types, [...] Read more.
The objective of this study is to analyze spatial entropy flows that reveal the directional dynamics of climate change—patterns that remain obscured in traditional statistical analyses. This approach enables the identification of pathways for “climate information transport”, highlights associations with atmospheric circulation types, and allows for the localization of both sources and “informational voids”—regions where entropy is dissipated. The analytical framework is grounded in a quantitative assessment of long-term climate variability across Europe over the period 1901–2010, utilizing Shannon entropy as a measure of atmospheric system uncertainty and variability. The underlying assumption is that the variability of temperature and precipitation reflects the inherently dynamic character of climate as a nonlinear system prone to fluctuations. The study focuses on calculating entropy estimated within a 70-year moving window for each calendar month, using bivariate distributions of temperature and precipitation modeled with copula functions. Marginal distributions were selected based on the Akaike Information Criterion (AIC). To improve the accuracy of the estimation, a block bootstrap resampling technique was applied, along with numerical integration to compute the Shannon entropy values at each of the 4165 grid points with a spatial resolution of 0.5° × 0.5°. The results indicate that entropy and its derivative are complementary indicators of atmospheric system instability—entropy proving effective in long-term diagnostics, while its derivative provides insight into the short-term forecasting of abrupt changes. A lag analysis and Spearman rank correlation between entropy values and their potential supported the investigation of how circulation variability influences the occurrence of extreme precipitation events. Particularly noteworthy is the temporal derivative of entropy, which revealed strong nonlinear relationships between local dynamic conditions and climatic extremes. A spatial analysis of the information entropy field was also conducted, revealing distinct structures with varying degrees of climatic complexity on a continental scale. This field appears to be clearly structured, reflecting not only the directional patterns of change but also the potential sources of meteorological fluctuations. A field-theory-based spatial classification allows for the identification of transitional regions—areas with heightened susceptibility to shifts in local dynamics—as well as entropy source and sink regions. The study is embedded within the Fokker–Planck formalism, wherein the change in the stochastic distribution characterizes the rate of entropy production. In this context, regions of positive divergence are interpreted as active generators of variability, while sink regions function as stabilizing zones that dampen fluctuations. Full article
(This article belongs to the Special Issue 25 Years of Sample Entropy)
18 pages, 5442 KB  
Article
Tail-Aware Forecasting of Precipitation Extremes Using STL-GEV and LSTM Neural Networks
by Haoyu Niu, Samantha Murray, Fouad Jaber, Bardia Heidari and Nick Duffield
Hydrology 2025, 12(11), 284; https://doi.org/10.3390/hydrology12110284 - 30 Oct 2025
Abstract
Accurate prediction of extreme precipitation events remains a critical challenge in hydrological forecasting due to their rare occurrence and complex statistical behavior. These extreme events are becoming more frequent and intense under the influence of climate change. Their unpredictability not only hampers water [...] Read more.
Accurate prediction of extreme precipitation events remains a critical challenge in hydrological forecasting due to their rare occurrence and complex statistical behavior. These extreme events are becoming more frequent and intense under the influence of climate change. Their unpredictability not only hampers water resource management and disaster preparedness but also leads to disproportionate impacts on vulnerable communities and critical infrastructure. Therefore, in this article, we introduce a hybrid modeling framework that combines Generalized Extreme Value (GEV) distribution fitting with deep learning models to forecast monthly maximum precipitation extremes. Long Short-term Memory models (LSTMs) are proposed to predict the cumulative distribution (CDF) values of the GEV-fitted remainder series. This approach transforms the forecasting problem into a bounded probabilistic learning task, improving model stability and interpretability. Crucially, a tail-weighted loss function is designed to emphasize rare but high-impact events in the training process, addressing the inherent class imbalance in extreme precipitation predictions. Results demonstrate strong predictive performance in both the CDF and residual domains, with the proposed model accurately identifying anomalously high precipitation months. This hybrid GEV–deep learning approach offers a promising solution for early warning systems and long-term climate resilience planning in hydrologically sensitive regions. Full article
Show Figures

Figure 1

27 pages, 10653 KB  
Article
Intensified Rainfall, Growing Floods: Projecting Urban Drainage Challenges in South-Central China Under Climate Change Scenarios
by Zhengduo Bao, Yuxuan Wu, Weining He, Nian She and Zhenjun Li
Appl. Sci. 2025, 15(21), 11577; https://doi.org/10.3390/app152111577 - 29 Oct 2025
Viewed by 159
Abstract
Global climate change is intensifying extreme rainfall, exacerbating urban flood risks, and undermining the effectiveness of urban stormwater drainage systems (USDS) designed under stationary climate assumptions. While prior studies have identified general trends of increasing flood risk under climate change, they lack actionable [...] Read more.
Global climate change is intensifying extreme rainfall, exacerbating urban flood risks, and undermining the effectiveness of urban stormwater drainage systems (USDS) designed under stationary climate assumptions. While prior studies have identified general trends of increasing flood risk under climate change, they lack actionable connections between climate projections and practical flood risk assessment. Specifically, quantifiable forecasts of extreme rainfall for defined return periods and integrated frameworks linking climate modeling to hydrological simulation at the watershed scale. This study addresses these gaps by developing an integrated framework to assess USDS resilience under future climate scenarios, demonstrated through a case study in Changsha City, China. The framework combines dynamic downscaling of the MRI-CGCM3 global climate model using the Weather Research and Forecasting (WRF) model to generate high-resolution precipitation data, non-stationary frequency analysis via the Generalized Extreme Value (GEV) distribution to project future rainfall intensities (for 2–200-year return periods in the 2040s and 2060s), and a 1D-2D coupled urban flood model built in InfoWorks ICM to evaluate flood risk. Key findings reveal substantial intensification of extreme rainfall, particularly for long-term period events, with the 24 h rainfall depth for 200-year events projected to increase by 32% by the 2060s. Flood simulations show significant escalation in risk: for 100-year events, an area with ponding depth > 500 mm grows from 1.38% (2020s) to 1.62%, (2060s), and the 300–500 mm ponding zone expands by 21%, with long-return-period events (≥20 years) driving most future risk increases. These results directly demonstrate the inadequacy of stationary design approaches for USDS, which carries substantial applied significance for policymakers and stakeholders. Specifically, it underscores the urgent need for these key actors to update engineering standards by adopting non-stationary intensity-duration-frequency (IDF) curves and integrate Sustainable Urban Drainage Systems (SUDS) into formal flood management strategies. Full article
(This article belongs to the Special Issue Resilient Cities in the Context of Climate Change)
Show Figures

Figure 1

24 pages, 1409 KB  
Article
A Lower-Bounded Extreme Value Distribution for Flood Frequency Analysis with Applications
by Fatimah E. Almuhayfith, Maher Kachour, Amira F. Daghestani, Zahid Ur Rehman, Tassaddaq Hussain and Hassan S. Bakouch
Mathematics 2025, 13(21), 3378; https://doi.org/10.3390/math13213378 - 23 Oct 2025
Viewed by 292
Abstract
This paper proposes the lower-bounded Fréchet–log-logistic distribution (LFLD), a probability model designed for robust flood frequency analysis (FFA). The LFLD addresses key limitations of traditional distributions (e.g., generalized extreme value (GEV) and log-Pearson Type III (LP3)) by combining bounded support ( [...] Read more.
This paper proposes the lower-bounded Fréchet–log-logistic distribution (LFLD), a probability model designed for robust flood frequency analysis (FFA). The LFLD addresses key limitations of traditional distributions (e.g., generalized extreme value (GEV) and log-Pearson Type III (LP3)) by combining bounded support (α<x<) to reflect physical flood thresholds, flexible tail behavior via Fréchet–log-logistic fusion for extreme-value accuracy, and maximum entropy characterization, ensuring optimal parameter estimation. Thus, we obtain the LFLD’s main statistical properties (PDF, CDF, and hazard rate), prove its asymptotic convergence to Fréchet distributions, and validate its superiority through simulation studies showing MLE consistency (bias < 0.02 and mean squared error < 0.0004 for α) and empirical flood data tests (52- and 98-year AMS series), where the LFLD outperforms 10 competitors (AIC reductions of 15–40%; Vuong test p < 0.01). The LFLD’s closed-form quantile function enables efficient return period estimation, critical for infrastructure planning. Results demonstrate its applicability to heavy-tailed, bounded hydrological data, offering a 20–30% improvement in flood magnitude prediction over LP3/GEV models. Full article
(This article belongs to the Special Issue Reliability Estimation and Mathematical Statistics)
Show Figures

Figure 1

16 pages, 1176 KB  
Article
Flood Frequency Analysis Using the Bivariate Logistic Model with Non-Stationary Gumbel and GEV Marginals
by Laura Berbesi-Prieto and Carlos Escalante-Sandoval
Hydrology 2025, 12(11), 274; https://doi.org/10.3390/hydrology12110274 - 22 Oct 2025
Viewed by 280
Abstract
Flood frequency analysis is essential for designing resilient hydraulic infrastructure, but traditional stationary models fail to capture the influence of climate variability and land-use change. This study applies a bivariate logistic model with non-stationary marginals to eight gauging stations in Sinaloa, Mexico, each [...] Read more.
Flood frequency analysis is essential for designing resilient hydraulic infrastructure, but traditional stationary models fail to capture the influence of climate variability and land-use change. This study applies a bivariate logistic model with non-stationary marginals to eight gauging stations in Sinaloa, Mexico, each with over 30 years of maximum discharge records. We compared stationary and non-stationary Gumbel and Generalized Extreme Value (GEV) distributions, along with their bivariate combinations. Results show that the non-stationary bivariate GEV–Gumbel distribution provided the best overall performance according to AIC. Importantly, GEV and Gumbel marginals captured site-specific differences: GEV was most suitable for sites with highly variable extremes, while Gumbel offered a robust fit for more regular records. At station 10086, where a significant increasing trend was detected by the Mann–Kendall and Spearman tests, the stationary GEV estimated a 50-year return flow of 772.66 m3/s, while the non-stationary model projected 861.00 m3/s for 2075. Under stationary assumptions, this discharge would be underestimated, occurring every ~30 years by 2075. These findings demonstrate that ignoring non-stationarity leads to systematic underestimation of design floods, while non-stationary bivariate models provide more reliable, policy-relevant estimates for climate adaptation and infrastructure safety. Full article
Show Figures

Figure 1

22 pages, 1811 KB  
Article
Hierarchical Construction of Fuzzy Signature Models for Non-Destructive Assessment of Masonry Strength
by András Kaszás, Vanda O. Pomezanski and László T. Kóczy
Symmetry 2025, 17(10), 1764; https://doi.org/10.3390/sym17101764 - 19 Oct 2025
Viewed by 253
Abstract
Non-destructive testing methods are essential in civil engineering applications, such as evaluating the compressive strength of masonry. This paper presents a fuzzy signature model based on non-destructive in situ measurements and visual inspection, applying weighted geometric mean aggregation in the signature vertices determined [...] Read more.
Non-destructive testing methods are essential in civil engineering applications, such as evaluating the compressive strength of masonry. This paper presents a fuzzy signature model based on non-destructive in situ measurements and visual inspection, applying weighted geometric mean aggregation in the signature vertices determined by experts. The weights of the aggregation terms were optimized using the Monte Carlo method, genetic algorithm and particle swarm algorithm to ensure that the evaluation by the signature aligned with the results of destructive tests performed on existing masonry. The results of the methods were compared for single and multiple assembled masonry structures using the same objective function. All three methods provided relatively high confidence in finding the extreme values of the objective function on a generated dataset, which accounted for the correlations observed in actual measurements. Accordingly, validation based on real data yielded the expected results, thus demonstrating the model’s suitability for practical application. This study assessed the inherent, analyzing whether symmetric or asymmetric weight distributions affected evaluation consistency. While symmetric weighting simplified aggregation, asymmetry allowed local structural irregularities to be highlighted. In addition, the cost analysis of the optimization methods revealed a disparity in computational cost increments between the two approaches. The presented work outlines the advantages of the different methods and their applicability to structural assessment. Full article
(This article belongs to the Section Engineering and Materials)
Show Figures

Figure 1

24 pages, 3070 KB  
Article
Examining the Probabilistic Characteristics of Maximum Rainfall in Türkiye
by Ibrahim Temel, Omer Levend Asikoglu and Harun Alp
Atmosphere 2025, 16(10), 1177; https://doi.org/10.3390/atmos16101177 - 11 Oct 2025
Viewed by 373
Abstract
Hydrologists need to predict extreme hydrological and meteorological events for design purposes, whose magnitude and probability are estimated using a probability distribution function (PDF). The choice of an appropriate PDF is crucial in describing the behavior of the phenomenon and the predictions can [...] Read more.
Hydrologists need to predict extreme hydrological and meteorological events for design purposes, whose magnitude and probability are estimated using a probability distribution function (PDF). The choice of an appropriate PDF is crucial in describing the behavior of the phenomenon and the predictions can differ significantly depending on the PDF. So, the success of the probability distribution function in representing the data of extreme value series of natural events such as hydrology and climatology is of great importance. Depending on whether the series consists of maximum or minimum values, the theoretical probability density function must be appropriately fit to the right or left tail of the extreme data, which contains the most critical information. This study includes a combined evaluation of the performance of four different tests for selecting the appropriate probability distribution of maximum rainfall in Türkiye: Kolmogorov–Smirnov (KS) test, Anderson–Darling (AD) test, Probability Plot Correlation Coefficient (PPCC) test, and L-Moments ZDIST test. Within the scope of the study, maximum rainfall series of seven rainfall durations from 15 to 1440 min, at rain gauge stations in 81 provinces of Türkiye, were examined. Goodness of fit was performed based on ranking using a combination of four different numerical tests (KS, AD, PPCC, ZDIST). The probabilistic character of maximum rainfall was evaluated using a large dataset consisting of 567 time series with record lengths ranging from 45 to 80 years. The goodness of fit of distributions was examined from three different perspectives. The first is an examination considering rainfall durations, the second is a province-based examination, and the third is a general country-based assessment. In all three different perspectives, the Wakeby distribution was determined as the best fit candidate to represent the maximum rainfall in Türkiye. Full article
(This article belongs to the Section Meteorology)
Show Figures

Graphical abstract

21 pages, 19839 KB  
Article
Development of a Reduced Order Model for Turbine Blade Cooling Design
by Andrea Pinardi, Noraiz Mushtaq and Paolo Gaetani
Int. J. Turbomach. Propuls. Power 2025, 10(4), 37; https://doi.org/10.3390/ijtpp10040037 - 8 Oct 2025
Viewed by 450
Abstract
Rotating detonation engines (RDEs) are expected to have higher specific work and efficiency, but the high-temperature transonic flow delivered by the combustor poses relevant design and technological difficulties. This work proposes a 1D model for turbine internal cooling design which can be used [...] Read more.
Rotating detonation engines (RDEs) are expected to have higher specific work and efficiency, but the high-temperature transonic flow delivered by the combustor poses relevant design and technological difficulties. This work proposes a 1D model for turbine internal cooling design which can be used to explore multiple design options during the preliminary design of the cooling system. Being based on an energy balance applied to an infinitesimal control volume, the model is general and can be adapted to other applications. The model is applied to design a cooling system for a pre-existing stator blade geometry. Both the inputs and the outputs of the 1D simulation are in good agreement with the values found in the literature. Subsequently, 1D results are compared to a full conjugate heat transfer (CHT) simulation. The agreement on the internal heat transfer coefficient is excellent and is entirely within the uncertainty of the correlation. Despite some criticality in finding agreement with the thermal power distribution, the Mach number, the total pressure drop, and the coolant temperature increase in the cooling channels are accurately predicted by the 1D code, thus confirming its value as a preliminary design tool. To guarantee the integrity of the blade at the extremities, a cooling solution with coolant injection at the leading and trailing edge is studied. A finite element analysis of the cooled blade ensures the structural feasibility of the cooling system. The computational economy of the 1D code is then exploited to perform a global sensitivity analysis using a polynomial chaos expansion (PCE) surrogate model to compute Sobol’ indices. Full article
Show Figures

Figure 1

14 pages, 427 KB  
Article
Performance Modeling of Cloud Systems by an Infinite-Server Queue Operating in Rarely Changing Random Environment
by Svetlana Moiseeva, Evgeny Polin, Alexander Moiseev and Janos Sztrik
Future Internet 2025, 17(10), 462; https://doi.org/10.3390/fi17100462 - 8 Oct 2025
Viewed by 328
Abstract
This paper considers a heterogeneous queuing system with an unlimited number of servers, where the parameters are determined by a random environment. A distinctive feature is that the parameters of the exponential distribution of the request processing time do not change their values [...] Read more.
This paper considers a heterogeneous queuing system with an unlimited number of servers, where the parameters are determined by a random environment. A distinctive feature is that the parameters of the exponential distribution of the request processing time do not change their values until the end of service. Thus, the devices in the system under consideration are heterogeneous. For the study, a method of asymptotic analysis is proposed under the condition of extremely rare changes in the states of the random environment. We consider the following problem. Cloud node accepts requests of one type that have a similar intensity of arrival and duration of processing. Sometimes an input scheduler switches to accept requests of another type with other intensity and duration of processing. We model the system as an infinite-server queue in a random environment, which influences the arrival intensity and service time of new requests. The random environment is modeled by a Markov chain with a finite number of states. Arrivals are modeled as a Poisson process with intensity dependent on the state of the random environment. Service times are exponentially distributed with rates also dependent on the state of the random environment at the time moment when the request arrived. When the environment changes its state, requests that are already in the system do not change their service times. So, we have requests of different types (serviced with different rates) present in the system at the same time. For the study, we consider a situation where changes of the random environment are made rarely. The method of asymptotic analysis is used for the study. The asymptotic condition of a rarely changing random environment (entries of the generator of the corresponding Markov chain tend to zero) is used. A multi-dimensional joint steady-state probability distribution of the number of requests of different types present in the system is obtained. Several numerical examples illustrate the comparisons of asymptotic results to simulations. Full article
Show Figures

Figure 1

28 pages, 3034 KB  
Review
Review of Thrust Vectoring Technology Applications in Unmanned Aerial Vehicles
by Yifan Luo, Bo Cui and Hongye Zhang
Drones 2025, 9(10), 689; https://doi.org/10.3390/drones9100689 - 6 Oct 2025
Viewed by 1066
Abstract
Thrust vectoring technology significantly improves the manoeuvrability and environmental adaptability of unmanned aerial vehicles by dynamically regulating the direction and magnitude of thrust. In this paper, the principles and applications of mechanical thrust vectoring technology, fluidic thrust vectoring technology and the distributed electric [...] Read more.
Thrust vectoring technology significantly improves the manoeuvrability and environmental adaptability of unmanned aerial vehicles by dynamically regulating the direction and magnitude of thrust. In this paper, the principles and applications of mechanical thrust vectoring technology, fluidic thrust vectoring technology and the distributed electric propulsion system are systematically reviewed. It is shown that the mechanical vector nozzle can achieve high-precision control but has structural burdens, the fluidic thrust vectoring technology improves the response speed through the design of no moving parts but is accompanied by the loss of thrust, and the distributed electric propulsion system improves the hovering efficiency compared with the traditional helicopter. Addressing multi-physics coupling and non-linear control challenges in unmanned aerial vehicles, this paper elucidates the disturbance compensation advantages of self-disturbance rejection control technology and the optimal path generation capabilities of an enhanced path planning algorithm. These two approaches offer complementary technical benefits: the former ensures stable flight attitude, while the latter optimises flight trajectory efficiency. Through case studies such as the Skate demonstrator, the practical value of these technologies in enhancing UAV manoeuvrability and adaptability is further demonstrated. However, thermal management in extreme environments, energy efficiency and lack of standards are still bottlenecks in engineering. In the future, breakthroughs in high-temperature-resistant materials and intelligent control architectures are needed to promote the development of UAVs towards ultra-autonomous operation. This paper provides a systematic reference for the theory and application of thrust vectoring technology. Full article
Show Figures

Figure 1

26 pages, 5202 KB  
Article
Time-Varying Bivariate Modeling for Predicting Hydrometeorological Trends in Jakarta Using Rainfall and Air Temperature Data
by Suci Nur Setyawati, Sri Nurdiati, I Wayan Mangku, Ionel Haidu and Mohamad Khoirun Najib
Hydrology 2025, 12(10), 252; https://doi.org/10.3390/hydrology12100252 - 26 Sep 2025
Viewed by 548
Abstract
Changes in rainfall patterns and irregular air temperature have become essential issues in analyzing hydrometeorological trends in Jakarta. This study aims to select the best copula of the stationary and non-stationary copula models and visualize and explore the relationship between rainfall and air [...] Read more.
Changes in rainfall patterns and irregular air temperature have become essential issues in analyzing hydrometeorological trends in Jakarta. This study aims to select the best copula of the stationary and non-stationary copula models and visualize and explore the relationship between rainfall and air temperature to predict hydrometeorological trends. The methods used include combining univariate Lognormal and Generalized Extreme Value (GEV) distributions with Clayton, Gumbel, and Frank copulas, as well as parameter estimation using the fminsearch algorithm, Markov Chain Monte Carlo (MCMC) simulation, and a combination of both. The results show that the best model is the non-stationary Clayton copula estimated using MCMC simulation, which has the lowest Akaike Information Criterion (AIC) value. This model effectively captures extreme dependence in the lower tail of the distribution, indicating a potential increase in extreme low events such as cold droughts. Visualization of the best model through contour plots shows a shifting center of the distribution over time. This study contributes to developing dynamic hydrometeorological models for adaptation planning of changing hydrometeorological trends in Indonesia. Full article
(This article belongs to the Special Issue Trends and Variations in Hydroclimatic Variables: 2nd Edition)
Show Figures

Figure 1

19 pages, 3619 KB  
Article
Surface Urban Heat Island Risk Index Computation Using Remote-Sensed Data and Meta Population Dataset on Naples Urban Area (Italy)
by Massimo Musacchio, Alessia Scalabrini, Malvina Silvestri, Federico Rabuffi and Antonio Costanzo
Remote Sens. 2025, 17(19), 3306; https://doi.org/10.3390/rs17193306 - 26 Sep 2025
Viewed by 636
Abstract
Extreme climate events such as heatwaves are becoming more frequent and pose serious challenges in cities. Urban areas are particularly vulnerable because built surfaces absorb and release heat, while human activities generate additional greenhouse gases. This increases health risks, making it crucial to [...] Read more.
Extreme climate events such as heatwaves are becoming more frequent and pose serious challenges in cities. Urban areas are particularly vulnerable because built surfaces absorb and release heat, while human activities generate additional greenhouse gases. This increases health risks, making it crucial to study population exposure to heat stress. This research focuses on Naples, Italy’s most densely populated city, where intense human activity and unique geomorphological conditions influence local temperatures. The presence of a Surface Urban Heat Island (SUHI) is assessed by deriving high-resolution Land Surface Temperature (LST) in a time series ranging from 2013 to 2023, processed with the Statistical Mono Window (SMW) algorithm in the Google Earth Engine (GEE) environment. SMW needs brightness temperature (Tb) extracted from a Landsat 8 (L8) Thermal InfraRed Sensor (TIRS), emissivity from Advanced Spaceborne and Thermal Emission Radiometer Global Emissivity Database (ASTERGED), and atmospheric correction coefficients from the National Center for Environmental Prediction and Atmospheric Research (NCEP/NCAR). A total of 64 nighttime images were processed and analyzed to assess long-term trends and identify the main heat islands in Naples. The hottest image was compared with population data, including demographic categories such as children, elderly people, and pregnant women. A risk index was calculated by combining temperature values, exposure levels, and the vulnerability of each group. Results identified three major heat islands, showing that risk is strongly linked to both population density and heat island distribution. Incorporating Local Climate Zone (LCZ) classification further highlighted the urban areas most prone to extreme heat based on morphology. Full article
Show Figures

Graphical abstract

15 pages, 7345 KB  
Article
Increased Exposure Risk of Natural Reserves to Rainstorm in the Eastern Monsoon Region of China
by Yixuan Zhou, Hanming Cao, Lin Zhao and Shao Sun
Atmosphere 2025, 16(9), 1096; https://doi.org/10.3390/atmos16091096 - 18 Sep 2025
Viewed by 371
Abstract
Due to climate warming, extreme precipitation events have intensified in frequency and intensity. This trend has raised significant concerns about its impact on natural reserves in eastern China’s monsoon region. A risk assessment is, therefore, needed to evaluate the vulnerability of these protected [...] Read more.
Due to climate warming, extreme precipitation events have intensified in frequency and intensity. This trend has raised significant concerns about its impact on natural reserves in eastern China’s monsoon region. A risk assessment is, therefore, needed to evaluate the vulnerability of these protected areas. Based on observed and simulated daily precipitation data, this study analyzed the spatiotemporal trends of heavy rainfall in the eastern monsoon region of China and assessed the exposure risk of the protected areas to rainstorm events both in the historical and future periods. Results indicate that the annual average number of heavy rainfall days gradually increases from northwest to southeast, displaying a distinct zonal distribution pattern. The proportion of heavy rainfall days to total precipitation days and the average intensity of heavy rainfall show peak centers in the southeastern coastal areas, western Sichuan region, and North China Plain, with minimum values observed in the northwestern direction. Protected areas in China’s Eastern Monsoon Region display a north–south gradient of precipitation exposure risk that intensifies from historical (1995–2014) to near future (2031–2050) to far future (2081–2100) under SSP245 scenario, with highest vulnerability in southeastern coastal areas. National reserves generally experience lower exposure than provincial and municipal ones, though all categories face increasing precipitation risks over time. Full article
Show Figures

Figure 1

35 pages, 4885 KB  
Article
Evaluating Sectoral Vulnerability to Natural Disasters in the US Stock Market: Sectoral Insights from DCC-GARCH Models with Generalized Hyperbolic Innovations
by Adriana AnaMaria Davidescu, Eduard Mihai Manta, Margareta-Stela Florescu, Robert-Stefan Constantin and Cristina Manole
Sustainability 2025, 17(18), 8324; https://doi.org/10.3390/su17188324 - 17 Sep 2025
Viewed by 800
Abstract
The escalating frequency and severity of natural disasters present significant challenges to the stability and sustainability of global financial systems, with the US stock market being especially vulnerable. This study examines sector-level exposure and contagion dynamics during climate-related disaster events, providing insights essential [...] Read more.
The escalating frequency and severity of natural disasters present significant challenges to the stability and sustainability of global financial systems, with the US stock market being especially vulnerable. This study examines sector-level exposure and contagion dynamics during climate-related disaster events, providing insights essential for sustainable investing and resilient financial planning. Using an advanced econometric framework—dynamic conditional correlation GARCH (DCC-GARCH) augmented with Generalized Hyperbolic Processes (GHPs) and an asymmetric specification (ADCC-GARCH)—we model daily stock returns for 20 publicly traded US companies across five sectors (insurance, energy, automotive, retail, and industrial) between 2017 and 2022. The results reveal considerable sectoral heterogeneity: insurance and energy sectors exhibit the highest vulnerability, with heavy-tailed return distributions and persistent volatility, whereas retail and selected industrial firms demonstrate resilience, including counter-cyclical behavior during crises. GHP-based models improve tail risk estimation by capturing return asymmetries, skewness, and leptokurtosis beyond Gaussian specifications. Moreover, the ADCC-GHP-GARCH framework shows that negative shocks induce more persistent correlation shifts than positive ones, highlighting asymmetric contagion effects during stress periods. The results present the insurance and energy sectors as the most exposed to extreme events, backed by the heavy-tailed return distributions and persistent volatility. In contrast, the retail and select industrial firms exhibit resilience and show stable, and in some cases, counter-cyclical, behavior in crises. The results from using a GHP indicate a slight improvement in model specification fit, capturing return asymmetries, skewness, and leptokurtosis indications, in comparison to standard Gaussian models. It was also shown with an ADCC-GHP-GARCH model that negative shocks result in a greater and more durable change in correlations than positive shocks, reinforcing the consideration of asymmetry contagion in times of stress. By integrating sector-specific financial responses into a climate-disaster framework, this research supports the design of targeted climate risk mitigation strategies, sustainable investment portfolios, and regulatory stress-testing approaches that account for volatility clustering and tail dependencies. The findings contribute to the literature on financial resilience by providing a robust statistical basis for assessing how extreme climate events impact asset values, thereby informing both policy and practice in advancing sustainable economic development. Full article
Show Figures

Figure 1

13 pages, 7030 KB  
Article
Mean and Dispersion Regression Model for Extremes with Application in Humidity
by Lara Roberta da Silva Costa, Fernando Ferraz do Nascimento and Marcelo Bourguignon
Mathematics 2025, 13(18), 2993; https://doi.org/10.3390/math13182993 - 16 Sep 2025
Viewed by 952
Abstract
The generalized extreme value (GEV) distribution has been extensively applied in predicting extreme events across diverse fields, enabling accurate estimation of extreme quantiles of maxima. Although the mean and variance of the GEV can be expressed as functions of its parameters, a reparameterization [...] Read more.
The generalized extreme value (GEV) distribution has been extensively applied in predicting extreme events across diverse fields, enabling accurate estimation of extreme quantiles of maxima. Although the mean and variance of the GEV can be expressed as functions of its parameters, a reparameterization that directly uses the mean and standard deviation as parameters provides interpretative advantages. This is particularly useful in regression-based models, as it allows a more straightforward interpretation of how covariates influence the mean and the standard deviation of the data. The proposed models are estimated within the Bayesian framework, assuming normal prior distributions for the regression coefficients. In this work, we introduce a reparameterization of the GEV distribution in terms of the mean and standard deviation, and apply it to minimum humidity data from the northeast region of Brazil. The results highlight the benefits of the proposed methodology, demonstrating clearer interpretability and practical relevance for extreme value modeling, where the influence of seasonal and locality variables on the mean and variance of minimum humidity can be accurately assessed. Full article
(This article belongs to the Special Issue Application of Regression Models, Analysis and Bayesian Statistics)
Show Figures

Figure 1

Back to TopTop