Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (36)

Search Parameters:
Keywords = NLDA

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
35 pages, 10001 KB  
Article
Analytical Seismic Vulnerability and Performance Assessment of a Special-Importance Steel Building: Application Under the NCSE-02 Code
by Rocio Romero-Jaren, Laura Navas-Sanchez, Carlos Gamboa-Canté, Maria Belen Benito and Carmen Jaren
Appl. Sci. 2026, 16(3), 1515; https://doi.org/10.3390/app16031515 - 2 Feb 2026
Viewed by 333
Abstract
This study develops a comprehensive workflow for the analytical seismic vulnerability and structural performance assessment of a special-importance steel building located in a region of elevated seismic hazard in southern Spain. The work addresses the need for reliable analytical methodologies for facilities that [...] Read more.
This study develops a comprehensive workflow for the analytical seismic vulnerability and structural performance assessment of a special-importance steel building located in a region of elevated seismic hazard in southern Spain. The work addresses the need for reliable analytical methodologies for facilities that must remain operational during earthquakes. The proposed framework integrates a probabilistic seismic hazard assessment, including uniform hazard spectra and hazard disaggregation to identify control earthquakes. Additionally, an analytical vulnerability assessment under the Spanish seismic design code, NCSE-02, is performed. Operational modal analysis and nonlinear analysis are combined to retrofit the numerical model of the building and capture the building’s realistic seismic response. The resulting demand spectra are derived from site-specific ground-motion scenarios for Los Barrios (Cádiz, Spain). Retrofitting strategies are designed and assessed to ensure compliance with the code-defined performance requirements. Results indicate that the retrofitted model reproduces the building’s dynamic behaviour with improved reliability, and that the strengthening interventions enhance seismic performance while still allowing moderate damage in specific components. These findings highlight the importance of analytical vulnerability approaches and code-oriented retrofitting when evaluating the seismic performance and vulnerability of essential facilities. The study demonstrates that rigorous analytical methods provide a robust basis for defining seismic vulnerability in special-importance buildings and support improved decision-making for structural safety and resilience. Full article
(This article belongs to the Special Issue Seismic Design and Analysis of Building Structures)
Show Figures

Figure 1

26 pages, 5454 KB  
Article
The Importance of Structural Configuration in the Seismic Performance and Reliability of Buildings
by Rodolfo J. Tirado-Gutiérrez, Ramón González-Drigo and Yeudy F. Vargas-Alzate
Infrastructures 2025, 10(12), 325; https://doi.org/10.3390/infrastructures10120325 - 26 Nov 2025
Cited by 1 | Viewed by 618
Abstract
The optimal performance of buildings strongly depends on their structural configuration, as it influences the structural response to expected loads during life service. For instance, structural arrangements oriented to reduce torsional effects increase performance and, in turn, mitigate vulnerability to seismic events. However, [...] Read more.
The optimal performance of buildings strongly depends on their structural configuration, as it influences the structural response to expected loads during life service. For instance, structural arrangements oriented to reduce torsional effects increase performance and, in turn, mitigate vulnerability to seismic events. However, several structural analyses should be performed to ensure that these structural arrangements are robust This can be computationally expensive depending on the type of analysis. The objective of this research is twofold. The first objective is to compare the dynamic response of two reinforced concrete buildings that are almost identical in height and floor area but whose structural elements are placed differently. The dynamic response of both structures was calculated via nonlinear dynamic analysis (NLDA) by considering a large set of ground motion records. Second, NLDA results were compared with those stemming from a spectral-based methodology. The comparison is made on the basis of the fragility and damage functions given different return periods. The results show that an adequate spatial distribution of structural elements reduces materials and increases safety and stability, since the expected damage is lower. Likewise, it is observed that the results based on reduced-order procedures accurately represent those obtained from NLDA while entailing a significantly lower computational cost. Full article
(This article belongs to the Topic Resilient Civil Infrastructure, 2nd Edition)
Show Figures

Figure 1

27 pages, 8010 KB  
Article
Deep Learning-Based Short- and Mid-Term Surface and Subsurface Soil Moisture Projections from Remote Sensing and Digital Soil Maps
by Saman Rabiei, Ebrahim Babaeian and Sabine Grunwald
Remote Sens. 2025, 17(18), 3219; https://doi.org/10.3390/rs17183219 - 18 Sep 2025
Cited by 3 | Viewed by 1570
Abstract
Accurate real-time information about soil moisture (SM) at a large scale is essential for improving hydrological modeling, managing water resources, and monitoring extreme weather events. This study presents a framework using convolutional long short-term memory (ConvLSTM) network to produce short- (1, 3, and [...] Read more.
Accurate real-time information about soil moisture (SM) at a large scale is essential for improving hydrological modeling, managing water resources, and monitoring extreme weather events. This study presents a framework using convolutional long short-term memory (ConvLSTM) network to produce short- (1, 3, and 7 days ahead) and mid-term (14 and 30 days ahead) forecasts of SM at surface (0–10 cm) and subsurface (10–40 and 40–100 cm) soil layers across the contiguous U.S. The model was trained with five-year period (2018–2022) datasets including Soil Moisture Active Passive (SMAP) level 3 ancillary covariables, North American Land Data Assimilation System phase 2 (NLDAS-2) SM product, shortwave infrared reflectance from Moderate Resolution Imaging Spectroradiometer (MODIS), and terrain features (e.g., elevation, slope, curvature), as well as soil texture and bulk density maps from the Soil Landscape of the United States (SOLUS100) database. To develop and evaluate the model, the dataset was divided into three subsets: training (January 2018–January 2021), validation (2021), and testing (2022). The outputs were validated with observed in situ data from the Soil Climate Analysis Network (SCAN) and the United States Climate Reference Network (USCRN) soil moisture networks. The results indicated that the accuracy of SM forecasts decreased with increasing lead time, particularly in the surface (0–10 cm) and subsurface (10–40 cm) layers, where strong fluctuations driven by rainfall variability and evapotranspiration fluxes introduced greater uncertainty. Across all soil layers and lead times, the model achieved a median unbiased root mean square error (ubRMSE) of 0.04 cm3 cm−3 with a Pearson correlation coefficient of 0.61. Further, the performance of the model was evaluated with respect to both land cover and soil texture databases. Forecast accuracy was highest in coarse-textured soils, followed by medium- and fine-textured soils, likely because the greater penetration depth of microwave observations improves SM retrieval in sandy soils. Among land cover types, performance was strongest in grasslands and savannas and weakest in dense forests and shrublands, where dense vegetation attenuates the microwave signal and reduces SM estimation accuracy. These results demonstrate that the ConvLSTM framework provides skillful short- and mid-term forecasts of surface and subsurface soil moisture, offering valuable support for large-scale drought and flood monitoring. Full article
(This article belongs to the Special Issue Earth Observation Satellites for Soil Moisture Monitoring)
Show Figures

Graphical abstract

22 pages, 4857 KB  
Article
Evaluating an Ensemble-Based Machine Learning Approach for Groundwater Dynamics by Downscaling GRACE Data
by Zahra Ghaffari, Abdel Rahman Awawdeh, Greg Easson, Lance D. Yarbrough and Lucas James Heintzman
Limnol. Rev. 2025, 25(3), 39; https://doi.org/10.3390/limnolrev25030039 - 21 Aug 2025
Viewed by 1740
Abstract
Groundwater depletion poses a critical challenge to global water security, threatening ecosystems, agriculture, and sustainable development. The Mississippi Delta, a region heavily reliant on groundwater for agriculture, has experienced significant groundwater level declines due to intensive irrigation. Traditional in situ monitoring methods, while [...] Read more.
Groundwater depletion poses a critical challenge to global water security, threatening ecosystems, agriculture, and sustainable development. The Mississippi Delta, a region heavily reliant on groundwater for agriculture, has experienced significant groundwater level declines due to intensive irrigation. Traditional in situ monitoring methods, while valuable, lack the spatial coverage necessary to capture regional groundwater dynamics comprehensively. This study addresses these limitations by leveraging downscaled Gravity Recovery and Climate Experiment (GRACE) data to estimate groundwater levels using random forest modeling (RFM). We applied a machine-learning approach, utilizing the “Forest-based and Boosted Classification and Regression” tool in ArcGIS Pro, (ESRI, Redlands, CA) to predict groundwater levels for April and October over a 10-year period. The model was trained and validated with well-water level records from over 400 monitoring wells, incorporating input variables such as NDVI, temperature, precipitation, and NLDAS data. Cross-validation results demonstrate the model’s high accuracy, with R2 values confirming its robustness and reliability. The outputs reveal significant groundwater depletion in the central Mississippi Delta, with the lowest water level observed in the eastern Sunflower and western Leflore Counties. Notably, April 2014 recorded a minimum water level of 18.6 m, while October 2018 showed the lowest post-irrigation water level at 54.9 m. By integrating satellite data with machine learning, this research provides a framework for addressing regional water management challenges and advancing sustainable practices in water-stressed agricultural regions. Full article
Show Figures

Figure 1

21 pages, 3947 KB  
Article
Combining Feature Extraction Methods and Categorical Boosting to Discriminate the Lettuce Storage Time Using Near-Infrared Spectroscopy
by Xuan Zhou, Xiaohong Wu, Zhihang Cao and Bin Wu
Foods 2025, 14(9), 1601; https://doi.org/10.3390/foods14091601 - 1 May 2025
Viewed by 1115
Abstract
Lettuce is a kind of nutritious leafy vegetable. The lettuce storage time has a significant impact on its nutrition and taste. Therefore, to classify lettuce samples with different storage times accurately and non-destructively, this study built classification models by combining several feature extraction [...] Read more.
Lettuce is a kind of nutritious leafy vegetable. The lettuce storage time has a significant impact on its nutrition and taste. Therefore, to classify lettuce samples with different storage times accurately and non-destructively, this study built classification models by combining several feature extraction methods and categorical boosting (CatBoost). Firstly, the near-infrared (NIR) spectral data of lettuce samples were collected using a NIR spectrometer, and then they were preprocessed using six preprocessing methods. Next, feature extraction was carried out on the spectral data using approximate linear discriminant analysis (ALDA), common-vector linear discriminant analysis (CLDA), maximum-uncertainty linear discriminant analysis (MLDA), and null-space linear discriminant analysis (NLDA). These four feature extraction methods can solve the problem of small sample sizes. Finally, the classification was achieved using classification and regression trees (CARTs) and CatBoost, respectively. The experimental results showed that the classification accuracy of NLDA combined with CatBoost could reach 97.67%. Therefore, the combination of feature extraction methods (NLDA) and CatBoost using NIR spectroscopy is an effective way to classify lettuce storage time. Full article
Show Figures

Figure 1

24 pages, 3748 KB  
Article
Leveraging Recurrent Neural Networks for Flood Prediction and Assessment
by Elnaz Heidari, Vidya Samadi and Abdul A. Khan
Hydrology 2025, 12(4), 90; https://doi.org/10.3390/hydrology12040090 - 16 Apr 2025
Cited by 7 | Viewed by 2752
Abstract
Recent progress in Artificial Intelligence and Machine Learning (AIML) has accelerated improvements in the prediction performance of many hydrological processes. Yet, flood prediction remains a challenging task due to its complex nature. Two common challenges afflicting the task are flood volatility and the [...] Read more.
Recent progress in Artificial Intelligence and Machine Learning (AIML) has accelerated improvements in the prediction performance of many hydrological processes. Yet, flood prediction remains a challenging task due to its complex nature. Two common challenges afflicting the task are flood volatility and the sensitivity and complexity of flood generation attributes. This study explores the application of Recurrent Neural Networks (RNNs)—specifically Vanilla Recurrent Neural Networks (VRNNs), Long Short-Term Memory (LSTM), and Gated Recurrent Unit (GRU)—in flood prediction and assessment. By integrating catchment-specific hydrological and meteorological variables, the RNN models leverage sequential data processing to capture the temporal dynamics and seasonal patterns characteristic of flooding. These models were employed across diverse terrains, including mountainous watersheds in the state of South Carolina, USA, to examine their robustness and adaptability. To identify significant hydrological events for flash flood analysis, a discharge frequency analysis was conducted using the Pearson Type III distribution. The 1-year and 2-year return period flows were estimated based on this analysis, and the 1-year return flow was selected as a conservative threshold for flash flood event identification to ensure a sufficient number of training instances. Comparative benchmarking with the National Water Model (NWM v3.0) revealed that the RNN-based approaches offer notable enhancements in capturing the intensity and timing of flood events, particularly for short-duration and high-magnitude floods (flash floods). Comparison of predicted disharges with the discharge recorded at the gauges revealed that GRU had the best performance as it achieved the highest mean NSE values and exhibited low variability across diverse watersheds. LSTM results were slightly less consistent compared to the GRU albeit achieving satisfactory performance, proving its value in hydrological forecasting. In contrast, VRNN had the highest variability and the lowest NSE values among the three. The NWM model trailed the machine learning-based models. The study highlights the efficacy of the RNN models in advancing hydrological predictions. Full article
(This article belongs to the Section Water Resources and Risk Management)
Show Figures

Figure 1

20 pages, 4770 KB  
Article
Surface and Subsurface Soil Moisture Estimation Using Fusion of SMAP, NLDAS-2, and SOLUS100 Data with Deep Learning
by Saman Rabiei, Ebrahim Babaeian and Sabine Grunwald
Remote Sens. 2025, 17(4), 659; https://doi.org/10.3390/rs17040659 - 14 Feb 2025
Cited by 4 | Viewed by 2666
Abstract
Accurate knowledge of surface and subsurface soil moisture (SM) is essential for hydrologic modeling, weather forecasting, and agricultural water management. NASA’s Soil Moisture Active Passive (SMAP) satellite (level 3) provides ‘surface’ SM with 2–3 days temporal resolution, hence lacks daily and subsurface SM [...] Read more.
Accurate knowledge of surface and subsurface soil moisture (SM) is essential for hydrologic modeling, weather forecasting, and agricultural water management. NASA’s Soil Moisture Active Passive (SMAP) satellite (level 3) provides ‘surface’ SM with 2–3 days temporal resolution, hence lacks daily and subsurface SM information. This study developed a convolutional neural network–long short-term memory (ConvLSTM) deep learning model to produce ‘daily’ surface (5 cm) and subsurface (25 cm) SM products (9 km) by integrating SMAP level 3 ancillary data, North American Land Data Assimilation System (NLDAS-2; 12 km) SM, and Soil Landscapes of the United States (SOLUS100) digital maps across the contiguous U.S. Two input scenarios were evaluated: scenario 1 used only SMAP ancillary data, while scenario 2 included both SMAP ancillary data and SOLUS100 soil maps. Model evaluation with in situ SM data showed higher accuracy for scenario 2, indicating the importance of soil properties (texture and bulk density) in SM estimation. Coarse-textured soils showed the highest estimation accuracy, followed by medium- and fine-textured soils. The model also performed in estimating subsurface SM than surface SM for most land-cover types. Incorporating SMAP ancillary data and SOLUS100 digital soil maps into the ConvLSTM improved the spatial and temporal estimation of surface and subsurface SM. The results highlight the potential of deep learning for integrating multi-source multi-scale observations for improving SM estimation at large scale. Full article
(This article belongs to the Special Issue Satellite Soil Moisture Estimation, Assessment, and Applications)
Show Figures

Figure 1

23 pages, 9644 KB  
Article
Modeling Urban Microclimates for High-Resolution Prediction of Land Surface Temperature Using Statistical Models and Surface Characteristics
by Md Golam Rabbani Fahad, Maryam Karimi, Rouzbeh Nazari and Mohammad Reza Nikoo
Urban Sci. 2025, 9(2), 28; https://doi.org/10.3390/urbansci9020028 - 28 Jan 2025
Cited by 5 | Viewed by 4825
Abstract
Surface properties in complex urban environments can significantly impact local-level temperature gradients and distribution on several scales. Studying temperature anomalies and identifying heat pockets in urban settings is challenging. Limited high-resolution datasets are available that do not translate into an accurate assessment of [...] Read more.
Surface properties in complex urban environments can significantly impact local-level temperature gradients and distribution on several scales. Studying temperature anomalies and identifying heat pockets in urban settings is challenging. Limited high-resolution datasets are available that do not translate into an accurate assessment of near-surface temperature. This study developed a model to predict land surface temperature (LST) at a high spatial–temporal resolution in urban areas using Landsat data and meteorological inputs from NLDAS. This study developed an urban microclimate (UC) model to predict air temperature at high spatial–temporal resolution for inner urban areas through a land surface and build-up scheme. The innovative aspect of the model is the inclusion of micro-features in land use characteristics, which incorporate surface types, urban vegetation, building density and heights, short wave radiation, and relative humidity. Statistical models, including the Generalized Additive Model (GAM) and spatial autoregression (SAR), were developed to predict land surface temperature (LST) based on surface characteristics and weather parameters. The model was applied to urban microclimates in densely populated regions, focusing on Manhattan and New York City. The results indicated that the SAR model performed better (R2 = 0.85, RMSE = 0.736) in predicting micro-scale LST variations compared to the GAM (R2 = 0.39, RMSE = 1.203) and validated the accuracy of the LST prediction model with R2 ranging from 0.79 to 0.95. Full article
Show Figures

Figure 1

18 pages, 3161 KB  
Article
Bluetongue Risk Map for Vaccination and Surveillance Strategies in India
by Mohammed Mudassar Chanda, Bethan V. Purse, Luigi Sedda, David Benz, Minakshi Prasad, Yella Narasimha Reddy, Krishnamohan Reddy Yarabolu, S. M. Byregowda, Simon Carpenter, Gaya Prasad and David John Rogers
Pathogens 2024, 13(7), 590; https://doi.org/10.3390/pathogens13070590 - 16 Jul 2024
Cited by 5 | Viewed by 3136
Abstract
Bluetongue virus (BTV, Sedoreoviridae: Orbivirus) causes an economically important disease, namely, bluetongue (BT), in domestic and wild ruminants worldwide. BTV is endemic to South India and has occurred with varying severity every year since the virus was first reported in 1963. [...] Read more.
Bluetongue virus (BTV, Sedoreoviridae: Orbivirus) causes an economically important disease, namely, bluetongue (BT), in domestic and wild ruminants worldwide. BTV is endemic to South India and has occurred with varying severity every year since the virus was first reported in 1963. BT can cause high morbidity and mortality to sheep flocks in this region, resulting in serious economic losses to subsistence farmers, with impacts on food security. The epidemiology of BTV in South India is complex, characterized by an unusually wide diversity of susceptible ruminant hosts, multiple vector species biting midges (Culicoides spp., Diptera: Ceratopogonidae), which have been implicated in the transmission of BTV and numerous co-circulating virus serotypes and strains. BT presence data (1997–2011) for South India were obtained from multiple sources to develop a presence/absence model for the disease. A non-linear discriminant analysis (NLDA) was carried out using temporal Fourier transformed variables that were remotely sensed as potential predictors of BT distribution. Predictive performance was then characterized using a range of different accuracy statistics (sensitivity, specificity, and Kappa). The top ten variables selected to explain BT distribution were primarily thermal metrics (land surface temperature, i.e., LST, and middle infrared, i.e., MIR) and a measure of plant photosynthetic activity (the Normalized Difference Vegetation Index, i.e., NDVI). A model that used pseudo-absence points, with three presence and absence clusters each, outperformed the model that used only the recorded absence points and showed high correspondence with past BTV outbreaks. The resulting risk maps may be suitable for informing disease managers concerned with vaccination, prevention, and control of BT in high-risk areas and for planning future state-wide vector and virus surveillance activities. Full article
Show Figures

Figure 1

17 pages, 3817 KB  
Article
A Reconstruction of May–June Mean Temperature since 1775 for Conchos River Basin, Chihuahua, Mexico, Using Tree-Ring Width
by Aldo Rafael Martínez-Sifuentes, José Villanueva-Díaz, Ramón Trucíos-Caciano, Nuria Aide López-Hernández, Juan Estrada-Ávalos and Víctor Manuel Rodríguez-Moreno
Atmosphere 2024, 15(7), 808; https://doi.org/10.3390/atmos15070808 - 5 Jul 2024
Cited by 1 | Viewed by 1934
Abstract
Currently there are several precipitation reconstructions for northern Mexico; however, there is a lack of temperature reconstructions to understand past climate change, the impact on ecosystems and societies, etc. The central region of Chihuahua is located in a transition zone between the Sierra [...] Read more.
Currently there are several precipitation reconstructions for northern Mexico; however, there is a lack of temperature reconstructions to understand past climate change, the impact on ecosystems and societies, etc. The central region of Chihuahua is located in a transition zone between the Sierra Madre Occidental and the Great Northern Plain, characterized by extreme temperatures and marked seasonal variability. The objectives of this study were (1) to generate a climatic association between variables from reanalysis models and the earlywood series for the center of Chihuahua, (2) to generate a reconstruction of mean temperature, (3) to determine extreme events, and (4) to identify the influence of ocean–atmosphere phenomena. Chronologies were downloaded from the International Tree-Ring Data Bank and climate information from the NLDAS-2 and ClimateNA reanalysis models. The response function was performed using climate models and regional dendrochronological series. A reconstruction of mean temperature was generated, and extreme periods were identified. The representativeness of the reconstruction was evaluated through spatial correlation, and low-frequency events were determined through multitaper spectral analysis and wavelet analysis. The influence of ocean–atmosphere phenomena on temperature reconstruction was analyzed using Pearson correlation, and the influence of ENSO was examined through wavelet coherence analysis. Highly significant correlations were found for maximum, minimum, and mean temperature, as well as for precipitation and relative humidity, before and after the growth year. However, the seasonal period with the highest correlation was found from May to June for mean temperature, which was used to generate the reconstruction from 1775 to 2022. The most extreme periods were 1775, 1801, 1805, 1860, 1892–1894, 1951, 1953–1954, and 2011–2012. Spectral analysis showed significant frequencies of 56.53 and 2.09 years, and wavelet analysis from 0 to 2 years from 1970 to 1980, from 8 to 11 years from 1890 to 1910, and from 30 to 70 years from 1860 to 2022. A significant association was found with the Multivariate ENSO Index phenomenon (r = 0.40; p = 0.009) and Pacific Decadal Oscillation (r = −0.38; p = 0.000). Regarding the ENSO phenomenon, an antiphase association of r = −0.34; p = 0.000 was found, with significant periods of 1 to 4 years from 1770 to 1800, 1845 to 1850, and 1860 to 1900, with periods of 6 to 10 years from 1875 to 1920, and from 6 to 8 years from 1990 to 2000. This study allowed a reconstruction of mean temperature through reanalysis data, as well as a historical characterization of temperature for central Chihuahua beyond the observed records. Full article
(This article belongs to the Special Issue Paleoclimate Reconstruction (2nd Edition))
Show Figures

Figure 1

31 pages, 2593 KB  
Review
Advancing Hydrology through Machine Learning: Insights, Challenges, and Future Directions Using the CAMELS, Caravan, GRDC, CHIRPS, PERSIANN, NLDAS, GLDAS, and GRACE Datasets
by Fahad Hasan, Paul Medley, Jason Drake and Gang Chen
Water 2024, 16(13), 1904; https://doi.org/10.3390/w16131904 - 3 Jul 2024
Cited by 27 | Viewed by 14282
Abstract
Machine learning (ML) applications in hydrology are revolutionizing our understanding and prediction of hydrological processes, driven by advancements in artificial intelligence and the availability of large, high-quality datasets. This review explores the current state of ML applications in hydrology, emphasizing the utilization of [...] Read more.
Machine learning (ML) applications in hydrology are revolutionizing our understanding and prediction of hydrological processes, driven by advancements in artificial intelligence and the availability of large, high-quality datasets. This review explores the current state of ML applications in hydrology, emphasizing the utilization of extensive datasets such as CAMELS, Caravan, GRDC, CHIRPS, NLDAS, GLDAS, PERSIANN, and GRACE. These datasets provide critical data for modeling various hydrological parameters, including streamflow, precipitation, groundwater levels, and flood frequency, particularly in data-scarce regions. We discuss the type of ML methods used in hydrology and significant successes achieved through those ML models, highlighting their enhanced predictive accuracy and the integration of diverse data sources. The review also addresses the challenges inherent in hydrological ML applications, such as data heterogeneity, spatial and temporal inconsistencies, issues regarding downscaling the LSH, and the need for incorporating human activities. In addition to discussing the limitations, this article highlights the benefits of utilizing high-resolution datasets compared to traditional ones. Additionally, we examine the emerging trends and future directions, including the integration of real-time data and the quantification of uncertainties to improve model reliability. We also place a strong emphasis on incorporating citizen science and the IoT for data collection in hydrology. By synthesizing the latest research, this paper aims to guide future efforts in leveraging large datasets and ML techniques to advance hydrological science and enhance water resource management practices. Full article
Show Figures

Graphical abstract

33 pages, 13438 KB  
Article
Investigating the Reliability of Nonlinear Static Procedures for the Seismic Assessment of Existing Masonry Buildings
by Sofia Giusto, Serena Cattari and Sergio Lagomarsino
Appl. Sci. 2024, 14(3), 1130; https://doi.org/10.3390/app14031130 - 29 Jan 2024
Cited by 5 | Viewed by 2414
Abstract
This paper presents, firstly, an overview of the nonlinear static procedures (NSPs) given in different codes and research studies available in the literature, followed by the results achieved by the authors to evaluate the reliability of the safety level that they guarantee. The [...] Read more.
This paper presents, firstly, an overview of the nonlinear static procedures (NSPs) given in different codes and research studies available in the literature, followed by the results achieved by the authors to evaluate the reliability of the safety level that they guarantee. The latter is estimated by adopting the fragility curve concept. In particular, 125 models of a masonry building case study are generated through a Monte Carlo process to obtain numerical fragility curves by applying various NSPs. More specifically, among the NSPs, the N2 method (based on the use of inelastic response spectra) with different alternatives and the capacity spectrum method (CSM)—based on the use of overdamped response spectra—are investigated. As a reference solution to estimate the reliability of the nonlinear static approach, nonlinear dynamic analyses (NLDAs) are carried out using the cloud method and a set of 125 accelerograms; the results are post-processed to derive fragility curves under the assumption of a lognormal distribution. The focus of this investigation is to quantify the influence that the NSP method’s choices imply, such as the criteria adopted to calculate the displacement demand of a structure or those for the bilinearization of the pushover curve. The results show that the N2 methods are all non-conservative. The only method that provides a good approximation of the capacity of the analyzed URM structures as derived from NLDAs is the CSM. In particular, bilinearization is proven to have a relevant impact on the results when using the N2 method to calculate displacement capacities, whereas the CSM method is not affected at all by such an assumption. The results obtained may have a significant impact on engineering practice and in outlining future directions regarding the methods to be recommended in codes. Full article
(This article belongs to the Special Issue Structural Design and Analysis for Constructions and Buildings)
Show Figures

Figure 1

12 pages, 4083 KB  
Communication
Application of Near-Infrared Spectroscopy and Fuzzy Improved Null Linear Discriminant Analysis for Rapid Discrimination of Milk Brands
by Xiaohong Wu, Yiheng Fang, Bin Wu and Man Liu
Foods 2023, 12(21), 3929; https://doi.org/10.3390/foods12213929 - 26 Oct 2023
Cited by 20 | Viewed by 2412
Abstract
The quality of milk is tightly linked to its brand. A famous brand of milk always has good quality. Therefore, this study seeks to design a new fuzzy feature extraction method, called fuzzy improved null linear discriminant analysis (FiNLDA), to cluster the spectra [...] Read more.
The quality of milk is tightly linked to its brand. A famous brand of milk always has good quality. Therefore, this study seeks to design a new fuzzy feature extraction method, called fuzzy improved null linear discriminant analysis (FiNLDA), to cluster the spectra of collected milk for identifying milk brands. To elevate the classification accuracy, FiNLDA was applied to process the near-infrared (NIR) spectra of milk acquired by the portable near-infrared spectrometer. The principal component analysis and Savitzky–Golay (SG) filtering algorithm were employed to lower dimensionality and eliminate noise in this system, respectively. Thereafter, improved null linear discriminant analysis (iNLDA) and FiNLDA were applied to attain the discriminant information of the NIR spectra. At last, the K-nearest neighbor classifier was utilized for assessing the performance of the identification system. The results indicated that the maximum classification accuracies of LDA, iNLDA and FiNLDA were 74.7%, 88% and 94.67%, respectively. Accordingly, the portable NIR spectrometer in combination with FiNLDA can classify milk brands correctly and effectively. Full article
(This article belongs to the Section Food Quality and Safety)
Show Figures

Figure 1

18 pages, 5484 KB  
Article
Triple Collocation of Ground-, Satellite- and Land Surface Model-Based Surface Soil Moisture Products in Oklahoma Part II: New Multi-Sensor Soil Moisture (MSSM) Product
by Zhen Hong, Hernan A. Moreno, Laura V. Alvarez, Zhi Li and Yang Hong
Remote Sens. 2023, 15(13), 3450; https://doi.org/10.3390/rs15133450 - 7 Jul 2023
Cited by 1 | Viewed by 2801
Abstract
This study develops a triple-collocation (TC) based, multi-source shallow-soil moisture product for Oklahoma. The method uses a least squared weights (LSW) optimization to find the set of parameters that result in the lowest root mean squared error (RMSE) with respect to the “unknown [...] Read more.
This study develops a triple-collocation (TC) based, multi-source shallow-soil moisture product for Oklahoma. The method uses a least squared weights (LSW) optimization to find the set of parameters that result in the lowest root mean squared error (RMSE) with respect to the “unknown truth”. Soil moisture information from multiple sources and resolutions, including the Soil Moisture Active Passive SMAP L3_SM_P_E (9 km, daily), the physically-based, land surface model (LSM) estimates from NLDAS_NOAH0125_H (1/8°, hourly), and the Oklahoma Mesonet ground sensor network (9 km interpolated from point, 30 min) is merged into a 9 km spatial and daily temporal resolution product across the state of Oklahoma from April 2015 to July 2019. This multi-sensor surface soil moisture (MSSM) product is assessed in terms of a state-wide benchmark and previously tested, in situ-based soil moisture product and SMAP L4. Results show that: (1) independent source products have differential values according to the regional conditions they represent, including land cover type, soils, irrigation, or climate regime; (2) beyond serving as validation sets, in situ measurements are of significant value for improving the accuracy of multi-sensor soil moisture datasets through TC; and (3) state-wide RMSE values obtained with MSSM are similar to the typical measurement error found on in situ ground measurements which provides some degree of confidence on the new product. MSSM is an improvement over currently available products in Oklahoma due to its minimized uncertainty, easiness of production, and continuous temporal and geographic coverage. Nevertheless, to exploit its utility, further tests of this methodology are needed in different climates, land cover types, geographic regions, and for other independent products and spatiotemporal resolutions. Full article
(This article belongs to the Special Issue Satellite Soil Moisture Validation and Applications)
Show Figures

Figure 1

17 pages, 1246 KB  
Article
Kernel Reverse Neighborhood Discriminant Analysis
by Wangwang Li, Hengliang Tan, Jianwei Feng, Ming Xie, Jiao Du, Shuo Yang and Guofeng Yan
Electronics 2023, 12(6), 1322; https://doi.org/10.3390/electronics12061322 - 10 Mar 2023
Cited by 4 | Viewed by 2231
Abstract
Currently, neighborhood linear discriminant analysis (nLDA) exploits reverse nearest neighbors (RNN) to avoid the assumption of linear discriminant analysis (LDA) that all samples from the same class should be independently and identically distributed (i.i.d.). nLDA performs well when a dataset contains multimodal classes. [...] Read more.
Currently, neighborhood linear discriminant analysis (nLDA) exploits reverse nearest neighbors (RNN) to avoid the assumption of linear discriminant analysis (LDA) that all samples from the same class should be independently and identically distributed (i.i.d.). nLDA performs well when a dataset contains multimodal classes. However, in complex pattern recognition tasks, such as visual classification, the complex appearance variations caused by deformation, illumination and visual angle often generate non-linearity. Furthermore, it is not easy to separate the multimodal classes in lower-dimensional feature space. One solution to these problems is to map the feature to a higher-dimensional feature space for discriminant learning. Hence, in this paper, we employ kernel functions to map the original data to a higher-dimensional feature space, where the nonlinear multimodal classes can be better classified. We give the details of the deduction of the proposed kernel reverse neighborhood discriminant analysis (KRNDA) with the kernel tricks. The proposed KRNDA outperforms the original nLDA on most datasets of the UCI benchmark database. In high-dimensional visual recognition tasks of handwritten digit recognition, object categorization and face recognition, our KRNDA achieves the best recognition results compared to several sophisticated LDA-based discriminators. Full article
(This article belongs to the Section Artificial Intelligence)
Show Figures

Figure 1

Back to TopTop