Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (116)

Search Parameters:
Keywords = inverse gamma distribution

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
19 pages, 2913 KiB  
Article
Radiation Mapping: A Gaussian Multi-Kernel Weighting Method for Source Investigation in Disaster Scenarios
by Songbai Zhang, Qi Liu, Jie Chen, Yujin Cao and Guoqing Wang
Sensors 2025, 25(15), 4736; https://doi.org/10.3390/s25154736 (registering DOI) - 31 Jul 2025
Abstract
Structural collapses caused by accidents or disasters could create unexpected radiation shielding, resulting in sharp gradients within the radiation field. Traditional radiation mapping methods often fail to accurately capture these complex variations, making the rapid and precise localization of radiation sources a significant [...] Read more.
Structural collapses caused by accidents or disasters could create unexpected radiation shielding, resulting in sharp gradients within the radiation field. Traditional radiation mapping methods often fail to accurately capture these complex variations, making the rapid and precise localization of radiation sources a significant challenge in emergency response scenarios. To address this issue, based on standard Gaussian process regression (GPR) models that primarily utilize a single Gaussian kernel to reflect the inverse-square law in free space, a novel multi-kernel Gaussian process regression (MK-GPR) model is proposed for high-fidelity radiation mapping in environments with physical obstructions. MK-GPR integrates two additional kernel functions with adaptive weighting: one models the attenuation characteristics of intervening materials, and the other captures the energy-dependent penetration behavior of radiation. To validate the model, gamma-ray distributions in complex, shielded environments were simulated using GEometry ANd Tracking 4 (Geant4). Compared with conventional methods, including linear interpolation, nearest-neighbor interpolation, and standard GPR, MK-GPR demonstrated substantial improvements in key evaluation metrics, such as MSE, RMSE, and MAE. Notably, the coefficient of determination (R2) increased to 0.937. For practical deployment, the optimized MK-GPR model was deployed to an RK-3588 edge computing platform and integrated into a mobile robot equipped with a NaI(Tl) detector. Field experiments confirmed the system’s ability to accurately map radiation fields and localize gamma sources. When combined with SLAM, the system achieved localization errors of 10 cm for single sources and 15 cm for dual sources. These results highlight the potential of the proposed approach as an effective and deployable solution for radiation source investigation in post-disaster environments. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

13 pages, 600 KiB  
Article
Frequentist and Bayesian Estimation Under Progressive Type-II Random Censoring for a Two-Parameter Exponential Distribution
by Rajni Goel, Mahmoud M. Abdelwahab and Tejaswar Kamble
Symmetry 2025, 17(8), 1205; https://doi.org/10.3390/sym17081205 - 29 Jul 2025
Viewed by 125
Abstract
In medical research, random censoring often occurs due to unforeseen subject withdrawals, whereas progressive censoring is intentionally applied to minimize time and resource requirements during experimentation. This work focuses on estimating the parameters of a two-parameter exponential distribution under a progressive Type-II random [...] Read more.
In medical research, random censoring often occurs due to unforeseen subject withdrawals, whereas progressive censoring is intentionally applied to minimize time and resource requirements during experimentation. This work focuses on estimating the parameters of a two-parameter exponential distribution under a progressive Type-II random censoring scheme, which integrates both censoring strategies. The use of symmetric properties in failure and censoring time models, arising from a shared location parameter, facilitates a balanced and robust inferential framework. This symmetry ensures interpretational clarity and enhances the tractability of both frequentist and Bayesian methods. Maximum likelihood estimators (MLEs) are obtained, along with asymptotic confidence intervals. A Bayesian approach is also introduced, utilizing inverse gamma priors, and Gibbs sampling is implemented to derive Bayesian estimates. The effectiveness of the proposed methodologies was assessed through extensive Monte Carlo simulations and demonstrated using an actual dataset. Full article
(This article belongs to the Section Mathematics)
Show Figures

Figure 1

16 pages, 666 KiB  
Article
Bayesian Analysis of the Maxwell Distribution Under Progressively Type-II Random Censoring
by Rajni Goel, Mahmoud M. Abdelwahab and Mustafa M. Hasaballah
Axioms 2025, 14(8), 573; https://doi.org/10.3390/axioms14080573 - 25 Jul 2025
Viewed by 145
Abstract
Accurate modeling of product lifetimes is vital in reliability analysis and engineering to ensure quality and maintain competitiveness. This paper proposes the progressively randomly censored Maxwell distribution, which incorporates both progressive Type-II and random censoring within the Maxwell distribution framework. The model allows [...] Read more.
Accurate modeling of product lifetimes is vital in reliability analysis and engineering to ensure quality and maintain competitiveness. This paper proposes the progressively randomly censored Maxwell distribution, which incorporates both progressive Type-II and random censoring within the Maxwell distribution framework. The model allows for the planned removal of surviving units at specific stages of an experiment, accounting for both deliberate and random censoring events. It is assumed that survival and censoring times each follow a Maxwell distribution, though with distinct parameters. Both frequentist and Bayesian approaches are employed to estimate the model parameters. In the frequentist approach, maximum likelihood estimators and their corresponding confidence intervals are derived. In the Bayesian approach, Bayes estimators are obtained using an inverse gamma prior and evaluated through a Markov Chain Monte Carlo (MCMC) method under the squared error loss function (SELF). A Monte Carlo simulation study evaluates the performance of the proposed estimators. The practical relevance of the methodology is demonstrated using a real data set. Full article
Show Figures

Figure 1

19 pages, 946 KiB  
Article
Enhanced Fast Fractional Fourier Transform (FRFT) Scheme Based on Closed Newton-Cotes Rules
by Aubain Nzokem, Daniel Maposa and Anna M. Seimela
Axioms 2025, 14(7), 543; https://doi.org/10.3390/axioms14070543 - 20 Jul 2025
Viewed by 205
Abstract
The paper presents an enhanced numerical framework for computing the one-dimensional fast Fractional Fourier Transform (FRFT) by integrating closed-form Composite Newton-Cotes quadrature rules. We show that a FRFT of a QN-length weighted sequence can be decomposed analytically into two mathematically [...] Read more.
The paper presents an enhanced numerical framework for computing the one-dimensional fast Fractional Fourier Transform (FRFT) by integrating closed-form Composite Newton-Cotes quadrature rules. We show that a FRFT of a QN-length weighted sequence can be decomposed analytically into two mathematically commutative compositions: one involving the composition of a FRFT of an N-length sequence and a FRFT of a Q-length weighted sequence, and the other in reverse order. The composite FRFT approach is applied to the inversion of Fourier and Laplace transforms, with a focus on estimating probability densities for distributions with complex-valued characteristic functions. Numerical experiments on the Variance-Gamma (VG) and Generalized Tempered Stable (GTS) models show that the proposed scheme significantly improves accuracy over standard (non-weighted) fast FRFT and classical Newton-Cotes quadrature, while preserving computational efficiency. The findings suggest that the composite FRFT framework offers a robust and mathematically sound tool for transform-based numerical approximations, particularly in applications involving oscillatory integrals and complex-valued characteristic functions. Full article
(This article belongs to the Special Issue Numerical Analysis and Applied Mathematics)
Show Figures

Figure 1

17 pages, 3856 KiB  
Article
Wavelet Fusion with Sobel-Based Weighting for Enhanced Clarity in Underwater Hydraulic Infrastructure Inspection
by Minghui Zhang, Jingkui Zhang, Jugang Luo, Jiakun Hu, Xiaoping Zhang and Juncai Xu
Appl. Sci. 2025, 15(14), 8037; https://doi.org/10.3390/app15148037 - 18 Jul 2025
Viewed by 290
Abstract
Underwater inspection images of hydraulic structures often suffer from haze, severe color distortion, low contrast, and blurred textures, impairing the accuracy of automated crack, spalling, and corrosion detection. However, many existing enhancement methods fail to preserve structural details and suppress noise in turbid [...] Read more.
Underwater inspection images of hydraulic structures often suffer from haze, severe color distortion, low contrast, and blurred textures, impairing the accuracy of automated crack, spalling, and corrosion detection. However, many existing enhancement methods fail to preserve structural details and suppress noise in turbid environments. To address these limitations, we propose a compact image enhancement framework called Wavelet Fusion with Sobel-based Weighting (WWSF). This method first corrects global color and luminance distributions using multiscale Retinex and gamma mapping, followed by local contrast enhancement via CLAHE in the L channel of the CIELAB color space. Two preliminarily corrected images are decomposed using discrete wavelet transform (DWT); low-frequency bands are fused based on maximum energy, while high-frequency bands are adaptively weighted by Sobel edge energy to highlight structural features and suppress background noise. The enhanced image is reconstructed via inverse DWT. Experiments on real-world sluice gate datasets demonstrate that WWSF outperforms six state-of-the-art methods, achieving the highest scores on UIQM and AG while remaining competitive on entropy (EN). Moreover, the method retains strong robustness under high turbidity conditions (T ≥ 35 NTU), producing sharper edges, more faithful color representation, and improved texture clarity. These results indicate that WWSF is an effective preprocessing tool for downstream tasks such as segmentation, defect classification, and condition assessment of hydraulic infrastructure in complex underwater environments. Full article
Show Figures

Figure 1

24 pages, 3798 KiB  
Article
A Robust Tracking Method for Aerial Extended Targets with Space-Based Wideband Radar
by Linlin Fang, Yuxin Hu, Lihua Zhong and Lijia Huang
Remote Sens. 2025, 17(14), 2360; https://doi.org/10.3390/rs17142360 - 9 Jul 2025
Viewed by 200
Abstract
Space-based radar systems offer significant advantages for air surveillance, including wide-area coverage and extended early-warning capabilities. The integrated design of detection and imaging in space-based wideband radar further enhances its accuracy. However, in the wideband tracking mode, large aircraft targets exhibit extended characteristics. [...] Read more.
Space-based radar systems offer significant advantages for air surveillance, including wide-area coverage and extended early-warning capabilities. The integrated design of detection and imaging in space-based wideband radar further enhances its accuracy. However, in the wideband tracking mode, large aircraft targets exhibit extended characteristics. Measurements from the same target cross multiple range resolution cells. Additionally, the nonlinear observation model and uncertain measurement noise characteristics under space-based long-distance observation substantially increase the tracking complexity. To address these challenges, we propose a robust aerial target tracking method for space-based wideband radar applications. First, we extend the observation model of the gamma Gaussian inverse Wishart probability hypothesis density filter to three-dimensional space by incorporating a spherical–radial cubature rule for improved nonlinear filtering. Second, variational Bayesian processing is integrated to enable the joint estimation of the target state and measurement noise parameters, and a recursive process is derived for both Gaussian and Student’s t-distributed measurement noise, enhancing the method’s robustness against noise uncertainty. Comprehensive simulations evaluating varying target extension parameters and noise conditions demonstrate that the proposed method achieves superior tracking accuracy and robustness. Full article
Show Figures

Graphical abstract

24 pages, 3774 KiB  
Article
A Novel Stochastic SVIR Model Capturing Transmission Variability Through Mean-Reverting Processes and Stationary Reproduction Thresholds
by Yassine Sabbar and Saud Fahad Aldosary
Mathematics 2025, 13(13), 2097; https://doi.org/10.3390/math13132097 - 26 Jun 2025
Viewed by 350
Abstract
This study presents a stochastic SVIR epidemic model in which disease transmission rates fluctuate randomly over time, driven by independent, mean-reverting processes with multiplicative noise. These dynamics capture environmental variability and behavioral changes affecting disease spread. We derive analytical expressions for the conditional [...] Read more.
This study presents a stochastic SVIR epidemic model in which disease transmission rates fluctuate randomly over time, driven by independent, mean-reverting processes with multiplicative noise. These dynamics capture environmental variability and behavioral changes affecting disease spread. We derive analytical expressions for the conditional moments of the transmission rates and establish the existence of their stationary distributions under broad conditions. By averaging over these distributions, we define a stationary effective reproduction number that enables a probabilistic classification of outbreak scenarios. Specifically, we estimate the likelihood of disease persistence or extinction based on transmission uncertainty. Sensitivity analyses reveal that the shape and intensity of transmission variability play a decisive role in epidemic outcomes. Monte Carlo simulations validate our theoretical findings, showing strong agreement between empirical distributions and theoretical predictions. Our results underscore how randomness in disease transmission can fundamentally alter epidemic trajectories, offering a robust mathematical framework for risk assessment under uncertainty. Full article
Show Figures

Figure 1

28 pages, 11942 KiB  
Article
Reliability Analysis of Improved Type-II Adaptive Progressively Inverse XLindley Censored Data
by Refah Alotaibi, Mazen Nassar and Ahmed Elshahhat
Axioms 2025, 14(6), 437; https://doi.org/10.3390/axioms14060437 - 2 Jun 2025
Viewed by 347
Abstract
This study offers a newly improved Type-II adaptive progressive censoring with data sampled from an inverse XLindley (IXL) distribution for more efficient and adaptive reliability assessments. Through this sampling mechanism, we evaluate the parameters of the IXL distribution, as well as its reliability [...] Read more.
This study offers a newly improved Type-II adaptive progressive censoring with data sampled from an inverse XLindley (IXL) distribution for more efficient and adaptive reliability assessments. Through this sampling mechanism, we evaluate the parameters of the IXL distribution, as well as its reliability and hazard rate features. In the context of reliability, to handle flexible and time-constrained testing frameworks in high-reliability environments, we formulate maximum likelihood estimators versus Bayesian estimates derived via Markov chain Monte Carlo techniques under gamma priors, which effectively capture prior knowledge. Two patterns of asymptotic interval estimates are constructed through the normal approximation of the classical estimates and of the log-transformed classical estimates. On the other hand, from the Markovian chains, two patterns of credible interval estimates are also constructed. A robust simulation study is carried out to compare the classical and Bayesian point estimation methods, along with the four interval estimation methods. This study’s practical usefulness is demonstrated by its analysis of a real-world dataset. The results reveal that both conventional and Bayesian inferential methods function accurately, with the Bayesian outcomes surpassing those of the conventional method. Full article
(This article belongs to the Special Issue Computational Statistics and Its Applications, 2nd Edition)
Show Figures

Figure 1

18 pages, 2382 KiB  
Article
Bethe–Heitler Cascades and Hard Gamma-Ray Spectra in Flaring TeV Blazars: 1ES 0414009 and 1ES 1959650
by Samuel Victor Bernardo da Silva, Luiz Augusto Stuani Pereira and Rita de Cássia Dos Anjos
Universe 2025, 11(6), 177; https://doi.org/10.3390/universe11060177 - 31 May 2025
Viewed by 1443
Abstract
In this work, we present updated models of the spectral energy distributions (SEDs) for two high-frequency-peaked BL Lac objects (HBLs), that is, 1ES 0414+009 and 1ES 1959+650. The hard gamma-ray spectra observed during their flaring states suggest the presence of an additional emission [...] Read more.
In this work, we present updated models of the spectral energy distributions (SEDs) for two high-frequency-peaked BL Lac objects (HBLs), that is, 1ES 0414+009 and 1ES 1959+650. The hard gamma-ray spectra observed during their flaring states suggest the presence of an additional emission component beyond the standard synchrotron self-Compton (SSC) scenario. We explore the possibility that this hard gamma-ray emission arises from inverse Compton (IC) scattering by Bethe–Heitler pairs produced along the line of sight, pointing to a more complex high-energy emission mechanism in these sources. Full article
(This article belongs to the Special Issue 10th Anniversary of Universe: Galaxies and Their Black Holes)
Show Figures

Figure 1

19 pages, 660 KiB  
Article
A Versatile Distribution Based on the Incomplete Gamma Function: Characterization and Applications
by Jimmy Reyes, Carolina Marchant, Karol I. Santoro and Yuri A. Iriarte
Mathematics 2025, 13(11), 1749; https://doi.org/10.3390/math13111749 - 25 May 2025
Viewed by 464
Abstract
In this study, we introduce a novel distribution related to the gamma distribution, referred to as the generalized incomplete gamma distribution. This new family is defined through a stochastic representation involving a linear transformation of a random variable following a distribution derived from [...] Read more.
In this study, we introduce a novel distribution related to the gamma distribution, referred to as the generalized incomplete gamma distribution. This new family is defined through a stochastic representation involving a linear transformation of a random variable following a distribution derived from the upper incomplete gamma function. As a result, the proposed distribution exhibits a probability density function that effectively captures data exhibiting asymmetry and both mild and high levels of kurtosis, providing greater flexibility compared to the conventional gamma distribution. We analyze the probability density function and explore fundamental properties, including moments, skewness, and kurtosis coefficients. Parameter estimation is conducted via the maximum likelihood method, and a Monte Carlo simulation study is performed to assess the asymptotic properties of the maximum likelihood estimators. To illustrate the applicability of the proposed distribution, we present two case studies involving real-world datasets related to mineral concentration and the length of odontoblasts in guinea pigs, demonstrating that the proposed distribution provides a superior fit compared to the gamma, inverse Gaussian, and slash-type distributions. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

26 pages, 12878 KiB  
Article
Reliability Estimation for the Inverse Chen Distribution Under Adaptive Progressive Censoring with Binomial Removals: A Framework for Asymmetric Data
by Refah Alotaibi, Mazen Nassar and Ahmed Elshahhat
Symmetry 2025, 17(6), 812; https://doi.org/10.3390/sym17060812 - 23 May 2025
Viewed by 362
Abstract
Traditional reliability methods using fixed removal plans often overlook withdrawal randomness, leading to biased estimates for asymmetric data. This study advances classical and Bayesian frameworks for the inverse Chen distribution, which is suited for modeling asymmetric data under adaptive progressively Type-II censoring with [...] Read more.
Traditional reliability methods using fixed removal plans often overlook withdrawal randomness, leading to biased estimates for asymmetric data. This study advances classical and Bayesian frameworks for the inverse Chen distribution, which is suited for modeling asymmetric data under adaptive progressively Type-II censoring with binomial removals. Here, removals post-failure follow a dynamic binomial process, enhancing a more realistic approach for reliability studies. Maximum likelihood estimates are computed numerically, with confidence intervals derived asymptotically. Bayesian approaches employ gamma priors, symmetric squared error loss, and posterior sampling for estimates and credible intervals. A simulation study validates the methods, while two asymmetric real-world applications demonstrate practicality: (1) analyzing diamond sizes from South-West Africa, capturing skewed geological distributions, and (2) modeling failure times of airborne communication transceivers, vital for aviation safety. The flexibility of the inverse Chen in handling asymmetric data addresses the limitations of symmetric assumptions, offering precise reliability tools for complex scenarios. This integration of adaptive censoring and asymmetric distributions advances reliability analysis, providing robust solutions where traditional approaches falter. Full article
Show Figures

Figure 1

23 pages, 741 KiB  
Article
Empirical Bayes Estimators for Mean Parameter of Exponential Distribution with Conjugate Inverse Gamma Prior Under Stein’s Loss
by Zheng Li, Ying-Ying Zhang and Ya-Guang Shi
Mathematics 2025, 13(10), 1658; https://doi.org/10.3390/math13101658 - 19 May 2025
Viewed by 264
Abstract
A Bayes estimator for a mean parameter of an exponential distribution is calculated using Stein’s loss, which equally penalizes gross overestimation and underestimation. A corresponding Posterior Expected Stein’s Loss (PESL) is also determined. Additionally, a Bayes estimator for a mean parameter is obtained [...] Read more.
A Bayes estimator for a mean parameter of an exponential distribution is calculated using Stein’s loss, which equally penalizes gross overestimation and underestimation. A corresponding Posterior Expected Stein’s Loss (PESL) is also determined. Additionally, a Bayes estimator for a mean parameter is obtained under a squared error loss along with its corresponding PESL. Furthermore, two methods are used to derive empirical Bayes estimators for the mean parameter of the exponential distribution with an inverse gamma prior. Numerical simulations are conducted to illustrate five aspects. Finally, theoretical studies are illustrated using Static Fatigue 90% Stress Level data. Full article
(This article belongs to the Special Issue Bayesian Statistical Analysis of Big Data and Complex Data)
Show Figures

Figure 1

15 pages, 3639 KiB  
Article
Prediction of Temperature Factors in Proteins: Effect of Data Pre-Processing and Experimental Conditions
by Jure Pražnikar
Crystals 2025, 15(5), 455; https://doi.org/10.3390/cryst15050455 - 12 May 2025
Viewed by 427
Abstract
The B-factor or temperature factor is one of the most important parameters in addition to the atomic coordinates, and which is refined during the determination of the protein structure and stored in the Protein Data Bank. It reflects the uncertainty of the atomic [...] Read more.
The B-factor or temperature factor is one of the most important parameters in addition to the atomic coordinates, and which is refined during the determination of the protein structure and stored in the Protein Data Bank. It reflects the uncertainty of the atomic positions and is closely linked to atomic flexibility. By using graphlet degree vectors as feature descriptors in a linear model—together with appropriate data transformation and consideration of various experimental factors—the model provides better prediction results. For example, the inclusion of crystal contacts in the linear model significantly improves the prediction accuracy. Since the distributions of the B-factors typically follow an inverse gamma distribution, applying a logarithmic transformation further improves the performance of the model. It has also been shown that large ligands, such as those found in protein–DNA complexes, have a significant impact on the quality of the prediction. A linear model based on graphlet degree vectors proves to be effective not only for the prediction of B-factors and the validation of deposited protein structures but also for the qualitative estimation of root-mean-square fluctuations derived from molecular dynamics. Full article
(This article belongs to the Section Macromolecular Crystals)
Show Figures

Figure 1

20 pages, 1097 KiB  
Article
Rao and Wald Tests in Nonzero-Mean Non–Gaussian Sea Clutter
by Haoqi Wu, Hongzhi Guo, Zhihang Wang and Zishu He
Remote Sens. 2025, 17(10), 1696; https://doi.org/10.3390/rs17101696 - 12 May 2025
Viewed by 300
Abstract
The non-Gaussian nature of radar-observed clutter echoes induces performance degradation in the context of remote sensing target detection when using conventional Gaussian detectors. To enhance target detection performance, this study addresses the issue of adaptive detection in nonzero-mean non-Gaussian sea clutter environments. The [...] Read more.
The non-Gaussian nature of radar-observed clutter echoes induces performance degradation in the context of remote sensing target detection when using conventional Gaussian detectors. To enhance target detection performance, this study addresses the issue of adaptive detection in nonzero-mean non-Gaussian sea clutter environments. The nonzero-mean compound Gaussian model, composed of the texture and complex Gaussian speckle, is utilized to capture the sea clutter. Further, we adopt the inverse Gamma, Gamma, and inverse Gaussian distributions to characterize the texture component. Novel adaptive detectors based on the two-step Rao and Wald tests, taking advantage of the maximum a posteriori (MAP) method to estimate textures, are designed. More specifically, test statistics of the proposed Rao- and Wald-based detectors are derived by assuming the speckle covariance matrix (CM), mean vector (MV), and clutter texture in the first step. Then, the sea clutter parameters assumed to be known are replaced with their estimations, and fully adaptive detectors are obtained. The Monte Carlo performance evaluation experiments using both simulated and measured sea clutter data are conducted, and numerical results validate the constant false alarm rate (CFAR) properties and detection performance of the proposed nonzero-mean detectors. Additionally, the proposed Rao and Wald detectors, respectively, show strong robustness and good selectivity for mismatch signals. Full article
(This article belongs to the Special Issue Array and Signal Processing for Radar)
Show Figures

Figure 1

20 pages, 2421 KiB  
Article
Socioeconomic Profile of Agricultural Producers and Production Systems in Municipalities of Piauí, Brazil
by Creusa Carvalho da Costa, Ana Cristina Alves Rodrigues, Caroline Chaves Arantes, Graciliano Galdino Alves dos Santos and Emil José Hernández Ruz
Sustainability 2025, 17(9), 4137; https://doi.org/10.3390/su17094137 - 2 May 2025
Viewed by 642
Abstract
Floodplain agriculture is a practice that involves cultivating arable soils along riverbanks and reservoirs, which become submerged during the rainy season. This study aimed to analyze the socioeconomic aspects of floodplain farmers in the municipalities of Amarante, Floriano, and Uruçuí along the banks [...] Read more.
Floodplain agriculture is a practice that involves cultivating arable soils along riverbanks and reservoirs, which become submerged during the rainy season. This study aimed to analyze the socioeconomic aspects of floodplain farmers in the municipalities of Amarante, Floriano, and Uruçuí along the banks of the Parnaíba River in northeastern Brazil. We conducted semi-structured interviews using the rapport technique. Data were analyzed using generalized linear models with four distributions (gamma, inverse Gaussian, exponential, and Gaussian), with the aim of identifying patterns and relationships between socioeconomic variables and production system profiles. The average age of respondents was 49 years across the three communities, with a predominance of male farmers. Regarding the length of residence, communities in Uruçuí had lived in the area the longest. In terms of monthly income, 80% of farmers earned up to one minimum wage. Land size analysis indicated that properties in Amarante had the highest average land area in hectares. We conclude that agriculture in the region studied is dominated by manual planting, low adoption of technologies, and scarce use of soil conservation techniques, suggesting more sustainable agricultural practices, the development of management plans, and rural extension practices. Full article
Show Figures

Figure 1

Back to TopTop