Next Article in Journal
The New Integrated Interval-Valued Fermatean Fuzzy Decision-Making Approach with the Implementation of Green Supply Chain Management
Next Article in Special Issue
A New Flexible Model for Three-Stage Phenomena: The Fragmented Kumar-Trapez Distribution
Previous Article in Journal
Müntz–Legendre Wavelet Collocation Method for Solving Fractional Riccati Equation
Previous Article in Special Issue
Intrinsic Functional Partially Linear Poisson Regression Model for Count Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Newly Improved Two-Parameter Ridge Estimators: A Better Approach for Mitigating Multicollinearity in Regression Analysis

by
Muteb Faraj Alharthi
1 and
Nadeem Akhtar
2,*
1
Department of Mathematics and Statistics, College of Science, Taif University, Taif 21944, Saudi Arabia
2
Higher Education Department, Peshawar 26281, Khyber Pakhtunkhwa, Pakistan
*
Author to whom correspondence should be addressed.
Axioms 2025, 14(3), 186; https://doi.org/10.3390/axioms14030186
Submission received: 22 January 2025 / Revised: 26 February 2025 / Accepted: 28 February 2025 / Published: 2 March 2025
(This article belongs to the Special Issue Computational Statistics and Its Applications)

Abstract

:
This study tackles the common issues of multicollinearity arising in regression models due to high correlations among predictor variables, leading to unreliable coefficient estimates and inflated variances, ultimately affecting the model’s accuracy. To address this issue, we introduce four improved two-parameter ridge estimators, named as MIRE1, MIRE2, MIRE3, and MIRE4, which incorporate innovative adjustments such as logarithmic transformations and customized penalization strategies to enhance estimation efficiency. These biased estimators are evaluated through a comprehensive Monte Carlo simulation using the minimum estimated mean square error (MSE) criterion. Although no single ridge estimator performs optimally under all conditions, our proposed estimators consistently outperform existing estimators in most scenarios. Notably, MIRE2 and MIRE3 emerge as the best-performing estimators across a variety of conditions. Their practical utility is further demonstrated through applications to two real-world datasets. The results of the analysis confirm that the proposed ridge estimators offer a reliable and effective approach for improving estimation precision in regression models, as they consistently yield the lowest MSE compared to other estimators.

1. Introduction

Linear regression is a widely used statistical technique for modeling the relationship between a dependent variable and one or more independent variables. The standard linear regression model is expressed as:
y = X + ϵ ,
where y = y 1 y 2 y n represents an n × 1 vector of response observations, = 1 2 p is a p × 1 vector of unknown regression coefficients, X = x 11 x 12 x 1 p x 21 x 22 x 2 p x n 1 x n 2 x n p is an n × p matrix of predictor observations, and ϵ = ϵ 1 ϵ 2 ϵ n is an n × 1 vector of error terms.
The ordinary least squares (OLS) solution for estimating the regression coefficients is given in Equation (2):
^ o l s = ^ 1 ^ 2 ^ p = X X 1 X y .
However, OLS estimates can become unstable in the presence of multicollinearity among predictor variables, leading to inflated variances of regression coefficients [1]. Ridge regression is a robust alternative to address this problem by introducing a penalty term to the OLS method. The ridge regression estimator is defined in Equation (3) as:
^ R d = X X + k I 1 X y .
In this equation, I is the identity of the order p × p and k   is a positive scalar usually called a ridge parameter. The ridge parameter is the level of shrinkage used on the coefficient estimates and the choice of k is very integral in the determination of the tradeoff between bias and variance.
To enhance ridge regression’s flexibility, a two-parameter ridge estimator has been proposed [2]. This generalized form introduces a scaling factor q and is expressed in Equation (4) as:
^ q , k = q X X + k I 1 X y .
Here, q is a scaling factor, and the presence of both q and k gives a more flexible class of solutions to the ridge regression problem. The scaling factor q can be computed as:
q ^ = X y X X + k I 1 X y X y X X + k I 1 X X X X + k I 1 X y .
Equation (4) reduces to the conventional OLS and ridge regression estimating equation when q = 1 and k = k or k = 0 . Therefore, the new idea of the two-parameter ridge estimator is helpful in providing a better solution to the problem of multicollinearity in regression analysis. Ref. [3] provided further information about the cross-sectional analysis of variance and ordinary least square estimator and its development to enhance multicollinearity.
Several studies have explored specific improvements in ridge regression. Ref. [4] conducted a Monte Carlo evaluation of various ridge-type estimators, providing empirical comparisons of their performance. Further, the work of [5] offered refined methods for selecting ridge parameters, essential for optimizing model accuracy. Ref. [6] also evaluated new ridge regression estimators, contributing to the empirical understanding of these methods.
The development of ridge regression has also crossed into specialized fields. For instance, Ref. [7] examined its role in logistic regression, while [8] focused on improved ridge regression estimators for logistic models, offering more robust options in binary classification problems. Ref. [9] reviewed ridge-type estimators, highlighting raise estimators for geometric data perturbation and parameter selection. More recently, Ref. [10] developed new ridge estimators to examine the severe multicollinearity data. The authors of [11] further explored robust ridge estimators, particularly under conditions of multicollinearity and the presence of outliers. These robust methods provide a more reliable estimation framework in real-world datasets.
Modern applications of ridge regression are evident in various fields, including environmental science and econometrics. For example, Ref. [12] developed a carbon price forecasting model based on Lasso regression and optimal integration, blending traditional ridge regression with machine learning techniques for improved forecasting accuracy. Ref. [10] developed two-parameter estimators to address linear regression models with correlated predictors. Additionally, the work of [13] introduced two-parameter ridge estimators, further advancing ridge regression in dealing with correlated predictors.
Although significant progress has been made in the development of ridge estimators, no single ridge estimator has consistently demonstrated optimal performance across varying multicollinearity conditions. Existing estimators often excel in specific scenarios but lack robustness and adaptability when applied to diverse datasets. To fill this gap, this study proposes four novel ridge estimators referred to as MIRE1, MIRE2, MIRE3, and MIRE4. These estimators are rigorously evaluated through comprehensive Monte Carlo simulations and compared to other existing estimators using MSE criterion. The results of the analysis indicate that the proposed estimators offer superior performance, particularly in multicollinear datasets, providing a significant contribution to the ongoing development of robust regression techniques. This work fills a critical gap in the literature by offering a set of estimators that provide more reliable performance across diverse datasets, marking a significant advancement in the development of more flexible and effective regression techniques.
The article is structured as follows. Section 2 discusses ridge regression methodology, introducing the proposed and existing estimators. Section 3 covers the simulation study and results analysis. Section 4 applies the estimators to two real datasets, and Section 5 concludes the study.

2. Statistical Methodology for Ridge Regression

To simplify the model, we can express Equation (1) in its canonical form as:
y = Z φ + ϵ ,
where
  • Z = X D , a transformed design matrix.
  • φ = D , where D is an orthogonal matrix containing the eigenvectors of X X .
  • is the parameter vector, and ϵ represents the noise.
  • D D = I p ensuring orthogonality, with I p being the identity matrix.
  • The transformation Z = X D aligns Z with the principal components of X .
Additionally, we define:
Λ = D X X D .
Here, Λ is a diagonal matrix of the eigenvalues   λ 1 , λ 2 , , λ p arranged in ascending order.
Using the canonical form, ridge regression estimators can be rewritten in terms of the transformed parameters φ instead of .
The standard OLS estimator in canonical form becomes:
φ ^ = Λ 1 Z y .
Here, Λ 1 directly scales the projections Z y by the reciprocal of eigenvalues, leading to instability when λ i are small.
The ridge regression estimator in its canonical form is given as:
φ ^ k = Λ + k I p 1 Z y .
This formulation ensures stability by adding k to each eigenvalue, effectively shrinking the coefficients corresponding to small eigenvalues.
The two-parameter ridge regression estimator in canonical form is:
φ ^ q , k = q Λ + k I p 1 Z y .

2.1. Ridge Estimators

In this section, we discuss both existing ridge regression techniques and the newly proposed ridge estimators.

2.1.1. Existing Ridge Estimators

Several ridge estimators were introduced in the literature; here, we will consider some of them that are relevant to our study. In the following, σ ^ 2 represents the estimated error variance of the regression model and λ m a x   denotes the largest eigenvalue of the covariance matrix X X .
The authors of [1] introduced the first ridge estimator, referred to as HK. It is defined as:
k ^ H K = σ ^ 2 φ ^ M a x 2 ,   where   φ ^ M a x = M a x ( φ ^ 1   ,   φ ^ 2   ,   ,   φ ^ p ) .
Ref. [14] modified the HK estimator by incorporating the number of predictors and the modified version is referred to as the HKB estimator. Its form is:
k ^ H K B = p   σ ^ 2 i = 1 p φ ^ i 2 .
Kibria [6] introduced three ridge estimators based on averaging techniques, referred to as KAM (Arithmetic Mean), KGM (Geometric Mean), and KMed (Median). These estimators are defined as follows:
K A M = k ^ A M = 1 p i = 1 p σ ^ 2 φ ^ i 2 , K G M = k ^ G M = σ ^ 2 i = 1 p φ ^ i 2 1 p , K M e d = k ^ M e d = M e d σ ^ 2 φ ^ i 2 .
Researchers in [15] focused on eigenvalue-based adjustments to introduce a new estimator referred to as KMS with the following expression:
k ^ K M S = λ max i = 1 p φ ^ i σ ^ 2 φ ^ M a x 2 .
In [16], an enhanced ridge regression method was proposed by introducing a two-parameter ridge estimator, termed LC estimator in this study. The ridge parameter k is determined through Equation (10), while the second parameter q is derived using Equation (5). Similarly, the authors of [3] developed another two-parameter ridge estimator, referred to as TK estimator, with optimal values for q and k calculated using Equation (14).
q ^ o p t = i = 1 p φ ^ i 2 λ i λ i + k i = 1 p σ ^ 2 λ i + φ ^ i 2 λ i 2 λ i + k 2   , k ^ o p t = q ^ o p t i = 1 p σ ^ 2 λ i + q ^ o p t 1   i = 1 p φ ^ i 2 λ i 2 i = 1 p φ ^ i 2 λ i   .
The authors of [17] proposed three modified versions of the two-parameter ridge estimators, referred to as MTPR1, MTPR2, and MTPR3, which are defined as follows:
M T P R 1 : k ^ A M * = i = 1 p k i * p , M T P R 2 : k ^ G M * = i = 1 p k i * 1 p , M T P R 3 : k ^ H M * = p i = 1 p 1 k i * .
In each instance above, the adjusted k -value for the i t h predictor is computed as:
k i * = w i k ^ o p t ,
k ^ o p t is the optimal k -value determined through an optimization process in these ridge estimation models. The weighted w i is defined as:
w i = λ i φ i ,
where φ i   is the absolute regression coefficient, and λ i is its corresponding eigenvalue of X X .
The k - values of the estimators mentioned above are used to determine q from Equation (14).
The above competitive ridge estimators, HK, HKB, KAM, KGM, KMed, KMS, LC, TK, MTPR1, MTPR2, and MTPR3, were assessed and compared with our newly proposed ridge estimators, MIRE1, MIRE2, MIRE3, and MIRE4 using simulations and real-life datasets exhibiting severe multicollinearity.

2.1.2. New Proposed Ridge Estimators

We introduce four modified Lipovetsky–Conklin ridge (MLCR) estimators, which integrate the methodologies of [3,16] to determine the optimal values of q and k . Given the maximum eigenvalue λ m a x , the k -values for our four estimators are outlined as follows:
M I R E 1 : k 1 = l o g 1 + 1 p i = 1 p ( λ i φ ^ i ) 2 σ ^ 2 φ ^ m a x 2 ,
M I R E 2 : k 2 = λ m a x φ ^ i   σ ^ 2 φ ^ m a x 2 ,
M I R E 3 : k 3 = λ m a x   l o g ( 1 + φ ^ i ) ,
  M I R E 4 : k 4 = λ m a x φ ^ i 2   σ ^ 2 φ ^ m a x 2 .
Equation (5) is used for calculating the estimated q -value. Based on these calculations, our four novel two-parameter ridge estimators, denoted as MIRE1, MIRE2, MIRE3, and MIRE4, can be obtained.

2.1.3. Performance of Estimators Under Mean Square Error

The performance of the proposed estimators was evaluated using the Mean Squared Error (MSE) criterion. This approach has been widely employed in previous studies, including [13,18] to analyze the effectiveness of various estimators. The MSE is represented by the following expression:
M S E φ ^ = E ( φ ^ φ ) φ ^ φ
= 1 p i = 1 p ( φ ^ i φ ) 2
Since theoretical comparisons of ridge estimators based on MSE are complex, the ridge estimators, shown in Equation (10) to Equation (19), were evaluated through Monte Carlo simulations as described in the following section.

3. Monte Carlo Simulations

This section outlines the Monte Carlo simulation process used to assess the performance of the proposed ridge estimators in comparison to OLS and other existing ridge estimators.

3.1. Simulation Study

This study investigates ridge regression performance by generating predictor variables under controlled multicollinearity settings. Predictors x i j were simulated using the following equation, a process adopted by many researchers, including [19,20], to generate predictor variables as shown in Equation (22).
x i j = 1 ρ 2 .   z j i + ρ z j p ,   i = 1,2 , , p   a n d   j = 1,2 , , n   ,  
where x i j represents the j t h observation for the i t h predictor, ρ denotes the correlation between predictors, z j i are independent random samples drawn from a standard normal distribution, p is the total number of predictors, and n represents the sample size. To examine a range of multicollinearity conditions, ρ was varied across 0.80 ,   0.90 ,   0.95 ,   a n d   0.99 . Sample sizes ( n = 20 ,   50 ,   100 ) and predictor counts ( p = 4 ,     10 ) were selected to assess model robustness across different dimensionalities.
The response variable y j is generated using the following model:
y j = φ 0 + φ 1 x j 1 + φ 2 x j 2 + + φ p x j p + ϵ j   ,   j = 1 , 2 , n
where φ 0 is the intercept (set to zero), φ i are the regression coefficients chosen based on optimal conditions, and ϵ j is an error term drawn from a normal distribution with variance σ 2 . Error variance was examined at four levels: σ 2 = 0.4 ,   1 ,   5 ,   a n d   10 . The regression coefficients φ i were calculated by identifying the optimal direction, as outlined in the methodology proposed in [21]. Algorithm 1 was used to execute the estimated MSE of the estimators.
Algorithm 1: MSE Computation.
i.
Generate and standardize the matrix of independent variables using Equation (22).
ii.
Calculate eigenvalues λ 1 , , λ p and eigenvectors e 1 , , e p of X X , ensuring i = 1 p λ i   = p .
iii.
Determine regression coefficients φ in canonical form using φ   = λ m a x   P , where P = e 1 , , e p and λ m a x   corresponds to the largest eigenvalue, and generate random error terms based on distributions N ( 0,1 ) .
iv.
Compute the dependent variable values using Equation (9).
v.
Calculate OLS and ridge regression estimates for φ using appropriate equations.
vi.
Repeat steps i–v for N = 10,000 Monte Carlo iterations.
vii.
Calculate the MSE for all estimators using:
M S E φ ^ = 1 N j = 1 N ( φ ^ j φ ) φ ^ j φ .  (24)
A simulation study with N = 10,000 runs was performed to evaluate the MSE for varying values of ρ , n , and p using R programming language. The results are summarized in Table A1, Table A2, Table A3, Table A4, Table A5, Table A6 and Table A7 in Appendix A, where bolded values indicate the lowest MSE for each estimator, highlighting higher accuracy and improved predictive performance. A detailed analysis and discussion of these results are provided in following section.

3.2. Discussion of the Results

The performance of the newly proposed estimators (MIRE1, MIRE2, MIRE3, and MIRE4), OLS, and other existing estimators was tested using the estimated MSE as the evaluation criterion. The study comprehensively considered varying sample sizes ( n = 20 ,   50 ,   100 ) , numbers of predictors ( p = 4 ,   10 ) , levels of correlation ρ , and error variances σ 2 . The simulation findings can be highlighted in the following points.
Proposed Estimators (MIRE1, MIRE2, MIRE3, and MIRE4): Across different scenarios of   n ,     p ,     ρ ,   a n d   σ 2 , the proposed estimators consistently achieved lower MSE values compared to most existing estimators. Among the proposed estimators, MIRE3 often outperformed MIRE1 and MIRE2, particularly in scenarios with higher correlations ( ρ 0.95 ) and larger numbers of predictors   p = 10 . This suggests that MIRE3 is better suited to handling multicollinearity in these scenarios.
Existing Estimators: OLS and estimator KAM estimator showed higher MSE values in nearly all scenarios, particularly when ρ and p are large. This highlights their sensitivity to multicollinearity. Other existing estimators such as KMed, KMS, and HK tended to perform better than other existing estimators but were generally outperformed by our proposed MIRE estimators. The performance of the estimators LC, TK, MTPR1, MTPR2, and MTPR3 was heavily dependent on the combination of n ,   p , and σ 2 . For instance, lc and tk achieved lower mse in smaller error variances ( σ 2 = 0.40 ); however, they fell short in scenarios with high   σ 2 .
Impact of correlation ( ρ ): As the correlation increased, the challenge of multicollinearity becomes more pronounced. The proposed estimators demonstrated robustness under high correlations ( ρ = 0.95 , 0.99 ) , maintaining lower mse values compared to other estimators. The ols and hk estimators suffered significantly under high ρ , with their mse values increasing dramatically, particularly when p = 10 .
Impact of Sample Size ( n ): Larger sample sizes ( n = 50 , 100 ) generally led to reduced MSE values across all estimators, reflecting the benefits of increased data availability. For small sample sizes ( n = 20 ) , our MIRE estimators remained competitive, with MTPR3 emerging as particularly reliable.
The performance gap between the proposed and existing estimators narrowed with increasing n ; however, the proposed estimators maintained their advantage, especially in high-dimensional settings ( p = 10 ) .
Impact of Predictors ( p ): Increasing the number of predictors ( f o r   e x a m p l e   p = 10 ) exacerbated multicollinearity, leading to a noticeable rise in MSE for existing estimators. The proposed estimators showed their strength in handling high-dimensional data, with MIRE3 consistently delivering the lowest MSE among all methods.
Impact of Error Variance ( σ 2 ): A smaller error variance ( σ 2 = 0.40 ) favored all estimators, resulting in a reduced MSE. As σ 2 increased, the gap between the proposed and existing estimators widened, highlighting the robustness of MIRE1, MIRE2, MIRE3, and MIRE4 under high error variances.
The summary of Table A7 (provided in Appendix A) demonstrates the performance of ridge estimators across varying sample sizes (n), error variances ( σ ), and multicollinearity levels ( ρ ) , showcasing that the proposed estimators (MIRE1, MIRE2, MIRE3, and MIRE4) performed best in 30 out of 32 scenarios. They exceled under severe multicollinearity ( ρ > 0.95 ) and high noise ( σ 2 = 5 ,   10 ) . For a small sample ( n = 20 ) , MTPR1 performed well in five cases under low noise, while MIRE3 and MIRE2 dominated in eight scenarios as noise increases. For larger samples ( n = 50 ,   100 ) , MIRE3 and MIRE2 achieved the lowest MSE in 15 scenarios, with MTPR’s estimators excelling in three low-noise cases. These results underline the adaptability and consistency of the proposed estimators, particularly in challenging multicollinear settings.

4. Real-Life Applications

The first dataset we considered is the Blood Pressure dataset, sourced from Applied Regression Analysis [22], which serves as a benchmark in regression analysis. The second dataset, Acetylene Conversion data, was taken from [23] and foundational research by [24]. We utilized these two real datasets to evaluate the performance of the proposed and existing ridge estimators.

4.1. Blood Pressure Patients Dataset

The Blood Pressure (BP) Patients dataset includes 20 patients and seven variables, with BP as the response variable and six predictors, age ( X 1 ) , weight ( X 2 ) , BSA ( X 3 ) , duration ( X 4 ) , pulse ( X 5 ) , and stress ( X 6 ). We fit a linear regression model using BP as the dependent variable and the six predictors as independent variables. The linear regression model can be expressed as:
y = φ 0 + φ 1 X 1 + φ 2 X 2 + φ 3 X 3 + φ 4 X 4 + φ 5 X 5 + φ 6 X 6 + ϵ ,
where φ 0 is the intercept and φ 1 , φ 2 , φ 3 , φ 4 ,   φ 5 ,   a n d   φ 6 are the coefficients for each predictor, and ϵ represents the error term.
To check for multicollinearity in the Blood Pressure Patients dataset, we use the Condition Number ( C N ) , Eigenvalues, Variance Inflation Factors (VIFs), and Heatmap pairs wise correlation display. A C N above 30, small eigenvalues, or V I F values exceeding 5 indicate multicollinearity, which can distort regression results. Addressing multicollinearity may require methods such as ridge regression.
The dataset demonstrates severe multicollinearity, evidenced by a high value of C N   (1746.33) and a smallest eigenvalue of 0.0022, indicating near-linear dependency among predictors. VIF values are also excessively high for key variables such as BP (259.76), age (29.02), and weight (161.35). Furthermore, Figure 1 reveals strong positive correlations between BP and predictors such as weight and BSA, while stress shows weaker correlations with other variables. These findings confirm the significant multicollinearity present, which can lead to unstable and unreliable regression coefficient estimates. To mitigate this issue, the newly proposed and existing ridge estimators were applied to stabilize the model and improve estimation precision.
The analysis of this real dataset validates the simulation results, confirming that our proposed ridge parameter estimator (MIRE2) performs better than other existing ridge estimators as shown in Table 1.

4.2. Acetylene Conversion Dataset

This dataset includes 16 observations with a response variable y , and three predictors   X 1 , X 2 , and X 3 . The corresponding linear regression model is expressed as:
y = φ 0 + φ 1 X 1 + φ 2 X 2 + φ 3 X 3 + ϵ ,
where φ 0 is the intercept and φ 1 , φ 2 , and φ 3 are the coefficients for each predictor, and ϵ represents the error term.
Figure 2 reveals severe multicollinearity, particularly a strong correlation between the independent variables X 1 and X 3 (−0.96). These dependencies can undermine the stability of regression models. To address this, we compared the performance of our proposed and existing ridge estimators using the estimated MSE as the evaluation criterion.
It is clear from Table 2 that our proposed estimator (MIRE3) achieved the minimum estimated MSE compared to OLS and other existing ridge estimators. This demonstrates the superior performance and effectiveness of MIRE3 in handling multicollinearity and providing more accurate regression estimates.
Additionally, we assessed the significance of the regression coefficients for each estimator for the Acetylene Conversion dataset by calculating the corresponding t-statistics. These t-statistics were obtained by dividing the estimated coefficients by their respective standard errors, which were derived from the MSE values. The t-statistics for the coefficients across all estimators are summarized in Table 3.
The MIRE2, MIRE3, and MTPR1 estimators showed significance for most of their coefficients in this dataset, suggesting that these models are particularly effective in capturing the relationships between the predictors and the response variable. In contrast, OLS and other existing estimators, including HK, HKB, KAM, KGM, KMed, KMS, LC, and TK, exhibited several non-significant coefficients, which indicated that while these models capture certain relationships within the data, they may not fully account for the variability in the dependent variable. Overall, the existing estimator MTPR1 and the newly proposed estimator MIRE3 outperform others in both predictive accuracy and the statistical significance of their coefficients.

5. Conclusions

In this study, we introduced four novel improved two-parameter ridge estimators (MIRE1, MIRE2, MIRE3, and MIRE4) developed especially to combat the issue of multicollinearity in regression analysis. Based on the results of simulations and the analysis of the two real-life datasets, our newly proposed estimators consistently demonstrated superior performance compared to existing estimators, particularly in scenarios characterized by severe multicollinearity and high error variance.
MIRE2 and MIRE3, in particular, had the minimum estimated MSE and offered superior performance in both simulated and real datasets. When using t-tests to evaluate the significance of the regression coefficients for both OLS and ridge regression estimators in the Acetylene Conversion dataset, our newly proposed estimators, MIRE2 and MIRE3, demonstrated statistical significance for most of their coefficients, further confirming their effectiveness. Overall, these new estimators provided more accurate and stable estimates, making them a reliable solution for handling multicollinearity in regression models.
Future research could focus on improving the proposed ridge estimators for more complex data, incorporating variable selection techniques, and evaluating their performance across diverse real-world applications to further enhance their robustness and applicability.

Author Contributions

All the authors have contributed equally to the article. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Taif University, Saudi Arabia.

Data Availability Statement

The data that support the findings of this study are available within the article.

Acknowledgments

The authors thank the editor and reviewers for their valuable suggestions that improved this study. The authors would like to acknowledge the Deanship of Graduate Studies and Scientific Research, Taif University for funding this work.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Table A1. Estimated MSE of ridge estimators for n = 20   a n d   p = 4 .
Table A1. Estimated MSE of ridge estimators for n = 20   a n d   p = 4 .
σ 2 ρ OLSHKHKBKAMKGMKMedKMSLCTKMTPR1MTPR2MTPR3MIRE1MIRE2MIRE3MIRE4
0.40.80.15720.13910.09420.15330.06670.09830.12730.01060.60580.00320.00410.00520.14270.084930.00590.0850
0.900.32220.24990.15260.30790.08750.11190.2140.01570.46710.00250.00340.03120.2710.10460.0040.105
0.950.64390.40390.22970.59960.12950.16090.35620.02210.19660.00230.00350.13360.53740.15230.0030.1534
0.993.29361.14480.89462.88190.51830.56731.80610.15680.07760.00250.01742.41512.81530.16010.00220.156
10.80.93560.53240.37970.860.22960.30760.54650.06011.25840.03720.04690.20990.49660.08730.0370.0881
0.902.07030.90150.58981.85150.34750.44761.15020.09440.82750.05010.04240.36251.12610.1180.02470.1186
0.954.3271.70761.30323.80140.57580.64482.55530.15820.70690.02280.04371.12012.08650.29160.01920.2948
0.9921.0577.13754.424517.71721.48351.141814.2800.25340.08880.01430.08729.249210.4181.49640.01411.5316
50.824.9888.7346.537621.53952.61542.42418.9871.76657.60451.48342.60937.43361.13471.19963.11171.2293
0.9045.36213.8229.947337.99243.08942.978134.2201.40896.99481.10412.815812.9310.96070.86072.97190.8687
0.95103.8733.07226.74888.0446.39985.937884.4821.07188.14530.8066.130135.5752.55940.65943.12650.7386
0.99547.51156.54133.47459.80619.788331.0987485.0360.50528.58940.861238.161253.9316.5070.60834.6190.8839
100.892.62730.60922.25978.81727.20477.730377.2688.482925.3039.062114.52033.2490.88026.646115.7716.7665
0.90203.3571.72848.052172.80711.747913.166174.885.229929.50445.68423.18978.2291.89674.346215.7754.2586
0.95400.27130.0389.160337.13818.054623.724352.3804.582423.87064.22636.609164.971.38923.734215.8523.7716
0.992043.90625.91422.461709.1846.26699.0101903.981.943219.06212.5263177.571063.25.12562.097933.1362.2387
Table A2. Estimated MSE of ridge estimators for n = 50   a n d   p = 4 .
Table A2. Estimated MSE of ridge estimators for n = 50   a n d   p = 4 .
σ 2 ρ OLSHKHKBKAMKGMKMedKMSLCTKMTPR1MTPR2MTPR3MIRE1MIRE2MIRE3MIRE4
0.40.80.06130.05820.04590.06060.0360.07420.05480.00440.73830.00130.00150.00120.05830.02980.00230.0297
0.900.10510.1280.06620.10270.03660.0550.08350.00540.52330.00530.00160.0230.09420.01960.00150.0196
0.950.22090.18110.11790.21260.0580.06630.15210.00980.07010.0010.00140.00150.19370.0490.00130.049
0.991.06310.38470.31730.96270.18260.21730.53560.04310.31470.00110.00310.14560.87020.13880.00100.1386
10.80.36310.27520.20280.34560.11740.14440.24550.02451.2160.00760.00860.01450.18140.02030.01370.0204
0.900.66610.28920.27780.61560.14210.17470.37160.03190.88690.00730.00880.03850.27020.01720.00900.0177
0.951.35170.65410.41611.21330.2230.26860.68580.05420.27260.00510.00660.03540.62440.0210.00650.0212
0.996.52692.12171.61955.59270.76540.73543.8520.20090.35280.00780.01910.68282.94250.05810.00590.058
50.88.67783.36662.46397.4741.17081.08955.79490.63484.00540.37760.50371.08190.37770.27890.65350.2789
0.9017.20755.96263.88314.6101.5961.42211.93790.54884.16250.35820.56872.63120.40230.18150.43030.1816
0.9536.440211.6058.417130.9262.74482.180627.0720.52774.5950.36230.80036.75180.71130.16980.33320.1739
0.99157.6942.78332.004130.176.91368.6012127.970.26532.61450.13862.487733.5861.76020.13760.1640.1413
100.836.30512.3888.877531.0323.53833.374628.2072.989210.4912.1582.95595.6990.58951.74464.88581.7469
0.9066.73420.64916.47656.2615.38625.465752.5972.339613.65851.61464.017513.9680.53931.40685.62261.4041
0.95135.8743.72228.860113.156.5887.7221110.891.630712.8821.22945.221628.8350.61011.1997.96731.1823
0.99670.12219.99162.79561.0621.35033.903598.210.77249.03540.792523.504222.151.06510.61217.49940.6189
Table A3. Estimated MSE of ridge estimators for n = 100   a n d   p = 4 .
Table A3. Estimated MSE of ridge estimators for n = 100   a n d   p = 4 .
σ 2 ρ OLSHKHKBKAMKGMKMedKMSLCTKMTPR1MTPR2MTPR3MIRE1MIRE2MIRE3MIRE4
0.40.80.02340.14640.02130.02330.02380.07560.02220.0020.00960.00090.00090.00350.02280.00290.00120.0029
0.900.04460.10770.03640.04420.02190.03640.04010.0030.05240.00570.00090.00580.04250.00250.00080.0025
0.950.09370.08560.06120.09190.03210.03480.07610.00530.02180.00060.00060.00070.08460.01440.00050.0144
0.990.50020.33350.19280.46770.10290.10850.2720.02270.00210.0050.00050.00130.37230.01580.00040.0159
10.80.14630.19610.08980.14270.07110.11490.11790.01220.23320.01040.00580.02470.08310.00570.00750.0057
0.900.28310.18850.14150.27070.08660.10440.18860.0190.60820.0050.00450.01110.11130.00440.00490.0044
0.950.60370.38410.22070.56350.13920.15650.34970.03010.15680.00280.00290.02620.26220.0060.00350.006
0.993.05041.18490.81272.66410.43820.4911.62230.12660.17720.00270.00420.15711.06610.00840.00260.0085
50.83.52381.29611.04253.10940.67720.75262.15240.35472.580.11750.14610.25890.27380.10020.20820.1003
0.907.52122.44852.12596.49971.00680.97664.81890.37792.01420.11160.22610.53330.1870.1030.13870.1005
0.9515.18395.17563.693812.93291.56081.363810.2440.39512.8970.07070.16461.25940.20620.06880.08880.0689
0.9978.48123.348818.766966.51334.62674.337362.1240.21832.70130.06191.073111.1980.67220.06180.07420.062
100.814.56755.17414.020412.55731.7681.726510.39441.44035.95460.77010.95791.43620.63210.72981.9490.7299
0.9029.70179.45087.437925.19382.79752.594621.7871.12346.4870.60160.96172.44280.44970.53481.6180.5347
0.9563.629722.279515.296454.13934.62124.12250.17160.85067.42140.5441.92388.59530.32720.41891.07440.4188
0.99306.26792.039667.4402256.01012.348619.4741262.2000.39726.27640.31574.31654.2010.40140.29541.46030.2945
Table A4. Estimated MSE of ridge estimators for n = 20   a n d   p = 10 .
Table A4. Estimated MSE of ridge estimators for n = 20   a n d   p = 10 .
σ 2 ρ OLSHKHKBKAMKGMKMedKMSLCTKMTPR1MTPR2MTPR3MIRE1MIRE2MIRE3MIRE4
0.40.80.51880.39740.18950.50940.05920.0630.32160.00420.68820.00250.00280.01410.43070.14270.00590.1429
0.901.13970.7050.30151.10640.07190.07770.57370.00340.24270.00130.00190.10330.95190.30460.00260.305
0.952.47191.18730.49332.37660.14160.16831.19510.00480.16870.00730.00630.46072.11170.59250.00180.5922
0.9914.31295.71132.593113.55620.58780.7248.39190.01650.15990.00120.01257.603512.3724.01680.00104.0173
10.83.3921.69480.80443.26380.24660.29271.95730.02711.29770.01820.0230.27182.30380.6380.03650.6398
0.907.41563.06811.36277.07060.36250.47214.26230.02170.86380.02260.02120.82323.96140.53390.01770.5307
0.9514.45815.37482.543113.65230.66960.87178.48050.03740.58390.04000.0352.22717.97190.82350.01040.8234
0.9985.898433.90613.11980.84463.01263.270562.6020.12180.41480.00620.189733.87246.6857.04120.00617.0255
50.882.177134.50613.69778.20933.97584.481966.6221.110619.79291.69685.617622.1615.46941.24972.3911.2906
0.90191.45172.19837.971182.0996.98197.522160.0600.780715.9541.3478.028959.50816.3851.42551.49161.4682
0.95383.163140.50473.041362.34612.567513.543325.0280.735714.2590.917115.572143.1531.4021.34080.83251.3746
0.992042.02782.290354.401915.5950.006667.5191844.850.405327.2361.3232141.611250.6171.505.8210.37515.9954
100.8342.561146.34159.225325.67113.856816.363303.8519.925769.12215.49538.581124.136.33698.858622.9129.1906
0.90744.955294.663137.05706.52122.539428.591664.9866.445780.67011.25968.164293.6612.9576.577423.1426.4253
0.951543.34592.034262.431450.6539.722951.0261390.714.782474.9926.4008121.44696.9730.1884.471215.4565.5657
0.998073.692801.081320.87568.42181.469278.5667622.063.2748128.0222.326911.425150.9213.3210.81030.98813.419
Table A5. Estimated MSE of ridge estimators for n = 50   a n d   p = 10 .
Table A5. Estimated MSE of ridge estimators for n = 50   a n d   p = 10 .
σ 2 ρ OLSHKHKBKAMKGMKMedKMSLCTKMTPR1MTPR2MTPR3MIRE1MIRE2MIRE3MIRE4
0.40.80.16110.15030.09590.16020.02870.03120.12880.00120.1910.00060.00060.00050.14940.04030.00130.0402
0.900.35370.30420.1520.34980.04040.04360.23160.00170.00390.00070.00060.00070.30250.04030.00080.0404
0.950.65170.49280.21540.640.05910.06620.34530.00190.00120.00070.00080.0010.54190.05750.00050.0576
0.993.70161.8940.83553.59110.30090.40161.88930.01190.010.00040.00060.32743.16820.41870.00030.4192
10.81.07020.76390.33711.05010.11320.14580.61810.00730.17670.00380.0040.00560.58220.02210.00830.0222
0.902.33351.36450.55992.27340.18760.24941.22390.010.34320.00880.01550.0341.23030.0280.00470.0281
0.954.29132.16640.88814.15560.33170.48542.21310.01180.24240.00270.00480.09562.16030.03740.00340.0378
0.9922.22119.32094.467721.39041.38661.893114.42390.05620.03440.00210.02151.621511.9680.22620.00200.2242
50.825.32711.51734.868224.43261.77712.247718.19770.2024.37370.21210.52311.31021.16490.10530.23550.107
0.9052.948423.57789.88950.91193.08963.773539.28480.21532.82840.07170.42642.32032.19670.07030.11670.0772
0.95115.67549.830523.1694111.2186.55487.703491.82030.32631.81010.06370.64474.6424.5490.05970.07540.0607
0.99523.631226.73295.0532502.20323.355735.6529454.4830.19372.22010.05197.096859.37717.290.06470.05020.0681
100.8104.23743.12720.2642100.5775.68477.625186.02791.836527.52091.93576.093414.2990.70941.41825.39041.4143
0.90214.73193.444638.3373206.63210.56414.7109181.6121.332832.55831.416511.82936.4471.13640.88145.24330.8673
0.95434.928189.45978.4042417.63419.52628.7565378.2180.643437.87340.529927.172103.651.79780.34132.46050.3378
0.992158.95945.147457.8222074.8582.546150.3782001.760.344225.84020.2267177.57757.5611.0530.21987.98830.2274
Table A6. Estimated MSE of ridge estimators for n = 100   a n d   p = 10 .
Table A6. Estimated MSE of ridge estimators for n = 100   a n d   p = 10 .
σ 2 ρ OLSHKHKBKAMKGMKMedKMSLCTKMTPR1MTPR2MTPR3MIRE1MIRE2MIRE3MIRE4
0.40.80.083220.079630.056940.08280.018390.02080.070310.000690.778340.000350.000320.000380.07880.019130.00060.0191
0.900.18740.169240.092330.185960.021430.02330.13210.000690.00350.000260.000280.000240.16650.017910.00030.0179
0.950.38690.31640.155920.381410.036160.04150.225340.00130.001920.000240.000220.000290.32780.02310.00020.02312
0.991.9511.12310.45271.88910.15510.20220.91630.00430.000310.000250.000220.000431.66370.09220.00020.09234
10.80.54360.42860.2280.53490.06530.07170.330.00390.03280.00200.00180.00170.28140.00380.00380.0039
0.901.12730.75360.35361.10010.10540.12940.58320.00450.0330.00140.00150.00190.62720.00820.00220.0082
0.952.51241.38040.56692.43140.19560.25611.23750.00680.02570.00240.00570.01561.19310.00870.00150.0087
0.9912.9565.4782.457712.40410.85151.24757.62830.02340.78110.00130.00890.76026.16910.02850.00110.0284
50.814.3446.31462.949313.82070.98771.41189.60450.09511.96110.04690.06630.16110.59990.04880.10020.0491
0.9028.38011.59725.382527.15531.79072.245219.2110.11681.54170.04080.09110.46811.70730.03330.05040.0336
0.9560.13625.021710.17857.50293.29494.103744.0860.1671.42230.03450.15861.22421.88040.03390.04100.0341
0.99313.459122.49255.550298.86212.94619.024261.9990.17120.44280.02860.45598.415716.1820.03270.03000.0331
100.855.79823.418510.17153.63563.14363.955943.3350.451311.45420.31150.74443.02020.42370.17730.83350.1776
0.90112.12145.604820.177107.1335.84097.522588.2210.35617.94320.17831.14724.87440.32270.13190.52340.1322
0.95244.09197.582639.892233.21511.76415.372202.590.359513.43490.11743.738820.46180.91180.10960.39720.1100
0.991309.14560.092243.8011251.6649.39581.0491188.60.21319.02870.105122.754219.6994.77560.10480.14940.1054
Table A7. Detailed summary of the performance of ridge estimators.
Table A7. Detailed summary of the performance of ridge estimators.
n σ 2 p   =   4 p   =   10
0.800.900.950.990.800.900.950.99
200.4MTPR1MTPR1MTPR1MIRE3MTPR1MTPR1MIRE3MIRE3
1MIRE3MIRE3MIRE3MIRE3MITPR2MIRE3MIRE3MIRE3
5MIRE3MIRE2MIRE2MIRE2LCLCLCMIRE3
10MIRE1MIRE1MIRE1MIRE2MIRE1MIRE4MIRE2LC
500.4MTPR3MIRE3MIRE3MIRE3MTPR3MTPR2MIRE3MIRE3
1MTPR2MTPR2MIRE3MIRE3MTPR1MIRE3MIRE3MIRE3
5MIRE4MIRE2MIRE2MIRE2MIRE2MIRE2MIRE2MIRE3
10MIRE1MIRE1MIRE1MIRE2MIRE1MIRE4MIRE4MIRE2
1000.4MTPR1MTPR1MIRE3MIRE3MTPR2MPTR3MIRE3MIRE3
1MIRE2MIRE2MTPR1MIRE3MTPR3MTPR1MIRE3MIRE3
5MIRE2MIRE2MIRE2MIRE2MTPR1MIRE2MIRE2MIRE3
10MIRE1MIRE1MIRE1MIRE4MIRE4MIRE2MIRE2MIRE2

References

  1. Hoerl, A.E.; Kennard, R.W. Ridge Regression: Biased Estimation for Nonorthogonal Problems. Technometrics 1970, 12, 55–67. [Google Scholar] [CrossRef]
  2. Muniz, G.; Kibria, B.M.G. On Some Ridge Regression Estimators: An Empirical Comparisons. Commun. Stat.-Simul. Comput. 2009, 38, 621–630. [Google Scholar] [CrossRef]
  3. Toker, S.; Kaçıranlar, S. On the performance of two parameter ridge estimator under the mean square error criterion. Appl. Math. Comput. 2013, 219, 4718–4728. [Google Scholar] [CrossRef]
  4. McDonald, G.C. A Monte Carlo Evaluation of Some Ridge-Type Estimators. J. Am. Stat. Assoc. 1975, 70, 407–416. [Google Scholar] [CrossRef]
  5. Khalaf, G.; Shukur, G. Choosing Ridge Parameter for Regression Problems. Commun. Stat.-Theory Methods 2005, 34, 1177–1182. [Google Scholar] [CrossRef]
  6. Kibria, B.M.G. Performance of some New Ridge regression estimators. Commun. Stat. Part B Simul. Comput. 2003, 32, 419–435. [Google Scholar] [CrossRef]
  7. Månsson, K.; Shukur, G. On Ridge Parameters in Logistic Regression. Commun. Stat.-Theory Methods 2011, 40, 3366–3381. [Google Scholar] [CrossRef]
  8. Saleh AM, E.; Kibria, B.G. Improved ridge regression estimators for the logistic regression model. Comput. Stat. 2013, 28, 2519–2558. [Google Scholar] [CrossRef]
  9. O’Driscoll, D.; Ramirez, D.E. Mitigating collinearity in linear regression models using ridge, surrogate and raised estimators. Cogent Math. 2016, 3, 1144697. [Google Scholar] [CrossRef]
  10. Khan, M.S.; Ali, A.; Suhail, M.; Kibria, B.M.G. On some two parameter estimators for the linear regression models with correlated predictors: Simulation and application. Commun. Stat.-Simul. Comput. 2024, 1, 1–15. [Google Scholar] [CrossRef]
  11. Suhail, M.; Chand, S.; Kibria, B.M.G. Quantile-based robust ridge m-estimator for linear regression model in presence of multicollinearity and outliers. Commun. Stat.-Simul. Comput. 2021, 50, 3194–3206. [Google Scholar] [CrossRef]
  12. Li, Y.; Yang, R.; Wang, X.; Zhu, J.; Song, N. Carbon Price Combination Forecasting Model Based on Lasso Regression and Optimal Integration. Sustainability 2023, 15, 9354. [Google Scholar] [CrossRef]
  13. Akhtar, N.; Alharthi, M.F.; Khan, M.S. Mitigating Multicollinearity in Regression: A Study on Improved Ridge Estimators. Mathematics 2024, 12, 3027. [Google Scholar] [CrossRef]
  14. Hoerl, A.E.; Kannard, R.W.; Baldwin, K.F. Ridge regression: Some simulations. Commun. Stat.-Theory Methods 1975, 4, 105–123. [Google Scholar] [CrossRef]
  15. Khalaf, G.; Månsson, K.; Shukur, G. Modified Ridge Regression Estimators. Commun. Stat.-Theory Methods 2013, 42, 1476–1487. [Google Scholar] [CrossRef]
  16. Lipovetsky, S.; Conklin, W.M. Ridge regression in two-parameter solution. Appl. Stoch. Model. Bus. Ind. 2005, 21, 525–540. [Google Scholar] [CrossRef]
  17. Yasin, S.; Salem, S.; Ayed, H.; Kamal, S.; Suhail, M.; Khan, Y.A. Modified Robust Ridge M-Estimators in Two-Parameter Ridge Regression Model. Math. Probl. Eng. 2021, 2021, 1845914. [Google Scholar] [CrossRef]
  18. Jensen, D.R.; Ramirez, D.E. On mitigating collinearity through mixtures. J. Stat. Comput. Simul. 2018, 88, 1437–1453. [Google Scholar] [CrossRef]
  19. Akhtar, N.; Alharthi, M.F. A comparative study of the performance of new ridge estimators for multicollinearity: Insights from simulation and real data application. AIP Adv. 2024, 14, 14. [Google Scholar] [CrossRef]
  20. Özbay, N. Two-Parameter Ridge Estimation for the Coefficients of Almon Distributed Lag Model. Iran. J. Sci. Technol. Trans. A Sci. 2019, 43, 1819–1828. [Google Scholar] [CrossRef]
  21. Halawa, A.M.; El Bassiouni, M.Y. Tests of regression coefficients under ridge regression models. J. Stat. Comput. Simul. 2000, 65, 341–356. [Google Scholar] [CrossRef]
  22. Norman, H.S.; Draper, R. Applied Regression Analysis, 3rd ed.; WILLEY: Hoboken, NJ, USA, 1998. [Google Scholar]
  23. Irandoukht, A. Optimum Ridge Regression Parameter Using R-Squared of Prediction as a Criterion for Regression Analysis. J. Stat. Theory Appl. 2021, 20, 242–250. [Google Scholar] [CrossRef]
  24. Kunugi, T.N.T. New Acetylene Process Uses Hydrogen Dilution. Chem. Eng. Prog. 1961, 57, 43–49. [Google Scholar]
Figure 1. Blood Pressure Patients dataset pairwise correlations display.
Figure 1. Blood Pressure Patients dataset pairwise correlations display.
Axioms 14 00186 g001
Figure 2. Heatmap of Acetylene Conversion dataset.
Figure 2. Heatmap of Acetylene Conversion dataset.
Axioms 14 00186 g002
Table 1. Estimated MSE of ridge estimators for Blood Pressure dataset.
Table 1. Estimated MSE of ridge estimators for Blood Pressure dataset.
EstimatorsMSE φ ^ 0 φ ^ 1 φ ^ 2 φ ^ 3 φ ^ 4 φ ^ 5 φ ^ 6
OLS2.188755−0.04540.1639870.1205010.0318110.015822−0.25681−0.12711
HK0.178530.2226960.4734530.013541−0.005−0.08594−0.082940.040346
HKB0.2176310.037362−0.25313−0.08097−0.075820.033470.2299020.073195
KAM1.433083−0.2745−0.045390.0356880.2409160.0319390.022055−0.02748
KGM0.1742620.1801850.2226020.0396660.02516−0.00502−0.12498−0.07193
KMed0.1753110.6516530.03733−0.00662−0.14654−0.054150.051230.240152
KMS0.534015−3.69391−0.27418−0.045360.0622610.250030.0518090.051425
LC0.179084−0.043770.1798220.2221720.0658450.038097−0.00832−0.16124
TK0.177980.2026270.6466860.03718−0.01076−0.26317−0.047910.096102
MTPR10.1791280.030995−2.89173−0.27269−0.100280.149280.2344290.042678
MTPR20.17908−0.2146−0.035570.1781780.1857860.2794710.03916−0.00226
MTPR30.1927460.1223370.1284070.6248320.015046−0.06611−0.28684−0.00647
MIRE10.400804−0.045050.0399910.1917660.0299530.039203−1.05962−0.19052
MIRE20.1735480.2181820.0453840.015773−0.0047−0.28349−0.06140.031881
MIRE30.178060.035825−0.00764−0.08565−0.098560.1786820.2535370.395008
MIRE40.751334−0.25938−0.034420.0333470.1921580.5076830.008764−1.9928
Table 2. Estimated MSE of ridge estimators for Acetylene Conversion dataset.
Table 2. Estimated MSE of ridge estimators for Acetylene Conversion dataset.
EstimatorsMSE φ ^ 0 φ ^ 1 φ ^ 2 φ ^ 3
OLS1.2240210.488659−0.137590.527294−0.11288
HK1.141057−0.53522−0.74052−0.33586−0.48008
HKB1.156782−0.142730.48835−0.011940.488664
KAM1.219503−0.80977−0.5341−0.02796−0.53522
KGM1.1139030.488323−0.139250.514907−0.1427
KMed1.147368−0.53399−0.76219−0.43638−0.80942
KMS1.176266−0.138950.488473−0.025780.488996
LC0.760537−0.75815−0.53454−0.0642−0.53521
TK0.6590680.488391−0.140620.498588−0.13954
MTPR10.33718−0.53424−0.78045−0.5124−0.79075
MTPR20.430533−0.13970.493592−0.069770.527509
MTPR30.56701−0.76818−0.53073−0.21949−0.33276
MIRE11.223442−0.142530.495315−0.142360.489342
MIRE21.192328−0.80704−0.52534−0.80425−0.53506
MIRE30.3355770.488197−0.093920.493304−0.09916
MIRE41.172576−0.53354−0.34577−0.53142−0.78466
Table 3. Significance coefficient of ridge estimators.
Table 3. Significance coefficient of ridge estimators.
EstimatorMSEt-Test for  φ ^ 0 t-Test for  φ ^ 1 t-Test for  φ ^ 2 t - Test   for   φ ^ 3 p - Value   for   φ ^ 0 p - Value   for   φ ^ 1 p - Value   for   φ ^ 2 p - Value   for   φ ^ 3
OLS1.2240211.766735−0.497451.906419−0.408120.104970.6286630.0830430.691021
HK1.141057−2.00419−2.77296−1.25766−1.797710.0702980.0181310.2345570.099696
HKB1.156782−0.530821.816207−0.044411.8173750.6060930.0966630.9653770.096474
KAM1.219503−2.93312−1.9346−0.10128−1.938660.0136140.0791650.9211540.078621
KGM1.1139031.850732−0.527751.951484−0.540830.0912250.6081510.0769230.599406
KMed1.147368−1.99408−2.84624−1.62957−3.022610.0715260.0159030.1314670.011601
KMS1.176266−0.512471.801556−0.095081.8034850.6184570.0990580.9259610.09874
LC0.760537−3.4774−2.45177−0.29447−2.454840.0051720.0321450.7738850.03197
TK0.6590682.406371−0.692852.456613−0.687530.0348380.5027670.031870.505985
MTPR10.33718−3.68015−5.37618−3.5297−5.447140.0036250.0002250.0047170.000202
MTPR20.430533−0.851633.009019−0.425333.2157820.4125740.0118870.6787980.082211
MTPR30.56701−4.08063−2.81928−1.16595−1.767650.0018180.0166890.2682940.104811
MIRE11.223442−0.515441.791224−0.514821.7696230.6164490.100780.6168650.104467
MIRE21.192328−2.95636−1.92443−2.94614−1.960040.013060.0805450.0133010.075809
MIRE30.3355773.371002−0.648523.406266−0.68470.0062410.5299560.0058640.507703
MIRE41.172576−1.97086−1.27725−1.96303−2.898480.0744220.2278140.0754230.014484
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Alharthi, M.F.; Akhtar, N. Newly Improved Two-Parameter Ridge Estimators: A Better Approach for Mitigating Multicollinearity in Regression Analysis. Axioms 2025, 14, 186. https://doi.org/10.3390/axioms14030186

AMA Style

Alharthi MF, Akhtar N. Newly Improved Two-Parameter Ridge Estimators: A Better Approach for Mitigating Multicollinearity in Regression Analysis. Axioms. 2025; 14(3):186. https://doi.org/10.3390/axioms14030186

Chicago/Turabian Style

Alharthi, Muteb Faraj, and Nadeem Akhtar. 2025. "Newly Improved Two-Parameter Ridge Estimators: A Better Approach for Mitigating Multicollinearity in Regression Analysis" Axioms 14, no. 3: 186. https://doi.org/10.3390/axioms14030186

APA Style

Alharthi, M. F., & Akhtar, N. (2025). Newly Improved Two-Parameter Ridge Estimators: A Better Approach for Mitigating Multicollinearity in Regression Analysis. Axioms, 14(3), 186. https://doi.org/10.3390/axioms14030186

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop