Next Article in Journal
Joint Approximation of Analytic Functions by Shifts of the Riemann Zeta-Function Twisted by the Gram Function II
Next Article in Special Issue
A Unit Half-Logistic Geometric Distribution and Its Application in Insurance
Previous Article in Journal
Existence Solutions for Implicit Fractional Relaxation Differential Equations with Impulsive Delay Boundary Conditions
Previous Article in Special Issue
Amoud Class for Hazard-Based and Odds-Based Regression Models: Application to Oncology Studies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

New Robust Estimators for Handling Multicollinearity and Outliers in the Poisson Model: Methods, Simulation and Applications

1
Department of Mathematics, Al-Aqsa University, Gaza 4051, Palestine
2
Department of Quantitative Analysis, College of Business Administration, King Saud University, Riyadh 11362, Saudi Arabia
3
Electrical Engineering Department, Faculty of Engineering & Technology, Future University in Egypt, New Cairo 11835, Egypt
4
Department of Applied Statistics and Econometrics, Faculty of Graduate Studies for Statistical Research, Cairo University, Giza 12613, Egypt
*
Author to whom correspondence should be addressed.
Axioms 2022, 11(11), 612; https://doi.org/10.3390/axioms11110612
Submission received: 23 August 2022 / Revised: 25 October 2022 / Accepted: 27 October 2022 / Published: 3 November 2022
(This article belongs to the Special Issue Computational Statistics & Data Analysis)

Abstract

:
The Poisson maximum likelihood (PML) is used to estimate the coefficients of the Poisson regression model (PRM). Since the resulting estimators are sensitive to outliers, different studies have provided robust Poisson regression estimators to alleviate this problem. Additionally, the PML estimator is sensitive to multicollinearity. Therefore, several biased Poisson estimators have been provided to cope with this problem, such as the Poisson ridge estimator, Poisson Liu estimator, Poisson Kibria–Lukman estimator, and Poisson modified Kibria–Lukman estimator. Despite different Poisson biased regression estimators being proposed, there has been no analysis of the robust version of these estimators to deal with the two above-mentioned problems simultaneously, except for the robust Poisson ridge regression estimator, which we have extended by proposing three new robust Poisson one-parameter regression estimators, namely, the robust Poisson Liu (RPL), the robust Poisson Kibria–Lukman (RPKL), and the robust Poisson modified Kibria–Lukman (RPMKL). Theoretical comparisons and Monte Carlo simulations were conducted to show the proposed performance compared with the other estimators. The simulation results indicated that the proposed RPL, RPKL, and RPMKL estimators outperformed the other estimators in different scenarios, in cases where both problems existed. Finally, we analyzed two real datasets to confirm the results.

1. Introduction

In general, outliers are often data points that differ significantly from other data points. In other words, they represent uncommon dataset values. For many statistical models, outliers are an issue because they can either cause tests to miss important findings, or bias real results. There are some statistical methods used to detect outliers, such as index plot of Cook’s distance and potential residual plot, see Hadi [1]. Several robust regression estimators have been provided to address the problem of outliers in different regression models, see Lawrence [2]. On the other hand, when the explanatory variables are highly correlated, this means that the model has a multicollinearity problem. This problem causes the maximum likelihood estimates to be unstable and inefficient. Therefore, many biased estimators have been provided to address this problem in different regression models, see e.g., Wu et al. [3], Arashi et al. [4], Algamal et al. [5], Rasheed et al. [6], Majid et al. [7], Asar and Algamal [8], and Jabur et al. [9].
The Poisson regression model (PRM) is popular for modeling and analyzing count data. The Poisson maximum likelihood (PML) method is used to obtain the estimators for the PRMs based on the algorithm of the iterative weighted least square (IWLS) technique. However, the presence of the multicollinearity problem in the PRM causes the PML estimators to be unreliable and have unexpected signs. To mitigate this, biased estimators have been suggested for the PRM, such as the Poisson ridge regression (PRR) estimator [10], Poisson Liu (PL) estimator [11], Poisson Kibria–Lukman (PKL) estimator [12], and Poisson modified Kibria–Lukman (PMKL) estimator [13]. These estimators’ properties have been investigated via the mean squared error (MSE) criterion. Additionally, the authors have modified and adopted some approaches for estimating the biasing parameters, which have been described in many studies. For more details see: Månsson and Shukur [10], Månsson et al. [11], Amin et al. [14], Yehia [15], Lukman et al. [12], and Jadhav [16].
The PRM is a popular model for analyzing count data when the dependent variable ( y i ) distribution is P o i s s o n ( μ i ) and μ i ( β ) = exp ( x i β ) , where x i is the ith row in X matrix (that is an n × ( p + 1 ) with p explanatory variables), and β is an unknown ( p + 1 ) × 1 coefficients vector, and the model is assumed to have an intercept. Then, the PML estimator is provided via the IWLS algorithm as follows:
β ^ P M L = ( X W ^ X ) 1 X W ^ z ^ ,
where W ^ = d i a g ( μ ^ i ) , and z ^ is a vector where the ith element equals z ^ i = log [ μ ^ i ( β ^ P M L ) ] + [ y i μ ^ i ( β ^ P M L ) μ ^ i ( β ^ P M L ) ] . The matrix MSE (MSEM) of the PML estimator is given by:
M S E M ( β ^ P M L ) = ( X W ^ X ) 1 .
Suppose that γ and λ j are eigenvectors and eigenvalues of X W ^ X matrix, where X W ^ X = γ Λ γ , with Λ = d i a g ( λ j ) . Hence, the scalar MSE of PML estimator is given by:
M S E ( β ^ P M L ) = t r a c ( X W ^ X ) 1 = j = 1 p + 1 1 λ j .
The PRM assumes that the mean and variance of the outcome (y) are equal. However, data sets in many real-world applications contain outliers causing the variance of the outcome to be larger than the mean of the outcome, which is known as the problem of overdispersion. It is proposed here that outliers in count data can be more adequately addressed using a robust Poisson estimator, in comparison with the usual PML estimator. A few robust Poisson estimators have been provided, such as by Cantoni and Ronchetti [17], Hall and Shen [18], Chen et al. [19], Hosseinian and Morgenthaler [20], Chen et al. [21], Abonazel and Saber [22], Marazzi [23], and Abonazel et al. [24]. However, the robust Poisson estimator that was proposed by Hosseinian and Morgenthaler [20] is the most common estimator given via the weighted maximum likelihood method (WMLM) for handling outliers in y-direction [22]. The weighted function ( W * ) of the WMLM depends on μi and two constants (c1 and c2):
W * = { 1 if m c 1 < μ i < c 1 m ; c 1 μ i m if μ i < m c 1 ; c 2 m μ i m if c 1 m < μ i < c 2 m ; 0 otherwise
where m is known as the median of μi, c1 = 2, and c2 = 3. Then, the WMLM estimation equation is defined as:
i = 1 n μ i μ i W * ( y i μ i ) x i = 0 ,
By solving the equations system in Equation (4) based on the Newton–Raphson method, we obtain a robust PML estimator (RPML), see Hosseinian [25] and Hosseinian and Morgenthaler [20] for more details. Then, the RPML estimator [26] is:
β ^ R P M L = ( X W ^ * X ) 1 X W ^ * z ^ * ,
where W ^ * = d i a g [ μ ^ i ( β ^ R P M L ) ] , and z ^ * is a vector where the ith element equals z ^ i * = log [ μ ^ i ( β ^ R P M L ) ] + [ y i μ ^ i ( β ^ R P M L ) μ ^ i ( β ^ R P M L ) ] .
The MSEM and MSE of the RPML estimator are given by:
M S E M ( β ^ R P M L ) = ( X W ^ * X ) 1 ,
M S E ( β ^ R P M L ) = j = 1 p + 1 1 λ j * ,
where γ * and λ j * are the eigenvectors and eigenvalues of X W ^ * X ,   where   X W ^ * X = γ   *   Λ * γ * with Λ * = d i a g ( λ j * ) .
Although the RPML estimator is robust for outliers, it is not suitable in the case of the multicollinearity problem in the PRM. On the other hand, the PL, PKL and PMKL estimators are robust for the multicollinearity problem, but not suitable in the case of outliers. Therefore, in this paper, we will combine the RPML estimator that was introduced by Hosseinian and Morgenthaler [20], into the PL, PKL and PMKL estimators, to obtain new robust estimators. The suggested estimators can efficiently handle both problems simultaneously in the PRM.
This paper is organized as follows. Section 2 presents the three proposed robust one-parameter estimators, theoretical comparisons, and biasing parameters selection techniques. The Monte Carlo simulation and empirical applications are described in Section 3 and Section 4, respectively. Section 5 provides some concluding remarks.

2. Methodology

2.1. Poisson One-Parameter Regression Estimators

As mentioned above, multicollinearity is a frequent problem that occurs among the explanatory variables in many datasets. This causes the PML estimator to be unstable and inefficient. Therefore, different authors have developed several biased Poisson regression estimators as a remedy to multicollinearity, e.g., Månsson and Shukur [10] proposed the PRR estimator, Månsson et al. [11] proposed the PL estimator, Lukman et al. [12] proposed the PKL estimator, and Aladeitan et al. [13] proposed the PMKL estimator, where these estimators are:
β ^ P R R = ( X W ^ X + k Ι ) 1 X W ^ X β ^ P M L
β ^ P L = ( X W ^ X + Ι ) 1 ( X W ^ X + d Ι ) β ^ P M L
β ^ P K L = ( X W ^ X + k Ι ) 1 ( X W ^ X k Ι ) β ^ P M L
β ^ P M K L = ( X W ^ X + k Ι ) 1 ( X W ^ X k Ι ) ( X W ^ X + k Ι ) 1 X W ^ X β ^ P M L
The MSE of the PRR, PL, PKL, and PMKL estimators are given by:
M S E ( β ^ P R R ) = j = 1 p + 1 λ j ( λ j + k ) 2 + j = 1 p + 1 k 2 α j 2 ( λ j + k ) 2 ,
M S E ( β ^ P L ) = j = 1 p + 1 ( λ j + d ) 2 λ j ( λ j + 1 ) 2 + ( 1 d ) 2 j = 1 p + 1 α j 2 ( λ j + 1 ) 2 ,
M S E ( β ^ P K L ) = j = 1 p + 1 ( λ j k ) 2 λ j ( λ j + k ) 2 + 4 k 2 j = 1 p + 1 α j 2 ( λ j + k ) 2 ,
M S E ( β ^ P M K L ) = j = 1 p + 1 λ j ( λ j k ) 2 ( λ j + k ) 4 + k 2 j = 1 p + 1 ( k + 3 λ j ) 2 α j 2 ( λ j + k ) 4 .
where α j is the jth item of γ β ^ P M L .

2.2. Robust Poisson Ridge Regression Estimator

Recently, Abonazel and Dawoud [26] proposed the robust Poisson ridge regression (RPRR) estimator to eliminate the effects of the outliers and multicollinearity problems simultaneously in the PRM, where the RPRR is given by:
β ^ R P R R = ( X W ^ * X + k Ι ) 1 X W ^ * X β ^ R P M L ,
and the MSE of the RPRR estimator is:
M S E ( β ^ R P R R ) = j = 1 p + 1 λ j * ( λ j * + k ) 2 + j = 1 p + 1 k 2 α j * 2 ( λ j * + k ) 2 ,
where α j * is the jth item of γ * β ^ R P M L .
To reduce the bias of the RPRR estimator, Abonazel and Dawoud [26] proposed the robust jackknife ridge estimator for the PRM. Furthermore, the robust modified jackknife ridge estimator for the PRM was provided by Arum et al. [27].

2.3. Proposed Robust Poisson One-Parameter Regression Estimators

Since the PL, PKL, and PMKL estimators are more efficient than the PRR estimator in the PRM, we propose a robust version of these estimators for the PRM as extensions of the one-parameter estimators to the area of the PRM in the presence of both multicollinearity and outliers. We can define the proposed estimators (RPL, RPKL, and RPMKL) as follows:
β ^ R P L = ( X W ^ * X + Ι ) 1 ( X W ^ * X + d Ι ) β ^ R P M L
β ^ R P K L = ( X W ^ * X + k Ι ) 1 ( X W ^ * X k Ι ) β ^ R P M L
β ^ R P M K L = ( X W ^ * X + k Ι ) 1 ( X W ^ * X k Ι ) ( X W ^ * X + k Ι ) 1 X W ^ * X β ^ R P M L
The MSE of the RPL, RPKL, and RPMKL estimators are given by:
M S E ( β ^ R P L ) = j = 1 p + 1 ( λ j * + d ) 2 λ j * ( λ j * + 1 ) 2 + ( 1 d ) 2 j = 1 p + 1 α j * 2 ( λ j * + 1 ) 2 ,
M S E ( β ^ R P K L ) = j = 1 p + 1 ( λ j * k ) 2 λ j * ( λ j * + k ) 2 + 4 k 2 j = 1 p + 1 α j * 2 ( λ j * + k ) 2 ,
M S E ( β ^ R P M K L ) = j = 1 p + 1 λ j * ( λ j * k ) 2 ( λ j * + k ) 4 + k 2 j = 1 p + 1 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4

2.4. Superiority of Proposed Estimators

The theorems below specify the necessary and sufficient conditions of the proposed RPL, RPKL, and RPMKL estimators’ superiority.
Theorem 1.
If j = 1 p + 1 ( ( 1 d ) α j * 2 2 ) ( λ j * + 1 ) 2 < j = 1 p + 1 ( 1 + d ) λ j * ( λ j * + 1 ) 2 , then M S E ( β ^ R P L ) < M S E ( β ^ R P M L ) .
Proof. 
The MSE difference between the RPML and the RPL estimators is given by:
Δ 1 = M S E ( β ^ R P L ) M S E ( β ^ R P M L )                 = j = 1 p + 1 ( λ j * + d ) 2 ( λ j * + 1 ) 2 + λ j * ( 1 d ) 2 α j * 2 λ j * ( λ j * + 1 ) 2
After using some algebraic calculations, Δ 1 in Equation (24) will be negative if j = 1 p + 1 ( ( 1 d ) α j * 2 2 ) ( λ j * + 1 ) 2 < j = 1 p + 1 ( 1 + d ) λ j * ( λ j * + 1 ) 2 , then M S E ( β ^ R P L ) < M S E ( β ^ R P M L ) . That means, the RPL estimator is better than the RPML estimator if j = 1 p + 1 ( ( 1 d ) α j * 2 2 ) ( λ j * + 1 ) 2 < j = 1 p + 1 ( 1 + d ) λ j * ( λ j * + 1 ) 2 . ☐
Theorem 2.
If j = 1 p + 1 k α j * 2 ( λ j * + k ) 2 < j = 1 p + 1 1 ( λ j * + k ) 2 , then M S E ( β ^ R P K L ) < M S E ( β ^ R P M L ) .
Proof. 
The MSE difference between the RPML and the RPKL estimators is given by:
Δ 2 = M S E ( β ^ R P K L ) M S E ( β ^ R P M L ) = j = 1 p + 1 ( λ j * k ) 2 ( λ j * + k ) 2 + λ j * 4 k 2 α j * 2 λ j * ( λ j * + k ) 2 .
After using some algebraic calculations, Δ 2 in Equation (25) will be negative if j = 1 p + 1 k α j * 2 ( λ j * + k ) 2 < j = 1 p + 1 1 ( λ j * + k ) 2 , then M S E ( β ^ R P K L ) < M S E ( β ^ R P M L ) . That means, the RPKL estimator is better than the RPML estimator if j = 1 p + 1 k α j * 2 ( λ j * + k ) 2 < j = 1 p + 1 1 ( λ j * + k ) 2 . ☐
Theorem 3.
If j = 1 p + 1 k 2 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4 < j = 1 p + 1 ( λ j * + k ) 4 λ j * 2 ( λ j * k ) 2 λ j * ( λ j * + k ) 4 , then M S E ( β ^ R P M K L ) < M S E ( β ^ R P M L ) .
Proof. 
The MSE difference between the RPML and the RPMKL estimators is given by:
Δ 3 = M S E ( β ^ R P M K L ) M S E ( β ^ R P M L )                            = j = 1 p + 1 λ j * 2 ( λ j * k ) 2 ( λ j * + k ) 4 + k 2 λ j * ( k + 3 λ j * ) 2 α j * 2 λ j * ( λ j * + k ) 4 .
After using some algebraic calculations, Δ 3 in Equation (26) will be negative if j = 1 p + 1 k 2 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4 < j = 1 p + 1 ( λ j * + k ) 4 λ j * 2 ( λ j * k ) 2 λ j * ( λ j * + k ) 4 , then M S E ( β ^ R P M K L ) < M S E ( β ^ R P M L ) . That means, the RPMKL estimator is better than the RPML estimator if j = 1 p + 1 k 2 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4 < j = 1 p + 1 ( λ j * + k ) 4 λ j * 2 ( λ j * k ) 2 λ j * ( λ j * + k ) 4 . ☐
Theorem 4.
If j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 < j = 1 p + 1 λ j * + k 2 α j * 2 ( λ j * + k ) 2 , then M S E ( β ^ R P L ) < M S E ( β ^ R P R R ) .
Proof. 
The MSE difference between the RPRR and the RPL estimators is given by:
Δ 4 = M S E ( β ^ R P L ) M S E ( β ^ R P R R )                                    = j = 1 p + 1 [ ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 λ j * + k 2 α j * 2 ( λ j * + k ) 2 ] .
After using some algebraic calculations, Δ 4 in Equation (27) will be negative if j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 < j = 1 p + 1 λ j * + k 2 α j * 2 ( λ j * + k ) 2 , then M S E ( β ^ R P L ) < M S E ( β ^ R P R R ) . That means, the RPL estimator is better than the PRRR estimator if j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 < j = 1 p + 1 λ j * + k 2 α j * 2 ( λ j * + k ) 2 . ☐
Theorem 5.
If j = 1 p + 1 3 k α j * 2 ( λ j * + k ) 2 < j = 1 p + 1 2 λ j * k λ j * ( λ j * + k ) 2 , then M S E ( β ^ R P K L ) < M S E ( β ^ R P R R ) .
Proof. 
The MSE difference between the RPRR and the RPKL estimators is given by:
Δ 5 = M S E ( β ^ R P K L ) M S E ( β ^ R P R R ) Δ 5 = M S E ( β ^ R P K L ) M S E ( β ^ R P R R ) .
After using some algebraic calculations, Δ 5 in Equation (28) will be negative if j = 1 p + 1 3 k α j * 2 ( λ j * + k ) 2 < j = 1 p + 1 2 λ j * k λ j * ( λ j * + k ) 2 , then M S E ( β ^ R P K L ) < M S E ( β ^ R P R R ) . That means, the RPKL estimator is better than the PRRR estimator if j = 1 p + 1 3 k α j * 2 ( λ j * + k ) 2 < j = 1 p + 1 2 λ j * k λ j * ( λ j * + k ) 2 . ☐
Theorem 6.
If j = 1 p + 1 k λ j * α j * 2 ( 2 λ j * + k ) ( λ j * + k ) 4 < j = 1 p + 1 λ j * 2 ( λ j * + k ) 4 , then M S E ( β ^ R P M K L ) < M S E ( β ^ R P R R ) .
Proof. 
The MSE difference between the RPRR and the RPMKL estimators is given by:
Δ 6 = M S E ( β ^ R P M K L ) M S E ( β ^ R P R R )                                                         = j = 1 p + 1 [ λ j * ( λ j * k ) 2 + k 2 ( k + 3 λ j * ) 2 α j * 2 λ j * ( λ j * + k ) 2 k 2 ( λ j * + k ) 2 α j * 2 ( λ j * + k ) 4 ] .
After using some algebraic calculations, Δ 6 in Equation (29) will be negative if j = 1 p + 1 k λ j * α j * 2 ( 2 λ j * + k ) ( λ j * + k ) 4 < j = 1 p + 1 λ j * 2 ( λ j * + k ) 4 , then M S E ( β ^ R P M K L ) < M S E ( β ^ R P R R ) . That means, the RPMKL estimator is better than the PRRR estimator if j = 1 p + 1 k λ j * α j * 2 ( 2 λ j * + k ) ( λ j * + k ) 4 < j = 1 p + 1 λ j * 2 ( λ j * + k ) 4 . ☐
Theorem 7.
If j = 1 p + 1 ( λ j * k ) 2 + 4 k 2 λ j * α j * 2 λ j * ( λ j * + k ) 2 < j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 , then M S E ( β ^ R P K L ) < M S E ( β ^ R P L ) .
Proof. 
The MSE difference between the RPL and the RPKL estimators is given by:
Δ 7 = M S E ( β ^ R P K L ) M S E ( β ^ R P L )                                    = j = 1 p + 1 [ ( λ j * k ) 2 + 4 k 2 λ j * α j * 2 λ j * ( λ j * + k ) 2 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 ] .
After using some algebraic calculations, Δ 7 in Equation (30) will be negative if j = 1 p + 1 ( λ j * k ) 2 + 4 k 2 λ j * α j * 2 λ j * ( λ j * + k ) 2 < j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 , then M S E ( β ^ R P K L ) < M S E ( β ^ R P L ) . That means, the RPKL estimator is better than the RPL estimator if j = 1 p + 1 ( λ j * k ) 2 + 4 k 2 λ j * α j * 2 λ j * ( λ j * + k ) 2 < j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 . ☐
Theorem 8.
If j = 1 p + 1 λ j * ( λ j * k ) 2 + k 2 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4 < j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 , then M S E ( β ^ R P M K L ) < M S E ( β ^ R P L ) .
Proof. 
The MSE difference between the RPL and the RPMKL estimators is given by:
Δ 8 = M S E ( β ^ R P M K L ) M S E ( β ^ R P L )                                       = j = 1 p + 1 [ λ j * ( λ j * k ) 2 + k 2 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 ] .
After using some algebraic calculations, Δ 8 in Equation (31) will be negative if j = 1 p + 1 λ j * ( λ j * k ) 2 + k 2 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4 < j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 , then M S E ( β ^ R P M K L ) < M S E ( β ^ R P L ) . That means, the RPMKL estimator is better than the RPL estimator if j = 1 p + 1 λ j * ( λ j * k ) 2 + k 2 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4 < j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 . ☐
Theorem 9.
If j = 1 p + 1 k α j * 2 ( 5 λ j * 2 2 λ j * k 3 k 2 ) ( λ j * + k ) 4 < j = 1 p + 1 ( λ j * k ) 2 ( 2 λ j * + k ) λ j * ( λ j * + k ) 4 , then M S E ( β ^ R P M K L ) < M S E ( β ^ R P K L ) .
Proof. 
The MSE difference between the RPKL and the RPMKL estimators is given by:
Δ 9 = M S E ( β ^ R P M K L ) M S E ( β ^ R P K L )                                                   = j = 1 p + 1 [ λ j * ( λ j * k ) 2 + k 2 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4 ( λ j * k ) 2 + 4 k 2 λ j * α j * 2 λ j * ( λ j * + k ) 2 ] .
After using some algebraic calculations, Δ 9 in Equation (32) will be negative if j = 1 p + 1 k α j * 2 [ 5 λ j * 2 2 λ j * k 3 k 2 ] ( λ j * + k ) 4 < j = 1 p + 1 ( λ j * k ) 2 [ 2 λ j * + k ] λ j * ( λ j * + k ) 4 , then M S E ( β ^ R P M K L ) < M S E ( β ^ R P K L ) . That means, the RPMKL estimator is better than the RPKL estimator if j = 1 p + 1 k α j * 2 ( 5 λ j * 2 2 λ j * k 3 k 2 ) ( λ j * + k ) 4 < j = 1 p + 1 ( λ j * k ) 2 ( 2 λ j * + k ) λ j * ( λ j * + k ) 4 . ☐

2.5. Selection of Biasing Parameters

Following Månsson and Shukur [10], Mansson et al. [11], Lukman et al. [12], and Abonazel and Dawoud [26], we suggest the following biasing parameters for the PRR, RPRR, PL, RPL, PKL, RPKL, PMKL, and RPMKL estimators.
  • The suggested k ^ for the PRR estimator is:
    k ^ P R R = max ( α ^ j 2 ) j = 1 p + 1 .
  • The suggested k ^ for the RPRR estimator is:
    k ^ R P R R = max ( α ^ j * 2 ) j = 1 p + 1 .
  • The suggested d ^ for the PL estimator is:
    d ^ P L = max ( 0 , α ^ j 2 1 α ^ j 2 + 1 λ j ) j = 1 p + 1 .
  • The suggested d ^ for the RPL estimator is:
    d ^ R P L = max ( 0 , α ^ j * 2 1 α ^ j * 2 + 1 λ j * ) j = 1 p + 1 .
  • The two suggested k ^ for the PKL estimator are k ^ P K L .1 = k ^ P R R and
    k ^ P K L .2 = ( min ( 1 2 α ^ j 2 + 1 λ j ) j = 1 p + 1 ) 1 / 2 .
  • The two suggested k ^ for the RPKL estimator are k ^ R P K L .1 = k ^ R P R R and
    k ^ R P K L .2 = ( min ( 1 2 α ^ j * 2 + 1 λ j * ) j = 1 p + 1 ) 1 / 2 .
Here, we discuss obtaining the optimal value of k by minimizing
M S E ( β ^ P M K L ) = j = 1 p + 1 λ j ( λ j k ) 2 ( λ j + k ) 4 + k 2 j = 1 p + 1 ( k + 3 λ j ) 2 α j 2 ( λ j + k ) 4 .
Differentiating M S E ( β ^ P M K L ) with respect to k and setting ( M S E ( β ^ P M K L ) / k ) = 0 , we have:
k = ( 3 λ j α j 2 + 1 ) 9 λ j 2 ( α j 2 ) 2 + 10 λ j α j 2 + 1 2 α j 2   or   k = 3 λ j .
Since k > 0 , we take:
k = ( 9 λ j 2 ( α j 2 ) 2 + 10 λ j α j 2 + 1 ) ( 3 λ j α j 2 + 1 ) 2 α j 2   or   k = 3 λ j .
Then, we replace the unknown parameter α j 2 with its estimator for the optimal value of k in (41) in order to use it in the application. Therefore, we have:
k = ( 9 λ j 2 ( α ^ j 2 ) 2 + 10 λ j α ^ j 2 + 1 ) ( 3 λ j α ^ j 2 + 1 ) 2 α ^ j 2   or   k = 3 λ j .
  • Three k ^ are suggested for the PMKL estimator: k ^ P M K L .1 = k ^ P R R , k ^ P M K L .2 = k ^ P K L .2 , and
    k ^ P M K L .3 = ( min { ( 9 λ j 2 ( α ^ j 2 ) 2 + 10 λ j α ^ j 2 + 1 ) ( 3 λ j α ^ j 2 + 1 ) 2 α ^ j 2 , 3 λ j } j = 1 p + 1 ) 1 / 2
  • The three suggested k ^ for the RPMKL estimator are k ^ R P M K L .1 = k ^ R P R R , k ^ R P M K L .2 = k ^ R P K L .2 and
    k ^ R P M K L .3 = ( min { ( 9 λ j * 2 ( α ^ j * 2 ) 2 + 10 λ j * α ^ j * 2 + 1 ) ( 3 λ j * α ^ j * 2 + 1 ) 2 α ^ j * 2 , 3 λ j * } j = 1 p + 1 ) 1 / 2 .

3. Simulation Study

We conducted a Monte Carlo simulation study to examine the performance of non-robust estimators (PML, PRR, PL, PKL, and PMKL) and robust estimators (RPML, RPRR, RPL, RPKL, and RPMKL). The program of the simulation study was written in the R programming language.

3.1. Simulation Design

The simulated data were implemented with the following settings: y i was generated from a Poisson distribution with mean equal to μ i = exp ( x i β ) ; for i = 1 ,   2 , ,   n . The sample sizes (n) were set to 75, 100, 200, and 300. The number of explanatory variables (p) was set to 2 and 6. β β = 1 , and β 2 = = β p + 1 , as in Kaçıranlar and Dawoud [28] and Abonazel and Dawoud [26]. The intercept was defined as β 1 = −1, 0, and 1, as in Lukman et al. [12] and Abonazel and Dawoud [26]. The explanatory variables (X), as in Abonazel and Dawoud [26], were generated from a multivariate normal distribution M V N ( 0 , Σ X ) , where diag ( Σ X ) = 1 and off–diag ( Σ X ) = ρ ; ρ   = 0.90, 0.95, and 0.99, where ρ was the correlation coefficient between the explanatory variables. Some outliers were generated with different percentages ( τ =   0 % , 10%, and 20%) in the model by randomly replacing some values due to the selected percentage τ % from the response variable. As in Abonazel and Saber [22], Abonazel et al. [24], Dawoud and Abonazel [29], Awwad et al. [30], and Abonazel and Dawoud [26], the outliers were generated from a Poisson distribution with mean equal to ( Q 3 + 10 × I Q R ) , where Q 3 was the third quartile of μ i ( β ) values, and I Q R was the interquartile range of μ i ( β ) values.
We used the average MSE (AMSE) measure for verification, which was computed as:
AMSE ( β ^ ) = 1 1000 l = 1 1000 ( β ^ l β ) ( β ^ l β ) ,
where β ^ l was the estimated vector at the l-th experiment of the simulation (the experiment was replicated 1000 times), and β was the true parameter vector.

3.2. Simulation Results

The simulation results are summarized in Table A1, Table A2, Table A3, Table A4, Table A5, Table A6, Table A7, Table A8 and Table A9 in Appendix A. In each table, the best value of the AMSE is highlighted in bold. According to the simulation results, we concluded the following:
  • The PML performance was the worst of the given estimators in the existence of both outliers and multicollinearity problems, as expected;
  • The estimators’ AMSE values were increased in the case of the multicollinearity degree ( ρ ), explanatory variables number (p), and outlier percentage ( τ );
  • The estimators’ AMSE values were decreased in the case of the sample size (n) being increased, when the other factors were fixed;
  • When only the multicollinearity problem existed in the model (as in Table A1, Table A2 and Table A3 or when no outliers existed, i.e., τ = 0 % ), the non-robust estimators (PML, PRR, PL, PKL, and PMKL) were better than the corresponding robust estimators (RPML, RPRR, RPL, RPKL, and RPMKL) for all ρ , p, and n values;
  • When both problems existed in the model (as in Table A4, Table A5, Table A6, Table A7, Table A8 and Table A9 or when outliers existed, i.e., τ > 0 % ), the RPML, RPRR, RPL, RPKL, and RPMKL estimators were better than the PML, PRR, PL, PKL, and PMKL estimators, respectively, for all ρ , p, and n values;
  • For n = 75, 100, β 1 = −1, τ = 0% and ρ = 0.90, 0.95, the PKL.1 AMSE values were the lowest among all models, and for n = 75, 100, β1 = −1, τ = 10% and ρ = 0.90, the RPKL.1 AMSE values were the lowest among all models;
  • When τ = 0 % , the PMKL estimator, and in particular, the PMKL.1 was the best, followed by the PKL, and in particular, the PKL.1 in most situations;
  • In addition, when τ > 0 % , the RPMKL estimator was the best, particularly the RPMKL.1, followed by the RPKL estimator, particularly the RPKL.1 in most situations;
  • Finally, the PMKL estimator achieved the best performance among all given estimators when only the multicollinearity problem existed in the model. If both outliers and multicollinearity problems existed in the model, the RPMKL estimator achieved the best performance among all given estimators in most situations.

4. Empirical Applications

In this section, we consider two real life applications to evaluate the performance of the proposed robust estimators.

4.1. Aircraft Damage Data

To evaluate the performance of the proposed robust estimators, we first considered the aircraft damage dataset. Myers et al. [31] discussed these data in detail. The dataset provided information related to two aircrafts, that is, the McDonnell Douglas A-4 Skyhawk and the Grumman A-6 Itruder. The sample size of the dataset was n = 30 with one dependent (y) and three explanatory variables (X’s): y denoted the number of locations where damage was inflicted on the aircraft, X1 was the type of aircraft, X2 was the bomb load measured in tons, and X3 was the representation of aircraft experience in months.
Myers et al. [31] indicated the existence of severe multicollinearity in the data set. However, the eigenvalues of this data were 4.3333, 374.8961, and 2085.2251. In addition, the condition number (CN) was 219.37 >  30, showing a multicollinearity issue among the explanatory variables. On the other hand, for outliers, there were three outlier values (1, 29, and 30) in the data, as shown in Figure 1. Therefore, these data suffered from the problems of multicollinearity and outliers, together.
Since this section aims to assess the superiority of the new robust estimators using a real-life application, the efficiency of the considered estimators was assessed through the MSE. The estimated coefficients and MSEs of the ten estimators on these data are presented in Table 1. Table 1 shows the estimates of the regression coefficients and the estimated MSE values for the different estimators. From Table 1, we can note that:
  • The PML estimator performed the worst among all given estimators;
  • The robust estimators achieved a better performance than the corresponding non-robust estimators;
  • The PMKL performed better in general, followed by the PKL, and then the other non-robust estimators. Additionally, the RPMKL achieved a better performance in general, followed by the RPKL and then the other robust estimators;
  • Finally, the RPMKL, particularly the RPMKL.1 estimator was the best, which had the lowest AMSE value.
We verified the theoretical results through the aircraft damage data, as follows:
  • Since the condition j = 1 p + 1 ( ( 1 d ) α j * 2 2 ) ( λ j * + 1 ) 2 = 0 . 083 < j = 1 p + 1 ( 1 + d ) λ j * ( λ j * + 1 ) 2 = 0.067 was satisfied, then the RPL estimator was better than the RPML estimator;
  • Since the condition j = 1 p + 1 k α j * 2 ( λ j * + k ) 2 = 0.089 < j = 1 p + 1 1 ( λ j * + k ) 2 = 0.141 was satisfied, then the RPKL estimator was better than the RPML estimator;
  • Since the condition j = 1 p + 1 k 2 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4 = 0.431 < j = 1 p + 1 ( λ j * + k ) 4 λ j * 2 ( λ j * k ) 2 λ j * ( λ j * + k ) 4 = 0.536 was satisfied, then the RPMKL estimator was better than the RPML estimator;
  • Since the condition j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 = 0.392 < j = 1 p + 1 λ j * + k 2 α j * 2 ( λ j * + k ) 2 = 0.400 was satisfied, then the RPL estimator was better than the RPRR estimator;
  • Since the condition j = 1 p + 1 3 k α j * 2 ( λ j * + k ) 2 = 0.249 < j = 1 p + 1 2 λ j * k λ j * ( λ j * + k ) 2 = 0.252 was satisfied, then the RPKL estimator was better than the RPRR estimator;
  • Since the condition j = 1 p + 1 k λ j * α j * 2 ( 2 λ j * + k ) ( λ j * + k ) 4 = 0.098 < j = 1 p + 1 λ j * 2 ( λ j * + k ) 4 = 0.111 was satisfied, then the RPMKL estimator was better than the RPRR estimator;
  • Since the condition j = 1 p + 1 ( λ j * k ) 2 + 4 k 2 λ j * α j * 2 λ j * ( λ j * + k ) 2 = 0.386 < j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 = 0.392 was satisfied, then the RPKL estimator was better than the RPL estimator;
  • Since the condition j = 1 p + 1 λ j * ( λ j * k ) 2 + k 2 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4 = 0.172 < j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 = 0.302 was satisfied, then the RPMKL estimator was better than the RPL estimator;
  • Since the condition j = 1 p + 1 k α j * 2 ( 5 λ j * 2 2 λ j * k 3 k 2 ) ( λ j * + k ) 4 = 0.007 < j = 1 p + 1 ( λ j * k ) 2 ( 2 λ j * + k ) λ j * ( λ j * + k ) 4 = 0.132 was satisfied, then the RPMKL estimator was better than the RPKL estimator.
Figure 2 presents the MSE values of the non-robust estimators against k and d from 0 to 1. The MSE values of the PKL and PMKL estimators were smaller than the MSE values of the other estimators. However, the PMKL estimator had the smallest MSE values. Figure 3 presents the MSE values of the robust estimators against k and d from 0 to 1. Additionally, we noted that the MSE values of the RPKL and RPMKL estimators were smaller than the MSE values of the other estimators, and the RPMKL estimator had the smallest MSE values.

4.2. Somerville Lake Data

We used a dataset from Cameron and Trivedi [32] to estimate a recreation demand function. These data were from a survey on the number of recreational boating trips in Somerville Lake, East Texas, in 1980. The sample size was 201 observations with one response variable (y) and three explanatory variables (X’s): y was the number of recreational boating trips in Somerville Lake in 1980, X1 was the cost of visiting Lake Conroe, X2 was cost of visiting Somerville Lake, and X3 was cost of visiting Houston Lake. These data were used by Månsson and Kibria [33] and Abonazel and Dawoud [26].
To investigate the existence of multicollinearity, Abonazel and Dawoud [26] showed that the condition number (CN) was 24.56, and the values of variance inflation factor (VIF) of the explanatory variables were 41.6, 13.2, and 25.03, respectively, and all coefficients of correlations between three explanatory variables were greater than 0.90. All these measures (correlations, VIF, and CN) confirmed the presence of severe multicollinearity. On the other hand, for outliers, there were 22 outlier values in the data as shown in Figure 4. Therefore, these data suffered from the problems of multicollinearity and outliers, together. The estimated coefficients and MSEs of the ten estimators on these data are presented in Table 2. Table 2 shows the estimates of the regression parameters and the estimated MSE values for the different estimators. From Table 2, we can note the following:
  • The PML performed worst among all given estimators;
  • The robust estimators achieved a better performance than the corresponding non-robust estimators;
  • The PMKL achieved a better performance in general, followed by the PKL, and then the other non-robust estimators. Additionally, the RPMKL achieved a better performance in general, followed by the RPKL, and then the other robust estimators;
  • Finally, the RPMKL, particularly the RPMKL.1 estimator was the best, which had the lowest AMSE value.
Through Somerville Lake data, we verified the theoretical results as follows:
  • Since the condition j = 1 p + 1 ( ( 1 d ) α j * 2 2 ) ( λ j * + 1 ) 2 = 0.042 < j = 1 p + 1 ( 1 + d ) λ j * ( λ j * + 1 ) 2 = 0.089 was satisfied, then the RPL estimator was better than the RPML estimator;
  • Since the condition j = 1 p + 1 k α j * 2 ( λ j * + k ) 2 = 0.066 < j = 1 p + 1 1 ( λ j * + k ) 2 = 0.123 was satisfied, then the RPKL estimator was better than the RPML estimator;
  • Since the condition j = 1 p + 1 k 2 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4 = 0.438 < j = 1 p + 1 ( λ j * + k ) 4 λ j * 2 ( λ j * k ) 2 λ j * ( λ j * + k ) 4 = 0.614 was satisfied, then the RPMKL estimator was better than the RPML estimator;
  • Since the condition j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 = 0.453 < j = 1 p + 1 λ j * + k 2 α j * 2 ( λ j * + k ) 2 = 0.555 was satisfied, then the RPL estimator was better than the RPRR estimator;
  • Since the condition j = 1 p + 1 3 k α j * 2 ( λ j * + k ) 2 = 0.079 < j = 1 p + 1 2 λ j * k λ j * ( λ j * + k ) 2 = 0.455 was satisfied, then the RPKL estimator was better than the RPRR estimator;
  • Since the condition j = 1 p + 1 k λ j * α j * 2 ( 2 λ j * + k ) ( λ j * + k ) 4 = 0.063 < j = 1 p + 1 λ j * 2 ( λ j * + k ) 4 = 0.129 was satisfied, then the RPMKL estimator was better than the RPRR estimator;
  • Since the condition j = 1 p + 1 ( λ j * k ) 2 + 4 k 2 λ j * α j * 2 λ j * ( λ j * + k ) 2 = 0.373 < j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 = 0.453 was satisfied, then the RPKL estimator was better than the RPL estimator;
  • Since the condition j = 1 p + 1 λ j * ( λ j * k ) 2 + k 2 ( k + 3 λ j * ) 2 α j * 2 ( λ j * + k ) 4 = 0.241 < j = 1 p + 1 ( λ j * + d ) 2 + ( 1 d ) 2 λ j * α j * 2 λ j * ( λ j * + 1 ) 2 = 0.429 was satisfied, then the RPMKL estimator was better than the RPL estimator;
  • Since the condition j = 1 p + 1 k α j * 2 ( 5 λ j * 2 2 λ j * k 3 k 2 ) ( λ j * + k ) 4 = 0.029 < j = 1 p + 1 ( λ j * k ) 2 ( 2 λ j * + k ) λ j * ( λ j * + k ) 4 = 0.166 was satisfied, then the RPMKL estimator was better than the RPKL estimator.
We checked the efficiency of our proposed estimators for 0 < k = d < 1 through the Somerville Lake data. Figure 5 and Figure 6 present the MSE values of the non-robust and robust estimators, respectively. As in the first application, these figures indicated that the PKL and PMKL estimators had smaller MSE values than the other estimators, and the PMKL estimator had the smallest MSE values.

5. Conclusions

In the PRM model, the problem of outliers causes the PML to be inefficient. In this case, the RPML is considered to reduce the effects of the outlier values. Additionally, the PML performed the worst in the case of the existence of the multicollinearity problem. Therefore, some proposed biased regression estimators in the PRM model were considered to handle multicollinearity, such as the PRR, PL, PKL, and PMKL (Månsson and Shukur [10], Månsson et al. [11], Lukman et al. [12], and Aladeitan et al. [13], respectively). Following Abonazel and Dawoud [26]’s provision of the RPRR estimator for dealing with both outliers and multicollinearity problems simultaneously in the PRM, we proposed, in this paper, new robust versions of three one-parameter biased estimators for the PRM, namely, the RPL, RPKL, and RPMKL estimators. Theoretically, the necessary and sufficient conditions were obtained for the proposed estimators (RPL, RPKL, and RPMKL). Then, a simulation study and two applications were conducted to investigate the proposed estimators’ performances, and the other estimators’ performances. The results indicated that the proposed estimators’ performances were better than the others, especially RPMKL in most situations where both outliers and multicollinearity problems occurred in the model. Finally, if outliers and multicollinearity problems occurred together in the PRM, we recommend that practitioners use the RPMKL and RPKL estimators to estimate the regression parameters of the model. For future work, we aim to develop new estimators to deal with both outliers and multicollinearity problems simultaneously in the PRM, such as the robust jackknife Liu, robust jackknife Kibria–Lukman, and robust jackknife modified Kibria–Lukman estimators as an extension to Abonazel and Dawoud [26] and Arum et al. [27].

Author Contributions

M.R.A., I.D. and F.A.A. contributed to conception and structural design of the manuscript. M.R.A. performed the simulation and applications. M.R.A., I.D., F.A.A. and E.T.E. contributed to manuscript revision. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Datasets are mentioned along the paper.

Acknowledgments

The authors appreciate the Deanship of Scientific Research at the King Saud University represented by the Research Center at CBA for financially supporting this research.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

Table A1. AMSE of different estimators when τ = 0% and ρ = 0.90.
Table A1. AMSE of different estimators when τ = 0% and ρ = 0.90.
Non-Robust EstimatorsRobust Estimators
pnPMLPRRPLPKL.1PKL.2PMKL.1PMKL.2PMKL.3RPMLRPRRRPLRPKL.1RPKL.2RPMKL.1RPMKL.2RPMKL.3
β1 = −1
2750.36230.17560.28300.10900.20780.13590.17010.18970.70510.20910.48760.19120.27280.23900.20660.2410
1000.19330.13040.17080.10490.14880.11570.13520.14220.30220.17320.25310.13470.20550.16350.17880.1925
2000.09580.07940.08980.06890.08400.06560.07970.08200.14080.10730.12750.08620.11520.08100.10610.1109
3000.04450.04090.04310.03830.04180.03700.04080.04130.06870.06040.06520.05440.06220.05120.05970.0610
6750.75200.47990.56200.29700.44890.22300.36060.40721.17070.60600.83240.31970.60710.23660.46620.5402
1000.20710.18330.18960.16650.18090.15790.17330.17710.27140.23130.24470.20590.23150.19630.22150.2263
2000.11740.10950.11130.10260.10810.09720.10420.10630.16570.15130.15410.13880.14770.12930.14050.1444
3000.09180.08820.08760.08480.08510.08180.08220.08380.12930.12160.12140.11450.11750.10860.11250.1152
β1 = 0
2750.09300.08290.08240.07350.07730.06580.07080.07430.13040.11010.10930.09180.09900.07840.08720.0936
1000.05730.05400.05360.05080.05150.04800.04890.05030.07780.07160.07090.06590.06690.06110.06250.0649
2000.03040.02920.02910.02800.02850.02690.02760.02810.04730.04430.04420.04150.04270.03890.04060.0418
3000.01740.01700.01700.01670.01680.01640.01660.01670.02480.02410.02400.02330.02360.02260.02300.0234
6750.13730.12900.12870.12110.12470.11410.11910.12210.17790.16490.16450.15260.15820.14210.14960.1543
1000.08820.08480.08470.08140.08320.07840.08080.08210.11640.11040.11030.10460.10760.09940.10350.1057
2000.04370.04270.04270.04170.04220.04080.04150.04190.05800.05620.05610.05440.05530.05280.05410.0548
3000.01620.01610.01610.01590.01600.01580.01590.01600.02210.02190.02190.02170.02180.02150.02160.0217
β1 = 1
2750.05910.05310.05650.04760.05420.04320.05200.05320.07990.06900.07530.05910.07100.05160.06700.0692
1000.02250.02180.02220.02120.02190.02070.02170.02180.03380.03230.03310.03080.03250.02950.03190.0322
2000.01020.01010.01020.00990.01010.00980.01000.01010.01610.01570.01590.01530.01570.01500.01560.0157
3000.00630.00630.00630.00620.00630.00620.00630.00630.00880.00870.00880.00860.00880.00850.00870.0087
6750.10060.09450.09710.08860.09450.08350.09180.09330.13490.12440.12890.11460.12450.10620.11980.1224
1000.02650.02620.02630.02590.02620.02570.02610.02620.03410.03360.03380.03310.03360.03270.03340.0335
2000.01060.01060.01060.01050.01060.01050.01050.01060.01330.01330.01330.01320.01330.01310.01320.0132
3000.00920.00920.00920.00920.00920.00910.00920.00920.01280.01270.01270.01260.01270.01250.01260.0127
Table A2. AMSE of different estimators when τ = 0% and ρ = 0.95.
Table A2. AMSE of different estimators when τ = 0% and ρ = 0.95.
Non-Robust EstimatorsRobust Estimators
pnPMLPRRPLPKL.1PKL.2PMKL.1PMKL.2PMKL.3RPMLRPRRRPLRPKL.1RPKL.2RPMKL.1RPMKL.2RPMKL.3
β1 = −1
2750.63850.19480.43560.09810.23830.12870.17140.20691.27000.18840.76940.42520.27510.26750.19020.2376
1000.31490.17110.25660.09990.20000.10530.16680.18430.50100.20810.37890.10740.25800.14510.20170.2311
2000.16180.11740.14270.08510.12520.07180.11190.11900.24430.15180.20260.09150.16460.07620.13920.1528
3000.07820.06720.07290.05800.06860.05180.06460.06680.11890.09370.10670.07350.09650.06210.08790.0926
6750.61380.44050.47460.31250.39920.25220.33600.36960.79780.51360.59290.33210.47530.26740.39130.4355
1000.60590.44180.48890.31500.41880.25010.35690.38990.76700.51700.59350.33760.48860.26270.40470.4491
2000.34770.29250.30680.24450.28350.21030.25780.27170.44610.35920.38090.28610.34400.23830.30560.3262
3000.08100.07800.07860.07520.07730.07280.07570.07660.11120.10570.10690.10070.10470.09630.10170.1033
β1 = 0
2750.17040.13270.13150.10020.11310.07920.09390.10430.26260.17910.18550.11350.14770.08120.11490.1325
1000.08420.07510.07460.06660.06990.05970.06400.06720.12940.10860.10770.09000.09700.07630.08490.0915
2000.04410.04160.04150.03910.04030.03690.03850.03950.08190.07310.07270.06490.06840.05810.06270.0658
3000.02670.02570.02560.02470.02510.02390.02440.02480.04220.03980.03970.03750.03840.03540.03660.0376
6750.13580.12610.12600.11690.12170.10890.11540.11880.15710.14410.14400.13190.13820.12160.13000.1345
1000.13370.12220.12200.11140.11690.10240.10970.11360.18330.16300.16280.14420.15370.12960.14170.1482
2000.05700.05570.05570.05440.05510.05310.05410.05470.08480.08200.08190.07920.08070.07670.07870.0798
3000.05110.04990.04990.04870.04930.04760.04850.04900.07020.06800.06790.06580.06690.06380.06540.0662
β1 = 1
2750.05990.05390.05730.04830.05490.04380.05270.05390.08500.07290.07970.06190.07490.05380.07040.0729
1000.05210.04700.04970.04220.04760.03820.04550.04670.07350.06300.06860.05340.06430.04620.06030.0625
2000.01640.01600.01630.01560.01610.01520.01590.01600.02580.02460.02530.02350.02490.02250.02440.0247
3000.00980.00970.00970.00950.00970.00940.00960.00960.01560.01510.01540.01470.01520.01430.01500.0151
6750.06770.06540.06630.06330.06540.06130.06430.06490.08720.08340.08480.07970.08320.07650.08140.0824
1000.09920.09220.09490.08560.09200.08010.08870.09050.13240.12050.12500.10940.12000.10050.11450.1175
2000.02760.02730.02740.02690.02730.02660.02710.02720.03780.03720.03740.03650.03720.03590.03680.0370
3000.01710.01700.01700.01680.01700.01670.01690.01690.02400.02370.02380.02340.02370.02320.02360.0236
Table A3. AMSE of different estimators when τ = 0% and ρ = 0.99.
Table A3. AMSE of different estimators when τ = 0% and ρ = 0.99.
Non-Robust EstimatorsRobust Estimators
pnPMLPRRPLPKL.1PKL.2PMKL.1PMKL.2PMKL.3RPMLRPRRRPLRPKL.1RPKL.2RPMKL.1RPMKL.2RPMKL.3
β1 = −1
2752.81120.20681.46521.71030.19210.10670.11760.16565.86020.11873.07735.06310.30390.41510.14660.1912
1001.23860.16030.73620.26710.23300.10880.14660.19642.04040.13961.15560.86430.24850.20000.15320.2120
2000.54350.19380.36580.03820.20960.02960.14330.17900.80620.17040.49100.05860.20730.03720.13030.1728
3000.48030.17910.33000.03390.19150.02280.13070.16340.79830.18210.49720.03760.21370.02460.13160.1766
6753.43820.68602.07650.69100.95460.16820.64930.81444.40140.64262.63431.24421.06470.20370.71090.9032
1001.98670.72751.28350.25530.76740.15550.54620.66452.47330.72431.55560.32280.84170.16040.58500.7230
2000.95010.59470.72360.34080.56970.21380.45740.51771.23460.67680.88580.32300.64750.23820.49580.5771
3000.52280.40420.44520.30310.39380.23900.34490.37140.71430.50880.58040.34290.49080.25300.41320.4551
β1 = 0
2750.79110.20460.41630.03950.19540.01920.12060.16131.19380.17150.62350.17520.23340.02910.14120.1929
1000.36560.20570.22490.09600.15500.04020.10940.13360.56370.23820.32320.06580.19300.06080.12860.1628
2000.31480.18790.20210.09550.14520.03670.10390.12590.44060.20580.25560.06600.15880.05990.10560.1339
3000.11440.09370.09260.07510.08160.06170.06960.07610.18270.13480.13460.09450.11010.07060.08730.0996
6752.41530.76341.41380.22400.78960.11830.53030.66813.09410.72161.78670.36170.88010.12230.58000.7410
1000.69680.49880.51140.34120.42570.25860.34460.38810.97510.61820.67840.36370.53240.26090.41530.4779
2000.29280.24910.24870.20960.22910.18040.20450.21790.40720.32660.32870.25690.29330.21120.25310.2749
3000.17210.15970.15950.14770.15400.13720.14580.15030.23920.21610.21580.19430.20560.17600.19110.1990
β1 = 1
2750.37110.17770.27940.05790.19000.01690.14130.16760.53370.18480.37000.02600.20820.03260.14080.1773
1000.12900.09840.11420.07210.10110.05040.09010.09610.20710.13280.17060.07570.13750.05580.11370.1266
2000.08250.06950.07640.05760.07110.04870.06610.06880.11530.09020.10350.06830.09300.05380.08380.0888
3000.03750.03430.03590.03120.03450.02850.03320.03390.05690.04970.05340.04300.05040.03770.04750.0491
6750.72940.45630.56120.26770.43940.19880.35880.40230.89800.51370.66780.27380.49930.20120.39880.4530
1000.29230.25290.26750.21670.25060.18870.23270.24250.36010.30160.32320.24910.29790.21080.27220.2862
2000.19610.17680.18490.15870.17680.14350.16810.17290.26950.23550.24980.20400.23540.17910.22050.2287
3000.07320.07030.07130.06750.07010.06480.06860.06940.10240.09690.09890.09160.09660.08670.09380.0953
Table A4. AMSE of different estimators when τ = 10% and ρ = 0.90.
Table A4. AMSE of different estimators when τ = 10% and ρ = 0.90.
Non-Robust EstimatorsRobust Estimators
pnPMLPRRPLPKL.1PKL.2PMKL.1PMKL.2PMKL.3RPMLRPRRRPLRPKL.1RPKL.2RPMKL.1RPMKL.2RPMKL.3
β1 = −1
2752.81512.71292.74312.62752.68022.57292.62992.65670.90380.53280.71270.42110.57540.43260.52560.5513
1002.24682.23452.23092.22262.21462.21182.20042.20810.32640.31290.31640.30530.31250.30660.31250.3122
2002.19812.19222.18972.18642.18042.18092.17242.17680.25100.24540.24590.24130.24360.24030.24240.2430
3002.18462.18172.18022.17892.17482.17612.17022.17280.22230.22360.22230.22240.22260.22210.22350.2230
6755.50135.24935.36965.02535.26724.85915.16475.22041.23070.88701.03950.68180.90210.61500.80980.8582
1005.22725.11885.14605.01595.09184.92695.02995.06370.94360.78220.81980.65740.74230.59150.67980.7127
2006.32196.29956.30916.27716.30016.25516.28936.29520.20180.20220.20170.20180.20210.20170.20270.2023
3003.84743.83893.83693.83043.82903.82213.81993.82490.19430.19330.19340.19370.19380.19290.19450.1941
β1 = 0
2751.79491.75361.76231.71421.74331.67961.71971.73260.33080.27250.27810.22340.24900.19380.22200.2365
1002.34202.31882.32432.29602.31462.27422.30142.30870.16010.15070.15020.14190.14510.13480.13900.1423
2002.06992.06292.06452.05592.06152.04902.05732.05960.09150.09010.09000.08880.08930.08760.08830.0889
3002.14352.13902.13962.13452.13772.13012.13492.13650.07420.07380.07380.07340.07360.07310.07330.0734
6755.46925.32845.44495.19355.40735.07445.37735.39380.60370.53850.54920.47960.52140.43490.48720.5058
1004.66694.61164.64584.55694.62694.50424.60734.61810.25070.24080.24050.23120.23590.22250.22910.2328
2004.38474.36644.38094.34824.37584.33024.37134.37380.11860.11670.11670.11490.11580.11310.11450.1152
3003.25193.24323.24823.23463.24503.22593.24163.24350.07460.07410.07410.07350.07380.07300.07340.0736
β1 = 1
2752.30462.28162.30152.25872.29622.23632.29202.29430.07600.07140.07430.06710.07260.06310.07100.0719
1001.90571.89131.90311.87681.89921.86261.89601.89770.05790.05520.05680.05250.05580.05000.05480.0553
2002.64692.63722.64572.62752.64362.61782.64202.64290.03650.03540.03610.03430.03570.03330.03520.0355
3002.32762.32202.32692.31652.32572.31092.32482.32530.03020.02960.02990.02890.02970.02830.02940.0296
6756.43956.31806.43356.19876.41626.08616.40466.41100.29740.28150.28760.26630.28100.25250.27330.2775
1006.74086.69066.73906.64086.73326.59216.72946.73150.12050.11660.11820.11280.11660.10920.11470.1158
2003.21333.20043.21223.18753.20983.17473.20813.20910.04320.04260.04280.04200.04260.04140.04220.0424
3004.26284.25454.26244.24614.26124.23784.26044.26080.03490.03450.03470.03410.03450.03370.03430.0344
Table A5. AMSE of different estimators when τ = 10% and ρ = 0.95.
Table A5. AMSE of different estimators when τ = 10% and ρ = 0.95.
Non-Robust EstimatorsRobust Estimators
pnPMLPRRPLPKL.1PKL.2PMKL.1PMKL.2PMKL.3RPMLRPRRRPLRPKL.1RPKL.2RPMKL.1RPMKL.2RPMKL.3
β1 = −1
2753.38882.95053.20412.67093.03202.57272.91612.97741.57940.52471.10280.55140.69260.44990.58540.6438
1002.49242.43432.44372.38162.40252.34152.36682.38600.45410.36790.39280.30920.35690.29350.33490.3462
2002.31632.29682.29372.27842.27162.26232.25302.26300.39500.35160.35840.31710.33780.29980.32110.3299
3002.39722.38872.38542.38042.37222.37282.36112.36710.28770.27590.27600.26590.26980.26010.26420.2671
6758.77867.39178.51356.35668.16305.84847.89678.04132.08241.22871.68980.78031.37890.63531.17911.2848
1006.35765.90416.16685.50856.00375.23615.84965.93321.18730.90210.97900.69290.84920.59800.74860.8018
2006.05405.84375.96035.64105.88345.45935.80225.84670.54890.50440.50250.46420.47520.43360.44710.4623
3004.79094.76544.76294.74014.74344.71564.72044.73300.31690.30930.30830.30210.30300.29590.29700.3002
β1 = 0
2752.57812.42192.50882.28432.45462.18832.40102.43020.44670.31660.34910.21930.29110.17720.24620.2702
1003.40233.34033.37583.28073.35353.22743.33023.34300.24280.21030.21230.18160.19610.16140.17890.1882
2002.41612.40232.40692.38872.40132.37572.39402.39800.13110.12530.12490.11980.12180.11510.11780.1200
3002.26672.26002.26192.25332.25912.24672.25532.25740.08700.08530.08520.08350.08420.08200.08300.0837
6758.24797.94038.18627.64658.10237.38838.03178.07050.88030.73810.78940.61950.73530.54440.67820.7091
1005.19835.00775.14934.82525.09044.66445.03855.06700.59800.52910.53620.46640.50600.41870.46860.4889
2003.85413.81443.84023.77523.82673.73743.81323.82060.17260.16630.16620.16010.16340.15440.15910.1614
3004.64064.61734.63424.59414.62674.57134.61984.62360.10520.10350.10350.10190.10280.10030.10160.1022
β1 = 1
2752.54752.46862.53512.39272.51472.32532.49882.50750.15780.13450.14760.11330.13820.09720.12960.1343
1002.69552.63082.68692.56842.67132.51232.65952.66600.12800.11040.11980.09420.11260.08180.10580.1095
2002.01962.00912.01801.99872.01531.98852.01322.01440.04990.04780.04900.04580.04820.04390.04740.0478
3002.35242.34532.35142.33832.34982.33132.34852.34920.03650.03540.03600.03440.03560.03340.03520.0354
6758.07757.81148.06507.55738.02807.33568.00368.01700.34480.31310.32570.28370.31240.26030.29820.3060
1008.27817.97708.26007.68558.21287.42078.18058.19830.26960.25010.25680.23160.24860.21570.23900.2442
2005.39315.36835.39215.34375.38885.31945.38675.38790.06590.06470.06510.06350.06460.06230.06400.0644
3003.48263.46763.48113.45263.47803.43783.47573.47700.03490.03460.03470.03420.03450.03390.03430.0344
Table A6. AMSE of different estimators when τ = 10% and ρ = 0.99.
Table A6. AMSE of different estimators when τ = 10% and ρ = 0.99.
Non-Robust EstimatorsRobust Estimators
pnPMLPRRPLPKL.1PKL.2PMKL.1PMKL.2PMKL.3RPMLRPRRRPLRPKL.1RPKL.2RPMKL.1RPMKL.2RPMKL.3
β1 = −1
2757.69832.77946.59514.20844.55802.50643.92564.26636.92110.68714.22725.62950.91990.58990.63820.7954
1004.32292.78263.92362.30633.41042.21723.14903.28791.74060.38451.09380.45470.54180.26520.40870.4825
2004.61993.29584.28862.67693.86902.58813.62923.75691.50890.41680.99750.30990.56300.22960.43030.5022
3002.59142.29302.44962.07442.32701.98172.23852.28550.84500.42570.60060.22220.43560.19840.35360.3971
67517.87585.474316.22416.222312.13083.494910.448811.35199.60921.09616.95013.79933.13570.51422.23872.7283
10030.91019.938329.84934.467225.74723.013123.676024.79988.36121.32606.23911.84213.39190.39362.47442.9681
2007.80126.13857.40094.90896.92294.34896.56246.75722.22091.30361.69850.71181.33790.50871.08121.2171
30019.437412.902019.17738.288818.33266.551317.817618.09941.55891.10031.23360.74731.03870.57830.87330.9616
β1 = 0
2755.57533.33585.24722.42574.67402.26744.34414.52191.46700.33900.92630.21410.48750.11590.34610.4225
1006.27993.76345.95542.68365.34362.53554.98985.18081.11380.30570.68890.11340.37700.07820.26390.3248
2003.43282.79993.27042.34573.09162.16842.95723.02980.72770.33240.48260.12400.32880.08950.24460.2892
3002.94342.81582.89042.69992.84642.61252.80322.82680.29950.22830.23640.16960.20370.13660.17300.1894
67519.938413.956419.648310.156318.71888.904618.165518.46792.92181.15152.26730.65771.68130.36611.36631.5343
10025.289317.110425.019011.351824.05589.078223.477323.79412.07871.26551.66410.74961.36740.56431.15561.2688
20012.730611.078012.59259.604312.29048.540312.080412.19560.97120.78040.81500.61740.73560.51260.64960.6960
3006.44126.06216.35915.70886.24935.42126.15856.20820.70520.59060.59400.48820.54250.41480.48070.5141
β1 = 1
2754.93233.79594.83962.97474.61812.63964.47704.55400.41410.23520.33350.11120.25540.07130.20570.2325
1004.24853.48534.18052.88354.03292.57853.93433.98820.39030.22420.31810.10780.24610.06880.19980.2247
2003.14502.94403.12072.76173.07442.62513.04103.05930.21240.15840.18800.11310.16510.08650.14660.1566
3002.81732.75042.80892.68642.79312.63082.78132.78770.11140.09670.10480.08320.09890.07280.09330.0963
67533.192727.229733.120422.409932.733619.659032.508532.63231.42300.96641.25870.63171.11450.48151.00271.0632
10015.686713.054015.583210.789515.24269.329815.028615.14610.82850.66000.73790.51440.66810.41920.60400.6388
20014.604413.916114.574513.254914.483812.667814.424214.45700.23460.21940.22490.20490.21860.19200.21120.2152
30010.05559.879110.04949.706110.02909.543210.015810.02300.13600.13100.13290.12610.13090.12150.12840.1298
Table A7. AMSE of different estimators when τ = 20% and ρ = 0.90.
Table A7. AMSE of different estimators when τ = 20% and ρ = 0.90.
Non-Robust EstimatorsRobust Estimators
pnPMLPRRPLPKL.1PKL.2PMKL.1PMKL.2PMKL.3RPMLRPRRRPLRPKL.1RPKL.2RPMKL.1RPMKL.2RPMKL.3
β1 = −1
2754.28114.21524.22694.15504.18484.10734.14324.16581.57771.13511.39470.94301.23950.91521.16291.2030
1003.64743.61543.61293.58503.58133.55873.55253.56811.20851.04811.11260.93651.04700.89521.00571.0272
2004.87394.86364.86264.85334.85594.84314.84714.85200.74300.73490.73430.72750.72860.72200.72470.7267
3004.13084.12664.12514.12244.11994.11824.11454.11750.71720.71680.71720.71660.71860.71650.72050.7194
67511.976111.863911.964711.754411.941411.652111.924311.93371.94501.73901.86451.57781.79791.48261.73931.7708
1009.40219.34039.39179.27959.37549.22139.36239.36951.60991.51041.55181.42721.50691.37091.46561.4878
2007.01646.98997.00406.96366.99416.93796.98306.98910.73820.72740.72590.71720.7170 0.7081 0.70790.7128
3006.52496.51126.51816.49766.51286.48416.50686.51010.65460.65330.65290.65200.6515 0.6509 0.65010.6509
β1 = 0
2753.90843.87273.89693.83753.88493.80363.87343.87970.39230.37410.37530.35690.36660.34250.35550.3616
1004.23984.21234.23394.18504.22594.15864.21914.22290.38620.37110.37130.35670.36380.34460.35410.3594
2004.29034.27884.28804.26734.28484.25594.28204.28350.29210.28930.28920.28660.28780.28400.28580.2869
3003.96593.95903.96443.95203.96243.94513.96073.96160.27260.27120.27110.26990.27040.26860.26940.2700
6756.97476.88946.96716.80556.95056.72596.93856.94511.78631.64351.71211.51141.66111.40511.60411.6352
1006.07376.01426.06635.95556.05285.89886.04256.04820.72690.69670.69850.66780.68440.64230.66510.6756
2004.98274.96634.98044.94994.97664.93374.97354.97520.27030.26820.26810.26610.26710.26410.26560.2664
3009.59789.58009.59699.56239.59439.54479.59269.59350.20530.20450.20440.20360.20400.20280.20340.2038
β1 = 1
2753.77593.74933.77433.72273.77023.69653.76733.76890.14950.14300.14620.13660.14370.13060.14090.1424
1003.10713.08773.10563.06853.10223.04963.09973.10110.19470.18340.19110.17240.18710.16270.18350.1855
2003.62463.61553.62403.60633.62253.59733.62153.62200.10540.10350.10460.10160.10380.09970.10310.1035
3004.07324.06684.07284.06054.07184.05424.07114.07150.09230.09120.09190.09010.09140.08910.09100.0912
6758.53108.46878.52948.40698.52318.34688.51928.52140.62940.60680.61920.58490.61150.56480.60290.6076
1009.35809.31989.35729.28179.35399.24419.35189.35290.41540.40490.40900.39460.40480.38480.39960.4024
2007.73077.71427.73027.69777.72867.68137.72767.72810.14100.13920.13980.13740.13910.13570.13810.1387
3005.86515.85695.86485.84865.86385.84045.86325.86350.06890.06850.06860.06800.06840.06760.06820.0683
Table A8. AMSE of different estimators when τ = 20% and ρ = 0.95.
Table A8. AMSE of different estimators when τ = 20% and ρ = 0.95.
Non-Robust EstimatorsRobust Estimators
pnPMLPRRPLPKL.1PKL.2PMKL.1PMKL.2PMKL.3RPMLRPRRRPLRPKL.1RPKL.2RPMKL.1RPMKL.2RPMKL.3
β1 = −1
2754.76004.49504.63414.29364.52784.18954.43984.48702.46221.13362.01551.03281.52900.89651.34771.4451
1003.99453.85753.90593.74243.83403.66963.77253.80551.75131.12951.48690.86881.25960.84171.14821.2069
2004.12934.11044.10814.09194.09114.07494.07354.08310.91830.85060.86230.79350.82250.75890.79340.8088
3004.41174.40244.40014.39314.39104.38434.38114.38650.81110.79270.79150.77600.77620.76350.76530.7710
67515.835214.263215.696712.876215.401811.891515.195415.30854.50942.88394.04241.91973.53581.58123.19213.3762
1009.30479.04639.22668.79909.14618.58079.07009.11172.36722.03082.19421.74752.06131.56631.93982.0053
2009.94859.85409.92779.76149.89979.67409.87579.88891.29901.21891.22711.14541.17611.08761.12601.1531
3007.48917.45157.47117.41437.45687.37817.44097.44970.97950.96690.96400.95470.9519 0.9435 0.93970.9464
β1 = 0
2754.34834.26684.32984.18944.30504.12254.28434.29570.65940.55450.58940.46690.54720.41480.50620.5283
1005.05174.99345.03954.93705.02264.88585.00845.01620.55620.48120.49820.41680.46600.37560.43280.4507
2004.10824.08914.10324.07034.09724.05244.09184.09480.34770.33440.33410.32180.32750.31120.31890.3236
3004.90984.89964.90824.88944.90554.87934.90344.90460.26870.26560.26550.26260.26400.25970.26180.2630
67513.147312.936913.140212.733913.115812.551313.100213.10882.70492.22162.60161.87592.51231.68132.42612.4731
10011.682811.508811.675611.341011.652911.190511.638111.64621.22411.08951.14330.96801.09390.87761.03831.0686
2009.53549.45809.52919.38139.51489.30699.50469.51020.46760.45700.45670.44650.45180.43660.44420.4483
3007.23267.20717.23037.18177.22527.15677.22167.22360.34170.33710.33700.33260.33480.32820.33140.3333
β1 = 1
2753.52023.47573.51703.43213.50923.39103.50383.50680.22550.20630.21690.18810.20940.17270.20190.2060
1004.23604.20334.23414.17104.22924.13994.22584.22760.20630.19270.19950.17970.19400.16820.18830.1914
2004.01123.99654.01033.98194.00803.96764.00644.00720.14530.14020.14320.13510.14130.13050.13930.1404
3003.69873.69173.69823.68473.69703.67783.69623.69670.11030.10840.10940.10650.10870.10470.10800.1084
6758.75218.54298.74568.34118.72248.16018.70788.71581.97751.55451.90411.24531.82511.08171.75741.7944
1007.98527.87887.98187.77437.96987.67567.96227.96640.52640.50060.51100.47570.50040.45310.48820.4949
2006.89666.86386.89536.83126.89146.79906.88886.89020.17410.17250.17300.17080.17230.16920.17140.1719
3006.87896.86416.87856.84936.87686.83466.87586.87640.09240.09170.09190.09100.09160.09030.09120.0914
Table A9. AMSE of different estimators when τ = 20% and ρ = 0.99.
Table A9. AMSE of different estimators when τ = 20% and ρ = 0.99.
Non-Robust EstimatorsRobust Estimators
pnPMLPRRPLPKL.1PKL.2PMKL.1PMKL.2PMKL.3RPMLRPRRRPLRPKL.1RPKL.2RPMKL.1RPMKL.2RPMKL.3
β1 = −1
2758.22564.52397.46374.79396.14824.03405.62875.90659.67141.00147.17087.54242.47200.89271.81542.1930
1006.38674.00955.79213.76224.95073.51044.58814.78115.94210.90014.37333.71321.91810.89361.47151.7250
2004.98274.55204.80734.24944.65034.13394.53544.59682.27390.92701.74880.71291.23020.63071.03441.1388
3004.26914.07114.15823.91284.06613.82843.99404.03251.64160.97441.32420.69831.06220.66900.93981.0042
67523.452214.022622.719410.469820.67969.568819.558920.166515.37132.267513.21296.21618.58961.12217.00717.8596
10020.907112.115620.21088.546818.37497.672517.340517.901514.98862.576113.13273.99308.89571.02927.22598.1218
20013.387210.349213.04788.085512.43237.058512.011312.24054.71622.35403.91901.18303.08680.93092.60832.8626
30012.697311.651012.528210.715412.267610.042012.067612.17702.56311.96412.23481.48801.98641.23891.77911.8899
β1 = 0
2756.33455.42546.20824.79385.97544.56315.82455.90651.43970.62121.10900.30300.81410.25790.66230.7431
1005.32724.74925.22434.31305.05954.11834.94685.00811.28610.62640.97600.31710.73480.27110.60190.6725
2005.73035.54925.70495.38655.65885.26815.62535.64370.82180.58810.68070.41930.58950.35320.51790.5561
3004.21934.12404.19354.03624.16213.96734.13574.15020.66320.52410.56180.41210.50430.35290.45100.4796
67517.225810.017716.60826.295715.03665.432814.133614.623813.01792.296111.29373.34887.74600.61676.37087.1103
10024.118119.085323.962515.036723.411212.778323.074023.25906.16463.47905.67382.25714.93961.85994.52424.7479
20016.961216.211116.928015.501316.825814.898516.759216.79581.33861.17831.22421.03211.15970.92011.08431.1252
30010.14409.712210.10959.309010.02698.98029.970110.00131.37521.18031.22021.00581.13770.87931.04231.0940
β1 = 1
2756.22014.31976.09893.20885.74112.96115.53265.64631.77940.45421.38640.16980.86240.10260.65190.7650
1005.70885.32435.68644.98685.62384.75755.58305.60540.59200.38720.52350.23550.44440.17360.39000.4194
2005.73025.57335.72295.42745.70045.31025.68585.69390.37180.30100.34040.23970.31120.20000.28630.2999
3005.01664.97125.01434.92735.00764.88745.00325.00560.19290.17780.18590.16370.17980.15200.17380.1771
67539.880634.219639.818429.381239.480126.197439.282739.39132.97092.25372.81841.70682.64381.42752.50562.5807
10024.444622.755324.408921.170624.258519.858224.166524.21711.76111.42581.64201.14401.53300.96691.43661.4890
20011.900611.262711.879610.661711.804910.158911.757711.78360.91730.80570.85470.70350.80870.62460.76150.7872
30012.201411.953112.194611.710512.169211.484012.153212.16200.41400.39360.40150.37380.39310.35580.38320.3886

References

  1. Hadi, A.S. A modification of a method for the detection of outliers in multivariate samples. J. R. Stat. Soc. Ser. B (Methodol.) 1994, 56, 393–396. [Google Scholar] [CrossRef]
  2. Lawrence, K.D. Robust Regression: Analysis and Applications; Routledge: Oxfordshire, UK, 2019. [Google Scholar]
  3. Wu, J.; Asar, Y.; Arashi, M. On the restricted almost unbiased Liu estimator in the logistic regression model. Commun. Stat. Theory Methods 2018, 47, 4389–4401. [Google Scholar] [CrossRef] [Green Version]
  4. Arashi, M.; Saleh, A.M.E.; Kibria, B.G. Theory of Ridge Regression Estimation with Applications; John Wiley & Sons: Hoboken, NJ, USA, 2019. [Google Scholar]
  5. Algamal, Z.Y.; Lukman, A.F.; Abonazel, M.R.; Awwad, F.A. Performance of the Ridge and Liu Estimators in the zero-inflated Bell Regression Model. J. Math. 2022, 2022, 9503460. [Google Scholar] [CrossRef]
  6. Rasheed, H.A.; Sadik, N.J.; Algamal, Z.Y. Jackknifed Liu-type estimator in the Conway-Maxwell Poisson regression model. Int. J. Nonlinear Anal. Appl. 2022, 13, 3153–3168. [Google Scholar]
  7. Majid, A.; Amin, M.; Akram, M.N. On the Liu estimation of Bell regression model in the presence of multicollinearity. J. Stat. Comput. Simul. 2022, 92, 262–282. [Google Scholar] [CrossRef]
  8. Asar, Y.; Algamal, Z. A New Two-parameter Estimator for the Gamma Regression Model. Stat. Optim. Inf. Comput. 2022, 10, 750–761. [Google Scholar] [CrossRef]
  9. Jabur, D.M.; Rashad, N.K.; Algamal, Z.Y. Jackknifed Liu-type estimator in the negative binomial regression model. Int. J. Nonlinear Anal. Appl. 2022, 13, 2675–2684. [Google Scholar]
  10. Månsson, K.; Shukur, G. A Poisson ridge regression estimator. Econ. Model. 2011, 28, 1475–1481. [Google Scholar] [CrossRef]
  11. Månsson, K.; Kibria, B.G.; Sjolander, P.; Shukur, G. Improved Liu estimators for the Poisson regression model. Int. J. Stat. Probab. 2012, 1, 2–6. [Google Scholar] [CrossRef]
  12. Lukman, A.F.; Adewuyi, E.; Månsson, K.; Kibria, B.G. A new estimator for the multicollinear Poisson regression model: Simulation and application. Sci. Rep. 2021, 11, 3732. [Google Scholar] [CrossRef]
  13. Aladeitan, B.B.; Adebimpe, O.; Lukman, A.F.; Oludoun, O.; Abiodun, O.E. Modified Kibria-Lukman (MKL) estimator for the Poisson Regression Model: Application and simulation. F1000Res. 2021, 10, 548. [Google Scholar] [CrossRef] [PubMed]
  14. Amin, M.; Akram, M.N.; Kibria, B.G. A new adjusted Liu estimator for the Poisson regression model. Concurr. Comput. Pract. Exp. 2021, 33, e6340. [Google Scholar] [CrossRef]
  15. Yehia, E.G. On the Restricted Poisson Ridge Regression Estimator. Sci. J. Appl. Math. Stat. 2021, 9, 106. [Google Scholar] [CrossRef]
  16. Jadhav, N.H. A new linearized ridge Poisson estimator in the presence of multicollinearity. J. Appl. Stat. 2022, 49, 2016–2034. [Google Scholar] [CrossRef] [PubMed]
  17. Cantoni, E.; Ronchetti, E. Robust inference for generalized linear models. J. Am. Stat. Assoc. 2001, 96, 1022–1030. [Google Scholar] [CrossRef] [Green Version]
  18. Hall, D.B.; Shen, J. Robust estimation for zero-inflated Poisson regression. Scand. J. Stat. 2010, 37, 237–252. [Google Scholar] [CrossRef]
  19. Chen, W.; Shi, J.; Qian, L.; Azen, S.P. Comparison of robustness to outliers between robust poisson models and log-binomial models when estimating relative risks for common binary outcomes: A simulation study. BMC Med. Res. Methodol. 2014, 14, 82. [Google Scholar] [CrossRef] [Green Version]
  20. Hosseinian, S.; Morgenthaler, S. Weighted maximum likelihood estimates in Poisson regression. In Proceedings of the International Conference on Robust Statistics, Valladolid, Spain, 27 June–1 July 2011. [Google Scholar]
  21. Chen, W.; Qian, L.; Shi, J.; Franklin, M. Comparing performance between log-binomial and robust Poisson regression models for estimating risk ratios under model misspecification. BMC Med. Res. Methodol. 2018, 18, 63. [Google Scholar] [CrossRef] [Green Version]
  22. Abonazel, M.R.; Saber, O. A comparative study of robust estimators for Poisson regression model with outliers. J. Stat. Appl. Probab. 2020, 9, 279–286. [Google Scholar]
  23. Marazzi, A. Improving the efficiency of robust estimators for the generalized linear model. Stats 2021, 4, 88–107. [Google Scholar] [CrossRef]
  24. Abonazel, M.R.; El-Sayed, S.M.; Saber, O.M. Performance of robust count regression estimators in the case of overdispersion, zero inflated, and outliers: Simulation study and application to German health data. Commun. Math. Biol. Neurosci. 2021, 2021, 55. [Google Scholar]
  25. Hosseinian, S. Robust Inference for Generalized Linear Models: Binary and Poisson Regression. Ph.D. Thesis, EPFL, Lausanne, Switzerland, 2009. [Google Scholar]
  26. Abonazel, M.R.; Dawoud, I. Developing Robust Ridge Estimators for Poisson Regression Model. Concurr. Comput. Pract. Exp. 2022, 34. [Google Scholar] [CrossRef]
  27. Arum, K.C.; Ugwuowo, F.I.; Oranye, H.E. Robust Modified Jackknife Ridge Estimator for the Poisson Regression Model with Multicollinearity and outliers. Sci. Afr. 2022, 17, e01386. [Google Scholar] [CrossRef]
  28. Kaçıranlar, S.; Dawoud, I. On the performance of the Poisson and the negative binomial ridge predictors. Commun. Stat. Simul. Comput. 2018, 47, 1751–1770. [Google Scholar] [CrossRef]
  29. Dawoud, I.; Abonazel, M.R. Robust Dawoud-Kibria estimator for handling multicollinearity and outliers in the linear regression model. J. Stat. Comput. Simul. 2021, 91, 3678–3692. [Google Scholar] [CrossRef]
  30. Awwad, F.A.; Dawoud, I.; Abonazel, M.R. Development of robust Özkale-Kaçiranlar and Yang-Chang estimators for regression models in the presence of multicollinearity and outliers. Concurr. Comput. Pract. Exp. 2022, 34, e6779. [Google Scholar] [CrossRef]
  31. Myers, R.H.; Montgomery, D.C.; Vining, G.G.; Robinson, T.J. Generalized Linear Models: With Applications in Engineering and the Sciences; John Wiley & Sons: Hoboken, NJ, USA, 2012; Volume 791. [Google Scholar]
  32. Cameron, A.C.; Trivedi, P.K. Regression Analysis of Count Data, 2nd ed.; Econometric Society Monograph No. 53; Cambridge University Press: Cambridge, UK, 2013. [Google Scholar]
  33. Månsson, K.; Kibria, B.G. Estimating the unrestricted and restricted Liu estimators for the Poisson regression model: Method and application. Comput. Econ. 2021, 58, 311–326. [Google Scholar] [CrossRef]
Figure 1. Residuals analysis of the aircraft damage data.
Figure 1. Residuals analysis of the aircraft damage data.
Axioms 11 00612 g001
Figure 2. MSE values of the non-robust estimators against k and d of the aircraft damage data.
Figure 2. MSE values of the non-robust estimators against k and d of the aircraft damage data.
Axioms 11 00612 g002
Figure 3. MSE values of the robust estimators against k and d of the aircraft damage data.
Figure 3. MSE values of the robust estimators against k and d of the aircraft damage data.
Axioms 11 00612 g003
Figure 4. Residuals analysis of the Somerville Lake data.
Figure 4. Residuals analysis of the Somerville Lake data.
Axioms 11 00612 g004
Figure 5. MSE values of the non-robust estimators against k and d of the Somerville Lake data.
Figure 5. MSE values of the non-robust estimators against k and d of the Somerville Lake data.
Axioms 11 00612 g005
Figure 6. MSE values of the robust estimators against k and d of the Somerville Lake data.
Figure 6. MSE values of the robust estimators against k and d of the Somerville Lake data.
Axioms 11 00612 g006
Table 1. Estimated coefficients and MSEs of the aircraft damage data.
Table 1. Estimated coefficients and MSEs of the aircraft damage data.
Non-Robust Estimators
PMLPRRPLPKL.1PKL.2PMKL.1PMKL.2PMKL.3
Intercept0.8680.6240.5880.3810.4180.2830.3180.372
X1−0.038−0.095−0.103−0.153−0.145−0.166−0.161−0.151
X2−0.024−0.010−0.0080.0030.0010.0080.0060.004
X30.0030.0050.0060.0070.0070.0080.0080.007
MSE1.2390.5400.4820.2090.2400.1350.1590.202
Robust Estimators
RPMLRPRRRPLRPKL.1RPKL.2RPMKL.1RPMKL.2RPMKL.3
Intercept−0.882−0.485−0.454−0.087−0.222−0.053−0.143−0.194
X1−0.0240.0480.0520.1190.1000.1060.1010.097
X20.2180.1980.1960.1780.1840.1770.1810.183
X3−0.009−0.012−0.012−0.015−0.014−0.015−0.015−0.014
MSE0.6230.3440.3220.1460.1800.0880.1400.172
Table 2. Estimated coefficients and MSEs of the Somerville Lake data.
Table 2. Estimated coefficients and MSEs of the Somerville Lake data.
Non-Robust Estimators
PMLPRRPLPKL.1PKL.2PMKL.1PMKL.2PMKL.3
Intercept1.8041.8121.8051.8201.8091.8051.8111.810
X1−1.450−0.808−1.395−0.166−1.201−0.214−1.106−1.159
X22.2841.5562.2310.8272.0690.6161.9762.027
X3−1.308−1.193−1.309−1.077−1.337−0.786−1.337−1.337
MSE1.0590.7560.9451.0610.6700.6300.6310.650
Robust Estimators
RPMLRPRRRPLRPKL.1RPKL.2RPMKL.1RPMKL.2RPMKL.3
Intercept1.3481.3451.3471.3421.3471.3341.3441.345
X1−0.774−0.497−0.594−0.219−0.411−0.197−0.342−0.387
X20.9230.6840.7720.4460.6380.3510.5520.602
X3−0.483−0.509−0.505−0.536−0.550−0.450−0.526−0.534
MSE0.6830.3350.3090.1880.4360.1110.2030.241
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Dawoud, I.; Awwad, F.A.; Tag Eldin, E.; Abonazel, M.R. New Robust Estimators for Handling Multicollinearity and Outliers in the Poisson Model: Methods, Simulation and Applications. Axioms 2022, 11, 612. https://doi.org/10.3390/axioms11110612

AMA Style

Dawoud I, Awwad FA, Tag Eldin E, Abonazel MR. New Robust Estimators for Handling Multicollinearity and Outliers in the Poisson Model: Methods, Simulation and Applications. Axioms. 2022; 11(11):612. https://doi.org/10.3390/axioms11110612

Chicago/Turabian Style

Dawoud, Issam, Fuad A. Awwad, Elsayed Tag Eldin, and Mohamed R. Abonazel. 2022. "New Robust Estimators for Handling Multicollinearity and Outliers in the Poisson Model: Methods, Simulation and Applications" Axioms 11, no. 11: 612. https://doi.org/10.3390/axioms11110612

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop