Previous Article in Journal
Asymptotic Semicircular Laws Induced by p-Adic Number Fields p and C*-Algebras over Primes p
Previous Article in Special Issue
Vibration Analysis of a Guitar considered as a Symmetrical Mechanical System

## Volume 11, Issue 6

Article Menu Metrics 0

## Export Article

Symmetry 2019, 11(6), 820; https://doi.org/10.3390/sym11060820

Article
On Comparing and Classifying Several Independent Linear and Non-Linear Regression Models with Symmetric Errors
1
College of Mathematics, Dianxi Science and Technology, Normal University, Lincang 677000, China
2
Department of Statistics, Faculty of Science, Fasa University, Fasa 74616 86131, Iran
3
Department of Mathematics, Faculty of Art and Sciences, Cankaya University Balgat, Ankara 06530, Turkey
4
Department of Statistics, Faculty of Science, Shiraz University, Shiraz 71946 85115, Iran
*
Author to whom correspondence should be addressed.
Received: 30 May 2019 / Accepted: 19 June 2019 / Published: 20 June 2019

## Abstract

:
In many real world problems, science fields such as biology, computer science, data mining, electrical and mechanical engineering, and signal processing, researchers aim to compare and classify several regression models. In this paper, a computational approach, based on the non-parametric methods, is used to investigate the similarities, and to classify several linear and non-linear regression models with symmetric errors. The ability of each given approach is then evaluated using simulated and real world practical datasets.
Keywords:
comparison; Friedman test; linear regression; nonlinear regression; sign test; symmetric errors; Wilcoxon test

## 1. Introduction

In many situations, we aim to study the effects of variables on variable $Y$. Simple and multiple regressions are data analysis techniques to model these effects. The authors of the references [1,2] applied simple and multiple linear regression models in different science fields, such as agriculture, biology, material, mechanical engineering, and signal processing. In many real world problems, scientists want to compare the relationship between the dependent variable and independent variables in several separate datasets.
The comparison of the correlation between the variables X and Y in two separate datasets, different techniques was provided by [3,4,5]. The comparison of the correlation between the variables X and Y in a dataset, and the correlation between the two variables X and W in another dataset, resulted in different methods developed by [6,7,8,9,10]. The correlation between the variables X and Y in a dataset, and the correlation between two variables W and Z in another dataset, were compared by different methods in [9,11,12]. The comparison and classification of two, and more simple linear regression models, have been considered in [13,14,15,16]. The comparison of two regression models has been reported in [14,15,16,17,18,19,20,21,22].
In the present research, we aim to compare and classify several linear and non-linear regression models that fitted on several independent datasets. The non-parametric methods are used to construct an approach to investigate the similarity and to classify the linear and non-linear regression models. A given approach is then evaluated using simulation and real world studies. The introduced approach is powerful and applicable in its ability to compare any linear or non-linear regression models.

## 2. Models Comparing and Classification

Assume is a sample dataset of size $n i$, from The equations of m linear or non-linear regression models can be written by:
such that for $i = 1 , … , m ,$ are zero-mean symmetric random variables with unknown and equal variance $σ i 2$.
By considering Equation (1), consequently, the conditional expectation of Y based on that we show it by is given by:
In real-word problems the aim is to test the hypothesis Under the rejection of $H 0$, we conclude that at least two models of the m regression models are not statistically similar, and if $H 0$ is accepted then it can be concluded that the m regression models are statistically equal.
The regression equations can be represented by:
such that are the values for the dependent variable Y, are the values for the independent variables , and are zero-mean random variables with unknown and equal variance $σ i 2$.
First, all m regression models are estimated by
for all values of , where are the estimated values for dependent variable $Y$, based on ith regression model. Since are zero-mean symmetric random variables, consequently, are unbiased estimators for respectively. In other words, are random variables with mean
Remark 1.
means that the repeated points are assumed once.
Now, to compare the fitted regression models, the Friedman test [23,24,25,26] will be applied on n couples
The Friedman test that is a non-parametric alternative to the repeated measures is used to compare related datasets (datasets that are repeated on the same subjects). This test is commonly applied when dataset do not follow the parametric conditions, such as normality assumption.

#### Classification

In previous discussion, if $H 0$ is false, then we conclude that the mechanism of one model or mechanisms of some models are significantly different from the other models. However, to determine which models are significantly different from each other, the sign test or Wilcoxon test are applied in order to compare each of the regression model pairs.

## 3. Simulation Study

This section assesses the ability of the introduced approach simulation datasets. First, the different datasets from different regression models are produced. Then, we compute the values of the Estimated Type I error probability ($α ^$) and the Estimated Power $( π ^ )$ of the introduced approach. For comparison, the Wilcoxon and Friedman tests are applied. The simulations are accomplished after 1000 runs and using the R 3.5.3 software (R Development Core Team, 2018) on a PC (Processor: Intel(R) CoreTM(2) Duo CPU T7100 @ 1.80GHz 1.80GHz, RAM: 2.00GB, System Type: 32-bit).
Example 1.
Assume the simple linear regression model:
$Y = β X + ε ,$
such that $ε$ and $X$ are independent.
Example 2.
Let
$Y = β 0 + β 1 X + β 2 X ε ,$
such that $ε$ and $X$ are independent.
Example 3.
Assume:
$Y = 1 + β X + ε ,$
such that $ε$ and $X$ are independent.
Example 4.
Assume the multiple linear regression model:
$Y = β 0 + β 1 X 1 + 2 β 2 X 2 + ε ,$
such that $ε$, $X 1$ and $X 2$ are independent.
Example 5.
For the first dataset, assume the simple nonlinear regression model:
$Y = e X + ε ,$
such that $ε$ and $X$ are independent.
For the second and the third datasets let $Y = { e X + ε , 1 + β X + ε } ,$ and $Y = { e X + ε , 1 + β X + ε , 2 X + ε } ,$ respectively.
Figure 1 and Figure 2 shows the density plots of the some parts of the response variable Y. As it can be seen in these figures, the density plots are symmetric, but not necessarily normal (Figure 2).
The values of $α ^$ (first four rows) and $π ^$ (other rows) for Examples 1 to 5 are summarized in Table 1, Table 2, Table 3, Table 4 and Table 5, respectively. As Table 1, Table 2, Table 3, Table 4 and Table 5 indicate the values of $α ^$ are very close to size test ($α =$ 0.05), and consequently the introduced approach can be controlled the type I error. Also the values of $π ^$ show that the given technique can distinguished between the null and alternative hypotheses.

## 4. Real Data

In this section, a practical real data is considered to study the power of the introduced approach in real world problems. Drought is a damaging natural phenomenon. To prevent this phenomenon, the hydrologists model and predict the drought datasets in a standard time period. In this research, the average monthly rainy days (1966–2010) at three Iranian synoptic stations (Fasa, Sarvestan, and Shiraz) was considered and modeled.
To model and forecast the average monthly rainy days, different polynomial regression models of orders 1 to 3 (linear, quadratic and cubic) and exponential model were fitted to datasets. The formulas of the considered models are as following:
Linear model: $Y = β 0 + β 1 X + ε$ Quadratic model:
$Y = β 0 + β 1 X + β 2 X 2 + ε .$
The numerical computations are done using the R 3.5.3 software (Library ‘nlstools’, lm() function for linear regression and nls() function for nonlinear regression) and Minitab 18 software.
The results of fitted regression models are summarized in Table 6. It can be observed that, for all of the stations, respectively, the polynomial regression of order 3 (cubic), and the exponential models, had the most R-square (R2) and the least root mean square error (RMSE) between all fitted models.
Now, we use the proposed approach to compare and classify these stations, for each model. The result of Friedman test is shown in Table 7. This table indicated that the fitted cubic and exponential models are significantly different in these stations (p < 0.05). Also, there is no significant difference between the fitted linear and quadratic models in these stations (p > 0.05).
As Table 8 indicates, we can classify the stations in two clusters, for cubic and exponential models. First cluster: Fasa and Sarvestan, and second cluster: Shiraz.

## 5. Conclusions

In many real world problems, researchers wish to compare and classify the regression models in several datasets. In this paper, the non-parametric methods were used to construct an approach to investigate the similarity of some linear and non-linear regression models with symmetric errors. Particular approaches were evaluated using simulation and practical datasets. A simulation study indicated that the introduced approach controlled the Type I error. Also the proposed technique distinguished well between null and alternative hypotheses. The introduced approach also had many advantages. First, it was powerful. Second, it was not too computational. Third, it could be applied to compare any linear or non-linear regression models. Fourth, this method did not need the normality of errors and could be applied for all models with symmetric errors.

## Author Contributions

Conceptualization, J.-J.P., M.R.M. and D.B.; Formal analysis, M.R.M., D.B. and M.M.; Investigation, J.-J.P., M.R.M. and D.B.; Methodology, J.-J.P., M.R.M., D.B. and M.M.; Project administration, J.-J.P.; Software, J.-J.P., M.R.M., D.B. and M.M.; Supervision, J.-J.P., M.R.M. and D.B.; Validation, J.-J.P., M.R.M. and D.B.; Visualization, J.-J.P., M.R.M. and D.B.; Writing—Original Draft, J.-J.P. and M.R.M.; Writing—Review and Editing, J.-J.P., M.R.M., D.B. and M.M.

## Funding

This research received no external funding.

## Conflicts of Interest

The authors declare no conflict of interest.

## References

1. Wan, J.; Zhang, D.; Xu, W.; Guo, Q. Parameter Estimation of Multi Frequency Hopping Signals Based on Space-Time-Frequency Distribution. Symmetry 2019, 11, 648. [Google Scholar] [CrossRef]
2. Sajid, M.; Shafique, T.; Riaz, I.; Imran, M.; Jabbar Aziz Baig, M.; Baig, S.; Manzoor, S. Facial asymmetry-based anthropometric differences between gender and ethnicity. Symmetry 2018, 10, 232. [Google Scholar] [CrossRef]
3. Mahmouudi, M.R.; Maleki, M.; Pak, A. Testing the Difference between Two Independent Time Series Models. Iran. J. Sci. Technol. Trans. A Sci. 2017, 41, 665–669. [Google Scholar] [CrossRef]
4. Fisher, R.A. On the Probable Error of a Coefficient of Correlation Deduced from a Small Sample. Metron 1921, 1, 3–32. [Google Scholar]
5. Mahmoudi, M.R.; Mahmoodi, M. Inferrence on the Ratio of Correlations of Two Independent Populations. J. Math. Ext. 2014, 7, 71–82. [Google Scholar]
6. Howell, D.C. Statistical Methods for Psychology, 6th ed.; Thomson Wadsworth: Stamford, CT, USA, 2007. [Google Scholar]
7. Hotelling, H. The Selection of Variates for Use in Prediction with Some Comments on the General Problem of Nuisance Parameters. Ann. Math. Stat. 1940, 11, 271–283. [Google Scholar] [CrossRef]
8. Williams, E.G. The Comparison of Regression Variables. J. R. Stat. Soc. Ser. B 1959, 21, 396–399. [Google Scholar] [CrossRef]
9. Steiger, J.H. Tests for Comparing Elements of a Correlation Matrix. Psychol. Bull. 1980, 87, 245–251. [Google Scholar] [CrossRef]
10. Meng, X.; Rosenthal, R.; Rubin, D.B. Comparing Correlated Correlation Coefficients. Psychol. Bull. 1992, 111, 172–175. [Google Scholar] [CrossRef]
11. Peter, C.C.; Van Voorhis, W.R. Statistical Procedures and Their Mathematical Bases; McGraw-Hill: New York, NY, USA, 1940. [Google Scholar]
12. Raghunathan, T.E.; Rosenthal, R.; Rubin, D.B. Comparing Correlated but Nonoverlapping Correlations. Psychol. Methods 1996, 1, 178–183. [Google Scholar] [CrossRef]
13. Liu, W.; Jamshidian, M.; Zhang, Y. Multiple Comparison of Several Linear Regression Lines. J. R. Stat. Soc. Ser. B 2004, 99, 395–403. [Google Scholar]
14. Liu, W.; Hayter, A.J.; Wynn, H.P. Operability Region Equivalence: Simultaneous Confidence Bands for the Equivalence of Two Regression Models Over Restricted Regions. Biom. J. 2007, 49, 144–150. [Google Scholar] [CrossRef] [PubMed]
15. Liu, W.; Jamshidian, M.; Zhang, Y.; Bertz, F.; Han, X. Pooling Batches in Drug Stability Study by Using Constant-width Simultaneous Confidence Bands. Stat. Med. 2007, 26, 2759–2771. [Google Scholar] [CrossRef] [PubMed]
16. Liu, W.; Jamshidian, M.; Zhang, Y.; Bertz, F.; Han, X. Some New Methods for the Comparison of Two Linear Regression Models. J. Stat. Plan. Inference 2007, 137, 57–67. [Google Scholar] [CrossRef]
17. Hayter, A.J.; Liu, W.; Wynn, H.P. Easy-to-Construct Confidence Bands for Comparing Two Simple Linear Regression Lines. J. Stat. Plan. Inference 2007, 137, 1213–1225. [Google Scholar] [CrossRef]
18. Jamshidian, M.; Liu, W.; Bretz, F. Simultaneous Confidence Bands for all Contrasts of Three or More Simple Linear Regression Models over an Interval. Comput. Stat Data Anal. 2010, 54, 1475–1483. [Google Scholar] [CrossRef]
19. Marques, F.J.; Coelho, C.A.; Rodrigues, P.C. Testing the equality of several linear regression models. Comput. Stat. 2016, 32, 1453–1480. [Google Scholar] [CrossRef]
20. Mahmoudi, M.R.; Mahmoudi, M.; Nahavandi, E. Testing the Difference between Two Independent Regression Models. Commun. Stat. Theory Methods 2016, 45, 6284–6289. [Google Scholar] [CrossRef]
21. Mahmoudi, M.R. On Comparing Two Dependent Linear and Nonlinear Regression Models. J. Test. Eval. 2018, 47, 449–458. [Google Scholar] [CrossRef]
22. Mahmoudi, M.R.; Maleki, M.; Pak, A. Testing the Equality of Two Independent Regression Models. Commun. Stat. Theory Methods 2018, 47, 2919–2926. [Google Scholar] [CrossRef]
23. Conover, W.J. Practical Nonparametric Statistics, 3rd ed.; John Wiley: Hoboken, NJ, USA, 1980. [Google Scholar]
24. Friedman, M. The Use of Ranks to Avoid the Assumption of Normality Implicit in the Analysis of Variance. J. R. Stat. Soc. Ser. B 1937, 32, 675–701. [Google Scholar] [CrossRef]
25. Friedman, M. A Correction: The Use of Ranks to Avoid the Assumption of Normality Implicit in the Analysis of Variance. J. R. Stat. Soc. Ser. B 1939, 34, 109. [Google Scholar] [CrossRef]
26. Friedman, M. A Comparison of Alternative Tests of Significance for the Problem of m Rankings. Ann. Math. Stat. 1940, 11, 86–92. [Google Scholar] [CrossRef]
Figure 1. The density plots of the some parts of the response variable Y (Black or Pie: Normal (0,1); Red or Triangle: Normal (0,2); Green or Star: Normal (0,3); Blue or Plus: Normal (0,5)).
Figure 1. The density plots of the some parts of the response variable Y (Black or Pie: Normal (0,1); Red or Triangle: Normal (0,2); Green or Star: Normal (0,3); Blue or Plus: Normal (0,5)).
Figure 2. The density plots of the some parts of the response variable Y (Black or Pie: 0.5 – Beta (1.5,1.5); Red or Triangle: 0.5 – Beta (1.75,1.75); Green or Star: 0.5 – Beta (2,2); Blue or Plus: 0.5 – Beta (2.5,2.5)).
Figure 2. The density plots of the some parts of the response variable Y (Black or Pie: 0.5 – Beta (1.5,1.5); Red or Triangle: 0.5 – Beta (1.75,1.75); Green or Star: 0.5 – Beta (2,2); Blue or Plus: 0.5 – Beta (2.5,2.5)).
Table 1. The values of $α ^$ and $π ^$ for Example 1.
Table 1. The values of $α ^$ and $π ^$ for Example 1.
(n1, n2, n3)
εXβ(10, 10, 10)(20, 40, 60)(50, 75, 100)(75, 100, 150)
SecondThird
110.0530.0510.0510.049
110.0520.0520.0510.048
110.0530.0520.0500.049
110.0530.0520.0500.049
120.7380.8820.9340.958
120.7530.8010.9500.981
120.7540.8540.9450.972
120.7490.8890.9130.970
130.7030.8250.9410.993
130.7100.8590.9170.975
130.7280.8240.9100.984
130.7070.8640.9340.953
210.7680.8280.9130.978
210.7030.8240.9280.951
210.7940.8460.9300.968
210.7940.8000.9030.955
220.7450.8130.9460.971
220.7180.8580.9370.981
220.7840.8660.9010.953
220.7260.8210.9440.999
230.7950.8490.9240.982
230.7550.8560.9280.961
230.7630.8450.9360.988
230.7100.8650.9140.975
Table 2. The values of $α ^$ and $π ^$ for Example 2.
Table 2. The values of $α ^$ and $π ^$ for Example 2.
(n1, n2, n3)
εX(β0, β1, β2)(10, 10, 10)(20, 40, 60)(50, 75, 100)(75, 100, 150)
SecondThird
$( 2 , 1 , 2 )$$( 2 , 1 , 2 )$0.0520.0520.0510.049
$( 2 , 1 , 2 )$$( 2 , 1 , 2 )$0.0530.0510.0500.049
$( 2 , 1 , 2 )$$( 2 , 1 , 2 )$0.0530.0520.0510.049
$( 2 , 1 , 2 )$$( 2 , 1 , 2 )$0.0520.0520.0510.049
$( 2 , 1 , 2 )$$( 0 , 2 , 1 )$0.7700.8430.9030.981
$( 2 , 1 , 2 )$$( 0 , 2 , 1 )$0.7430.8170.9090.979
$( 2 , 1 , 2 )$$( 0 , 2 , 1 )$0.7710.8420.9180.992
$( 2 , 1 , 2 )$$( 0 , 2 , 1 )$0.7910.8550.9340.967
$( 2 , 1 , 2 )$$( 3 , 2 , 1 )$0.7370.8910.9410.997
$( 2 , 1 , 2 )$$( 3 , 2 , 1 )$0.7980.8600.9320.988
$( 2 , 1 , 2 )$$( 3 , 2 , 1 )$0.7400.8490.9470.993
$( 2 , 1 , 2 )$$( 3 , 2 , 1 )$0.7120.8270.9160.997
$( 0 , 2 , 1 )$$( 2 , 1 , 2 )$0.7820.8370.9320.966
$( 0 , 2 , 1 )$$( 2 , 1 , 2 )$0.7800.8300.9360.960
$( 0 , 2 , 1 )$$( 2 , 1 , 2 )$0.7200.8570.9450.998
$( 0 , 2 , 1 )$$( 2 , 1 , 2 )$0.7670.8970.9020.958
$( 0 , 2 , 1 )$$( 0 , 2 , 1 )$0.7900.8090.9210.992
$( 0 , 2 , 1 )$$( 0 , 2 , 1 )$0.7410.8140.9350.992
$( 0 , 2 , 1 )$$( 0 , 2 , 1 )$0.7100.8440.9450.981
$( 0 , 2 , 1 )$$( 0 , 2 , 1 )$0.7600.8710.9060.972
$( 0 , 2 , 1 )$$( 3 , 2 , 1 )$0.7760.8070.9190.969
$( 0 , 2 , 1 )$$( 3 , 2 , 1 )$0.7010.8750.9280.963
$( 0 , 2 , 1 )$$( 3 , 2 , 1 )$0.7800.8030.9360.987
$( 0 , 2 , 1 )$$( 3 , 2 , 1 )$0.7200.8860.9230.960
Table 3. The values of $α ^$ and $π ^$ for Example 3.
Table 3. The values of $α ^$ and $π ^$ for Example 3.
(n1, n2, n3)
εXβ(10, 10, 10)(20, 40, 60)(50, 75, 100)(75, 100, 150)
SecondThird
110.0530.0510.0500.049
110.0530.0510.0510.050
110.0530.0510.0510.050
110.0530.0510.0510.050
120.7240.8460.9240.996
120.7340.8130.9420.952
120.7370.8180.9140.959
120.7640.8190.9490.998
150.7970.8080.9040.959
150.7600.8690.9190.978
150.7930.8430.9170.988
150.7650.8760.9100.983
210.7420.8680.9340.954
210.7300.8100.9250.966
210.7250.8670.9110.981
210.7690.8680.9300.996
220.7630.8160.9050.982
220.7060.8950.9350.951
220.7230.8660.9090.981
220.7650.8570.9030.974
250.7100.8670.9100.950
250.7640.8370.9040.981
250.7780.8910.9330.987
250.7260.8190.9460.967
Table 4. The values of $α ^$ and $π ^$ for Example 4.
Table 4. The values of $α ^$ and $π ^$ for Example 4.
(n1, n2, n3)
X1X2(β0, β1, β2)(10, 10, 10)(20, 40, 60)(50, 75, 100)(75, 100, 150)
SecondThird
$( 2 , 1 , 2 )$$( 2 , 1 , 2 )$0.0520.0520.0500.049
$( 2 , 1 , 2 )$$( 2 , 1 , 2 )$0.0530.0520.0500.049
$( 2 , 1 , 2 )$$( 2 , 1 , 2 )$0.0520.0520.0510.049
$( 2 , 1 , 2 )$$( 2 , 1 , 2 )$0.0520.0510.0500.048
$( 2 , 1 , 2 )$$( 0 , 2 , 1 )$0.7340.8930.9230.961
$( 2 , 1 , 2 )$$( 0 , 2 , 1 )$0.7870.8870.9470.964
$( 2 , 1 , 2 )$$( 0 , 2 , 1 )$0.7660.8130.9430.973
$( 2 , 1 , 2 )$$( 0 , 2 , 1 )$0.7620.8970.9090.993
$( 2 , 1 , 2 )$$( 3 , 2 , 1 )$0.7060.8660.9360.966
$( 2 , 1 , 2 )$$( 3 , 2 , 1 )$0.7460.8820.9460.960
$( 2 , 1 , 2 )$$( 3 , 2 , 1 )$0.7160.8750.9480.975
$( 2 , 1 , 2 )$$( 3 , 2 , 1 )$0.7570.8110.9390.950
$( 0 , 2 , 1 )$$( 2 , 1 , 2 )$0.7920.8660.9360.985
$( 0 , 2 , 1 )$$( 2 , 1 , 2 )$0.7680.8240.9020.995
$( 0 , 2 , 1 )$$( 2 , 1 , 2 )$0.7730.8410.9330.983
$( 0 , 2 , 1 )$$( 2 , 1 , 2 )$0.7950.8010.9400.992
$( 0 , 2 , 1 )$$( 0 , 2 , 1 )$0.7900.8910.9120.953
$( 0 , 2 , 1 )$$( 0 , 2 , 1 )$0.7840.8550.9240.951
$( 0 , 2 , 1 )$$( 0 , 2 , 1 )$0.7390.8420.9080.961
$( 0 , 2 , 1 )$$( 0 , 2 , 1 )$0.7490.8800.9050.963
$( 0 , 2 , 1 )$$( 3 , 2 , 1 )$0.7450.8540.9180.956
$( 0 , 2 , 1 )$$( 3 , 2 , 1 )$0.7390.8250.9460.955
$( 0 , 2 , 1 )$$( 3 , 2 , 1 )$0.7430.8830.9260.960
$( 0 , 2 , 1 )$$( 3 , 2 , 1 )$0.7340.8400.9180.976
Table 5. The values of $α ^$ and $π ^$ for Example 5.
Table 5. The values of $α ^$ and $π ^$ for Example 5.
(n1, n2, n3)
εXY(10, 10, 10)(20, 40, 60)(50, 75, 100)(75, 100, 150)
SecondThird
$e X + ε$$e X + ε$0.0520.0510.0510.048
$e X + ε$$e X + ε$0.0530.0510.0510.049
$e X + ε$$e X + ε$0.0520.0510.0510.049
$e X + ε$$e X + ε$0.0520.0510.0500.048
$e X + ε$$1 + β X + ε$0.7870.8950.9010.965
$e X + ε$$1 + β X + ε$0.7870.8290.9300.974
$e X + ε$$1 + β X + ε$0.7250.8480.9120.991
$e X + ε$$1 + β X + ε$0.7590.8980.9440.984
$e X + ε$$2 X + ε$0.7340.8910.9490.962
$e X + ε$$2 X + ε$0.7880.8110.9210.981
$e X + ε$$2 X + ε$0.7590.8770.9410.965
$e X + ε$$2 X + ε$0.7040.8680.9480.989
$e X + ε$$e X + ε$0.7980.8450.9080.956
$e X + ε$$e X + ε$0.7530.8090.9270.989
$e X + ε$$e X + ε$0.7310.8650.9100.990
$e X + ε$$e X + ε$0.7310.8200.9060.962
$1 + β X + ε$$1 + β X + ε$0.7230.8970.9340.960
$1 + β X + ε$$1 + β X + ε$0.7990.8070.9490.982
$1 + β X + ε$$1 + β X + ε$0.7130.8770.9160.952
$1 + β X + ε$$1 + β X + ε$0.7430.8720.9250.965
$1 + β X + ε$$2 X + ε$0.7250.8920.9010.996
$1 + β X + ε$$2 X + ε$0.7950.8860.9440.959
$1 + β X + ε$$2 X + ε$0.7070.8210.9250.972
$1 + β X + ε$$2 X + ε$0.7980.8250.9240.974
Table 6. Indices to evaluate the fitted regression models.
Table 6. Indices to evaluate the fitted regression models.
ModelStationR SquareRMSE
LinearFasa0.6241.693
Sarvestan0.6381.516
Shiraz0.6891.501
QuadraticFasa0.7341.350
Sarvestan0.7431.285
Shiraz0.7671.265
CubicFasa0.8950.910
Sarvestan0.8990.855
Shiraz0.9760.529
ExponentialFasa0.7670.978
Sarvestan0.7780.926
Shiraz0.8760.713
Table 7. Friedman test to compare the stations.
Table 7. Friedman test to compare the stations.
Modelp
Linear0.123
Quadratic0.224
Cubic<0.001
Exponential<0.001
Table 8. Wilcoxon test to compare and classify the stations.
Table 8. Wilcoxon test to compare and classify the stations.
ModelStationsp
CubicPair 1Shiraz - Fasa0.011
Pair 2Shiraz - Sarvestan0.003
Pair 3Fasa - Sarvestan0.144
ExponentialPair 1Shiraz - Fasa0.019
Pair 2Shiraz - Sarvestan<0.001
Pair 3Fasa - Sarvestan0.112

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Symmetry EISSN 2073-8994 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top