# Estimation of Linear Regression with the Dimensional Analysis Method

^{1}

^{2}

^{*}

^{†}

^{‡}

## Abstract

**:**

## 1. Introduction

- Minimize the median squared difference between observed and fitted response [35];

- We formulate a method to attack the drawbacks related to efficiency, instability, and minimal error.
- This study explores the significant application of linear regression model under Dimensional Analysis.
- The novelty of the current study also lies in considering the grade of importance of the decision makers or experts involved implied to solve the problem.
- Finally, the proposed includes an application of DA to linear regression to deal with an inventory forecast problem.

## 2. Basic Concepts

#### 2.1. Linear Regression

**Definition**

**1.**

#### 2.2. Dimensional Analysis

**Definition**

**2.**

**Example**

**1.**

## 3. Main Results

#### 3.1. Generalized Linear Regression Method under Dimensional Analysis Environment

#### 3.2. Compute the Mean and Variance Estimators

## 4. Numerical Example

#### 4.1. Numerical Example from a Real Case Study

#### 4.2. Numerical Example 2

## 5. Validations

## 6. Discussion and Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Acknowledgments

## Conflicts of Interest

## Abbreviations

DA | Dimensional Analysis |

LR | Linear regression |

SSE | Sum of the squares of the error |

GLR-DA | Generalized linear regression method under Dimensional Analysis environment called |

MSE | The mean square error named |

## References

- Hothorn, T.; Bretz, F.; Westfall, P. Simultaneous inference in general parametric models. Biom J. J. Math. Methods Biosci.
**2008**, 50, 346–363. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Liu, M.; Hu, S.; Ge, Y.; Heuvelink, G.B.; Ren, Z.; Huang, X. Using multiple linear regression and random forests to identify spatial poverty determinants in rural China. Spat. Stat.
**2020**, 42, 100461. [Google Scholar] [CrossRef] - Cook, J.R.; Stefanski, L.A. Simulation-extrapolation estimation in parametric measurement error models. J. Am. Stat. Assoc.
**1994**, 89, 1314–1328. [Google Scholar] [CrossRef] - Park, J.Y.; Phillips, P.C. Statistical inference in regressions with integrated processes: Part 2. Econom. Theory
**1989**, 5, 95–131. [Google Scholar] [CrossRef] [Green Version] - Johnston, F.; Boylan, J.E.; Shale, E.A. An examination of the size of orders from customers, their characterisation and the implications for inventory control of slow moving items. J. Oper. Res. Soc.
**2003**, 54, 833–837. [Google Scholar] [CrossRef] - Babai, M.; Chen, H.; Syntetos, A.; Lengu, D. A compound-Poisson Bayesian approach for spare parts inventory forecasting. Int. J. Prod. Econ.
**2021**, 232, 107954. [Google Scholar] [CrossRef] - Mansur, A.; Kuncoro, T. Product inventory predictions at small medium enterprise using market basket analysis approach-neural networks. Procedia Econ. Financ.
**2012**, 4, 312–320. [Google Scholar] [CrossRef] [Green Version] - Dekker, R.; Bloemhof, J.; Mallidis, I. Operations Research for green logistics–An overview of aspects, issues, contributions and challenges. Eur. J. Oper. Res.
**2012**, 219, 671–679. [Google Scholar] [CrossRef] [Green Version] - Bakker, M.; Riezebos, J.; Teunter, R.H. Review of inventory systems with deterioration since 2001. Eur. J. Oper. Res.
**2012**, 221, 275–284. [Google Scholar] [CrossRef] - Saha, E.; Ray, P.K. Modelling and analysis of inventory management systems in healthcare: A review and reflections. Comput. Ind. Eng.
**2019**, 137, 106051. [Google Scholar] [CrossRef] - van Steenbergen, R.; Mes, M. Forecasting demand profiles of new products. Decis. Support Syst.
**2020**, 139, 113401. [Google Scholar] [CrossRef] - Kourentzes, N.; Trapero, J.R.; Barrow, D.K. Optimising forecasting models for inventory planning. Int. J. Prod. Econ.
**2020**, 225, 107597. [Google Scholar] [CrossRef] [Green Version] - Wilson, B.T.; Knight, J.F.; McRoberts, R.E. Harmonic regression of Landsat time series for modeling attributes from national forest inventory data. ISPRS J. Photogramm. Remote Sens.
**2018**, 137, 29–46. [Google Scholar] [CrossRef] - Seifbarghy, M.; Amiri, M.; Heydari, M. Linear and nonlinear estimation of the cost function of a two-echelon inventory system. Sci. Iran.
**2013**, 20, 801–810. [Google Scholar] - Ryu, S.; Noh, J.; Kim, H. Deep neural network based demand side short term load forecasting. Energies
**2017**, 10, 3. [Google Scholar] [CrossRef] - Dalla Corte, A.P.; Souza, D.V.; Rex, F.E.; Sanquetta, C.R.; Mohan, M.; Silva, C.A.; Zambrano, A.M.A.; Prata, G.; de Almeida, D.R.A.; Trautenmüller, J.W.; et al. Forest inventory with high-density UAV-Lidar: Machine learning approaches for predicting individual tree attributes. Comput. Electron. Agric.
**2020**, 179, 105815. [Google Scholar] [CrossRef] - Junttila, V.; Laine, M. Bayesian principal component regression model with spatial effects for forest inventory variables under small field sample size. Remote Sens. Environ.
**2017**, 192, 45–57. [Google Scholar] [CrossRef] - Ulrich, M.; Jahnke, H.; Langrock, R.; Pesch, R.; Senge, R. Distributional regression for demand forecasting in e-grocery. Eur. J. Oper. Res.
**2021**, 294, 831–842. [Google Scholar] [CrossRef] [Green Version] - Georgi, H. Generalized dimensional analysis. Phys. Lett.
**1993**, 298, 187–189. [Google Scholar] [CrossRef] [Green Version] - Butterfield, R. Dimensional analysis for geotechnical engineers. Geotechnique
**1999**, 49, 357–366. [Google Scholar] [CrossRef] - Cheng, Y.T.; Cheng, C.M. Scaling, dimensional analysis, and indentation measurements. Mater. Sci. Eng. R Rep.
**2004**, 44, 91–149. [Google Scholar] [CrossRef] - Bellamine, F.; Elkamel, A. Model order reduction using neural network principal component analysis and generalized dimensional analysis. Eng. Comput.
**2008**, 25, 443–463. [Google Scholar] [CrossRef] - Moran, M.; Marshek, K. Some matrix aspects of generalized dimensional analysis. J. Eng. Math.
**1972**, 6, 291–303. [Google Scholar] [CrossRef] - Longo, S. Principles and Applications of Dimensional Analysis and Similarity; Springer Nature Switzerland AG: Cham, Switzerland, 2022. [Google Scholar]
- Szava, I.R.; Sova, D.; Peter, D.; Elesztos, P.; Szava, I.; Vlase, S. Experimental Validation of Model Heat Transfer in Rectangular Hole Beams Using Modern Dimensional Analysis. Mathematics
**2022**, 10, 409. [Google Scholar] [CrossRef] - Szirtes, T. Applied Dimensional Analysis and Modeling; Butterworth-Heinemann: Oxford, UK, 2007. [Google Scholar]
- Shen, W.; Lin, D.K. Statistical theories for dimensional analysis. Stat. Sin.
**2019**, 29, 527–550. [Google Scholar] [CrossRef] - Shen, W.; Davis, T.; Lin, D.K.; Nachtsheim, C.J. Dimensional analysis and its applications in statistics. J. Qual. Technol.
**2014**, 46, 185–198. [Google Scholar] [CrossRef] - Albrecht, M.C.; Nachtsheim, C.J.; Albrecht, T.A.; Cook, R.D. Experimental design for engineering dimensional analysis. Technometrics
**2013**, 55, 257–270. [Google Scholar] [CrossRef] - Bridgman, P.W. Dimensional Analysis; Yale University Press: New Haven, CT, USA, 1922. [Google Scholar]
- Gibbings, J.C. Dimensional Analysis; Springer Nature & Business Media: New York, NY, USA, 2011. [Google Scholar]
- Dovi, V.; Reverberi, A.; Maga, L.; De Marchi, G. Improving the statistical accuracy of dimensional analysis correlations for precise coefficient estimation and optimal design of experiments. Int. Commun. Heat Mass Transf.
**1991**, 18, 581–590. [Google Scholar] [CrossRef] - Breiman, L. Bagging predictors. Mach. Learn.
**1996**, 24, 123–140. [Google Scholar] [CrossRef] [Green Version] - Kohli, S.; Godwin, G.T.; Urolagin, S. Sales Prediction Using Linear and KNN Regression. In Advances in Machine Learning and Computational Intelligence; Springer: Singapore, 2021; pp. 321–329. [Google Scholar]
- Gentleman, J.F. New developments in statistical computing. Am. Stat.
**1986**, 40, 228–237. [Google Scholar] [CrossRef] - Liang, K.Y.; Zeger, S.L. Longitudinal data analysis using generalized linear models. Biometrika
**1986**, 73, 13–22. [Google Scholar] [CrossRef] - Prion, S.K.; Haerling, K.A. Making Sense of Methods and Measurements: Simple Linear Regression. Clin. Simul. Nurs.
**2020**, 48, 94–95. [Google Scholar] [CrossRef] - Kuhn, M.; Johnson, K. Applied Predictive Modeling; Springer: New York, NY, USA, 2013; Volume 26. [Google Scholar]
- Wright, J.; Yang, A.Y.; Ganesh, A.; Sastry, S.S.; Ma, Y. Robust face recognition via sparse representation. IEEE Trans. Pattern Anal. Mach. Intell.
**2008**, 31, 210–227. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Cheng, J.; Ai, M. Optimal designs for panel data linear regressions. Stat. Probab. Lett.
**2020**, 163, 108769. [Google Scholar] [CrossRef] - Huseyin Tunc, B.G. A column generation based heuristic algorithm for piecewise linear regression. Expert Syst. Appl.
**2021**, 171, 114539. [Google Scholar] [CrossRef] - Fieller, E.C.; Hartley, H.O.; Pearson, E.S. Tests for rank correlation coefficients. I. Biometrika
**1957**, 44, 470–481. [Google Scholar] [CrossRef] - Conrad, S. Sales data and the estimation of demand. J. Oper. Res. Soc.
**1976**, 27, 123–127. [Google Scholar] [CrossRef] - Tukey, J.W. Comparing individual means in the analysis of variance. Biometrics
**1949**, 5, 99–114. [Google Scholar] [CrossRef]

x | y | x | y |
---|---|---|---|

1 | 32 | 7 | 89 |

2 | 40 | 8 | 94 |

3 | 40 | 9 | 96 |

4 | 45 | 10 | 96 |

5 | 64 | 11 | 121 |

6 | 71 | 12 | 125 |

A | B |
---|---|

0.11 | 5.63142487 |

0.14 | 4.45475331 |

0.19 | 3.63920858 |

0.21 | 3.34356494 |

0.26 | 3.56587087 |

0.22 | 3.42892914 |

0.19 | 3.55394256 |

0.35 | 3.4168655 |

0.35 | 3.25595148 |

0.41 | 3.0892899 |

0.43 | 3.30630209 |

0.46 | 3.21766878 |

mean 3.32 | mean 43.903772 |

Real | Prediction | ${\mathbf{Case}}_{\mathit{M}}$ | ${\mathit{\xi}}_{\mathit{s}}$ | $|{\mathit{\xi}}_{\mathit{s}}|$ | ${\mathit{\xi}}_{\mathit{s}}^{2}$ |
---|---|---|---|---|---|

32.00 | 20.50 | 27.9 | 11.5 | 11.5 | 132.2 |

40.00 | 30.61 | 36.6 | 9.4 | 9.4 | 88.2 |

40.00 | 40.71 | 45.4 | −0.7 | 0.7 | 0.5 |

45.00 | 50.81 | 54.2 | −5.8 | 5.8 | 33.8 |

64.00 | 60.92 | 62.9 | 3.1 | 3.1 | 9.5 |

71.00 | 71.02 | 71.7 | −0.0 | 0.0 | 0.0 |

89.00 | 81.12 | 80.5 | 7.9 | 7.9 | 62.0 |

94.00 | 91.23 | 89.2 | 2.8 | 2.8 | 7.7 |

96.00 | 101.33 | 98.0 | −5.3 | 5.3 | 28.4 |

96.00 | 111.43 | 106.8 | −15.4 | 15.4 | 238.2 |

121.00 | 121.54 | 115.5 | −0.5 | 0.5 | 0.3 |

125.00 | 131.64 | 124.3 | −6.6 | 6.6 | 44.1 |

Total | 0.14 | 69.11 | 644.97 | ||

Average | 0.01 | 5.76 | 53.75 |

Real Data | Prediction with GLR-DA | Conventional Regression | |
---|---|---|---|

Real Data | 1.0000000 | 0.9827522 | 0.9827528 |

Prediction with GLR-DA | 0.9827522 | 1.0000000 | 1.0000000 |

Conventional regression | 0.9827528 | 1.0000000 | 1.0000000 |

x | y |
---|---|

1 | 9 |

2 | 8 |

3 | 10 |

4 | 10 |

5 | 10 |

6 | 8 |

7 | 7 |

8 | 10 |

9 | 8 |

10 | 10 |

11 | 10 |

12 | 9 |

13 | 9 |

Real | Prediction | ${\mathit{\xi}}_{\mathit{s}}$ | $|{\mathit{\xi}}_{\mathit{s}}|$ | ${\mathit{\xi}}_{\mathit{s}}^{2}$ |
---|---|---|---|---|

9.00 | 2.34 | 6.7 | 6.7 | 44.4 |

8.00 | 3.22 | 4.8 | 4.8 | 22.9 |

10.00 | 4.10 | 5.9 | 5.9 | 34.8 |

10.00 | 4.98 | 5.0 | 5.0 | 25.2 |

10.00 | 5.86 | 4.1 | 4.1 | 17.1 |

8.00 | 6.74 | 1.3 | 1.3 | 1.6 |

7.00 | 7.62 | −0.6 | 0.6 | 0.4 |

10.00 | 8.50 | 1.5 | 1.5 | 2.2 |

8.00 | 9.38 | −1.4 | 1.4 | 1.9 |

10.00 | 10.27 | −0.3 | 0.3 | 0.1 |

10.00 | 11.15 | −1.1 | 1.1 | 1.3 |

9.00 | 12.03 | −3.0 | 3.0 | 9.2 |

9.00 | 12.91 | −3.9 | 3.9 | 15.3 |

Total | 18.90 | 39.62 | 176.40 | |

Average | 1.90 | 2.98 | 13.43 |

Factors | Adj. Total Mean | Adj. Total StDev | Item-Adj Corr. | Cronbach’s Alpha |
---|---|---|---|---|

Real Data | 152.16 | 68.03 | 0.9828 | 0.995 |

Prediction with GLR-DA | 152.17 | 63.49 | 0.9956 | 0.9912 |

Conventional regression | 152.16 | 68.29 | 0.9962 | 0.9874 |

Factor | N | Mean | StDev | 95% CI |
---|---|---|---|---|

Real Data | 12 | 76.08 | 32.16 | (56.43, 5.74) |

Prediction with GLR-DA | 12 | 76.1 | 36.4 | (56.4, 5.7) |

Conventional regression | 12 | 76.08 | 31.61 | (56.43, 5.74) |

Factor | N | Mean | Grouping |
---|---|---|---|

Conventional regression | 12 | 76.08 | A |

Real Data | 12 | 76.08 | A |

Prediction with GLR-DA | 12 | 76.1 | A |

Difference of Levels | Difference of Means | SE of Difference | 95% CI | T-Value | Adjusted p-Value |
---|---|---|---|---|---|

Prediction-Real Data | 0 | 13.7 | (−33.5, 33.5) | 0 | 1 |

Conventional-Real Data | 0 | 13.7 | (−33.5, 33.5) | 0 | 1 |

Conventional-Prediction | 0 | 13.7 | (−33.5, 33.5) | 0 | 1 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Pérez-Domínguez, L.; Garg, H.; Luviano-Cruz, D.; García Alcaraz, J.L.
Estimation of Linear Regression with the Dimensional Analysis Method. *Mathematics* **2022**, *10*, 1645.
https://doi.org/10.3390/math10101645

**AMA Style**

Pérez-Domínguez L, Garg H, Luviano-Cruz D, García Alcaraz JL.
Estimation of Linear Regression with the Dimensional Analysis Method. *Mathematics*. 2022; 10(10):1645.
https://doi.org/10.3390/math10101645

**Chicago/Turabian Style**

Pérez-Domínguez, Luis, Harish Garg, David Luviano-Cruz, and Jorge Luis García Alcaraz.
2022. "Estimation of Linear Regression with the Dimensional Analysis Method" *Mathematics* 10, no. 10: 1645.
https://doi.org/10.3390/math10101645