Weight-Constrained Neural Networks in Forecasting Tourist Volumes: A Case Study
Abstract
:1. Introduction
2. Related Work
3. Weight-Constrained Neural Networks
Adaptive Nonmonotone Active Set-Weight Constrained-Neural Network Training Algorithm
Algorithm 1: AANN | |
Input: | − Initial weights. |
− implies the subproblem was efficiently solved. | |
− Decay factor used to decrease in aNGPA. | |
− Integer connected with active set repetitions or change. | |
− Integer connected with active set repetitions or change. | |
Output: | − Weights of the trained network. |
Step 1. | repeat |
Step 2. | Execute aNGPA. /* Phase I: Gradient projection */ |
Step 3. | if then |
Step 4. | if then |
Step 5. | ; |
Step 6. | else |
Step 7. | Goto Step 14; |
Step 8. | end if |
Step 9. | else if then |
Step 10. | if then |
Step 11. | Goto Step 14; |
Step 12. | end if |
Step 13. | end if |
Step 14. | Goto Step 2; |
Step 15. | Execute CG. /* Phase II: Unconstrained optimization */ |
Step 16. | if () then |
Step 17. | Restart aNGPA and goto Step 3; |
Step 18. | else if () then |
Step 19. | if ( and ) then |
Step 20. | Restart aNGPA and goto Step 3; |
Step 21. | else |
Step 22. | Restart CG and goto Step 15; |
Step 23. | end if |
Step 24. | until (stopping criterion) |
4. Datasets
5. Numerical Experiments
5.1. Performance Evaluation of WCNNs Against ANNs
- “WCNN” stands for a weight-constrained neural network with bounds on all its weights.
- “WCNN” stands for a weight-constrained neural network with bounds on all its weights.
- “WCNN” stands for a weight-constrained neural network with bounds on all its weights.
- “ANN” stands for a classical artificial neural-network.
5.2. Evaluation of WCNNs Against State-of-Art Regression Algorithms
- SVR [29] is a regression algorithm utilized for predicting continuous output variables instead of classification, for which the classical support vector machines is used. It is based on a different philosophy from other regression models, as it tries to fit the best line within a predefined error value, while other regression models try to minimize the error between the predicted and the actual value.
- kNN [30] is a simple learning method applied to both classification and regression tasks, which utilizes distance functions to measure feature similarity in order to predict the values of any new data point. The final prediction of the kNN regression algorithm is the average value of the closest points to the new point.
- RBF [31] is an ANN which utilizes radial basis functions on all hidden units. Each hidden unit contains a prototype vector and the activation depends on the similarity of the input to the prototype. The output layer of the network is a linear combination of radial basis functions of the inputs and neuron parameters.
- M5 [32] is a decision tree regression algorithm, which is characterized by the simplicity of implementation Initially, this algorithm splits the input data into subsets creating a decision tree for attributes. Subsequently, a linear multiple regression model is built for each node using the attributes which are referenced by linear or tests models in the subtree of each node. Smoothing and pruning techniques can also be applied for improving the prediction accuracy of M5 algorithm.
- GP [33] is a stochastic process that constitutes a collection of random variables depending from time or space. Every collection of those random variables has a multivariate normal distribution. The prediction of the machine learning algorithm involving a GP is a one-dimensional Gaussian distribution, which is calculated by the similarity between the training data instances and the new unseen data instances.
6. Conclusions and Future Research
Author Contributions
Funding
Conflicts of Interest
References
- Institute SETE (INSETE). Tourism’s Contribution to the Greek Economy 2016–2017. Available online: http://www.insete.gr (accessed on 10 August 2019).
- Akın, M. A novel approach to model selection in tourism demand modeling. Tour. Manag. 2015, 48, 64–72. [Google Scholar] [CrossRef]
- Kleinlein, R.; García-Faura, Á.; Luna Jiménez, C.; Montero, J.M.; Díaz-de María, F.; Fernández-Martínez, F. Predicting Image Aesthetics for Intelligent Tourism Information Systems. Electronics 2019, 8, 671. [Google Scholar] [CrossRef]
- Li, Z.C.; Sheng, D. Forecasting passenger travel demand for air and high-speed rail integration service: A case study of Beijing-Guangzhou corridor, China. Transp. Res. Part A Policy Pract. 2016, 94, 397–410. [Google Scholar] [CrossRef]
- Mehmood, F.; Ahmad, S.; Kim, D. Design and Development of a Real-Time Optimal Route Recommendation System Using Big Data for Tourists in Jeju Island. Electronics 2019, 8, 506. [Google Scholar] [CrossRef]
- Spoladore, D.; Arlati, S.; Carciotti, S.; Nolich, M.; Sacco, M. RoomFort: An ontology-based comfort management application for hotels. Electronics 2018, 7, 345. [Google Scholar] [CrossRef]
- Kulendran, N.; Witt, S.F. Forecasting the demand for international business tourism. J. Travel Res. 2003, 41, 265–271. [Google Scholar] [CrossRef]
- Song, H.; Li, G. Tourism demand modelling and forecasting—A review of recent research. Tour. Manag. 2008, 29, 203–220. [Google Scholar] [CrossRef]
- Claveria, O.; Monte, E.; Torra, S. Tourism demand forecasting with neural network models: Different ways of treating information. Int. J. Tour. Res. 2015, 17, 492–500. [Google Scholar] [CrossRef]
- Sun, S.; Wei, Y.; Tsui, K.L.; Wang, S. Forecasting tourist arrivals with machine learning and internet search index. Tour. Manag. 2019, 70, 1–10. [Google Scholar] [CrossRef]
- Claveria, O.; Torra, S. Forecasting tourism demand to Catalonia: Neural networks vs. time series models. Econo. Model. 2014, 36, 220–228. [Google Scholar] [CrossRef]
- Lerner, B.; Guterman, H.; Aladjem, M. A comparative study of neural network based feature extraction paradigms. Pattern Recognit. Lett. 1999, 20, 7–14. [Google Scholar] [CrossRef]
- Manieniyan, V.; Vinodhini, G.; Senthilkumar, R.; Sivaprakasam, S. Wear element analysis using neural networks of a DI diesel engine using biodiesel with exhaust gas recirculation. Energy 2016, 114, 603–612. [Google Scholar] [CrossRef]
- Kamel, N.; Atiya, A.F.; El Gayar, N.; El-Shishiny, H. Tourism demand foreacsting using machine learning methods. ICGST Int. J. Artif. Intell. Mach. Learn. 2008, 8, 1–7. [Google Scholar]
- Chen, K.Y. Combining linear and nonlinear model in forecasting tourism demand. Exp. Syst. Appl. 2011, 38, 10368–10376. [Google Scholar] [CrossRef]
- Peng, L.; Lai, L. A service innovation evaluation framework for tourism e-commerce in China based on BP neural network. Electron. Mark. 2014, 24, 37–46. [Google Scholar] [CrossRef]
- Reed, R.; Marks, R.J. Neural Smithing: Supervised Learning in Feedforward Artificial Neural Networks; Mit Press: Cambridge, MA, USA, 1999. [Google Scholar]
- Brownlee, J. Machine Learning Mastery; VIC: Melbourne, Australia, 2014; Available online: http://machinelearningmastery.com (accessed on 10 August 2019).
- Livieris, I.E. Improving the classification efficiency of an ANN utilizing a new training methodology. Informatics 2019, 6, 1. [Google Scholar] [CrossRef]
- Livieris, I.E. Forecasting economy-related data utilizing weight-constrained recurrent neural networks. Algorithms 2019, 12, 85. [Google Scholar] [CrossRef]
- Livieris, I.E.; Kotsilieris, T.; Stavroyiannis, S.; Pintelas, P. Forecasting stock price index movement using a constrained deep neural network training algorithm. Intell. Decis. Technol. 2019. (accepted). [Google Scholar]
- Livieris, I.E.; Pintelas, P. An adaptive nonmonotone active set-weight constrained-neural network training algorithm. Neurocomputing 2019, 360, 294–303. [Google Scholar] [CrossRef]
- Dai, Y.H.; Hager, W.W.; Schittkowski, K.; Zhang, H. The cyclic Barzilai-Borwein method for unconstrained optimization. IMA J. Numer. Anal. 2006, 26, 604–627. [Google Scholar] [CrossRef]
- Hellenic Statistical Authority. Hotels, Rooms for Rent and Tourist Campsites/2018. Available online: https://www.statistics.gr/en/statistics/-/publication/STO12/- (accessed on 2018).
- Chai, T.; Draxler, R.R. Root mean square error (RMSE) or mean absolute error (MAE)?—Arguments against avoiding RMSE in the literature. Geosci. Model Dev. 2014, 7, 1247–1250. [Google Scholar] [CrossRef]
- Dolan, E.D.; Moré, J.J. Benchmarking optimization software with performance profiles. Math. Program. 2002, 91, 201–213. [Google Scholar] [CrossRef]
- Hsieh, W.W. Machine Learning Methods in the Environmental Sciences: Neural Networks and Kernels; Cambridge University Press: Cambridge, UK, 2009. [Google Scholar]
- Russell, S.J.; Norvig, P. Artificial Intelligence: A Modern Approach; Pearson Education Limited: Kuala Lumpur, Malaysia, 2016. [Google Scholar]
- Deng, N.; Tian, Y.; Zhang, C. Support Vector Machines: Optimization Based Theory, Algorithms, and Extensions; Chapman and Hall/CRC: London, UK, 2012. [Google Scholar]
- Aha, D.W. Lazy Learning; Springer: Berlin, Germany, 2013. [Google Scholar]
- Chen, W.; Fu, Z.J.; Chen, C.S. Recent Advances in Radial Basis Function Collocation Methods; Springer: Berlin, Germany, 2014. [Google Scholar]
- Landwehr, N.; Hall, M.; Frank, E. Logistic model trees. Mach. Learn. 2005, 59, 161–205. [Google Scholar] [CrossRef]
- Paul, W.; Baschnagel, J. Stochastic Processes; Springer: Berlin, Germany, 2013; Volume 1. [Google Scholar]
- Witten, I.H.; Frank, E.; Hall, M.A.; Pal, C.J. Data Mining: Practical Machine Learning Tools and Techniques; Morgan Kaufmann: Burlington, MA, USA, 2016. [Google Scholar]
- Wu, X.; Kumar, V. The Top Ten Algorithms in Data Mining; CRC Press: Boca Raton, FL, USA, 2009. [Google Scholar]
- Hodges, J.L.; Lehmann, E.L. Rank methods for combination of independent experiments in analysis of variance. Ann. Math. Stat. 1962, 33, 482–497. [Google Scholar] [CrossRef]
- Finner, H. On a monotonicity problem in step-down multiple test procedures. J. Am. Stat. Assoc. 1993, 88, 920–923. [Google Scholar] [CrossRef]
- Aminu, M.; Ludin, A.N.B.M.; Matori, A.N.; Yusof, K.W.; Dano, L.U.; Chandio, I.A. A spatial decision support system (SDSS) for sustainable tourism planning in Johor Ramsar sites, Malaysia. Environ. Earth Sci. 2013, 70, 1113–1124. [Google Scholar] [CrossRef]
- Bousset, J.P.; Skuras, D.; Těšitel, J.; Marsat, J.B.; Petrou, A.; Fiallo-Pantziou, E.; Kušová, D.; Bartoš, M. A decision support system for integrated tourism development: Rethinking tourism policies and management strategies. Tour. Geogr. 2007, 9, 387–404. [Google Scholar] [CrossRef]
- Song, H.; Gao, B.Z.; Lin, V.S. Combining statistical and judgmental forecasts via a web-based tourism demand forecasting system. Int. J. Forecast. 2013, 29, 295–310. [Google Scholar] [CrossRef]
Algorithm | Parameters |
---|---|
SVR | . |
. | |
Kernel type = universal Pearson VII function-based kernel. | |
kNN | Number of neighbors = 1. |
Euclidean distance. | |
RBF | Tolerance parameter = 10−6. |
Training algorithm = BFGS. | |
M5 | Minimum number of instances to allow at a leaf node = 4. |
Used unsmoothed predictions. | |
GP | Level of Gaussian noise = 1. |
Kernel type = universal Pearson VII function-based kernel. |
Model | Domestic Tourists Forecasting Horizon | Foreign Tourists Forecasting Horizon | ||||
---|---|---|---|---|---|---|
6 | 9 | 12 | 6 | 9 | 12 | |
SVR | 0.0870 | 0.0711 | 0.0399 | 0.1085 | 0.1515 | 0.1519 |
kNN | 0.0687 | 0.0704 | 0.0525 | 0.1621 | 0.1733 | 0.1387 |
RBF | 0.0844 | 0.0973 | 0.0500 | 0.4011 | 0.2063 | 0.1479 |
M5 | 0.1202 | 0.0790 | 0.0511 | 0.5112 | 0.4109 | 0.3486 |
GP | 0.0706 | 0.0594 | 0.0435 | 0.3733 | 0.3872 | 0.0833 |
WCNN | 0.0025 | 0.0029 | 0.0028 | 0.0072 | 0.0059 | 0.0052 |
Model | Domestic Tourists Forecasting Horizon | Foreign Tourists Forecasting Horizon | ||||
---|---|---|---|---|---|---|
6 | 9 | 12 | 6 | 9 | 12 | |
SVR | 0.0720 | 0.0580 | 0.0323 | 0.0866 | 0.1185 | 0.1280 |
kNN | 0.0491 | 0.0498 | 0.0380 | 0.1378 | 0.1547 | 0.1269 |
RBF | 0.0706 | 0.0796 | 0.0451 | 0.3660 | 0.1606 | 0.1170 |
M5 | 0.0570 | 0.0522 | 0.0355 | 0.3228 | 0.2983 | 0.2842 |
GP | 0.1103 | 0.0624 | 0.0401 | 0.3154 | 0.2124 | 0.0681 |
WCNN | 0.0020 | 0.0022 | 0.0025 | 0.0059 | 0.0056 | 0.0042 |
Model | FAR | Finner Post-Hoc Test | |
---|---|---|---|
-Value | Null Hypothesis | ||
WCNN | 5.167 | - | - |
kNN | 14.167 | 0.056389 | accepted |
SVR | 16.667 | 0.040204 | rejected |
RBF | 24.500 | 0.030307 | rejected |
M5P | 25.000 | 0.020308 | rejected |
GP | 25.500 | 0.010206 | rejected |
Based on RMSE metric | |||
WCNN | 5.667 | - | - |
kNN | 15.000 | 0.07898 | accepted |
SVR | 15.833 | 0.04280 | rejected |
RBF | 23.167 | 0.00414 | rejected |
GP | 24.667 | 0.00414 | rejected |
M5 | 26.667 | 0.00414 | rejected |
Based on MAE metric |
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Livieris, I.E.; Pintelas, E.; Kotsilieris, T.; Stavroyiannis, S.; Pintelas, P. Weight-Constrained Neural Networks in Forecasting Tourist Volumes: A Case Study. Electronics 2019, 8, 1005. https://doi.org/10.3390/electronics8091005
Livieris IE, Pintelas E, Kotsilieris T, Stavroyiannis S, Pintelas P. Weight-Constrained Neural Networks in Forecasting Tourist Volumes: A Case Study. Electronics. 2019; 8(9):1005. https://doi.org/10.3390/electronics8091005
Chicago/Turabian StyleLivieris, Ioannis E., Emmanuel Pintelas, Theodore Kotsilieris, Stavros Stavroyiannis, and Panagiotis Pintelas. 2019. "Weight-Constrained Neural Networks in Forecasting Tourist Volumes: A Case Study" Electronics 8, no. 9: 1005. https://doi.org/10.3390/electronics8091005
APA StyleLivieris, I. E., Pintelas, E., Kotsilieris, T., Stavroyiannis, S., & Pintelas, P. (2019). Weight-Constrained Neural Networks in Forecasting Tourist Volumes: A Case Study. Electronics, 8(9), 1005. https://doi.org/10.3390/electronics8091005