Analyzing Monofractal Short and Very Short Time Series: A Comparison of Detrended Fluctuation Analysis and Convolutional Neural Networks as Classifiers
Abstract
1. Introduction
- •
- The significance of classifying monofractal time series lies in the insights it provides for classification, forecasting, and anomaly detection. While most studies focus on analyzing long-term series [29], short- and very-short-term series receive less attention. However, this research aims to expand the investigation into the performance and application of DFA and neural networks as classification methods for synthetic monofractal short-term series.
- •
- We compare two distinct approaches for classifying monofractal short and very short time series of different lengths: DFA and CNN-SVM.
- •
- Our findings show that CNN-SVM achieves higher classification rates than DFA, and both methods performance declines as the length of short and very short time series decreases.
2. Detrended Fluctuation Analysis (DFA)
- Take a finite time series of length M, with a minor portion of elements being zero and compute a new time series , where :
- Divide the new series into segments of size s from the beginning and repeat the process starting from the end to obtain segments.
- Calculate the local trend of order m, denoted as , for all segments and all sizes s, and compute the variance as:
- Compute the second-order fluctuations as an average over all segments of a given size s:
- For a range of sizes, , observe the relationship:
3. Machine Learning Classification Approaches
3.1. Support Vector Machines
3.2. Convolutional Neural Network
- •
- Convolutional layer. A convolutional layer is the main building block of a CNN. It includes a set of filters or kernels, parameters that learn local informative features of the image, controlled by the padding and the stride. The padding controls the edges of the image, while the stride determines the number of pixels considered in the filter at each step during the convolution. The kernel operator, a matrix of weights, is applied to local regions of the input to generate output features, multiplying and summing values at each movement of the convolutional block (sliding window). Mathematically, the latent output representation of the s-th feature map of the current layer, denoted as , is as follows:
- •
- Pooling layer. This layer reduces the resolution of the feature map in each channel through a pooling operator, maintaining the most relevant spatial features of the current convolutional layer. The common types of pooling layers include max pooling, average pooling, and global pooling. The mathematical expression corresponding to this layer is as follows:
- •
- Fully connected layers. These consist of a block of fully connected layers, trained with previously extracted features. At this stage, an optimization problem, which integrates a cost function, is the main support for the optimal configuration of the model parameters. The cost function quantifies the error between the model prediction and the true label . To illustrate, consider the cross-entropy cost function. The associated minimization problem is defined as follows:
4. Materials and Methods
4.1. Monofractal Time Series
Monofractal Synthetic Data
4.2. Environment
4.3. Performance Metrics
Algorithm 1 Monofractal synthetic signal classification process |
|
5. Results and Discussion
5.1. Performance of DFA on Monofractal Synthetic Data
5.2. Performance of CNN-SVM on Monofractal Synthetic Data
5.2.1. Training of the Deep Learning Model
5.2.2. Analysis of Classification Metrics of Monofractal Synthetic Data
- •
- The limited amount of monofractal data may limit the training and generalization capabilities of the DFA and CNN-SVM models. Likewise, due to their short length, they are more likely to be susceptible to noise, representing an additional challenge in capturing complex temporal patterns. Another inherent disadvantage of these classifiers is the processing time, especially when the complexity of the time series increases.
- •
- Our research focuses on the classification of short-length synthetic data, which may restrict the model’s ability to generalize to prediction tasks that require longer series. In addition, monofractal analysis may not be suitable for numerous application problems that demand multifractal models. This implies that the data may not fully reflect the complexity of the task, sometimes skewing the model’s performance.
- •
- The study compares the DFA and CNN-SVM schemes; nevertheless, it would be desirable to extend the evaluation of the classification system to other advanced machine learning models currently available.
- •
- In future work, we intend to link this research with applications that may be of interest to humanity. For example, through synthetic monofractal signals, we can detect types of heart disease and categorize the degree of the disease using deep learning techniques or machine learning algorithms. This implementation will be the basis for the analysis of more complex models, which could improve the classification rate by carefully tuning certain parameters.
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
CNN | Convolutional Neural Network |
SVM | Support Vector Machines |
ReLU | Rectified Linear Unit |
PPV | Positive Predictive Value |
TP | True Positive |
FP | False Positive |
TN | True Negative |
FN | False Negative |
MFDFA | Monofractal Detrended Fluctuation Analysis |
DFA | Detrended Fluctuation Analysis |
WT | Wavelet Analysis |
LSTM | Long Short-term Memory |
RNNs | Recurrent Neural Networks |
seq2seq | Sequence-to-sequence |
References
- Mandelbrot, B.B.; Mandelbrot, B.B. Intermittent turbulence in self-similar cascades: Divergence of high moments and dimension of the carrier. Multifractals 1/ƒ Noise Wild-Self-Affin. Phys. (1963–1976) 1999, 317–357. [Google Scholar] [CrossRef]
- Bunde, A.; Havlin, S. Fractals in Science; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Ivanov, P.C.; Amaral, L.A.N.; Goldberger, A.L.; Havlin, S.; Rosenblum, M.G.; Struzik, Z.R.; Stanley, H.E. Multifractality in human heartbeat dynamics. Nature 1999, 399, 461–465. [Google Scholar] [CrossRef]
- Kantelhardt, J.W. Fractal and multifractal time series. arXiv 2008, arXiv:0804.0747. [Google Scholar]
- Peng, C.K.; Buldyrev, S.V.; Havlin, S.; Simons, M.; Stanley, H.E.; Goldberger, A.L. Mosaic organization of DNA nucleotides. Phys. Rev. E 1994, 49, 1685–1689. [Google Scholar] [CrossRef] [PubMed]
- Kantelhardt, J.W.; Zschiegner, S.A.; Koscielny-Bunde, E.; Havlin, S.; Bunde, A.; Stanley, H. Multifractal detrended fluctuation analysis of nonstationary time series. Phys. Stat. Mech. Its Appl. 2002, 316, 87–114. [Google Scholar] [CrossRef]
- Gu, G.F.; Zhou, W.X. Detrended fluctuation analysis for fractals and multifractals in higher dimensions. Phys. Rev. E 2006, 74, 061104. [Google Scholar] [CrossRef]
- Podobnik, B.; Stanley, H.E. Detrended cross-correlation analysis: A new method for analyzing two nonstationary time series. Phys. Rev. Lett. 2008, 100, 084102. [Google Scholar] [CrossRef] [PubMed]
- Podobnik, B.; Horvatic, D.; Petersen, A.M.; Stanley, H.E. Cross-correlations between volume change and price change. Proc. Natl. Acad. Sci. USA 2009, 106, 22079–22084. [Google Scholar] [CrossRef]
- Zhou, W.X. Multifractal detrended cross-correlation analysis for two nonstationary signals. Phys. Rev. E 2008, 77, 066211. [Google Scholar] [CrossRef]
- Oświecimka, P.; Kwapień, J.; Drożdż, S. Wavelet versus detrended fluctuation analysis of multifractal structures. Phys. Rev. E 2006, 74, 016103. [Google Scholar] [CrossRef]
- Kantelhardt, J.W.; Rybski, D.; Zschiegner, S.A.; Braun, P.; Koscielny-Bunde, E.; Livina, V.; Havlin, S.; Bunde, A. Multifractality of river runoff and precipitation: Comparison of fluctuation analysis and wavelet methods. Phys. Stat. Mech. Its Appl. 2003, 330, 240–245. [Google Scholar] [CrossRef]
- Kavasseri, R.G.; Nagarajan, R. A multifractal description of wind speed records. Chaos Solitons Fractals 2005, 24, 165–173. [Google Scholar] [CrossRef]
- Makowiec, D.; Gała, R.; Dudkowska, A.; Rynkiewicz, A.; Zwierz, M. Long-range dependencies in heart rate signals—Revisited. Phys. Stat. Mech. Its Appl. 2006, 369, 632–644. [Google Scholar] [CrossRef]
- Dutta, S.; Ghosh, D.; Samanta, S.; Dey, S. Multifractal parameters as an indication of different physiological and pathological states of the human brain. Phys. Stat. Mech. Its Appl. 2014, 396, 155–163. [Google Scholar] [CrossRef]
- Gospodinova, E.; Lebamovski, P.; Georgieva-Tsaneva, G.; Negreva, M. Evaluation of the Methods for Nonlinear Analysis of Heart Rate Variability. Fractal Fract. 2023, 7, 388. [Google Scholar] [CrossRef]
- Oświe, P.; Kwapień, J.; Drożdż, S. Multifractality in the stock market: Price increments versus waiting times. Phys. Stat. Mech. Its Appl. 2005, 347, 626–638. [Google Scholar]
- Wang, L.; Lee, R.S. Stock Index Return Volatility Forecast via Excitatory and Inhibitory Neuronal Synapse Unit with Modified MF-ADCCA. Fractal Fract. 2023, 7, 292. [Google Scholar] [CrossRef]
- Wang, Y.; Wu, C.; Pan, Z. Multifractal detrending moving average analysis on the US Dollar exchange rates. Phys. Stat. Mech. Its Appl. 2011, 390, 3512–3523. [Google Scholar] [CrossRef]
- Oh, G.; Eom, C.; Havlin, S.; Jung, W.S.; Wang, F.; Stanley, H.E.; Kim, S. A multifractal analysis of Asian foreign exchange markets. Eur. Phys. J. B 2012, 85, 1–6. [Google Scholar] [CrossRef]
- Ebrahimi, F.; Setarehdan, S.K.; Ayala-Moyeda, J.; Nazeran, H. Automatic sleep staging using empirical mode decomposition, discrete wavelet transform, time-domain, and nonlinear dynamics features of heart rate variability signals. Comput. Methods Programs Biomed. 2013, 112, 47–57. [Google Scholar] [CrossRef]
- Gough, N.A.J. Fractal analysis of foetal heart rate variability. Physiol. Meas. 1993, 14, 309–315. [Google Scholar] [CrossRef]
- Shah, E.; Reddy, N.; Rothschild, B. Fractal analysis of acceleration signals from patients with CPPD, rheumatoid arthritis, and spondyloarthroparthy of the finger joint. Comput. Methods Programs Biomed. 2005, 77, 233–239. [Google Scholar] [CrossRef]
- Shiomi, T.; Guilleminault, C.; Sasanabe, R.; Hirota, I.; Maekawa, M.; Kobayashi, T. Augmented very low frequency component of heart rate variability during obstructive sleep apnea. Sleep 1996, 19, 370–377. [Google Scholar] [CrossRef]
- Hadase, M.; Azuma, A.; Zen, K.; Asada, S.; Kawasaki, T.; Kamitani, T.; Kawasaki, S.; Sugihara, H.; Matsubara, H. Very Low Frequency Power of Heart Rate Variability is a Powerful Predictor of Clinical Prognosis in Patients With Congestive Heart Failure. Circ. J. Off. J. Jpn. Circ. Soc. 2004, 68, 343–347. [Google Scholar] [CrossRef]
- Usui, H.; Nishida, Y. The very low-frequency band of heart rate variability represents the slow recovery component after a mental stress task. PLoS ONE 2017, 12, e0182611. [Google Scholar] [CrossRef]
- Koscielny-Bunde, E.; Bunde, A.; Havlin, S.; Goldreich, Y. Analysis of daily temperature fluctuations. Phys. Stat. Mech. Its Appl. 1996, 231, 393–396. [Google Scholar] [CrossRef]
- Otsuka, K.; Nakajima, M.S.Y.K.; Yamanaka, T. Heart rate variability including 1/f fluctuations versus conventional autonomic functions. J. Ambul. Monit. 1995, 8, 91–100. [Google Scholar]
- López, J.L.; Contreras, J.G. Performance of multifractal detrended fluctuation analysis on short time series. Phys. Rev. E 2013, 87, 022918. [Google Scholar] [CrossRef]
- Deng, J.; Jirutitijaroen, P. Short-term load forecasting using time series analysis: A case study for Singapore. In Proceedings of the 2010 IEEE Conference on Cybernetics and Intelligent Systems, Singapore, 28–30 June 2010; pp. 231–236. [Google Scholar]
- Braei, M.; Wagner, S. Anomaly detection in univariate time-series: A survey on the state-of-the-art. arXiv 2020, arXiv:2004.00433. [Google Scholar]
- Gajbhiye, S.; Meshram, C.; Mirabbasi, R.; Sharma, S. Trend analysis of rainfall time series for Sindh river basin in India. Theor. Appl. Climatol. 2016, 125, 593–608. [Google Scholar] [CrossRef]
- Wu, Y.; Shang, P.; Li, Y. Modified generalized multiscale sample entropy and surrogate data analysis for financial time series. Nonlinear Dyn. 2018, 92, 1335–1350. [Google Scholar] [CrossRef]
- Accardo, A.; Affinito, M.; Carrozzi, M.; Bouquet, F. Use of the fractal dimension for the analysis of electroencephalographic time series. Biol. Cybern. 1997, 77, 339–350. [Google Scholar] [PubMed]
- Dlask, M.; Kukal, J.; Poplová, M.; Sovka, P.; Cifra, M. Short-time fractal analysis of biological autoluminescence. PLoS ONE 2019, 14, e0214427. [Google Scholar] [CrossRef] [PubMed]
- Wei, X.; Zhang, L.; Yang, H.Q.; Zhang, L.; Yao, Y.P. Machine learning for pore-water pressure time-series prediction: Application of recurrent neural networks. Geosci. Front. 2021, 12, 453–467. [Google Scholar] [CrossRef]
- Chaurasia, V.; Pal, S. Application of machine learning time series analysis for prediction COVID-19 pandemic. Res. Biomed. Eng. 2020, 38, 35–47. [Google Scholar] [CrossRef]
- Delignieres, D.; Ramdani, S.; Lemoine, L.; Torre, K.; Fortes, M.; Ninot, G. Fractal analyses for short time series: A re-assessment of classical methods. J. Math. Psychol. 2006, 50, 525–544. [Google Scholar] [CrossRef]
- Li, R.; Wang, J.; Yu, H.; Deng, B.; Wei, X.; Chen, Y. Fractal analysis of the short time series in a visibility graph method. Phys. Stat. Mech. Its Appl. 2016, 450, 531–540. [Google Scholar] [CrossRef]
- Gao, J.; Hu, J.; Tung, W.W.; Zheng, Y. Multiscale analysis of economic time series by scale-dependent Lyapunov exponent. Quant. Financ. 2013, 13, 265–274. [Google Scholar] [CrossRef]
- Kleiger, R.E.; Stein, P.K.; Bosner, M.S.; Rottman, J.N. Time domain measurements of heart rate variability. Cardiol. Clin. 1992, 10, 487–498. [Google Scholar] [CrossRef]
- Look AHEAD Research Group. Long-term effects of a lifestyle intervention on weight and cardiovascular risk factors in individuals with type 2 diabetes mellitus: Four-year results of the Look AHEAD trial. Arch. Intern. Med. 2010, 170, 1566–1575. [Google Scholar]
- Sharma, S.; Guleria, K. Deep learning models for image classification: Comparison and applications. In Proceedings of the 2022 2nd International Conference on Advance Computing and Innovative Technologies in Engineering (ICACITE), Greater Noida, India, 28–29 April 2022; pp. 1733–1738. [Google Scholar]
- Ardeti, V.A.; Kolluru, V.R.; Varghese, G.T.; Patjoshi, R.K. An Overview on State-of-the-Art Electrocardiogram Signal Processing Methods: Traditional to AI-Based Approaches. Expert Syst. Appl. 2023, 217, 119561. [Google Scholar] [CrossRef]
- He, K.; Zhang, X.; Ren, S.; Sun, J. Deep residual learning for image recognition. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 26 June–1 July 2016; pp. 770–778. [Google Scholar]
- Miotto, R.; Wang, F.; Wang, S.; Jiang, X.; Dudley, J.T. Deep learning for healthcare: Review, opportunities and challenges. Briefings Bioinform. 2018, 19, 1236–1246. [Google Scholar] [CrossRef] [PubMed]
- Gal, Y.; Ghahramani, Z. A theoretically grounded application of dropout in recurrent neural networks. Adv. Neural Inf. Process. Syst. 2016, 29. [Google Scholar] [CrossRef]
- Li, M.; Andersen, D.G.; Park, J.W.; Smola, A.J.; Ahmed, A.; Josifovski, V.; Long, J.; Shekita, E.J.; Su, B.Y. Scaling distributed machine learning with the parameter server. In Proceedings of the 11th USENIX Symposium on Operating Systems Design and Implementation (OSDI 14), Broomfield, CO, USA, 6–8 October 2014; pp. 583–598. [Google Scholar]
- Canizo, M.; Triguero, I.; Conde, A.; Onieva, E. Multi-head CNN–RNN for multi-time series anomaly detection: An industrial case study. Neurocomputing 2019, 363, 246–260. [Google Scholar] [CrossRef]
- Lutsiv, N.; Maksymyuk, T.; Beshley, M.; Lavriv, O.; Andrushchak, V.; Sachenko, A.; Vokorokos, L.; Gazda, J. Deep Semisupervised Learning-Based Network Anomaly Detection in Heterogeneous Information Systems. Comput. Mater. Contin. 2022, 70, 413–431. [Google Scholar] [CrossRef]
- Hu, M.; Ji, Z.; Yan, K.; Guo, Y.; Feng, X.; Gong, J.; Zhao, X.; Dong, L. Detecting anomalies in time series data via a meta-feature based approach. IEEE Access 2018, 6, 27760–27776. [Google Scholar] [CrossRef]
- Demertzis, K.; Iliadis, L.; Tziritas, N.; Kikiras, P. Anomaly detection via blockchained deep learning smart contracts in industry 4.0. Neural Comput. Appl. 2020, 32, 17361–17378. [Google Scholar] [CrossRef]
- Uribarri, G.; Mindlin, G.B. Dynamical time series embeddings in recurrent neural networks. Chaos Solitons Fractals 2022, 154, 111612. [Google Scholar] [CrossRef]
- Seabe, P.L.; Moutsinga, C.R.B.; Pindza, E. Forecasting cryptocurrency prices using LSTM, GRU, and bi-directional LSTM: A deep learning approach. Fractal Fract. 2023, 7, 203. [Google Scholar] [CrossRef]
- Lim, B.; Zohren, S. Time-series forecasting with deep learning: A survey. PHilosophical Trans. R. Soc. A 2021, 379, 20200209. [Google Scholar] [CrossRef]
- Ismail Fawaz, H.; Forestier, G.; Weber, J.; Idoumghar, L.; Muller, P.A. Deep learning for time series classification: A review. Data Min. Knowl. Discov. 2019, 33, 917–963. [Google Scholar] [CrossRef]
- Chalapathy, R.; Chawla, S. Deep learning for anomaly detection: A survey. arXiv 2019, arXiv:1901.03407. [Google Scholar]
- Wang, Z.; Yan, W.; Oates, T. Time series classification from scratch with deep neural networks: A strong baseline. In Proceedings of the 2017 International Joint Conference on Neural Networks (IJCNN), Anchorage, AK, USA, 14–19 May 2017; pp. 1578–1585. [Google Scholar]
- Cui, Z.; Chen, W.; Chen, Y. Multi-scale convolutional neural networks for time series classification. arXiv 2016, arXiv:1603.06995. [Google Scholar]
- Sutskever, I.; Vinyals, O.; Le, Q.V. Sequence to sequence learning with neural networks. Adv. Neural Inf. Process. Syst. 2014, 27, page. [Google Scholar]
- Gers, F.A.; Schmidhuber, J.; Cummins, F. Learning to forget: Continual prediction with LSTM. Neural Comput. 2000, 12, 2451–2471. [Google Scholar] [CrossRef] [PubMed]
- Wang, S.; Xiang, J.; Zhong, Y.; Zhou, Y. Convolutional neural network-based hidden Markov models for rolling element bearing fault identification. Knowl. Based Syst. 2018, 144, 65–76. [Google Scholar] [CrossRef]
- Esmael, B.; Arnaout, A.; Fruhwirth, R.K.; Thonhauser, G. Improving time series classification using Hidden Markov Models. In Proceedings of the 2012 12th International Conference on Hybrid Intelligent Systems (HIS), Pune, India, 4–7 December 2012; pp. 502–507. [Google Scholar]
- Hu, K.; Ivanov, P.C.; Chen, Z.; Carpena, P.; Eugene Stanley, H. Effect of trends on detrended fluctuation analysis. Phys. Rev. E 2001, 64, 011114. [Google Scholar] [CrossRef] [PubMed]
- Cortes, C.; Vapnik, V. Support-vector networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
- Lorena, A.C.; De Carvalho, A.C.; Gama, J.M. A review on the combination of binary classifiers in multiclass problems. Artif. Intell. Rev. 2008, 30, 19–37. [Google Scholar] [CrossRef]
- Hsu, C.W.; Lin, C.J. A comparison of methods for multiclass support vector machines. IEEE Trans. Neural Netw. 2002, 13, 415–425. [Google Scholar]
- Jing, C.; Hou, J. SVM and PCA based fault classification approaches for complicated industrial process. Neurocomputing 2015, 167, 636–642. [Google Scholar] [CrossRef]
- Khanday, N.Y.; Sofi, S.A. Deep insight: Convolutional neural network and its applications for COVID-19 prognosis. Biomed. Signal Process. Control 2021, 69, 102814. [Google Scholar] [CrossRef] [PubMed]
- Ribeiro, M.; Lazzaretti, A.E.; Lopes, H.S. A study of deep convolutional auto-encoders for anomaly detection in videos. Pattern Recognit. Lett. 2018, 105, 13–22. [Google Scholar] [CrossRef]
- López, J.L.; Vásquez-Coronel, J.A. Congestive Heart Failure Category Classification Using Neural Networks in Short-Term Series. Appl. Sci. 2023, 13, 13211. [Google Scholar] [CrossRef]
- Lopes, R.; Betrouni, N. Fractal and multifractal analysis: A review. Med. Image Anal. 2009, 13, 634–649. [Google Scholar] [CrossRef] [PubMed]
- Shi, W.; Shang, P.; Wang, J.; Lin, A. Multiscale multifractal detrended cross-correlation analysis of financial time series. Phys. Stat. Mech. Its Appl. 2014, 403, 35–44. [Google Scholar] [CrossRef]
- Xu, Y.; Qian, C.; Pan, L.; Wang, B.; Lou, C. Comparing monofractal and multifractal analysis of corrosion damage evolution in reinforcing bars. PloS ONE 2012, 7, e29956. [Google Scholar] [CrossRef]
- Pastén, D.; Muñoz, V.; Cisternas, A.; Rogan, J.; Valdivia, J.A. Monofractal and multifractal analysis of the spatial distribution of earthquakes in the central zone of Chile. Phys. Rev. E 2011, 84, 066123. [Google Scholar] [CrossRef]
- Saâdaoui, F. Skewed multifractal scaling of stock markets during the COVID-19 pandemic. Chaos Solitons Fractals 2023, 170, 113372. [Google Scholar] [CrossRef]
- Huang, Z.W.; Liu, C.Q.; Shi, K.; Zhang, B. Monofractal and multifractal scaling analysis of pH time series from Dongting lake inlet and outlet. Fractals 2010, 18, 309–317. [Google Scholar] [CrossRef]
- Shi, K.; Liu, C.Q.; Ai, N.S. Monofractal and multifractal approaches in investigating temporal variation of air pollution indexes. Fractals 2009, 17, 513–521. [Google Scholar] [CrossRef]
- Curto-Risso, P.; Medina, A.; Hernández, A.C.; Guzman-Vargas, L.; Angulo-Brown, F. Monofractal and multifractal analysis of simulated heat release fluctuations in a spark ignition heat engine. Phys. Stat. Mech. Its Appl. 2010, 389, 5662–5670. [Google Scholar] [CrossRef]
- Lopez, J.L.; Hernández, S.; Urrutia, A.; López-Cortés, X.A.; Araya, H.; Morales-Salinas, L. Effect of missing data on short time series and their application in the characterization of surface temperature by detrended fluctuation analysis. Comput. Geosci. 2021, 153, 104794. [Google Scholar] [CrossRef]
- Abry, P.; Sellan, F. The wavelet-based synthesis for fractional Brownian motion proposed by F. Sellan and Y. Meyer: Remarks and fast implementation. Appl. Comput. Harmon. Anal. 1996, 3, 377–383. [Google Scholar] [CrossRef]
- Bardet, J.M.; Lang, G.; Oppenheim, G.; Philippe, A.; Taqqu, M.S. Generators of long-range dependent processes: A survey. Theory Appl. Long-Range Depend. 2003, 1, 579–623. [Google Scholar]
H-Index | Signal Length: 128 | Signal Length: 256 | Signal Length: 512 | Signal Length: 1024 |
---|---|---|---|---|
Acc (%) | Acc (%) | Acc (%) | Acc (%) | |
28.2 | 33.1 | 36.8 | 44.3 | |
25.2 | 52.6 | 68.2 | 80.2 | |
20.2 | 49.2 | 64.8 | 81.0 | |
17.2 | 39.4 | 56.2 | 74.1 | |
14.0 | 33.9 | 50.9 | 67.4 | |
12.2 | 30.3 | 44.7 | 61.2 | |
10.8 | 25.6 | 41.2 | 56.3 | |
9.9 | 25.8 | 40.1 | 56.0 | |
8.9 | 24.5 | 35.8 | 52.2 |
Depth Level | Layer Type | Filter Size | Stride | Padding | Activation |
---|---|---|---|---|---|
1 | Convolutional | 1 | same | ReLU | |
Max Pooling | 2 | — | — | ||
2 | Convolutional | 1 | same | ReLU | |
Max Pooling | 2 | — | — | ||
3 | Convolutional | 1 | same | ReLU | |
Max Pooling | 2 | — | — | ||
4 | Convolutional | 1 | same | ReLU | |
Max Pooling | 2 | — | — | ||
5 | Convolutional | 1 | same | ReLU | |
Max Pooling | 2 | — | — | ||
6 | Convolutional | 1 | same | ReLU | |
Max Pooling | 2 | — | — | ||
7 | Convolutional | 1 | same | ReLU |
Dataset | Signal Length | Overall Accuracy (%) | ||||
---|---|---|---|---|---|---|
CNN-SVM3 | CNN-SVM4 | CNN-SVM5 | CNN-SVM6 | CNN-SVM7 | ||
Short-time synthetic signals | 128 | 64.00 | 67.06 | 67.89 | 68.12 | 68.01 |
256 | 73.64 | 78.67 | 81.54 | 81.88 | 81.67 | |
512 | 90.31 | 90.48 | 91.60 | 92.35 | 92.44 | |
1024 | 97.66 | 97.98 | 97.83 | 97.85 | 98.22 |
H-Index | Signal Length: 128 | Signal Length: 256 | Signal Length: 512 | Signal Length: 1024 | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Acc | PPV | Sen | Spe | Acc | PPV | Sen | Spe | Acc | PPV | Sen | Spe | Acc | PPV | Sen | Spe | |
95.2 | 77.2 | 80.0 | 97.1 | 97.2 | 87.5 | 87.6 | 98.4 | 98.9 | 94.8 | 95.2 | 99.3 | 99.7 | 98.4 | 98.6 | 99.8 | |
91.1 | 59.9 | 59.6 | 95.0 | 94.7 | 75.9 | 76.5 | 96.9 | 97.9 | 90.5 | 90.8 | 98.8 | 99.4 | 97.5 | 97.1 | 99.7 | |
91.3 | 60.4 | 62.0 | 94.9 | 94.9 | 76.7 | 78.1 | 97.0 | 97.9 | 91.0 | 90.7 | 98.9 | 99.4 | 97.5 | 97.4 | 99.7 | |
91.0 | 59.6 | 59.7 | 94.9 | 94.7 | 76.3 | 75.9 | 97.1 | 97.8 | 89.8 | 90.2 | 98.7 | 99.4 | 97.1 | 97.3 | 99.6 | |
90.7 | 58.6 | 57.1 | 94.9 | 94.6 | 75.9 | 75.3 | 97.0 | 97.6 | 89.6 | 89.0 | 98.7 | 99.4 | 97.1 | 97.1 | 99.6 | |
91.2 | 61.1 | 58.5 | 95.3 | 95.1 | 78.1 | 77.2 | 97.3 | 97.7 | 89.8 | 89.7 | 98.7 | 99.4 | 97.4 | 97.2 | 99.7 | |
92.8 | 67.2 | 67.8 | 95.9 | 96.2 | 82.9 | 82.8 | 97.9 | 98.3 | 92.6 | 92.4 | 99.1 | 99.6 | 98.0 | 98.2 | 99.8 | |
94.9 | 77.4 | 77.3 | 97.2 | 97.4 | 88.4 | 88.5 | 98.6 | 98.9 | 95.0 | 95.4 | 99.4 | 99.7 | 98.5 | 98.6 | 99.8 | |
98.0 | 90.8 | 91.3 | 98.9 | 98.9 | 95.2 | 95.1 | 99.4 | 99.5 | 97.9 | 97.8 | 99.7 | 99.8 | 99.2 | 99.1 | 99.9 |
H-Index | Signal Length: 128 | Signal Length: 256 | Signal Length: 512 | Signal Length: 1024 | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Acc | PPV | Sen | Spe | Acc | PPV | Sen | Spe | Acc | PPV | Sen | Spe | Acc | PPV | Sen | Spe | |
95.1 | 77.1 | 79.9 | 97.0 | 97.3 | 87.6 | 88.0 | 98.4 | 98.9 | 94.9 | 95.2 | 99.4 | 99.7 | 98.6 | 99.1 | 99.8 | |
91.1 | 59.9 | 58.9 | 95.1 | 94.8 | 76.4 | 76.3 | 97.1 | 97.9 | 90.9 | 90.9 | 98.9 | 99.5 | 98.1 | 97.7 | 99.8 | |
91.3 | 60.2 | 62.8 | 94.8 | 94.9 | 76.4 | 78.5 | 96.9 | 98.1 | 91.2 | 91.4 | 98.9 | 99.6 | 98.1 | 98.1 | 99.8 | |
91.1 | 59.9 | 58.6 | 95.1 | 94.7 | 76.1 | 75.8 | 97.0 | 97.8 | 90.1 | 90.2 | 98.8 | 99.5 | 97.7 | 97.9 | 99.7 | |
90.7 | 58.4 | 58.1 | 94.8 | 94.5 | 75.6 | 74.9 | 96.9 | 97.6 | 89.2 | 89.2 | 98.7 | 99.5 | 97.8 | 97.6 | 99.7 | |
91.3 | 61.4 | 58.5 | 95.4 | 94.9 | 77.6 | 76.9 | 97.2 | 97.8 | 90.0 | 89.9 | 98.8 | 99.5 | 97.7 | 97.9 | 99.7 | |
92.8 | 67.4 | 67.9 | 95.9 | 96.1 | 82.7 | 81.9 | 97.9 | 98.4 | 93.3 | 92.4 | 99.2 | 99.6 | 98.4 | 98.3 | 99.8 | |
94.8 | 76.9 | 76.6 | 97.1 | 97.3 | 88.1 | 87.6 | 98.5 | 98.9 | 94.9 | 94.9 | 99.4 | 99.7 | 98.6 | 98.3 | 99.8 | |
97.9 | 90.0 | 90.8 | 98.8 | 98.9 | 94.7 | 95.1 | 99.3 | 99.5 | 97.4 | 97.7 | 99.7 | 99.8 | 98.9 | 99.1 | 99.9 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
López, J.L.; Vásquez-Coronel, J.A. Analyzing Monofractal Short and Very Short Time Series: A Comparison of Detrended Fluctuation Analysis and Convolutional Neural Networks as Classifiers. Fractal Fract. 2024, 8, 460. https://doi.org/10.3390/fractalfract8080460
López JL, Vásquez-Coronel JA. Analyzing Monofractal Short and Very Short Time Series: A Comparison of Detrended Fluctuation Analysis and Convolutional Neural Networks as Classifiers. Fractal and Fractional. 2024; 8(8):460. https://doi.org/10.3390/fractalfract8080460
Chicago/Turabian StyleLópez, Juan L., and José A. Vásquez-Coronel. 2024. "Analyzing Monofractal Short and Very Short Time Series: A Comparison of Detrended Fluctuation Analysis and Convolutional Neural Networks as Classifiers" Fractal and Fractional 8, no. 8: 460. https://doi.org/10.3390/fractalfract8080460
APA StyleLópez, J. L., & Vásquez-Coronel, J. A. (2024). Analyzing Monofractal Short and Very Short Time Series: A Comparison of Detrended Fluctuation Analysis and Convolutional Neural Networks as Classifiers. Fractal and Fractional, 8(8), 460. https://doi.org/10.3390/fractalfract8080460