Next Article in Journal
Notch Effect on the Fatigue Behavior of a TC21 Titanium Alloy in Very High Cycle Regime
Previous Article in Journal
A Simplified Methodology to Evaluate the Design Specifications of Hydraulic Components
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

An Ant-Lion Optimizer-Trained Artificial Neural Network System for Chaotic Electroencephalogram (EEG) Prediction

Department of Computer Engineering, Suleyman Demirel University, Isparta 32260, Turkey
Appl. Sci. 2018, 8(9), 1613; https://doi.org/10.3390/app8091613
Submission received: 25 July 2018 / Revised: 1 September 2018 / Accepted: 4 September 2018 / Published: 11 September 2018

Abstract

:
The prediction of future events based on available time series measurements is a relevant research area specifically for healthcare, such as prognostics and assessments of intervention applications. A measure of brain dynamics, electroencephalogram time series, are routinely analyzed to obtain information about current, as well as future, mental states, and to detect and diagnose diseases or environmental factors. Due to their chaotic nature, electroencephalogram time series require specialized techniques for effective prediction. The objective of this study was to introduce a hybrid system developed by artificial intelligence techniques to deal with electroencephalogram time series. Both artificial neural networks and the ant-lion optimizer, which is a recent intelligent optimization technique, were employed to comprehend the related system and perform some prediction applications over electroencephalogram time series. According to the obtained findings, the system can successfully predict the future states of target time series and it even outperforms some other hybrid artificial neural network-based systems and alternative time series prediction approaches from the literature.

Graphical Abstract

1. Introduction

In mathematics and statistics, there are different types of problems that can be solved via artificial intelligence-based approaches. One remarkable research issue in this regard is the analysis of time series, and it seems that the collection of such data flows over a period of time often attract researchers as it is possible to derive lots of information from them. In the literature, the term “time series” is defined as the flow of data saved over a previous time period, which can be a week, month, or even year [1,2,3,4]. It has been determined that the use of time series can make it easier to understand many aspects on the subject or event from which the time series data has been obtained. Due to this, time series analysis has become a remarkable application and, finally, the popularity of applications on ‘predicting the future flow of time series’ has taken many steps forward in time. The prediction of the future flow of time series involves the analysis of previous states of a time series to obtain ideas about the flow in the future. This approach benefits from the idea of that the past patterns of a time series will be observed in the future [1].
Time series prediction has relations with events or problems within almost all fields of modern life. Here, it is notable to express that future states related to changes in financial values, sales, productions, or even behaviors of natural dynamics can be predicted thanks to different approaches applied over a time series. Hereby, it is possible for a scientist, researcher, or an expert in a certain field, to have an idea about future states and, finally, to reach decisions that can be useful for new applications or problem solutions. In general, the prediction of time series has become an important application as a result of unstoppable, chaotic changes in real-life, and the nature of ‘uncertainty’ has appeared because of this situation. In particular, chaotic time series showing complex flows which make it difficult to perform accurate analyses have a remarkable place in real-life applications.
The prediction of chaotic time series is a research interest that is followed widely by the scientific community. If the background of time series prediction is examined in detail, it can be seen that previously-introduced traditional approaches or methods sometimes failed for especially chaotic time series which are more complex and dynamic [5]. Due to this issue, there have been remarkable efforts to employ alternative solutions to overcome the prediction problem for difficult or challenging time series. At this point, it was also noticed that the use of artificial intelligence can overcome the related issues and furthermore, give many advantages of ‘intelligent mechanisms’ to solve real-world based problems. In time, this field has become a popular application interest when time series prediction is important for an encountered problem. Today, the development of artificial intelligence-based hybrid systems is one of the most attractive solutions and because of this, it is possible to see many examples of time series prediction applications done via hybrid systems structured with more than one method or technique.
Based on the explanations so far, this study aims to employ an alternative artificial intelligence-based hybrid system to predict chaotic time series and deal with electroencephalogram (EEG) time series, because of its vital importance in terms of healthcare. The study briefly focuses on designing an approach that combines artificial neural networks (ANN), and the ant-lion optimizer algorithm (ALO), which is a recent optimization technique, in order to realize the expressed objectives. As an alternative method of training the artificial neural network, the ant-lion optimizer is a recently-introduced algorithm which was inspired by the hunting behavior of ant-lions while they are larvae [6,7,8,9]. The study done here is connected with many outputs, such as evaluating the potential of predicting EEG with Artificial Intelligence-based systems, and also the success rate of an ANN-ALO (ALO trained ANN) system with this problem. Furthermore, the practical value of the ANN-ALO system is analyzed thanks to applications made by physicians recently. The motivation of this study and the main contributions considered in it are as follows:
The study deals with the remarkable, widely-followed research interest of time series prediction and focuses on predicting especially ‘chaotic’ ones. Additionally, the study also considers the healthcare field and tries to predict electroencephalogram (EEG) time series, which are important medical data for analyzing brain activity and obtaining information about possible diseases (like depression, autism, epilepsy, and even Alzheimer’s disease) or environmental factors that interact with the brain. While performing predictions over EEG time series, the study contributes to the associated literature by introducing the ANN-ALO hybrid system which has not been employed for this purpose yet. As EEG is the target data to be predicted, the study aims to predict an advanced time series with chaotic characteristics. Furthermore, the study generally proves that hybrid Artificial Intelligence techniques are effective enough in predicting time series, even when they are chaotic. In this way, the more practical and effective sides of artificial intelligence techniques according to traditional approaches can be understood by readers. Finally, the study also proves, generally, that healthcare needs more use of artificial intelligence to solve some issues. The issue considered here, the ‘prediction’ over data flow, is one of the key elements for healthcare, especially in terms of early diagnosis. In this context, the study proves the advantages of the introduced approach thanks to performed evaluations including both comparative and experiences from experts (physicians).

2. Background

Except from other studies done over time series, the prediction of future states has always been a remarkable issue because of its advantages of ‘seeing the future’ and ‘acting according to that’. Moving on from that fact, it is possible to indicate that the prediction of time series is a vital approach, especially in healthcare. On the other hand, there are many other fields that need predictions to be done over gathered data in the form of time series. In detail, chaotic time series have a remarkable value here because of their more complex structures. In the past, the disadvantages of traditional approaches have enabled researchers to search for alternative solutions and artificial intelligence has become an important solution for achieving better results in the prediction of time series, including chaotic ones. So far, the related literature has been affected by this situation too much.

2.1. A Brief Review of the Literature

By also extending a recently done literature review by the author and his colleague in [10], it is possible to explain some of the background on time series prediction (mostly on chaotic ones) as follows:
Considering the technique of artificial neural networks (ANN), Gan et al. and Wong et al. all introduced some prediction-oriented research studies for the related literature [11,12]. In terms of hybrid approaches, Gentili et al. employed a nonlinear local predictor, the feed forward artificial neural network, and fuzzy logic to realize predictions on time series of aperiodic hydrodynamic oscillations [13]. Predicting multivariate chaotic time series is a challenging research issue. At this point, Chen and Han performed a study using the radial basis function (RBF) to deal with the related research issues [14]. In the literature, some well-known chaotic time series have been used to evaluate different approaches. In a remarkable example, Wu et al. predicted the time series of Mackey–Glass, and the gas furnace (Box–Jenkins), thanks to the iterated extended Kalman filter with the single multiplicative neuron model [15]. In addition, they also performed predictions over EEG time series. Based on the use of single multiplicative neuron (SMN) for the prediction of time series, it is possible to find many alternative studies that have been done so far. For example, Yadav et al. used this technique in their study which was published in 2007 [16]. Additionally, Zhao and Yang performed a study similar to [14], but they used the SMN-PSO (particle swarm optimization trained SMN) approach to predict the Mackey–Glass, gas furnace (Box–Jenkins), and EEG time series [17]. Yao and Liu benefited from a fuzzy logic (FL)-based approach to deal with the prediction issues about atmospheric visibility in the city of Shanghai, China [18]. Particle swarm optimization (PSO), which is a widely-used swarm intelligence technique, was used by Unler to perform optimization-based time series prediction applications [19]. In another study, Porto et al. used PSO in order to obtain optimum ensemble of Extreme Learning Machines (ELM) for time series prediction [20]. Ant colony optimization (ACO), a combinatorial optimization algorithm that is widely-known in the literature, has become another effective prediction method for chaotic flows. Some examples are shown in [5,21,22,23,24]. Yeh used a model which provides a parameter-free simplified swarm optimization for the training of artificial neural networks (ANN) [25]. In that study, the model was run over different time series, including EEG. In a remarkable study, Nourani and Andalib employed a Wavelet least square support vector machine (WLSSVM) system to predict hydrological time series [26]. By focusing of the issue of predicting a suspended sediment load (SSL) for the Aji-Chay River (monthly), the authors performed a comparison-based prediction approach which analyzed approaches like ad-hoc the least square support vector machine (LSSVM) and models of artificial neural networks (ANN) together. Regarding the sub-field of machine learning, Bontempi et al. produced a book chapter to provide a review about recent approaches for predicting the time series via specific machine learning techniques [27]. As it is based on a research study focused on prediction of general time series, this study is different from the alternative ones including only chaotic time series. It is an important background study due to its publishing year and wide scope.
The literature also includes some studies in which the authors employed hybrid systems structured via optimization algorithms and SVM to overcome the time series prediction problem. For example, Hu and Zhang employed support vector machines (SVM) and the chaotic simulated annealing algorithm (CSAA) to predict time series [28]. On the other hand, Liu and Yao employed a hybrid system including PSO and the least square SVM to perform prediction processes [29]. Readers are also referred to [30,31,32,33,34,35,36] to get a better idea about using SVM in prediction problems. Ren et al. carried out a study on the prediction of short-term traffic flow with an Artificial Intelligence-based approach. In this context, they used a prediction approach including the back-propagation neural network–niche genetic algorithm (NGA) [37]. In another study on predicting traffic flow, Ding et al. predicted the ‘lane-change trajectory by drivers’ in urban traffic flow [38]. In the context of predicting urban traffic flow, Yin et al. also developed an alternative approach and introduced the Fuzzy-Neural Model (FNM) for predicting traffic-flow in urban street networks [39]. Dunne and Ghosh presented an alternative approach to predict traffic flow (hourly) by taking the effects of rainfall into consideration [40]. In detail, they employed a neuro-wavelet model including the stationary wavelet transform (SWT). As a three-technique hybrid system formation, Pulido et al. employed ensemble neural networks (ENN)–fuzzy logic (as Type 1–Type 2), and particle swarm optimization (PSO) to perform predictions related to the Mexican Stock Exchange [41]. Huang et al. used the chaos over BP artificial neural networks (CBPANNs) system supported by the genetic algorithm to perform predictions on time series [42]. Briefly, they employed their system to predict wind power. In another study on predicting wind power and speed, Jiang et al. used a hybrid system based on the support vector regression (SVR) model, and cross correlation (CC) analysis. They supported their system with the cuckoo search (CS) and brainstorm optimization (BSO) algorithms [43] and applied the research to wind turbines running on a wind farm (China). Doucoure et al. used a multi-resolution analysis along with the artificial wavelet neural network model to predict wind speed [44]. Briefly, the authors developed a prediction system that can be used in the context of renewable energy sources. The main objective of this study was to obtain an intelligent, alternative management approach for a micro-grid system that could use renewable energy within isolated and grid-connected power systems [44]. Chandra used a recurrent neural networks system, trained thanks to the technique of Cooperative Coevolution (CC), to predict chaotic time series [45]. Chai and Lim used artificial neural networks and weighted fuzzy membership functions (NEWFM) to perform predictions on the business cycle [46]. In detail, the authors used chaotic time series (adjusted leading composite index time series with coordinate embedding (time-delay)) for the NEWFM and predicted the flow of business in this way [46]. Seo et al. employed artificial neural networks and, also, the adaptive neuro-fuzzy inference system to perform prediction operations on water levels that were tracked daily [47]. The theory of wavelet decomposition theory was also used within this research study. In their study, Marzban et al. employed differently structured dynamic neural networks (DNN) to predict chaotic systems like Mackey–Glass and Henon Map [48]. Okkan employed a wavelet neural network (WNN) to realize the predictions for monthly reservoir in-flow (from the data obtained from the basin of the Kemer Dam in Turkey) [49]. In detail, the system used here included discrete wavelet transform (DWT)–feed forward neural networks (FFNN), which are supported by the Levenberg–Marquardt optimization algorithm. Zhou et al. developed a dendritic neuron model (with dendritic functions and phase space reconstruction) for predictions over the financial time series of the Shanghai Stock Exchange Composite Index, the Deutscher Aktienindex, the DJI Average, and the N225 [50]. In a research study, Wang et al. used differential evolution (DE) supported by teaching-learning based optimization (TLBO) to predict some chaotic time series [51]. In relation to the prediction of chaotic time series, Heydari et al. employed the Takagi Sugeno Kang (TSK) second-order fuzzy system and ANFIS (artificial neural fuzzy inference system) [52]. Wang et al. provided a study on the prediction of time series using a Back Propagation-oriented neural network model trained by adaptive differential evaluation (ADE) [53]. It was reported that the authors produced successful, improved results in their study. Catalao et al. used a hybrid system in their study to predict short term electricity prices [54]. In detail, they employed particle swarm optimization (PSO), wavelet transform (WT), and ANFIS for their research purposes. A hybrid system of the ANFIS–vortex optimization algorithm (VOA) was used by Kose and Arslan to perform predictions over the time series of EEG. It was reported that the related system is successful enough in the prediction of EEG data [10]. A model of adaptive local linear (optimized) radial basis functional neural network (LLRBFNN) was employed by Patra et al. to predict some financial time series [55]. It was reported in that study that LLRBFNN can perform predictions successfully enough and performs better than some other alternative systems that have been considered. In another study on financial time series prediction, Ravi et al. built a hybrid model including three different solution elements [56]. They designed some three-stage hybrid models that employ chaos to construct the phase space in stage 1, the Multi-Layer Perceptron (MLP) network model in stage 2, and particle swarm optimization with a multi-objective mechanism (MOPSO) and the elitist non-dominated sorting genetic algorithm (NDSGA) in stage 3. Both long-term and short-term time series prediction were achieved by Méndez et al. by using a different approach [57]. In their study, they built a modular (structured) neural network (MNN) model with two methods: competitive clustering and winner-takes-all.
Following the examination of works in the literature that have put more emphasis on the prediction of EEG time series (with a chaotic nature), the several works are explained briefly. In their study, Weil et al. used radial basis function neural networks with the adaptive projective learning algorithm to predict some EEG time series [58]. In detail, they showed that the optimal choice of alpha parameters within the system can effectively predict EEG data and this depends on the correlation dimension of objective EEG time series. As a traditional use, Hou employed the BP neural network model to perform some prediction operations over the EEG time series [59]. In the study, they also used brain electrical activity mapping (BEAM) as an alternative research approach. In a recent study by Wei et al., both the convolution neural network (CNN) and the bi-directional recurrent neural network (BRNN) were employed to develop a universal predictor for both EEG and ECG time series [60]. The related system showed positive performances for predicting the chaotic time series by considering two medical data types: EEG and ECG. Blinowska and Malinowski used non-linear and autoregressive (AR) techniques to predict both EEG and simulated time series [61]. The study followed the objective of evaluating whether the techniques can distinguish chaotic time series from noisy ones and positive results were obtained. In another study, Wei et al. used a third-order Volterra filter to predict chaotic time series including EEG data [62]. The findings obtained in the research showed that the used technique was better than the second-order Volterra filter and it was successful in predicting high dimensional chaotic EEG time series. Coyle et al. used two neural network models to perform prediction operations on an EEG time series in the context of a feature extraction approach for a brain–computer interface [63]. In that study, the prediction of EEG was supported with a classification process extraction done by linear discriminant analysis (LDA). By following the concept of brain fingerprinting, Coelho et al. used the General variable neighborhood search (GVNS) technique to optimize fuzzy rules to achieve a classification of individuals [64]. Although the study is not directly about predicting EEG time series, it can be evaluated within the same scope because of the prediction done with current EEG patterns to catch differences. An ANFIS was used by Komijani et al. to distinguish healthy individuals from those with epilepsy by using chaotic EEG time series [65]. Here, the formed system also included an EEG prediction phase including a mixture of normal and epileptic EEG time series. The authors obtained positive results in the context of the study objectives thanks to the use of ANFIS. Prasad and Prasad developed deep recurrent neural networks (DRNN) to realize an effective prediction approach for chaotic EEG time series [66]. In detail, they also designed a dynamic programming (DP) training process to improve the efficiency by benefiting from matrix operations. The Elman RNN technique was used by Forney to classify EEG time series by supporting the process with prediction phases [67]. The classifications used in the study included the winner-takes-all, the linear discriminant analysis, and the quadratic discriminant analysis.

2.2. Evaluation of the Works

Based on the expressed works, it is possible to mention the following notable points: EEG is a widely examined data type in terms of time series prediction. In addition to the EEG, there are some remarkable data systems that can be evaluated in terms of prediction. These are Mackey–Glass, gas furnace (Box–Jenkins), and Henon Map. Chaotic maps are also widely used for the prediction of oriented works. Chaotic data from real-life has great potential because of its unique characteristics that change in different locations around the world, regions, etc. It is already a remarkable approach to deal with chaotic time series systems by using hybrid structured solution approaches. In terms of predicting, in particular, chaotic time series, the artificial neural network is a widely-chosen technique that is used by researchers. Generally, researchers obtain positive results when they try to predict time series (even chaotic ones) via artificial intelligence-based approaches, methods, and techniques. There is a certain open problem involved in training the artificial neural network and its different variations as long as it has the function of being trained with optimization-based solutions. The prediction of EEG time series is a critical research topic for diagnosing brain diseases and better understanding potential brain activity that may appear in the future. On the other hand, there is also a remarkable interest in the evaluation of EEG data for research objectives on the brain–computer interface. In the context of intelligent optimization algorithms, the ant-lion optimizer has not yet been used with artificial neural networks to directly predict chaotic EEG time series.
In addition to the brief but expansive look at the background of this topic, it is now appropriate to express some of the information about how chaotic time series prediction over EEG is achieved by using the hybrid system of artificial neural networks and the ant-lion optimizer technique. To achieve this, each technique and the general prediction solution designed in this study are explained to the readers in the next section.

3. Materials and Methods

In regard to the materials and methods used in this study, the essentials of the techniques employed within the hybrid system and also, important details regarding the designed prediction approach are explained in the following text:

3.1. Artificial Neural Networks

In the Artificial Intelligence literature, there are some general techniques that are still strong enough to deal with encountered real-life problems. The artificial neural network (ANN) is one of them, and this technique is widely used by researchers in problems, even in hybrid-structured approaches. As it is also a machine learning technique, ANN is a data processing system model which is inspired from the human brain’s structure. Generally, a typical ANN model includes some artificial neurons which are some kind of distributed, parallel calculation element. By using some sets of artificial neurons, it is possible to form different ANN models. A typical ANN model can deal with many tasks, like classification, prediction, intelligent optimization, control, pattern recognition, etc. [68,69,70,71,72,73,74,75,76,77,78,79]. The default structure of a multi-layer ANN is shown in Figure 1 [78].
The default structure of an artificial neuron was introduced by McCulloch and Pitts [80]. Since then, the literature regarding the ANN has gained momentum, and there has been a rapid increase in research studies based on that technique. Eventually, several different ANN models (i.e., multi-layer, single perceptron, self-organized map, ADALINE, adaptive neuro-fuzzy, probabilistic network) were introduced by researchers [56,57,68,70,72,79,81,82,83].
The learning mechanism employed in an ANN model is briefly described as follows: an ANN learns from the environment and gives appropriate responses to newly encountered situations thanks to the learned experiences and information, which is like the mechanism seen for humans’ (and even some other living organisms’) behavior when learning new things from experiences and using them to solve new problems (or solving already known ones in a better way) [70,71,72,79,81,82,83]. In this context, the learning process of an ANN can be based on three different machine learning strategies: supervised learning, unsupervised learning, and reinforcement learning. Supervised learning deals with the use of inputs and desired outputs to train an ANN, while unsupervised learning uses only inputs, and the reinforcement learning focuses on the feedback value(s) received [70,71,72,79,81,82,83].
A common ANN model structure used in research studies is as follows: the network includes artificial neurons connected to each other by weights. In general, an artificial neuron employs input(s) with weight(s), a function of summation, and finally, an output. Here, inputs are multiplied by weight values, and sum of them is used by a transfer function. The output of the transfer function is the output of the neuron [79,81,82,83]. Thanks to such connections, more and more artificial neurons and larger ANN models can be obtained [79,83].
Today, it is clear that the ANN technique is still popular in the context of artificial intelligence-based applications, as was indicated previously. At this point, hybrid system formation is a common usage of ANN. Although there can be many different ways to use an ANN in hybrid form, the support of its learning process with alternative algorithms has always been attractive for researchers. One remarkable approach is to use artificial intelligence-based, intelligent optimization algorithms instead of known traditional algorithms. In this study, the hybrid structure was formed by following an intelligent optimization-oriented approach and a recent algorithm. The ant-lion optimizer was chosen as the training algorithm for the process. The next subsection briefly focuses on the essentials of that algorithm.

3.2. Ant-Lion Optimizer

The ant-lion optimizer (ALO) is a recently developed artificial intelligence-based optimization algorithm that was introduced by Mirjalili [6]. It is also a swarm intelligence-based approach that was inspired from ant-lions by considering their behaviors during larvae form. The algorithm employs particles as ant-lions and ants, respectively, in order to perform an optimization process that was designed mathematically and logically around the hunting behaviors shown by ant-lions. In real-life, ant-lions use traps under the ground in order to hunt ants (Figure 2 [6]). The ALO builds on this fact and focuses on interactions among particles of ant-lions and ants. To model these interactions, ants are moved over the search space while ant-lions are allowed to hunt them with their traps [6,7]. At this point, the ant-lions become fitter by using the traps. In the ALO, the movements of ants are modelled by using the approach of ‘random walk’ [6,7,8,9]. Here, the changing positions of the ants are used as parameters for objective function(s) so that optimum values are searched [6].
ALO has already been used in different studies in the associated literature [6,7,8,9,84,85]. For the ALO, the following conditions are considered through optimization steps followed in this manner [6]: Ants move in the objective search space with different random walks by considering all dimensions of the problem. Performed random walks are affected by the traps of the ant-lions. Pits indicating traps are built by ant-lions proportionally to their calculated fitness, and larger pits (ant-lions) have a greater probability of catching ants. Any ant can be caught by an ant-lion (including the best one) during ALO iterations. Ants actually move towards ant-lions by decreasing their range of random walk adaptively. If an ant is fitter than an ant-lion, it is caught and hid under the sand by the ant-lion. Ant-lions to be processed are chosen during ALO iterations using the roulette wheel choosing approach applied in genetic algorithms.
Ant-lions are relocated to the last caught prey, and a pit is built here to improve the chance of catching other prey.
A pseudo-code regarding the default ALO can be written as follows, which also includes essential equations [6,7,8,9]:
Step 1 
(Initialization Phase): Randomly place M number of ants, and N number of ant-lions over the search space.
Step 2: 
Calculate the fitness levels of all ants and ant-lions. Additionally, determine the best ant-lion according to the calculated fitness values.
Step 3: 
Perform the following steps until the stopping criteria is met.
Step 3.1: 
Perform the following sub-steps for each ant.
Step 3.1.1: 
Select an ant-lion with the roulette wheel.
Step 3.1.2: 
Update the minimum and maximum (c and d) values with the ratio (I), by using the following equations:
c t i m e = c t i m e I
d t i m e = d t i m e I
Step 3.1.3: 
Perform a random walk (Equation (3)) and normalize it (Equation (4)):
X ( t i m e ) = [ 0 , c s ( 2 r ( t i m e 1 ) 1 ) ,   c s ( 2 r ( t i m e 2 ) 1 ) , , c s ( 2 r ( t n ) 1 ) ]
where cs is the cumulative sum, and r ( t i m e ) = { 1 ,   r a n d o m   n u m b e r > 0.5 0 ,   r a n d o m   n u m b e r 0.5
X i t i m e = ( X i t i m e a i ) × ( d i c i t i m e ) ( d i t i m e a i ) + c i .
Step 3.1.4: 
Update the related ant’s position by using the following equation:
A n t P o s i t i o n i t i m e = R A n t L i o n t i m e R B e s t A n t L i o n t i m e 2
where R A n t L i o n t i m e is a random walk around the chosen ant-lion with the roulette wheel while R A n t L i o n t i m e is a random walk around the best ant-lion.
Step 3.2: 
Calcute the fitness values for all ants.
Step 3.3: 
Replace an ant-lion with its related ant if it is fitter by using the following equation:
A n t L i o n P o s i t i o n j t i m e = A n t P o s i t i o n i t i m e
Step 3.4: 
Update the best ant-lion if an ant-lion is better than it.
Step 4: 
The best ant-lion is the optimum solution(s) for the related problem.
The whole process can be explained in a more technical way, as follows [86]. In the ALO, first, the matrices representing ants and ant-lions are randomly initialized. Following that, the position of each ant corresponding to an ant-lion is chosen with the roulette wheel approach, while the best ant-lion so far is updated in each iteration. Here, the position, which updates the boundary, is firstly relative to the current iteration number, and the position is then updated via two random walks around the chosen best one and the ant-lion. In every random walk, the points are predicted over the fitness function. If any ant becomes fitter than any other ant-lions, their positions are set as the new positions for ant-lions in the next iteration. The best ant-lion is contrasted with the best ant-lion that was obtained through the optimization and is replaced at its primary point.
Although it is a recent algorithm, ALO has been widely applied in problems within different fields with different optimization problem structures (i.e., multi-objective optimization) [87,88,89,90,91]. Additionally, as expressed before, ALO is an algorithm which is the subject of the swarm intelligence sub-field under artificial intelligence. Swarm intelligence briefly deals with particle-based systems coming from the idea of behaviors seen among social swarms (i.e., fishes, birds, ants, bees, and even humans) that are observed as they are collectively realized [79,92]. Under swarm intelligence, many different intelligent algorithms dealing with optimization problems have been developed in the related literature. Particle swarm optimization (PSO), ant colony optimization (ACO), cuckoo search (CS), and artificial bee colony (ABC) are some recent, widely-employed swarm intelligence algorithms. Readers interested in the subfield of swarm intelligence and some known algorithms are referred to [93,94,95,96,97,98,99,100] for more detailed information.
In this study, ALO has been used to optimize the weights of the ANN technique which means training of the model. The next subsection provides more information about the formed ANN–ALO system and the chaotic time series prediction approach used for EEG data here.

3.3. Chaotic Time Series Prediction with the ANN–ALO System

EEG time series are in chaotic forms. Due to this, there is a need for a prediction solution strong enough to deal with chaotic time series. In order to achieve that, a hybrid approach rising over the ANN–ALO combination was considered in this study. The system briefly employs an ANN model which is trained by the ALO. Since it is a recent, effective optimization technique, ALO has been employed to train the ANN model rather than using traditional training approaches, e.g., the back-propagation algorithm.
The approach designed here is a novel one with it is function being to employ ALO for the first time to train ANN on chaotic time series prediction. The novelty here deals with the prediction success according to alternative approaches and as a first-time alternative for the associated literature. The prediction approach is as follows: For training, particles of the ALO are associated with the weight and bias of the ANN (ALO was employed to optimize the weight and the bias). The ANN is trained according to the mean squared error (MSE) criteria by considering the differences between obtained output values and desired output values. In this context, the ANN model is, briefly, a multi-layer perceptron (MLP) that employs four inputs and one output. The ANN model tries to predict the value of x(time + 3), by using the values x(time), x(time − 3), x(time − 6), and x(time − 9). It is important to express here that there are many different combinations of lags used for the prediction approach. However, past studies that were performed by the author showed that these lags are the most appropriate ones for the different chaotic time series considered [101]. Additionally, some alternative lag options have been tested before in order to understand which one is appropriate to use (a report on this is provided in the next paragraphs). Of course, further investigations are always open to select alternative lags as long as the success of prediction approaches depends on choosing the optimum number of lags and time delays between each of them. In this study, the general prediction performance of the approach was evaluated according to the mean absolute error (MAE) criteria.
Figure 3 shows a brief view of the ANN–ALO prediction system.
While forming a hybrid system for a specific problem, it is important to adjust all parameters regarding the employed techniques to ensure that there is good enough synchronization among/between them. Thus, it is important to determine how both ANN and ALO should be adjusted to ensure desired interaction between them and to solve the problem of predicting chaotic time series which is one of the objectives of this research study.

4. Applications on Electroencephalogram (EEG) Prediction

The applications of the followed research were based on prediction operations done over some considered EEG data. Here, it was also important to determine the general structure of the ANN–ALO system that was employed for prediction. First of all, the optimum structure of the ANN was determined by evaluating five different ANN models with alternative lags (inputs), outputs, and hidden layers. The ANN models were trained with the traditional back propagation algorithm (BPA), and the prediction accuracy of the models were analyzed by considering the sample and simulated EEG time series used in [10]. Following that, five different ALO adjustments (regarding the used parameters of the ALO) were used in the optimum ANN model to determine the best adjustment that was then evaluated further in terms of its success in predicting chaotic EEG time series. The next paragraphs briefly explain all of these research stages.

4.1. Chaotic EEG Time Series Prediction with the ANN–ALO System

The electroencephalogram (EEG) is a typical monitoring approach for electrical activities that occur in our brains. Since the brain is a complex component of the body that is associated with many vital functions, the electrical activity in it can be observed in chaotic forms. Due to this, a general analysis of brain activity requires the employment of advanced approaches to understand the chaotic flows in time series that are derived from electrical activities in them. It is possible to consider this to be a vital issue because of the following key points: In terms of the healthcare and medical perspective, the early diagnosis of some diseases (like depression, autism, epilepsy, and even Alzheimer’s disease) related to the brain is very important. Additionally, the evaluation of interactions of the brain with other environmental factors (e.g., triggering factors around living organisms, psychological aspects, living standards, and health state affected by problems in other organs) are also critical.
Considering the mentioned key points, predicting the future of such data flow is also a remarkable research approach and is followed in this study.
In order to perform predictions over chaotic EEG time series, data were recorded from individuals coming to the Isparta State Hospital located in the city center of Isparta, Turkey. At this point, a total of 10 EEG data gathered from 10 (five males and five females) individuals (from whom necessary ethical permission was obtained) were considered for the prediction approach. In this context, a total of 10 different prediction applications were determined to achieve the prediction approach with the ANN–ALO hybrid system and also to evaluate it.
A typical EEG recording is done by locating electrodes on certain points over the individual’s head (Figure 4a [102]). At this point, an internationally used 10–20 system is followed to place the electrodes on points from which it is possible to gather the desired electrical data easily (Figure 4b [103]). In this study, the sampling frequency for the EEG recording was 250 Hz, and in order to make the related data ready for the prediction tasks, it was preprocessed against noises, eye blinks, and muscular artifacts. For the noise artifact removal, FastICA, which is an automated, simple technique introduced by Jadhav et al. [104] (which benefits generally from wavelet transform) was used in this study.
Each chaotic EEG time series is called as ‘Application’ and named with the letter followed by application number (A1, …, A10). Figure 5 briefly presents the chaotic EEG time series considered in the sense of this research study.
Since it is important to observe chaotic behaviors within each EEG time series (for the scope of the research), it is possible to evaluate their phase-space state portraits. In this context, Figure 6 represents the phase-space state portrait for each chaotic EEG time series within the related applications. It can be seen from Figure 6 that for each piece of EEG data, the phase-spaces are not periodic and do not extend to infinity, which is a sign of chaos.
In order to validate that the considered EEG time series are chaotic, a formal approach—calculation of the largest Lyapunov exponent—was used for each EEG time series. The related calculation was done with the following equation [105]:
λ = 1 N t = 1 N l n d i s t ( s ( n + 1 ) ,     s ( m + 1 ) ) d i s t ( s ( n ) ,     s ( m ) )
where dist means the Euclidian distance between two points, s(n) is the reference point, s(m) is the nearest neighbor to it, and N is the number of Euclidian distance calculations.
Table 1 provides the Lyapunov exponents calculated for each EEG time series application. In order to accept a time series as chaotic, at least one of the calculated exponents should be positive (positive values are shown in bold style in Table 1). It can be seen from Table 1 that all EEG time series are chaotic, because each of them has a least one positive Lyapunov exponent.

4.2. Organization of the ANN–ALO

As was indicated before, an optimum ANN model was first determined by evaluating the accuracy with the EEG data used in [10], and then five different ALO adjustments were used for the ANN model structure for the prediction applications in the study. After performing the prediction applications with each hybrid system, the system with the best results was used for the comparative evaluations done in further stages. Here, a remarkable consideration was also given to obtain an ANN–ALO system which does not have an overfitting problem. This was controlled by prediction applications done over different ANN–ALO adjustments by evaluating the relation between training and testing errors. At the end of the process, more consideration was also given to employing the best adjustment of ANN–ALO with no overfitting problem.
Table 2 shows a report on the mean accuracy of the prediction shown by different BPA-trained ANN models (the best model is in bold style). The models were run 10 times over four different EEG data (explained in [10]) and in order to make the evaluation simpler, the mean accuracy of prediction for the test data only (which was 50% of the whole EEG data) was used. Furthermore, in order to obtain one single value of prediction accuracy, the mean of 10 runs was calculated over a single mean accuracy value calculated for each ANN model by considering the entire EEG time series. Two of different lag alternatives are the ones reported by [10,65]. As a widely used activation function, sigmoid was used in all ANN models in this study.
Tt can be seen from Table 2 that a greater the number of inputs (previous data points) does not always lead to a better prediction performance for the ANN model. On the other hand, it is also important to determine the optimum number of total hidden layers and artificial neurons included within each of them. Since it is not ideal to use different numbers of neurons in different hidden layers for the training and test performances of an ANN model, the same number of neurons was included in each hidden layer in the context of related works by the author. Here, it is considered that the alternative ANN model formation No. 3 provided better prediction accuracy than the other alternative formations. Additionally, the lag structure chosen in model no. 3 led to better findings than the lag structure preferred in [10,65].
Moving on from the obtained findings, Table 3 provides information about the essential adjustments of the optimum ANN model structure that are used in the prediction applications of chaotic EEG data. In addition to that, different adjustments (parameter values) that were used to form five different ALO based training approaches are presented in Table 4. It is notable that ALO has just two parameters that can be adjusted for optimization.
Hybrid systems employing the same ANN and five different ALO adjustments were applied to the chaotic EEG data. In detail, each EEG data (applications: A1, …, A10) contained a total of 3990 rows corresponding to x(time + 3): x(time), x(time − 3), x(time − 6), and x(time − 9), and 70% of the data for each time series (2793 rows) was employed for the ‘training’ processes, and generally, predictions were observed over the whole time period by including all the points whether they were associated (70% of the data) or not associated (remaining; 30% of the data) with the training process. In this study, the data was called training and test data, respectively, rather than the terms in-sample and out-of-sample (i.e., training, selection, and out-of-sample) suggested by [106] to provide a simple approach to perform the evaluation around. The general findings obtained with the performed applications are provided in the following section.

5. Findings from Prediction Applications

To be sure about which ANN–ALO solution is more successful at predicting the used time series, the prediction errors obtained with five different applications were compared. In order to make comparisons, errors in the prediction performances were calculated for the whole dataset (including both training and test data) with the mean absolute error (MAE), while the mean squared error (MSE) was used for ANN training processes. MAE and MSE can be represented as follows [107,108]:
Let o i be the observation (i) and p i be the prediction regarding o i .
M A E = M e a n V a l u e ( | e i | )
where e i = o i p i .
M S E = M e a n V a l u e ( 1 n e i )
where e i = o i p i , and n is the total data (row) considered.
The MAE and MSE values obtained with five different ANN–ALO systems are presented in Table 5 and Table 6 respectively (best values are in bold style).
Table 5 and Table 6 show that the ANN–ALO system with Adjustment 4 (ALO-A4/Table 3) had the best performance in seven out of 10 prediction applications. Figure 7 presents the prediction flow for the related EEG time series (Figure 5) in order to give visual idea about the success of the employed hybrid system. In Figure 7 and Figure 8, the predictions done for both the training and test data of the considered chaotic EEG time series are shown (predictions of training and test data are separated with a red line). The original data is shown with a white line, and predictions are represented with red dots (line). Additionally, some visible and remarkable periods including errors in prediction are represented by blue squares.
Visualization of the prediction performances (Figure 7 and Figure 8) showed that the ANN–ALO system is successful enough to predict the future states of the considered chaotic EEG time series data flow. In order to be sure, the chaotic flow was also checked by considering the points in the prediction phase of the processes. Additionally, there no overfitting problem observed when the error values were evaluated again. On the other hand, it is also still important to evaluate the system in detail by comparing it with some alternative approaches, and even validating it statistically.

6. More Comparisons with Alternative Approaches

In addition to the evaluation of predictions done by ANN-ALO, a comparison-based evaluation was performed to get more idea about the effectiveness of the ANN–ALO model. At this point, the chosen (best) ANN–ALO system (ANN–ALO A4) from the former prediction results was compared with alternative hybrid systems with the same ANN model structure (Table 3), but with different optimization algorithms as trainers. The evaluation was done according to the same criteria under Equations (8) and (9). However, this time, a total of 30 different runs were performed on each system to eliminate the findings obtained by chance. For the evaluation, the means of MAE and MSE were used for each EEG time series via different ANN-based systems. As this was done while determining the appropriate ANN–ALO adjustment, the chaotic flow was checked again and again for all alternative approaches, considering the points in the prediction phase of the related processes. Additionally, the whole approaches were checked for any possible overfitting issue that could have affected objective evaluation, and further analyses were done when it was certain that they did not have an overfitting problem.
As for the trainers, the particle swarm optimization (PSO) [98,109,110], cuckoo search (CS) [111,112], firefly algorithm (FA) [113,114], bat algorithm (BA) [115,116] and, also, the traditional back-propagation algorithm (BPA) [117] were used for the ANN, and each different system was employed over the same time series considered in this study (the ANN–ALO A4 system was run, except for the previously obtained findings, and a mean of 30 different runs was considered for this evaluation in order determine the level of stability within all runs).
The mean MAE and mean MSE values (with standard deviations) obtained over the time series via different ANN-based systems are reported in Table 7 and Table 8 respectively (the best values are in bold style).
It can be seen from the findings provided in Table 7 and Table 8 that ANN–ALO A4 outperformed the other systems in six out of the 10 applications. In detail, the findings obtained for the four remaining applications were also in the top three findings obtained by all different ANN-based hybrid systems. Here, CS seems to be a competitive algorithm for ALO (ALO A4) in the context of chaotic EEG time series. However, the other algorithms do not seem to be strong enough to even get near findings with the same ANN, compared with the ANN–ALO A4 system. As the traditional trainer, BPA also seems to be generally different from the other systems in terms of the obtained findings.
After the comparison with different ANN-based hybrid systems, the ANN–ALO A4 was compared with some alternative chaotic time series prediction approaches. In detail, the alternative time series prediction approaches were the dynamic Boltzmann machine (DyBM) [118], the support vector machine (SVM) [119], the hidden Markov model (HMM) [120], the Bayesian learning on Gaussian process model (BG) [121], the autoregressive integrated moving average (ARIMA) [122], the autoregressive model (ARM) [123], and the K-nearest neighbor algorithm (K-NN) [124]. The same approach of using a total of 30 different runs was performed over the objective 10 chaotic EEG time series, and the mean values of MAE obtained for each approach were assessed to determine the ‘outperforming approaches’. In order to use the general ranking after that, the mean MAE values for ANN–ALO A4 (as reported in Table 7) were directly used, which means only seven alternative approaches were run and compared with ANN–ALO A4
Table 9 and Table 10 report the mean values of MAE that were obtained for the time series via ANN–ALO A4 and the other alternative prediction approaches (best values are in bold style).
It can be observed from Table 9 and Table 10 that ANN–ALO A4 provided the top performances for all EEG time series applications when it was compared with alternative ANN-based hybrid system prediction approaches. Although some approaches provided slightly similar performances, ANN–ALO A4 was mostly different in terms of values and provided successful findings by outperforming the other methods for all components considered in the evaluation stage.
It is also important to evaluate the accuracy of ANN–ALO A4 by focusing on the predictions only done for test data. Thus, the mean rates for correctly predicted test data (based on 30 runs in the context of 1197 rows) were calculated for all 13 approaches, with each application calculated separately. The obtained findings are shown in Table 11 and Table 12, respectively (best values are shown in bold style).
Table 11 and Table 12 show that the ANN–ALO A4 mostly outperformed the other approaches based on the mean accuracy evaluation done for the predictions using only the test data considering each chaotic EEG time series (application).

6.1. Validation Test

It was important to validate the obtained findings among the related approaches, including ANN–ALO, to determine that they were not obtained by chance. In this way, it was critical to validate and assess the performance and success of the introduced ANN–ALO system. In order to achieve this, a statistical technique, the Giacomini–White Test [125] was applied in this study. This technique was applied to determine whether the minimum mean value of MAE also meant that the corresponding approach was good at prediction. Following the pairwise comparison that was done with all approaches, the general test results showing which approach provided the better performance (statistically outperformed the others with a significance level of 5%) in the related prediction application are provided in Table 13.
In Table 13, the presence of more than one approach means that there was equivalence among the related approaches for the corresponding application. Additionally, it is possible to indicate from Table 13 that the good performances of ANN–ALO (ANN–ALO A4) were validated statistically. In addition, competitive approaches in this manner may show effects sometimes, by the validations done on their findings. One remarkable fact here is also that ANN–CS was the best performing method for the A9, while ANN–ALO A4 produced the best performance(s) for the remaining applications.

6.2. Ranking

By gathering the findings from Table 7, Table 9 and Table 10, a ranking was also made for all 13 approaches to chaotic EEG time series prediction. In order to determine an order of approaches, each of them was given a point for each prediction application; points were given according to the following equation (a lower mean value of MAE gave more points to the corresponding approach):
A p p r o a c h P o i n t = A p p r o a c h P o i n t + A p p r o a c h V a l u e O r d e r A p p l i c a t i o n N o .
Moving on from the explanations, Table 14 and Table 15 provide a report of the ranks and the obtained points from each approach in the context of the prediction applications.
Figure 9 represents the total obtained points by the related approaches in the form of a graphic. It can be seen that the ANN–ALO A4 took first place with a total of 125 points, ANN–CS took second place with a total of 121 points, and ANN–BA took the third place with a total of 102 points. Except for the SVM and K-NN approaches, ANN-based systems are generally consecutive places. The worst rank was given to ARM, followed by BG, and then the other approaches—DyBM, HMM, and ARIMA—had total point values of around 38–43. Finally, ANN–BPA (the traditional training algorithm) took eighth place among all 13 approaches evaluated.

6.3. Practical Application and Experiences by Physicians

The development of this s scientific method with ANN–ALO to predict chaotic EEG time series also required its practical value to be shown. Due to this, it was critical to gather some evidence for the practical value of the system. In order to achieve this, the ANN–ALO system was coded with the Java programming language in order to form a prediction software system that could be used directly in computers to analyze chaotic EEG data flows in a real manner. The developed software was distributed to neurosurgery policlinics of four local hospitals in Isparta, Turkey. The hospitals were, respectively, Isparta State Hospital (from which the EEG data used in the study were obtained), the Hospital of Suleyman Demirel University (SDU), Meddem Hospital, and the Davraz Life Hospital. Currently, the software system is actively used by a total of six physicians from the related policlinics of the hospitals, and in order to obtain enough information about the practical value of the system, anonymous interviews were done to obtain remarkable ideas from the experts for the developed system and solution approach. Statements received from the physicians were generally positive ones. Some remarkable statements noted are as follows: ‘This software has improved my efficiency by ensuring predictive analyze of EEG’ (Meddem Hospital). ‘I am currently using the system for investigating epilepsy in my patients’ (Isparta State Hospital). ‘I think such a prediction approach can be useful for also cardiologists and other physicians dealing with medical time series’ (Davraz Life Hospital). ‘I am able to predict brain diseases better thanks to this system. It is something like I have an experienced assistant who does not know being tired’ (Hospital of SDU). ‘I will shape my future academic research in nature of EEG data after seeing that it is possible to perform effective analyzes with Machine Learning’ (Hospital of SDU). ‘The system is good enough at predicting future states of brain activity and making effective decision making in terms of treatments’ (Meddem Hospital). ‘The system can be used as a mobile application I think. Currently I need to use my laptop everywhere’ (Davraz Life Hospital). ‘I started to reduce my diagnosis time thanks to using that automated system for prediction and analyze of EEG data’ (Meddem Hospital). ‘Prediction done with this software is useful to see many details you can miss with human eyes’ (Isparta State Hospital). ‘This software is fast at analyzing future states of the electrical activity in human brain. I wonder its application on also animals’ (Davraz Life Hospital).
In addition to the interview, the physicians were also asked to fill out a small survey to try and evaluate the used software system in terms of usability, accuracy, speed, effectiveness, and novelty, respectively. At this point, each physician gave a point for each characteristic with the following scale: too bad (1); bad (2); normal (3); good (4); and very good (5). The obtained findings with mean points are given briefly in Table 16.
Table 16 shows that the software system running ANN–ALO in the background is ‘very good’ in terms of its usability, accuracy, and effectiveness. Although the physicians thought that the speed of the system was between the scale of normal and good, the novelty value of the system was around the level of ‘good’.
Figure 10 below also shows the prediction examples done for two different patients at the Hospital of SDU. As necessary, ethical permission was obtained from both patients, one of which was a 37-year old female with a possible epilepsy problem and the other one with normal EEG data was a 32-year old male. One prediction was for early epilepsy seizure diagnosis (Figure 10a) for the female patient and the other one was for the male patient with a normal EEG flow (no disease diagnosis) at the end (Figure 10b). The data flow in the blue color corresponds to predictions, and from Figure 10a, it can be seen that the ANN-ALO system was effective enough to predict even epilepsy seizures, which is a vital medical diagnosis issue.
All the findings obtained through the explained evaluation stages and practical application indicate that the designed ANN–ALO hybrid system (with the ALO A4 form) was successful and effective enough for predicting chaotic EEG time series, which means such healthcare-oriented data can be predicted by that system to support diagnosis and treatment-based tasks. On the other hand, the system has good potential to predict the future states of chaotic time series which are generally in the form of medical data or other data types related to natural occurrences and dynamics in real-life. More discussion is provided in the following section.

7. Discussion

Based on the works done so far in the study, it is possible to discuss some remarkable points which are important for the characteristic of the study and its effects generally. At this point, a discussion is made concerning the findings of the system and general effects considering the wider scope.

7.1. Discussion over Findings

Based on the comparisons done, it is possible to indicate that the ANN system supported by ALO seems to produce better results than some other alternative systems. The results were even validated with the Giacomini–White Test, which was done for the findings from all approaches for the related prediction applications. Thus, the results seen in this study can be accepted as being a good contribution to the alternative research studies done so far regarding the prediction of chaotic time series. The prediction of chaotic time series is an important research method for time series prediction. This study briefly showed the effectiveness of the introduced ANN–ALO hybrid system/approach (and thus, the role of artificial intelligence and the related techniques) in predicting chaotic time series successfully. According to the findings, not only hybrid systems or alternative approaches, but also the traditional ANN–BPA system, provides worse results than the employed ANN–ALO approach, although it has a medium-level rank over some alternative approaches. The performed alternative error evaluations with both MAE and MSE confirmed the performance and success of the ANN–ALO system along with the states of other prediction systems and approaches considered. Additional statistical tests confirmed the reliability of the obtained findings and again confirmed the obtained findings leading more accurate results. The practical applications and received feedback point out the value of the developed ANN–ALO system in terms of its use in hospitals and healthcare units by physicians.

7.2. General Discussion

The prediction of medical/healthcare-oriented data by EEG is a remarkable approach because of the reasons expressed in previous paragraphs. In addition to being vital in terms of detection and diagnosis, the determination of future state of EEG is especially important for today’s world in which there is great interaction with the digital world. The understanding of brain activities and their relations with the world will always be a research interest and studies like the one done here will always be triggering factors for the shaping of technology and the future world. Recent evidence for this is the research currently being done for brain–computer interfaces. At this point, ANN–ALO systems can be effectively used to support the development of brain–computer interfaces. The prediction of chaotic time series is a very attractive issue in the informatics era which currently requires the rapid formation of information/data—manipulating it, sharing it, and obtaining alternative information/data (i.e., future states, explanations for problems/solutions) which is greatly usable. Thus, the study conducted here is an important alternative for similar studies being done in the scientific arena. The observed findings and the results reached here show the importance of artificial intelligence and its role in solving real-life problems. It is also important that artificial intelligence is a science that is progressed in the future because of its multidisciplinary scope. Since the literature is a dynamic environment which will always have better candidates, there is an open opportunity for the author(s) to carry out future studies. The prediction of, particularly natural dynamics, is an important point for dealing with real-life problems. The study realized here can be accepted as a good solution in this manner. Swarm intelligence is an important subfield of artificial intelligence, and it has potential for use in the future. The use of ALO here and its effective role in shaping the solution are remarkable points for supporting the ideas about swarm intelligence. It is thought by the author that the future of artificial intelligence will be based on the design of appropriate hybrid systems by using the most recent approaches, methods, and techniques. This study is a remarkable example of developing a hybrid artificial intelligence system to solve real-world based problems. The hybrid system introduced here has the potential to be part of some modular components to derive feedback for people/users (according to meaning of future states of time series) and to form general medical artificial intelligence-based systems like expert systems and/or medical/clinical decision support systems [126,127,128,129,130]. Furthermore, it may be also a small part of larger, adaptive control systems which continuously support real-time processes performed in medical/healthcare locations. The results obtained here can be thought of as ‘butterfly effects’ which may shape greater research by considering small, but effective, outputs.

8. Conclusions

This study introduced a novel hybrid system formed by the artificial neural network (ANN) and ant-lion optimizer (ALO) to be applied for the chaotic time series of electroencephalograms (EEGs). As a recent effective optimization algorithm, ALO was developed through inspiration from the hunting behaviors shown by ant-lions while they are larvae. In the study, ALO was employed for the training phase of the ANN which can use its inputs to obtain outputs corresponding to future predictions. From a general perspective, that hybrid system has been used to predict chaotic time series in the form of medical data (EEG). EEG was considered as the objective time series, because it is a vital data collection method that is used to diagnose brain diseases earlier to allow better healthcare options. Furthermore, EEG is associated with many occurrences in the body of a living organism, so it is possible to diagnose many diseases or interactions with environmental factors thanks to the evaluation done by EEGs. Since it also has chaotic flow, the prediction of such time series requires effective prediction approaches and, because of that, the ANN–ALO system was employed in this study as an artificial intelligence-based solution. The obtained results and planned future works are as follows:

8.1. Obtained Results

In order to see if the ANN-ALO system is successful enough at predicting future states of EEGs, it was applied to a total of 10 different EEG datasets recorded from random individuals at the Isparta State Hospital located in the city of Isparta, Turkey. The obtained findings showed that the considered ANN–ALO system can predict the future states of objective time series which means it can be used a predictive infrastructure for EEG time series. In addition, it was determined that the ANN–ALO even outperforms some other alternative hybrid systems formed with ANN and different swarm intelligence algorithm variations. Finally, the ANN–ALO is also a better prediction system than the traditional ANN–BPA system and also some traditional chaotic time series prediction approaches introduced in the associated literature. It is also important that the results here were validated statistically with the Giacomini–White Test. Except for the performed comparative works, some experiences from physicians who had used the ANN-ALO system in four separate hospitals in Isparta, Turkey were evaluated. The experiences reported in this manner showed positive results regarding the practical application of the system. All of these results showed that the ANN–ALO system can be used as an effective, and also quick, prediction approach. It was also observed that the system can be effectively used by physicians for early diagnosis and further investigations on brain activity.

8.2. Future Work

The positive results obtained in this study have encouraged the author to continue performing more works with the ANN–ALO system. In this context, future studies include applying the ANN–ALO on alternative chaotic time series in order to investigate more about its success in time series prediction. Since the scope of this study has been healthcare resulting from the use of EEG data, it is important to evaluate the system with some other chaotic time series from different fields. However, future works will also include applications in the electrocardiogram (ECG), which is the key time series of heart activity. Another future study will be associated with the analysis of effects of different adjustments applied to both ANN and ALO, respectively, and also the use of alternative lags/delays (from previous failed ones) as inputs to the ANN which have not been investigated before. In regard to using the ANN-ALO structure, future studies will also include using different hybrid techniques (e.g., ANNs with different intelligent optimization algorithms) to determine their success with chaotic time series, including healthcare-oriented ones.

Funding

This research received no external funding.

Conflicts of Interest

The author declares no conflicts of interest.

References

  1. Douglas, A.I.; Williams, G.M.; Samuel, A.W.; Carol, A.W. Basic Statistics for Business & Economics, 3rd ed.; McGraw-Hill: New York, NY, USA, 2009. [Google Scholar]
  2. Esling, P.; Agon, C. Time-series data mining. ACM Comput. Surv. (CSUR) 2012, 45, 12. [Google Scholar] [CrossRef]
  3. NIST SEMATECH. Introduction to Time Series Analysis. In Engineering Statistics Handbook; NIST: Gaithersburg, MD, USA, 2016. Available online: http://www.itl.nist.gov/div898/handbook/pmc/section4/pmc4.htm (accessed on 10 July 2016).
  4. Penn State Eberly Collage of Science. Overview of Time Series Characteristics, STAT-510 (App. Time Series Analysis). Available online: https://onlinecourses.science.psu.edu/stat510/node/47 (accessed on 10 July 2016).
  5. Gromov, G.A.; Shulga, A.N. Chaotic time series prediction with employment of ant colony optimization. Expert Syst. Appl. 2012, 39, 8474–8478. [Google Scholar] [CrossRef]
  6. Mirjalili, S. The ant lion optimizer. Adv. Eng. Softw. 2015, 83, 80–98. [Google Scholar] [CrossRef]
  7. Mirjalili, S.; Jangir, P.; Saremi, S. Multi-objective ant lion optimizer: A multi-objective optimization algorithm for solving engineering problems. Appl. Intell. 2017, 46, 79–95. [Google Scholar] [CrossRef]
  8. Yao, P.; Wang, H. Dynamic Adaptive Ant Lion Optimizer applied to route planning for unmanned aerial vehicle. Soft Comput. 2017, 21, 5475–5488. [Google Scholar] [CrossRef]
  9. Mani, M.; Bozorg-Haddad, O.; Chu, X. Ant Lion Optimizer (ALO) Algorithm. In Advanced Optimization by Nature-Inspired Algorithms; Springer: Singapore, 2018; pp. 105–116. [Google Scholar]
  10. Kose, U.; Arslan, A. Forecasting chaotic time series via anfis supported by vortex optimization algorithm: Applications on electroencephalogram time series. Arab. J. Sci. Eng. 2017, 42, 3103–3114. [Google Scholar] [CrossRef]
  11. Gan, M.; Peng, H.; Peng, X.; Chen, X.; Inoussa, G. A locally linear RBF network-based state-dependent AR model for nonlinear time series modeling. Inf. Sci. 2010, 180, 4370–4383. [Google Scholar] [CrossRef]
  12. Wong, W.K.; Xia, M.; Chu, W.C. Adaptive neural network model for time-series forecasting. Eur. J. Oper. Res. 2010, 207, 807–816. [Google Scholar] [CrossRef]
  13. Gentili, P.L.; Gotoda, H.; Dolnik, M.; Epstein, I.R. Analysis and prediction of aperiodic hydrodynamic oscillatory time series by feed-forward neural networks, fuzzy logic, and a local nonlinear predictor. Chaos: An Interdiscip. J. Nonlinear Sci. 2015, 25, 013104. [Google Scholar] [CrossRef] [PubMed]
  14. Chen, D.; Han, W. Prediction of multivariate chaotic time series via radial basis function neural network. Complexity 2013, 18, 55–66. [Google Scholar] [CrossRef]
  15. Wu, X.; Li, C.; Wang, Y.; Zhu, Z.; Liu, W. Nonlinear time series prediction using iterated extended Kalman filter trained single multiplicative neuron model. J. Inf. Comput. Sci. 2013, 10, 385–393. [Google Scholar]
  16. Yadav, R.N.; Kalra, P.K.; John, J. Time series prediction with single multiplicative neuron model. Appl. Soft Comput. 2007, 7, 1157–1163. [Google Scholar] [CrossRef]
  17. Zhao, L.; Yang, Y. PSO-based single multiplicative neuron model for time series prediction. Expert Syst. Appl. 2009, 36, 2805–2812. [Google Scholar] [CrossRef]
  18. Yao, J.; Liu, W. Nonlinear time series prediction of atmospheric visibility in shanghai. In Time Series Analysis, Modeling and Applications; Intelligent Systems Reference Library; Pedrycz, W., Chen, S.-M., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; Volume 47. [Google Scholar]
  19. Unler, A. Improvement of energy demand forecasts using swarm intelligence: The case of Turkey with projections to 2025. Energy Policy 2008, 36, 1937–1944. [Google Scholar] [CrossRef]
  20. Porto, A.; Irigoyen, E.; Larrea, M. A PSO boosted ensemble of extreme learning machines for time series forecasting. In The 13th International Conference on Soft Computing Models in Industrial and Environmental Applications; Springer International Publishing AG: Cham, Switzerland, 2018; pp. 324–333. [Google Scholar]
  21. Weng, S.S.; Liu, Y.H. Mining time series data for segmentation by using ant colony optimization. Eur. J. Oper. Res. 2006, 173, 921–937. [Google Scholar] [CrossRef]
  22. Toskari, M.D. Estimating the net electricity energy generation and demand using the ant colony optimization approach. Energy Policy 2009, 37, 1181–1187. [Google Scholar]
  23. Hong, W.C. Application of chaotic ant swarm optimization in electric load forecasting. Energy Policy 2010, 38, 5830–5839. [Google Scholar] [CrossRef]
  24. Niu, D.; Wang, Y.; Wu, D.D. Power load forecasting using support vector machine and ant colony optimization. Expert Syst. Appl. 2010, 37, 2531–2539. [Google Scholar] [CrossRef]
  25. Yeh, W.-C. New parameter-free simplified swarm optimization for artificial neural network training and its application in the prediction of time series. IEEE Trans. Neural Netw. Learn. Syst. 2013, 24, 661–665. [Google Scholar] [PubMed]
  26. Nourani, V.; Andalib, G. Wavelet based Artificial Intelligence approaches for prediction of hydrological time series. In Proceedings of the Australasian Conference on Artificial Life and Computational Intelligence, Newcastle, NSW, Australia, 5–7 February 2015; pp. 422–435. [Google Scholar]
  27. Bontempi, G.; Taieb, S.B.; Le Borgne, Y.-A. Machine learning strategies for time series forecasting. In Business Intelligence; Lecture Notes in Business Information Processing; Aufaure, M.-A., Zimanyi, E., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; Volume 138. [Google Scholar]
  28. Hu, Y.X.; Zhang, H.T. Prediction of the chaotic time series based on chaotic simulated annealing and support vector machine. In Proceedings of the International Conference on Solid State Devices and Materials Science, Macao, China, 1–2 April 2012; pp. 506–512. [Google Scholar]
  29. Liu, P.; Yao, J.A. Application of least square support vector machine based on particle swarm optimization to chaotic time series prediction. In Proceedings of the IEEE International Conference on Intelligent Computing and Intelligent Systems, Shanghai, China, 20–22 November 2009; pp. 458–462. [Google Scholar]
  30. Quian, J.S.; Cheng, J.; Guo, Y.N. A novel multiple support vector machines architecture for chaotic time series prediction. In Proceedings of the ICNC: International Conference on Natural Computation, Xi’an, China, 24–28 September 2006; Volume 4221, pp. 147–156. [Google Scholar]
  31. Yang, Z.H.O.; Wang, Y.S.; Li, D.D.; Wang, C.J. Predict the time series of the parameter-varying chaotic system based on reduced recursive lease square support vector machine. In Proceedings of the IEEE International Conference on Artificial Intelligence and Computational Intelligence, Shanghai, China, 7–8 November 2009; pp. 29–34. [Google Scholar]
  32. Zhang, J.S.; Dang, J.L.; Li, H.C. Local support vector machine prediction of spatiotemporal chaotic time series. Acta Phys. Sin. 2007, 56, 67–77. [Google Scholar]
  33. Farooq, T.; Guergachi, A.; Krishnan, S. Chaotic time series prediction using knowledge based Green’s kernel and least-squares support vector machines. In Proceedings of the IEEE International Conference on Systems, Man and Cybernetics, Montreal, QC, Canada, 7–10 October 2007; pp. 2669–2674. [Google Scholar]
  34. Shi, Z.W.; Han, M. Support vector echo-state machine for chaotic time-series prediction. IEEE Trans. Neural Netw. 2007, 18, 359–372. [Google Scholar] [CrossRef] [PubMed]
  35. Li, H.T.; Zhang, X.F. Precipitation time series predicting of the chaotic characters using support vector machines. In Proceedings of the International Conference on Information Management, Innovation Management and Industrial Engineering, Xi’an, China, 26–27 December 2009; pp. 407–410. [Google Scholar]
  36. Zhu, C.H.; Li, L.L.; Li, J.H.; Gao, J.S. Short-term wind speed forecasting by using chaotic theory and SVM. Appl. Mech. Mater. 2013, 300–301, 842–847. [Google Scholar] [CrossRef]
  37. Ren, C.-X.; Wang, C.-B.; Yin, C.-C.; Chen, M.; Shan, X. The prediction of short-term traffic flow based on the niche genetic algorithm and BP neural network. In Proceedings of the 2012 International Conference on Information Technology and Software Engineering, Beijing, China, 8–10 December 2012; pp. 775–781. [Google Scholar]
  38. Ding, C.; Wang, W.; Wang, X.; Baumann, M. A neural network model for driver’s lane-changing trajectory prediction in urban traffic flow. Math. Probl. Eng. 2013. [Google Scholar] [CrossRef]
  39. Yin, H.; Wong, S.C.; Xu, J.; Wong, C.K. Urban traffic flow prediction using a fuzzy-neural approach. Transp. Res. Part C Emerg. Technol. 2002, 10, 85–98. [Google Scholar] [CrossRef]
  40. Dunne, S.; Ghosh, B. Weather adaptive traffic prediction using neurowavelet models. IEEE Trans. Intell. Transp. Syst. 2013, 14, 370–379. [Google Scholar] [CrossRef]
  41. Pulido, M.; Melin, P.; Castillo, O. Particle swarm optimization of ensemble neural networks with fuzzy aggregation for time series prediction of the Mexican Stock Exchange. Inf. Sci. 2014, 280, 188–204. [Google Scholar] [CrossRef]
  42. Huang, D.Z.; Gong, R.X.; Gong, S. Prediction of wind power by chaos and BP artificial neural networks approach based on genetic algorithm. J. Electr. Eng. Technol. 2015, 10, 41–46. [Google Scholar] [CrossRef]
  43. Jiang, P.; Qin, S.; Wu, J.; Sun, B. Time series analysis and forecasting for wind speeds using support vector regression coupled with artificial intelligent algorithms. Math. Probl. Eng. 2015, 2015, 939305. [Google Scholar] [CrossRef]
  44. Doucoure, B.; Agbossou, K.; Cardenas, A. Time series prediction using artificial wavelet neural network and multi-resolution analysis: Application to wind speed data. Renew. Energy 2016, 92, 202–211. [Google Scholar] [CrossRef]
  45. Chandra, R. Competition and collaboration in cooperative coevolution of Elman recurrent neural networks for time-series prediction. IEEE Trans. Neural Netw. Learn. Syst. 2015, 26, 3123–3136. [Google Scholar] [CrossRef] [PubMed]
  46. Chai, S.H.; Lim, J.S. Forecasting business cycle with chaotic time series based on neural network with weighted fuzzy membership functions. Chaos Solitons Fractals 2016, 90, 118–126. [Google Scholar] [CrossRef]
  47. Seo, Y.; Kim, S.; Kisi, O.; Singh, V.P. Daily water level forecasting using wavelet decomposition and Artificial Intelligence techniques. J. Hydrol. 2015, 520, 224–243. [Google Scholar] [CrossRef]
  48. Marzban, F.; Ayanzadeh, R.; Marzban, P. Discrete time dynamic neural networks for predicting chaotic time series. J. Artif. Intell. 2014, 7, 24. [Google Scholar] [CrossRef]
  49. Okkan, U. Wavelet neural network model for reservoir inflow prediction. Sci. Iran. 2012, 19, 1445–1455. [Google Scholar] [CrossRef]
  50. Zhou, T.; Gao, S.; Wang, J.; Chu, C.; Todo, Y.; Tang, Z. Financial time series prediction using a dendritic neuron model. Knowl.-Based Syst. 2016, 105, 214–224. [Google Scholar] [CrossRef]
  51. Wang, L.; Zou, F.; Hei, X.; Yang, D.; Chen, D.; Jiang, Q.; Cao, Z. A hybridization of teaching–learning-based optimization and differential evolution for chaotic time series prediction. Neural Comput. Appl. 2014, 25, 1407–1422. [Google Scholar] [CrossRef]
  52. Heydari, G.; Vali, M.; Gharaveisi, A.A. Chaotic time series prediction via artificial neural square fuzzy inference system. Expert Syst. Appl. 2016, 55, 461–468. [Google Scholar] [CrossRef]
  53. Wang, L.; Zeng, Y.; Chen, T. Back propagation neural network with adaptive differential evolution algorithm for time series forecasting. Expert Syst. Appl. 2015, 42, 855–863. [Google Scholar] [CrossRef]
  54. Catalao, J.P.S.; Pousinho, H.M.I.; Mendes, V.M.F. Hybrid wavelet-PSO-ANFIS approach for short-term electricity prices forecasting. IEEE Trans. Power Syst. 2011, 26, 137–144. [Google Scholar] [CrossRef]
  55. Patra, A.; Das, S.; Mishra, S.N.; Senapati, M.R. An adaptive local linear optimized radial basis functional neural network model for financial time series prediction. Neural Comput. Appl. 2017, 28, 101–110. [Google Scholar] [CrossRef]
  56. Ravi, V.; Pradeepkumar, D.; Deb, K. Financial time series prediction using hybrids of chaos theory, multi-layer perceptron and multi-objective evolutionary algorithms. Swarm Evol. Comput. 2017, 36, 136–149. [Google Scholar] [CrossRef]
  57. Méndez, E.; Lugo, O.; Melin, P. A competitive modular neural network for long-term time series forecasting. In Nature-Inspired Design of Hybrid Intelligent Systems; Springer International Publishing: Cham, Switzerland, 2017; pp. 243–254. [Google Scholar]
  58. Wei, B.L.; Luo, X.S.; Wang, B.H.; Guo, W.; Fu, J.J. Prediction of EEG signal by using radial basis function neural networks. Chin. J. Biomed. Eng. 2003, 22, 488–492. [Google Scholar]
  59. Hou, M.-Z.; Han, X.-L.; Huang, X. Application of BP neural network for forecast of EEG signal. Comput. Eng. Des. 2006, 14, 061. [Google Scholar]
  60. Wei, C.; Zhang, C.; Wu, M. A study on the universal method of EEG and ECG prediction. In Proceedings of the 2017 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI), Shanghai, China, 14–16 October 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 1–5. [Google Scholar]
  61. Blinowska, K.J.; Malinowski, M. Non-linear and linear forecasting of the EEG time series. Biol. Cybern. 1991, 66, 159–165. [Google Scholar] [CrossRef] [PubMed]
  62. Lin, W.B.; Shu, L.X.; Hong, W.B.; Jun, Q.H.; Wei, G.; Jie, F.J. A method based on the third-order Volterra filter for adaptive predictions of chaotic time series. Acta Phys. Sin. 2002, 10, 006. [Google Scholar]
  63. Coyle, D.; Prasad, G.; McGinnity, T.M. A time-series prediction approach for feature extraction in a brain-computer interface. IEEE Trans. Neural Syst. Rehabil. Eng. 2005, 13, 461–467. [Google Scholar] [CrossRef] [PubMed]
  64. Coelho, V.N.; Coelho, I.M.; Coelho, B.N.; Souza, M.J.; Guimarães, F.G.; Luz, E.D.S.; Barbosa, A.C.; Coelho, M.N.; Netto, G.G.; Costa, R.C.; et al. EEG time series learning and classification using a hybrid forecasting model calibrated with GVNS. Electron. Notes Discret. Math. 2017, 58, 79–86. [Google Scholar] [CrossRef]
  65. Komijani, H.; Nabaei, A.; Zarrabi, H. Classification of normal and epileptic EEG signals using adaptive neuro-fuzzy network based on time series prediction. Neurosci. Biomed. Eng. 2016, 4, 273–277. [Google Scholar] [CrossRef]
  66. Prasad, S.C.; Prasad, P. Deep recurrent neural networks for time series prediction. arXiv, 2014; arXiv:1407.5949. [Google Scholar]
  67. Forney, E.M. Electroencephalogram Classification by Forecasting with Recurrent Neural Networks. Master’s Dissertation, Department of Computer Science, Colorado State University, Fort Collins, CO, USA, 2011. [Google Scholar]
  68. Carpenter, G.A. Neural network models for pattern recognition and associative memory. Neural Netw. 1989, 2, 243–257. [Google Scholar] [CrossRef]
  69. Cochocki, A.; Unbehauen, R. Neural Networks for Optimization and Signal Processing; John Wiley & Sons, Inc.: Chichester, UK, 1993. [Google Scholar]
  70. Miller, W.T.; Sutton, R.S.; Werbos, P.J. Neural Networks for Control; MIT Press: Cambridge, MA, USA, 1995. [Google Scholar]
  71. Ripley, B.D. Neural networks and related methods for classification. J. R. Stat. Soc. Ser. B 1994, 56, 409–456. [Google Scholar]
  72. Basheer, I.A.; Hajmeer, M. Artificial neural networks: Fundamentals, computing, design and application. J. Microbiol. Methods 2000, 43, 3–31. [Google Scholar] [CrossRef]
  73. Badri, A.; Ameli, Z.; Birjandi, A.M. Application of artificial neural networks and fuzzy logic methods for short term load forecasting. Energy Procedia 2012, 14, 1883–1888. [Google Scholar] [CrossRef]
  74. Ghorbanian, J.; Ahmadi, M.; Soltani, R. Design predictive tool and optimization of journal bearing using neural network model and multi-objective genetic algorithm. Sci. Iran. 2011, 18, 1095–1105. [Google Scholar] [CrossRef]
  75. Gholizadeh, S.; Seyedpoor, S.M. Shape optimization of arch dams by metaheuristics and neural networks for frequency constraints. Sci. Iran. 2011, 18, 1020–1027. [Google Scholar] [CrossRef]
  76. Firouzi, A.; Rahai, A. An integrated ANN-GA for reliability based inspection of concrete bridge decks considering extent of corrosion-induced cracks and life cycle costs. Sci. Iran. 2012, 19, 974–981. [Google Scholar] [CrossRef]
  77. Shahreza, M.L.; Moazzami, D.; Moshiri, B.; Delavar, M.R. Anomaly detection using a self-organizing map and particle swarm optimization. Sci. Iran. 2011, 18, 1460–1468. [Google Scholar] [CrossRef]
  78. Isokawa, T.; Nishimura, H.; Matsui, N. Quaternionic multilayer perceptron with local analyticity. Information 2012, 3, 756–770. [Google Scholar] [CrossRef]
  79. Kose, U.; Arslan, A. Optimization of self-learning in Computer Engineering courses: An intelligent software system supported by Artificial Neural Network and Vortex Optimization Algorithm. Comput. Appl. Eng. Educ. 2017, 25, 142–156. [Google Scholar] [CrossRef] [Green Version]
  80. McCulloch, W.S.; Pitts, W. A logical calculus of the ideas immanent in nervous activity. Bull. Math. Biophys. 1990, 5, 115–133. [Google Scholar] [CrossRef]
  81. Anderson, D.; McNeill, G. Artificial Neural Networks Technology; A DACS State-of-the-Art Report; Kaman Sciences Corporation: Utica, NY, USA, 1992. [Google Scholar]
  82. Ugur, A.; Kinaci, A.C. A web-based tool for teaching neural network concepts. Comput. Appl. Eng. Educ. 2010, 18, 449–457. [Google Scholar] [CrossRef]
  83. Yegnanarayana, B. Artificial Neural Networks; PHI Learning Pvt. Ltd.: Delhi, India, 2009. [Google Scholar]
  84. Raju, M.; Saikia, L.C.; Sinha, N. Automatic generation control of a multi-area system using ant lion optimizer algorithm based PID plus second order derivative controller. Int. J. Electr. Power Energy Syst. 2016, 80, 52–63. [Google Scholar] [CrossRef]
  85. Kamboj, V.K.; Bhadoria, A.; Bath, S.K. Solution of non-convex economic load dispatch problem for small-scale power systems using ant lion optimizer. Neural Comput. Appl. 2017, 28, 2181–2192. [Google Scholar] [CrossRef]
  86. Yamany, W.; Tharwat, A.; Hassanin, M.F.; Gaber, T.; Hassanien, A.E.; Kim, T.H. A new multi-layer perceptrons trainer based on ant lion optimization algorithm. In Proceedings of the 2015 Fourth International Conference on Information Science and Industrial Applications (ISI), Busan, Korea, 20–22 September 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 40–45. [Google Scholar]
  87. Maher, M.; Ebrahim, M.A.; Mohamed, E.A.; Mohamed, A. Ant-lion Optimizer Based Optimal Allocation of Distributed Generators in Radial Distribution Networks. Int. J. Eng. Inf. Syst. 2017, 1, 225–238. [Google Scholar]
  88. Kilic, H.; Yuzgec, U. Improved antlion optimization algorithm via tournament selection. In Proceedings of the 2017 9th International Conference on Computational Intelligence and Communication Networks (CICN), Girne, Cyprus, 16–17 September 2017; IEEE: Piscataway, NJ, USA, 2017; pp. 200–205. [Google Scholar]
  89. Ali, A.H.; Youssef, A.R.; George, T.; Kamel, S. Optimal DG allocation in distribution systems using Ant lion optimizer. In Proceedings of the 2018 International Conference on Innovative Trends in Computer Engineering (ITCE), Aswan, Egypt, 19–21 February 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 324–331. [Google Scholar]
  90. Pradhan, R.; Majhi, S.K.; Pradhan, J.K.; Pati, B.B. Performance Evaluation of PID Controller for an Automobile Cruise Control System using Ant Lion Optimizer. Eng. J. 2017, 21, 347–361. [Google Scholar] [CrossRef] [Green Version]
  91. Rajan, A.; Jeevan, K.; Malakar, T. Weighted elitism based Ant Lion Optimizer to solve optimum VAr planning problem. Appl. Soft Comput. 2017, 55, 352–370. [Google Scholar] [CrossRef]
  92. Blum, C.; Li, X. Swarm intelligence in optimization. In Swarm Intelligence; Blum, C., Merkle, D., Eds.; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
  93. Engelbrecht, A.P. Fundamentals of Computational Swarm Intelligence; John Wiley & Sons: Hoboken, NJ, USA, 2006. [Google Scholar]
  94. Bonabeau, E.; Dorigo, M.; Theraulaz, G. Swarm Intelligence: From Natural to Artificial Systems (No. 1); Oxford University Press: Oxford, UK, 1999. [Google Scholar]
  95. Panigrahi, B.K.; Shi, Y.; Lim, M.H. (Eds.) Handbook of Swarm Intelligence: Concepts, Principles and Applications; Springer Science & Business Media: Berlin, Germany, 2011; Volume 8. [Google Scholar]
  96. Fukuyama, Y. Fundamentals of particle swarm optimization techniques. In Modern Heuristic Optimization Techniques: Theory and Applications to Power Systems; Lee, K.Y., El-Sharkawi, M.A., Eds.; John Wiley & Sons: Hoboken, NJ, USA, 2008. [Google Scholar]
  97. Bonabeau, E.; Dorigo, M.; Theraulaz, G. Inspiration for optimization from social insect behaviour. Nature 2000, 406, 39–42. [Google Scholar] [CrossRef] [PubMed]
  98. Kennedy, J. Particle swarm optimization. In Encyclopedia of Machine Learning; Sammut, C., Webb, G.I., Eds.; Springer: New York, NY, USA, 2011. [Google Scholar]
  99. Dorigo, M.; Blum, C. Ant colony optimization theory: A survey. Theor. Comput. Sci. 2005, 344, 243–278. [Google Scholar] [CrossRef]
  100. Karaboga, D. Artificial Intelligence Optimization Algorithms; Nobel Publishing: Ankara, Turkey, 2004; ISBN 975-6574. (In Turkish) [Google Scholar]
  101. Kose, U. Development of Artificial Intelligence Based Optimization Algorithms. Ph.D. Thesis, nstitute of Natural Sciences, Department of Computer Engineering, Selcuk University, Konya, Turkey, 2017. (In Turkish). [Google Scholar]
  102. MyScienceSchool.org. What Is Electroencephalography (EEG)? Available online: http://myscienceschool.org/index.php?/archives/3208-What-is-Electroencephalography-EEG.html (accessed on 10 March 2017).
  103. Sjölie, D. Reality-Based Brain-Computer Interaction. Ph.D. Thesis, Department of Computing Science, Umeå University, Umeå, Sweden, 2011. Available online: https://www.researchgate.net/publication/215509007_Reality-Based_Brain-Computer_Interaction (accessed on 10 March 2017).
  104. Jadhav, P.; Shanamugan, D.; Chourasia, A.; Ghole, A.R.; Acharyya, A.; Naik, G.R. Automated detection and correction of eye blink and muscular artefacts in EEG signal for analysis of Autism Spectrum Disorder. In Proceedings of the 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC 2014), Chicago, IL, USA, 26–30 August 2014; pp. 1881–1884. [Google Scholar]
  105. Sandri, M. Numerical calculation of Lyapunov exponents. Math. J. 1996, 6, 78–84. [Google Scholar]
  106. Bishop, C.M. Neural Networks for Pattern Recognition; Oxford University Press: Oxford, UK, 1995. [Google Scholar]
  107. OTexts.org. Evaluating Forecast Accuracy. Available online: https://www.otexts.org/fpp/2/5 (accessed on 16 July 2016).
  108. Willmott, C.J.; Matsuura, K. Advantages of the mean absolute error (MAE) over the root mean square error (RMSE) in assessing average model performance. Clim. Res. 2005, 30, 79–82. [Google Scholar] [CrossRef] [Green Version]
  109. Eberhart, R.C.; Kennedy, J. A new optimizer using particle swarm theory. In Proceedings of the Sixth International Symposium on Micro Machine and Human Science, Nagoya, Japan, 4–6 October 1995; pp. 39–43. [Google Scholar]
  110. Kennedy, J. The particle swarm: Social adaptation of knowledge. In Proceedings of the 1997 IEEE International Conference on Evolutionary Computation, Indianapolis, IN, USA, 13–16 April 1997; IEEE: Piscataway, NJ, USA, 1997; pp. 303–308. [Google Scholar]
  111. Yang, X.S.; Deb, S. Cuckoo search via Lévy flights. In Proceedings of the 2009 World Congress on Nature & Biologically Inspired Computing, Coimbatore, India, 9–11 December 2009; IEEE: Piscataway, NJ, USA, 2009; pp. 210–214. [Google Scholar]
  112. Yang, X.S.; Deb, S. Cuckoo search: Recent advances and applications. Neural Comput. Appl. 2014, 24, 169–174. [Google Scholar] [CrossRef]
  113. Yang, X.S. Nature-Inspired Metaheuristic Algorithms; Luniver Press: Frome, UK, 2010. [Google Scholar]
  114. Yang, X.S. Firefly algorithms for multimodal optimization. In Stochastic Algorithms: Foundations and Applications; Watanabe, O., Zeugmann, T., Eds.; Springer: Berlin/Heidelberg, Germany, 2009. [Google Scholar]
  115. Yang, X.S. A new metaheuristic bat-inspired algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); González, J.R., Pelta, D.A., Cruz, C., Terrazas, G., Krasnogor, N., Eds.; Springer: Berlin/Heidelberg, Germany, 2010. [Google Scholar]
  116. Yang, X.S.; Hossein Gandomi, A. Bat algorithm: A novel approach for global engineering optimization. Eng. Comput. 2012, 29, 464–483. [Google Scholar] [CrossRef]
  117. Chauvin, Y.; Rumelhart, D.E. Backpropagation: Theory, Architectures, and Applications; Psychology Press: Hillsdale, NJ, USA, 2013. [Google Scholar]
  118. Dasgupta, S.; Osogami, T. Nonlinear Dynamic Boltzmann Machines for Time-Series Prediction. In Proceedings of the AAAI, San Francisco, CA, USA, 4–9 February 2017; pp. 1833–1839. [Google Scholar]
  119. Kim, K.J. Financial time series forecasting using support vector machines. Neurocomputing 2003, 55, 307–319. [Google Scholar] [CrossRef]
  120. Hassan, M.R.; Nath, B. Stock market forecasting using hidden Markov model: A new approach. In Proceedings of the 5th International Conference on Intelligent Systems Design and Applications 2005. ISDA ’05, Warsaw, Poland, 8–10 September 2005; IEEE: Piscataway, NJ, USA, 2005; pp. 192–196. [Google Scholar]
  121. Brahim-Belhouari, S.; Bermak, A. Gaussian process for nonstationary time series prediction. Comput. Stat. Data Anal. 2004, 47, 705–712. [Google Scholar] [CrossRef]
  122. Ediger, V.Ş.; Akar, S. ARIMA forecasting of primary energy demand by fuel in Turkey. Energy Policy 2007, 35, 1701–1708. [Google Scholar] [CrossRef]
  123. Poggi, P.; Muselli, M.; Notton, G.; Cristofari, C.; Louche, A. Forecasting and simulating wind speed in Corsica by using an autoregressive model. Energy Convers. Manag. 2003, 44, 3177–3196. [Google Scholar] [CrossRef]
  124. Larose, D.T.; Larose, C.D. K-nearest neighbor algorithm. In Discovering Knowledge in Data: An Introduction to Data Mining, 2nd ed.; Wiley: Hoboken, NJ, USA, 2005; pp. 149–164. [Google Scholar]
  125. Giacomini, R.; White, H. Tests of conditional predictive ability. Econometrica 2006, 74, 1545–1578. [Google Scholar] [CrossRef]
  126. Giarratano, J.C.; Riley, G. Expert Systems; PWS Publishing, Co.: Boston, MA, USA, 1998. [Google Scholar]
  127. Turban, E.; Frenzel, L.E. Expert Systems and Applied Artificial Intelligence; Prentice Hall Professional Technical Reference; ACM Digital Library: New York, NY, USA, 1992. [Google Scholar]
  128. David, J.M.; Krivine, J.P.; Simmons, R. (Eds.) Second Generation Expert Systems; Springer Science & Business Media: Berlin, Germany; New York, NY, USA, 2012. [Google Scholar]
  129. Guerlain, S.; Smith, P.J.; Smith, J.W.; Rudmann, S.; Obradovich, J.; Strohm, P. Decision Support in Medical Systems. In Automation and Human Performance: Theory and Applications; CRC Press: Boca Raton, FL, USA, 1996; pp. 385–406. [Google Scholar]
  130. Musen, M.A.; Middleton, B.; Greenes, R.A. Clinical decision-support systems. In Biomedical Informatics; Springer: London, UK, 2014; pp. 643–674. [Google Scholar]
Figure 1. Typical structure of a multi-layer artificial neural network (ANN) [78] (the figure is free to be used as it is from an open access source).
Figure 1. Typical structure of a multi-layer artificial neural network (ANN) [78] (the figure is free to be used as it is from an open access source).
Applsci 08 01613 g001
Figure 2. Ant-lions (shown with blue arrow), ants, and the hunting behavior, which are inspiration sources for the ant-lion optimizer (ALO) algorithm [6] (the figures are free to use according to the permission policies of Inderscience, publisher of the referenced article).
Figure 2. Ant-lions (shown with blue arrow), ants, and the hunting behavior, which are inspiration sources for the ant-lion optimizer (ALO) algorithm [6] (the figures are free to use according to the permission policies of Inderscience, publisher of the referenced article).
Applsci 08 01613 g002
Figure 3. Brief scheme of the artificial neural network (ANN) and ant-lion optimizer (ALO) hybrid system (ANN–ALO) for chaotic electroencephalogram (EEG) prediction.
Figure 3. Brief scheme of the artificial neural network (ANN) and ant-lion optimizer (ALO) hybrid system (ANN–ALO) for chaotic electroencephalogram (EEG) prediction.
Applsci 08 01613 g003
Figure 4. (a) Electroencephalogram (EEG) recording approach [102]; (b) placement of the electrodes according to the 10–20 system while recording EEG [103] (the figures were sourced from a public domain).
Figure 4. (a) Electroencephalogram (EEG) recording approach [102]; (b) placement of the electrodes according to the 10–20 system while recording EEG [103] (the figures were sourced from a public domain).
Applsci 08 01613 g004
Figure 5. The chaotic electroencephalogram (EEG) time series that was included in the study for prediction applications.
Figure 5. The chaotic electroencephalogram (EEG) time series that was included in the study for prediction applications.
Applsci 08 01613 g005
Figure 6. Phase-space state portraits of the EEG time series (Figure 5) included in the study.
Figure 6. Phase-space state portraits of the EEG time series (Figure 5) included in the study.
Applsci 08 01613 g006
Figure 7. Predictions for both training and test data regarding the EEG time series: A1A5 (the white line represents original data, and the red dots are for predictions. Some visible and remarkable periods including errors are represented by blue squares).
Figure 7. Predictions for both training and test data regarding the EEG time series: A1A5 (the white line represents original data, and the red dots are for predictions. Some visible and remarkable periods including errors are represented by blue squares).
Applsci 08 01613 g007
Figure 8. Predictions for both training and test data regarding the EEG time series: A6A10 (the white line shows the original data, and the red dots are for predictions. Some visible and remarkable periods, including errors, are represented by blue squares).
Figure 8. Predictions for both training and test data regarding the EEG time series: A6A10 (the white line shows the original data, and the red dots are for predictions. Some visible and remarkable periods, including errors, are represented by blue squares).
Applsci 08 01613 g008
Figure 9. Ranking (with points) of the employed approaches used to predict the EEG time series in this study.
Figure 9. Ranking (with points) of the employed approaches used to predict the EEG time series in this study.
Applsci 08 01613 g009
Figure 10. (a) Early epilepsy seizure diagnosis done with the ANN–ALO approach at the Hospital of SDU; and (b) normal patient EEG prediction done with the ANN–ALO approach at the Hospital of SDU.
Figure 10. (a) Early epilepsy seizure diagnosis done with the ANN–ALO approach at the Hospital of SDU; and (b) normal patient EEG prediction done with the ANN–ALO approach at the Hospital of SDU.
Applsci 08 01613 g010
Table 1. Lyapunov exponents for each EEG time series application (Figure 5) in the study.
Table 1. Lyapunov exponents for each EEG time series application (Figure 5) in the study.
Application
(EEG Time Series)
Lyapunov Exponent 1Lyapunov Exponent 2Lyapunov Exponent 3
A10.6515 × 10−30.5078 × 10−4−0.7469 × 10−3
A20.2002 × 10−3−0.2354 × 10−4−0.4054 × 10−3
A3−0.0480 × 10−3−0.9309 × 10−40.8753 × 10−3
A40.2552 × 10−3−0.2734 × 10−40.5395 × 10−3
A50.4205 × 10−30.5442 × 10−4−0.0385 × 10−3
A6−0.8610 × 10−30.7984 × 10−4−0.1260 × 10−3
A70.1245 × 10−3−0.0166 × 10−40.2455 × 10−3
A8−0.3045 × 10−30.7614 × 10−40.3279 × 10−3
A90.6972 × 10−3−0.1718 × 10−40.9491 × 10−3
A100.2330 × 10−3−0.4401 × 10−4−0.8605 × 10−3
Positive values are in bold style.
Table 2. Average true prediction rate values of different back propagation algorithm (BPA)-trained ANN models for the EEG time series (Figure 5) in the study.
Table 2. Average true prediction rate values of different back propagation algorithm (BPA)-trained ANN models for the EEG time series (Figure 5) in the study.
ANN–BPA Model No.Values of ANN Input NeuronsValue of ANN Output NeuronTotal Hidden LayersTotal Neurons in Each Hidden LayerAverage True Prediction Rate
1x(time);
x(time − 3);
x(time − 6)
[10]
x(time + 3)3588.58%
2x(time − 2);
x(time − 2);
x(time − 3);
x(time − 4);
[65]
x(time + 2)3690.63%
3x(time);
x(time − 3);
x(time − 6);
x(time − 9)
x(time + 3)3793.81%
4x(time);
x(time − 2);
x(time − 4)
x(time + 2)4779.84%
5x(time);
x(time − 3);
x(time − 6);
x(time − 9)
x(time + 3)4684.19%
Total data rows for each EEG time series: 4000/total data rows for test: 2000. The best model is in bold style.
Table 3. Adjustments of ANN models employed in prediction applications for the EEG time series (Figure 5) in the study.
Table 3. Adjustments of ANN models employed in prediction applications for the EEG time series (Figure 5) in the study.
Total Neurons in the Input LayerValues of ANN Input NeuronsTotal Neurons in the Output LayerValue of ANN Output NeuronTotal Hidden LayersTotal Neurons in Each Hidden LayerActivation Function
4x(time);
x(time −3);
x(time − 6);
x(time − 9)
1x(time + 3)37Sigmoid
Table 4. Five different ALO adjustments considered for training ANN models along the prediction applications for the EEG time series (Figure 5) in the study.
Table 4. Five different ALO adjustments considered for training ANN models along the prediction applications for the EEG time series (Figure 5) in the study.
ParametersAdjust. 1 (ALO-A1)Adjust. 2 (ALO-A2)Adjust. 3 (ALO-A3)Adjust. 4 (ALO-A4)Adjust. 5 (ALO-A5)
Number of ant-lions (particles)5075100100125
Total iterations (stopping criteria)20003000400050006000
Table 5. Mean absolute error (MAE) values calculated for each EEG time series (Figure 5) via different ANN–ALO adjustments.
Table 5. Mean absolute error (MAE) values calculated for each EEG time series (Figure 5) via different ANN–ALO adjustments.
Application
(EEG Time Series)
ANN–ALO System Including ALO with Different Adjustment
ALO-A1ALO-A2ALO-A3ALO-A4ALO-A5
A112.453112.221911.219511.065312.1029
A213.849014.511913.393312.904812.6544
A315.872216.679915.433714.119015.3290
A412.619311.090710.899410.772611.1167
A516.398017.888116.211014.385115.2962
A613.156614.879112.110912.967113.0991
A715.677716.310916.198115.430915.7193
A820.455920.398419.856617.099417.7556
A922.398321.783920.906119.677019.4211
A1022.715919.994120.418518.913420.1498
Best MAE values are in bold style.
Table 6. Mean squared error (MSE) values calculated for each EEG time series (Figure 5), via different ANN–ALO adjustments.
Table 6. Mean squared error (MSE) values calculated for each EEG time series (Figure 5), via different ANN–ALO adjustments.
Application
(EEG Time Series)
ANN-ALO System Including ALO with Different Adjustment
ALO-A1ALO-A2ALO-A3ALO-A4ALO-A5
A10.04530.04300.03350.03330.0348
A20.03720.03810.03660.03590.0356
A30.03980.04080.03930.03760.0392
A40.03550.03330.03300.03280.0333
A50.06050.06230.06160.05790.0596
A60.03630.03860.03480.03600.0362
A70.07960.09040.08020.07930.0796
A80.11610.10520.09460.09140.0931
A90.06730.06670.05570.05440.0441
A100.10130.08470.09520.08350.0849
Best MSE values are in bold style.
Table 7. Mean MAE values of predictions done for the EEG time series via different ANN-based systems.
Table 7. Mean MAE values of predictions done for the EEG time series via different ANN-based systems.
Application
(EEG Time Series)
ANN–ALO A4ANN–PSOANN–CSANN–FAANN–BAANN–BPA
A111.191714.733212.751114.171313.155914.6133
A211.984313.191111.801312.136112.499713.8799
A314.188315.609814.943815.433315.128116.1351
A411.227312.856011.791712.109911.099713.3209
A514.471018.155515.319116.209716.119920.1941
A613.130916.709112.945514.509914.215016.8771
A715.104719.143316.849017.908717.802519.5016
A817.030119.098117.613318.229018.799421.1430
A920.157123.305519.775120.047720.269123.1190
A1017.705119.738018.144718.583118.843620.0544
Values are for mean of MAE within 30 runs. Best values are in bold style.
Table 8. Mean MSE and standard deviation of predictions done for the EEG time series via different ANN-based systems.
Table 8. Mean MSE and standard deviation of predictions done for the EEG time series via different ANN-based systems.
Application
(EEG Time Series)
ANN–ALO A4ANN–PSOANN–CSANN–FAANN–BAANN–BPA
A10.0419 ± 0.27290.0466 ± 0.09270.0422 ± 0.05130.0459 ± 0.07520.0445 ± 0.26240.0449 ± 0.7108
A20.0530 ± 0.30330.0606 ± 0.35830.0527 ± 0.11060.0532 ± 0.17740.0537 ± 0.20210.0631 ± 0.1066
A30.0359 ± 0.35580.0376 ± 0.29120.0368 ± 0.49550.0374 ± 0.36600.0410 ± 0.28960.0579 ± 0.6750
A40.0319 ± 0.20980.0341 ± 0.08380.0327 ± 0.11520.0331 ± 0.46040.0317 ± 0.07420.0519 ± 0.2013
A50.0562 ± 0.49640.0706 ± 0.27090.0573 ± 0.29570.0583 ± 0.46340.0582 ± 0.42960.0762 ± 0.1610
A60.0345 ± 0.33960.0389 ± 0.29760.0343 ± 0.45690.0363 ± 0.31780.0359 ± 0.03570.0411 ± 0.0611
A70.0671 ± 0.29700.0917 ± 0.06560.0791 ± 0.00720.0703 ± 0.43410.0702 ± 0.10200.1013 ± 0.7071
A80.0993 ± 0.10070.1116 ± 0.48190.1011 ± 0.31040.1207 ± 0.18350.1213 ± 0.35780.1103 ± 0.1403
A90.0428 ± 0.48620.0461 ± 0.42340.0424 ± 0.39160.0426 ± 0.22870.0429 ± 0.38060.0511 ± 0.2661
A100.0801 ± 0.29300.1124 ± 0.14800.0906 ± 0.35870.0911 ± 0.28290.0913 ± 0.45500.1317 ± 0.0720
Each Cell = mean of MSE ± SD. The values represent the mean MSE of 30 runs. Best values are in bold style.
Table 9. Mean MAE values of predictions done for the EEG time series using ANN–ALO A4 and some other prediction approaches (set 1).
Table 9. Mean MAE values of predictions done for the EEG time series using ANN–ALO A4 and some other prediction approaches (set 1).
Application
(EEG Time Series)
ANN–ALO A4DyBM [107]SVM [108]HMM [109]
A111.191715.451112.419013.3118
A211.984317.479915.308714.3792
A314.188320.610317.698121.1768
A411.227318.419314.128817.4963
A514.471017.377915.519919.4320
A613.130916.369114.043318.1982
A715.104719.119018.732222.0577
A817.030123.690320.113621.8593
A920.157124.118521.533623.1088
A1017.705122.709118.218024.6911
Values represent the mean MAE from 30 runs. Best values are in bold style.
Table 10. Mean MAE values of predictions done for the EEG time series using ANN–ALO A4 and some other prediction approaches (set 2).
Table 10. Mean MAE values of predictions done for the EEG time series using ANN–ALO A4 and some other prediction approaches (set 2).
Application
(EEG Time Series)
BG [110]ARIMA [111]ARM [112]K-NN [113]
A115.985315.023016.711112.9003
A217.290116.810218.194716.1661
A320.419320.344121.153916.9335
A416.663815.790116.122116.4662
A518.966120.677122.141715.9107
A621.055319.836023.111016.8157
A723.511120.781120.041918.9406
A821.695319.114822.873119.9704
A923.419425.119023.399421.7188
A1024.331720.016124.131020.4101
Values represent the mean MAE from 30 runs. Best values are in bold style.
Table 11. Mean accuracy of different approaches for predicting the test data of each EEG time series (set 1).
Table 11. Mean accuracy of different approaches for predicting the test data of each EEG time series (set 1).
Application
(EEG Time Series)
ANN–ALO A4ANN–PSOANN–CSANN–FAANN–BAANN–BPADyBM
A188.74%75.08%83.11%76.59%80.10%76.17%72.81%
A290.79%86.27%92.46%88.41%88.13%85.18%76.18%
A394.06%88.31%93.17%89.58%90.24%86.15%80.15%
A492.17%89.35%91.58%89.58%93.44%88.70%79.17%
A591.25%84.37%90.19%85.04%86.51%77.69%84.75%
A689.04%84.50%89.71%86.74%87.09%82.51%86.49%
A792.56%86.65%90.19%89.10%89.39%85.10%87.80%
A887.60%82.77%85.01%84.38%83.49%80.07%77.58%
A989.05%77.52%90.75%89.16%88.18%80.60%72.45%
A1089.28%80.79%87.90%82.19%81.66%80.13%77.06%
Best values are in bold style.
Table 12. Mean accuracy of different approaches for predicting the test data of each EEG time series (set 2).
Table 12. Mean accuracy of different approaches for predicting the test data of each EEG time series (set 2).
Application
(EEG Time Series)
SVMHMMBGARIMAARMK-NN
A184.26%77.11%71.44%%73.64%68.52%81.45%
A283.55%84.97%78.59%79.61%75.77%80.18%
A385.61%78.10%80.42%83.68%79.28%85.95%
A486.14%80.68%83.42%85.79%84.17%83.80%
A589.78%81.54%83.90%75.01%72.55%87.52%
A688.61%80.95%79.08%80.15%78.69%82.99%
A788.41%79.90%79.45%80.07%83.14%88.10%
A880.55%78.60%79.17%82.46%77.97%80.65%
A985.60%82.46%75.33%71.05%75.70%83.50%
A1084.48%75.05%76.41%80.40%76.57%78.90%
Best values are in bold style.
Table 13. Giacomini–White Test results for determining the best performance(s).
Table 13. Giacomini–White Test results for determining the best performance(s).
Application
(EEG Time Series)
The Best Performance(s)
A1ANN–ALO A4
A2ANN–ALO A4/ANN–CS
A3ANN–ALO A4/ANN–CS/ANN–BA
A4ANN–BA/ANN–ALO A4/ANN–CS
A5ANN–ALO A4
A6ANN–ALO A4/ANN–CS
A7ANN–ALO A4
A8ANN–ALO A4
A9ANN–CS
A10ANN–ALO A4/ANN–CS/ANN–FA
Significance level of 5%.
Table 14. The ranks and obtained points from each approach for the predictions of the EEG time series (set 1).
Table 14. The ranks and obtained points from each approach for the predictions of the EEG time series (set 1).
Application
(EEG Time Series)
ANN–ALO A4ANN–PSOANN–CSANN–FAANN–BAANN–BPADyBM
A11 (+13)9 (+5)3 (+11)7 (+7)5 (+9)8 (+6)11 (+3)
A22 (+12)5 (+9)1 (+13)3 (+11)4 (+10)6 (+8)12 (+2)
A31 (+13)5 (+9)2 (+12)4 (+10)3 (+11)6 (+8)11 (+3)
A42 (+12)5 (+9)3 (+11)4 (+10)1 (+13)6 (+8)13 (+1)
A51 (+13)8 (+6)2 (+12)6 (+8)5 (+9)11 (+3)7 (+7)
A62 (+12)7 (+7)1 (+13)5 (+9)4 (+10)9 (+5)6 (+8)
A71 (+13)8 (+6)2 (+12)4 (+10)3 (+11)9 (+5)7 (+7)
A81 (+13)5 (+9)2 (+12)3 (+11)4 (+10)9 (+5)13 (+1)
A93 (+11)9 (+5)1 (+13)2 (+12)4 (+10)8 (+6)12 (+2)
A101 (+13)6 (+8)2 (+12)4 (+10)5 (+9)8 (+6)10 (+4)
Total Points12573121981026038
Table 15. The ranks and obtained points from each approach for the prediction of the EEG time series (set 2).
Table 15. The ranks and obtained points from each approach for the prediction of the EEG time series (set 2).
Application
(EEG Time Series)
SVMHMMBGARIMAARMK-NN
A12 (+12)6 (+8)12 (+2)10 (+4)13 (+1)4 (+10)
A28 (+6)7 (+7)11 (+3)10 (+4)13 (+1)9 (+5)
A38 (+6)13 (+1)10 (+4)9 (+5)12 (+2)7 (+7)
A47 (+7)12 (+2)11 (+3)8 (+6)9 (+5)10 (+4)
A53 (+11)10 (+4)9 (+5)12 (+2)13 (+1)4 (+10)
A63 (+11)10 (+4)12 (+2)11 (+3)13 (+1)8 (+6)
A75 (+9)12 (+2)13 (+1)11 (+3)10 (+4)6 (+8)
A88 (+6)11 (+3)10 (+4)6 (+8)12 (+2)7 (+7)
A95 (+9)7 (+7)11 (+3)13 (+1)10 (+4)6 (+8)
A103 (+11)13 (+1)12 (+2)7 (+7)11 (+3)9 (+5)
Total Points883929432470
Table 16. Findings from the survey work done with the physicians using ANN–ALO for EEG prediction.
Table 16. Findings from the survey work done with the physicians using ANN–ALO for EEG prediction.
HospitalPhysicianUsabilityAccuracySpeedEffectivenessNovelty
Isparta State HospitalP155455
Isparta State HospitalP245344
Hospital of Suleyman Demirel University (SDU)P355454
Meddem HospitalP444554
Meddem HospitalP555355
Davraz Life HospitalP654444
Mean4.674.673.834.674.33
Scale: too bad (1); bad (2); normal (3); good (4); very good (5).

Share and Cite

MDPI and ACS Style

Kose, U. An Ant-Lion Optimizer-Trained Artificial Neural Network System for Chaotic Electroencephalogram (EEG) Prediction. Appl. Sci. 2018, 8, 1613. https://doi.org/10.3390/app8091613

AMA Style

Kose U. An Ant-Lion Optimizer-Trained Artificial Neural Network System for Chaotic Electroencephalogram (EEG) Prediction. Applied Sciences. 2018; 8(9):1613. https://doi.org/10.3390/app8091613

Chicago/Turabian Style

Kose, Utku. 2018. "An Ant-Lion Optimizer-Trained Artificial Neural Network System for Chaotic Electroencephalogram (EEG) Prediction" Applied Sciences 8, no. 9: 1613. https://doi.org/10.3390/app8091613

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop