Abstract
Chaotic systems implemented by artificial neural networks are good candidates for data encryption. In this manner, this paper introduces the cryptographic application of the Hopfield and the Hindmarsh–Rose neurons. The contribution is focused on finding suitable coefficient values of the neurons to generate robust random binary sequences that can be used in image encryption. This task is performed by evaluating the bifurcation diagrams from which one chooses appropriate coefficient values of the mathematical models that produce high positive Lyapunov exponent and Kaplan–Yorke dimension values, which are computed using TISEAN. The randomness of both the Hopfield and the Hindmarsh–Rose neurons is evaluated from chaotic time series data by performing National Institute of Standard and Technology (NIST) tests. The implementation of both neurons is done using field-programmable gate arrays whose architectures are used to develop an encryption system for RGB images. The success of the encryption system is confirmed by performing correlation, histogram, variance, entropy, and Number of Pixel Change Rate (NPCR) tests.
1. Introduction
Image encryption is one of the well-known mechanisms to preserve confidentiality over a reliable unrestricted public channel. However, public channels are vulnerable to attacks and hence efficient encryption algorithms must be developed for secure data transfer. In [1], the authors surveyed ten conventional and five chaos-based encryption techniques to encrypt three test images of different sizes based on various performance metrics, and the important conclusion was that none of the conventional schemes were designed especially for images and hence none of them have any dependence on the initial image. In this manner, the topic of image encryption remains open and several researchers are proposing the use of chaotic systems to mask information that can be transmitted in a secure channel. In this direction, this paper highlights the usefulness of Hopfield and Hindmarsh–Rose neural networks to generate chaotic behavior, and its suitability to design random number generators (RNGs) that are implemented using field-programmable gate arrays (FPGAs).
In [2] J.J. Hopfield introduced the neuron model that nowadays is known as the Hopfield neural network. Ten years later, a modified model of Hopfield neural network was proposed in [3], and applied in information processing. Immediately, the Hopfield neural network was adapted to generate chaotic behavior in [4] where the authors explored bifurcation diagrams. In [5] the simplified Hopfield neuron model was designed to use a sigmoid as activation function, and three neurons were used to generate chaotic behavior. In addition, the authors performed an optimization process updating the weights of the neurons interconnections. The Hopfield neuron was combined with a chaotic map in [6] to be applied in chaotic masking. More recently, the authors in [7] proposed an image encryption algorithm using the Hopfield neural network. In the same direction, the authors in [8] detailed the behavior of Hindmarsh–Rose neuron to generate chaotic behavior. Its bifurcation diagrams were described in [9], and the results were used to select the values of the model to improve chaotic behavior. Hindmarsh–Rose neurons were synchronized in [10], optimizing the scheme of Lyapunov function with two gain coefficients. In this way, the synchronization region is estimated by evaluating the Lyapunov stability. Two Hindmarsh–Rose neurons were synchronized in [11], and the system was used to mask information in continuous time. To show that the neurons generate chaotic behavior, one must compute Lyapunov exponents, and for the Hindmarsh–Rose neuron they were evaluated by the TISEAN package in [12].
The Hopfield neural network has been widely applied in chaotic systems [13,14,15]. This network consists of three neurons, and the authors in [13] proposed a simplified model by removing the synaptic weight connection of the third and second neuron in the original Hopfield network. Numerical simulations were carried out considering values from the bifurcation diagrams, and Lyapunov exponents were evaluated to conclude that the simplified model exhibits rich nonlinear dynamical behaviors including symmetry breaking, chaos, periodic window, antimonotonicity and coexisting self-excited attractors. An FPGA-based modified Hopfield neural network was introduced in [14], to generate multiple attractors, but there are no details of their hardware design from computer arithmetic. The authors in [15] showed the existence of hidden chaotic sets in a simplified Hopfield neural network with three neurons. Similar to the Hopfield neural network, the Hindmarsh–Rose neuron is quite useful, for example: using the Hindmarsh–Rose neuron model, the authors in [16] showed that in the parameter region close to the bifurcation value, where the only attractor of the system is the limit cycle of tonic spiking type, the noise can transform the spiking oscillatory regime to the bursting one. The fractional-order version of the Hindmarsh–Rose neuron was used in [17], for the synchronization of fractional-order chaotic systems. In [18], based on two-dimensional Hindmarsh–Rose neuron and non-ideal threshold memristor, a five-dimensional neuron model of two adjacent neurons coupled by memristive electromagnetic induction, was introduced. In a similar way, the authors in [19] showed the effects of time delay on burst synchronization transitions of a neural network which was locally modeled by Hindmarsh–Rose neurons. On the one hand, the main drawback of those works was the lack of statistical tests according to the National Institute of Standard and Technology (NIST), as done in other chaotic systems given in [20,21,22], to guarantee the randomness of the chaotic sequences. On the other hand, and in addition to NIST tests, the authors in [23] recommend to improve the key space when using chaotic maps, thus enhancing the image encryption schemes. In this work we show the application of neural networks in the design of random number generators (RNGs), whose binary sequences are applied to implement an image encryption scheme [24]. This idea has been previously exploited, for example: the Hopfield neural network was used in [25] to design a RNG, but showed low randomness. In this manner, this paper introduces the selection of the best coefficients of both Hopfield and Hindmarsh–Rose neurons, from the bifurcation diagram, to generate robust chaotic sequences that improve NIST tests and enhance chaotic encryption of images.
Section 2 describes both the Hopfield and the Hindmarsh–Rose neuron models, showing their chaotic behavior. Section 3 shows simulation results of the cases that generate better chaotic time series, applying the 4th-order Runge-Kutta method. Bifurcation diagrams are generated to select appropriate values that improve the generation of chaotic times series that are evaluated using TISEAN, in order to verify the Lyapunov exponents. Section 4 details the FPGA-based implementation of both Hopfield and Hindmarsh–Rose neuron models. Section 5 shows the selection of the series with the highest values of the positive Lyapunov exponent, which are used to generate binary sequences, and whose randomness is evaluated by performing NIST tests. Section 6 shows the application of the generated binary sequences to encrypt an image in a chaotic secure communication system, and the success of the RGB image encryption system is confirmed by performing correlation, histogram, variance, entropy, and Number of Pixel Change Rate (NPCR) tests. Finally, Section 7 summarizes the main results of this work.
2. Mathematical Models of Hopfield and Hindmarsh–Rose Neurons
This section describes the mathematical models of both the Hopfield and the Hindmarsh–Rose neural networks. For instance, the complex dynamics of the Hopfield-type neural network with three neurons are analyzed in [26], as well as the observation of the stable points, limit circles, single-scroll chaotic attractors and double-scrolls chaotic attractors. By varying the parameters, the numerical simulations performed in [27] show that the simple Hopfield neural networks can display chaotic attractors and periodic orbits for different parameters, and they associate different values of the Lyapunov exponents and bifurcation plots. The Hindmarsh–Rose neural network is analyzed in [28], and by using the polynomial model previously introduced in [29], the authors perform a detailed bifurcation analysis of the full fast-slow system for bursting patterns.
2.1. Hopfield Neuron
The Hopfield neural network can be modeled by Equation (1), where v represents the state variables, c is a proportional constant, W is the weights matrix, and is associated to the activation function [5].
Commonly, c is made equal to one, and when a chaotic behavior is desired, the activation function is a hyperbolic tangent function and the weights matrix W is modified, whose size depends on the number of neurons. In this work, W has size 3 × 3, meaning that the Hopfield neural network has three state variables, associated with each neuron. That way, Equations (2)–(4), describe the model of the three neurons, as shown in [5]. In this case, v in (1) is replaced by a three elements vector, including three state variables, namely: x, y, and z; and the control parameter p is set to . This value is found by exploring values that maximize the positive Lyapunov exponent (LE+). Figure 1 and Figure 2 show the chaotic time series and the attractors obtained by applying the 4th-order Runge–Kutta method.
Figure 1.
Simulation results of the chaotic time series of the state variables: (a) x, (b) y, and (c) z, of the Hopfield neural network given in [5]. The initial conditions are: = 1.951738939809982, = −1.207112821944644, and = −0.284321234701517.
Figure 2.
Attractors generated by plotting the state variables shown in Figure 1.
The equilibrium points are obtained by applying the Newton Raphson method , where is the Jacobian, because the neural network has nonlinear terms as . The Equilibrium points are: . The eigenvalues are obtained for each equilibrium point evaluating , so that the ones associated to are: =1.9416 and =−0.0658 ± j 1.8793. The eigenvalues associated to and are: =−0.9870 and =0.5381 ± j 1.2861.
2.2. Hindmarsh–Rose Neuron
The Hindmarsh–Rose neural network can be modeled by three state variables, as given by Equation (5). This model is used to analyze the charge and discharge of a neuron, and in addition, when it provides chaotic behavior, its applications can be extended to cryptography, as shown in this work.
In Equation (5), x is associated to the membrane voltage, y is the recovery variable associated to the current, and z is the slow and adaptable current. The coefficients are parameters of the neuron, and and are associated to the time scale. Their values are set to: , , , , , , , and [11]. Figure 3 shows the time series of the state variable x of Hindmarsh–Rose neuron, and Figure 4 shows the phase-space portraits.
Figure 3.
Chaotic time series of state variable x of the Hindmarsh–Rose neuron using , , and , as initial conditions.
Figure 4.
Phase-space portraits of the Hindmarsh–Rose neuron given in Equation (5), using , , and , as initial conditions.
The equilibrium points from Equation (5) are: , , and . The eigenvalues associated to each equilibrium point are: = 0.261784, = 0.0204526, =− 0.495936 for ; and = 2.2075 ± j 1.5659, =− 0.002488 ±j 0.0001127, and = −0.67089 ± j 0.4012, for and .
3. Bifurcation Diagrams and Selection of the Best Values to Generate Enhanced Chaotic Time Series
Bifurcation diagrams are quite useful to find appropriate values of the mathematical models of the neurons, and in this paper, they are generated to find the best Lyapunov exponents and Kaplan–Yorke dimension, which are considered as appropriate metrics to enhance the generation of chaotic time series. In the case of the Hopfield neuron, the state variable x is selected to plot the bifurcation with respect to the control parameter p. This process must be performed following the variation of all the coefficients of the mathematical model and for all the state variables in various iterations until some dynamical characteristics of the chaotic system, like the positive Lyapunov exponent (LE+) and Kaplan–Yorke dimension [30], are improved. Varying the weights matrix W, one can find better characteristics. For example, in this paper the nine elements in W were varied in the ranges and steps listed in Table 1. All these cases generated different bifurcation diagrams from which the feasible values were selected. For example, Figure 5 shows the bifurcation diagram varying , where it can be appreciated that the feasible values to generate chaotic behavior must be chosen with values lower than −4.5. In this manner, after exploring the bifurcation diagrams by varying the values in the ranges given in Table 1, three feasible sets of values are given in Table 2, where it can be appreciated their variations with respect to the original values given in [5]. The chaotic time series associated with those sets of values, obtained by applying the 4th-order Runge–Kutta method are shown in Figure 6 for the state variable x. Those chaotic time series are used to evaluate Lyapunov exponents and Kaplan–Yorke dimension using TISEAN.
Table 1.
Variation conditions of the elements in W.
Figure 5.
Bifurcation diagram varying in the range given in Table 1.
Table 2.
Proposed sets of values to generate enhanced chaotic behavior using Hopfield neuron.
Figure 6.
Chaotic time series generated by using the sets of values listed in Table 2: (a) original, (b) HNNset1, (c) HNNset2, and (d) HNNset3, plotting the state variable x.
Table 3 lists all Lyapunov exponents values and their associated Kaplan–Yorke dimension for each case from Table 2.
Table 3.
Lyapunov exponents and Kaplan–Yorke dimension associated to each set of values from Table 2.
In the case of the Hindmarsh–Rose neural network model given in (5), one can count eight coefficients that can be varied. In this case, a heuristic process was performed with the goal of improving the chaotic behavior. Each coefficient , a, b, , k, , , and s, was varied in steps of 0.001 and observing the degradation of chaotic behavior. After performing the variations and observing the bifurcation diagrams, two sets of values were found, which are listed in Table 4. In this manner, Figure 7 shows the chaotic time series associated with these sets of values.
Table 4.
Proposed sets of values to generate enhanced chaotic behavior using the Hindmarsh–Rose neuron.
Figure 7.
Chaotic time series generated by using the sets of values listed in Table 4: (a) original, (b) HRNset1, and (c) HRNset2, plotting the state variable x.
The simulation of the chaotic time series was performed using the initial conditions , , and , and those series were introduced to TISEAN to evaluate Lyapunov exponents and Kaplan–Yorke dimension that are given in Table 5. Since the maximum exponent is positive, then chaotic behavior is guaranteed. In this case, the sets of values HRNset1 is the best because the Kaplan–Yorke dimension is 3, i.e., the ideal value for a three-dimensional dynamical system.
Table 5.
Lyapunov exponents and Kaplan–Yorke dimension associated to each set of values from Table 4.
4. FPGA-Based Implementation of the Neurons
The hardware implementation of both neurons can be performed from the discretized equations using a specific numerical method. For example, in [31] one can find the discretization of a dynamical model by applying Forward Euler and 4th-order Runge–Kutta methods. It can be appreciated that there is a trade-off between exactness and hardware resources. Besides, since the Hopfield neural network is a small dynamical system, the 4th-order Runge–Kutta method is used herein to develop the FPGA-based implementation, as shown in Figure 8, which is based on Equations (2)–(4). The details of the numerical method is sketched in Figure 9, where one can appreciate the block for the hyperbolic tangent function given in Equation (3), which is implemented as already shown in [31]. The general architecture shown in Figure 8 consists of a finite state machine (FSM) that controls the iterations of the numerical method, whose data is saved in the registers (x, y, z). The block labeled Hopfield Chaotic Neuronal Network contains the hardware that evaluates the 4th-order Runge-Kutta method receiving the data at iteration (, , ) and providing the data at the next iteration (, , ). The Mux blocks introduce the initial conditions (, , ) and select the values (, , ) for all the remaining iterations. The Reg blocks are parallel-parallel arrays and save the data being processed within the dynamical system. The output of the whole architecture provide a binary string associated to a specific state variable (x, y, z).
Figure 8.
High-level description of the FPGA-based implementation of the Hopfield neural network.
Figure 9.
Details of the implementation of the 4th-order Runge–Kutta method solving the Hopfield chaotic neural network, whose block is embedded in Figure 8.
The hyperbolic tangent function given in [31] and used in Figure 9, is described by Equations (6) and (7), with , , and . Table 6 shows the hardware resources for the implementation of the four cases given in Table 2 for the Hopfield neuron. The numerical method is the 4th-order Runge–Kutta and the FPGA Cyclone IV EP4CE115F29C7 is used.
Table 6.
FPGA resources for the implementation of the Hopfield neuron from the sets of values given in Table 2.
In a similar way, the FPGA-based implementation of Hindmarsh–Rose neural network given in Equation (5), is developed by applying a numerical method. In this case, and applying Forward-Euler method, the hardware description is shown in Figure 10, where it can be appreciated the use of a finite state machine (FSM) to control the iterations associated to the numerical method, the use of multiplexers to process the initial conditions and afterwards the remaining iterations, the use of registers to save the data of the state variables and blocks to evaluate the discretized equations by Forward Euler. The whole iterative process to generate the next value of the state variables requires seven clock (CLK) cycles.
Figure 10.
FPGA-based implementation of Hindmarsh–Rose neural network described in Equation (5).
In both FPGA-based implementations for Hopfield and Hindmarsh–Rose neural networks, the computer arithmetic is performed using fixed-point notation of 5.27 for the Hopfield and 3.29 for the Hindmarsh–Rose neural networks. The FPGA resources for the three sets of values of the Hindmarsh–Rose neural network are listed in Table 7.
Table 7.
FPGA resources for the implementation of the Hindmarsh–Rose neuron from the sets of values given in Table 4.
5. Randomness Test: NIST
In this Section, the results of the NIST tests [32,33] for both neural networks are shown. The four cases of Hopfield neurons, and using the state variable x with 1000 chaotic time series (binary strings) of 1 million bits each, generated the NIST tests given in Table 8. The results using the original values taken from [5], the three sets of values given in Table 2, and the results using the weight matrix from [14], can be compared. All cases passed NIST tests with proportions around 99%, and the set of values HNNset2 generated a higher p-Value average of 0.7065.
Table 8.
National Institute of Standard and Technology (NIST) tests for the binary sequences generated by the Hopfield neural network for the state variable x, and for the sets of values given in Table 2.
The computer arithmetic for the Hindmarsh–Rose neural network is 3.29 but as the largest variation occurs in the least significative bits (LSB), then only the 16 LSB of each 32-bit number were used. The NIST tests were performed using 1000 chaotic time series of 1 million bits each. The results are summarized in Table 9 including the averages of the p-Values and proportions.
Table 9.
NIST tests for the binary sequences generated by the Hindmarsh–Rose neural network for the state variable x, and for the sets of values given in Table 4.
6. Image Encryption Application
The binary sequences tested in the previous section can be taken as pseudorandom number generators (PRNGs), and can be used to design a chaotic secure communication system to encrypt images, as shown in Figure 11. Those PRNGs can be implemented by either the Hopfield or Hindmarsh–Rose neural networks because both provide high randomness. Both neurons can also be implemented using memristors, as shown in [34], which constitutes another research direction of hardware security. For instance, using the four binary sequences from the FPGA-based implementation of the Hopfield neural network, we show the encryption of three images (Lena, Fruits and Baboon) in Figure 12. The three RGB images have a resolution of 512 × 512 pixels.
Figure 11.
General description of a chaotic secure communication scheme for image encryption based on pseudorandom number generators (PRNGs) implemented by the Hopfield and Hindmarsh–Rose neural networks.
Figure 12.
Image encryption using the binary sequences from the FPGA implementation of the Hopfield neuron given in Table 6, and the parameters corresponding to HNNset2. The original image is on the left, the encrypted in the center and the recovered on the right column, for: (a) Lena, (b) Fruits, and (c) Baboon images.
The correlation analysis is performed using [35]: Equation (8)–(10), and it provides the values given in Table 10. The first row , means that the chaotic time series of state x is used to encrypt R (red), y to G (green) and z to B (blue). The second row means that all R, G and B are encrypted using the data from x, and so on.
Table 10.
Correlations between the chaotic channel and the RGB Lena, Fruits and Baboon images using the sequences generated by the Hopfield neuron for the state variables x, y, and z.
Figure 13 shows the histograms of Lena before and after encryption using HNNset2 from Table 2. To describe the distribution characteristics of the histograms quantitatively, the variance of the histograms for the three images (Lena, Fruits and Baboon) is calculated according to [36], and the results are shown in Table 11.
Figure 13.
Histograms of the Lena image encrypted using HNNset2 from Table 2. Original images on the left and the R, G and B encryption on the right.
Table 11.
Variance of the histograms for the Lena, Fruits and Baboon images using the sequences generated by HNNset2.
The entropy is evaluated by Equation (11), where represents the probability of the datum . Using 8 bits (), Table 12 shows the entropies for the three images (Lena, Fruits and Baboon).
Table 12.
Entropies of the original Lena, Fruits and Baboon images and the encrypted ones using the sets of values HNNset2 from Table 2.
To verify the encryption capability against differential attacks, the NPCR test is evaluated by Equations (12) and (13) [37], where and are two cipher images that are encrypted from two plain images with only one-bit difference. In this case, using HNNset2 for the Lena, Fruits and Baboon images, the NPCR values are: 99.2672 when using state variable x, 99.2886 when using the state variable y, and 99.3420 when using z. All NPCR results pass the criterium given in [38].
The binary sequences from the FPGA implementation of the Hindmarsh–Rose neural network shown in Figure 7, were used as PRNG to encrypt the Lena, Fruits and Baboon images, which results are shown in Figure 14.
Figure 14.
Image encryption of (a) Lena, (b) Fruits and (c) Baboon, using the binary sequences from the FPGA implementation of Hindmarsh–Rose neuron with HRNset1 from Table 7. The original image is on the left, the encrypted in the center and the recovered on the right column.
The correlation analysis between the images and the sets of values from Table 7, provides the values given in Table 13. The first row , means that the chaotic time series of state x is used to encrypt R (red), y to G (green) and z to B (blue). The second row means that all R, G and B are encrypted using the data from x, and so on.
Table 13.
Correlations between the chaotic channel and the RGB Lena, Fruits and Baboon images using the sequences generated by the Hindmarsh–Rose neuron for the state variables x, y, and z.
Figure 15 shows the histogram of the Lena image before and after encryption using HRNset1 from Table 4. The variance of the histograms for the three images (Lena, Fruits and Baboon) using HRNset1, is calculated according to [36], and the results are shown in Table 14.
Figure 15.
Histograms of the Lena image encrypted using the set of values HRNset1. Original images on the left and the R, G and B encryption on the right.
Table 14.
Variance of the histograms for the Lena, Fruits and Baboon images using the sequences generated by HRNset1.
The entropy is evaluated by (11) using HRNset1, and using 8 bits (), Table 15 shows the entropies for the three images (Lena, Fruits and Baboon).
Table 15.
Entropies of the original Lena, Fruits and Baboon images and the encrypted ones using the sets of values HRNset1 from Table 4.
The key space is equal to 160 bits because each datum is encoded by 32 bits, the initial condition for each state variable counts, the step-size of the numerical method can change and is also encoded using 32 bits. The NPCR tests results using HRNset1 for the color images, were: 99.60365 when using the state variable x, 99.49608 when using the state variable y, and 98.49586 when using the state variable z.
In both FPGA-based implementations for the Hopfield and Hindmarsh–Rose neural networks, the image is transmitted from a personal computer running MatLab to the FPGA using the serial port RS-232, as described in [31].
7. Conclusions
The use of two well-known neural networks, the Hopfield and Hindmarsh–Rose ones, for image encryption applications has been described. With the help of bifurcation diagrams, new feasible sets of values were proposed in order to generate binary strings with more randomness than the ones previously published in the literature. We proposed three sets of values for the Hopfield neuron and two sets of values for the Hindmarsh–Rose neuron. The chaotic time series were analyzed by TISEAN to compute Lyapunov exponents and Kaplan–Yorke dimension. The proposed sets of values were much better than the already published ones.
By applying numerical methods, we showed the descriptions of the hardware design of both neurons and the FPGA resources were listed for the Hopfield and Hindmarsh–Rose neurons, respectively. The binary strings that were generated by the FPGA-based implementations of both neurons were taken as PRNGs to perform the encryption of the RGB Lena, Fruits and Baboon images. The success of the encryption system has been confirmed by the results obtained from correlation, histogram, variance, entropy, and NPCR tests. This demonstrates that both neurons are very useful for chaotic image encryption.
Author Contributions
Conceptualization E.T.-C., R.L., W.D.L.-S., and F.V.F.; methodology E.T.-C., J.D.D.-M., A.M.G.-Z., O.G.-F.; software E.T.-C., J.D.D.-M., A.M.G.-Z., and O.G.-F.; validation E.T.-C., R.L., and I.C.-V.; formal analysis E.T.-C., J.D.D.-M., and A.M.G.-Z.; investigation E.T.-C., J.D.D.-M., and A.M.G.-Z.; resources J.D.D.-M., A.M.G.-Z., and O.G.-F.; writing—original draft preparation E.T.-C., J.D.D.-M., and A.M.G.-Z.; writing—review and editing E.T.-C., J.D.D.-M., A.M.G.-Z., R.L., O.G.-F., I.C.-V., W.D.L.-S., and F.V.F.; supervision E.T.-C. All authors have read and agreed to the published version of the manuscript.
Funding
This research received no external funding.
Conflicts of Interest
The authors declare no conflict of interest.
References
- Kumari, M.; Gupta, S.; Sardana, P. A Survey of Image Encryption Algorithms. 3D Res. 2017, 8, 37. [Google Scholar] [CrossRef]
- Hopfield, J.J. Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. Acad. Sci. USA 1982, 79, 2554–2558. [Google Scholar] [CrossRef]
- Nozawa, H. A neural network model as a globally coupled map and applications based on chaos. Chaos Interdiscip. J. Nonlinear Sci. 1992, 2, 377–386. [Google Scholar] [CrossRef] [PubMed]
- Chen, L.; Aihara, K. Chaotic simulated annealing by a neural network model with transient chaos. Neural Netw. 1995, 8, 915–930. [Google Scholar] [CrossRef]
- Yang, X.S.; Yuan, Q. Chaos and transient chaos in simple Hopfield neural networks. Neurocomputing 2005, 69, 232–241. [Google Scholar] [CrossRef]
- Yu, W.; Cao, J. Cryptography based on delayed chaotic neural networks. Phys. Lett. A 2006, 356, 333–338. [Google Scholar] [CrossRef]
- Wang, X.Y.; Li, Z.M. A color image encryption algorithm based on Hopfield chaotic neural network. Secur. Commun. Netw. 2019, 115, 107–118. [Google Scholar] [CrossRef]
- Osipov, V.V.; Ponizovskaya, E.V. The nature of bursting noises, stochastic resonance and deterministic chaos in excitable neurons. Phys. Lett. A 1998, 238, 369–374. [Google Scholar] [CrossRef]
- Storace, M.; Linaro, D.; de Lange, E. The Hindmarsh–Rose neuron model: Bifurcation analysis and piecewise-linear approximations. Chaos Interdiscip. J. Nonlinear Sci. 2008, 18, 033128. [Google Scholar] [CrossRef]
- Wang, C.; He, Y.; Ma, J.; Huang, L. Parameters estimation, mixed synchronization, and antisynchronization in chaotic systems. Complexity 2014, 20, 64–73. [Google Scholar] [CrossRef]
- Wang, T.; Wang, D.; Wu, K. Chaotic Adaptive Synchronization Control and Application in Chaotic Secure Communication for Industrial Internet of Things. IEEE Access 2018, 6, 8584–8590. [Google Scholar] [CrossRef]
- Hegger, R.; Kantz, H.; Schreiber, T. Practical implementation of nonlinear time series methods: The TISEAN package. Chaos Interdiscip. J. Nonlinear Sci. 1999, 9, 413–435. [Google Scholar] [CrossRef] [PubMed]
- Njitacke, Z.T.; Kengne, J. Nonlinear Dynamics of Three-Neurons-Based Hopfield Neural Networks (HNNs): Remerging Feigenbaum Trees, Coexisting Bifurcations and Multiple Attractors. J. Circ. Syst. Comput. 2019, 28, 1950121. [Google Scholar] [CrossRef]
- Rajagopal, K.; Munoz-Pacheco, J.M.; Pham, V.T.; Hoang, D.V.; Alsaadi, F.E.; Alsaadi, F.E. A Hopfield neural network with multiple attractors and its FPGA design. Eur. Phys. J. Spec. Top. 2018, 227, 811–820. [Google Scholar] [CrossRef]
- Danca, M.F.; Kuznetsov, N. Hidden chaotic sets in a Hopfield neural system. Chaos Solitons Fractals 2017, 103, 144–150. [Google Scholar] [CrossRef]
- Bashkirtseva, I.; Ryashko, L.; Slepukhina, E. Noise-induced spiking-bursting transition in the neuron model with the blue sky catastrophe. Phys. Rev. E 2019, 99, 062408. [Google Scholar] [CrossRef]
- Vafaei, V.; Kheiri, H.; Akbarfam, A.J. Synchronization of fractional-order chaotic systems with disturbances via novel fractional-integer integral sliding mode control and application to neuron models. Math. Methods Appl. Sci. 2019, 42, 2761–2773. [Google Scholar] [CrossRef]
- Bao, H.; Liu, W.; Hu, A. Coexisting multiple firing patterns in two adjacent neurons coupled by memristive electromagnetic induction. Nonlinear Dyn. 2019, 95, 43–56. [Google Scholar] [CrossRef]
- Sun, X.; Xue, T. Effects of Time Delay on Burst Synchronization Transition of Neuronal Networks. Int. J. Bifurc. Chaos 2018, 28, 1850143. [Google Scholar] [CrossRef]
- Angel Murillo-Escobar, M.; Omar Meranza-Castillon, M.; Martha Lopez-Gutierrez, R.; Cruz-Hernandez, C. Suggested Integral Analysis for Chaos-Based Image Cryptosystems. Entropy 2019, 21, 815. [Google Scholar] [CrossRef]
- Nesa, N.; Ghosh, T.; Banerjee, I. Design of a chaos-based encryption scheme for sensor data using a novel logarithmic chaotic map. J. Inf. Secur. Appl. 2019, 47, 320–328. [Google Scholar] [CrossRef]
- Ding, L.; Liu, C.; Zhang, Y.; Ding, Q. A New Lightweight Stream Cipher Based on Chaos. Symmetry 2019, 11, 853. [Google Scholar] [CrossRef]
- Nepomuceno, E.G.; Nardo, L.G.; Arias-Garcia, J.; Butusov, D.N.; Tutueva, A. Image encryption based on the pseudo-orbits from 1D chaotic map. Chaos Interdiscip. J. Nonlinear Sci. 2019, 29, 061101. [Google Scholar] [CrossRef] [PubMed]
- Zhou, Y.; Hua, Z.; Pun, C.; Chen, C.L.P. Cascade Chaotic System With Applications. IEEE Trans. Cybern. 2015, 45, 2001–2012. [Google Scholar] [CrossRef] [PubMed]
- Tirdad, K.; Sadeghian, A. Hopfield neural networks as pseudo random number generators. In Proceedings of the 2010 Annual Meeting of the North American Fuzzy Information Processing Society, Toronto, ON, Canada, 12–14 July 2010; pp. 1–6. [Google Scholar] [CrossRef]
- Li, Q.; Yang, X. Complex dynamics in a simple Hopfield-type neural network. In International Symposium on Neural Networks; Springer: Berlin, Germany, 2005; pp. 357–362. [Google Scholar]
- Huang, Y.; Yang, X.S. Chaos and bifurcation in a new class of simple Hopfield neural network. In International Symposium on Neural Networks; Springer: Berlin, Germany, 2006; pp. 316–321. [Google Scholar]
- Tsaneva-Atanasova, K.; Osinga, H.M.; Rieß, T.; Sherman, A. Full system bifurcation analysis of endocrine bursting models. J. Theor. Biol. 2010, 264, 1133–1146. [Google Scholar] [CrossRef] [PubMed]
- Hindmarsh, J.L.; Rose, R. A model of neuronal bursting using three coupled first order differential equations. Proc. R. Soc. Lond. Ser. B. Biol. Sci. 1984, 221, 87–102. [Google Scholar]
- Silva-Juarez, A.; Rodriguez-Gomez, G.; Fraga, L.G.d.l.; Guillen-Fernandez, O.; Tlelo-Cuautle, E. Optimizing the Kaplan–Yorke Dimension of Chaotic Oscillators Applying DE and PSO. Technologies 2019, 7, 38. [Google Scholar] [CrossRef]
- Tlelo-Cuautle, E.; de la Fraga, L.; Rangel-Magdaleno, J. Engineering Applications of FPGAs; Springer: Berlin, Germany, 2016. [Google Scholar]
- Rukhin, A.; Soto, J.; Nechvatal, J.; Smid, M.; Barker, E. A Statistical Test Suite for Random and Pseudorandom Number Generators for Cryptographic Applications; Technical Report; Booz-Allen and Hamilton Inc.: Mclean, VA, USA, 2001. [Google Scholar]
- Bassham, L.; Rukhin, A.; Soto, J.; Nechvatal, J.; Smid, M.; Barker, E.; Leigh, S.; Levenson, M.; Vangel, M.; Banks, D.; et al. A Statistical Test Suite for Random and Pseudorandom Number Generators for Cryptographic Applications; Technical Report NIST Special Publication (SP) 800-22 Rev. 1a; National Institute of Standards and Technology: Gaithersburg, MD, USA, 2010. [CrossRef]
- Lu, L.; Bao, C.; Ge, M.; Xu, Y.; Yang, L.; Zhan, X.; Jia, Y. Phase noise-induced coherence resonance in three dimension memristive Hindmarsh-Rose neuron model. Eur. Phys. J. Spec. Top. 2019, 228, 2101–2110. [Google Scholar] [CrossRef]
- Zhang, W.; Zhu, Z.; Yu, H. A symmetric image encryption algorithm based on a coupled logistic–bernoulli map and cellular automata diffusion strategy. Entropy 2019, 21, 504. [Google Scholar] [CrossRef]
- Lu, Q.; Zhu, C.; Deng, X. An Efficient Image Encryption Scheme Based on the LSS Chaotic Map and Single S-Box. Available online: https://ieeexplore.ieee.org/abstract/document/8977567 (accessed on 26 February 2020).
- Moafimadani, S.S.; Chen, Y.; Tang, C. A New Algorithm for Medical Color Images Encryption Using Chaotic Systems. Entropy 2019, 21, 577. [Google Scholar] [CrossRef]
- Wu, Y. NPCR and UACI Randomness Tests for Image Encryption. Cyber J. J. Sel. Areas Telecommun. 2011, 31–38. Available online: http://www.cyberjournals.com/Papers/Apr2011/05.pdf (accessed on 26 February 2020).
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).