Chaotic Image Encryption Using Hopfield and Hindmarsh–Rose Neurons Implemented on FPGA

Chaotic systems implemented by artificial neural networks are good candidates for data encryption. In this manner, this paper introduces the cryptographic application of the Hopfield and the Hindmarsh–Rose neurons. The contribution is focused on finding suitable coefficient values of the neurons to generate robust random binary sequences that can be used in image encryption. This task is performed by evaluating the bifurcation diagrams from which one chooses appropriate coefficient values of the mathematical models that produce high positive Lyapunov exponent and Kaplan–Yorke dimension values, which are computed using TISEAN. The randomness of both the Hopfield and the Hindmarsh–Rose neurons is evaluated from chaotic time series data by performing National Institute of Standard and Technology (NIST) tests. The implementation of both neurons is done using field-programmable gate arrays whose architectures are used to develop an encryption system for RGB images. The success of the encryption system is confirmed by performing correlation, histogram, variance, entropy, and Number of Pixel Change Rate (NPCR) tests.


Introduction
Image encryption is one of the well-known mechanisms to preserve confidentiality over a reliable unrestricted public channel. However, public channels are vulnerable to attacks and hence efficient encryption algorithms must be developed for secure data transfer. In [1], the authors surveyed ten conventional and five chaos-based encryption techniques to encrypt three test images of different sizes based on various performance metrics, and the important conclusion was that none of the conventional schemes were designed especially for images and hence none of them have any dependence on the initial image. In this manner, the topic of image encryption remains open and several researchers are proposing the use of chaotic systems to mask information that can be transmitted in a secure channel. In this direction, this paper highlights the usefulness of Hopfield and Hindmarsh-Rose neural networks to generate chaotic behavior, and its suitability to design random number generators (RNGs) that are implemented using field-programmable gate arrays (FPGAs). verify the Lyapunov exponents. Section 4 details the FPGA-based implementation of both Hopfield and Hindmarsh-Rose neuron models. Section 5 shows the selection of the series with the highest values of the positive Lyapunov exponent, which are used to generate binary sequences, and whose randomness is evaluated by performing NIST tests. Section 6 shows the application of the generated binary sequences to encrypt an image in a chaotic secure communication system, and the success of the RGB image encryption system is confirmed by performing correlation, histogram, variance, entropy, and Number of Pixel Change Rate (NPCR) tests. Finally, Section 7 summarizes the main results of this work.

Mathematical Models of Hopfield and Hindmarsh-Rose Neurons
This section describes the mathematical models of both the Hopfield and the Hindmarsh-Rose neural networks. For instance, the complex dynamics of the Hopfield-type neural network with three neurons are analyzed in [26], as well as the observation of the stable points, limit circles, single-scroll chaotic attractors and double-scrolls chaotic attractors. By varying the parameters, the numerical simulations performed in [27] show that the simple Hopfield neural networks can display chaotic attractors and periodic orbits for different parameters, and they associate different values of the Lyapunov exponents and bifurcation plots. The Hindmarsh-Rose neural network is analyzed in [28], and by using the polynomial model previously introduced in [29], the authors perform a detailed bifurcation analysis of the full fast-slow system for bursting patterns.

Hopfield Neuron
The Hopfield neural network can be modeled by Equation (1), where v represents the state variables, c is a proportional constant, W is the weights matrix, and f (v) is associated to the activation function [5].
Commonly, c is made equal to one, and when a chaotic behavior is desired, the activation function is a hyperbolic tangent function and the weights matrix W is modified, whose size depends on the number of neurons. In this work, W has size 3 × 3, meaning that the Hopfield neural network has three state variables, associated with each neuron. That way, Equations (2)-(4), describe the model of the three neurons, as shown in [5]. In this case, v in (1) is replaced by a three elements vector, including three state variables, namely: x, y, and z; and the control parameter p is set to 0.0997. This value is found by exploring values that maximize the positive Lyapunov exponent (LE+). Figures 1 and 2 show the chaotic time series and the attractors obtained by applying the 4th-order Runge-Kutta method.  The equilibrium points are obtained by applying the Newton Raphson method is the Jacobian, because the neural network has nonlinear terms as tanh(). The Equilibrium points are: EP 1 = (0, 0, 0) , EP 2 = (0.4932, 0.3658, −3.2666) , EP 3 = (−0.4932, −0.3658, 3.2666). The eigenvalues are obtained for each equilibrium point evaluating |λI − J| = 0, so that the ones associated to EP 1 are: λ 1 =1.9416 and λ 2,3 =-0.0658 ± j 1.8793. The eigenvalues associated to EP 2 and EP 3 are: λ 1 =-0.9870 and λ 2,3 =0.5381 ± j 1.2861.

Bifurcation Diagrams and Selection of the Best Values to Generate Enhanced Chaotic Time Series
Bifurcation diagrams are quite useful to find appropriate values of the mathematical models of the neurons, and in this paper, they are generated to find the best Lyapunov exponents and Kaplan-Yorke dimension, which are considered as appropriate metrics to enhance the generation of chaotic time series. In the case of the Hopfield neuron, the state variable x is selected to plot the bifurcation with respect to the control parameter p. This process must be performed following the variation of all the coefficients of the mathematical model and for all the state variables in various iterations until some dynamical characteristics of the chaotic system, like the positive Lyapunov exponent (LE+) and Kaplan-Yorke dimension [30], are improved. Varying the weights matrix W, one can find better characteristics. For example, in this paper the nine elements in W were varied in the ranges and steps listed in Table 1. All these cases generated different bifurcation diagrams from which the feasible values were selected. For example, Figure 5 shows the bifurcation diagram varying W(3, 1), where it can be appreciated that the feasible values to generate chaotic behavior must be chosen with values lower than −4.5. In this manner, after exploring the bifurcation diagrams by varying the values in the ranges given in Table 1, three feasible sets of values are given in Table 2, where it can be appreciated their variations with respect to the original values given in [5]. The chaotic time series associated with those sets of values, obtained by applying the 4th-order Runge-Kutta method are shown in Figure 6 for the state variable x. Those chaotic time series are used to evaluate Lyapunov exponents and Kaplan-Yorke dimension using TISEAN.  Table 1.

Matrix Element Variation Range
Step   In the case of the Hindmarsh-Rose neural network model given in (5), one can count eight coefficients that can be varied. In this case, a heuristic process was performed with the goal of improving the chaotic behavior. Each coefficient ϕ, a, b, a 1 , k, b 1 , ε, and s, was varied in steps of 0.001 and observing the degradation of chaotic behavior. After performing the variations and observing the bifurcation diagrams, two sets of values were found, which are listed in Table 4. In this manner, Figure 7 shows the chaotic time series associated with these sets of values.
The simulation of the chaotic time series was performed using the initial conditions x 0 = 0.1169282607, y 0 = 0.03563851071, and z 0 = 0.01034665217, and those series were introduced to TISEAN to evaluate Lyapunov exponents and Kaplan-Yorke dimension that are given in Table 5. Since the maximum exponent is positive, then chaotic behavior is guaranteed. In this case, the sets of values HRNset1 is the best because the Kaplan-Yorke dimension is 3, i.e., the ideal value for a three-dimensional dynamical system.    Table 4: (a) original, (b) HRNset1, and (c) HRNset2, plotting the state variable x.

FPGA-Based Implementation of the Neurons
The hardware implementation of both neurons can be performed from the discretized equations using a specific numerical method. For example, in [31] one can find the discretization of a dynamical model by applying Forward Euler and 4th-order Runge-Kutta methods. It can be appreciated that there is a trade-off between exactness and hardware resources. Besides, since the Hopfield neural network is a small dynamical system, the 4th-order Runge-Kutta method is used herein to develop the FPGA-based implementation, as shown in Figure 8, which is based on Equations (2)-(4). The details of the numerical method is sketched in Figure 9, where one can appreciate the block for the hyperbolic tangent function given in Equation (3), which is implemented as already shown in [31]. The general architecture shown in Figure 8 consists of a finite state machine (FSM) that controls the iterations of the numerical method, whose data is saved in the registers (x, y, z). The block labeled Hopfield Chaotic Neuronal Network contains the hardware that evaluates the 4th-order Runge-Kutta method receiving the data at iteration (x i , y i , z i ) and providing the data at the next iteration (x i+1 , y i+1 , z i+1 ). The Mux blocks introduce the initial conditions (x 0 , y 0 , z 0 ) and select the values (x i+1 , y i+1 , z i+1 ) for all the remaining iterations. The Reg blocks are parallel-parallel arrays and save the data being processed within the dynamical system. The output of the whole architecture provide a binary string associated to a specific state variable (x, y, z).  The hyperbolic tangent function given in [31] and used in Figure 9, is described by Equations (6) and (7), with L = 2, β = 1, and θ = 0.25. Table 6 shows the hardware resources for the implementation of the four cases given in Table 2 for the Hopfield neuron. The numerical method is the 4th-order Runge-Kutta and the FPGA Cyclone IV EP4CE115F29C7 is used. In a similar way, the FPGA-based implementation of Hindmarsh-Rose neural network given in Equation (5), is developed by applying a numerical method. In this case, and applying Forward-Euler method, the hardware description is shown in Figure 10, where it can be appreciated the use of a finite state machine (FSM) to control the iterations associated to the numerical method, the use of multiplexers to process the initial conditions and afterwards the remaining iterations, the use of registers to save the data of the state variables and blocks to evaluate the discretized equations by Forward Euler. The whole iterative process to generate the next value of the state variables requires seven clock (CLK) cycles.
In both FPGA-based implementations for Hopfield and Hindmarsh-Rose neural networks, the computer arithmetic is performed using fixed-point notation of 5.27 for the Hopfield and 3.29 for the Hindmarsh-Rose neural networks. The FPGA resources for the three sets of values of the Hindmarsh-Rose neural network are listed in Table 7.

Randomness Test: NIST
In this Section, the results of the NIST tests [32,33] for both neural networks are shown. The four cases of Hopfield neurons, and using the state variable x with 1000 chaotic time series (binary strings) of 1 million bits each, generated the NIST tests given in Table 8. The results using the original values taken from [5], the three sets of values given in Table 2, and the results using the weight matrix from [14], can be compared. All cases passed NIST tests with proportions around 99%, and the set of values HNNset2 generated a higher p-Value average of 0.7065.
The computer arithmetic for the Hindmarsh-Rose neural network is 3.29 but as the largest variation occurs in the least significative bits (LSB), then only the 16 LSB of each 32-bit number were used. The NIST tests were performed using 1000 chaotic time series of 1 million bits each. The results are summarized in Table 9 including the averages of the p-Values and proportions.

Image Encryption Application
The binary sequences tested in the previous section can be taken as pseudorandom number generators (PRNGs), and can be used to design a chaotic secure communication system to encrypt images, as shown in Figure 11. Those PRNGs can be implemented by either the Hopfield or Hindmarsh-Rose neural networks because both provide high randomness. Both neurons can also be implemented using memristors, as shown in [34], which constitutes another research direction of hardware security. For instance, using the four binary sequences from the FPGA-based implementation of the Hopfield neural network, we show the encryption of three images (Lena, Fruits and Baboon) in Figure 12. The three RGB images have a resolution of 512 × 512 pixels.  Table 6, and the parameters corresponding to HNNset2. The original image is on the left, the encrypted in the center and the recovered on the right column, for: (a) Lena, (b) Fruits, and (c) Baboon images.
The correlation analysis is performed using [35]: Equation (8)- (10), and it provides the values given in Table 10. The first row x, y, z, means that the chaotic time series of state x is used to encrypt R (red), y to G (green) and z to B (blue). The second row means that all R, G and B are encrypted using the data from x, and so on.   Figure 13 shows the histograms of Lena before and after encryption using HNNset2 from Table 2. To describe the distribution characteristics of the histograms quantitatively, the variance of the histograms for the three images (Lena, Fruits and Baboon) is calculated according to [36], and the results are shown in Table 11.
The entropy is evaluated by Equation (11), where P(s i ) represents the probability of the datum s i . Using 8 bits (N = 8), Table 12 shows the entropies for the three images (Lena, Fruits and Baboon).
To verify the encryption capability against differential attacks, the NPCR test is evaluated by Equations (12) and (13) [37], where C 1 (i, j) and C 2 (i, j) are two cipher images that are encrypted from two plain images with only one-bit difference. In this case, using HNNset2 for the Lena, Fruits and Baboon images, the NPCR values are: 99.2672 when using state variable x, 99.2886 when using the state variable y, and 99.3420 when using z. All NPCR results pass the criterium given in [38].
The binary sequences from the FPGA implementation of the Hindmarsh-Rose neural network shown in Figure 7, were used as PRNG to encrypt the Lena, Fruits and Baboon images, which results are shown in Figure 14.   Table 7. The original image is on the left, the encrypted in the center and the recovered on the right column.
The correlation analysis between the images and the sets of values from Table 7, provides the values given in Table 13. The first row x, y, z, means that the chaotic time series of state x is used to encrypt R (red), y to G (green) and z to B (blue). The second row means that all R, G and B are encrypted using the data from x, and so on.  Figure 15 shows the histogram of the Lena image before and after encryption using HRNset1 from Table 4. The variance of the histograms for the three images (Lena, Fruits and Baboon) using HRNset1, is calculated according to [36], and the results are shown in Table 14.  The entropy is evaluated by (11) using HRNset1, and using 8 bits (N = 8), Table 15 shows the entropies for the three images (Lena, Fruits and Baboon). The key space is equal to 160 bits because each datum is encoded by 32 bits, the initial condition for each state variable counts, the step-size of the numerical method can change and is also encoded using 32 bits. The NPCR tests results using HRNset1 for the color images, were: 99.60365 when using the state variable x, 99.49608 when using the state variable y, and 98.49586 when using the state variable z.
In both FPGA-based implementations for the Hopfield and Hindmarsh-Rose neural networks, the image is transmitted from a personal computer running MatLab to the FPGA using the serial port RS-232, as described in [31].

Conclusions
The use of two well-known neural networks, the Hopfield and Hindmarsh-Rose ones, for image encryption applications has been described. With the help of bifurcation diagrams, new feasible sets of values were proposed in order to generate binary strings with more randomness than the ones previously published in the literature. We proposed three sets of values for the Hopfield neuron and two sets of values for the Hindmarsh-Rose neuron. The chaotic time series were analyzed by TISEAN to compute Lyapunov exponents and Kaplan-Yorke dimension. The proposed sets of values were much better than the already published ones.
By applying numerical methods, we showed the descriptions of the hardware design of both neurons and the FPGA resources were listed for the Hopfield and Hindmarsh-Rose neurons, respectively. The binary strings that were generated by the FPGA-based implementations of both neurons were taken as PRNGs to perform the encryption of the RGB Lena, Fruits and Baboon images. The success of the encryption system has been confirmed by the results obtained from correlation, histogram, variance, entropy, and NPCR tests. This demonstrates that both neurons are very useful for chaotic image encryption.