Abstract
The time delay caused by transmission in neurons is often ignored, but it is demonstrated by theories and practices that time delay is unavoidable. A new chaotic neuron model with time delay self-feedback is proposed based on Chen’s chaotic neuron. The bifurcation diagram and Lyapunov exponential diagram are used to analyze the chaotic characteristics of neurons in the model when they receive the output signals at different times. The experimental results exhibit that it has a rich dynamic behavior. In addition, the randomness of chaotic series generated by chaotic neurons with time delay self-feedback under different conditions is verified. In order to investigate the application of this model in image encryption, an image encryption scheme is proposed. The security analysis of the simulation results shows that the encryption algorithm has an excellent anti-attack ability. Therefore, it is necessary and practical to study chaotic neurons with time delay self-feedback.
1. Introduction
The Hopfield neural network is a typical dynamic neural network with abundant dynamic characteristics [1–3]. It is considered as the basic network structure for studying a chaotic neural network, mainly because nonlinear dynamic behavior is introduced into the Hopfield neural network, and chaos is the typical embodiment of this behavior. Therefore, many chaotic neural networks are based on the Hopfield neural network. A chaotic neural network effectively solves the problem that the Hopfield neural network is easy to fall into local minimum solution and infeasible solution in the process of optimization [4].
Chen’s chaotic neural network is a typical chaotic neural network [5, 6]. A self-feedback term is introduced into discrete Hopfield neural network. In the initial stage, the ergodic property of chaos is used to optimize rough search in a large range to avoid falling into local minimum [7, 8]. In order to improve the optimization performance of chaotic neural networks, many scholars have proposed innovative research [9–15]. Based on Chen’s chaotic neural network, Xu et al. proposed the wavelet chaotic neural network using the Morlet wavelet function as an excitation function [10]. Ye and Yang presented the Legendre chaotic neural network by combining the polynomial of Legendre function with the Sigmoid function as the excitation function [11, 14]. These chaotic neural network models have rich dynamic characteristics. The study of the chaotic system has attracted the attention of scholars. In [16], the synchronization problem of chaotic systems using integral-type sliding mode control for a category of hyperchaotic systems is considered, controlling the nonlinear systems and chaotic behaviors for achievement of finite-time synchronization. In [17], an adaptive nonsingular terminal sliding mode control method based on obstacle function is proposed to solve the problem of robust stability of disturbed nonlinear systems. However, for the convenience of analysis and application, these research studies do not consider the delay in signal transmission.
Time delay refers to the fact that the effect of signal transport is always slower compared with the time at which a physical event occurs [18–20]. Time delay is universal and inevitable [21, 22]. The time delay can improve the dynamic characteristics of the original system and can be better applied to the fields such as encryption [23–25] and control [26]. System analysis with time delay has become a research hotspot [27–33]. Some one-dimensional chaotic mappings were studied in [27–29]. A projection neural network with two-time delay is proposed for solving quadratic programming problems subject to linear constraints in [30]. Wang investigated the stability analysis of the fractional-order Hopfield neural networks with time delay [31]. The study shows that the introduction of time delay makes the dynamic behavior of the model more complicated.
A chaotic neuron is the basic unit of the chaotic neural network, so it is the focus of our research. Grasping the chaotic characteristics of a single chaotic neuron can also provide a necessary premise for understanding the corresponding chaotic neural network [34]. There is a delay in signal transmission within the neuron. Input signals and output signals closely cooperate and influence each other. Therefore, introducing time delay into chaotic neurons is more suitable for signal transmission in chaotic neurons.
In this paper, a time delay self-feedback chaotic neuron model is proposed by introducing the time delay self-feedback into Chen’s chaotic neuron model. The Lyapunov exponential diagram and bifurcation diagram are used to analyze the chaotic characteristics of time delay self-feedback chaotic neuron considering the input signals and parameter allocation. Test suites such as NIST SP800-22 and TestU01 are used to evaluate the randomness of the sequences generated by the tests.
Currently, the encryption method based on the chaos theory is one of the main stream in the field of image encryption. Chaotic time series generated by time delay self-feedback chaotic neuron are used in image encryption combined with a classical permutation diffusion encryption structure. Simulation experiments are conducted to analyze the performance indexes of the image encryption algorithm.
The structure of the paper is as follows. Section 2 introduces the time delay self-feedback chaotic neuron model, analyzing dynamical behavior and the randomness of time series of time delay self-feedback chaotic neuron under different parameters and different number of neuron connections. Section 3 proposes a color image encryption algorithm based on time series generated by time delay self-feedback chaotic neurons. Section 4 uses the encryption algorithm to carry on the simulation experiment and the security analysis. The conclusion is drawn in Section 5.
2. Chaotic Neuron with Time Delay Self-Feedback
The model of chaotic neuron with time delayed self-feedback is as follows.where is the output of the neuron at time , is the internal state of the neuron at time , is the steepness parameter of Sigmoid function, represents the ability of neuron to retain internal state ranging from , is the number of output signal received, is the weight of output signals, is the self-feedback connection weight, and is the annealing speed parameter.
The hardware implementations of neurons can be performed from discretized equations using specific numerical methods [35–37]. The chaotic neuron with time delay self-feedback model proposed in this paper is a small dynamic system; the structure of chaotic neurons with time delay self-feedback is shown in Figure 1, which is based on Equations (1)–(4). The output of neuron at is delayed to to provide data for the iteration of neuron at .

This paper studies the dynamical system of chaotic neurons with time delay self-feedback when the cases in Table 1. Table 1 shows several situations of the chaotic neuron with different time delays and influence weights.
2.1. Chaotic Neurons with No Time Delay
2.1.1. Dynamical Behaviour in Chaotic Neurons with No Time Delay
When , , the single neuron has no time delay, which is Chen’s chaotic neuron. The single neuron model can be described as follows:
The state bifurcation diagram and the maximal Lyapunov exponential time evolution diagram can reflect the chaotic searching ability of the chaotic neuron to some extent [38, 39]. According to the model, the dynamic behavior of chaotic neuron is affected by four parameters. In order to analyze the influence of each parameter on the dynamic behavior of a single neuron, the remaining parameters are fixed and the bifurcation diagrams and Lyapunov exponent diagrams of the chaotic neuron are observed by changing one parameter.
The selection of a parameter is a result of experience and continuous testing. At the beginning, we set the iteration step of the parameter to a larger size so that the parameter can change quickly. The speed of training can be guaranteed by controlling the number of iterations of chaotic neuron with time delayed self-feedback. According to the bifurcation diagram and the maximal Lyapunov exponential diagram, the approximate range of the parameter is determined. Within the rough range of the parameter, the iteration step size of the parameter is reduced and the selected parameter is slightly adjusted. We observe the bifurcation diagram and the maximal Lyapunov exponential diagram and select the optimal parameter solution of the chaotic neuron with time delay self-feedback.
Set the parameters as follows: , , , .The bifurcation diagrams and Lyapunov exponent diagrams of one parameter are drawn. The simulation results of parameters r and k varied are shown in Figures 2 and 3. For parameters z and h, as shown in Figure 4 and Figure 5. The simulation results indicate that this neuron model has transient chaotic dynamic behaviour. The bifurcation diagrams have an obvious correspondence with the Lyapunov exponent diagrams. When the Lyapunov exponent is greater than 0, the neuron is in a chaotic state.

(a)

(b)

(a)

(b)

(a)

(b)

(a)

(b)
2.1.2. Sensitivity to Initial Condition
An important feature of chaotic neural networks is that they are sensitive to initial conditions. That is, small differences in initial conditions will generate different sequences [40]. In order to detect the sensitivity of chaotic neurons with no time delay self-feedback to the initial value, Equation (8) is made to process the output of , with the purpose of expanding the output of to [41]. We set the parameters as follows: , , , . We make a tiny change to each parameter or initial value to observe the output value of under different circumstances. The simulation results are shown in Figure 6. It can be seen that after 15 iterations, the output result of has a great change.

(a)

(b)

(c)

(d)

(e)
2.1.3. NIST SP800-22 Test
Random numbers play an important role in the field of image encryption. SP800-22 is used to detect the randomness of time series generated by chaotic neuron with no time delay self-feedback. SP800 is a series of guidelines on information security issued by National Institute of Standards and Technology (NIST) [42]. The time series generated by chaotic neurons with no time delay self-feedback processed by Equations (9) and (10) after Equation (8).
NIST conducts 15 randomized tests and outputs a value at the end of the tests. When value is greater than 0.01, it indicates that the randomness detection of the data set is successful. This article tests bits of binary data, and the simulation results are shown in Table 2.
2.1.4. TestU01 Test
TestU01 test suite provides a set of utilities for detecting random number generators [43]. The test standard of this suite is stricter than the NIST test. In the experiment, three predefined algorithms, Rabbit, Alphabit, and BlockAlphabit, are used to evaluate the randomness of chaotic sequences generated by chaotic neurons with no time delay self-feedback. Each algorithm is tested with sequences of bit. The simulation results are shown in Table 3. This shows that the random numbers generated by the neuron model have good randomness.
2.2. Chaotic Neurons with One-Time Delay Self-Feedback
2.2.1. Dynamical Behaviour in Chaotic Neurons with One-Time Delay Self-Feedback
When , , the output of chaotic neuron time is delayed to time , and the single neuron model can be described as follows.
We set the parameters as follows: , , , . The bifurcation graph and Lyapunov exponential graph of one parameter are drawn. The simulation results of parameters r and k are shown in Figures 7 and 8. For parameters z and h, it is shown in Figures 9 and 10.

(a)

(b)

(a)

(b)

(a)

(b)

(a)

(b)
Compared to the dynamical behavior of chaotic neuron with no time delay, the chaotic interval of chaotic neuron with one time delay self-feedback is not continuous, has a wider range of output values in parameters and .
2.2.2. Sensitivity to Initial Condition
We analyze the sensitivity of the model to the initial values. We set the parameters as follows: , , , . It can be seen from Figure 11 that after 12 iterations, the output result of has a great change.

(a)

(b)

(c)

(d)

(e)
The sensitivity of the model to the initial value is slightly improved. Compared the chaotic neuron with no time delay, the chaotic neuron with one-time delay self-feedback has fewer iterations when the same changes are made to the initial value.
2.2.3. NIST SP800-22 Test
The time series generated by the chaotic neuron with one-time delay self-feedback processed by Equations (9) and (10) after Equation (8). This article tests 100000000 bits of binary data and the results are shown in Table 4. The results show that all -values were greater than 0.01, indicating that the randomness test of the data set was successful. Compared with the chaotic neuron with no time delay, the randomness of time series generated by chaotic neurons with one-time delay self-feedback is improve.
2.2.4. TestU01 Test
TestU01 test suite is the most rigorous test to evaluate the statistical properties of generated sequence. Three predefined algorithms, Rabbit, Alphabit, and BlockAlphabit, are used to evaluate the randomness of the sequence generated by the chaotic neuron with one-time delay self-feedback, the sequence length is bit. The results are shown in Table 5. We prove that the generated binary sequence is random by performing TestU01 statistical test.
2.3. Chaotic Neurons with Two-Time Delay Self-Feedback
2.3.1. Dynamical Behaviour in Chaotic Neurons with Two-Time Delay Self-Feedback
When , , , we set the output of chaotic neuron time and are delayed to time and the weight is . The chaotic neuron with two-time delay self-feedback can be described as follows:
We set the parameters as follows: , , , . The bifurcation graph and Lyapunov exponential graph of one parameter are drawn. The simulation results of parameters r and k are shown in Figures 12 and 13. For parameters z and h, it is shown in Figures 14 and 15.

(a)

(b)
The dynamics of the network is changed by increasing the duration of the time delay and decreasing the proportion of each delayed neuron in the output signal. Compared with the chaotic neuron with one-time delay self-feedback, the chaotic intervals of , , parameters increase obviously and the interval of Lyapunov exponent greater than 0 expands.

(a)

(b)

(a)

(b)

(a)

(b)
2.3.2. Sensitivity to Initial Condition
We analyze the sensitivity of the model to the initial values. We set the parameters as follows: , , , . It can be seen from Figure 16 that after 10 iterations, the output result of has a great change.

(a)

(b)

(c)

(d)

(e)
2.3.3. NIST SP800-22 Test
The time series generated by chaotic neuron with two-time delay self-feedback processed by Equations (9) and (10) after Equation (8). This article tests 100000000 bits of binary data and the results are shown in Table 6. Compared with chaotic neuron with no time delay, the randomness of time series generated by the chaotic neuron with two-time delay self-feedback is improved.
2.3.4. TestU01 Test
TestU01 test is performed on sequence generated by the chaotic neuron with two-time delay self-feedback. The results are shown in Table 7. We prove that the generated binary sequence is random by performing TestU01 statistical test.
2.4. Chaotic Neurons with Three-Time Delay Self-Feedback
2.4.1. Dynamical Behaviour in Chaotic Neurons with Three-Time Delay Self-Feedback
When , , we set the output of chaotic neuron at time , , and are delayed to time and the weight is The chaotic neuron with three-time delay self-feedback can be described as follows:
We set the parameters as follows: , , , . The bifurcation graph and Lyapunov exponential graph of one parameter are drawn. The simulation results of parameters r and k varied are shown in Figures 17 and 18. For parameters z and h, it is shown in Figure 19 and 20. It can be seen from Figure 17-20 that the maximum value of Lyapunov is increased compared with chaotic neuron with no time delay.

(a)

(b)

(a)

(b)

(a)

(b)

(a)

(b)
2.4.2. Sensitivity to Initial Condition
We analyze the sensitivity of the model to the initial values. We set the parameters as follows: , , , . It can be seen from Figure 21 that after 11 iterations, the output result of has a great change. The sensitivity of the model to the initial value is slightly improved.

(a)

(b)

(c)

(d)

(e)
2.4.3. NIST SP800-22 Test
The time series generated by the chaotic neuron with three-time delay self-feedback processed by Equations (9) and (10) after Equation (8). This article tests 100000000 bits of binary data and the results are shown in Table 8. Compared with the chaotic neuron with no time delay, the randomness of time series generated by chaotic neurons with three-time delay self-feedback is improved.
2.4.4. TestU01 Test
TestU01 test is performed on a sequence generated by the chaotic neuron with two-time delay self-feedback. The results are shown in Table 9. We prove that the generated binary sequence is random by performing TestU01 statistical test.
2.5. Chaotic Neurons with Three-Time Delay and Four-Output
2.5.1. Dynamical Behaviour in Chaotic Neurons with Three-Time Delay and Four-Output
When , we set the output of the chaotic neuron at time , , are delayed to time , including the output of time t, with the weight is . The chaotic neuron with three-time delay and four-output can be described as follows:
We set the parameters as follows: , , , . The bifurcation graph and Lyapunov exponential graph of one parameter are drawn. The simulation results of parameters r and k are shown in Figures 22 and 23. The dynamic behavior of the neuron shows a distinct spiral shape, and the chaos range is obviously expanded in parameters z and h (Figures 24 and 25).

(a)

(b)

(a)

(b)

(a)

(b)

(a)

(b)
2.5.2. Sensitivity to Initial Condition
We analyze the sensitivity of the model to the initial values. We set the parameters as follows: , , , . It can be seen from Figure 26 that after 8 iterations, the output result of has a great change. The sensitivity to initial values is significantly improved. Therefore, the chaotic neuron with three-time delay and four-output is sensitive to initial values and can ensure the security of encryption.

(a)

(b)

(c)

(d)

(e)
2.5.3. NIST SP800-22 Test
The time series generated by the chaotic neuron with three-time delay and four-output processed by Equations (9) and (10) after Equation (8). This article tests 100000000 bits of binary data and the results are shown in Table 10. The results show that the time series generated by this model has good randomness and can guarantee the security of encryption.
2.5.4. TestU01 Test
TestU01 test is performed on a sequence generated by the chaotic neuron with two-time delay self-feedback. The results are shown in Table 11. We prove that the generated binary sequence is random by performing TestU01 statistical test.
3. Encryption Process
This paper proposes a color encryption algorithm based on the chaotic neuron with one-time delay self-feedback and the chaotic neuron with three-time delay and four output. The process is as follows: Step 1: Read the original color image , with the size of and are the row and column of image , respectively. Step 2: Image preprocessing, divided A into three individual gray images of , and channels, respectively, possessing the same size of . Step 3: Equation (21) iterates and times separately to produce two chaotic sequences and with initial value, respectively. Step 4: The two sequences are processed as Equations (23) and (24) to get new sequences and , ranging from . Step 5: These two sequences are used as the row address and column address of the permutating matrix, which are used to replace the row and column positions of image pixels. The permutation process is shown in Figure 27. Step 6: Equation (12) iterates times to produce two chaotic sequences and . Step 7: Take a point in the permutated image in order. If the ordinal number of the point is odd, the secret key is constructed from with Equation (25); if the ordinal number of the point is even, the secret key is constructed from with Equation (26). Step 8: Xor the pixel gray value in the permutated image and the secret key generated by Step 6 to obtain the encrypted pixel value . Step 9: Repeat until all the pixels have been encrypted to obtain the encrypted channel image. Step 10: The same operation is performed for , to obtain the encrypted image and . Step 11: Combine the three encrypted images to produce a colored encrypted image with the size of .

The encryption process is shown in Figure 28. The decryption process is the reverse of the encryption process. The original image can be restored by the reverse operation of the encryption process under the correct secret key.

4. Security Analysis
4.1. Simulation Result
In the experiment, three different images are used for simulation under the environment of MATLABR2018b. The obtained encrypted images are shown in Figure 29. As can be seen from Figure 29, after the encryption process, the outline of the plain images cannot be seen from the final encrypted images, which achieve the goal of encryption.

(a)

(b)

(c)

(d)

(e)

(f)

(g)

(h)

(i)
4.2. Information Entropy
The uncertainty of digital images is analyzed by the information entropy [44]. The larger the information entropy is, the more uniform the pixel value distribution of the image is, and the stronger the anti-attack ability of the image is. The calculation formula of information entropy is as follows.
Among them, represents the gray level of an image and represents the probability of the gray value . For a completely random image with a gray level of 256, the theoretical value of is 8. The closer the information entropy is to the theoretical value; the less useful information is Table 12 expressed in the image.
Table 12 records the information entropy of plain images and encrypted images and compares it with other algorithms. The results show that the image encrypted by the scheme proposed in this paper can better conceal information and resist statistical analysis.
4.3. Histogram Analysis
Image histogram shows the distribution of pixel values in the image, the abscissa represents the pixel value, and the ordinate represents the number of times each pixel value appears in the image [51, 52]. The histogram of the encrypted image should be uniformly distributed, meaning that the histogram of the encrypted image has little statistical information about the pixel distribution of the original image.
Figure 30 shows the histogram of the RGB channel of plaintext images is ups and downs, while the histogram distribution of the RGB channel of encrypted images is relatively flat, and the distribution of pixel values is uniform. Therefore, this encryption systems can resist statistical attacks.

(a)

(b)

(c)

(d)

(e)

(f)
4.4. Correlation Analysis
The correlation value of adjacent pixels can reflect the degree of correlation between adjacent pixels in images. The higher the correlation between adjacent pixels, the easier it is to obtain image information. Therefore, to resist statistical attacks, the correlation between adjacent pixels must be reduced [53, 54]. The correlation calculation formula is as follows.where and are the gray values of adjacent pixels, and represents the correlation value between adjacent pixels. The correlation diagram among adjacent pixels at horizontal, vertical, and diagonal directions of G channel in Lena image is shown in Figure 31 and the correlation coefficients according to the direction is shown in Table 13.

(a)

(b)

(c)

(d)

(e)

(f)
4.5. Differential Attack
The number of pixels change rate (NPCR) and unified average changing intensity (UACI) are used to analyze the antidifferential attack capability of the encryption algorithm. The original image is randomly changed by one pixel and the changed image is encrypted. The two encrypted images are compared to analyze the sensitivity of the algorithm. The definitions of NPCR and UACI are as follows [57, 58].where and are the width and length of the image, respectively. and are encrypted images obtained by encrypting two original images with only one pixel difference. The theoretical values of NPCR and UACI were 99.6094% and 33.4635%, respectively. It can be seen from Table 14 that the NPCR and UACI of the image are close to the theoretical values, indicating that the algorithm can resist differential attacks well.
4.6. Secret Key Sensitivity
Secret key sensitivity refers to the decryption of the same encrypted image using two secret keys with extremely slight differences [63]. The plain image can only be retrieved with the correct secret key. The secret keys of the algorithm in this paper include , , , , and ; the initial value of chaotic neuron with one-time delay self-feedback set , and , respectively. The rest of secret keys remain unchanged, respectively, to decrypt Figure 29(g), and two completely different decrypted images are obtained as shown in Figure 32. Therefore, the encryption algorithm in this paper is very sensitive to the change of secret key; subtle changes in secret key will result in unrecognizable decrypted image.

(a)

(b)
4.7. Data Loss
In the process of storage and transmission, the encrypted image is vulnerable to intercept and attack, resulting in the loss of information in the encrypted image, so that the decrypted image cannot be obtained [64]. In order to analyze the anticlipping attack ability of the algorithm in this paper, a matrix with the size of is selected from the encrypted image to simulate data loss. The pixel value of the matrix is set to 0, and the β value is 15%, 25%, and 50%, respectively. The processed image is decrypted as shown in Figure 33.

(a)

(b)

(c)

(d)

(e)

(f)

(g)

(h)
As can be seen from Figure 33, most of the image information of the original image can be decrypted under different degrees of data loss of the encrypted image. With the increase of data loss, the decrypted image gradually began to blur. However, even if half of the image information is lost, some features of the original image can be recognized, which is enough to prove that the algorithm has a good ability to resist cropping attack.
4.8. Noise Attack
In the transmission process of the encrypted image, it will inevitably be affected by various factors, which will have an impact on the decryption of the image, so the image encryption algorithm must have enough robustness [65, 66]. In the simulation experiment, different degrees of “salt and pepper” noise are added to the encrypted image, and then the encrypted image with noise are decrypted. The decryption results are shown in Figure 34.

(a)

(b)

(c)

(d)

(e)

(f)
As can be seen from Figure 34, most of the image information of the original image can be decrypted by the encrypted image under different degree of noise attack. With the increase of the noise level, the decrypted image gradually begins to blur, but some features of the original image can still be recognized, which is enough to prove that the algorithm has a good ability of antinoise attack.
4.9. Speed Analysis
The encryption efficiency can reflect the advantages and disadvantages of image encryption algorithm. Table 15 shows the average time to encrypt and decrypt images of different sizes for ten times. We can see that the execution time of encryption algorithm in this paper is short, which has advantages in terms of time.
Time complexity is an important performance index to evaluate encryption algorithms. Assuming that the size of the image is , the time complexity of the algorithm mainly depends on the permutation and diffusion operation of the image. After calculation, it can be concluded that the time complexity of the encryption algorithm used in this paper is . Table 16 shows the time complexity of the encryption algorithm used in this paper and compares it with other schemes.
4.10. Secret Key Space
Secret key space directly affects the feasibility of exhaustive attack [69]. Large secret key space can increase the difficulty of cracking encryption algorithms and improve security. In the encryption algorithm proposed in this paper, the secret key space is composed of 10 parameters, which are 5 parameters of the chaotic neuron with one-time delay self-feedback and 5 parameters of the chaotic neuron with three-time delay and four-output, respectively. The value range of is , , , , and . Because double-precision float can accurately represent 14 digits after the decimal point, the secret key space is as follows:
The secret key space of the algorithm can resist exhaustive attack.
5. Conclusions
In this paper, time delay self-feedback is introduced into Chen’s chaotic neuron and a chaotic neuron model with time delay self-feedback is proposed. The model delays the signal output before time to time and analyzes the chaotic neuron at different delay times. The results show that the sensitivity of the chaotic neuron with time delay self-feedback to initial value and the randomness of generated time series are improved obviously. Time delay can make the chaotic neuron withdraw from chaos and enter chaos periodically, which makes the chaos control process more flexible. In order to verify the practicability of the chaotic neuron model with time delay self-feedback, we design a color image encryption algorithm based on the model. Simulation results show that the color image encryption algorithm has high security and can resist security attacks effectively.
Data Availability
The data and code used to support the findings of this study are available from the corresponding author upon request.
Conflicts of Interest
The authors declare that they have no conflicts of interest.
Acknowledgments
This work was funded by Nature Science Foundation of Heilongjiang, grant number (LH2021F035).