#
Difference-Based Mutation Operation for Neuroevolution of Augmented Topologies^{ †}

^{*}

^{†}

## Abstract

**:**

## 1. Introduction

## 2. Materials and Methods

#### 2.1. Neuroevolution

#### 2.2. Encoding Scheme Used in NEAT

#### 2.3. Crossover and Mutation Operators Used in NEAT

- Add connection: two nodes are selected randomly, and connected to each user. The weight of the connection is assigned randomly in the following way: first a random value from [0,1] is generated, and if it is smaller then 0.5, then the weight is set as a normal distributed random number with a mean equal to 0 and standard deviation set to 0.1 ($normrand(0,0.1)$). Otherwise, the weight it set close to 1, i.e., $normrand(1,0.1)$. A new innovation number is assigned to the connection.
- Mutating random node: the type of operation performed in a randomly chosen node is changed to another one, and the new innovation number is assigned to the node. Only hidden nodes are mutating.
- Connection removal: if the structure of the network is not minimal, then a randomly chosen connection is deleted. The connections between inputs and outputs are never chosen.
- Connections merging: if the network structure contains at least two connections following the same path, i.e., having the same source and destination, then these nodes are merged together, and the weight value is assigned as the sum of weights. The new connection receives the innovation number of one of the merged.
- Adding a node to connection: one of the connections is randomly selected and divided into two, and a node is placed in between. The new node receives a new innovation number and operation, one of the weights is set to 1, while the other keeps the previous value. The connections receive new innovation numbers.
- Assigning random weights: every connection is mutated with a probability of $1/NumberOfConnections$ the connection receives either weight, chosen from $normrand(0,0.1)$ or $normrand(1,0.1)$. Otherwise, the current value of the weight is used as a mean value to generate new as follows: $w=normrand(w,0.01)$, where w is the current weight value. With probability of $1/NumberOfConnections$ each weight is either activated or deactivated.

#### 2.4. The General Scheme of the NEAT Algorithm

Algorithm 1 NEAT |

1: for $gen=1$ to $NG$ do |

2: Initialize population P with minimal solutions, calculate fitness $fi{t}_{i}$, $i=1,\dots ,N$ |

3: Assign species numbers to every individual |

4: for $i=1$ to N do |

5: Perform tournament-based selection (t = 2) to get index $tr$ |

6: Perform crossover between ${P}_{i}$ and ${P}_{tr}$, save offspring to ${O}_{i}$ |

7: Select the mutation operator to be used |

8: Perform mutation on ${O}_{i}$ and calculate fitness |

9: end for |

10: Perform speciation of combined parents P and offspring O populations |

11: Create new population from the representatives of every species |

12: end for |

13: Return best found solution |

- Linear: equals x
- Unsigned step function: equals 1 if $x>0$
- Sine: $sin\left(\pi x\right)$
- Gaussian: $exp(-x\ast x/2)$
- Hyperbolic tangent: $tanh\left(x\right)$
- Sigmoid: $\left(tanh\right(x/2)+1)/2$
- Inverse: $-x$
- Absolute value: $\left|x\right|$
- ReLU: $max(0,x)$
- Cosine: $cos\left(\pi x\right)$
- Squared: ${x}^{2}$

#### 2.5. Proposed Mutation Operation

## 3. Results

## 4. Conclusions

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## Abbreviations

EA | Evolutionary Algorithm |

NN | Neural Network |

NEAT | NeuroEvolution of Augmented Topologies |

DBM | Difference-Based Mutation |

RIP | Rotary Inverted Pendulum |

HEFCA | Hybrid Evolutionary Fuzzy Classification Algorithm |

## References

- Bengio, Y.; LeCun, Y.; Hinton, G. Deep Learning. Nature
**2015**, 521, 436–444. [Google Scholar] - Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput.
**1997**, 9, 1735–1780. [Google Scholar] [CrossRef] - Jaeger, H.; Haas, H. Harnessing Nonlinearity: Predicting Chaotic Systems and Saving Energy in Wireless Communication. Science
**2015**, 304, 78–80. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Chenxi, L.; Zoph, B.; Neumann, M.; Shlens, J.; Hua, W.; Li, L.-J.; Fei-Fei, L.; Yuille, A.; Huang, J.; Murphy, K. Progressive Neural Architecture Search. In Proceedings of the European Conference on Computer Vision (ECCV 2018), Munich, Germany, 8–14 September 2018; pp. 19–34. [Google Scholar]
- Real, E.; Moore, S.; Selle, A.; Saxena, S.; Suematsu, Y.L.; Tan, J.; Le, Q.V.; Kurakin, A. Large-scale evolution of image classifiers. In Proceedings of the ICML, Sydney, Australia, 6–11 August 2017. [Google Scholar]
- Miikkulainen, R.; Liang, J.Z.; Meyerson, E.; Rawal, A.; Fink, D.; Francon, O.; Raju, B.; Shahrzad, H.; Navruzyan, A.; Duffy, N.; et al. Evolving deep neural networks. Neural Evol. Comput.
**2017**, 293–312. [Google Scholar] [CrossRef] [Green Version] - Xie, L.; Yuille, A.L. Genetic CNN. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 1388–1397. [Google Scholar]
- Angeline, P.J.; Saunders, G.M.; Pollack, J.B. An evolutionary algorithm that constructs recurrent neural networks. IEEE Trans. Neural Netw.
**1993**, 5, 54–65. [Google Scholar] [CrossRef] [PubMed] - Stanley, K.O.; Miikkulainen, R. Evolving Neural Networks through Augmenting Topologies. Evol. Comput.
**2002**, 10, 99–127. [Google Scholar] [CrossRef] [PubMed] - Stanovov, V.; Akhmedova, S.; Semenkin, E. Neuroevolution of augmented topologies with difference-based mutation. IOP Conf. Ser. Mater. Sci. Eng.
**2021**, 1047, 012075. [Google Scholar] [CrossRef] - Braun, H.; Weisbrod, J. Evolving feedforward neural networks. In Proceedings of the ANNGA93, International Conference on Artificial Neural Networks and Genetic Algorithms, Innsbruck, Austria, 14–16 April 1993; pp. 25–32. [Google Scholar]
- Dasgupta, D.; McGregor, D. Designing application-specific neural networks using the structured genetic algorithm. In Proceedings of the International Conference on Combinations of Genetic Algorithms and Neural Networks, Baltimore, MD, USA, 6 June 1992; pp. 87–96. [Google Scholar]
- Yao, X.; Liu, Y. Towards designing artificial neural networks by evolution. Appl. Math. Comput.
**1996**, 91, 83–90. [Google Scholar] [CrossRef] - Floreano, D.; Durr, P. Mattiussi C Neuroevolution: From Architectures to Learning. Evol. Intell.
**2008**, 1, 47–62. [Google Scholar] - Rodrigues, N.M.; Silva, S.; Vanneschi, L. A Study of Fitness Landscapes for Neuroevolution. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC), Glasgow, UK, 19–24 July 2020; pp. 1–8. [Google Scholar]
- Arza, E.; Ceberio, J.; Pérez, A.; Irurozki, E. An adaptive neuroevolution-based hyperheuristic. In Proceedings of the 2020 Genetic and Evolutionary Computation Conference Companion (GECCO ’20), Cancún, Mexico, 8–12 July 2020; pp. 111–112. [Google Scholar]
- Gonzalez, S.; Miikkulainen, R. Improved Training Speed, Accuracy, and Data Utilization Through Loss Function Optimization. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC), Glasgow, UK, 19–24 July 2020; pp. 1–8. [Google Scholar]
- Gomez, F.; Schmidhuber, J.; Miikkulainen, R. Accelerated Neural Evolution Through Cooperatively Coevolved Synapses. J. Mach. Learn. Res.
**2008**, 9, 937–965. [Google Scholar] - Yaman, A.; Mocanu, D.C.; Iacca, G.; Fletcher, G.; Pechenizkiy, M. Limited evaluation cooperative co-evolutionary differential evolution for large-scale neuroevolution. In Proceedings of the Genetic and Evolutionary Computation Conference, Kyoto, Japan, 15–19 July 2018; pp. 569–576. [Google Scholar]
- Lyu, Z.; ElSaid, A.; Karns, J.; Mkaouer, M.W.; Desell, T. An Experimental Study of Weight Initialization and Lamarckian Inheritance on Neuroevolution. In EvoApplications Lecture Notes in Computer Science; Springer: Berlin/Heidelberg, Germany, 2021; Volume 12694. [Google Scholar] [CrossRef]
- Deb, K.; Deb, D. Analysing mutation schemes for real-parameter genetic algorithms. Int. J. Artif. Intell. Soft Comput.
**2014**, 4, 1–28. [Google Scholar] [CrossRef] - Kennedy, J.; Eberhart, R. Particle Swarm Optimization. In Proceedings of the IEEE International Conference on NEURAL Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1941–1948. [Google Scholar]
- Storn, R.; Price, K. Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim.
**1997**, 11, 341–359. [Google Scholar] [CrossRef] - Yang, X.S.; Deb, S. Cuckoo Search via Levy flights. In Proceedings of the World Congress on Nature & Biologically Inspired Computing, Coimbatore, India, 9–11 December 2009; pp. 210–214. [Google Scholar]
- Alcalá-Fdez, J.; Sánchez, L.; Garcia, S.; del Jesus, M.J.; Ventura, S.; Garrell, J.M.; Otero, J.; Romero, C.; Bacardit, J.; Rivas, V.M.; et al. KEEL: A software tool to assess evolutionary algorithms for data mining problems. Soft Comput.
**2009**, 13, 307–318. [Google Scholar] [CrossRef] - Heaton, J. Ian Goodfellow, Yoshua Bengio, and Aaron Courville, Deep learning. Genet. Program. Evolvable Mach.
**2015**, 19, 305–307. [Google Scholar] [CrossRef] [Green Version] - Quinlan, J.R. Bagging, boosting, and C4.5. AAAI/IAAI
**1996**, 1, 725–730. [Google Scholar] - Breiman, L. Random forests. Mach. Learn.
**2001**, 45, 5–32. [Google Scholar] [CrossRef] [Green Version] - Stanovov, V.; Semenkin, E.; Semenkina, O. Self-configuring hybrid evolutionary algorithm for fuzzy classification with active learning. In Proceedings of the IEEE Congress on Evolutionary Computation, Sendai, Japan, 24–28 May 2015; pp. 1823–1830. [Google Scholar]
- Stanovov, V.; Semenkin, E.; Semenkina, O. Self-configuring hybrid evolutionary algorithm for fuzzy imbalanced classification with adaptive instance selection. J. Artif. Intell. Soft Comput. Res.
**2016**, 6, 173–188. [Google Scholar] [CrossRef] [Green Version] - Chollet, F. The Keras Library. 2015. Available online: https://keras.io (accessed on 20 April 2021).
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res.
**2011**, 12, 2825–2830. [Google Scholar] - Prasad, L.B.; Tyagi, B.; Gupta, H.O. Optimal Control of Nonlinear Inverted Pendulum System Using PID Controller and LQR: Performance Analysis Without and With Disturbance Input. Int. J. Autom. Comput.
**2011**, 11, 661–670. [Google Scholar] [CrossRef] [Green Version] - Fairus, M.A.; Mohamed, Z.; Ahmad, M.N. Fuzzy modeling and control of rotary inverted pendulum system using LQR technique. IOP Conf. Ser. Mater. Sci. Eng.
**2013**, 53, 1–11. [Google Scholar]

**Figure 3.**Visualization of DBM example from Figure 2.

Mutation Type | Probability to Use |
---|---|

Add connection | 0.1 |

Mutating random node | 0.15 |

Connection removal | 0.15 |

Connections merging | 0.2 |

Adding node to connection | 0.1 |

Assigning random weights | 0.2 |

No mutation/Proposed approach | 0.1 |

Dataset | Number of Instances | Number of Features | Number of Classes |
---|---|---|---|

Australian credit | 690 | 14 | 2 |

German credit | 1000 | 24 | 2 |

Segment | 2310 | 19 | 7 |

Phoneme | 5404 | 5 | 2 |

Page-blocks | 5472 | 10 | 5 |

Twonorm | 7400 | 20 | 2 |

Ring | 7400 | 20 | 2 |

Magic | 19,020 | 10 | 2 |

Dataset | $\mathit{NEAT}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.1}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.3}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.5}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.7}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.9}$ |
---|---|---|---|---|---|---|

Australian credit | 0.8884 | 0.8862 | 0.8857 | 0.8878 | 0.8797 | 0.8886 |

German credit | 0.7900 | 0.7868 | 0.7900 | 0.7882 | 0.7923 | 0.7929 |

Segment | 0.7811 | 0.7973 | 0.8155 | 0.8179 | 0.8220 | 0.8033 |

Phoneme | 0.8083 | 0.7914 | 0.7955 | 0.7983 | 0.7954 | 0.8042 |

Page-blocks | 0.9606 | 0.9608 | 0.9620 | 0.9593 | 0.9608 | 0.9609 |

Twonorm | 0.9758 | 0.9749 | 0.9755 | 0.9749 | 0.9757 | 0.9767 |

Ring | 0.9026 | 0.8946 | 0.8802 | 0.9020 | 0.9003 | 0.9018 |

Magic | 0.8290 | 0.8350 | 0.8359 | 0.8303 | 0.8391 | 0.8341 |

Dataset | $\mathit{NEAT}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.1}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.3}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.5}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.7}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.9}$ |
---|---|---|---|---|---|---|

Australian credit | 0.8603 | 0.8647 | 0.8765 | 0.8824 | 0.8971 | 0.8441 |

German credit | 0.7500 | 0.7470 | 0.7480 | 0.7660 | 0.7410 | 0.7420 |

Segment | 0.7874 | 0.8082 | 0.8199 | 0.8087 | 0.8264 | 0.7944 |

Phoneme | 0.8074 | 0.7918 | 0.7918 | 0.7950 | 0.7911 | 0.8171 |

Page-blocks | 0.9607 | 0.9653 | 0.9594 | 0.9631 | 0.9603 | 0.9618 |

Twonorm | 0.9751 | 0.9758 | 0.9732 | 0.9751 | 0.9739 | 0.9729 |

Ring | 0.9022 | 0.8959 | 0.8733 | 0.9011 | 0.9005 | 0.8992 |

Magic | 0.8271 | 0.8342 | 0.8386 | 0.8305 | 0.8331 | 0.8326 |

Dataset | ${\mathit{NEAT}}_{\mathit{DBM}0.1}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.3}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.5}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.7}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.9}$ |
---|---|---|---|---|---|

Australian credit | 0, Z = −0.53 | 0, Z = −0.80 | 0, Z = −0.42 | 0, Z = −1.48 | 0, Z = −0.19 |

German credit | 0, Z = −0.95 | 0, Z = −0.08 | 0, Z = −0.72 | 0, Z = 0.15 | 0, Z = 0.53 |

Segment | 0, Z = 1.36 | 1, Z = 2.19 | 1, Z = 2.42 | 1, Z = 2.50 | 0, Z = 1.02 |

Phoneme | −1, Z = −2.38 | 0, Z = −1.74 | 0, Z = −0.72 | −1, Z = −2.12 | 0, Z = −0.98 |

Page−blocks | 0, Z = 0.45 | 0, Z = 0.79 | 0, Z = −0.42 | 0, Z = 0.30 | 0, Z = 0.61 |

Twonorm | 0, Z = −0.87 | 0, Z = −1.14 | 0, Z = −0.68 | 0, Z = −0.80 | 0, Z = 1.67 |

Ring | 0, Z = −1.36 | 0, Z = −1.17 | 0, Z = 0.08 | 0, Z = −0.15 | 0, Z = 0.76 |

Magic | 0, Z = 1.81 | 0, Z = 1.55 | 0, Z = 1.13 | 1, Z = 2.42 | 0, Z = 1.36 |

Dataset | ${\mathit{NEAT}}_{\mathit{DBM}0.1}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.3}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.5}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.7}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.9}$ |
---|---|---|---|---|---|

Australian credit | 0, Z = −0.04 | 0, Z = 0.57 | 0, Z = 0.99 | 0, Z = 1.68 | 0, Z = −0.84 |

German credit | 0, Z = −0.61 | 0, Z = −0.42 | 0, Z = 1.12 | 0, Z = −0.89 | 0, Z = −0.95 |

Segment | 0, Z = 1.03 | 0, Z = 1.75 | 0, Z = 1.40 | 0, Z = 1.93 | 0, Z = 0.45 |

Phoneme | 0, Z = −1.40 | 0, Z = −1.33 | 0, Z = −1.06 | 0, Z = −1.51 | 0, Z = 0.76 |

Page-blocks | 0, Z = 1.21 | 0, Z = −0.27 | 0, Z = 0.38 | 0, Z = −0.11 | 0, Z = 0.19 |

Twonorm | 0, Z = 0.11 | 0, Z = −0.27 | 0, Z = 0.00 | 0, Z = −0.49 | 0, Z = −0.92 |

Ring | 0, Z = −0.83 | 0, Z = −1.66 | 0, Z = −0.04 | 0, Z = 0.00 | 0, Z = 0.04 |

Magic | 0, Z = 1.32 | 1, Z = 2.27 | 0, Z = 0.83 | 0, Z = 1.06 | 0, Z = 1.13 |

Dataset | HEFCA | NN (Keras) | DT | RF | K-NN | SVM | ${\mathbf{NEAT}}_{\mathbf{DBM}0.5}$ |
---|---|---|---|---|---|---|---|

Australian credit | 0.8537 | 0.8552 | 0.8202 | 0.8724 | 0.8579 | 0.8507 | 0.8824 |

German credit | 0.7280 | 0.7530 | 0.6930 | 0.7610 | 0.6960 | 0.7590 | 0.7660 |

Segment | 0.9100 | 0.9476 | 0.9619 | 0.9801 | 0.9472 | 0.8662 | 0.8087 |

Phoneme | 0.8260 | 0.7957 | 0.8760 | 0.9197 | 0.8862 | 0.8442 | 0.7950 |

Page-blocks | 0.9406 | 0.9616 | 0.9613 | 0.9741 | 0.9550 | 0.9055 | 0.9631 |

Twonorm | 0.9043 | 0.9792 | 0.8455 | 0.9736 | 0.9728 | 0.9803 | 0.9751 |

Ring | 0.9237 | 0.8262 | 0.8782 | 0.9500 | 0.6888 | 0.9781 | 0.9011 |

Magic | 0.8415 | 0.8274 | 0.8157 | 0.8790 | 0.8072 | 0.8198 | 0.8369 |

Dataset | HEFCA | NN (Keras) | DT | RF | K-NN | SVM |
---|---|---|---|---|---|---|

Australian credit | 0, Z = 0.91 | 0, Z = 0.98 | 1, Z = 2.27 | 0, Z = 0.08 | 0, Z = 0.76 | 0, Z = 1.44 |

German credit | 1, Z = 2.56 | 0, Z = 0.57 | 1, Z = 2.81 | 0, Z = 0.04 | 1, Z = 3.07 | 0, Z = 0.53 |

Segment | −1, Z = −3.79 | −1, Z = −3.79 | −1, Z = −3.80 | −1, Z = −3.81 | −1, Z = −3.79 | −1, Z = −3.56 |

Phoneme | −1, Z = −2.99 | 0, Z = 0.00 | −1, Z = −3.78 | −1, Z = −3.79 | −1, Z = −3.79 | −1, Z = −3.63 |

Page−blocks | 1, Z = 3.67 | 0, Z = 0.45 | 0, Z = 0.30 | −1, Z = −3.18 | 1, Z = 2.35 | 1, Z = 3.79 |

Twonorm | 1, Z = 3.80 | 0, Z = −1.44 | 1, Z = 3.80 | 0, Z = 0.68 | 0, Z = 1.14 | −1, Z = −2.81 |

Ring | −1, Z = −3.48 | 1, Z = 3.78 | 1, Z = 3.10 | −1, Z = −3.78 | 1, Z = 3.78 | −1, Z = −3.78 |

Magic | 0, Z = −1.21 | 1, Z = 2.19 | 1, Z = 3.78 | −1, Z = −3.78 | 1, Z = 3.78 | 1, Z = 3.78 |

Name | Description | Value |
---|---|---|

${m}_{1}$ | Mass of arm | 0.056 kg |

${l}_{1}$ | Length of arm | 0.16 m |

${c}_{1}$ | Distance to arm center of mass | 0.08 m |

${J}_{1}$ | Inertia of arm | 0.00215058 kg·m${}^{2}$ |

${m}_{2}$ | Mass of pendulum | 0.022 kg |

${l}_{2}$ | Length of pendulum | 0.16 m |

${c}_{2}$ | Distance to pendulum center of mass | 0.08 m |

${J}_{2}$ | Inertia of pendulum | 0.00018773 kg·m${}^{2}$ |

${R}_{m}$ | Armature resistance | 2.5604 $\mathsf{\Omega}$ |

${K}_{b}$ | Back-emf constant | 0.01826 V·s/rad |

${K}_{t}$ | Torque constant | 0.01826 N·m/A |

${\theta}_{1}$ | Angular displacement of arm | - |

${\theta}_{1}^{\prime}$ | Angular velocity of arm | - |

${\theta}_{2}$ | Angular displacement of pendulum | - |

${\theta}_{2}^{\prime}$ | Angular velocity of pendulum | - |

$\tau $ | Applied torque | - |

Starting Position | x1 | x2 | x3 | x4 |
---|---|---|---|---|

1 | 0 | −0.1 | 0 | 0 |

2 | 3 | −0.1 | 0 | 0 |

3 | 0 | 1.41 | 0 | 0 |

4 | 3 | 1.41 | 0 | 0 |

Performance Value | NEAT | ${\mathit{NEAT}}_{\mathit{DBM}0.1}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.3}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.5}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.7}$ | ${\mathit{NEAT}}_{\mathit{DBM}0.9}$ |
---|---|---|---|---|---|---|

Mean | 416.68 | 422.35 | 383.37 | 372.25 | 408.57 | 454.24 |

Median | 459.96 | 397.20 | 424.15 | 398.76 | 416.74 | 458.92 |

Std | 104.60 | 54.98 | 106.66 | 122.46 | 61.34 | 30.69 |

Min | 186.87 | 361.63 | 132.62 | 65.67 | 288.62 | 403.04 |

M-W test | - | 0, Z = 0.45 | 0, Z = 1.36 | 0, Z = 1.06 | 0, Z = 1.36 | 0, Z = −0.15 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Stanovov, V.; Akhmedova, S.; Semenkin, E.
Difference-Based Mutation Operation for Neuroevolution of Augmented Topologies. *Algorithms* **2021**, *14*, 127.
https://doi.org/10.3390/a14050127

**AMA Style**

Stanovov V, Akhmedova S, Semenkin E.
Difference-Based Mutation Operation for Neuroevolution of Augmented Topologies. *Algorithms*. 2021; 14(5):127.
https://doi.org/10.3390/a14050127

**Chicago/Turabian Style**

Stanovov, Vladimir, Shakhnaz Akhmedova, and Eugene Semenkin.
2021. "Difference-Based Mutation Operation for Neuroevolution of Augmented Topologies" *Algorithms* 14, no. 5: 127.
https://doi.org/10.3390/a14050127