# Election Algorithm for Random k Satisfiability in the Hopfield Neural Network

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. The Proposed Random k Satisfiability (RANkSAT)

## 3. RAN2SAT in a Hopfield Neural Network

**.**Hence, the quality of the final neuron state can be properly examined by checking the following condition:

## 4. Learning Model for HNN-RAN2SAT

#### 4.1. Election Algorithm (EA)

#### 4.1.1. Initialization

#### 4.1.2. Forming Initial Parties

#### 4.1.3. Positive Advertisement

#### 4.1.4. Negative Advertisement

#### 4.1.5. Coalition

#### 4.1.6. Election Day

Algorithm 1 Detailed Procedure of the Proposed HNN-RAN2SATEA | ||||

1 | Initialize the population ${N}_{POP}$ consisting ${S}_{i}\in \left[{S}_{1},{S}_{2},{S}_{3},\dots ,{S}_{{N}_{POP}}\right]$; | |||

2 | while$\left(g\le Ir\right)$ or ${f}_{{L}_{j}}={f}_{m+n}$ | |||

3 | Forming Initial Parties by using Equation (16); | |||

4 | for | $j\in \left\{1,2,3,\dots ,{N}_{party}\right\}$do | ||

5 | Calculate the similarity between the voter and the candidate by using Equation (17); | |||

6 | end | |||

7 | {Positive Advertisement} | |||

8 | for | ${S}_{i}\in \left\{1,2,3,\dots ,{N}_{{S}_{j}}\right\}$do | ||

9 | Evaluate the number of voters ${N}_{{S}_{j}}$ Equation (18); | |||

10 | Evaluate the reasonable effect from the candidate ${\omega}_{{v}_{i}^{j}}$ by using Equation (19); | |||

11 | Update the neuron state according to Equation (20); | |||

12 | if | ${f}_{{v}_{i}^{j}}>{f}_{{L}_{j}}$ | ||

13 | Assign ${v}_{i}^{j}$ as new ${L}_{j}$; | |||

14 | else | |||

15 | Remain ${L}_{j}$ | |||

16 | end | |||

17 | {Negative Advertisement} | |||

18 | for | ${S}_{i}\in \left\{1,2,3,\dots ,{N}_{{v}_{i}^{\ast}}\right\}$do | ||

19 | Evaluate the similarity between the voter from other party and the candidate by using Equation (22); | |||

20 | Evaluate the reasonable effect from the candidate ${\omega}_{{v}_{i}^{\ast}}$ by using Equation (23); | |||

21 | Update the neuron state according to Equation (24); | |||

22 | if | ${f}_{{v}_{i}^{\ast}}>{f}_{{L}_{j}}$ | ||

23 | Assign ${v}_{i}^{\ast}$ as new ${L}_{j}$; | |||

24 | else | |||

25 | Remain ${L}_{j}$ | |||

26 | end | |||

27 | {Coalition} | |||

28 | for | ${S}_{i}\in \left\{1,2,3,\dots ,{N}_{{v}_{i}^{\ast}}\right\}$do | ||

29 | Evaluate the similarity between the voter from other party and the candidate by using (22); | |||

30 | Evaluate the reasonable effect from the candidate ${\omega}_{{v}_{i}^{\ast}}$ by using Equation (23); | |||

31 | Update the neuron state according to Equation (24); | |||

32 | if | ${f}_{{v}_{i}^{\ast}}>{f}_{{L}_{j}}$ | ||

33 | Assign ${v}_{i}^{\ast}$ as new ${L}_{j}$ | |||

34 | else | |||

35 | Remain ${L}_{j}$ | |||

36 | end | |||

37 | end while | |||

38 | return Output the final neuron state |

#### 4.2. Genetic Algorithm (GA)

#### 4.2.1. Initialization

#### 4.2.2. Fitness Evaluation

#### 4.2.3. Selection

#### 4.2.4. Crossover

**Before Crossover**

${S}_{1}$ | −1 | 1 | 1 | −1 | 1 | 1 |

${S}_{2}$ | 1 | −1 | 1 | 1 | 1 | −1 |

**After Crossover**

${S}_{1}$ | 1 | 1 | −1 | −1 | 1 | 1 |

${S}_{2}$ | 1 | −1 | 1 | −1 | 1 | 1 |

#### 4.2.5. Mutation

Algorithm2 Detailed Procedure of the Proposed HNN-RAN2SATGA | ||||

1 | Initialize the ${N}_{POP}$ chromosomes population consisting ${S}_{i}\in \left[{S}_{1},{S}_{2},\dots ,{S}_{{N}_{POP}}\right]$; | |||

2 | while$g\le Gen$ or ${f}_{{S}_{i}}={f}_{m+n}$ | |||

3 | Initialize ${N}_{POP}-{N}_{D}$ random ${S}_{i}$; | |||

4 | {Selection} | |||

5 | for | $i\in \left\{1,2,3,\dots ,{N}_{POP}\right\}$do | ||

6 | Calculate the fitness of each ${S}_{i}$ by using Equation (25); | |||

7 | Evaluate ${N}_{D}$ by using Equation (29); | |||

8 | end | |||

9 | {Crossover} | |||

10 | for | ${S}_{i}\in \left\{1,2,3,\dots ,{N}_{D}\right\}$do | ||

11 | Exchange the states of the selected two ${S}_{i}$ at a random point. | |||

12 | end | |||

13 | {Mutation} | |||

14 | for | ${S}_{i}\in \left\{1,2,3,\dots ,{N}_{D}\right\}$do | ||

15 | Flipping states from of ${S}_{i}$ the random location; | |||

16 | Evaluate the fitness of the ${S}_{i}$ according to Equation (25); | |||

17 | end | |||

18 | end while | |||

19 | return Output the final ${S}_{i}$ state. |

## 5. HNN Model Experimental Setup

^{®}Celeron

^{®}CPU B800@2GHz processor with 4 GB RAM utilizing Windows 8.1. Table 1, Table 2 and Table 3 indicate the appropriate parameters during each HNN model execution.

#### 5.1. Performance Metric for HNN-RAN2SAT Models

#### 5.1.1. Root Mean Square Error (RMSE)

#### 5.1.2. Mean Absolute Error (MAE)

#### 5.1.3. Sum of Squared Error (SSE)

#### 5.1.4. Mean Absolute Percentage Error (MAPE)

#### 5.2. Implementation of HNN-RAN2SAT Models

## 6. Results and Discussion

## 7. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## References

- Zhu, X.; Fu, B.; Yang, Y.; Ma, Y.; Hao, J.; Chen, S.; Liao, Z. Attention-based recurrent neural network for influenza epidemic prediction. BMC Bioinform.
**2019**, 20, 1–10. [Google Scholar] [CrossRef] [PubMed] - D’Addona, D.M.; Ullah, A.S.; Matarazzo, D. Tool-wear prediction and pattern-recognition using artificial neural network and DNA-based computing. J. Intell. Manuf.
**2017**, 28, 1285–1301. [Google Scholar] [CrossRef] - Kho, L.C.; Kasihmuddin, M.S.M.; Mansor, M.; Sathasivam, S. Logic mining in league of legends. Pertanika J. Sci. Technol.
**2020**, 28, 211–225. [Google Scholar] - Pang, G.; Yang, L.; Karniadakis, G.E. Neural-net-induced Gaussian process regression for function approximation and PDE solution. J. Comput. Phys.
**2019**, 384, 270–288. [Google Scholar] [CrossRef] [Green Version] - Kobayashi, M. Hopfield neural networks using Klein four-group. Neurocomputing
**2020**, 387, 123–128. [Google Scholar] [CrossRef] - Hopfield, J.J.; Tank, D.W. “Neural” computation of decisions in optimization problems. Biol. Cybern.
**1985**, 52, 141–152. [Google Scholar] - Fung, C.H.; Wong, M.S.; Chan, P.W. Spatio-temporal data fusion for satellite images using Hopfield neural network. Remote Sens.
**2019**, 11, 2077. [Google Scholar] [CrossRef] [Green Version] - Pan, J.; Pottimurthy, Y.; Wang, D.; Hwang, S.; Patil, S.; Fan, L.S. Recurrent neural network based detection of faults caused by particle attrition in chemical looping systems. Powder Technol.
**2020**, 367, 266–276. [Google Scholar] [CrossRef] - Tao, Q. Evaluation of scientific research ability in colleges and universities based on discrete Hopfield neural network. Acad. J. Comput. Inf. Sci.
**2019**, 2, 1–8. [Google Scholar] - Kasihmuddin, M.S.M.; Mansor, M.A.; Sathasivam, S. Hybrid genetic algorithm in the Hopfield network for logic satisfiability problem. Pertanika J. Sci. Technol.
**2017**, 25, 139–152. [Google Scholar] - Abdullah, W.A.T.W. Logic programming on a neural network. Int. J. Intell. Syst.
**1992**, 7, 513–519. [Google Scholar] [CrossRef] - Sathasivam, S. Upgrading logic programming in Hopfield network. Sains Malays.
**2010**, 39, 115–118. [Google Scholar] - Mansor, M.A.; Kasihmuddin, M.S.M.; Sathasivam, S. Artificial immune system paradigm in the Hopfield network for 3-satisfiability problem. Pertanika J. Sci. Technol.
**2017**, 25, 1173–1188. [Google Scholar] - Kasihmuddin, M.S.M.; Mansor, M.A.; Sathasivam, S. Discrete Hopfield Neural Network in Restricted Maximum k-Satisfiability Logic Programming. Sains Malays.
**2018**, 47, 1327–1335. [Google Scholar] [CrossRef] - Kasihmuddin, M.S.M.; Mansor, M.A.; Sathasivam, S. Bezier curves satisfiability model in enhanced Hopfield network. Int. J. Intell. Syst. Appl.
**2016**, 8, 9–17. [Google Scholar] [CrossRef] [Green Version] - Mansor, M.A.; Kasihmuddin, M.S.M.; Sathasivam, S. Enhanced Hopfield network for pattern satisfiability optimization. Int. J. Intell. Syst. Appl.
**2016**, 8, 27–33. [Google Scholar] [CrossRef] - Mansor, M.A.; Kasihmuddin, M.S.M.; Sathasivam, S. VLSI circuit configuration using satisfiability logic in Hopfield network. Int. J. Intell. Syst. Appl.
**2016**, 8, 22–29. [Google Scholar] [CrossRef] - Hamadneh, N.; Sathasivam, S.; Tilahun, S.L.; Choon, O.H. Learning logic programming in radial basis function network via genetic algorithm. J. Appl. Sci.
**2012**, 12, 840–847. [Google Scholar] [CrossRef] - Alzaeemi, S.; Mansor, M.A.; Kasihmuddin, M.S.M.; Sathasivam, S.; Mamat, M. Radial basis function neural network for 2 satisfiability programming. Indones. J. Electr. Eng. Comput. Sci.
**2020**, 18, 459–469. [Google Scholar] [CrossRef] - Mansor, M.A.; Jamaludin, S.Z.M.; Kasihmuddin, M.S.M.; Alzaeemi, S.A.; Basir, M.F.M.; Sathasivam, S. Systematic boolean satisfiability programming in radial basis function neural network. Processes
**2020**, 8, 214. [Google Scholar] [CrossRef] [Green Version] - Zaji, A.H.; Bonakdari, H.; Khameneh, H.Z.; Khodashenas, S.R. Application of optimized Artificial and Radial Basis neural networks by using modified Genetic Algorithm on discharge coefficient prediction of modified labyrinth side weir with two and four cycles. Measurement
**2020**, 152, 107291. [Google Scholar] [CrossRef] - Bahiraei, M.; Nazari, S.; Moayedi, H.; Safarzadeh, H. Using neural network optimized by imperialist competition method and genetic algorithm to predict water productivity of a nanofluid-based solar still equipped with thermoelectric modules. Powder Technol.
**2020**, 366, 571–586. [Google Scholar] [CrossRef] - Prado, F.; Minutolo, M.C.; Kristjanpoller, W. Forecasting Based on an Ensemble Autoregressive Moving Average-Adaptive Neuro-Fuzzy Inference System–Neural Network-Genetic Algorithm Framework. Energy
**2020**, 197, 117159. [Google Scholar] [CrossRef] - Kasihmuddin, M.S.M.; Mansor, M.A.; Sathasivam, S. Robust artificial bee colony in the Hopfield network for 2-satisfiability problem. Pertanika J. Sci. Technol.
**2017**, 25, 453–468. [Google Scholar] - Mansor, M.A.B.; Kasihmuddin, M.S.B.M.; Sathasivam, S. Robust Artificial Immune System in the Hopfield network for Maximum k-Satisfiability. Int. J. Interact. Multimed. Artif. Intell.
**2017**, 4, 63–71. [Google Scholar] [CrossRef] [Green Version] - Kumar, M.; Kulkarni, A.J. Socio-inspired optimization metaheuristics: A review. In Socio-Cultural Inspired Metaheuristics; Singh, P., Satapathy, S., Kashan, A.H., Tai, K., Eds.; Springer: Singapore, 2019; Volume 828, pp. 241–265. [Google Scholar]
- Emami, H.; Derakhshan, F. Election algorithm: A new socio-politically inspired strategy. AI Commun.
**2015**, 28, 591–603. [Google Scholar] [CrossRef] - Lv, W.; He, C.; Li, D.; Cheng, S.; Luo, S.; Zhang, X. Election campaign optimization algorithm. Procedia Comput. Sci.
**2010**, 1, 1377–1386. [Google Scholar] [CrossRef] [Green Version] - Emami, H. Chaotic election algorithm. Comput. Inf.
**2020**, 38, 1444–1478. [Google Scholar] [CrossRef] - Kasihmuddin, M.S.M.; Mansor, M.A.; Basir, M.F.M.; Sathasivam, S. Discrete mutation Hopfield neural network in propositional satisfiability. Mathematics
**2019**, 7, 1133. [Google Scholar] [CrossRef] [Green Version] - Hopfield, J.J.; Tank, D.W. Computing with neural circuits: A model. Science
**1986**, 223, 625–633. [Google Scholar] [CrossRef] [Green Version] - Barra, A.; Beccaria, M.; Fachechi, A. A new mechanical approach to handle generalized Hopfield neural networks. Neural Netw.
**2018**, 106, 205–222. [Google Scholar] [CrossRef] [PubMed] - Abdullah, W.A.T.W. Logic programming in neural networks. Malays. J. Comput. Sci.
**1996**, 9, 1–5. [Google Scholar] [CrossRef] - Kasihmuddin, M.S.B.M.; Mansor, M.A.B.; Sathasivam, S. Genetic algorithm for restricted maximum k-satisfiability in the Hopfield network. Int. J. Interact. Multimed. Artif. Intell.
**2016**, 4, 52–60. [Google Scholar] - Goldberg, D.E.; Holland, J.H. Genetic algorithms and machine learning. Mach. Learn.
**1988**, 3, 95–99. [Google Scholar] [CrossRef] - Goldberg, D.E.; Korb, B.; Deb, K. Messy genetic algorithms: Motivation, analysis, and first results. Complex Syst.
**1989**, 3, 493–530. [Google Scholar] - Sathasivam, S. Learning in the Recurrent Hopfield Network. In Proceedings of the 2008 Fifth International Conference on Computer Graphics, Imaging and Visualisation (IEEE), Penang, Malaysia, 26–28 August 2008; p. 10234772. [Google Scholar]
- Stone, R.J. Improved statistical procedure for the evaluation of solar radiation estimation models. Sol. Energy
**1993**, 51, 289–291. [Google Scholar] [CrossRef] - Chai, T.; Draxler, R.R. Root mean square error (RMSE) or mean absolute error (MAE)?–Arguments against avoiding RMSE in the literature. Geosci. Model Dev.
**2014**, 7, 1247–1250. [Google Scholar] [CrossRef] [Green Version] - Willmott, C.J.; Matsuura, K. Advantages of the mean absolute error (MAE) over the root mean square error (RMSE) in assessing average model performance. Clim. Res.
**2015**, 30, 79–82. [Google Scholar] [CrossRef] - Zeng, B.; Neuvo, Y. Optimal parallel stack filtering under the mean absolute error criterion. IEEE Trans. Image Process.
**1994**, 3, 324–327. [Google Scholar] [CrossRef] - Adeney, K.M.; Korenberg, M.J. Target Adaptation to Improve the Performance of Least-Squared Classifiers. In Proceedings of the IEEE-INNS-ENNS International Joint Conference on Neural Networks. IJCNN 2000. Neural Computing: New Challenges and Perspectives for the New Millennium (IEEE), Como, Italy, 27 July 2000; pp. 100–105. [Google Scholar]
- Armstrong, J.S.; Collopy, F. Error measures for generalizing about forecasting methods: Empirical comparisons. Int. J.
**1992**, 8, 69–80. [Google Scholar] [CrossRef] [Green Version] - Sudha, K.; Kumar, N.; Khetarpal, P. GA-ANN hybrid approach for load forecasting. J. Stat. Manag. Syst.
**2020**, 23, 135–144. [Google Scholar] [CrossRef] - Lam, K.F.; Mui, H.W.; Yuen, H.K. A note on minimizing absolute percentage error in combined forecasts. Comput. Oper. Res.
**2001**, 28, 1141–1147. [Google Scholar] [CrossRef] - Velavan, M.; Yahya, Z.R.; Halif, M.N.A.; Sathasivam, S. Mean field theory in doing logic programming using Hopfield network. Mod. Appl. Sci.
**2016**, 10, 154–160. [Google Scholar] [CrossRef] - Sathasivam, S. Boltzmann machine and new activation function comparison. Appl. Math. Sci.
**2011**, 5, 3853–3860. [Google Scholar] - Alzaeemi, S.A.; Sathasivam, S. Linear Kernel Hopfield Neural Network approach in Horn Clause Programming. In Proceedings of the 25th National Symposium on Mathematical Sciences (SKSM25): Mathematical Sciences as the Core of Intellectual Excellence (AIP), Pahang, Malaysia, 27–29 August 2017; p. 020107. [Google Scholar]
- Bag, S.; Kumar, S.K.; Tiwari, M.K. An efficient recommendation generation using relevant Jaccard similarity. Inf. Sci.
**2019**, 483, 53–64. [Google Scholar] [CrossRef] - Kasihmuddin, M.S.M.; Mansor, M.A.; Alzaeemi, S.; Basir, M.F.M.; Sathasivam, S. Quality Solution of Logic Programming in Hopfield Neural Network. In Proceedings of the 2nd International Conference on Applied & Industrial Mathematics and Statistics, Pahang, Malaysia, 23–25 July 2019; IOP Publishing: Bristol, UK, 2019; p. 012094. [Google Scholar]
- Goodwin, P.; Lawton, R. On the asymmetry of the symmetric MAPE. Int. J.
**1999**, 15, 405–408. [Google Scholar] [CrossRef] - Kasihmuddin, M.S.M.; Mansor, M.A.; Sathasivam, S. Artificial Bee Colony in the Hopfield Network for Maximum k-Satisfiability Problem. J. Inform. Math. Sci.
**2016**, 8, 317–334. [Google Scholar] - Mansor, M.A.; Sathasivam, S.; Kasihmuddin, M.S.M. Artificial immune system algorithm with neural network approach for social media performance. In Proceedings of the 25th National Symposium on Mathematical Sciences (SKSM25): Mathematical Sciences as the Core of Intellectual Excellence (AIP), Pahang, Malaysia, 27–29 August 2017; p. 020072. [Google Scholar]
- Goodman, J.S.; Wood, R.E.; Hendrickx, M. Feedback specificity, exploration, and learning. J. Appl. Psychol.
**2004**, 89, 248. [Google Scholar] [CrossRef] [Green Version]

**Table 1.**List of parameters used in Hopfield Neural Network-Random 2 Satisfiability Exhaustive Search (HNN-RAN2SATES) [37].

Parameter | Value |
---|---|

Neuron Combination | 100 |

Number of Trials | 100 |

Tolerance Value $\left(\xi \right)$ | 0.001 |

Number of Strings | 100 |

Selection Rate $\left(\lambda \right)$ | 0.1 |

**Table 2.**List of parameters used in Hopfield Neural Network-Random 2 Satisfiability Genetic Algorithm (HNN-RAN2SATGA) [10].

Parameter | Value |
---|---|

Neuron Combination | 100 |

Number of Trials | 100 |

Tolerance Value $\left(\xi \right)$ | 0.001 |

Number of Generations $\left(Gen\right)$ | 1000 |

Number of Chromosomes $\left({N}_{POP}\right)$ | 120 |

Selection Rate $\left(\lambda \right)$ | 0.1 |

Crossover Rate | 0.9 |

Mutation Rate | 0.01 |

**Table 3.**List of parameters used in Hopfield Neural Network-Random 2 Satisfiability Election Algorithm (HNN-RAN2SATEA).

Parameter | Value |
---|---|

Neuron Combination | 100 |

Number of Trials | 100 |

Tolerance Value $\left(\xi \right)$ | 0.001 |

Number of Learning | 100 |

Number of Candidates $\left({N}_{POP}\right)$ | 120 |

Number of Parties $\left({N}_{party}\right)$ | 4 |

Positive Advertisement Rate $\left({\sigma}^{p}\right)$ | 0.5 |

Negative Advertisement Rate $\left({\sigma}^{n}\right)$ | 0.5 |

Maximum Iterations $\left(Ir\right)$ | 100 |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Sathasivam, S.; Mansor, M.A.; Kasihmuddin, M.S.M.; Abubakar, H.
Election Algorithm for Random *k* Satisfiability in the Hopfield Neural Network. *Processes* **2020**, *8*, 568.
https://doi.org/10.3390/pr8050568

**AMA Style**

Sathasivam S, Mansor MA, Kasihmuddin MSM, Abubakar H.
Election Algorithm for Random *k* Satisfiability in the Hopfield Neural Network. *Processes*. 2020; 8(5):568.
https://doi.org/10.3390/pr8050568

**Chicago/Turabian Style**

Sathasivam, Saratha, Mohd. Asyraf Mansor, Mohd Shareduwan Mohd Kasihmuddin, and Hamza Abubakar.
2020. "Election Algorithm for Random *k* Satisfiability in the Hopfield Neural Network" *Processes* 8, no. 5: 568.
https://doi.org/10.3390/pr8050568