# Hybridised Network of Fuzzy Logic and a Genetic Algorithm in Solving 3-Satisfiability Hopfield Neural Networks

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. 3-Satisfiability

## 3. 3-Satisfiability in the Hopfield Neural Network

## 4. Fuzzy Logic

#### 4.1. Fuzzy Set

#### 4.2. Fuzzy Logic System

## 5. The Genetic Algorithm

## 6. Fuzzy Logic System and Genetic Algorithm in 3-Satisfiability Logic Programming

Algorithm 1: Pseudocode for FuzzyGA |

Begin the first stageStep 1: InitialisationInitialise the inputs${\omega}_{i}$, output${\beta}_{i}$, membership function${\mu}_{\tau}\left({\omega}_{i}\right)$ and rule lists $RL$. Step 2: FuzzificationAssign the literal, ${\omega}_{i}$ to it membership function which $0\le {\mu}_{\tau}\left({\omega}_{i}\right)\le 1$. Step 3: Fuzzy rulesApply rules lists, $RL$ using the first clause block, ${\beta}_{i}$, e.g., ${\beta}_{1}={\omega}_{1}\vee \neg {\omega}_{2}\vee \neg {\omega}_{3}.$ Step 4: DefuzzificationImplement the defuzzification step and estimation model. endBegin the second stageStep 1: InitialisationInitialise random populations of the instances, ${\beta}_{i}$. Step 2: Fitness EvaluationEvaluate the fitness of the population of the instances, ${\beta}_{i}$. Step 3: SelectionSelect best ${S}_{i}$ and save as ${S}_{i}^{\prime}$. Step 4: CrossoverRandomly select two bits, ${x}_{i}$ from ${\beta}_{i}$. Generate new solutions and save them as ${\beta}_{i}^{\u2033}$. Step 5: Mutation Select a solution ${x}_{i}$ from ${\beta}_{i}^{\u2033}$. Mutates the solution by flipping the bits randomly and generates ${x}_{i}^{\prime}$. if ${x}_{i}^{\prime}<criterion$,Update ${x}_{i}^{\prime}$. endUpdate ${x}_{i}=$ ${x}_{i}^{\prime}$. end |

## 7. Experimental Setup

#### 7.1. Performance Metric

- Training phase—The evaluation includes how well the suggested model performs in locating the most satisfied clauses in the learning phase of the 3SAT assignments. In this phase, the computation involves error analysis to verify the performances of each model. It also assesses the efficiency of the optimisation method in the training phase.
- Testing phase—The evaluation in the testing phase comprises computing the error analysis to validate the satisfied clauses. Besides that, it calculates the similarity degree to which the final neuron state differs from or resembles the benchmark states. It also verifies the robustness of the suggested model by energy analysis and computational time.

#### 7.1.1. Training Phase Metric

#### 7.1.2. Testing Phase Metric

## 8. Results and Discussion

_{train}, MAE

_{train}, SSE

_{train}and MAPE

_{train}during the training stage for the three different methods: HNN-3SAT, HNN-3SATFuzzy, and HNN-3SATFuzzyGA, respectively. According to Figure 3a,b, RMSE

_{train}and MAE

_{train}for HNN-3SATFuzzyGA outperformed those of other networks during the training phase. Despite an increase in the number of neurons (NN), the results demonstrate that the RMSE

_{train}and MAE

_{train}values for the HNN-3SATFuzzyGA network are the lowest. The HNN-3SATFuzzyGA solutions deviated from the potential solutions less. Initially, the outcomes for all networks seemed to have little difference during $9\le \mathrm{NN}\le 45$. However, once it reached $\mathrm{NN}=54$, the results started to deviate more for RMSE

_{train}and MAE

_{train}, with performance for HNN-3SATFuzzyGA remaining low. The other two methods considerably increased toward the end of the simulations, especially HNN-3SAT. Based on RMSE

_{train}and MAE

_{train}calculations, the suggested method, HNN-3SATFuzzyGA, achieved ${E}_{{\theta}_{3SAT}}$ at lower results. The leading cause of this is that fuzzy logic, and GA resolve a better solution in the training phase, enabling the achievement of ${E}_{{\theta}_{3SAT}}=0$ in a shorter number of cycles. The reason is that the fuzzy logic system is more likely to widen the search during training using fuzzification for precise interpretations. Then, the HNN-3SATFuzzyGA utilised a systematic approach by employing the defuzzification procedure. The usage of GA also helps to optimise the solutions better. Additionally, HNN-3SATFuzzyGA could efficiently verify the proper interpretation and handle more restrictions than the other networks. In another work by Abdullahi et al. [32], a method for extracting the logical rule in the form of 3SAT to characterise the behaviour of a particular medical data set is tested. The goal is to create a reliable algorithm that can be used to extract data and insights from a set of medical records. The 3SAT method extracts logical rule-based insights from medical data sets. The technique will integrate 3SAT logic and HNN as a single data mining paradigm. Medical data sets like the Statlog Heart (ST) and Breast Cancer Coimbra (BCC) are tested and trained using the proposed method. The network has recorded good performance evaluation metrics like RMSE and MAE based on the simulation with various numbers of neurons. The results of Abdullahi et al. [32] can conclude that the HNN-3SAT models have a considerably lower sensitivity in error analysis results. According to the optimal CPU time measured with various levels of complexity, the networks can also accomplish flexibility.

_{train}and MAPE

_{train}values. Since HNN-3SATFuzzyGA has a lower SSE

_{train}value, it has a more reliable ability to train the simulated data set. With a lower SSE

_{train}value for all hidden neuron counts, HNN-3SATFuzzyGA was found to have evident good-quality results. Even though the results for SSE

_{train}during $9\le \mathrm{NN}\le 45$have not deviated far from each other’s result, HNN-3SATFuzzyGA remained low until the end of the simulations compared to the other methods. Associated with HNN-3SATFuzzyGA during the final NN, the outcomes of HNN-3SAT increased drastically. The results show the MAPE

_{train}settings’ output for all networks. The MAPE

_{train}value has also provided compelling evidence of fuzzy logic and GA compatibility in 3SAT. The results for HNN-3SATFuzzyGA were significantly at a low level even after $\mathrm{NN}=54$. Compared to HNN-3SATFuzzyGA, the results of HNN-3SAT converged higher, starting from $\mathrm{NN}=54$. As a result, the HNN-3SATFuzzyGA approach performs noticeably better than the other two models. The effectiveness of the network is because the training phase’s operators of fuzzy logic and GA boosted the solutions’ compatibility. Compared to different networks, HNN-3SATFuzzyGA can recover a more precise final state.

_{test}, MAE

_{test}, SSE

_{test}and MAPE

_{test}in the testing error analyses. Examining the HNN-3SAT behaviour based on error analysis concerning global or local minima solutions is the significance of testing error analysis. The WA approach will construct the synaptic weight after completing the HNN-3SAT clause satisfaction (minimisation of the cost function). Figure 5a,b shows that the testing error for RMSE

_{test}and MAE

_{test}in HNN-3SAT is the highest, indicating that the model cannot induce global minima solutions. The model is overfitted to the test sets due to the unsatisfied clauses. Therefore, we need to add an optimiser such as fuzzy logic to widen the search space and GA optimise the solutions better. Based on RMSE

_{test}and MAE

_{test}calculations, the HNN-3SAT attains ${E}_{{\theta}_{3SAT}}$ at higher results. The two methods, with an optimiser, maintain the error at a lower level. Additionally, in Figure 5c,d, we consider SSE

_{test}and MAPE

_{test}as two performance indicators in the testing phase when evaluating the error analysis. Notably, an increased likelihood of optimal solutions will be attributed to lower SSE

_{test}and MAPE

_{test}values. Lower SSE

_{test}and MAPE

_{test}scores denote a higher accuracy (fitness) level. The SSE

_{test}and MAPE

_{test}values for the HNN-3SAT model rise noticeably when the number of neurons increases. As a result, the HNN-3SAT model with greater SSE

_{test}and MAPE

_{test}values cannot optimise the required fitness value during the testing phase. After $NN=9$, the values for SSE

_{test}and MAPE

_{test}for the HNN-3SAT model begin to rise. The event occurred due to the HNN-3SAT trial-and-error approach, which resulted in a less ideal testing phase.

## 9. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Conflicts of Interest

## References

- Hopfield, J.J.; Tank, D.W. “Neural” Computation of Decisions in Optimization Problems. Biol. Cybern.
**1985**, 52, 141–152. [Google Scholar] [CrossRef] [PubMed] - Abdullah, W.A.T.W. Logic Programming on a Neural Network. Int. J. Intell. Syst.
**1992**, 7, 513–519. [Google Scholar] [CrossRef] - Sathasivam, S. Upgrading Logic Programming in Hopfield Network. Sains Malays.
**2010**, 39, 115–118. [Google Scholar] - Kasihmuddin, M.S.M.; Mansor, M.A.; Sathasivam, S. Robust Artificial Bee Colony in the Hopfield Network for 2-Satisfiability Problem. Pertanika J. Sci. Technol.
**2017**, 25, 453–468. [Google Scholar] - Mansor, M.A.; Kasihmuddin, M.S.M.; Sathasivam, S. Artificial Immune System Paradigm in the Hopfield Network for 3-Satisfiability Problem. Pertanika J. Sci. Technol.
**2017**, 25, 1173–1188. [Google Scholar] - Agrawal, H.; Talwariya, A.; Gill, A.; Singh, A.; Alyami, H.; Alosaimi, W.; Ortega-Mansilla, A. A Fuzzy-Genetic-Based Integration of Renewable Energy Sources and E-Vehicles. Energies
**2022**, 15, 3300. [Google Scholar] [CrossRef] - Nasir, M.; Sadollah, A.; Grzegorzewski, P.; Yoon, J.H.; Geem, Z.W. Harmony Search Algorithm and Fuzzy Logic Theory: An Extensive Review from Theory to Applications. Mathematics
**2021**, 9, 2665. [Google Scholar] [CrossRef] - El Midaoui, M.; Qbadou, M.; Mansouri, K. A Fuzzy-Based Prediction Approach for Blood Delivery Using Machine Learning and Genetic Algorithm. Int. J. Electr. Comput. Eng.
**2022**, 12, 1056. [Google Scholar] [CrossRef] - de Campos Souza, P.V. Fuzzy Neural Networks and Neuro-Fuzzy Networks: A Review the Main Techniques and Applications Used in the Literature. Appl. Soft Comput.
**2020**, 92, 106275. [Google Scholar] [CrossRef] - Nordin, N.S.; Ismail, M.A.; Sutikno, T.; Kasim, S.; Hassan, R.; Zakaria, Z.; Mohamad, M.S. A Comparative Analysis of Metaheuristic Algorithms in Fuzzy Modelling for Phishing Attack Detection. Indones. J. Electr. Eng. Comput. Sci.
**2021**, 23, 1146. [Google Scholar] [CrossRef] - Scaranti, G.F.; Carvalho, L.F.; Barbon, S.; Proenca, M.L. Artificial Immune Systems and Fuzzy Logic to Detect Flooding Attacks in Software-Defined Networks. IEEE Access
**2020**, 8, 100172–100184. [Google Scholar] [CrossRef] - Torres-Salinas, H.; Rodríguez-Reséndiz, J.; Cruz-Miguel, E.E.; Ángeles-Hurtado, L.A. Fuzzy Logic and Genetic-Based Algorithm for a Servo Control System. Micromachines
**2022**, 13, 586. [Google Scholar] [CrossRef] [PubMed] - Holland, J.H. Adaptation in Natural and Artificial Systems; University of Michigan Press: Ann Arbor, MI, USA, 1975. [Google Scholar]
- Sathasivam, S.; Mamat, M.; Kasihmuddin, M.S.M.; Mansor, M.A. Metaheuristics Approach for Maximum k Satisfiability in Restricted Neural Symbolic Integration. Pertanika J. Sci. Technol.
**2020**, 28, 545–564. [Google Scholar] - Zadeh, L.A. Fuzzy Sets. Inf. Control
**1965**, 8, 338–353. [Google Scholar] [CrossRef] [Green Version] - Mamdani, E.H. Application of Fuzzy Algorithms for Control of Simple Dynamic Plant. Proc. Inst. Electr. Eng.
**1974**, 121, 1585–1588. [Google Scholar] [CrossRef] - Pourabdollah, A.; Mendel, J.M.; John, R.I. Alpha-cut representation used for defuzzification in rule-based systems. Fuzzy Sets Syst.
**2020**, 399, 110–132. [Google Scholar] [CrossRef] - Zamri, N.E.; Azhar, S.A.; Sidik, S.S.M.; Mansor, M.A.; Kasihmuddin, M.S.M.; Pakruddin, S.P.A.; Pauzi, N.A.; Nawi, S.N.M. Multi-discrete genetic algorithm in Hopfield neural network with weighted random k satisfiability. Neural Comput. Appl.
**2022**, 34, 19283–19311. [Google Scholar] [CrossRef] - Kaveh, M.; Kaveh, M.; Mesgari, M.S.; Paland, R.S. Multiple Criteria Decision-Making for Hospital Location-Allocation Based on Improved Genetic Algorithm. Appl. Geomat.
**2020**, 12, 291–306. [Google Scholar] [CrossRef] - Zhang, W.; He, H.; Zhang, S. A novel multi-stage hybrid model with enhanced multi-population niche genetic algorithm: An application in credit scoring. Expert Syst. Appl.
**2019**, 121, 221–232. [Google Scholar] [CrossRef] - Katoch, S.; Chauhan, S.S.; Kumar, V. A Review on Genetic Algorithm: Past, Present, and Future. Multimed Tools Appl.
**2021**, 80, 8091–8126. [Google Scholar] [CrossRef] - Khan, B.; Naseem, R.; Muhammad, F.; Abbas, G.; Kim, S. An Empirical Evaluation of Machine Learning Techniques for Chronic Kidney Disease Prophecy. IEEE Access
**2020**, 8, 55012–55022. [Google Scholar] [CrossRef] - Ofori-Ntow, E.J.; Ziggah, Y.Y.; Rodrigues, M.J.; Relvas, S. A hybrid chaotic-based discrete wavelet transform and Aquila optimisation tuned-artificial neural network approach for wind speed prediction. Results Eng.
**2022**, 14, 100399. [Google Scholar] [CrossRef] - Guo, Y.; Kasihmuddin, M.S.M.; Gao, Y.; Mansor, M.A.; Wahab, H.A.; Zamri, N.E.; Chen, J. YRAN2SAT: A Novel Flexible Random Satisfiability Logical Rule in Discrete Hopfield Neural Network. Adv. Eng. Softw.
**2022**, 171, 103169. [Google Scholar] [CrossRef] - Bilal, M.; Masud, S.; Athar, S. FPGA Design for Statistics-Inspired Approximate Sum-of-Squared-Error Computation in Multimedia Applications. IEEE Trans. Circuits Syst. II Express Briefs
**2012**, 59, 506–510. [Google Scholar] [CrossRef] - Qin, F.; Liu, P.; Niu, H.; Song, H.; Yousefi, N. Parameter estimation of PEMFC based on Improved Fluid Search Optimization Algorithm. Energy Rep.
**2020**, 6, 1224–1232. [Google Scholar] [CrossRef] - De Myttenaere, A.; Golden, B.; Le Grand, B.; Rossi, F. Mean Absolute Percentage Error for Regression Models. Neurocomputing
**2016**, 192, 38–48. [Google Scholar] [CrossRef] [Green Version] - Djebedjian, B.; Abdel-Gawad, H.A.A.; Ezzeldin, R.M. Global Performance of Metaheuristic Optimization Tools for Water Distribution Networks. Ain Shams Eng. J.
**2021**, 12, 223–239. [Google Scholar] [CrossRef] - Karim, S.A.; Zamri, N.E.; Alway, A.; Kasihmuddin, M.S.M.; Ismail, A.I.M.; Mansor, M.A.; Hassan, N.F.A. Random satisfiability: A higher-order logical approach in discrete hopfield neural network. IEEE Access
**2021**, 9, 50831–50845. [Google Scholar] [CrossRef] - de Santis, E.; Martino, A.; Rizzi, A. On component-wise dissimilarity measures and metric properties in pattern recognition. PeerJ Comput. Sci.
**2022**, 8, e1106. [Google Scholar] [CrossRef] - Bazuhair, M.M.; Mohd Jamaludin, S.Z.; Zamri, N.E.; Mohd Kasihmuddin, M.S.; Mansor, M.A.; Alway, A.; Karim, S.A. Novel Hopfield Neural Network Model with Election Algorithm for Random 3 Satisfiability. Processes
**2021**, 9, 1292. [Google Scholar] [CrossRef] - Abdullahi, S.; Mansor, M.A.; Sathasivam, S.; Kasihmuddin, M.S.M.; Zamri, N.E. 3-satisfiability reverse analysis method with Hopfield neural network for medical data set. In AIP Conference Proceedings 2266; American Institute of Physics: College Park, MD, USA, 2020. [Google Scholar]
- Meyer, A.d.S.; Garcia, A.A.F.; de Souza, A.P.; de Souza, C.L., Jr. Comparison of Similarity Coefficients Used for Cluster Analysis with Dominant Markers in Maize. Genet. Mol. Biol.
**2004**, 27, 83–91. [Google Scholar] [CrossRef] [Green Version] - Gao, Y.; Guo, Y.; Romli, N.A.; Kasihmuddin, M.S.M.; Chen, W.; Mansor, M.A.; Chen, J. GRAN3SAT: Creating Flexible Higher-Order Logic Satisfiability in the Discrete Hopfield Neural Network. Mathematics
**2022**, 11, 1899. [Google Scholar] [CrossRef] - Mansor, M.A.; Kasihmuddin, M.S.M.; Sathasivam, S. Enhanced Hopfield Network for Pattern Satisfiability Optimization. Int. J. Intell. Syst. Appl.
**2016**, 11, 27–33. [Google Scholar] [CrossRef] - Mansor, M.A.; Kasihmuddin, M.S.M.; Sathasivam, S. VLSI Circuit Configuration Using Satisfiability Logic in Hopfield Network. Int. J. Intell. Syst. Appl.
**2016**, 9, 22–29. [Google Scholar] [CrossRef]

**Figure 3.**The error analysis metrics in the training phase (

**a**) RMSE

_{train}; (

**b**) MAE

_{train}; (

**c**) SSE

_{train}; (

**d**) MAPE

_{train}.

**Figure 4.**The efficiency performance metrics in the training phase: (

**a**) iteration efficiency; (

**b**) evaluation efficiency.

**Figure 5.**The error analysis metrics in the testing phase: (

**a**) RMSE

_{test}; (

**b**) MAE

_{test}; (

**c**) SSE

_{test}; (

**d**) MAPE

_{test}.

Parameter | Parameter Value |
---|---|

Number of neurons | $9\le NN\le 180$ |

Total of combinations | 100 |

Tolerance measurement | 0.001 |

CPU time threshold | 24 h |

Activation function | HTAF |

Synaptic weight method | Wan Abdullah (WA) method |

$\mathrm{Final}\mathrm{neuron}\mathrm{states},{S}_{i}$ | $\left\{-1,1\right\}$ |

Parameter | Parameter Value |
---|---|

$\mathrm{Fuzzy}\mathrm{membership}\mathrm{neuron},{\mu}_{\tau}$ | $\left[0,1\right]$ |

$\mathrm{Input}\mathrm{parameter},{\omega}_{i}$ | $\left[0,1\right]$ |

$\mathrm{Output}\mathrm{parameter},{\beta}_{i}$ | $\left[0,1\right]$ |

Number of inputs | 3 |

Number of outputs | 1 |

Parameter | Parameter Value |
---|---|

$\mathrm{Populations},pop$ | 100 |

$\mathrm{Selection}\mathrm{rate},{\mathsf{\Gamma}}_{S}$ | 0.1 |

$\mathrm{Crossover}\mathrm{rate},{\mathsf{\Gamma}}_{C}$ | 1 |

$\mathrm{Mutation}\mathrm{rate},{\mathsf{\Gamma}}_{M}$ | 1 |

Parameter | ${\mathit{S}}_{\mathit{i}}^{\mathit{i}\mathit{d}\mathit{e}\mathit{a}\mathit{l}}$ | ${\mathit{S}}_{\mathit{i}}$ |
---|---|---|

$\rho \rho $ | $1$ | $1$ |

$\rho \eta $ | $1$ | $-1$ |

$\eta \rho $ | $-1$ | $1$ |

$\eta \eta $ | $-1$ | $-1$ |

Parameter | Parameter Notion |
---|---|

${I}_{highest}$ | Maximum fitness |

${I}_{x}$ | Current fitness |

$n$ | Number of learning |

${N}_{iter}$ | Number of iterations |

${N}_{eval}$ | Number of successful iterations |

$\gamma $ | Number of combinations |

$\epsilon $ | Number of trials |

${\varphi}_{i}$ | Number of total solutions |

${z}_{i}$ | Number of global minima |

${y}_{i}$ | Number of local minima |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Azizan, F.L.; Sathasivam, S.; Ali, M.K.M.; Roslan, N.; Feng, C.
Hybridised Network of Fuzzy Logic and a Genetic Algorithm in Solving 3-Satisfiability Hopfield Neural Networks. *Axioms* **2023**, *12*, 250.
https://doi.org/10.3390/axioms12030250

**AMA Style**

Azizan FL, Sathasivam S, Ali MKM, Roslan N, Feng C.
Hybridised Network of Fuzzy Logic and a Genetic Algorithm in Solving 3-Satisfiability Hopfield Neural Networks. *Axioms*. 2023; 12(3):250.
https://doi.org/10.3390/axioms12030250

**Chicago/Turabian Style**

Azizan, Farah Liyana, Saratha Sathasivam, Majid Khan Majahar Ali, Nurshazneem Roslan, and Caicai Feng.
2023. "Hybridised Network of Fuzzy Logic and a Genetic Algorithm in Solving 3-Satisfiability Hopfield Neural Networks" *Axioms* 12, no. 3: 250.
https://doi.org/10.3390/axioms12030250