# Similarity Measures for Learning in Lattice Based Biomimetic Neural Networks

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Lattice Theory Background Material

**Definition**

**1.**

- $x\preccurlyeq x$ (reflexivity),
- $x\preccurlyeq y$ and $y\preccurlyeq x$⇒$x=y$ (antisymmetry) and
- $x\preccurlyeq y$ and $y\preccurlyeq z$⇒$x\preccurlyeq z$ (transitivity).

**Theorem**

**1.**

- If $Q\subset P$, then $(Q,\preccurlyeq )$ is also a poset,
- $\nexists \phantom{\rule{0.166667em}{0ex}}x\in P\phantom{\rule{0.277778em}{0ex}}\ni \phantom{\rule{0.277778em}{0ex}}x\prec x$, and
- if $x\prec y$ and $y\prec z$, then $x\prec z$, where $x,y,z\in P$.

**Definition**

**2.**

**Theorem**

**2.**

- $d(x,y)\ge 0$ and $d(x,x)=0$,
- $d(x,y)=d(y,x)$,
- $d(x,y)\le d(x,z)+d(z,y)$, and
- $d(x,y)\ge d(a\vee x,a\vee y)+d(a\wedge x,a\wedge y)$.

**Corollary**

**1.**

**Theorem**

**3.**

**Proof.**

**Definition**

**3.**

- $s(x,O)=0,\phantom{\rule{4pt}{0ex}}\forall \phantom{\rule{0.166667em}{0ex}}x\ne O$,
- $s(x,x)=1,\phantom{\rule{4pt}{0ex}}\forall \phantom{\rule{0.166667em}{0ex}}x\in L$, and
- $s(x,y)<1,\phantom{\rule{4pt}{0ex}}\forall \phantom{\rule{0.166667em}{0ex}}x\ne y$.

## 3. Lattice Biomimetic Neural Networks

- $(j,k,h,\ell )$ if $n=1$ and set $N={N}_{1}$ (single input neuron),
- $(i,k,h,\ell )$ if $m=1$, set $M={M}_{1}$ (single output neuron) and denote its dendritic branches by ${\tau}^{1},\dots ,{\tau}^{K}$ (multiple dendrites) or simply $\tau $ if $K=1$ (single dendrite), and
- $(i,j,k,\ell )$ if $\rho =1$ (at most one synapse per dendrite).

- The use of dendrites and their synapses.
- A presynaptic neuron ${N}_{i}$ can have more than one terminal branch on the dendrites of a postsynaptic neuron ${M}_{j}$.
- If the axon of a presynaptic neuron ${N}_{i}$ has two or more terminal branches that synapse on different dendritic locations of the postsynaptic neuron ${M}_{j}$, then it is possible that some of the synapses are excitatory and others are inhibitory to the same information received from ${N}_{i}$.
- The basic computations resulting from the information received from the presynaptic neurons takes place in the dendritic tree of ${M}_{j}$.
- As in standard ANNs, the number of input and output neurons is problem dependent. However, in contrast to standard ANNs where the number of neurons in a hidden layer, as well as the number of hidden layers are pre-set by the user or an optimization process, hidden layer neurons, dendrites, synaptic sites and weights, and axonal structures are grown during the learning process.

## 4. Similarity Measure Based Learning for LBNNs

## 5. Recognition Capability of Similarity Measure Based LNNs

#### 5.1. Classification Performance on Artificial Datasets

#### 5.2. Classification Performance on Real-World Application Datasets

**Example**

**1.**

**Example**

**2.**

**Example**

**3.**

## 6. Conclusions

## Author Contributions

## Funding

## Conflicts of Interest

## References

- Ritter, G.X.; Urcid, G. Lattice based dendritic computing: A biomimetic approach to ANNs. In Progress in Pattern Recognition, Image Analysis, Computer Vision and Applications, 19th Iberoamerican Congress: Lecture Notes in Computer Science; Springer: New York, NY, USA, 2014; Volume 8827, pp. 730–744. [Google Scholar]
- Eccles, J.C. The Understanding of the Brain; McGraw-Hill: New York, NY, USA, 1977. [Google Scholar]
- Rall, W.; Segev, I. Functional Possibilities for Synapses on Dendrites and Dendritic Spines. In Synaptic Function; Edelman, G.M., Gall, E.E., Cowa, W.M., Eds.; John Wiley & Sons: New York, NY, USA, 1987; pp. 605–636. [Google Scholar]
- Koch, C.; Segev, I. Methods in Neuronal Modeling: From Synapses to Networks; MIT Press: Boston, MA, USA, 1989. [Google Scholar]
- McKenna, T.; Davis, J.; Zornetzer, S.E. (Eds.) Single Neuron Computation; Academic Press: San Diego, CA, USA, 1992. [Google Scholar]
- Holmes, W.R.; Rall, W. Electronic Models of Neuron Dendrites and Single Neuron Computation. In Single Neuron Computation; McKenna, T., Davis, J., Zornetzer, S.F., Eds.; Academic Press: San Diego, CA, USA, 1992; pp. 7–25. [Google Scholar]
- Shepherd, G.M. Canonical Neurons and their Computational Organization. In Single Neuron Computation; McKenna, T., Davis, J., Zornetzer, S.F., Eds.; Academic Press: San Diego, CA, USA, 1992; pp. 27–55. [Google Scholar]
- Mel, B.W. Synaptic integration in excitable dendritic trees. J. Neurophysiol.
**1993**, 70, 1086–1101. [Google Scholar] [CrossRef] [PubMed] - Editors of The Scientific American Magazine. The Scientific American Book of the Brain; Lyons Press: Guilford, CT, USA, 2001. [Google Scholar]
- Drachman, D.A. Do we have a brain to spare? Neurology
**2005**, 64, 12. [Google Scholar] [CrossRef] [PubMed] - Herculano-Houzel, S. The remarkable, yet not extraordinary, human brain as scaled-up primate brain and its associated cost. Proc. Nat. Acad. Sci. USA
**2012**, 109, 10661–10668. [Google Scholar] [CrossRef][Green Version] - Bartheld, C.S.V.; Bahney, J.; Herculano-Houzel, S. The search for true numbers of neurons and glial cells in the human brain: A review of 150 years of cell counting, quantification of neurons and glia in human brains. J. Comp. Neurol.
**2016**, 524, 3865. [Google Scholar] [CrossRef] [PubMed][Green Version] - Kandel, E.R.; Schwartz, J.H.; Jessel, T.M. Principles of Neural Systems, 4th ed.; McGraw-Hill: New York, NY, USA, 2000. [Google Scholar]
- Marois, R.; Ivanoff, J. Capacity limits of information processing in the brain. Trends Cogn. Sci.
**2005**, 9, 296–305. [Google Scholar] [CrossRef] [PubMed] - Zimmer, C. 100 trillion connections: New efforts probe and map the brain’s detailed architecture. Sci. Am.
**2011**, 304, 58–61. [Google Scholar] [CrossRef] - Segev, I. Dendritic Processing. In The Handbook of Brain Theory and Neural Networks; Arbib, M.A., Ed.; MIT Press: Boston, MA, USA, 1998; pp. 282–289. [Google Scholar]
- Mel, B.W. Why have Dendrites? A Computational Perspective. In Dendrites; Spruston, S.G., Hausser, M.D., Eds.; Oxford University Press: Oxford, UK, 1999; pp. 271–289. [Google Scholar]
- Wei, D.S.; Mei, Y.A.; Bagal, A.; Kao, J.P.; Thompson, S.M.; Tang, C.M. Compartmentalized and binary behavior of terminal dendrites in hippocampal pyramidal neurons. Science
**2001**, 293, 2272–2275. [Google Scholar] [CrossRef][Green Version] - Branco, T.; Häusser, M. The single dendritic branch as a fundamental functional unit in the nervous system. Curr. Opin. Neurol.
**2010**, 20, 494–502. [Google Scholar] [CrossRef] - Arbib, M.A. (Ed.) The Handbook of Brain Theory and Neural Networks; MIT Press: Boston, MA, USA, 1998. [Google Scholar]
- Koch, C. Biophysics of Computation: Information Processing in Single Neurons; Oxford University Press: Oxford, UK, 1999. [Google Scholar]
- Ritter, G.X.; Sussner, P.; Díaz de León, J.L. Morphological associative memories. IEEE Trans. Neural Netw.
**1998**, 9, 281–293. [Google Scholar] [CrossRef] - Ritter, G.X.; Gader, P. Fixed Points of Lattice Transforms and Lattice Associative Memories. In Advances in Imaging and Electron Physics, 144; Hawkes, P., Ed.; Elsevier: San Diego, CA, USA, 2006; pp. 165–242. [Google Scholar]
- Urcid, G.; Valdiviezo-N, J.C. Generation of lattice independent vector sets for pattern recognition applications. In Mathematics of Data/Image Pattern Recognition, Compression, Coding, and Encryption X with Applications, Proceedings of the SPIE, San Diego, CA, USA, 7 September 2007; SPIE: Washington, WA, USA, 2007; Volume 6700. [Google Scholar]
- Birkhoff, G. Metric and Topological Lattices. In Lattice Theory, 3rd ed.; Birkhoff, G., Ed.; AMS Colloqium Publications; American Mathematical Society: Providence, RI, USA, 1973; p. 230. [Google Scholar]
- Ritter, G.X.; Urcid, G. Lattice algebra approach to single-neuron computation. IEEE Trans. Neural Netw.
**2003**, 14, 282–295. [Google Scholar] [CrossRef] - Ritter, G.X.; Iancu, L.; Urcid, G. Morphological perceptrons with dendritic structures. IEEE Proc. Int. Conf. Fuzzy Syst.
**2003**, 2, 1296–1301. [Google Scholar] - Chyzhyk, D.; Graña, M. Optimal hyperbox shrinking in dendritic computing applied to Alzheimer’s disease detection in MRI. Adv. Intell. Soft Comput.
**2011**, 87, 543–550. [Google Scholar] - Ritter, G.X.; Urcid, G. Lattice Algebra Approach to Endmember Determination in Hyperspectral Imagery. In Advances in Imaging and Electron Physics; Hawkes, P., Ed.; Academic Press: San Diego, CA, USA, 2010; Volume 160, pp. 113–169. [Google Scholar]
- Ritter, G.X.; Urcid, G. A lattice matrix method for hyperspectral image unmixing. Inf. Sci.
**2011**, 18, 1787–1803. [Google Scholar] [CrossRef] - Deza, M.M.; Deza, E. Encyclopedia of Distances, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2013. [Google Scholar]
- Wu, D.; Mendel, J.M. Efficient algorithms for computing a class of subsethood and similarity measures for interval type-2 fuzzy sets. In Proceedings of the IEEE International Conference on Fuzzy Systems, Barcelona, Spain, 18–23 July 2010; pp. 1–7. [Google Scholar]
- Esmi, E.; Sussner, P.; Valle, M.E.; Sakuray, F.; Barros, L. Fuzzy associative memories based on subsethood and similarity measures with applications to speaker identification. Lect. Notes Comput. Sci.
**2012**, 7209, 479–490. [Google Scholar] - Papakostas, G.A.; Hatzimichailidis, A.G.; Kaburlasos, V.G. Distance and similarity measures between intuitionistic fuzzy sets: A comparative analysis from a pattern recognition point of view. Pattern Recognit. Lett.
**2013**, 34, 1609–1622. [Google Scholar] [CrossRef] - Hatzimichailidis, A.G.; Kaburlasos, V.G. A novel fuzzy implication stemming from a fuzzy lattice inclusion measure. In Proceedings of the Lattice Based Modeling Workshop, Olomouc, Czech Republic, 21–23 October 2008; pp. 59–66. [Google Scholar]
- Hatzimichailidis, A.G.; Papakostas, G.A.; Kaburlasos, V.G. On Constructing Distance and Similarity Measures based on Fuzzy Implications. In Handbook of Fuzzy Sets Comparison: Theory, Algorithms and Applications; Papakostas, G.A., Hatzimichailidis, A.G., Kaburlasos, V.G., Eds.; Science Gate Publishing: Democritus, Greece, 2016; pp. 1–21. [Google Scholar]
- Kaburlasos, V.G. Granular Enhancement of Fuzzy-ART/SOM neural classifiers based on lattice theory. In Studies in Computational Intelligence; Springer: Berlin, Germany, 2007; Volume 67, pp. 3–23. [Google Scholar]
- Nguyen, H.T.; Kreinovich, V. Computing degrees of subsethood and similarity for interval-valued fuzzy sets: Fast algorithms. In Proceedings of the 9th International Conference on Intelligent Technologies, Samui, Thailand, 7–9 October 2008; pp. 47–55. [Google Scholar]
- Urcid, G.; Ritter, G.X.; Valdiviezo-N., J.C. Dendritic lattice associative memories for pattern recognition. In Proceedings of the IEEE Proceedings of the 4th World Congress on Nature and Biologically Inspired Computing, Mexico City, Mexico, 5–9 November 2012; pp. 181–187. [Google Scholar]
- Frank, A.; Asuncion, A. UCI Machine Learning Repository; University of California, School of Information & Computer Science: Irvine, CA, USA, 2010; Available online: http://archive.ics.uci.edu/ml (accessed on 25 January 2020).
- Woods, K. Combination of multiple classifiers using local accuracy estimates. IEEE Trans. Pattern Anal. Mach. Intell.
**1997**, 19, 405–410. [Google Scholar] [CrossRef] - Li, C.-F.; Xu, L.; Wang, S.-T. A comparative study on improved fuzzy support vector machines and Levenberg-Marquardt based BP network. In Intelligent Computing; Springer LNCS: Berlin, Germany, 2006; Volume 4113, pp. 73–82. [Google Scholar]
- Rocha-N, A.R.; Barreto, G.A. On the application of ensembles of classifiers to the diagnosis of pathologies of the vertebral column: A comparative analysis. IEEE Latin Am. Trans.
**2009**, 7, 487–496. [Google Scholar] - Aeberhard, S.; Coomans, D.; de Veland, O. Comparison of Classifiers in High Dimensional Settings; Tech. Rep., 92-02; Department of Computer Science & Department of Mathematics and Statistics, James Cook University of North Queensland: Queensland, Australia, 1992. [Google Scholar]
- Petridis, V.; Kaburlasos, V.G. Fuzzy lattice neural network (FLNN): A hybrid model of learning. IEEE Trans. Neural Netw.
**1998**, 9, 877–890. [Google Scholar] [CrossRef] - Saračević, M.; Adamović, S.; Biševac, E. Application of Catalan numbers and the lattice path combinatorial problem in cryptography. Acta Polytech. Hung.
**2018**, 15, 91–110. [Google Scholar] - Urcid, G.; Valdiviezo-N., J.C.; Ritter, G.X. Lattice algebra approach to color image segmentation. J. Math. Imgaing Vis.
**2011**, 42, 150–162. [Google Scholar] [CrossRef] - Urcid, G.; Ritter, G.X.; Valdiviezo-N., J.C. Grayscale image recall from imperfect inputs with a two layer dendritic lattice associative memory. In Proceedings of the 3th IEEE World Congress on Nature and Biologically Inspired Computing, Salamanca, Spain, 19–21 October 2011; pp. 268–273. [Google Scholar]
- Urcid, G.; Lara-R., L.-D.; López-M., E. A dendritic lattice neural network for color image segmentation. In Application of Digital Image Processing XXXVIII, Proceedings of the SPIE, San Diego, CA, USA, 22 September 2015; SPIE: Washington, WA, USA, 2015; Volume 9599. [Google Scholar]
- Saračević, M.; Adamović, S.; Miškovic, V.; Maček, N.; Šarak, M. A novel approach to steganography based on the properties of Catalan numbers and Dyck words. Future Gener. Comput. Syst.
**2019**, 100, 186–197. [Google Scholar] [CrossRef]

**Figure 1.**Illustration of neural axons and branches from the presynaptic neurons ${N}_{i}$ to the postsynaptic neuron ${M}_{j}$. An inhibitory synaptic weight is shown as an open circle (∘), whereas an excitatory synapse is represented with a solid circle (•). The information value ${x}_{i}$ is transferred from neuron ${N}_{i}$ to the synaptic sites of the output neuron ${M}_{j}$. Stemming from presynaptic neurons, boutons of axonal fibers communicate with the synaptic sites on dendritic branches ${\tau}_{k}^{j}$ of ${M}_{j}$.

**Figure 2.**The neural architecture of a LBNN that learns using a similarity measure. Different pathways are shown between the input layer neurons ${N}_{i}$ and the output neurons ${M}_{j}$. The value ${x}_{i}$ denotes the information transferred from neuron ${N}_{i}$ to the synaptic sites of neurons ${A}_{j},{B}_{j}$ (first hidden layer) and terminal branches of axonal fibers originating in the $AB$ neurons layer making contact with synaptic sites on dendritic branches of the second hidden layer neurons ${C}_{j}$.

**Figure 3.**Data set “X-shape” has 55 points, 2 features (coordinates x, y), and 2 classes. Class ${c}_{1}$ has 28 points (olive green dots) and class ${c}_{2}$ has 27 points (red dots).

**Figure 4.**Data set “Hemisphere-sphere” has 618 samples, 3 features (coordinates x, y, z), and 2 classes. Class ${c}_{1}$ has 499 points (blue dots) and class ${c}_{2}$ has 119 points (red dots). Point projections on to the $xy$, $xz$, and $yz$ planes are drawn as circles (purple) for ${c}_{1}$ data and as small dots (orange) for ${c}_{2}$ data.

**Figure 5.**X-shaped dataset with 55 samples. Training points in class ${c}_{1}$ are marked with a cross (×), training points in class ${c}_{2}$ marked with a plus sign (+), and selected test points ${\mathbf{x}}^{\zeta}\in {Q}_{p}$, for $\zeta =5,19,34,50$, shown as filled colored dots. The assigned class to test points corresponds, respectively, to the class of the training points ${\mathbf{q}}^{j}\in {P}_{p}$ where $j=5,11,18,28$.

**Figure 6.**Similarity measure curves for ${\mathbf{x}}^{\zeta}\in {Q}_{p}$ where $\zeta =5,19,34,50$. The assigned class to test points corresponds to the class of the training points, ${\mathbf{q}}^{j}\in {P}_{p}$ for $j=5,11,18,28$, where the maximum similarity value occurs. Here, $\u25bf=0.846$, $0.928$, $0.896$, and $0.906$, respectively, for ${\mathbf{x}}^{5}$, ${\mathbf{x}}^{19}$, ${\mathbf{x}}^{34}$, and ${\mathbf{x}}^{50}$.

**Table 1.**Similarity valuation LBNN classification performance for the “X-shaped” (X-s) and “hemisphere-sphere” (H-s) datasets.

Q | p | $|{\mathit{P}}_{\mathit{p}}|$ | $|{\mathit{Q}}_{\mathit{p}}|$ | $\lfloor {\mathit{\mu}}_{\mathit{p}}\rfloor $ | ${\mathit{f}}_{\mathit{p}}^{\mathbf{hits}}$ |
---|---|---|---|---|---|

X-s | $50\%$ | 28 | 27 | 0 | 0.994 |

$k=55$ | $60\%$ | 33 | 22 | 0 | 0.998 |

$m=2$ | $70\%$ | 39 | 16 | 0 | 0.999 |

$n=2$ | $80\%$ | 44 | 11 | 0 | 1.000 |

$90\%$ | 50 | 5 | 0 | 1.000 | |

H-s | $50\%$ | 309 | 309 | 13 | 0.978 |

$k=618$ | $60\%$ | 370 | 248 | 11 | 0.982 |

$m=2$ | $70\%$ | 432 | 186 | 8 | 0.987 |

$n=3$ | $80\%$ | 494 | 124 | 5 | 0.992 |

$90\%$ | 556 | 62 | 2 | 0.996 |

Q | p | $|{\mathit{P}}_{\mathit{p}}|$ | $|{\mathit{Q}}_{\mathit{p}}|$ | $\lfloor {\mathit{\mu}}_{\mathit{p}}\rfloor $ | ${\mathit{f}}_{\mathit{p}}^{\mathbf{hits}}$ |
---|---|---|---|---|---|

Iris | $50\%$ | 75 | 75 | 3 | 0.975 |

$k=150$ | $60\%$ | 90 | 60 | 2 | 0.980 |

$m=3$ | $70\%$ | 105 | 45 | 2 | 0.987 |

$n=4$ | $80\%$ | 120 | 30 | 1 | 0.991 |

$90\%$ | 135 | 15 | 0 | 0.997 | |

Column | $50\%$ | 155 | 155 | 33 | 0.893 |

$k=310$ | $60\%$ | 186 | 124 | 25 | 0.917 |

$m=3$ | $70\%$ | 217 | 93 | 19 | 0.939 |

$n=6$ | $80\%$ | 248 | 62 | 13 | 0.958 |

$90\%$ | 279 | 31 | 6 | 0.981 | |

Wine | $50\%$ | 89 | 89 | 23 | 0.871 |

$k=178$ | $60\%$ | 107 | 71 | 17 | 0.902 |

$m=3$ | $70\%$ | 125 | 53 | 12 | 0.930 |

$n=13$ | $80\%$ | 142 | 36 | 8 | 0.956 |

$90\%$ | 160 | 18 | 4 | 0.978 |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Ritter, G.X.; Urcid, G.; Lara-Rodríguez, L.-D. Similarity Measures for Learning in Lattice Based Biomimetic Neural Networks. *Mathematics* **2020**, *8*, 1439.
https://doi.org/10.3390/math8091439

**AMA Style**

Ritter GX, Urcid G, Lara-Rodríguez L-D. Similarity Measures for Learning in Lattice Based Biomimetic Neural Networks. *Mathematics*. 2020; 8(9):1439.
https://doi.org/10.3390/math8091439

**Chicago/Turabian Style**

Ritter, Gerhard X., Gonzalo Urcid, and Luis-David Lara-Rodríguez. 2020. "Similarity Measures for Learning in Lattice Based Biomimetic Neural Networks" *Mathematics* 8, no. 9: 1439.
https://doi.org/10.3390/math8091439