Introduction to the E-Sense Artificial Intelligence System
Abstract
1. Introduction
2. The Original Cognitive Architecture
3. Related Work
3.1. State of the Art in AI
- Symbolic interpretation: AI struggles to assign semantic significance to symbols, instead focusing on shallow comparisons like symmetry checks.
- Compositional reasoning: AI falters when it needs to apply multiple interacting rules simultaneously.
- Contextual rule application: Systems fail to apply rules differently based on complex contexts, often fixating on surface-level patterns.
3.2. Alternative Models
4. The Memory Model
4.1. Memory Model Levels
- (1)
- The lowest level is an n-gram structure that is only sets of links between every source concept that has been stored. The links describe any possible routes through the source concept sequences but are unweighted.
- (2)
- The middle level is an ontology that aggregates the source data through three phases and this converts it from set-based sequences into type-based clusters.
- (3)
- The upper level is a combination of the functional properties of the brain, with whatever input and resulting conversions they produce being stored in the same memory substrate.
- Experience to knowledge.
- Knowledge to knowledge.
- Knowledge to experience.
4.2. Lower Memory Level
4.3. Middle Ontology Level
4.4. Unit of Work
5. The Neural Level
5.1. Function Identity
5.2. Function Structure
5.3. Index Types
- Unipolar Type: This has a list of index terms that is generally a bit longer and is unordered. It can be matched with any sequence in the input set but to only one sequence.
- Bipolar Type: This has a list of index terms and a related feature set. The index terms should be matched to only one sequence and some of the feature values should also match with that sequence. This matching should be in order, however, where the order in the feature should be repeated in the sequence. Then, the rest of the feature values can match with any other sequence and in any order.
- Pyramidal Type: This has a list of index terms and a related feature set. The index terms, however, are split over two specific sequences. Both the index terms and the related feature set should match with two specific sequences and the matching should be ordered in both.
6. Ordinal Learning
7. Algorithm Testing
7.1. Ontology Tests
7.1.1. Statistical Counting
- dog cat horse zebra.
- dog cat giraffe lion wolf elephant.
- dog cat horse zebra.
- dog cat giraffe lion wolf elephant. …
7.1.2. Classic Texts
7.2. Ordinal Tests
8. Some Biological Comparisons
8.1. Gestalt Psychology
- The cat sat on the mat and drank some milk.
- The dog barked at the moon and chased its tail.
- The dog barked at the moon and chased its tail, or
- The dog barked at the moon and drank some milk.
Numbers in the Gestalt Process
8.2. Cortical Columns
8.3. Animal Brain Function
Theory of Small Changes
9. Conclusions and Future Work
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Appendix A. Upper Ontology Trees for Book Texts
| Clusters |
|---|
| dorothy |
| scarecrow |
| lion, woodman |
| great, oz |
| city, emerald |
| asked, came, see |
| little, out, tin |
| again, answered, away, before, down, made, now, shall, toto, up |
| back, come, girl, go, green, head, heart, man, one, over, upon, very, witch |
| Clusters |
|---|
| romeo, shall |
| thou |
| love, o, thy |
| death, eye, hath |
| come, thee |
| go, good, here, ill, night, now |
| day, give, lady, make, one, out, up, well |
| man, more, tybalt |
| Clusters |
|---|
| back, before, came |
| down, know |
| more, room, think, well |
| day, eye, face, found, matter, tell |
| upon |
| holmes, very |
| little, man, now |
| one |
| away, case, good, heard, house, much, nothing, quite, street, such, through, two, ye |
| go, here |
| come, hand, over, shall, time |
| asked, never |
| door, saw |
| mr, see |
| out, up |
| made, way |
| Clusters |
|---|
| answer, computer, man, question, think |
| machine |
| one |
| such |
Appendix B. Documents and Test Results for the Neural-Level Sorting
| Train File—Hard-Boiled Egg |
|---|
| Place eggs at the bottom of a pot and cover them with cold water. Bring the water to a boil, then remove the pot from the heat. Let the eggs sit in the hot water until hard-boiled. Remove the eggs from the pot and crack them against the counter and peel them with your fingers. |
| Train File—Panna Cotta |
| For the panna cotta, soak the gelatine leaves in a little cold water until soft. Place the milk, cream, vanilla pod and seeds and sugar into a pan and bring to a simmer. Remove the vanilla pod and discard. Squeeze the water out of the gelatine leaves, then add to the pan and take off the heat. Stir until the gelatine has dissolved. Divide the mixture among four ramekins and leave to cool. Place into the fridge for at least an hour, until set. For the sauce, place the sugar, water and cherry liqueur into a pan and bring to the boil. Reduce the heat and simmer until the sugar has dissolved. Take the pan off the heat and add half the raspberries. Using a hand blender, blend the sauce until smooth. Pass the sauce through a sieve into a bowl and stir in the remaining fruit. To serve, turn each panna cotta out onto a serving plate. Spoon over the sauce and garnish with a sprig of mint. Dust with icing sugar. |
| Test File—Hard-Boiled Egg and Panna Cotta |
| Remove the vanilla pod and discard. For the panna cotta, soak the gelatine leaves in a little cold water until soft. As soon as they are cooked drain off the hot water, then leave them in cold water until they are cool enough to handle. Squeeze the water out of the gelatine leaves, then add to the pan and take off the heat. Spoon over the sauce and garnish with a sprig of mint. Stir until the gelatine has dissolved. Place the eggs into a saucepan and add enough cold water to cover them by about 1cm. Pass the sauce through a sieve into a bowl and stir in the remaining fruit. Divide the mixture among four ramekins and leave to cool. Place into the fridge for at least an hour, until set. To peel them crack the shells all over on a hard surface, then peel the shell off starting at the wide end. For the sauce, place the sugar, water and cherry liqueur into a pan and bring to the boil. Place the milk, cream, vanilla pod and seeds and sugar into a pan and bring to a simmer. Reduce the heat and simmer until the sugar has dissolved. Take the pan off the heat and add half the raspberries. Using a hand blender, blend the sauce until smooth. Bring the water up to boil then turn to a simmer. To serve, turn each panna cotta out onto a serving plate. Dust with icing sugar. |
| Selected Sequences from the Hard-Boiled Egg Function |
|---|
| [place, the, eggs, into, a, saucepan, and, add, enough, cold, water, to, cover, them, by, about] [bring, the, water, up, to, boil, then, turn, to, a, simmer] [as, soon, as, they, are, cooked, drain, off, the, hot, water, then, leave, them, in, cold, water, until, they, are, cool, enough, to, handle] [to, peel, them, crack, the, shells, all, over, on, a, hard, surface, then, peel, the, shell, off, starting, at, the, wide, end] |
| Selected Sequences from the Panna Cotta Function |
| [for, the, panna, cotta, soak, the, gelatine, leaves, in, a, little, cold, water, until, soft] [place, the, milk, cream, vanilla, pod, and, seeds, and, sugar, into, a, pan, and, bring, to, the, boil] [remove, the, vanilla, pod, and, discard] [squeeze, the, water, out, of, the, gelatine, leaves, then, add, to, the, pan, and, take, off, the, heat] [stir, until, the, gelatine, has, dissolved] [divide, the, mixture, among, four, ramekins, and, leave, to, cool] [place, into, the, fridge, for, at, least, an, hour, until, set] [for, the, sauce, place, the, sugar, water, and, cherry, liqueur, into, a, pan, and, bring, to, the, boil] [reduce, the, heat, and, simmer, until, the, sugar, has, dissolved] [take, the, pan, off, the, heat, and, add, half, the, raspberries] [using, a, hand, blender, blend, the, sauce, until, smooth] [pass, the, sauce, through, a, sieve, into, a, bowl, and, stir, in, the, remaining, fruit] [to, serve, turn, each, panna, cotta, out, onto, a, serving, plate] [spoon, over, the, sauce, and, garnish, with, a, sprig, of, mint] [dust, with, icing, sugar] |
References
- Gupta, B.; Rawat, A.; Jain, A.; Arora, A.; Dhami, N. Analysis of various decision tree algorithms for classification in data mining. Int. J. Comput. Appl. 2017, 163, 15–19. [Google Scholar] [CrossRef]
- Hinton, G.E.; Osindero, S.; Teh, Y.-W. A fast learning algorithm for deep belief nets. Neural Comput. 2006, 18, 1527–1554. [Google Scholar] [CrossRef] [PubMed]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. In Advances in Neural Information Processing Systems; ACM: New York, NY, USA, 2012; pp. 1097–1105. [Google Scholar]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
- Mnih, V.; Kavukcuoglu, K.; Silver, D.; Rusu, A.A.; Veness, J.; Bellemare, M.G.; Graves, A.; Riedmiller, M.; Fidjeland, A.K.; Ostrovski, G.; et al. Human-level control through deep reinforcement learning. Nature 2015, 518, 529–533. [Google Scholar] [CrossRef] [PubMed]
- Mikolov, T.; Sutskever, I.; Chen, K.; Corrado, G.S.; Dean, J. Distributed representations of words and phrases and their compositionality. In Advances in Neural Information Processing Systems; ACM: New York, NY, USA, 2013; Volume 26. [Google Scholar]
- Minaee, S.; Mikolov, T.; Nikzad, N.; Chenaghlu, M.; Socher, R.; Amatriain, X.; Gao, J. Large language models: A survey. arXiv 2024, arXiv:2402.06196. [Google Scholar]
- OpenAI GPT-4 Technical Report. arXiv 2023, arXiv:cs.CL/2303.08774.
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, L.; Polosukhin, I. Attention is all you need. In Advances in Neural Information Processing Systems; ACM: New York, NY, USA, 2017; Volume 30. [Google Scholar]
- Goodfellow, I.J.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial nets. In Advances in Neural Information Processing Systems; ACM: New York, NY, USA, 2014; Volume 27. [Google Scholar]
- Kingma, D.P.; Welling, M. An introduction to variational autoencoders. Found. Trends Mach. Learn. 2019, 12, 307–392. [Google Scholar] [CrossRef]
- Feng, S.; Sun, H.; Yan, X.; Zhu, H.; Zou, Z.; Shen, S.; Liu, H.X. Dense reinforcement learning for safety validation of autonomous vehicles. Nature 2023, 615, 620–627. [Google Scholar] [CrossRef]
- Fink, G.A. Markov Models for Pattern Recognition: From Theory to Applications; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
- Greer, K. Neural Assemblies as Precursors for Brain Function. NeuroSci 2022, 3, 645–655. [Google Scholar] [CrossRef]
- Greer, K. New Ideas for Brain Modelling 3. Cogn. Syst. Res. 2019, 55, 1–13. [Google Scholar] [CrossRef]
- Sarker, M.K.; Zhou, L.; Eberhart, A.; Hitzler, P. Neuro-symbolic artificial intelligence. AI Commun. 2021, 34, 197–209. [Google Scholar] [CrossRef]
- Greer, K. Turing: Then, Now and Still Key. In Artificial Intelligence, Evolutionary Computation and Metaheuristics (AIECM)–Turing 2012, Studies in Computational Intelligence; Yang, X.-S., Ed.; Springer: Berlin/Heidelberg, Germany, 2013; Volume 427, pp. 43–62. [Google Scholar] [CrossRef]
- Grassé, P.P. La reconstruction dun id et les coordinations internidividuelles chez Bellicositermes natalensis et Cubitermes sp., La théorie de la stigmergie: Essais d’interprétation du comportment des termites constructeurs. Insectes Sociaux 1959, 6, 41–84. [Google Scholar] [CrossRef]
- Dorigo, M.; Bonabeau, E.; Theraulaz, G. Ant algorithms and stigmergy. Future Gener. Comput. Syst. 2000, 16, 851–871. [Google Scholar] [CrossRef]
- Gruber, T. A translation approach to portable ontology specifications. Knowl. Acquis. 1993, 5, 199–220. [Google Scholar] [CrossRef]
- Anderson, J.A.; Silverstein, J.W.; Ritz, S.A.; Jones, R.A. Distinctive Features, Categorical Perception, and Probability Learning: Some Applications of a Neural Model. Psychol. Rev. 1977, 84, 413–451. [Google Scholar] [CrossRef]
- Hawkins, J.; Blakeslee, S. On Intelligence; Times Books: New York, NY, USA, 2004. [Google Scholar]
- Pulvermüller, F.; Tomasello, R.; Henningsen-Schomers, M.R.; Wennekers, T. Biological constraints on neural network models of cognitive function. Nat. Rev. Neurosci. 2021, 22, 488–502. [Google Scholar] [CrossRef]
- Treves, A.; Rolls, E.T. What determines the capacity of autoassociative memories in the brain? Netw. Comput. Neural Syst. 1991, 2, 371. [Google Scholar] [CrossRef]
- Meunier, D.; Lambiotte, R.; Bullmore, E.T. Modular and hierarchically modular organization of brain networks. Front. Neurosci. 2010, 4, 200. [Google Scholar] [CrossRef]
- Lynn, C.W.; Holmes, C.M.; Palmer, S.E. Heavy-tailed neuronal connectivity arises from Hebbian self-organization. Nat. Phys. 2024, 20, 484–491. [Google Scholar] [CrossRef]
- Rubinov, M.; Sporns, O.; van Leeuwen, C.; Breakspear, M. Symbiotic relationship between brain structure and dynamics. BMC Neurosci. 2009, 10, 55. [Google Scholar] [CrossRef]
- Barabasi, A.L.; Albert, R. Emergence of scaling in random networks. Science 1999, 286, 509–512. [Google Scholar] [CrossRef] [PubMed]
- Spens, E.; Burgess, N. A generative model of memory construction and consolidation. Nat. Hum. Behav. 2024, 8, 526–543. [Google Scholar] [CrossRef]
- Mountcastle, V.B. The columnar organization of the neocortex. Brain J. Neurol. 1997, 120, 701–722. [Google Scholar]
- Hawkins, J.; Lewis, M.; Klukas, M.; Purdy, S.; Ahmad, S. A Framework for Intelligence and Cortical Function Based on Grid Cells in the Neocortex. Front. Neural Circuits 2019, 12, 121. [Google Scholar] [CrossRef] [PubMed]
- Krotov, D. A new frontier for Hopfield networks. Nat. Rev. Phys. 2023, 5, 366–367. [Google Scholar] [CrossRef]
- Buffart, H. A Formal Approach to Gestalt Theory; Blurb: London, UK, 2017; ISBN 9781389505577. [Google Scholar]
- Miller, G.A. The magical number seven, plus or minus two: Some limits on our capacity for processing information. Psychol. Rev. 1956, 63, 81–97. [Google Scholar] [CrossRef]
- Cavanagh, J.P. Relation between the immediate memory span and the memory search rate. Psychol. Rev. 1972, 79, 525–530. [Google Scholar] [CrossRef]
- Rock, I. In defence of unconscious inference. In Stability and Constancy in Visual Perception: Mechanisms and Processes; Epstein, W., Ed.; John Wiley & Sons: New York, NY, USA, 1977. [Google Scholar]
- Greer, K. Category Trees–Classifiers that Branch on Category. Int. J. Artif. Intell. Appl. 2021, 12, 65–76. [Google Scholar] [CrossRef]
- Brown, P.F.; Della Pietra, V.J.; Desouza, P.V.; Lai, J.C.; Mercer, R.L. Class-based n-gram models of natural language. Comput. Linguist. 1992, 18, 467–480. [Google Scholar]
- Dong, M.; Yao, L.; Wang, X.; Benatallah, B.; Zhang, S. GrCAN: Gradient Boost Convolutional Autoencoder with Neural Decision Forest. arXiv 2018, arXiv:1806.08079. [Google Scholar]
- Katuwal, R.; Suganthan, P.N. Stacked Autoencoder Based Deep Random Vector Functional Link Neural Network for Classification. Appl. Soft Comput. 2019, 85, 105854. [Google Scholar] [CrossRef]
- Nguyen, T.; Ye, N.; Bartlett, P.L. Learning Near-optimal Convex Combinations of Basis Models with Generalization Guarantees. arXiv 2019, arXiv:1910.03742. [Google Scholar]
- ARC Prize. 2025. Available online: https://arcprize.org/ (accessed on 30 May 2025).
- Lieto, A.; Lebiere, C.; Oltramari, A. The knowledge level in cognitive architectures: Current limitations and possible developments. Cogn. Syst. Res. 2017, 48, 39–55. [Google Scholar] [CrossRef]
- Eliasmith, C.; Stewart, T.C.; Choo, X.; Bekolay, T.; DeWolf, T.; Tang, Y.; Rasmussen, D. A Large-Scale Model of the Functioning Brain. Science 2012, 338, 1202–1205. [Google Scholar] [CrossRef]
- Laird, J. The Soar Cognitive Architecture; MIT Press: Cambridge, MA, USA, 2012. [Google Scholar]
- Newell, A.; Simon, H.A. Computer science as empirical inquiry: Symbols and search. Commun. ACM 1976, 19, 113–126. [Google Scholar] [CrossRef]
- High, R. The era of cognitive systems: An inside look at IBM Watson and how it works. IBM Corporation. Redbooks 2012, 1, 1–16. [Google Scholar]
- Webb, T.W.; Frankland, S.M.; Altabaa, A.; Segert, S.; Krishnamurthy, K.; Campbell, D.; Russin, J.; Giallanza, T.; O’Reilly, R.; Lafferty, J.; et al. The relational bottleneck as an inductive bias for efficient abstraction. Trends Cogn. Sci. 2024, 28, 829–843. [Google Scholar] [CrossRef]
- Hinton, G. How to represent part-whole hierarchies in a neural network. Neural Comput. 2023, 35, 413–452. [Google Scholar] [CrossRef]
- Friedman, R. Cognition as a Mechanical Process. NeuroSci 2021, 2, 141–150. [Google Scholar] [CrossRef]
- Tkačik, G.; Mora, T.; Marre, O.; Amodei, D.; Palmer, S.E.; Berry, M.J.; Bialek, W. Thermodynamics and signatures of criticality in a network of neurons. Proc. Natl. Acad. Sci. USA 2015, 112, 11508–11513. [Google Scholar] [CrossRef]
- Dobson, S.; Fields, C. Constructing condensed memories in functorial time. J. Exp. Theor. Artif. Intell. 2023, 37, 487–511. [Google Scholar] [CrossRef]
- Tsien, R.Y. Very long-term memories may be stored in the pattern of holes in the perineuronal net. Proc. Natl. Acad. Sci. USA 2013, 110, 12456–12461. [Google Scholar] [CrossRef] [PubMed]
- Greer, K. New Ideas for Brain Modelling 7. Int. J. Comput. Appl. Math. Comput. Sci. 2021, 1, 34–45. [Google Scholar]
- Shannon, C.E. A Mathematical Theory of Communication. Bell Syst. Tech. J. 1948, 27, 379–423. [Google Scholar] [CrossRef]
- Cover, T.M.; Joy, A.T. Elements of Information Theory; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 1991; ISBN 0-471-20061-1. [Google Scholar]
- Greer, K. Is Intelligence Artificial? In Proceedings of the Euroasia Summit, Congress on Scientific Researches and Recent Trends-8, Zambales, Philippines, 2–4 August 2021; pp. 307–324. Available online: https://arxiv.org/abs/1403.1076 (accessed on 30 May 2025).
- Greer, K. New Ideas for Brain Modelling 6. AIMS Biophys. 2020, 7, 308–322. [Google Scholar]
- Dobrynin, V.; Sherman, M.; Abramovich, R.; Platonov, A. A Sparsifier Model for Efficient Information Retrieval. In Proceedings of the 2024 IEEE 18th International Conference on Application of Information and Communication Technologies (AICT), Turin, Italy, 25–27 September 2024. [Google Scholar]
- Greer, K. Symbolic Neural Networks for Clustering Higher-Level Concepts. NAUN Int. J. Comput. 2011, 3, 378–386. [Google Scholar]
- The Gutenberg Project. Available online: https://www.gutenberg.org/browse/scores/top (accessed on 30 May 2025).
- Word2Vec—DeepLearning4J. 2025. Available online: https://deeplearning4j.konduit.ai/en-1.0.0-beta7/language-processing/word2vec (accessed on 30 May 2025).
- Turing, A.M. Computing machinery and intelligence. Mind 1950, 59, 433–460. [Google Scholar] [CrossRef]



| E-Sense Typing | Type Clusters | |
| dorothy | ||
| asked, came, see | ||
| city, emerald | ||
| great, oz | ||
| Word2Vec | Type Word | Associations |
| dorothy | dorothy, back, over, scarecrow, girl | |
| asked | asked, cowardly, sorrowfully, promised, courage | |
| came | came, next, morning, flew, carried | |
| see | see, lived, away, many, people | |
| city | city, emerald, brick, streets, led | |
| emerald | emerald, brick, city, streets, gates | |
| great | great, room, throne, head, terrible | |
| oz | oz, throne, room, heart, tell | |
| E-Sense Typing | Type Clusters | |
| love, o, thy | ||
| romeo, shall | ||
| death, eye, hath | ||
| Word2Vec | Type Word | Associations |
| thou | thou, art, wilt, thy, hast | |
| love | love, art, wit, thou, fortunes | |
| thy | thy, thou, tybalt, dead, husband | |
| romeo | romeo, slain, thou, hes, art | |
| shall | shall, thou, till, art, such | |
| death | death, romeo, slain, tybalt, thy | |
| eye | eye, hoar, sometime, dreams, ladies | |
| hath | hath, husband, slain, tybalt, dead | |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Greer, K. Introduction to the E-Sense Artificial Intelligence System. AI 2025, 6, 122. https://doi.org/10.3390/ai6060122
Greer K. Introduction to the E-Sense Artificial Intelligence System. AI. 2025; 6(6):122. https://doi.org/10.3390/ai6060122
Chicago/Turabian StyleGreer, Kieran. 2025. "Introduction to the E-Sense Artificial Intelligence System" AI 6, no. 6: 122. https://doi.org/10.3390/ai6060122
APA StyleGreer, K. (2025). Introduction to the E-Sense Artificial Intelligence System. AI, 6(6), 122. https://doi.org/10.3390/ai6060122

