Machines Imitating Human Thinking Using Bayesian Learning and Bootstrap
Abstract
:1. Introduction
2. Review of the Literature
2.1. Bayesian Inference and Learning
2.2. Bayesian Learning for Human Behaviors
3. Proposed Methodology
3.1. Bayesian Learning
3.2. Bayesian Bootstrap
4. Simulation Study
5. Discussion
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Bermudez, J.L. Cognitive Science An Introduction to the Science of the Mind, 2nd ed.; Cambridge University Press: Cambridge, UK, 2014. [Google Scholar]
- Duch, W. Towards artificial minds. In Proceedings of the I National Conference on Neural Networks and Applications, Kule, Republic of Botswana, 12–15 April 1994; pp. 17–28. [Google Scholar]
- Hurwitz, J.S.; Kaufman, M.; Bowles, A. Cognitive Computing and Big Data Analysis; Wiley: Indianapolis, India, 2015. [Google Scholar]
- Lake, B.M.; Ullman, T.D.; Tenenbaum, J.B.; Gershman, S.J. Building Machines That Learn and Think Like People. Behav. Brain Sci. 2017, 40, e253. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Turing, A.M. Computing Machinery and Intelligence. Mind 1950, 49, 433–460. [Google Scholar] [CrossRef]
- Bates, A. Augmented Mind AI, Humans and the Superhuman Revolution; Neocortex Ventures: San Diego, CA, USA, 2019. [Google Scholar]
- Burr, C.; Cristianini, N. Can Machines Read our Minds? Minds Mach. 2019, 29, 461–494. [Google Scholar] [CrossRef] [Green Version]
- Franklin, S.; Wolpert, S.; McKay, S.R.; Christian, W. Artificial minds. Comput. Phys. 1997, 11, 258–259. [Google Scholar] [CrossRef] [Green Version]
- Franklin, S. Artificial Minds; MIT Press: Cambridge, MA, USA, 2001. [Google Scholar]
- Silver, D.; Huang, A.; Maddison, C.J.; Guez, A.; Sifre, L.; Driessche, G.; Schrittwieser, J.; Antonoglou, I.; Panneershelvam, V.; Lanctot, M.; et al. Mastering the game of Go with deep neural networks and tree search. Nature 2016, 529, 484–489. [Google Scholar] [CrossRef] [PubMed]
- Walsh, T. Machines That Think; Prometheus Books: New York, NY, USA, 2018. [Google Scholar]
- LeCun, Y.; Bengio, Y.; Hinton, G. Deep Learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef] [PubMed]
- Neisser, U.; Boodoo, G.; Bouchard, T.J., Jr.; Boykin, A.W.; Brody, N.; Ceci, S.J.; Halpern, D.F.; Loehlin, J.C.; Perloff, R.; Sternberg, R.J.; et al. Intelligence: Knowns and unknowns. Am. Psychol. 1996, 51, 77–101. [Google Scholar] [CrossRef]
- Russell, S.; Norvig, P. Artificial Intelligence: A Modern Approach, 3rd ed.; Pearson: Essex, UK, 2014. [Google Scholar]
- Shi, Z. Mind Computation; World Scientific Publishing: Singapore, 2017. [Google Scholar]
- Minsky, M. The Society of Mind; Simon & Schuster Paperbacks: New York, NY, USA, 1985. [Google Scholar]
- Minsky, M. The Emotion Machine; Simon & Schuster Paperbacks: New York, NY, USA, 2006. [Google Scholar]
- Hoya, T. Artificial Mind System Kernel Memory Approach; Springer: New York, NY, USA, 2005. [Google Scholar]
- Gelman, A.; Carlin, J.B.; Stern, H.S.; Dunson, D.B.; Vehtari, A.; Rubin, D.B. Bayesian Data Analysis, 3rd ed.; Chapman & Hall/CRC Press: Boca Raton, FL, USA, 2013. [Google Scholar]
- Kruschke, J.K. Doing Bayesian Data Analysis, 2nd ed.; Elsevier: Waltham, MA, USA, 2015. [Google Scholar]
- Neal, R.M. Bayesian Learning for Neural Networks; Springer: New York, NY, USA, 1996. [Google Scholar]
- McGrayne, S.B. The Theory That Would Not Die: How Bayes’ Rule Cracked the Enigma Code, Hunted Down Russian Submarines & Emerged Triumphant from Two Centuries of Controversy; Yale University Press: New Haven, CT, USA, 2011. [Google Scholar]
- Wainer, H.; Savage, S. The Theory That Would Not Die: How Bayes’ Rule Cracked the Enigma Code, Hunted Down Russian Submarines and Emerged Triumphant from Two Centuries of Controversy by Sharon Bertsch McGrayne. J. Educ. Meas. 2012, 49, 214–219. [Google Scholar] [CrossRef]
- Donovan, T.M.; Mickey, R.M. Bayesian Statistics for Beginners; Oxford University Press: Oxford, UK, 2019. [Google Scholar]
- Jun, S. Frequentist and Bayesian Learning Approaches to Artificial Intelligence. Int. J. Fuzzy Logic Intell. Syst. 2016, 16, 111–118. [Google Scholar] [CrossRef] [Green Version]
- Meng, X.; Bachmann, R.; Khan, M.E. Training binary neural networks using the Bayesian learning rule. In Proceedings of the International Conference on Machine Learning, Vienna, Austria, 12–18 July 2020; pp. 6852–6861. [Google Scholar]
- Liu, S.; Huang, Y.; Wu, H.; Tan, C.; Jia, J. Efficient multitask structure-aware sparse Bayesian learning for frequency-difference electrical impedance tomography. IEEE Trans. Ind. Inform. 2020, 17, 463–472. [Google Scholar] [CrossRef]
- Chen, Z.; Zhang, R.; Zheng, J.; Sun, H. Sparse Bayesian learning for structural damage identification. Mech. Syst. Signal Process. 2020, 140, 106689. [Google Scholar] [CrossRef]
- Walmsley, J. Mind and Machine; Palgrave Macmillan: Hampshire, UK, 2012. [Google Scholar]
- Kurzweil, R. The Singularity Is Near When Humans Transcend Biology; Penguin Books: New York, NY, USA, 2006. [Google Scholar]
- Kurzweil, R. How to Create a Mind; Penguin Books: New York, NY, USA, 2013. [Google Scholar]
- Lee, J.J.; Sha, F.; Breazeal, C. A Bayesian Theory of Mind Approach to Nonverbal Communication. In Proceedings of the 14th ACM/IEEE International Conference on Human-Robot Interaction, Daegu, Korea, 11–14 March 2019; pp. 487–496. [Google Scholar]
- Kang, X.; Ren, F. Understanding Blog author’s emotions with hierarchical Bayesian models. In Proceedings of the IEEE 13th International Conference on Networking, Sensing, and Control, Mexico City, Mexico, 28–30 April 2016; pp. 1–6. [Google Scholar]
- Mousas, C.; Newbury, P.; Anagnostopoulos, C.N. Evaluating the covariance matrix constraints for data-driven statistical human motion reconstruction. In Proceedings of the Spring Conference on Computer Graphics, Smolenice, Slovakia, 28–30 May 2014; pp. 99–106. [Google Scholar]
- Casella, G. An introduction to empirical Bayes data analysis. Am. Stat. 1985, 39, 83–87. [Google Scholar]
- Žilinskas, A. On the worst-case optimal multi-objective global optimization. Optim. Lett. 2013, 7, 1921–1928. [Google Scholar] [CrossRef]
- Bruce, P.; Bruce, A.; Gedeck, P. Practical Statistics for Data Scientists; O’Reilly Media: Sebastopol, CA, USA, 2020. [Google Scholar]
- Rubin, D.B. The Bayesian Bootstrap. Ann. Stat. 1981, 9, 130–134. [Google Scholar] [CrossRef]
- Rohekar, R.Y.; Gurwicz, Y.; Nisimov, S.; Koren, G.; Novik, G. Bayesian Structure Learning by Recursive Bootstrap. In Proceedings of the 32nd International Conference on Neural Information Processing Systems, Montréal, Canada, 3–8 December 2018; pp. 10546–10556. [Google Scholar]
Notation | Notation Description |
---|---|
) | Observed data |
Parameter or hypothesis | |
Prior distribution of parameter | |
Likelihood distribution of X given θ | |
Posterior distribution of parameter given observed data |
Learning | ||||
---|---|---|---|---|
1 | n = 100, x = 35 | 0.3529 | ||
2 | n = 100, x = 41 | 0.3812 | ||
3 | n = 100, x = 47 | 0.4106 |
Learning Phase | Approach | Result for Decision |
---|---|---|
1 | Optimal | 0.3529 |
Multiple points | 0.2952, 0.3507, 0.3420, 0.3404, 0.3377 | |
Interval (95%) | (0.3479, 0.3545) | |
2 | Optimal | 0.3812 |
Multiple points | 0.4315, 0.3960, 0.3348, 0.3966, 0.4331 | |
Interval (95%) | (0.3804, 0.3851) | |
3 | Optimal | 0.4106 |
Multiple points | 0.3633, 0.3827, 0.4077, 0.3919, 0.3790 | |
Interval (95%) | (0.4094, 0.4133) |
Learning | ||||
---|---|---|---|---|
1 | n = 100, = 5.2, s = 3.2 | 5.1978 | ||
2 | n = 100, = 5.7, s = 3.8 | 5.4048 | ||
3 | n = 100, = 6.3, s = 3.5 | 5.6975 |
Learning Phase | Approach | Result for Decision |
---|---|---|
1 | Optimal | 5.1978 |
Multiple points | 4.8567, 5.1846, 4.3943, 5.1323, 5.8597 | |
Interval (95%) | (5.1790, 5.2241) | |
2 | Optimal | 5.4048 |
Multiple points | 5.2426, 5.3921, 5.4529, 5.4134, 5.1331 | |
Interval (95%) | (5.4034, 5.4373) | |
3 | Optimal | 5.6975 |
Multiple points | 5.8345, 6.1208, 5.5106, 5.2896, 5.6123 | |
Interval (95%) | (5.6860, 5.7129) |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Jun, S. Machines Imitating Human Thinking Using Bayesian Learning and Bootstrap. Symmetry 2021, 13, 389. https://doi.org/10.3390/sym13030389
Jun S. Machines Imitating Human Thinking Using Bayesian Learning and Bootstrap. Symmetry. 2021; 13(3):389. https://doi.org/10.3390/sym13030389
Chicago/Turabian StyleJun, Sunghae. 2021. "Machines Imitating Human Thinking Using Bayesian Learning and Bootstrap" Symmetry 13, no. 3: 389. https://doi.org/10.3390/sym13030389