Abstract
A natural-like artificial general intelligence (AGI) is defined to be an AGI that includes mammalian-like mechanisms such as core usage of navigation maps, spatial and temporal binding, predictive coding, lifelong learning, and innate knowledge procedures. If it includes core mechanisms which allow full causal and analogical processing, then it is also considered to be a human-like AGI. An AGI which is not a natural-like AGI is termed an alien AGI. We consider (for sake of example) as a natural-like AGI a largely conceptual cognitive architecture (the Causal Cognitive Architecture 5) inspired by the mammalian brain. We consider (for sake of example) as an alien AGI the large language model ChatGPT. We show for a non-numeric simple example, that the natural-like AGI is able to solve the problem by automatic core mechanisms, but an alien AGI has difficulty arriving at a solution. It may be, that alien AGIs’ understanding of the world is so different from a human understanding that to allow alien AGIs to do tasks done originally by humans, is to eventually invite strange failures in the tasks.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Russel, S.J., Norvig, P.: “What is AI?” in Artificial Intelligence: A Modern Approach, 4th edn, pp. 1–5. Pearson, New Jersey (2021)
Legg, S., Hutter, M.: Universal intelligence: a definition of machine intelligence. arXiv:0712.3329 (2007)
Chollet, F.: On the measure of intelligence. arXiv:1911.01547 (2019)
Wang, P.: On defining artificial intelligence. J. Artif. Gen. Intell. 10(2), 1–37 (2019). https://doi.org/10.2478/jagi-2019-0002
Rosenbloom, P., Joshi, H., Ustun, V.: (Sub)Symbolic x (A)Symmetric x (Non)Combinatory. In: Cox, M.T. (ed.) Proceedings of the 7th Annual Conference on Advances in Cognitive Systems (2019)
Bołtuć, P., Boltuc, M.: BICA for AGI. Cogn. Syst. Res. 62, 57–67 (2020)
Gubrud, M.A.: Nanotechnology and international security. In: Fifth Foresight Conference on Molecular Technology (1997)
Goertzel, B.: Who coined the term “AGI”? https://goertzel.org/who-coined-the-term-agi/. Accessed 26 Jan 2023
Goertzel, B., Pennachin, C. (eds.): Artificial General Intelligence. Springer, New York (2007)
Adams, S., et al.: Mapping the landscape of human-level artificial general intelligence. AI Mag. 33, 25–42 (2012). https://doi.org/10.1609/aimag.v33i1.2322
Kung, T.H., et al.: Performance of ChatGPT on USMLE. medRxiv (2022). https://doi.org/10.1101/2022.12.19.22283643
Wernle, T., Waaga, T., Mørreaunet, M., Treves, A., Moser, M.B., Moser, E.I.: Integration of grid maps in merged environments. Nat. Neurosci. 21(1), 92–101 (2018)
Alme, C.B., Miao, C., Jezek, K., et al.: Place cells in the hippocampus. Proc. Natl. Acad. Sci. U.S.A. 111(52), 18428–18435 (2014)
Schafer, M., Schiller, D.: Navigating social space. Neuron 100(2), 476–489 (2018)
Schneider, H.: Navigation map-based artificial intelligence. AI 3(2), 434–464 (2022)
Schneider, H.: An inductive analogical solution to the grounding problem. Cogn. Syst. Res. 77, 174–216 (2023). https://doi.org/10.1016/j.cogsys.2022.10.005
Schneider, H.: Causal cognitive architecture 2: a solution to the binding problem. In: Klimov, V.V., Kelley, D.J. (eds.) BICA 2021. SCI, vol. 1032, pp. 472–485. Springer, Cham (2022). https://doi.org/10.1007/978-3-030-96993-6_52
Schneider, H.: Causal cognitive architecture 3: a solution to the binding problem. Cogn. Syst. Res. 72, 88–115 (2022)
Rao, R.P.N., Ballard, D.H.: Predictive coding in the visual cortex. Nat. Neurosci. 2(1), 79–87 (1999). https://doi.org/10.1038/4580
Spelke, E.S.: Initial knowledge: six suggestions. Cognition 50, 431–445 (1994)
Kinzler, K.D., Spelke, E.S.: Core systems in human cognition, chap 14. In: von Hofsten, C., Rosander, K. (eds.) Progress in Brain Research, vol. 164 (2007)
Martin-Ordas, G., Call, J., Colmenares, F.: Tubes, tables and traps: great apes solve two functionally equivalent trap tasks but show no evidence of transfer across tasks. Anim. Cogn. 11(3), 423–430 (2008). https://doi.org/10.1007/s10071-007-0132-1
Penn, D.C., Holyoak, K.J., Povinelli, D.J.: Darwin’s mistake: explaining the discontinuity between human and nonhuman minds. Behav. Brain Sci. 31(2), 109–130 (2008)
Hofstadter, D.R.: Analogy as the core of cognition. In: Gentner, D., Holyoak, K.J., Kokinov, B.N. (eds.) The Analogical Mind, pp. 499–538. MIT Press (2001)
Rodríguez, F., Quintero, B., Amores, L., Madrid, D., Salas-Peña, C., Salas, C.: Spatial cognition in teleost fish: strategies and mechanisms. Animals 11(8), 2271 (2021)
Taylor, A.H., Knaebe, B., Gray, R.D.: An end to insight? New Caledonian crows can spontaneously solve problems without planning their actions. Proc. Biol. Sci. 279(1749), 4977–4981 (2012). https://doi.org/10.1098/rspb.2012.1998
Finn, J.K., Tregenza, T., Norman, M.D.: Defensive tool use in a coconut-carrying octopus. Curr Biol. 19(23), R1069–R1070 (2009). https://doi.org/10.1016/j.cub.2009.10.052. PMID: 20064403
Kralik, J.D.: What can nonhuman animals, children, and g tell us about human-level general intelligence (AGI)? In: Goertzel, B., Iklé, M., Potapov, A., Ponomaryov, D. (eds.) AGI 2022, pp. 282–292. Springer, Cham (2023). https://doi.org/10.1007/978-3-031-19907-3_26
Schneider, H.: Analogical problem solving in the causal cognitive architecture. In: Goertzel, B., Iklé, M., Potapov, A., Ponomaryov, D. (eds.) AGI 2022, pp. 100–112. Springer, Cham (2023). https://doi.org/10.1007/978-3-031-19907-3_10
Vaswani, A., Shazeer, N., Parmar, N., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)
Gozalo-Brizuela, R., Garrido-Merchan, E.C.: ChatGPT is not all you need. A state of the art review of large generative AI models (2023). arXiv:2301.04655
Bołtuć, P.: Strong semantic computing. Procedia Comput. Sci. 1(123), 98–103 (2018)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Schneider, H., Bołtuć, P. (2023). Alien Versus Natural-Like Artificial General Intelligences. In: Hammer, P., Alirezaie, M., Strannegård, C. (eds) Artificial General Intelligence. AGI 2023. Lecture Notes in Computer Science(), vol 13921. Springer, Cham. https://doi.org/10.1007/978-3-031-33469-6_24
Download citation
DOI: https://doi.org/10.1007/978-3-031-33469-6_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-33468-9
Online ISBN: 978-3-031-33469-6
eBook Packages: Computer ScienceComputer Science (R0)