Next Article in Journal
The Legal Translator’s Approach to Texts
Next Article in Special Issue
Sustainability—What Are the Odds? Envisioning the Future of Our Environment, Economy and Society
Previous Article in Journal / Special Issue
Surprise and Uncertainty—Framing Regional Geohazards in the Theory of Complexity
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

From Human Past to Human Future

by
Robert G. Bednarik
International Federation of Rock Art Organizations (IFRAO), P.O. Box 216, Caulfield South, VIC 3162, Australia
Humanities 2013, 2(1), 20-55; https://doi.org/10.3390/h2010020
Submission received: 22 October 2012 / Revised: 28 November 2012 / Accepted: 28 November 2012 / Published: 9 January 2013
(This article belongs to the Special Issue Humanity’s Future)

Abstract

:
This paper begins with a refutation of the orthodox model of final Pleistocene human evolution, presenting an alternative, better supported account of this crucial phase. According to this version, the transition from robust to gracile humans during that period is attributable to selective breeding rather than natural selection, rendered possible by the exponential rise of culturally guided volitional choices. The rapid human neotenization coincides with the development of numerous somatic and neural detriments and pathologies. Uniformitarian reasoning based on ontogenic homology suggests that the cognitive abilities of hominins are consistently underrated in the unstable orthodoxies of Pleistocene archaeology. A scientifically guided review establishes developmental trajectories defining recent changes in the human genome and its expressions, which then form the basis of attempts to extrapolate from them into the future. It is suggested that continuing and perhaps accelerating unfavorable genetic changes to the human species, rather than existential threats such as massive disasters, pandemics, or astrophysical events, may become the ultimate peril of humanity.

1. Introduction

Among the most profound questions humans can ask themselves is the issue of humanity’s future: what is to become of our species? For centuries this subject has been at the core of many enquiries, most of which seem to have been framed with specific predictions in mind, and these were often concerned with aspects of technology. In more recent times, especially over the course of the second half of the 20th century, rising environmental concerns prompted a gradual increase in the preoccupation with the ecological sustainability of humanity. With the early 21st century, this anxiety is taking on a new urgency, as society’s unease with humanly caused climate change and several other threats to the environment is becoming universal. Some of these issues have been investigated and modeled in countless scientific studies, and are the subject of ongoing work by numerous think-tanks around the world.
Efforts to predict the course of humanity over the longer term have been considerably less prominent, even though this would clearly be just as important. Nevertheless, there have been many such initiatives, and some of their main protagonists are contributing to this special issue of Humanities. Most of these studies deal with the effects of advances in technology and science, while the perhaps more fundamental aspects of the human future receive somewhat less attention: concerns with exponentially increasing population size; whether the human species will evolve into another; whether life on planet Earth will become extinct; or whether artificial intelligence will surpass biological intelligence. As Bostrom [1] poignantly observes, “it is relatively rare for humanity’s future to be taken seriously as a subject matter on which it is important to try to have factually correct beliefs.” He notes that, apart from climate change, “national and international security, economic development, nuclear waste disposal, biodiversity, natural resource conservation, population policy, and scientific and technological research funding are examples of policy areas that involve long time-horizons.”
But how does one acquire factually correct beliefs about the future, and is it even possible to do so? As it happens there are numerous aspects of the future, including the quite distant future, about which we can formulate perfectly plausible predictions. We can be fairly certain that in two million years from now, Pioneer 10, which was launched in 1972, will pass near Aldebaran, some 69 light-years away. We can be fairly certain that in 50 million years, the northward drift of Africa and Australia will have connected them with the landmasses of Europe and South-East Asia respectively, just as California will have slid up the coast to Alaska. These and many other certainties are available to us because they can be deduced from known trajectories that, in the greater context, are not expected to change in any significant way. Numerous aspects of humanity’s future are also determined by fairly well understood uniformitarian trajectories. For instance any survey of the technological development of hominins shows that, overall, there has been an exponential increase in the complexity and sophistication of technology. From a level perhaps similar to today’s chimpanzees some 5 to 7 million years ago, there has been only minimal change over the first half of that period. From the first appearance of formal stone implements it took another couple of million years to reach the perfectionist template of the Acheulian handaxe. A million years ago hominins mastered seafaring colonization of several islands [2] and began to produce exograms (external memory records of ideas, symbols). Although the pace of development quickened throughout it needs to be remembered that societies remained excruciatingly conservative for most of their remaining histories. Even toward the end of the Ice Ages, a millennium’s progress seemed insignificant, yet the last millennium of human history delivered the species from the Middle Ages to the “Space Age.” Moreover, the last century, for much of mankind, involved almost as many technological changes as the previous millennium, a fair indication of the exponential nature of the process. Today this phenomenon is clear for anyone to observe from one decade to the next. Unless one were to expect that there would be a permanent abatement of this development it would need to be assumed that technological and scientific progress will continue at a breathtaking rate, and at an exponentially increasing rate. The technological future of humanity, therefore, appears to be fairly predictable, in the overall sense.
However, technology alone cannot define the future of the species; several other factors are at least as important. For instance the extinction risk and the existential risk both override the technological aspects. A number of authors have offered rather pessimistic estimates and scenarios of the probability of humanity surviving another century, or a few centuries [3,4,5,6]. Replacement of the present human species by another, obviously, would be extinction, but not an existential risk; whereas the establishment of a permanent global tyranny would be an existential disaster, although not an extinction [1]. Typical existential risks arising from natural conditions are major volcanic eruptions (leading to massive environmental degradation), pandemics, astrophysical events destructive to life, and major meteor or asteroid impact. Bostrom [7] rates the existential risks from anthropogenic causes as more important, specifically those “from present or anticipated future technological developments. Destructive uses of advanced molecular nanotechnology, designer pathogens, future nuclear arms races, high-energy physics experiments, and self-enhancing AI with an ill-conceived goal system are among the worrisome prospects that could cause the human world to end in a bang.”
Bostrom singles out those potential developments that would enable humans to alter their biology through technological means [8,9]. Controlling the biochemical processes of aging could dramatically prolong human life; drugs and neurotechnologies could be used to modify people and their behavior [10]. The development of ultraintelligence is often invoked in this context, although the prediction that it will occur in the 20th century [11] has not materialized. Less vague is the notion of “whole brain emulation” or “uploading”, which refers to the technology of transferring a human mind to a computer [12]. The idea is to create a highly detailed scan of a human brain, such as by feeding vitrified brain tissue into powerful microscopes for automatic slicing and scanning. Automatic image processing would then be applied to the scanned data to reconstruct a complete model of the brain, mapping the different types of neurons and their entire network. Next, the whole computational structure would be emulated on an appropriately powerful computer, hopefully reproducing the original “mind” qualitatively. Memory and personality intact, the mind would then exist as software on the computer, and it could inhabit a robotic machine or simply exist in virtual reality.
However, as the main-proponents of this idea note, “it remains unclear how much information about synaptic strength and neuromodulator type can be inferred from pure geometry at a given level of resolution” ([12], p. 40). Whether the technology of “nanodisassembly” [13,14,15,16] is feasible or not, this process would only result in a dead brain at best. Neural connections are being forged in their millions per second, and every second the brain is changing at a molecular level. Taking an instantaneous snapshot of the dead whole brain is completely nonsensical from that point of view alone, in that one would emulate only a static state. Moreover, the brain is not a separate entity, it is connected to numerous other parts of the body (e.g., retina, other sensory recorders, spinal cord, proprioceptors), being simply the main part of an entire neural system. For instance the neural circuits contained in the spinal cord can independently control numerous reflexes and central pattern generators. The brain functions as part of that body, storing myriad information about the body and its past experiences (Figure 1). To effectively emulate this one would have to determine at the molecular level the structure of every neuron and support cell in the brain, their connections, data transmission rates etc., and replicate this in electronic form. The most promising approach would be to scan the target brain from birth onwards, with some implanted device, recording all brain activity of every neuron and support cell in real time, and then replay the entire experience into the artificial brain. Next one would have to embed that artificial brain in an artificial body cloned from the original. All of this does not inspire much confidence that this will ever be achieved, or even attempted; in the end it seems like a rather pointless exercise.
Figure 1. Brain signals traffic along white matter fibers in the left hemisphere as recorded by diffusion magnetic resonance imaging (Courtesy of MGH-UCLA Human Connectome Project).
Figure 1. Brain signals traffic along white matter fibers in the left hemisphere as recorded by diffusion magnetic resonance imaging (Courtesy of MGH-UCLA Human Connectome Project).
Humanities 02 00020 g001
As can be seen from this example, some of the futuristic scenarios contemplated are perhaps science-fiction rather than science, while others can be regarded as soundly based, or as deserving serious consideration. In this paper the principle of realistic trajectories will be employed to flesh out some key predictions about the human future. The emphasis will be on first creating a sound empirical base defining such trajectories about the human past, from which viable considerations of the future should evolve quite naturally and logically.

2. The Human Ascent

However, this is where the difficulties begin. The principal disciplines providing information about previous states of humans and human societies, Pleistocene archaeology and paleoanthropology, are not sciences (i.e., not based on falsification and testability) and have an error-prone history. Archaeology has strenuously spurned every major innovation since the Pleistocene antiquity of humanity was rejected by it from the 1830s to the late 1850s. This includes the subsequent reports of fossil humans, of Ice Age cave art, of Homo erectus, of australopithecines, and of practically every major methodological improvement proposed ever since. It took three to four decades to accept the existence of the “Neanderthals”, of the erectines, the australopithecines, just as it took that long to unmask “Piltdown man.” On present indications it will take as long to establish the status of the Flores “hobbits,” one of many examples indicating that neither Pleistocene (and Pliocene) archaeology nor paleoanthropology have learned much from their long lists of blunders [17]. This pattern of denial and much later grudging acceptance of any major innovation has not only continued to the present time, it has even intensified in recent decades. Faddish interpretations dominate the disciplines and distort academic perceptions of the hominin past in much the same measure as they did more than a century ago. Clearly this is not a good starting point to begin establishing trajectories of human development, be they cultural, technological, or cognitive. Before this is realistically possible, these disciplines need to be purged of their current falsities.
Some of the most consequential fallacies concern the model of “cultural evolution” archaeology provides. The notion of such an evolution is itself flawed, because evolution, as a biological concept, is an entirely dysteleological process; it has no ultimate purpose and it is not a development toward increased complexity. The concept of cultural evolution, however, involves the teleologically guided assumption of progress toward greater sophistication—ultimately, in the archaeological mind, resulting in that glorious crown of evolution, Homo sapiens sapiens. This fantasy (the modern human is a neotenous form of ape, susceptible to countless neuropathologies, as will be shown below) implies that archaeology is guided by a species-centric delusion of grandeur. Moreover, its definition of culture is itself erroneous, being based on invented tool types (in the Pleistocene usually of stone implements). Culture, obviously, is not defined by tools or technologies, but by cultural factors. Some of these are available from very early periods, but archaeology has categorically excluded them from delineating the cultures it perceives. Indeed, when it does consider cultural elements such as undated rock art it strenuously tries to insert them into its invented cultures based on stone tools, rather than try to create a cultural history from them. Archaeology goes even further in its obsessive taxonomization by then assuming that these imagined cultures were the work of specific human societies. So for instance certain combinations of invented tool types found in discrete layers of sediments are called the “Aurignacian culture,” and this imaginary culture is seen as the signature of a people called the “Aurignacians.” Although archaeologists lack any significant knowledge of who these imaginary Aurignacians were [18,19,20], they regard these as real, identifiable entities, when in fact there is not one iota of evidence that all the people that produced the tools in question were in any way related, be it ethnically, linguistically, genetically, politically, or even culturally.
This is a fair indication of the misinformation Pleistocene archaeology has inflicted on modern society, and it is greatly attributable to the complete lack of internal falsifiability of the discipline. Many other examples could be cited, but one that is of particular relevance in the context of properly understanding the human past relates to a major archaeological fad of recent decades. The replacement hypothesis, termed the “African Eve” model by the media, derives from an academic fraud begun in the 1970s [21,22,23,24,25,26], which by the late 1980s suddenly gained almost universal acceptance and has since been the de-facto dogma of the discipline, especially in the Anglo-American sphere of influence [27,28,29,30,31,32]. This unlikely hypothesis proposes that all extant humans derive from a small population—indeed, from one single female—at an unspecified location in sub-Saharan Africa that miraculously became unable to interbreed with all other hominins. They then expanded across Africa, then to the Middle East, and colonized all of Eurasia, wiping out all other people in their wake. Reaching South-East Asia they promptly invented seafaring to sail for Australia.
This origins myth is contradicted by so much empirical evidence one must question the academic competence of its protagonists. Based originally on false claims about numerous fossils and their ages, it then resorted to genetics, misusing numerous genetic findings to underpin its demographically and archaeologically naive reasoning. The computer modeling of Cann et al. [27] was botched and its haplotype trees were fantasies that could not be provided with time depth even if they were real. Based on 136 extant mitochondrial DNA samples, it arbitrarily selected one of 10267 alternative and equally credible haplotype trees (which are very much more than the number of elementary particles of the entire universe, about 1070!). Maddison [33] then demonstrated that a re-analysis of the Cann et al. model could produce 10,000 haplotype trees that were actually more parsimonious than the single one chosen by these authors. Yet no method could even guarantee that the most parsimonious tree result should even be expected to be the correct tree [34]. Cann et al. had also mis-estimated the diversity per nucleotide (single locus on a string of DNA), incorrectly using the method developed by Ewens [35] and thereby falsely claiming greater genetic diversity of Africans, compared to Asians and Europeans (they are in fact very similar: 0.0046 for both Africans and Asians, and 0.0044 for Europeans). Even the premise of genetic diversity is false, for instance it is greater in African farming people than in African hunters-foragers [36], yet the latter are not assumed to be ancestral to the former (see e.g., [37]). Cann et al.’s assumption of exclusive maternal transference of mitochondria was also false, and the constancy of mutation rates of mtDNA was similarly a myth [38,39]. As Gibbons [40] noted, by using the modified putative genetic clock, Eve would not have lived 200,000 years ago, as Cann et al. had claimed, but only 6,000 years ago. The various genetic hypotheses about the origins of “Moderns” that have appeared over the past few decades placed the hypothetical split between these and other humans at times ranging from 17,000 to 889,000 years bp. They are all contingent upon purported models of human demography, but these and the timing or number of colonization events are practically fictional: there are no sound data available for most of these variables. This applies to the contentions concerning mitochondrial DNA (“African Eve”) as much as to those citing Y-chromosomes (“African Adam” [41]). The divergence times projected from the diversity found in nuclear DNA, mtDNA, and DNA on the non-recombining part of the Y-chromosome differ so much that a time regression of any type is extremely problematic. Contamination of mtDNA with paternal DNA has been demonstrated in extant species [42,43,44,45], in one recorded case amounting to 90% [46]. Interestingly, when this same “genetic clock” is applied to the dog and implies its split from the wolf occurred 135,000 years ago, archaeologists reject it because there is no paleontological evidence for dogs prior to about 15,000 years ago ([47], but see [48]). The issues of base substitution [49] and fragmentation of DNA [50] have long been known, and the point is demonstrated, for instance, by the erroneous results obtained from the DNA of insects embedded in amber [51]. Other problems with interpreting or conducting analyses of paleogenetic materials are alterations or distortions through the adsorption of DNA by a mineral matrix, its chemical rearrangement, microbial or lysosomal enzymes degradation, and lesions by free radicals and oxidation [52,53].
Since 1987 the genetic distances in nuclear DNA (the distances created by allele frequencies) proposed by different researchers or research teams have produced conflicting results [54,55,56,57], and some geneticists concede that the models rest on untested assumptions; others even oppose them [31,55,57,58,59,60,61,62,63]. The key claim of the replacement theory (the “Eve” model), that the “Neanderthals” were genetically so different from the “Moderns” that the two were separate species, has been under severe strain since Gutierrez et al. [64] demonstrated that the pair-wise genetic distance distributions of the two human groups overlap more than claimed, if the high substitution rate variation observed in the mitochondrial D-loop region [65,66,67] and lack of an estimation of the parameters of the nucleotide substitution model are taken into account. The more reliable genetic studies of living humans have shown that both Europeans and Africans have retained significant alleles from multiple populations of Robusts ([68,69]; cf. [63]). After the Neanderthal genome yielded results that seemed to include an excess of Gracile single nucleotide polymorphisms [70], more recent analyses confirmed that “Neanderthal” genes persist in recent Europeans, Asians, and even Papuans [71]. “Neanderthals” are said to have interbred with the ancestors of Europeans and Asians, but not with those of Africans ([72]; cf. [73]). The African alleles occur at a frequency averaging only 13% in non-Africans, whereas those of other regions match the Neanderthaloids in ten of twelve cases. “Neanderthal genetic difference to [modern] humans must therefore be interpreted within the context of human diversity” ([70], p. 334). This suggests that gracile Europeans and Asians evolved largely from local robust populations, and the replacement model has thus been decisively refuted. While this may surprise those who subscribed to Protsch’s “African hoax,” it had long been obvious from previously available evidence. For instance Alan Mann’s finding that tooth enamel cellular traits showed a close link between Neanderthaloids and present Europeans, which both differ from those of Africans [74], had been ignored by the Eve protagonists, as has much other empirical evidence (e.g., [75,76]). In response to the initial refutations of the Eve model, Cann [77] made no attempt to argue against the alternative proposals of long-term, multiregional evolution.
But faulty genetics are only one aspect of the significant shortcomings of the replacement model; it also lacks any supporting archaeological, paleoanthropological, technological or cultural evidence [17,18,20,78,79,80,81,82]. Nothing suggests that Upper Paleolithic culture or technology originated in sub-Saharan Africa, or that such traditions moved north through Africa into Eurasia. The early traditions of Mode 4 (“Upper Paleolithic”) technocomplexes evolved in all cases in situ, and the Graciles of Australia, Asia, and Europe emerged locally from Robusts, as they did in Africa. By the end of the Middle Pleistocene, 135,000 years ago, all habitable regions of the Old World continents can be safely assumed to have been occupied by hominins. At that time, even extremely inhospitable parts, such as the Arctic [83,84,85], were inhabited by highly adapted Robusts. Therefore the notion that African immigrants from the tropics could have displaced these with their identical technologies is demographically absurd. Wherever robust and gracile populations coexisted, from the Iberian Peninsula to Australia, they shared technologies, cultures, even ornaments. Moreover, the established resident populations in many climatic regions would have genetically swamped any intrusive population bringing with it a much smaller number of adaptive alleles. Introgressive hybridization [86], allele drift based on generational mating site distance [87], and genetic drift [88] through episodic genetic isolation during climatically unfavorable events (e.g., the Campagnian Ignimbrite event, or the Heinrich Event 4 [89,90,91,92] account for the mosaic of hominin forms found.
Mode 4 technocomplexes [93] first appear across Eurasia between 45,000 and 40,000 years ago, perhaps even earlier [94], at which time they existed neither in Africa nor in Australia. In fact right across northern Africa, Mode 3 traditions continued for more than twenty millennia, which renders it rather difficult to explain how Eve’s progeny managed to cross this zone without leaving a trace. None of the many tool traditions of the early Mode 4 which archaeologists have “identified” across Europe have any precursors to the south [17]. Some of these “cultures” have provided skeletal human remains of Robusts, including “Neanderthals” [32,95,96,97,98], but there are no unambiguous associations between “anatomically modern human” remains (Graciles) and “early Upper Paleolithic” assemblages [99]. This is another massive blow to the replacement proponents, who relied on the unassailability of their belief that some of these traditions, especially the Aurignacian, were by Graciles—having fallen victim to Protsch’s hoax. Moreover, these “cultures”, as they are called, are merely etic constructs, “observer-relative or institutional facts” [100]; as “archaeofacts” they have no real, emic existence, as noted above. They are entirely made up of invented (etic) tool types and based on the fundamental misunderstanding of Pleistocene archaeology that tools are diagnostic for identifying cultures.
Pleistocene archaeology as conducted is thus incapable of providing a cultural history, as it relegates the cultural information available (such as rock art) to marginal rather than central status, forcing it into the false technological framework it has created. Instead of beginning with a chronological skeleton of paleoart traditions and then placing tool assemblages into it, invented tool types forming invented cultures of invented ethnic and even genetic groups form the temporal backbone of the academic narrative. The result is a collection of origins myths for human groups, nations, and for “modern humans” generally. It is of little relevance to considering the human ascent, which calls for a far more credible approach. This applies most specifically if the purpose is to establish various trajectories of human development in order to extrapolate from them to the future.
The most central issue is clearly the origin of human modernity [82]: at what point in hominin history can it be detected with reasonable reliability, and how did it come about? The replacement model offers a simplistic answer: modern human behavior was introduced together with the advent of “anatomically modern” humans, which in Europe is thought to date from between 40,000 and 30,000 years ago. Unfortunately the issue is not as simple as that. First of all, no humans of the past were either anatomically or cognitively fully modern [101]. Archaeological notions of modernity, expressed for instance in the semi-naturalistic paleoart first appearing with the “Aurignacian,” are therefore false, irrespective of how modernity is defined. Conflating the literate minds of modern Westerners with the oral minds that inhabited the human past, which “cognitive archaeology” does without realizing, is the result of one of the many epistemological impairments of orthodox archaeology. Even people of the Middle Ages existed in realities profoundly different from those experienced today, as do many extant peoples. For instance the general introduction of writing in recent centuries has dramatically changed the brain of adult humans: although they start out as infants with brains similar to non-literate peoples, these brains are gradually reorganized as demanded by the thinking implicit in literacy, which is totally different from the thought patterns found in oral societies [102]. The use of all symbol systems (be they computer languages, conventions for diagrams, styles of painting) influence perception and thought [103]. Cultural activity modifies the chemistry and structure of the brain through affecting the flow of neurotransmitters and hormones [104] and the quantity of gray matter [105,106,107].
“Modern behavior” does not refer to the behavior of modern Westerners, or to that of any other extant human group. It is defined by the state of the neural structures that are involved in moderating behavioral patterns, which ultimately are determined by inhibitory and excitatory stimuli in the brain. So the question is, what types of evidence would suggest that these neural structures had been established, and at what time does such evidence first occur? When archaeologists refer to human modernity they tend to cite a list of cultural or technological variables, such as projectile weapons, blade tools, bone artifacts, hafting (composite artifacts), “elaborate” fire use, clothing, exploitation of marine resources and large game, even seasonality in the exploitation of resources. This implies that they are not adequately informed about archaeological evidence, because these and other forms of proof cited in this context are available from much earlier periods, including the “Lower Paleolithic.” Similarly, paleoart of various forms can be traced back to the same period [18,79,81,108]; in fact most of the surviving rock art of the Pleistocene is not even of the “Upper Paleolithic,” but of “Middle Paleolithic” (Mode 3) industries [109]. Furthermore, if the cave art and portable paleoart of the “Aurignacian” and other early “Upper Paleolithic” traditions is the work of “Neanderthals” or intermediate humans, as the available record tends to suggest [19], the entire case for conflating perceived anatomical modernity with purported cognitive modernity collapses.
Preferred indicators that the human brain had acquired the kind of structures that underwrite behavior and cognition today are exograms, signs that hominins were capable of storing symbolic information outside their brains. Middle Pleistocene examples of exograms have been classified into beads, petroglyphs, portable engravings, proto-sculptures, pigments and manuports [79,108]. Most of these classes of evidence offer no utilitarian explanations whatsoever. They first appear on the archaeological record with Mode 1 and Mode 2 technocomplexes (handaxe-free and handaxe Lower Paleolithic tool traditions), i.e., at about the same time as the first evidence of pelagic crossings, up to one million years ago [2]. Seafaring colonization, according to the data currently available, began in what is today Indonesia, where the presence of thriving human populations during Middle Pleistocene times has so far been demonstrated on Flores, Timor, and Roti. None of these and other Wallacean islands has ever been connected to either the Asian or the Australian landmass. Viable island populations can only be established by deliberate travel of significant numbers of people providing an adequate gene pool, and none of the sea narrows can be crossed without propellant power. Not only does this necessitate the use of watercraft, almost certainly in the form of bamboo rafts, such missions demanded a certain level of forward planning and the use of effective communication. Although communication is possible by various means, it seems unlikely that maritime colonization is possible without an appropriate form of “reflective” language [2,110].
This provides an important anchor point for a realistic timeline of growing human competence in volitionally driven behavior, one of the quintessential aspects of humanness. Modernity in human behavior had begun, in the sense that the same neural structures and processes that determine this quality today were essentially in place. If this occurred around a million years ago, as appears to be the case, then the archaeological beliefs are hopelessly mired in falsehoods. If, on the other hand, only fully modern behavior qualifies for modernity, then it arose only in recent centuries, and it does not apply to illiterates or to extant traditional societies, or even to non-Western societies. It then becomes such a narrow definition that it is useless as a marker of human development. Either way orthodox archaeology got it completely wrong.
Clearly, then, the current paradigm of Pleistocene archaeology is not a basis on which to expound predictions of the kind to be attempted here. For this it is essential to begin by first purging the record of elements introduced to support dogmas, at the same time considering empirical evidence that has been neglected. Some of the underlying features of hominin evolution have not been adequately factored into the narratives of mainstream archaeology. For instance hominins have been subjected to relentless encephalization for the entire duration of their existence. This astonishing growth of one organ, unprecedented on this planet, has come at massive costs for the species. Not only is the brain the most energy-hungry part of the body, its enlargement has rendered it essential that infants be born at an increasingly earlier fetal stage. They remain helpless for several years, needing to be carried and reducing the mothers’ reproduction rates dramatically. The large brain of the human fetus is a huge cost to the mother, the band, and to the species as a whole. And yet archaeologists believe that for 98% of their existence, humans barely used these brains in any meaningful way. In a biological sense this is most unlikely, and it is not the only flawed conviction in Pleistocene archaeology. For instance archaeologists and paleoanthropologists have invented numerous explanations for why hominins adopted upright walk, waxing lyrically about the benefits of this adaptation. They never asked the obvious: if it is so advantageous, why did other primates fail to adopt it? The same, of course, applies to encephalization or any other development endemic to humans. Similarly, it has not been asked why it was essential for human fetuses to have such large brains; why could the brain not have grown so large after birth, as so many other organs or body parts do in countless species? The answer, apparently, is that the brain has a limited ability of expanding.
If the enlargement of the brain through evolutionary time is taken as an approximate indicator of the cognitive and intellectual complexity of the brain, which is a realistic working hypothesis, all of archaeology’s beliefs about non-somatic development become redundant. Given that natural selection can only select expressed characteristics, not latent ones, the indices Pleistocene archaeology fields in its speculations about behavior, cognition, or even technology are inevitably flawed. For instance, there is no reason to exclude from consideration the notion that upright walk was not selected for, but was imposed on human ancestors. This possibility was never even contemplated, and yet it is obvious that hominins have undergone massive neotenization, and that the human foot closely resembles that of a fetal chimpanzee [17,20,111,112]. Therefore it would be more logical to assume that upright walk was facilitated by somatic changes to the foot and became an “expressed characteristic” available for selection. Indeed, the obvious influence neotenization has had in the development of hominins [99] is never considered by the gatekeepers of the human past. And yet it is among the most important factors of humanization, second only to encephalization.
And then there is the strange fact that, after up to three million years of encephalization, suddenly it ceased, and the human brain began to shrink much more rapidly than it had before expanded. Archaeology and paleoanthropology offer no explanations for this sudden reversal of a process that had become the hallmark of human evolution. In fact this is largely ignored by these fields, and the same applies to the other significant reductions in human fitness. As the brain shrank by about 13% in the course of the last few tens of millennia, skeletal robusticity also declined rapidly, especially that of the cranium, mastoid equipment, and the skull generally. Not only that, sheer physical strength, as indicated by muscle attachments, waned profoundly. And all that has been offered as an explanation is a story about a new species from Africa that was so superior in every way that it outcompeted or exterminated all other humans across the globe.
Similarly, the most dramatic neurological developments in hominin history have remained completely ignored by both archaeology and paleoanthropology, which has resulted in an unbridgeable chasm between these gatekeepers and the sciences. For instance the greatest conundrum of neuroscience is its inability to explain why natural selection has not suppressed the thousands of neuropathologies, genetic disorders and neurodegenerative conditions afflicting modern humans [112]. Within the replacement hoax, which explains the change from Robusts to Graciles by a combination of natural selection and genetic drift [88], the Keller and Miller paradox is indeed unsolvable. That should already have prompted caution, because the sciences are inevitably much better equipped to deal with these issues. Unfortunately, in matters of human evolution, the sciences have relied too much on the myths of the gatekeepers of the human past, but once this paradigmatic obstacle is overcome the uptake of the alternative explanation should swiftly induce a better scientific understanding of the human ascent.

3. Archaeology vs. Science

The rough outline of such an improved understanding is easily sketched out. Hominins evolved initially during the Pliocene period, which lasted from 5.2 to 1.7 million years (Ma) ago. Earlier contenders for human ancestry such as Sahelanthropus tchadensis (7 Ma) and Orrorin tugenensis (6 Ma) are known, but it is the Pliocene’s australopithecines that are more plausibly thought to include a human ancestor. The position of Ardipithecus ramidus and Ardipithecus kadabba (4.4 Ma) remains controversial. The gracile australopithecines (4.2 Ma to 2.0 Ma) were certainly bipedal, and at least some of them (Australopithecus garhi) used stone tools to butcher animal remains. Bearing in mind the common use of tools by modern chimpanzees [112,113] and Brazilian capuchin monkeys it is realistic to credit the australopithecines and Kenyanthropus platyops (3.5 to 3.3 Ma) with better tool-making skills than these extant primates. But what level of cognitive ability might be attributable to them?
Homology can provide some preliminary indications from reviewing ontogenic development. It is roughly at the age of forty months that the human child surpasses the ToM (theory of mind) level of the great apes. Thus the executive control over cognition unique to humans, together with metarepresentation and recursion, would be expected to have developed during the last 5 or 6 million years. Although the brain areas accounting for the latter two faculties remain unidentified, executive control resides in the frontal lobes. Since the frontal and temporal areas have experienced the greatest degree of enlargement in humans [114,115], uniquely human abilities would be expected to be most likely found there, although inter-connectivity rather than discrete loci may be the main driving force of cognitive evolution. It is precisely the expansion of association cortices that has made the human brain disproportionately large. Intentional behavior can be detected by infants 5–9 months old [116], while at 15 months infants can classify actions according to their goals [117]. The same abilities are available to chimpanzees and orang-utans [118], but apparently not to monkeys [119]. Between 18 and 24 months, the child establishes joint attention [120], as well as engages in pretend-play, and it develops an ability to understand desires [121,122,123]. Again, apes use gaze monitoring to detect joint attention [124], but monkeys apparently do not. It is with the appearance of “metarepresentation,” the ability to explicitly represent representations as representations [125,126,127], and with recursion that developed human ToM emerges, as these are lacking in the great apes [128,129]. Similarly, the apes have so far provided no evidence of episodic memory or future planning [130]. Episodic memory, which is identified with autonoetic consciousness, can be impaired in humans, e.g., in amnesia, Asperger’s syndrome, or in older adults [131]. It can be attributed to differential activity in the medial prefrontal and medial parietal cortices, imaging studies of episodic retrieval have shown [132].
Theory of mind defines the ability of any animal to attribute mental states to itself and others, and to understand that conspecifics have beliefs, desires, and intentions; and that these may be different from one’s own [17,133,134,135,136,137,138,139,140,141]. Each organism can only prove the existence of his or her own “mind” through introspection, and has no direct access to others’ “minds.” Although ToM is present in numerous species, at greatly differing levels, it has perhaps attracted most attention in the study of two groups, children and great apes, and the level they conceive of mental activity in others, attribute intention to, and predict the behavior of others [118]. It is thought to be largely the observation of behavior that can prompt a ToM. The discovery of mirror neurons in macaques [142,143] has provided much impetus in the exploration of how a ToM is formed [144,145]. Mirror neurons are activated both when specific actions are executed and when identical actions are observed, providing a neural mechanism for the common coding between perception and action (but see [146]). One of the competing models to explain ToM, simulation theory [147,148,149], is said to derive much support from the mirror neurons, although it precedes their discovery by a decade. These neurons are seen as the mechanism by which individuals simulate others in order to better understand them. However, mirror neurons have so far not been shown to produce actual behavior [150]. Motor command neurons in the prefrontal complex send out signals that orchestrate body movements, but some of them, the mirror neurons, also fire when merely watching another individual (not necessarily a conspecific) perform a similar act. It appears that the visual input prompts a “virtual reality” simulation of the other individual’s actions. However, ToM and “simulation,” though related, may have different phylogenic histories [151,152]. Ramachandran [153] has speculated about the roles of mirror neurons in cognitive evolution [154], in empathy, imitation [155], and language acquisition [156].
Since it is safe to assume that relatively advanced forms of ToM were available to all hominins and hominids, it may be preferable to turn to consciousness and self-awareness for a better resolution in defining their cognitive status. Consciousness focuses attention on the organism’s environment, merely processing incoming external stimuli [157,158], whereas self-awareness focuses on the self, processing both private and “public” information about selfhood. The capacity of being the object of one’s own attention defines self-awareness, in which the individual is a reflective observer of its internal milieu and experiences its own mental events [159,160,161]. What is regarded as the “self” is inherently a social construct [162], shaped by the individual’s culture and immediate conspecifics [163]. But the self is not the same as consciousness [164], as shown by the observation that many attributes seen as inherent in the self are not available to conscious scrutiny. People invent the neurological computation of the boundaries of personhood from their own behavior and from the narratives they form, which also determine their future behavior. Thus it needs to be established how the chain of events from sensory input is established and how behavior is initiated, controlled and produced [165,166,167,168]. It appears that subcortical white matter, brainstem and thalamus are implicated in consciousness [169,170], although the latter is not believed to drive consciousness. Ultimately consciousness is self-referential awareness, the self’s sense of its own existence, which may explain why its etiology remains unsolved [171].
Self-awareness, ToM, and episodic memory derive from the default mode network, which is considered to be a functionally homogeneous system [172]. Conscious self-awareness, the sentience of one’s own knowledge, attitudes, opinions, and existence, is even less understood and accounted for ontologically than ToM. It may derive primarily from a neural network of the prefrontal, posterior temporal, and inferior parietal of the right hemisphere ([173,174,175]; but see critiques in [176,177,178]). In humans, a diminished state of self-awareness occurs for instance in dementia, sleep, or when focusing upon strong stimuli [179]. Basic awareness of the self is determined by the mirror test ([180,181,182,183,184,185,186]; but see [187,188,189] for critical reviews). Some of the great apes, the elephants and bottlenose dolphins are among the species that have passed the mirror test, and interestingly they are much the same species shown to possess von Economo neurons [190,191,192], which seem to be limited to large mammals with sophisticated social systems [17]. As in ToM, various levels apply, probably correlated with social complexity of the species concerned. It is difficult to see how such systems could have developed much beyond those of social insects without some level of self-awareness. It is the result of interplay of numerous variables, ranging from the proprioceptors to distal-type bimodal neurons (moderating anticipation and execution [193]), and the engagement of several brain regions.
Like ToM, self-awareness can safely be attributed to all hominoids and hominins, and there is a reasonable expectation that it became progressively more established with time. Archaeologically, self-awareness can be demonstrated by body adornment, but unfortunately most forms of such behavior (body painting, tattoos, cicatrices) leave no archaeological evidence. Beads and pendants, however, may in rare circumstances survive taphonomy. An incipient form of body decoration may have been observed by McGrew and Marchant ([194]; cf. [195,196]) in 1996. Following the killing and eating of a red colobus monkey by a group of chimpanzees, a young adult female was observed wearing the skin strip draped around her neck, tied in a single overhand knot as if forming a simple necklace. Perhaps the young adult female sought to enhance her appearance and/or status by adorning herself with the remains of a highly valued kill; or perhaps it was just a mistaken observation or an accidental occurrence. However, with a much enhanced self-awareness of the early hominins it should be expected that more pronounced cultural expressions of such behavior would be found in the late Pliocene. Yet the first beads on the archaeological record appear only during the Middle Pleistocene [79,108,197,198,199], and even these are met with dogmatic rejection by most archaeologists. From a biological perspective it is rather surprising that such artifacts appear so late on the available record (during the Middle Acheulian technological traditions [198]). As in so many other issues, biological, empirical and scientific perceptions clash irreconcilably with the unstable orthodoxies of Pleistocene archaeology [2,17,18,19,20,79,80,81,82,99].
The first report of early beads dates from the initial demonstration that humans lived in the Pleistocene [200], which was categorically rejected by archaeology for decades. These beads remained almost ignored for over 150 years [198], until some 325 Cretaceous fossils of Porosphaera globularis collected at Acheulian sites in France and England were shown to have been modified. Many of them show extensive wear facets where they rubbed against other beads while worn on a string for long periods of time. Other Lower Paleolithic beads had in the meantime been recovered from Repolust Cave, Austria [201]; from Gesher Ya’aqov, Israel [202]; and from El Greifa site E, Libya [203,204]. The latter case involves more than 40 disc beads carefully crafted from ostrich eggshell, and yet advocates of the replacement hoax have sought to discredit the case of Acheulian beads [205,206]. For instance they claimed that a perforated wolf’s tooth must be the result of animal chewing, but omitted to clarify why similarly perforated teeth of the Upper Paleolithic were not caused by this. These double standards are routinely applied; for instance any tubular bone fragment with regularly spaced, circular holes from an Upper Paleolithic deposit is inevitably presented as a flute, but when an identical object is recovered from a final Mousterian layer [207,208], which has a two and a half octave compass that extends to over three octaves by over-blowing, it is explained away as the result of carnivore chewing [209]. The purpose is simply to preserve the dogma, according to which the Mousterian “Neanderthals” were primitive and incapable of such cultural sophistication. Archaeologists fail to appreciate that such finds as beads should be expected from hominins that must be assumed to have developed an advanced level of self-awareness, that most beads of the Pleistocene would have been made from perishable materials, and that even most of those that were not have either not survived or not been recovered. Recalling that the magnificent cave art of the Aurignacian [210] and the delicate carvings of Vogelherd and other Swabian sites [211,212] must be the work of these same final “Neanderthals,” the full absurdity of regressive opinions such as those concerning the Mousterian flute becomes apparent.
Beads and cupules (hemispherical cup marks on rock), which are among the exograms that managed to survive from the Middle Pleistocene, derive their only significance from their cultural context. They are social constructions, which as Plotkin [213] reminds us means that they cannot exist in a single mind. These invented qualities—for that is what they represent—have been turned into reality by culture. A string of beads around the neck of a person may signify certain “memes” to people sharing his social construct or culture; it may well evoke a quite different response in people not sharing his reality; but it means absolutely nothing to any other animal. However, the symbolic roles of beads and pendants extend beyond the semiotic message encoded by cultural convention; they also present a “readable” message. They are even proof for a quest for perfection, and in fact their excellence of execution is part of their message; they are aspects of a “costly display” strategy [214,215,216].
Another benchmark in establishing the cognitive complexity of hominins, also completely ignored by mainstream archaeology, is maritime colonization, although demonstrating a whole raft of human capacities. The gracile australopithecines apparently evolved into robust forms, now subsumed under the genus Paranthropus. P. robustus is credited with using both “advanced” tools and fire, which interestingly has been explained away as evidence of imitation of human behavior, an unlikely and awkward explanation encountered elsewhere in hominin history. The two other Paranthropus “species” currently distinguished are P. boisei (2.3–1.4 Ma) and P. aethiopicus. (2.7–2.3 Ma). They co-existed with fully human species, Homo habilis (2.3–1.6 Ma), H. rudolfensis (2.5–1.9 Ma), and H. ergaster (1.8–1.3 Ma). The latter and H. erectus were the first to develop templates for stone implements, especially handaxes, and they colonized much of Asia. By around one million years ago, a coastal H. erectus population in Java, then with Bali part of the Asian continent, had developed seafaring ability to the point of being able to colonize Lombok from Bali, eventually expanding further eastward into the islands of Wallacea. Via Sumbawa they reached Flores [217,218], establishing a thriving colony there by 840,000 years ago at the latest, eventually crossing Ombai Strait to also occupy Timor and Roti [2,110]. During the Middle Pleistocene, at least two or three Mediterranean islands were also reached by hominins: Sardinia [110,219], Crete [2,220,221,222,223], and presumably Corsica. It is also possible that Europe was first occupied via the Strait of Gibraltar rather than from the east [224].
Maritime colonization involves a number of prerequisites. The technological minimum requirements have been established through a series of experiments begun in 1997 and are now well understood [2,17,80,110,218,224,225]. More important, here, are the cognitive and cultural criteria essential for these quests. Reflective language capable of displacement [226] can be realistically assumed to have been in use at that stage, and the effective use of language to convey abstract concepts and refer to future conditions. The successful operation of a collective consciousness (sensu Émile Durkheim) can be assumed, as well as an advanced ToM. To be successful (i.e., to result in archaeologically detectable evidence), seafaring colonization demands also a certain minimum level of social organization and the capacity of long-term forward planning. Just as the author’s First Mariners project has determined minimal technological conditions for successful crossings of the sea by Lower Paleolithic means, the minimum cognitive or linguistic requirements for such quests could also be established within a falsifiable format. In contrast to the déformation professionnelle governing archaeological scenarios, this exercise is strictly scientific and readily testable.
Many other examples show that scientific approaches to the course of the human ascent are possible, and all of them seem to lead in the same direction: this ascent was gradual, over the course of several million years. Communication ability, social complexity, development of ToM and self-awareness, and cultural sophistication can all safely be assumed to have been functions of the gradual growth of the frontal, temporal, and parietal areas of the brain [115]. The apparent reason for the extreme position of Pleistocene archaeologists, believing in a sudden explosion of all these and other developments at the very end of the time span involved, is perhaps a subliminal conviction that the greatest possible intellectual, cognitive, and cultural distance should be maintained between “us” and those savage ancestors, because it is that distance that seemingly justifies the existence of archaeology as well as certain religious beliefs. There is simply no better explanation for this fervent belief in the primitiveness of all humans up to the final Pleistocene, or for the vehement rejections of each and every reasonable contention contradicting that belief. A second reason is that most archaeologists still do not understand taphonomic logic, which decrees that there must be a “taphonomic threshold” for every phenomenon category in archaeology, and that it must always be later (in most cases very much later) than the first appearance of the category in question [227]. This impediment to understanding archaeological evidence, which increases in severity linearly with the age of the evidence, remains the single most effective reason for misinterpreting archaeology. There even seems to be a culture within Pleistocene archaeology of deliberately restraining excessive ideas about hominin advancement, almost as if to leave adequate scope for future generations of researchers to progressively offer sensational revelations. This extreme conservatism or obscurantism is the antithesis of the realistic search for a balanced representation of what might really have happened in the human past that is advocated here.

4. Reviewing Trajectories

The issue is really quite simple. At the time the hominin lineage split from that resulting in the great apes, the human ancestor can safely be assumed to have possessed cognitive and intellectual abilities matching theirs—roughly those of a child of almost forty months. Since then the expansion of the frontal insular cortex, dorsolateral prefrontal cortex, anterior cingulate cortex, temporal and parietal lobes, limbic system, and basal ganglia accounts for the uniquely developed hominin cognition and intellect, which must be assumed to have increased at roughly the same rate as encephalization occurred. Otherwise they would not have been selected for. Ontogenic homology provides a rough guide for the development to be expected; for instance metarepresentation, recursion, and basic language skills need to be attributed to the first representatives of Homo. By the time Homo erectus appeared, the means and volition of colonizing cold climate regions (skilled fire use and probably use of animal furs) had become possible. One million years ago, the complex means of colonizing seafaring had become accessible, and culture had developed to such intricacy that the use of exograms to store memory externally became established. The structures and functions of the human brain had become essentially modern. Any archaeological explanation that cannot accommodate these fundamentals is obliged to provide clearly stated justifications, because it is at odds with a credible predictive outline provided by the sciences, including paleophysiology and linguistics [226,228,229,230,231]. Absence of perceived archaeological evidence of any phenomenon category is not evidence of absence; it is simply a reflection of a contingent state of an embryonic discipline—an “underdeveloped discipline” [232]. Pleistocene archaeology is in no position to attempt finite determination of any past human states and disallow any contradictory evidence just because it conflicts with its entirely provisional notions of the past, which are merely unstable orthodoxies. The discipline’s history demonstrates this amply. Moreover, it possesses no knowledge whatsoever about the no doubt more sedentary and more developed societies occupying the world’s coastal regions, because the vast sea level fluctuations of the Pleistocene have obliterated all traces of them. Therefore the premature narratives of archaeological explanations refer purely to the more mobile hunters of the hinterland that followed the herds, while the achievements of the Lower Palaeolithic seafarers seem inconceivable to this mode of thinking.
Not only does the expansion of the brain areas just listed account for human cognition and intellect, but brain illnesses also predominantly affect these very same areas [17,20,233]. The phenomenal rise of these and thousands of other genetically based disorders and syndromes that has accompanied the most recent development of humans can be explained by the ability of final Pleistocene and Holocene people to suppress natural selection by replacing it with sexual selection and domestication [99]. Thus the future of humanity can no longer be determined by the traditional processes of nature. The effects of the process of human domestication first become evident on the archaeological record between 40,000 and 20,000 years ago, with a distinctive acceleration of the formerly gradual neotenization, expressed as rapid gracilization of robust populations in four continents [17,19,20,83,99]. This gracilization has continued to the present time: 10,000 years ago humans were significantly more robust than today (as expressed by numerous indices), 20,000 years ago they were twice as robust, and so on. This is the first of a number of trajectories of human development that, given the right data, can be applied to extrapolating into the future.
Domestication by sexual selection in favor of neotenous features has had numerous effects in recent human history. The obvious somatic changes are a rapid loss of skeletal robusticity, thinning of the cranium, reduction of physical strength, and of cranial volume. As noted above, these are typical features of domestication. So is the abolition of estrus, which has often accompanied the domestication of various mammals, and which of course has been strongly established in humans. Even exclusive homosexuality, which as Miller [214] poignantly observes, is biologically unexplainable in a sexually reproducing species and may need to be attributed to domestication. Humanity has actively contributed to self-domestication, at least in Holocene times, by systematically eradicating the genetic traits of “noncompliant” individuals. Those who were independently minded, especially gifted or enterprising, those who were recalcitrant or rebellious have for millennia been selectively persecuted, culled, exiled, or burned at the stake. The cumulative effects of systematic selection of compliant personality traits the human species has practiced at least throughout known history is akin to the domesticating selection of compliant traits in other animals—the obvious exception being animals specifically bred for fighting (bulls, bulldogs, roosters). During known history, latent and sometimes real caste systems of various types have been developed and refined, which as Bickerton [226] notes resemble those of social insects. For the participatory organisms it may be difficult to detect these readily, but class systems, systems of academic apartheid, professionalism, of rigid economic affiliations, religious pigeonholes, political allegiances and so forth, even sporting loyalties, can all resemble castes: in some sense pre-ordained adherences. Present-day humans may see these as reassuring, as signs of belonging, and may find certain forms of them quite advantageous. They may also perceive the modern welfare state as expedient, even though most members of such societies would be unable to subsist without the continued support of these structures. Reference to this dependency is not to be implied as a form of criticism; it is merely a clarification of the state of modern society in order to consider the direction of the human genome. Just as domestication has had a profound effect on it, these other currents may do so also, therefore in the present context they need to be appreciated.
In considering the continued neotenization of humans, which is not assumed to discontinue any time soon, it is important to be aware that this process occurs entirely outside of natural selection; it is determined culturally. Although initially selection of neotenous features was apparently limited to the females [99], it prompted universal genetic changes, and a female preference for neotenous males appears to be developing (consider the Bollywood phenomenon). In the most recent history, a cult of youth has arisen, and in the Western world, young people are increasingly shunning the responsibilities of adult life and parenthood. It would of course be premature to attribute these short-term trends to a long-range change; it is more likely a temporary social phenomenon, but in combination with other recent developments it does deserve attention.
Of particular importance are the rapid changes in the neuropathological and neurodegenerative domains. During most of human history, determined as it was by environment and selective pressures, neuropsychiatric disorders could not establish themselves effectively. In other primates they are practically absent [115,234,235,236,237,238,239]. The lack of social and survival skills inherent in these conditions selected strongly against them, socially as well as genetically, and genetic evidence for them is lacking in hominins up to the final Pleistocene. This is the case even though they tend to be polygenic; therefore single genes would not even prove their existence. For instance the schizophrenia susceptibility genes identified (NRG1, NRG3, DTNBP1, COMT, CHRNA-7, SLC6A4, IMPA2, HOPA12bp, DISC1, TCF4, MBP, MOBP, NCAM1, NRCAM, NDUFV2, RAB18, ADCYAP1, BDNF, CNR1, DRD2, GAD1, GRIA1, GRIA4, GRIN2B, HTR2A, RELN, SNAP-25, TNIK, HSPA1B, ALDH1A1, ANK3, CD9, CPLX2, FABP7, GABRB3, GNB1L, GRMS, GSN, HINT1, KALRN, KIF2A, NR4A2, PDE4B, PRKCA, RGS4, SLC1A2, SYN2 e.g., [240,241,242,243,244,245]) are individually of small or non-detrimental effects. Similarly, the genetic basis of bipolar disorder, although unresolved, involves many regions of interest identified in linkage studies, such as chromosome 18, 4p16, 12q23-q24, 16p13, 21q22 and Xq24-q26 [246,247,248], and genes DRD4, SYNJ1 and MAOA, which have so far been implicated [242,249,250,251,252,253]. Again, the illness is clearly polygenic. Nevertheless, when the absence of such schizophrenia susceptibility alleles as NRG3 is demonstrated in ancestral robust humans it confirms the suspected absence of the condition in these populations [254]. Selective sweeps tend to yield relatively recent etiologies, of less than 20,000 years, for all neuropathologies. Indeed, schizophrenia has been suggested to be of very recent etiology [115] and may have appeared only a few centuries ago [255]. Numerous deleterious conditions were derived from recent neoteny, including cleidocranial dysplasia or delayed closure of cranial sutures, malformed clavicles, and dental abnormalities (genes RUNX2 and CBRA1 refer), type 2 diabetes (gene THADA); or the microcephalin D allele, introduced in the final Pleistocene [256]; or the ASPM allele, another contributor to microcephaly, which appeared around 5,800 years ago [257].
A further example of the recent advent of detrimental genes is that of CADPS2 and AUTS2, involved in autism, which are absent in robust humans, i.e., those predating 28,000 years in Europe. The human brain condition autistic spectrum disorder (ASD) [115,233,258,259,260,261,262,263,264,265,266,267,268,269] also seems to have become notably more prominent in recent centuries. Most recently ASD has developed into a very common illness, reported to be affecting one in 5,000 children in 1975, one in 150 by 2002, one in 110 in 2006 [270], and one in 88 U.S. children in 2008. Moreover, these figures very probably underestimate autism’s U.S. prevalence, because they rely on school and medical record reviews rather than in-person screening. A more thorough study conducted on a large population of South Korean children found recently that one in 38 had autism spectrum disorder [271]. It has been emphasized that the epidemic increase in these diagnoses cannot be entirely attributed to changing diagnostic criteria [272].
It is therefore not at all alarmist to speak of an epidemic in detrimental neuropsychiatric conditions. The incidence of various other conditions, such as obsessive-compulsive disorder (OCD), obsessive-compulsive personality disorder (OCPD), and chronic fatigue syndrome (CFS), also appears to be increasing. Similarly, notable increases are observed in the neurodegenerative diseases, although here the recent surge in human longevity probably accounts for much of this. For instance the incidence of Alzheimer’s disease begins to increase significantly at age 80 (from 9 to 23 per 1000 [273]), and a similar pattern applies to Parkinson’s disease [274]. However, about 5% of Alzheimer’s patients have experienced what is termed early onset, and even Parkinson’s can commence before the age of 40 [115]. There is a juvenile form of Huntington’s disease with onset prior to 20 years of age that accounts for about 5–10% of all affected patients, and the most common age of onset is from 35–44 years of age. While it is therefore justified to consider rising life expectancies in the incidence of neurodegenerative conditions as a contributing factor to statistics, this alone does not account fully for epidemiological patterns. Within the overall picture the longevity factor is also more than compensated for by the fact that most mental illnesses and conditions deriving from demyelination or dysmyelination are selecting younger cohorts etiologically, even infants in some cases. Therefore the rise in neuropathology cannot be brushed aside as not affecting reproduction, because it affects young people as well. In contrast to many other health issues, which modern medicine tends to deal with fairly successfully, neuropathologies have been relatively more resistant to palliative treatment and management so far.
The relentless increase in neuropathologies is one of the many effects of human self-domestication and the replacement of natural selection with culturally guided sexual selection. Like gracilization and neotenization, it can provide empirically determined, testable, and cogent trajectories. Unless there were to be unexpected extraordinary changes to the causal variables, these trajectories permit relatively reliable extrapolation into the future. In the absence of the arbitrating force of natural selection, none of the deleterious predispositions can be suppressed in the human genome. This concerns in particular neuropathologies, which arose essentially because the most recent encephalization occurred outside the constraining controls imposed by evolutionary selection. Short of a complete collapse of human culture it should be expected that these encumbrances will continue to rise, and new ones will very probably appear. All things considered, this may well turn out to be the greatest challenge in the future of humanity.
In the case of neotenization, which is perhaps a more protracted and convoluted process, only highly controversial and universally scorned measures could impede or arrest it. Therefore it can safely be expected to continue into the future, and it seems perfectly reasonable to predict some of the directions into which this will take the human species. For one thing, an increase in the volume of the human brain is unlikely, considering the rapid decline of that index over the past forty millennia. This has been more than compensated for by two factors: the growth in the brain’s interconnectivity and the simply massive proliferation in its reliance on exograms, i.e., external storage of symbolic information. Both these influences can be expected to facilitate future reorganization of the human brain, because the ontogenic plasticity noted above will progressively translate into phylogenic changes. To the extent that the external storage is now technologically driven, e.g., through digitization, it should be apparent that this will lead to permanent changes in the operation of the human brain, much in the same way as the introduction of writing has had such effects [102]. Something of this kind was already predicted by Plato, when he noted that learning to write “will implant forgetfulness in (people’s) souls: they will cease to exercise memory because they will rely on that which is written, calling things to remembrance no longer from within themselves but from external marks” [275]. As so often, Plato was right.
Other effects of continued neotenization and general changes to the human genome can also be speculated about with some credibility. However, the ultimate question on the future of humanity concerns the issue of speciation: at what time can the present species be expected to have developed so much that its individuals would no longer be able to viably breed (i.e., produce fertile offspring) with Homo sapiens sapiens? Before addressing this issue it would be expedient to note the lack of consensus among paleoanthropologists concerning the taxonomy of human species of the present and past. The current majority view, that H.s.s. could not effectively interbreed with any robust predecessor, beginning with the so-called Neanderthals, is in all probability false. Some scholars extend the species designation of H.s. as far back as H. erectus, thus implying that a modern person would be able to produce fertile offspring with a H. erectus partner [276]. Others take an intermediate position, either including the robust H.s. subspecies, or also including the preceding H. heidelbergensis and H. antecessor as subspecies of H. sapiens. Since the extreme view, despite being the majority view at present, is presumably erroneous [17,20,82,99], a conservative taxonomy would be to include all robusts of the last several hundred thousand years with H.s. However, the model of the longer duration of the species, of over a million years, cannot be excluded from consideration at this stage.
This infers that the likelihood of a fully new species emerging from present humanity within, say, the next hundred millennia is rather low. Unless there were to be a very incisive bottleneck event, during which the world population might plummet to less than one millionth its present size, speciation would be as far off as a time when the majority of humans would live beyond Earth—or roughly after the end of the next Ice Age. By that time, humans would have long replaced traditional sustenance with intravenous liquid food, and it is possible that technologically driven changes of this kind will also affect the rate of speciation. Nevertheless, it would be entirely unrealistic to expect speciation to occur over the next few millennia, during which time technology can safely be predicted to develop rapidly, if not exponentially.
This raises the specter of the “posthuman condition” [277,278,279]. Bostrom [1] refers to this condition by predicting that at least one of the following characteristics will apply:
  • Population greater than 1 trillion persons
  • Life expectancy greater than 500 years
  • Large fraction of the population has cognitive capacities more than two standard deviations above the current human maximum
  • Near-complete control over the sensory input, for the majority of people for most of the time
  • Human psychological suffering becoming rare occurrence
  • Any change of magnitude or profundity comparable to that of one of the above
Since such hypothetical people would still be humans, the term “post-sapienoid condition” might be preferable to refer to them. Captivated by their tantalizing notions, the promoters of these futuristic scenarios tend to overlook that most of humanity lives subsistence existences currently, and that some of its members still survive with essentially “Paleolithic” to “Neolithic” technologies. Progress in technology might be exponential, but it does not necessarily follow that “a hand-picked portfolio of hot technologies” [1] will determine the overall direction of humanity in the foreseeable future. The pace of “progress” is extremely uneven in different areas. Superintelligence will no doubt be developed at some stage, but before speculating about a “posthuman” condition it needs to be considered that humanity may find itself facing some rather different but more pressing matters well before reaching such a stage. The cumulative probability of extinction increases monotonically over time [1], but what are the most potent threats?
The issues associated with the future of humanity can be divided into two types: the “external” issues, relating to the environment (natural or artificial), and the “internal” issues, relating to the human condition [17,280,281]. “External” existential risks, such as massive disasters, pandemics, and astrophysical events are always present but they are of relatively low probability, whereas the perhaps more insidious “internal” threats have attracted much less attention. Moreover, the external threats may be reduced significantly by the future establishment of extraterrestrial human colonies over multiple planets and solar systems. The internal threats, based on inevitable genetic changes and societal defects (e.g., the tensions Bostrom [7] perceives between “eudaemonic agents” and “fitness-maximizers”), seem much more difficult to avert by future developments; in fact they will very probably be fostered by these changes. As shown in this paper, one of the most important is the trajectory of humanity in acquiring significant genetic burdens, which first appeared in the final Pleistocene, but which has since then apparently gained growing momentum [17,20,82,88,99,282]. There are credible indications that this momentum is accelerating exponentially, and this issue is likely to be clarified within the next few decades.
This trajectory suggests two things: that the greatest threat to the human species is derived from its self-domestication and the genetic changes it engenders, and that it presents for all practical purposes an inevitability. While external existential risks are always present, but statistically modest, the threat of genetic unsustainability is not just real, it is inescapable. Those who trust in the human ingenuity of solving every issue by technology will probably argue that ways will be found to reengineer the human genome to counter these inexorable developments, and perhaps that will be possible. However, such solutions may need to be applied sooner rather than later, because the recent and present trajectories of human development suggest that the processes they need to counter are well under way, and possibly at an accelerating rate. In the final analysis it may be futile to escape the basic nature of human beings: they are still apes, still subject to the canons of nature, and their escape from the confines of natural selection may already have sealed the long-term prospects of the species.

5. Conclusions

This review of the long-term prospects of the human species arrives at findings that are not very encouraging. There is a range of existential threats to human future, essentially of two types. “External” risks are those deriving from environmental, always present hazards, such as planet-wide disasters, astrophysical events elsewhere in the universe, or pandemics. They are, however, relatively low risks. Of course the possibility of a fatal asteroid collision is always present, but the statistical probability of it occurring in the foreseeable future is exceedingly low. Moreover, some of these external threats may become manageable with the help of future technology, in various ways. For instance a collision of Earth with another object may be averted in the distant future, or the existence of extraterrestrial human colonies may preserve humanity in the case that the Earth becomes uninhabitable. It is considered that the risks from “internal” or anthropogenic causes are the more imminent. These include conditions created by humans themselves, be it intentionally or not. For instance dangers inherent in future technologies, whether applied maliciously or not, are statistically more likely to occur in due course, and may be difficult to respond to. These could include self-enhancing artificial intelligence of an ill-conceived type, man-made pathogens, or the unintended or unforeseeable consequences of the ability to manipulate genetics, molecular structures and the matter life is made of.
While these risks have all long been considered and discussed, this paper has adopted a different approach to the issue. It begins with a review of what is currently known about the trajectory of human development, evolutionary or otherwise, for the past several million years. It then adopts a uniformitarian method, by assuming that especially recent developments in these trajectories are likely to continue for some time into the future, all other things being equal. The specific trend then focused on is the neotenization of humans through self-domestication, and the attendant rapid neurological deterioration. The latter, attributable to neural proliferation under conditions of decreasing natural selection pressures, is leading to an explosion in neuropathologies and numerous other deleterious genetic traits. Whereas many of the external threats can perhaps be mitigated or averted in the distant future, and the dangers from technology should be manageable in many or most cases, the deterioration of the human genome is not only inescapable, it has already progressed for a few tens of millennia. This is a process that cannot be forestalled; it is entirely inevitable, and it is already happening. In other words, the human species will perhaps not end with a big bang, but with a whimper. This may not be a future one looks forward to—although, from the perspective of planet Earth, it could not occur soon enough, bearing in mind Nietzsche’s apt characterization of humanity as a kind of skin disease of the planet. However, it is perhaps preferable that humanity understands its most likely destiny. The benighted ignorance of approaching it blindly, as clueless as humanity has been for eons, would not be worthy of a species that has been credited with rationality and intelligence. Only an unintelligent organism would rather not know where it is heading.

References

  1. Nick Bostrom. “The future of humanity.” In New Waves in Philosophy of Technology. Edited by Jan Kyrre Berg Olsen, Evan Selinger and Søren Riis. New York: Palgrave McMillan, 2009, pp. 186–216. [Google Scholar]
  2. Robert G. Bednarik. “Seafaring in the Pleistocene.” Cambridge Archaeological Journal 13 (2003): 41–66. [Google Scholar] [CrossRef]
  3. John Leslie. The End of the World: The Science and Ethics of Human Extinction. London: Routledge, 1996. [Google Scholar]
  4. Nick Bostrom. “Existential risks: Analyzing human extinction scenarios and related hazards.” Journal of Evolution and Technology 9 (2002): 1–36. [Google Scholar]
  5. Martin Rees. Our Final Hour: A Scientist's Warning: How Terror, Error, and Environmental Disaster Threaten Humankind's Future in This Century—On Earth and Beyond. New York: Basic Books, 2003. [Google Scholar]
  6. Richard A. Posner. Catastrophe: Risk and Response. Oxford: Oxford University Press, 2004. [Google Scholar]
  7. Nick Bostrom. “The future of human evolution.” In Death and Anti-Death: Two Hundred Years After Kant, Fifty Years After Turing. Edited by C. Tandy. Palo Alto, CA: Ria University Press, 2004, pp. 339–71. [Google Scholar]
  8. Nick Bostrom. “Transhumanist values.” Review of Contemporary Philosophy 4 (2005): 87–101. [Google Scholar] [CrossRef]
  9. Nick Bostrom. “Why I want to be a posthuman when I grow up.” In Medical Enhancement and Posthumanity. Edited by B. Gordijn and R. Chadwick. New York: Springer, 2007. [Google Scholar]
  10. David Pearce. The Hedonistic Imperative. Available online: http://www.hedweb.com/hedab.htm (accessed on 28 November 2012).
  11. I. J. Good. “Speculations concerning the first ultraintelligent machine.” Advances in Computers 6 (1965): 31–88. [Google Scholar]
  12. Anders Sandberg, and Nick Bostrom. Whole Brain Emulation: A Roadmap. Technical Report #2008-3, Future of Humanity Institute, Oxford University. Available online: www.fhi.ox.ac.uk/reports/2008-3.pdf (accessed on 28 November 2012).
  13. K. Eric Drexler. Engines of Creation: The Coming Era of Nanotechnology. New York: Anchor Books, 1986. [Google Scholar]
  14. K. Eirc Drexler. Nanosystems: Molecular Machinery, Manufacturing and Computation. New York: Wiley, 1992. [Google Scholar]
  15. Hans Morevec. Mind Children: The Future of Robot and Human Intelligence. Cambridge, MA: Harvard University Press, 1988. [Google Scholar]
  16. Ralf C. Merkle. “The molecular repair of the brain.” Cryonics 15 (1994): 16–31. [Google Scholar]
  17. Robert G. Bednarik. The Human Condition. New York: Springer, 2011. [Google Scholar]
  18. Robert G. Bednarik. “Concept-mediated marking in the Lower Palaeolithic.” Current Anthropology 36 (1995): 605–34. [Google Scholar] [CrossRef]
  19. Robert G. Bednarik. “Antiquity and authorship of the Chauvet rock art.” Rock Art Research: The Journal of the Austrlian 24 (2007): 21–34. [Google Scholar]
  20. Robert G. Bednarik. “The mythical Moderns.” Journal of World Prehistory 21 (2008): 85–102. [Google Scholar] [CrossRef]
  21. Reiner Rudolf Robert Protsch. “The Dating of Upper-Pleistocene Subsaharan fossil Hominids and their Place in Human Evolution: With Morphological and Archaeological Implications.” PhD Thesis, University of California, Los Angeles, 1973. [Google Scholar]
  22. Reiner Rudolf Robert Protsch. “The absolute dating of Upper Pleistocene sub-Saharan fossil hominids and their place in human evolution.” Journal of Human Evolution 4 (1975): 297–322. [Google Scholar] [CrossRef]
  23. Reiner Rudolf Robert Protsch, and H. Glowatzki. “Das absolute Alter des paläolithischen Skeletts aus der Mittleren Klause bei Neuessing, Kreis Kelheim, Bayern.” Anthropologischer Anzeiger 34 (1974): 140–44. [Google Scholar]
  24. Reiner Rudolf Robert Protsch, and A. Semmel. “Zur Chronologie des Kelsterbach-Hominiden.” Eiszeitalter und Gegenwart 28 (1978): 200–10. [Google Scholar]
  25. W. Henke, and Reiner Rudolf Robert Protsch. “Die Paderborner Calvaria—ein diluvialer Homo sapiens.” Anthropologischer Anzeiger 36 (1978): 85–108. [Google Scholar]
  26. G. Bräuer. “Präsapiens-hypothese oder Afro-europäische sapiens-hypothese? ” Zeitschrift für Morphologie und Anthropologie 75 (1984): 1–25. [Google Scholar] [PubMed]
  27. R. L. Cann, M. Stoneking, and A. C. Wilson. “Mitochondrial DNA and human evolution.” Nature 325 (1987): 31–36. [Google Scholar] [CrossRef] [PubMed]
  28. C. B. Stringer, and P. Andrews. “Genetic and fossil evidence for the origin of modern humans.” Science 239 (1988): 1263–68. [Google Scholar] [CrossRef] [PubMed]
  29. Elizabeth Pennisi. “Genetic study shakes up Out of Africa Theory.” Science 283 (1999): 1828. [Google Scholar] [CrossRef] [PubMed]
  30. Vinayak Eswaran. “A diffusion wave out of Africa.” Current Anthropology 43 (2002): 749–74. [Google Scholar] [CrossRef]
  31. Alan Templeton. “Out of Africa again and again.” Nature 416 (2002): 45–51. [Google Scholar] [CrossRef] [PubMed]
  32. Fred H. Smith, Ivor Janković, and Ivor Karavanić. “The assimilation model, modern human origins in Europe, and the extinction of Neandertals.” Quaternary Internationl. 137 (2005): 7–19. [Google Scholar] [CrossRef]
  33. D. R. Maddison. “African origin of human MtDNA re-examined.” Systematic Zoology 40 (1991): 355. [Google Scholar] [CrossRef]
  34. Daniel L. Hartl, and Andrew G. Clark. Principles of Population Genetics. Sunderland, MA: Sinauer, 1997. [Google Scholar]
  35. W. J. Ewans. “The role of models in the analysis of molecular genetic data, with particular reference to restriction fragment data.” In Statistical Analysis of DNA Sequence Data. Edited by B. S. Weir. New York: Marcel Dekker, 1983, pp. 45–73. [Google Scholar]
  36. E. Watson, K. Bauer, R. Aman, G. Weiss, A. von Haeseler, and S. Pääbo. “MtDNA sequence diversity in Africa.” The American Journal of Human Genetics 59 (1996): 437–44. [Google Scholar] [PubMed]
  37. R. H. Ward, B. L. Frazier, K. Dew-Jager, and S. Pääbo. “Extensive mitochondrial diversity within a single Amerindian tribe.” Proceedings of the National Academy of Sciences 88 (1991): 8720–24. [Google Scholar] [CrossRef]
  38. Francisco Rodriguez-Trelles, Rosa Tarrio, and Francisco J. Ayala. “Erratic overdispersion of three molecular clocks: GPDH, SOD, and XDH.” Proceedings of the National Academy of Sciences 98 (2001): 11405–10. [Google Scholar] [CrossRef] [PubMed]
  39. Francisco Rodriguez-Trelles, Rosa Tarrio, and Francisco J. Ayala. “A methodological bias toward overestimation of molecular evolutionary time scales.” Proceedings of the National Academy of Sciences 99 (2002): 8112–115. [Google Scholar] [CrossRef] [PubMed]
  40. Ann Gibbons. “Calibrating the mitochondrial clock.” Science 279 (1998): 28–29. [Google Scholar] [CrossRef] [PubMed]
  41. M. F. Hammer. “A recent common ancestry for human Y chromosomes.” Nature 378 (1995): 376–378. [Google Scholar] [CrossRef] [PubMed]
  42. U. Gyllensten, D. Wharton, A. Josefsson, and A. C. Wilson. “Paternal inheritance of mitochondrial DNA in mice.” Nature 352 (1991): 255–57. [Google Scholar] [CrossRef] [PubMed]
  43. P. Awadalla, A. Eyre-Walker, and J. Maynard Smith. “Linkage disequilibrium and recombination in hominid mitochondrial DNA.” Science 286 (1999): 2524–25. [Google Scholar] [CrossRef] [PubMed]
  44. Andrew A. M. Morris, and Robert N. Lightowlers. “Can paternal mtDNA be inherited? ” The Lancet 355 (2000): 1290–91. [Google Scholar] [CrossRef]
  45. R. S. Williams. “Another surprise from the mitochondrial genome.” New England Journal of Medicine 347 (2002): 609–11. [Google Scholar] [CrossRef] [PubMed]
  46. M. Schwartz, and J. Vissing. “Paternal inheritance of mitochondrial DNA.” New England Journal of Medicine 347 (2002): 576–80. [Google Scholar] [CrossRef] [PubMed]
  47. Hannes Napierala, and Hans-Peter Uerpmann. “A ‘new’ Palaeolithic dog from central Europe.” International Journal of Osteoarchaeology. [CrossRef]
  48. Mietje Germonpré, Mikhail V. Sablin, Rhiannon E. Stevens, Robert E. M. Hedges, Michael Hofreiter, Mathias Stiller, and Viviane R. Després. “Fossil dogs and wolves from Palaeolithic sites in Belgium, the Ukraine and Russia: Osteometry, ancient DNA and stable isotopes.” Journal of Archaeological Science 36 (2009): 473–490. [Google Scholar] [CrossRef]
  49. T. Lindhal, and B. Nyberg. “Rate depurination of native deoxyribonucleic acid.” Biochemistry 11 (1972): 3610–18. [Google Scholar] [CrossRef]
  50. Edward M. Golenberg, Ann Bickel, and Paul Weihs. “Effect of highly fragmented DNA on PCR.” Nucleic Acids Research 24 (1996): 5026–33. [Google Scholar] [CrossRef] [PubMed]
  51. G. Gutierrez, and A. Marin. “The most ancient DNA recovered from amber-preserved specimen may not be as ancient as it seems.” Molecular Biology and Evolution 15 (1998): 926–29. [Google Scholar] [CrossRef] [PubMed]
  52. Eva-Maria Geigl. “Why ancient DNA research needs taphonomy.” Paper presented at Conférence ICAZ, Bioshere to Lithosphere. 2002. [Google Scholar]
  53. Ludovic Carlier, Joel Couprie, Albane le Maire, Laure Guilhaudis, Isabelle Milazzo, Marie Gondry, Daniel Davoust, Bernard Gilquin, and Sophie Zinn-Justin. “Solution structure of the region 51-160 of human KIN17 reveals an atypical winged helix domain.” Protein Science 16 (2007): 2750–55. [Google Scholar] [CrossRef] [PubMed]
  54. L. Vigilant, M. Stoneking, H. Harpending, K. Hawkes, and A. C. Wilson. “African populations and the evolution of human mitochondrial DNA.” Science 253 (1991): 1503–07. [Google Scholar] [CrossRef] [PubMed]
  55. M. Barinaga. “‘African Eve’ backers beat a retreat.” Science 255 (1992): 686–87. [Google Scholar] [CrossRef] [PubMed]
  56. F. J. Ayala. “Response to Templeton.” Science 272 (1996): 1363–1364. [Google Scholar] [CrossRef] [PubMed]
  57. J. F. Y. Brookfield. “Importance of ancestral DNA ages.” Nature 388 (1997): 134. [Google Scholar] [CrossRef] [PubMed]
  58. S. B. Hedges, S. Kumar, K. Tamura, and M. Stoneking. “Human origins and analysis of mitochondrial DNA sequences.” Science 255 (1992): 737–739. [Google Scholar] [CrossRef] [PubMed]
  59. David R. Maddison, Maryellen Ruvolo, and David L. Swofford. “Geographic origins of human mitochondrial DNA: Phylogenetic evidence from control region sequences.” Systematic Biology 41 (1992): 111–24. [Google Scholar] [CrossRef]
  60. S. Blair Hedges, Sudhir Kumar, Koichiro Tamura, and Mark Stoneking. “Human origins and analysis of mitochondrial DNA sequences.” Science 255 (1992): 737–739. [Google Scholar] [CrossRef] [PubMed]
  61. Alan R. Templeton. “The ‘Eve’ hypothesis: A genetic critique and re-analysis.” American Anthropology 95 (1993): 51–72. [Google Scholar] [CrossRef]
  62. Alan R. Templeton. “Gene lineages and human evolution.” Science 272 (1996): 1363. [Google Scholar] [CrossRef] [PubMed]
  63. Alan R. Templeton. “Haplotype trees and modern human origins.” Yearbook of Physical Anthropol 48 (2005): 33–59. [Google Scholar] [CrossRef] [PubMed]
  64. Gabriel Gutierrez, Diego Sanchez, and Antonio Marin. “A reanalysis of the ancient mitochondrial DNA sequences recovered from Neandertal bones.” Molecular Biology and Evolution 19 (2002): 1359–66. [Google Scholar] [CrossRef] [PubMed]
  65. M. W. Walberg, and D. A. Clayton. “Sequence and properties of the human KB cell and mouse L cell D-loop regions of mitochondrial DNA.” Nucleic Acids Research 9 (1981): 5411–21. [Google Scholar] [CrossRef] [PubMed]
  66. A. Torroni, M. T. Lott, M. F. Cabell, Y.-S. Chen, L. Lavergne, and D. C. Wallace. “MtDNA and the origin of Caucasians: Identification of ancient Caucasian-specific haplogroups, one of which is prone to a recurrent somatic duplication in the D-loop region.” The American Journal of Human Genetics 55 (1994): 760–52. [Google Scholar] [PubMed]
  67. H. Zischler, H. Geisert, A. von Haeseler, and S. Pääbo. “A nuclear ‘fossil’ of the mitochondrial D-loop and the origin of modern humans.” Nature 378 (1995): 489–92. [Google Scholar] [CrossRef] [PubMed]
  68. J. Hardy, A. Pittman, A. Myers, K. Gwinn-Hardy, H. C. Fung, R. de Silva, M. Hutton, and J. Duckworth. “Evidence suggesting that Homo neanderthalensis contributed the H2 MAPT haplotype to Homo sapiens.” Biochemical Society Transactions 33 (2005): 582–85. [Google Scholar] [CrossRef] [PubMed]
  69. Daniel Garrigan, Zahra Mobasher, Tesa Severson, Jason A. Wilder, and Michael F. Hammer. “Evidence for archaic Asian ancestry on the human X chromosome.” Molecular Biology and Evolution 22 (2005): 189–92. [Google Scholar] [CrossRef] [PubMed]
  70. Richard E. Green, Johannes Krause, Susan E. Ptak, Adrian. W. Briggs, Michael. T. Ronan, Jan F. Simons, Lei Du, Michael Egholm, Jonathan M. Rothberg, Maja Paunovic, and et al. “Analysis of one million base pairs of Neanderthal DNA.” Nature 444 (2006): 330–36. [Google Scholar] [CrossRef] [PubMed]
  71. Richard E. Green, Johannes Krause, Adrian W. Briggs, Tomislav Maricic, Udo Stenzel, Martin Kircher, Nick Patterson, Heng Li, Weiwei W. Zhai, Markus H.-Y. Fritz, and et al. “A draft sequence of the Neandertal genome.” Science 328 (2010): 710–22. [Google Scholar] [CrossRef] [PubMed]
  72. Ann Gibbons. “Close encounters of the prehistoric kind.” Science 328 (2010): 680–84. [Google Scholar] [CrossRef] [PubMed]
  73. M. Krings, A. Stone, R. W. Schmitz, H. Krainitzki, M. Stoneking, and S. Pääbo. “Neandertal DNA sequences and the origin of modern humans.” Cell 90 (1997): 19–30. [Google Scholar] [CrossRef]
  74. M. L. Weiss, and A. E. Mann. Human Biology and Behavior: An Anthropological Perspective. Boston: Little, Brown and Co., 1978. [Google Scholar]
  75. Ya. Ya. Roginsky, M. M. Gerasimov, S. N. Zamyatnin, and A. A. Formozov. “Zaklyuchenie po nakhodke iskopaemogo cheloveka v peshchernoi stoyanke Starosel’e bliz g. Bakhchisary.” Sovetskaya Etnografiya 1954 (1954): 39–41. [Google Scholar]
  76. V. P. Yakimov. “New materials of skeletal remains of ancient peoples in the territory of the Soviet Union.” In Current Argument on Early Man: Report from a Nobel Symposium. Oxford: Pergamon Press, 1980, pp. 152–169. [Google Scholar]
  77. Rebecca L. Cann. “Tangled genetic routes.” Nature 416 (2002): 32–33. [Google Scholar] [CrossRef] [PubMed]
  78. Robert G. Bednarik. “‘African Eve’ a computer bungle.” The Artefact 14 (1991): 34–35. [Google Scholar]
  79. Robert G. Bednarik. “Palaeoart and archaeological myths.” Cambridge Archaeological Journal 2 (1992): 27–43. [Google Scholar] [CrossRef]
  80. Robert G. Bednarik. “The origins of navigation and language.” The Artefact 20 (1997): 16–56. [Google Scholar]
  81. Robert G. Bednarik. “The global evidence of early human symboling behaviour.” Human Evolution 12 (1997): 147–68. [Google Scholar] [CrossRef]
  82. Robert G. Bednarik. “The origins of human modernity.” Humanities 1 (2011): 1–53. [Google Scholar] [CrossRef]
  83. Hans-Peter Schulz. “The lithic industry from layers IV–V, Susiluola Cave, western Finland, dated to the Eemian interglacial.” Préhistoire Européenne 16–17 (2002): 7–23. [Google Scholar]
  84. Hans-Peter Schulz, B. Eriksson, H. Hirvas, P. Huhta, H. Jungner, P. Purhonen, P. Ukkonen, and T. Rankama. “Excavations at Susiluola Cave.” Suomen Museo 2002 (2002): 5–45. [Google Scholar]
  85. P. Pavlov, J. I. Svendsen, and S. Indrelid. “Human presence in the European Arctic nearly 40,000 years ago.” Nature 413 (2001): 64–67. [Google Scholar] [CrossRef] [PubMed]
  86. E. Anderson. Introgressive Hybridization. New York: John Wiley and Sons, 1949. [Google Scholar]
  87. H. C. Harpending, M. A. Batzer, M. Gurven, L. B. Jorde, A. R. Rogers, and S. T. Sherry. “Genetic traces of ancient demography.” Proc. Nat. Acad. Sciences, USA 95 (1998): 1961–67. [Google Scholar] [CrossRef]
  88. Robert G. Bednarik. “Genetic drift in recent human evolution? ” In Advances in Genetics Research, Volume 6. Edited by K. V. Urbano. New York: Nova Science Publishers, 2011, pp. 109–60. [Google Scholar]
  89. F. Barberi, F. Innocenti, L. Lirer, R. Munno, T. S. Pescatore, and R. Santacroce. “The Campanian Ignimbrite: A major prehistoric eruption in the Neapolitan area (Italy).” Bulletin Volcanologique 41 (1978): 10–22. [Google Scholar] [CrossRef]
  90. Francesco G. Fedele, and Biagio Giaccio. “Paleolithic cultural change in western Eurasia across the 40,000 BP timeline: Continuities and environmental forcing.” In Exploring the Mind of Ancient Man. Festschrift to Robert G. Bednarik. Edited by P. Chenna Reddy. New Delhi: Research India Press, 2007, pp. 292–316. [Google Scholar]
  91. Francesco G. Fedele, Biagio Giaccio, Roberto Isaia, and Giovanni Orsi. “Ecosystem impact of the Campanian Ignimbrite eruption in Late Pleistocene Europe.” Quaternary Research 57 (2002): 420–24. [Google Scholar] [CrossRef]
  92. Francesco G. Fedele, Biagio Giaccio, Roberto Isaia, and Giovanni Orsi. “The Campanian Ignimbrite Eruption, Heinrich Event 4, and Palaeolithic change in Europe: A high-resolution investigation.” In Volcanism and the Earth’s Atmosphere. Geophysical Monograph 139; Washington, DC: American Geophysical Union, 2003, pp. 301–25. [Google Scholar]
  93. R. Foley, and M. M. Lahr. “Mode 3 technologies and the evolution of modern humans.” Cambridge Archaeological Journal 7 (1997): 3–36. [Google Scholar] [CrossRef]
  94. F. Felgenhauer. “Das Paläolithikum von Willendorf in der Wachau, Niederösterreich. Vorbericht über die monographische Bearbeitung.” Forschungen und Fortschritte 33 (1959): 152–55. [Google Scholar]
  95. V. Gábori-Csánk. Le Jankovichien: Une civilisation paléolithiques en Hongrie. Liège: ERAUL 53, 1993. [Google Scholar]
  96. O. N. Bader. Sungir: Verkhnepaleoliticheskaya stoyanka. Moscow: Izdatel’stvo “Nauka”, 1978. [Google Scholar]
  97. F. H. Smith, E. Trinkaus, P. B. Pettitt, I. Karavanić, and M. Paunović. “Direct radiocarbon dates for Vindija G1 and Velika Pećina Late Pleistocene hominid remains.” Proceedings of the National Academy of Sciences 96 (1999): 12281–86. [Google Scholar] [CrossRef]
  98. James C. M. Ahern, Ivor Karavanic, Maja Paunović, Ivor Janković, and Fred H. Smith. “New discoveries and interpretations of fossil hominids and artifacts from Vindija Cave, Croatia.” Journal of Human Evolution 46 (2004): 25–65. [Google Scholar] [CrossRef]
  99. Robert G. Bednarik. “The domestication of humans.” Anthropologie 46 (2008): 1–17. [Google Scholar]
  100. John R. Searle. The Construction of Social Reality. London: Allen Lane, 1995. [Google Scholar]
  101. Bruno Latour. We Have Never Been Modern. Cambridge, MA: Harvard University Press, 1993. [Google Scholar]
  102. P. A. Helvenston. “Differences between oral and literate cultures: What we can know about Upper Paleolithic minds.” In The Psychology of Human Behavior. Edited by Robert G. Bednarik. New York: Nova Science Publishers, 2013. [Google Scholar]
  103. Nelson Goodman. Ways of Worldmaking. Indianapolis, IN: Hackett Publishing Company, 1978. [Google Scholar]
  104. Daniel L. Smail. On Deep History and the Brain. Berkeley: University of California Press, 2007. [Google Scholar]
  105. Eleanor A. Maguire, David G. Gadian, Ingrid S. Johnsrude, Catriona D. Good, John Ashburner, Richard S. J. Frackowiak, and Christopher D. Frith. “Navigation-related structural change in the hippocampi of taxi drivers.” Proceedings of the National Academy of Sciences 97 (2000): 4398–403. [Google Scholar] [CrossRef] [PubMed]
  106. Bogdan Draganski, Christian Gaser, Vokker Bush, Gerhard Schuierer, Ulrich Bogdahn, and Arne May. “Changes in grey matter induced by training.” Nature 427 (2004): 311–12. [Google Scholar] [CrossRef] [PubMed]
  107. Lambros Malafouris. “Beads for a plastic mind: The ‘blind man stick’ (BMS) hypothesis and the active nature of material culture.” Cambridge Archaeological Journal 18 (2008): 401–14. [Google Scholar] [CrossRef]
  108. Robert G. Bednarik. “The earliest evidence of palaeoart.” Rock Art Research: The Journal of the Austrlian 20 (2003): 89–135. [Google Scholar]
  109. Robert G. Bednarik. “Australian rock art of the Pleistocene.” Rock Art Research: The Journal of the Austrlian 27 (2010): 95–120. [Google Scholar]
  110. Robert G. Bednarik. “Maritime navigation in the Lower and Middle Palaeolithic.” Comptes Rendus de l’Académie des Sciences Paris, Earth and Planetary Sciences 328 (1999): 559–63. [Google Scholar] [CrossRef]
  111. Matthew C. Keller, and Geoffrey Miller. “Resolving the paradox of common, harmful, heritable mental disorders: Which evolutionary genetic models work best? ” Behavioral and Brain Sciences 29 (2006): 385–452. [Google Scholar] [CrossRef] [PubMed]
  112. W. C. McGrew. Chimpanzee Material Culture: Implications for Human Evolution. Cambridge, UK: Cambridge University Press, 1992. [Google Scholar]
  113. W. C. McGrew. “Brain, hands, and minds: Puzzling incongruities in ape tool use.” In The Use of Tools by Human and Non-human Primates. Edited by A. Berthelet and J. Chavaillon. Oxford: Clarendon Press, 1993, pp. 143–57. [Google Scholar]
  114. Katerina Semendeferi, Este Armstrong, Axel Schleicher, Karl Zilles, and Gary W. Van Hoesen. “Prefrontal cortex in humans and apes: A comparative study of area 10.” American Journal of Physical Anthropology 114 (2001): 224–241. [Google Scholar]
  115. Robert G. Bednarik, and Patricia A. Helvenston. “The nexus between neurodegeneration and advanced cognitive abilities.” Anthropos 107 (2012): 511–27. [Google Scholar]
  116. Amanda L. Woodward. “Infants’ ability to distinguish between purposeful and non-purposeful behaviors.” Infant Behavior and Development 22 (1999): 145–60. [Google Scholar] [CrossRef]
  117. Gergely Csibra, Szilvia. Biro, Orsolya Koos, and Gyorgy Gergely. “One-year-old infants use teleological representations of actions productively.” Cognitive Science 27 (2003): 111–133. [Google Scholar] [CrossRef]
  118. Josep Call, and Michael Tomasello. “Distinguishing intentional from accidental actions in orangutans (Pongo pygmaeus), chimpanzees (Pan troglodytes), and human children (Homo sapiens).” Journal of Comparative Psychology 112 (1998): 192–206. [Google Scholar] [CrossRef] [PubMed]
  119. T. Jellema, C. I. Baker, B. Wicker, and D. I. Perrett. “Neural representation for the perception of the intentionality of actions.” Brain and Cognition 44 (2000): 280–302. [Google Scholar] [CrossRef] [PubMed]
  120. Fabia Franco, and George Butterworth. “Pointing and social awareness: Declaring and requesting in the second year.” Journal of Child Language 23 (1996): 307–36. [Google Scholar] [CrossRef] [PubMed]
  121. H. Wellman, and J. Wooley. “From simple desires to ordinary beliefs: The early development of everyday psychology.” Cognition 35 (1990): 245–275. [Google Scholar] [CrossRef]
  122. Betty M. Rapacholi, and Alison Gopnik. “Early reasoning about desires. Evidence from 14- and 18-months-olds.” Development Psychology 33 (1997): 12–21. [Google Scholar] [CrossRef]
  123. H. Wellman, and D. Liu. “Scaling theory of mind tasks.” Child Development 75 (2004): 523–41. [Google Scholar] [CrossRef] [PubMed]
  124. Brian Hare, Josep Call, B. Agnetta, and Michael Tomaselli. “Chimpanzees know what conspecifics do and do not see.” Animal Behaviour 59 (2000): 771–85. [Google Scholar] [CrossRef] [PubMed]
  125. A. M. Leslie. “Pretending and believing: Issues in the theory of ToMM.” Cognition 50 (1994): 211–38. [Google Scholar]
  126. S. Baron-Cohen. Mindblindness: An Essay of Autism and Theory of Mind. Cambridge, MA: MIT Press, 1995. [Google Scholar]
  127. Josef Perner, and Wendy A. Garnham. “Actions really do speak louder than words—but only implicitly. Young children’s understanding of false belief in action.” British Journal of Development Psychology 19 (2001): 413–32. [Google Scholar]
  128. T. Suddendorf. “The rise of the metamind.” In The Descent of Mind: Psychological Perception on Hominid Evolution. Edited by M. C. Corballis and S. Lea. London: Oxford University Press, 1999, pp. 218–260. [Google Scholar]
  129. Josep Call, and Michael Tomasello. “A nonverbal false belief task: The performance of children and great apes.” Child Development 70 (1999): 381–95. [Google Scholar] [CrossRef] [PubMed]
  130. Thomas Suddendorf, and Janie Busby. “Mental time travel in animals? ” Trends Cognitive Science 7 (2003): 391–96. [Google Scholar] [CrossRef]
  131. John M. Gardiner. “Episodic memory and autonoetic consciousness: A first-person approach.” Philosophical Transactions of The Royal Society: Biological Sciences 356 (2001): 1351–61. [Google Scholar] [CrossRef] [PubMed]
  132. H. C. Lou, B. Luber, M. Crupain, J. P. Keenan, M. Nowak, T. W. Kjaer, H. A. Sackeim, and S. H. Lisanby. “Parietal cortex and representation of the mental self.” Proceedings of the National Academy of Sciences 101 (2004): 6827–32. [Google Scholar] [CrossRef] [PubMed]
  133. David G. Premack, and Guy Woodruff. “Does the chimpanzee have a theory of mind? ” Behavioral and Brain Sciences 1 (1978): 515–526. [Google Scholar] [CrossRef]
  134. Simon Baron-Cohen. “Precursors to a theory of mind: Understanding attention in others.” In Natural Theories of Mind: Evolution, Development and Simulation of Everyday Mindreading. Edited by A. Whiten. Oxford: Basil Blackwell, 1991, pp. 233–51. [Google Scholar]
  135. U. Frith, and F. G. E. Happé. “Autism: Beyond ‘theory of mind’.” Cognition 50 (1994): 115–132. [Google Scholar] [CrossRef]
  136. Sally Ozonoff, and Judith N. Miller. “Teaching theory of mind—a new approach in social skills training for individuals with autism.” Journal of Autism Development Disord. 25 (1995): 415–433. [Google Scholar] [CrossRef]
  137. F. Happé, S. Ehlers, P. Fletcher, U. Frith, M. Johansson, C. Gillberg, R. Dolan, R. Frackowiak, and C. Frith. “Theory of mind in the brain: Evidence from a PET scan study of Asperger syndrome.” Neuroreport 8 (1996): 197–201. [Google Scholar] [CrossRef] [PubMed]
  138. F. G. E. Happé. “Central coherence and theory of mind in autism: Reading homographs in context.” British Journal of Development Psychology 15 (1997): 1–12. [Google Scholar] [CrossRef]
  139. Simon Baron-Cohen, Therese Jolliffe, C. Mortimore, and M. Robertson. “Another advanced test of theory of mind: Evidence from very high functioning adults with autism or Asperger syndrome.” Child Psychology and Psychiatry 38 (1997): 813–22. [Google Scholar] [CrossRef]
  140. Christopher Jarrold, David W. Butler, Emily M. Coltington, and Flora Jimenez. “Linking theory of mind and central coherence bias in autism and the general population.” Development Psychology 36 (2000): 126–138. [Google Scholar] [CrossRef]
  141. S. Jacques, and P. D. Zelazo. “Language and the development of cognitive flexibility: Implications for theory of mind.” In Why Language Matters for Theory of Mind. Edited by J. W. Astington and J. A. Baird. Toronto: Oxford University Press, 2005, pp. 144–62. [Google Scholar]
  142. G. Di Pellegrino, L. Fadiga, L. Fogassi, V. Gallese, and G. Rizzolatti. “Understanding motor events: A neurophysiological study.” Experimental Brain Research 91 (1992): 176–80. [Google Scholar] [CrossRef] [PubMed]
  143. G. Rizzolatti, L. Fadiga, V. Gallese, and L. Fogassi. “Premotor cortex and the recognition of motor actions.” Cognitive Brain Research 3 (1996): 131–41. [Google Scholar] [CrossRef]
  144. V. Gallese, and A. Goldman. “Mirror neurons and the simulation theory of mind-reading.” Trends Cognitive Science 2 (1998): 493–501. [Google Scholar] [CrossRef]
  145. Marco Iacoboni, Istvan Molnar-Szakacs, Vittorio Gallese, Giovanni Buccino, and John C. Mazziotta. “Grasping the intentions of others with one’s own mirror neuron system.” PLoS Biology 3 (2005): 529–535. [Google Scholar] [CrossRef] [PubMed]
  146. Gregory Hickok. “Eight problems for the mirror neuron theory of action understanding in monkeys and humans.” Journal of Cognitive Neuroscience 21 (2009): 1229–1243. [Google Scholar] [CrossRef] [PubMed]
  147. Robert M. Gordon. “Folk psychology as simulation.” Mind and Language, 1986, 158–71. [Google Scholar]
  148. Robert M. Gordon. “‘Radical’ simulationism.” In Theories of Theories of Mind. Edited by P. Carruthers and P. K. Smith. Cambridge, UK: Cambridge University Press, 1996, pp. 11–21. [Google Scholar]
  149. Stephanie D. Preston, and Frans B. M. de Waal. “Empathy: Its ultimate and proximate bases.” Behavioral and Brain Sciences 25 (2002): 1–72. [Google Scholar] [CrossRef] [PubMed]
  150. Robert Provine. “Self and other: A ticklish solution.” Edge, 2009. Available online: http://www.edge.org/3rd_culture /rama08/ rama08_ index.html (accessed on 14 January 2012). [Google Scholar]
  151. Jessica A. Somerville, and Jean Decety. “Weaving the fabric of social interaction: Articulating developmental psychology and cognitive neuroscience in the domain of motor cognition.” Psychonomic Bulletin & Reveiw 13 (2006): 179–200. [Google Scholar] [CrossRef]
  152. Christian Keysers, and Valeria Gazzola. “Integrating simulation and theory of mind: From self to social cognition.” Trends Cognitive Science 11 (2007): 194–196. [Google Scholar] [CrossRef] [PubMed]
  153. Vilayanur. S. Ramachandran. “Mirror neurons and imitation learning as the driving force behind ‘the great leap forward’ in human evolution.” Edge, 2009. Available online: http://www.edge.org/3rd_culture/ramachandran/ramachandran_index.html (accessed on 14 January 2012). [Google Scholar]
  154. Lindsay Oberman, and Vilayanur. S. Ramachandran. “Reflections on the mirror neuron system: Their evolutionary functions beyond motor representation.” In Mirror Neuron Systems: The Role of Mirroring Processes in Social Cognition. Edited by J. A. Pineda. New York: Humana Press, 2009, pp. 39–62. [Google Scholar]
  155. P. F. Ferrari, L. Bonini, and L. Fogassi. “From monkey mirror neurons to mirror-related behaviours: Possible direct and indirect pathways.” Philosophical Transactions of The Royal Society B 364 (2009): 2311–23. [Google Scholar] [CrossRef] [PubMed]
  156. G. Rizzolatti, and M. A. Arbib. “Language within our grasp.” Trends in Neurosciences 21 (1998): 188–194. [Google Scholar] [CrossRef]
  157. D. C. Dennett. Consciousness Explained. Boston, MA: Little, Brown, 1991. [Google Scholar]
  158. G. W. Farthing. The Psychology of Consciousness. New Jersey: Prentice Hall, 1992. [Google Scholar]
  159. Gordon G. Gallup Jr. “Self-awareness and the evolution of social intelligence.” Behavioural Processes 42 (1998): 239–247. [Google Scholar]
  160. Gordon G. Gallup Jr., and Steven M. Platek. “Cognitive empathy presupposes self-awareness: Evidence from phylogeny, ontogeny, neuropsychology, and mental illness.” Behavioral And Brain Sciences 25 (2002): 36–37. [Google Scholar]
  161. C. S. Carver. “Self-awareness.” In Handbook of Self and Identity. Edited by M. R. Leary and J. P. Tangney. New York: Guilford Press, 2002, pp. 179–196. [Google Scholar]
  162. Robert M. Seyfarth, and Dorothy L. Cheney. “Social awareness in monkeys.” Integrative and Comparative Biology 40 (2000): 902–909. [Google Scholar] [CrossRef]
  163. Mark R. Leary, and Nicole R. Buttermore. “The evolution of the human self: Tracing the natural history of self-awareness.” Journal For the Theory of Social Behaviour 33 (2003): 365–404. [Google Scholar] [CrossRef]
  164. T. Natsoulas. “Consciousness and self-awareness.” In Self-awareness: Its Nature and Development. Edited by M. D. Ferrari and R. J. Sternberg. New York: Guilford Press, 1998, pp. 12–33. [Google Scholar]
  165. P. Carruthers. “The cognitive functions of language.” Behavioral and Brain Sciences 25 (2002): 657–74. [Google Scholar] [CrossRef] [PubMed]
  166. C. Koch. The Quest for Consciousness. Greenwood Village, CO: Roberts & Company Publishers, 2004. [Google Scholar]
  167. K. Nelson. “Emerging levels of consciousness in early human development.” In The Missing Link in Cognition: Origins of Self-reflective Consciousness. Edited by H. S. Terrace and J. Metcalfe. Oxford: Oxford University Press, 2005, pp. 116–41. [Google Scholar]
  168. Robert A. Clowes. “Self-regulation model of inner speech and its role in the organisation of human conscious experience.” Journal of Consciousness Studies 14 (2007): 59–71. [Google Scholar]
  169. D. Fernández-Espejo, T. Bekinschtein, M. M. Monti, J. D. Pickard, C. Junque, M. R. Coleman, and A. M. Owen. “Diffusion weighted imaging distinguishes the vegetative state from the minimally conscious state.” NeuroImage 54 (2011): 103–12. [Google Scholar] [CrossRef] [PubMed]
  170. L. Velly, M. F. Rey, N. J. Bruder, F. A. Gouvitsos, T. Witjas, J. M. Regis, J. C. Peragut, and F. M. Gouin. “Differential dynamic of action on cortical and subcortical structures of anesthetic agents during induction of anesthesia.” Anesthesiology 107 (2007): 202–12. [Google Scholar] [CrossRef] [PubMed]
  171. D. Hofstadter. I am a Strange Loop. New York: Basic Books, 2007. [Google Scholar]
  172. Carlo Sestieri, Maurizio Corbetta, Gian L. Romani, and Gordon L. Shulman. “Episodic memory retrieval, parietal cortex, and the default mode network: Functional and topographic analyses.” The Journal of Neuroscience 31 (2011): 4407–4420. [Google Scholar] [CrossRef] [PubMed]
  173. D. T. Stuss, T. W. Picton, and M. P. Alexander. “Consciousness, self-awareness and the frontal lobes.” In The Frontal Lobes and Neuropsychiatric Illness. Edited by S. Salloway, P. Malloy and J. Duffy. Washington, DC: American Psychiatric Press, 2001, pp. 101–109. [Google Scholar]
  174. J. Decety, and J. A. Somerville. “Shared representations between self and others: A social cognitive neuroscience view.” Trends Cognitive Science 7 (2003): 527–533. [Google Scholar] [CrossRef]
  175. Debra A. Gusnard. “Being a self: Considerations from functional imaging.” Consciousness and Cognition 14 (2005): 679–97. [Google Scholar] [CrossRef] [PubMed]
  176. Alain Morin. “Right hemispheric self-awareness: A critical assessment.” Consciousness and Cognition 11 (2002): 396–401. [Google Scholar] [CrossRef]
  177. Alain Morin. “A neurocognitive and socioecological model of self-awareness.” Genetic, Social, and General Psychology Monographs 130 (2004): 197–222. [Google Scholar] [CrossRef] [PubMed]
  178. Alain Morin, and Jayson Michaud. “Self-awareness and the left inferior frontal gyrus: Inner speech use during in self-related processing.” Brain Research Bulletin 74 (2007): 387–96. [Google Scholar] [CrossRef] [PubMed]
  179. Andrea E. Cavanna, and Michael R. Trimble. “The precuneus: A review of its functional anatomy and behavioral correlates.” Brain 129 (2006): 564–83. [Google Scholar] [CrossRef] [PubMed]
  180. G. G. Gallup Jr. “Chimpanzees: Self recognition.” Science 167 (1970): 86–87. [Google Scholar]
  181. Robert W. Mitchell. “Mental models of mirror-self-recognition: Two theories.” New Ideas in Psychology 11 (1993): 295–325. [Google Scholar] [CrossRef]
  182. Robert W. Mitchell. “Kinesthetic-visual matching and the self-concept as explanations of mirror-self-recognition.” Journal for the Theory of Social Behaviour 27 (1997): 18–39. [Google Scholar] [CrossRef]
  183. R. W. Mitchell. “Subjectivity and self-recognition in animals.” In Handbook of Self and Identity. Edited by M. R. Leary and J. P. Tangney. New York: Guilford Press, 2002, pp. 567–95. [Google Scholar]
  184. C. M. Heyes. “Theory of mind in nonhuman primates.” Behavioral and Brain Sciences 21 (1998): 101–134. [Google Scholar] [CrossRef] [PubMed]
  185. Gordon G. Gallup Jr., J. L. Anderson, and D. P. Shillito. “The mirror test.” In The Cognitive Animal: Empirical and Theoretical Perspectives on Animal Cognition. Edited by M. Bekoff, C. Allen and G. M. Burghardt. Chicago: University of Chicago Press, 2002, pp. 325–33. [Google Scholar]
  186. Julian P. Keenan, Dean Falk, and Gordon G. Gallup Jr. The Face in the Mirror: The Search for the Origins of Consciousness. New York: Harper Collins Publishers, 2003. [Google Scholar]
  187. K. B. Swartz. “What is mirror self-recognition in nonhuman primates, and what is it not? ” In The Self Across Psychology: Self-recognition, Self-awareness, and the Self-concept. Edited by J. G. Snodgrass and R. L. Thompson. New York: New York Academy of Sciences, 1997, pp. 65–71. [Google Scholar]
  188. M. W. De Veer, and R. Van Den Bos. “A critical review of methodology and interpretation of mirror self-recognition research in nonhuman primates.” Animal Behavior 58 (1999): 459–68. [Google Scholar] [CrossRef] [PubMed]
  189. Alain Morin. “Let’s face it.” Evolutionary Psychology 1 (2003): 177–87. [Google Scholar] [CrossRef]
  190. W. W. Seeley, D. A. Carlin, and J. M. Allman. “Early frontotemporal dementia targets neurons unique to apes and humans.” Annals of Neurology 60 (2006): 660–67. [Google Scholar] [CrossRef] [PubMed]
  191. Camilla Butti, Chet C. Sherwood, Atiya Y. Hakeem, and John M. Allman. “Total number and volume of von Economo neurons in the cerebral cortex of cetaceans.” The Journal of Comparative Neurology 515 (2009): 243–59. [Google Scholar] [CrossRef] [PubMed]
  192. Atiya Y. Hakeem, Chet C. Sherwood, Christopher J. Bonar, Camilla Butti, Patrick R. Hof, and John M. Allman. “Von Economo neurons in the elephant brain.” Anatomical Record 292 (2009): 242–48. [Google Scholar] [CrossRef] [PubMed]
  193. Angelo Maravita, Charles Spence, and Jon Driver. “Multisensory integration and the body schema: Close to hand and within reach.” Current Biology 13 (2003): R531–39. [Google Scholar] [CrossRef]
  194. W. C. McGrew, and L. F. Marchant. “Chimpanzee wears a knotted skin ‘necklace’.” Pan Africa News 5 (1998): 8–9. [Google Scholar]
  195. William C. McGrew. The Cultured Chimpanzee. Cambridge, UK: Cambridge University Press, 2004. [Google Scholar]
  196. T. Nishida, T. Matsusaka, and W. C. McGrew. “Emergence, propagation or disappearance of novel behavioral patterns in the habituated chimpanzees of Mahale: A review.” Primates 50 (2009): 23–36. [Google Scholar] [CrossRef] [PubMed]
  197. Robert G. Bednarik. “The role of Pleistocene beads in documenting hominid cognition.” Rock Art Research: The Journal of the Austrlian 14 (1997): 27–41. [Google Scholar]
  198. Robert G. Bednarik. “Middle Pleistocene beads and symbolism.” Anthropos 100 (2005): 537–52. [Google Scholar]
  199. Robert G. Bednarik. “Beads and the origins of symbolism.” Time and Mind 1 (2008): 285–318. [Google Scholar] [CrossRef]
  200. J. Boucher de Perthes. Antiquités Celtiques et Antédiluviennes. Paris: Treuttel et Wurtz, 1846. [Google Scholar]
  201. M. Mottl. “Die Repolust-Höhle bei Peggau (Steiermark) und ihre eiszeitlichen Bewohner.” Archaeologia Austriaca 8 (1951): 1–78. [Google Scholar]
  202. N. Goren-Inbar, Z. Lewy, and M. E. Kislev. “Bead-like fossils from an Acheulian occupation site, Israel.” Rock Art Research: The Journal of the Austrlian 8 (1991): 133–36. [Google Scholar]
  203. H. Ziegert. “Das neue Bild des Urmenschen.” Uni hh forschung 30 (1995): 9–15. [Google Scholar]
  204. H. Ziegert. “A new dawn for humanity: Lower Palaeolithic village life in Libya and Ethiopia.” Minerva 18 (2007): 8–9. [Google Scholar]
  205. F. d’Errico, and P. Villa. “Holes and grooves: The contribution of microscopy and taphonomy to the problem of art origins.” Journal of Human Evolution 33 (1997): 1–31. [Google Scholar] [CrossRef] [PubMed]
  206. S. Rigaud, F. d’Errico, M. Vanhaeren, and C. Neumann. “Critical reassessment of putative Acheulean Prosphaere globularis beads.” Journal of Archaeological Science 36 (2009): 25–34. [Google Scholar] [CrossRef]
  207. I. Turk, J. Dirjec, and B. Kavur. “Ali so v Sloveniji našli najstarejše glasbilo v Evropi? (The oldest musical instrument in Europe discovered in Slovenia?).” Razprave IV, razreda SAZU 36 (1995): 287–93. [Google Scholar]
  208. M. Turk, and L. Dimkaroski. “Neanderthal flute from Divje babe I: Old and new findings.” In Fragments of Ice Age Environments. Proceedings in Honour of Ivan Turk’s Jubilee. Edited by B. Toškan. Ljubljana: ZRC SAZU, 2011, pp. 251–65. [Google Scholar]
  209. Francesco d’Errico, Paola Villa, Ana C. Pinto Llona, and Rosa R. Idarraga. “A Middle Palaeolithic origin of music? Using cave-bear bone accumulations to assess the Divje babe I bone ‘flute’.” Antiquity 72 (1998): 65–79. [Google Scholar]
  210. J. Clottes, J.-M. Chauvet, E. Brunel-Deschamps, C. Hillaire, J.-P. Daugas, M. Arnold, H. Cachier, J. Evin, P. Fortin, C. Oberlin, and et al. “Les peintures paléolithiques de la Grotte Chauvet-Pont d’Arc, à Vallon-Pont-d’Arc (Ardèche, France): datations directes et indirectes par la méthode du radiocarbone.” Comptes Rendus de l’Académie des Sciences de Paris 320 Ser. II. (1995): 1133–40. [Google Scholar]
  211. A. Marshack. The Roots of Civilization. New York: McGraw-Hill. London: Weidenfeld and Nicolson, 1972. [Google Scholar]
  212. N. J. Conard. “A female figurine from the basal Aurignacian of Hohle Fels Cave in southwestern Germany.” Nature 459 (2009): 248–52. [Google Scholar] [CrossRef] [PubMed]
  213. Henry Plotkin. The Imagined World made Real: Towards a Natural Science of Culture. London: Penguin Books, 2002. [Google Scholar]
  214. Geoffrey F. Miller. The Mating Mind: How Mate Choice shaped the Evolution of Human Nature. New York: Doubleday, 2000. [Google Scholar]
  215. Geoffrey F. Miller. “Aesthetic fitness: How sexual selection shaped artistic virtuosity as a fitness indicator and aesthetic preference as mate choice criteria.” Bulletin of Psychology and the Arts 2 (2001): 20–25. [Google Scholar]
  216. M. A. C. Varella, A. A. L. de Souza, and J. H. B. P. Ferreira. “Evolutionary aesthetics and sexual selection in the evolution of rock art aesthetics.” Rock Art Research: The Journal of the Austrlian 28 (2011): 153–86. [Google Scholar]
  217. T. Verhoeven. “Pleistozäne Funde in Flores.” Anthropos 53 (1958): 264–65. [Google Scholar]
  218. R. G. Bednarik. “The origins of navigation and language.” The Artefact 20 (1997): 16–56. [Google Scholar]
  219. S. Ginesu, S. Sias, and J. M. Cordy. “Morphological evolution of the Nurighe Cave (Logudoro, northern Sardinia, Italy) and the presence of man: First results.” Geografica Fisica e Dinamica Quaternaria 26 (2003): 41–48. [Google Scholar]
  220. P. Mortensen. “Lower to Middle Palaeolithic artefacts from Loutró on the south coast of Crete.” Antiquity 82 (2008): 1–6. [Google Scholar]
  221. K. Kopaka, and C. Matzanas. “Palaeolithic industries from the island of Gavdos, near neighbour to Crete in Greece.” Antiquity 83 (2009). Available online: http://www.antiquity.ac.uk/projgall/kopaka321/. [Google Scholar]
  222. T. F. Strasser, E. Panagopoulou, C. N. Runnels, P. M. Murray, N. Thompson, P. Karkanas, F. W. McCoy, and K. W. Wegmann. “Stone Age seafaring in the Mediterranean: Evidence from the Plakias region for Lower Palaeolithic and Mesolithic habitation of Crete.” Hesperia 79 (2010): 145–90. [Google Scholar] [CrossRef]
  223. Thomas F. Strasser, Curtis Runnels, Karl Wegmann, Eleni Panagopoulou, F. McCoy, Chad Digregorio, Panagiotis Karkanas, and Nick Thompson. “Dating Palaeolithic sites in southwestern Crete, Greece.” Journal of Quaternary Science, 2011. [Google Scholar] [CrossRef]
  224. Robert G. Bednarik. “Pleistocene seafaring in the Mediterranean.” Anthropologie 37 (1999): 275–82. [Google Scholar]
  225. Robert G. Bednarik. “The earliest evidence of ocean navigation.” International Journal of Nautical Archaeology 26 (1997): 183–91. [Google Scholar] [CrossRef]
  226. Derek Bickerton. Adam’s Tongue: How Humans made Language, how Language made Humans. New York: Hill and Wang, 2010. [Google Scholar]
  227. Robert G. Bednarik. “A taphonomy of palaeoart.” Antiquity 68 (1994): 68–74. [Google Scholar]
  228. D. Falk. “Comparative anatomy of the larynx in man and chimpanzee: Implications for language in Neanderthal.” American Journal of Physical Anthropologyol 43 (1975): 123–32. [Google Scholar] [CrossRef] [PubMed]
  229. D. Falk. “Cerebral cortices of east African early hominids.” Science 221 (1983): 1072–74. [Google Scholar] [CrossRef] [PubMed]
  230. Dean Falk. “Hominid paleoneurology.” Annual Review of Anthropology 16 (1987): 13–30. [Google Scholar] [CrossRef]
  231. Dean Falk. Finding our Tongues: Mothers, Infants and the Origins of Language. New York: Basic Books, 2009. [Google Scholar]
  232. Thomas S. Kuhn. The Structure of Scientific Revolutions. Chicago: University of Chicago Press, 1962. [Google Scholar]
  233. P. A. Helvenston, and Robert G. Bednarik. “Evolutionary origins of brain disorders in Homo sapiens sapiens.” Brain Research Reveiw 3 (2011): 113–39. [Google Scholar]
  234. D. C. Rubinsztein, W. Amos, J. Leggo, S. Goodburn, R. S. Ramesar, J. Old, R. Dontrop, R. McMahon, D. E. Barton, and M. A. Ferguson-Smith. “Mutational bias provides a model for the evolution of Huntington’s disease and predicts a general increase in disease prevalence.” Nature Genetics 7 (1994): 525–30. [Google Scholar] [CrossRef] [PubMed]
  235. L. C. Walker, and L. C. Cork. “The neurobiology of aging in nonhuman primates.” In Alzheimer’s Disease, 2nd ed. Edited by R. D. Terry, R. Katzman, K. L. Bick and S. S. Sisodia. New York: Lippincott Williams & Wilkins, 1999, pp. 233–43. [Google Scholar]
  236. W. Enard, P. Khaitovich, J. Klose, F. Heissig, S. Zöllner, P. Giavalisco, K. Nieselt-Struwe, E. Muchmore, A. Varki, R. Ravid, and et al. “Intra- and interspecific variation in primate gene expression patterns.” Science 296 (2002): 340–43. [Google Scholar] [CrossRef] [PubMed]
  237. M. V. Olson, and A. Varki. “Sequencing the chimpanzee genome: Insights into human evolution and disease.” Nature Reviews Genetics 4 (2003): 20–28. [Google Scholar] [CrossRef] [PubMed]
  238. M. Marvanová, J. Ménager, E. Bezard, R. E. Bontrop, L. Pradier, and G. Wong. “Microarray analysis of nonhuman primates: Validation of experimental models in neurological disorders.” The FASEB Journal 17 (2003): 929–31. [Google Scholar] [CrossRef] [PubMed]
  239. C. C. Sherwood, A. D. Gordon, J. S. Allen, K. A. Phillips, J. M. Erwin, P. R. Hof, and W. D. Hopkins. “Aging of the cerebral cortex differs between humans and chimpanzees.” Proceedings of the National Academy of Sciences, 2011. [Google Scholar] [CrossRef]
  240. T. Yoshikawa, M. Kikuchi, K. Saito, A. Watanabe, K. Yamada, H. Shibuya, M. Nankai, A. Kurumaji, E. Hattori, H. Ishiguro, and et al. “Evidence for association of the myo-inositol monophosphatase 2 (IMPA2) gene with schizophrenia in Japanese samples.” Molecular Psychiatry 6 (2001): 202–10. [Google Scholar] [CrossRef] [PubMed]
  241. R. Spinks, H. K. Sandhu, N. C. Andreasen, and R. A. Philibert. “Association of the HOPA12bp allele with a large X-chromosome haplotype and positive symptom schizophrenia.” American Journal of Medical Genetics Part B: Neuropsychiatric Genetics 127 (2004): 20–27. [Google Scholar] [CrossRef] [PubMed]
  242. H. J. Cho, I. Meira-Lima, Q. Cordeiro, L. Michelon, P. C. Sham, H. Vallada, and D. A. Collier. “Population-based and family-based studies on the serotonin transporter gene polymorphisms and bipolar disorder: A systematic review and meta-analysis.” Molecular Psychiatry 10 (2005): 771–81. [Google Scholar] [CrossRef] [PubMed]
  243. Dawei Li, David A. Collier, and Lin He. “Meta-analysis shows strong positive association of the neuregulin 1 (NRG1) gene with schizophrenia.” Human Molecular Genetics 15 (2006): 1995–2002. [Google Scholar] [CrossRef] [PubMed]
  244. Ming-Qing Xu, David St Clair, and Lin He. “Meta-analysis of association between ApoE epsilon4 allele and schizophrenia.” Schizophrenia Research 84 (2006): 228–35. [Google Scholar] [CrossRef] [PubMed]
  245. M. Ayalew, H. Le-Niculescu, D. F. Levey, N. Jain, B. Changala, S. D. Patel, E. Winiger, A. Breier, A. Shekhar, R. Amdur, and et al. “Convergent functional genomics of schizophrenia: From comprehensive understanding to genetic risk prediction.” Molecular Psychiatry 17 (2012): 887–905. [Google Scholar] [CrossRef] [PubMed]
  246. Nick Craddock, and Ian Jones. “Genetics of bipolar disorder: Review article.” Journal of Medical Genetics 26 (1999): 585–94. [Google Scholar]
  247. N. Craddock, M. C. O’Donovan, and M. J. Owen. “The genetics of schizophrenia and bipolar disorder: Dissecting psychosis.” Journal of Medical Genetics 42 (2005): 193–204. [Google Scholar] [CrossRef] [PubMed]
  248. T. Saito, F. Guan, D. F. Papolos, S. Lau, M. Klein, C. S. Fann, and H. M. Lachman. “Mutation analysis of SYNJ1: A possible candidate gene for chromosome 21q22-linked bipolar disorder.” Molecular Psychiatry 6 (2001): 387–95. [Google Scholar] [CrossRef] [PubMed]
  249. P. Muglia, A. Petronis, E. Mundo, S. Lander, T. Cate, and J. L. Kennedy. “Dopamine D4 receptor and tyrosine hydroxylase genes in bipolar disorder: Evidence for a role of DRD4.” Molecular Psychiatry 7 (2002): 860–66. [Google Scholar] [CrossRef] [PubMed]
  250. P. Stopkova, J. Vevera, I. Paclt, I. Zukov, and H. M. Lachman. “Analysis of SYNJ1, a candidate gene for 21q22 linked bipolar disorder: A replication study.” Psychiatry Research 127 (2004): 157–61. [Google Scholar] [CrossRef] [PubMed]
  251. Aida M. Andres, Marta Soldevila, Arcadi Navarro, Kenneth K. Kidd, Baldomero Oliva, and Jaume Bertranpetit. “Positive selection in MAOA gene is human exclusive: Determination of the putative amino acid change selected in the human lineage.” Human Genetics 115 (2004): 377–86. [Google Scholar] [CrossRef] [PubMed]
  252. Martin Preisig, Francois Ferrero, and Alain Malafosse. “Monoamine oxidase a and tryptophan hydroxylase gene polymorphisms: Are they associated in bipolar disorder? ” American Journal of PharmacoGenomics 5 (2005): 45–52. [Google Scholar] [CrossRef] [PubMed]
  253. M. Jansson, S. McCarthy, P. F. Sullivan, P. Dickman, B. Andersson, L. Oreland, M. Schalling, and N. L. Pedersen. “MAOA haplotypes associated with thrombocyte-MAO activity.” BMC Genetics 6 (2005): 46. [Google Scholar] [CrossRef] [PubMed]
  254. Benjamin F. Voight, Sridhar Kudaravalli, Xiaoquan Wen, and Jonathan K. Pritchard. “A map of recent positive selection in the human genome.” PLoS Biology 4 (2006): e72. [Google Scholar] [CrossRef] [PubMed]
  255. E. H. Hare. “Schizophrenia as a recent disease.” The British Journal of Psychiatry 153 (1988): 521–31. [Google Scholar] [CrossRef] [PubMed]
  256. P. D. Evans, S. L. Gilbert, N. Mekel-Bobrov, E. J. Vallender, J. R. Anderson, L. M. Vaez-Azizi, S. A. Tishkoff, R. R. Hudson, and B. T. Lahn. “Microcephalin, a gene regulating brain size, continues to evolve adaptively in humans.” Science 309 (2005): 1717–20. [Google Scholar]
  257. N. Mekel-Bobrov, S. L. Gilbert, P. D. Evans, E. J. Vallender, J. R. Anderson, S. A. Tishkoff, and B. T. Lahn. “Ongoing adaptive evolution of ASPM, a brain size determinant in Homo sapiens.” Science 309 (2005): 1720–22. [Google Scholar] [CrossRef] [PubMed]
  258. Beate Hermelin, and Neil O’Connor. Psychological Experiments with Autistic Children. Oxford & New York: Pergamon Press, 1970. [Google Scholar]
  259. Uta Frith. Autism: Explaining the Enigma. Oxford: Blackwell, 1989. [Google Scholar]
  260. C. Hughes, I. Soares-Boucaud, J. Hochmann, and U. Frith. “Social behaviour in pervasive developmental disorders: Effects of informant, group and ‘theory-of-mind’.” European Child & Adolescent Psychiatry 6 (1997): 191–98. [Google Scholar] [CrossRef]
  261. S. Baron-Cohen. “The extreme male brain theory of autism.” Trends Cognitive Science 6 (2002): 248–54. [Google Scholar] [CrossRef]
  262. Simon Baron-Cohen. “Two new theories of autism: Hyper-systemizing and assortative mating.” Archives of Disease in Childhood 91 (2006): 2–5. [Google Scholar] [CrossRef] [PubMed]
  263. J. M. Allman, K. K. Watson, N. A. Tetreault, and A. Y. Hakeem. “Intuition and autism: A possible role for von Economo neurons.” Trends Cognitive Science 9 (2005): 367–73. [Google Scholar] [CrossRef] [PubMed]
  264. Richard R. Grinker. Unstrange Minds: Remapping the World of Autism. New York: Basic Books, 2007. [Google Scholar]
  265. J. R. Brasic. “Autism.” E Medicine Medscape, 2009. Available online: http://emedicine.medscape.com/article/912781-print (accessed on 8 January 2012). [Google Scholar]
  266. J. R. Brasic. “Asperger’s syndrome.” E Medicine Medscape, 2009. Available online: http://emedicine. medscape.com/article/912296-print (accessed on 8 January 2012). [Google Scholar]
  267. J. R. Brasic. “PET Scanning in autism spectrum disorders.” E Medicine Medscape, 2012. Available online: http://emedicine.medscape.com/article/1155568-print. 2010 (accessed on 8 January 2012). [Google Scholar]
  268. M. Balter. “A mind for sociability.” Science Now Daily News 27 (2007): 1. [Google Scholar]
  269. Jacob. A. Burack, T. Charman, N. Yurmiya, and P. R. Zelazo, eds. The Development of Autism: Perspectives from Theory and Research. London: Taylor & Francis/Routledge, 2009.
  270. K. Weintraub. “Autism counts.” Nature 479 (2011): 22–24. [Google Scholar]
  271. Young S. Kim, Bennett Leventhal, Yoo-Joo Koh, Eric Fombonne, Eugene Laska, Eun-Chung Lim, Keun-Ah Chun, Soo-Jeong Kim, Young-Key Kim, Hyunkyung Lee, and et al. “Prevalence of autism spectrum disorders in a total population sample.” The American Journal of Psychiatry 168 (2011): 904–12. [Google Scholar] [CrossRef] [PubMed]
  272. L. Buchen. “When geeks meet.” Nature 479 (2011): 25–27. [Google Scholar] [CrossRef] [PubMed]
  273. F. Bermejo-Pareja, J. Benito-León, S. Vega, M.J. Medrano, and G. C. Román. “Incidence and subtypes of dementia in three elderly populations of central Spain.” Journal of the Neurological Sciences 264 (2008): 63–72. [Google Scholar] [CrossRef] [PubMed]
  274. Lonneke M. L. de Lau, and Monique M. Breteler. “Epidemiology of Parkinson’s disease.” The Lancet Neurology 5 (2006): 525–35. [Google Scholar] [CrossRef]
  275. “Plato ca. 370 CE.” Phaedrus, 274e–275a.
  276. Milford Wolpoff. Paleoanthropology, 2nd ed. New York: McGraw-Hill, 1999. [Google Scholar]
  277. Francis Fukuyama. Our Posthuman Future: Consequences of the Biotechnology Revolution. New York: Farrar, Straus and Giroux, 2002. [Google Scholar]
  278. Ray Kurzweil. The Singularity is Near: When Humans Transcend Biology. New York: Viking, 2005. [Google Scholar]
  279. Bert Gordijn, and Ruth Chadwick, eds. Medical Enhancement and Posthumanity. Dordrecht: Springer, 2008.
  280. Robert L. Heilbroner. Visions of the Future: The Distant Past, Yesterday, Today, Tomorrow. New York: Oxford University Press, 1995. [Google Scholar]
  281. Francis Fukuyama. The End of History and the Last Man. New York: Free Press, 1992. [Google Scholar]
  282. David. M. Raup. Extinction: Bad Genes or Bad Luck? New York: W.W. Norton, 1991. [Google Scholar]

Share and Cite

MDPI and ACS Style

Bednarik, R.G. From Human Past to Human Future. Humanities 2013, 2, 20-55. https://doi.org/10.3390/h2010020

AMA Style

Bednarik RG. From Human Past to Human Future. Humanities. 2013; 2(1):20-55. https://doi.org/10.3390/h2010020

Chicago/Turabian Style

Bednarik, Robert G. 2013. "From Human Past to Human Future" Humanities 2, no. 1: 20-55. https://doi.org/10.3390/h2010020

APA Style

Bednarik, R. G. (2013). From Human Past to Human Future. Humanities, 2(1), 20-55. https://doi.org/10.3390/h2010020

Article Metrics

Back to TopTop