The Original Industrial Revolution. Did Cold Winters Select for Cognitive Ability?

Rushton and Jensen argued that cognitive ability differs between human populations. But why are such differences expectable? Their answer: as modern humans spread out of Africa and into northern Eurasia, they entered colder and more seasonal climates that selected for the ability to plan ahead, in order to store food, make clothes, and build shelters for winter. This cold winter theory is supported by research on Paleolithic humans and recent hunter-gatherers. Tools become more diverse and complex as effective temperature decreases, apparently because food has to be obtained during limited periods and over large areas. There is also more storage of food and fuel and greater use of untended traps and snares. Finally, shelters have to be sturdier, and clothing more cold-resistant. The resulting cognitive demands are met primarily by women because the lack of opportunities for food gathering pushes them into more cognitively demanding tasks, like garment making, needlework, weaving, leatherworking, pottery, and kiln operation. The northern tier of Paleolithic Eurasia thus produced the “Original Industrial Revolution”—an explosion of creativity that preadapted its inhabitants for later developments, i.e., farming, more complex technology and social organization, and an increasingly future-oriented culture. Over time, these humans would spread south, replacing earlier populations that could less easily exploit the possibilities of the new cultural environment. As this environment developed further, it selected for further increases in cognitive ability. Indeed, mean intelligence seems to have risen during recorded history at temperate latitudes in Europe and East Asia. There is thus no unified theory for the evolution of human intelligence. A key stage was adaptation to cold winters during the Paleolithic, but much happened later.


Introduction
It has been proposed that the farther north the populations migrated out of Africa, the more they encountered the cognitively demanding problems of gathering and storing food, gaining shelter, making clothes, and raising children successfully during prolonged winters. [1] (p. 266) In their joint article, J. Philippe Rushton and Arthur Jensen argued that cognitive ability differs between human populations, specifically between black and white Americans. But why are such differences expectable? This question was answered briefly: as Paleolithic humans spread out of Africa and into the northern latitudes of Eurasia, they entered colder and more seasonal climates that imposed higher cognitive demands. This cold winter theory is described at length in Rushton's book Race, Evolution, and Behavior, where he makes the following points: • Colder and longer winters increased the need for planning, particularly for storage of food and fuel. • Plant foods were unavailable during winter and spring, thus making meat crucial for survival. A greater range of tools had to be developed for hunting and cutting up of meat. • Game animals were pursued in open environments. Hunters had to strategize and coordinate with other hunters, e.g., to drive animals over cliffs or into ravines.

•
To keep warm, people had to find novel ways to make fire, clothing, and shelter. Fire, for instance, had to be made by friction or percussion in a treeless environment with little wood [2] (pp. 225-226).
The above points are taken from earlier work by Edward Miller and Richard Lynn [3][4][5][6]. Lynn came up with the general argument, and Miller developed it further: As I remember things Lynn first mentioned it in general terms but did not provide details as to why high intelligence may be advantageous. I think I filled the argument in with details like a need to plan (how long a stored food supply will last), the importance of being able to start a fire and keep it going in adverse conditions, and the added complexity of making warm clothes. [...] What I felt was an original contribution related to the importance of male-female cooperation during winters. Virtually all food has to be from large hunted game since digging for tubers will not work in frozen ground, eggs, berries, and fruits are absent, cold blooded reptiles and slugs are absent, and any fallen nuts are hidden by snow. Only large mammals remain. [7] The point on male-female cooperation was inspired by an article of mine on another subject [8]. In any case, few ideas are original, and other authors had already argued that cold climates select for cognitive ability. Miller says as much in his review of the literature: Lynn is not the first to argue that the benefits of intelligence were greatest for those populations living in cold climates during the Ice Ages (for an early exposition see Huntington 1924, Chapter IV [9]). It is common to argue that the impetus for the enlargement of the brain size "during the era of Homo erectus was due to an expansion out of the tropics and into cool regions where ingenuity and flexibility of behavior were more necessary for survival" (Campbell 1976, p. 324) [10]. More recently, Calvin (1991) [11] has argued that human intelligence was strongly selected for in cold areas during the Ice Ages, although without recognizing that this hypothesis had implications for the geographical distribution of intelligence [3] (p.127). [9][10][11] This argument was made even earlier, in 1864 by Darwin's colleague Alfred Russel Wallace: So when a glacial epoch comes on, some animals must acquire warmer fur, or a covering of fat, or else die of cold. Those best clothed by nature are, therefore, preserved by natural selection. Man, under the same circumstances, will make himself warmer clothing, and build better houses; and the necessity of doing this will react upon his mental organisation and social condition [ . . . ].
[ . . . ] a hardier, a more provident, and a more social race would be developed, than in those regions where the earth produces a perennial supply of vegetable food, and where neither foresight nor ingenuity are required to prepare for the rigours of winter. And is it not the fact that in all ages, and in every quarter of the globe, the inhabitants of temperate have been superior to those of tropical countries? All the great invasions and displacements of races have been from North to South, rather than the reverse. [12] One can go back earlier still to the German philosopher Arthur Schopenhauer (1788-1860): All this is due to the fact that necessity is the mother of invention, because those tribes that emigrated early to the north, and there gradually became white, had to develop all their intellectual powers, and invent and perfect all the arts in their struggle with need, want and misery, which in their many forms, were brought about by the climate. This they had to do in order to make up for the parsimony of nature, and out of it came their high civilization. [13] (p. 159) Neither Miller nor Lynn was aware of the above passage when I brought it to their attention. It seems that over the past two centuries different people have independently come up with the same idea.
A different idea, however, prevailed farther back in time: medieval scholars thought intelligence to be highest between the cold north and the hot south. This was the view of Sa'id al-Andalusi (1029-1070), a historian and philosopher in Toledo, Spain: For those who live furthest to the north between the last of the seven climates and the limits of the inhabited world, the excessive distance of the sun in relation to the zenith line makes the air cold and the atmosphere thick. Their temperaments are therefore frigid, their humors raw, their bellies gross, their color pale, their hair long and lank. Thus they lack keenness of understanding and clarity of intelligence, and are overcome by ignorance and dullness, lack of discernment, and stupidity. Such are the Slavs, the Bulgars, and their neighbors. For those peoples on the other hand who live near and beyond the equinoctial line to the limit of the inhabited world in the south, the long presence of the sun at the zenith makes the air hot and the atmosphere thin. Because of this their temperaments become hot and their humors fiery, their color black, and their hair woolly. Thus they lack self-control and steadiness of mind and are overcome by fickleness, foolishness, and ignorance. Such are the blacks, who live at the extreme of the land of Ethiopia, the Nubians, the Zanj and the like. [14] (pp. [47][48] Similar views were expressed by Maimonides (1135-1204), Ibn Khaldun (1332-1406), and others [15] (supplementary file A). This idea also appears in the writings of some classical writers, notably Aristotle (384-322 BC): "The nations inhabiting the cold places and those of Europe are full of spirit but somewhat deficient in intelligence and skill. [...] The peoples of Asia on the other hand are intelligent and skillful in temperament, but lack spirit. [...] But the Greek race participates in both characters, just as it occupies the middle position geographically, for it is both spirited and intelligent" Politics 7.6.1 [16] (pp. 97, 110-111).

A Closer Look at the Cold Winter Theory: Paleolithic Hunter-Gatherers
So how true is the cold winter theory? We know that modern humans began spreading into northern Eurasia some forty thousand years ago, when the climate was cooler than it is today-a time called the Last Glacial Period or, more commonly, the last Ice Age [17] (p. 141). There, they encountered an environment unlike any that currently exists-vast expanses of steppe-tundra that were remarkably rich in food resources. This world is described by Fredric Weizmann and others in a rejoinder to Rushton: However, that earlier world was not simply a colder version of today's. V. Geist (1978) [18] describes the periglacial climate as something quite different from Baffin Island or Northeastern Siberia in January, or from any other existing environment. It was not particularly harsh, but was rather benign and ecologically rich: "... between the Urals and the Altai in the Pleistocene, early man would have had as mixed a bag as camel, moose, reindeer, yak and roe deer (Geist, 1978, p. 201) [18]." "The periglacial ecosystem approached some African ecosystems in diversity, and was apparently far more productive than the climax ecosystems at comparable latitudes and altitudes.". (Geist, 1978, p. 188) [18] (p. 48), [19] Actually, this environment was both harsh and rich. The apparent contradiction lies in the fact that the ice sheets had pushed the tundra zone to the south, particularly in Europe, where the stronger sunlight spurred the growth of grasses, mosses, lichens, and low shrubs. This plant growth supported large herds of herbivores and, in turn, a large human population. John Hoffecker writes: "Despite estimated winter temperatures of between −20 and −30 degrees C, warm, continental summers probably raised effective temperature above 10.5 degrees C [...]. Primary productivity must have been significantly higher than that of modern arctic tundra environments, and closer to that of a temperate grassland [...]" [17] (p. 23).
In short, humans could thrive in this environment if they could cope with the yearly cycle of warm summers and frigid winters. To this end they developed new forms of technology and organization: The success of modern human technology and organization is most dramatically apparent after the beginning of the Last Glacial Maximum (OIS 2), when modern humans occupied the central East European Plain under full glacial conditions. They created a remarkable adaptation to this bizarre environment that employed ingenious technology and complex organizational structures. [17] (p. 254) The period of full glacial conditions saw these humans occupy large areas of northern Eurasia that Neanderthals had never occupied [17] (p. 2). They succeeded through ingenuity: "Modern humans quickly developed an array of novel technological solutions to the challenges of northern habitats that more than compensated for their warm-climate morphology" [17] (p. 2). Northern habitats posed three challenges: • Keeping the cold out and the warmth in. Clothing had to be tailored to retain body heat. This required development of bone awls and eyed needles of bone and ivory, as well as stone end-scrapers and burnishers to prepare hides [17] (pp. 160, 225). For the same reason, shelters had to be sturdier and more cold-resistant, being generally made of mammoth bone and heated with bone-fuel hearths [17] (p. 192). By contrast, tropical hunter-gatherers did not correspondingly develop heat-removal technologies (rotary fans, ice creation or storage, air conditioning, etc.), probably because heat removal is technologically more difficult than heat retention. Adaptations to heat were thus much more biological (having a higher surface area to volume ratio) or behavioral (remaining in the shade at midday).

•
Managing time. There was a greater need to plan and schedule because food and fuel were available for limited periods [17] (p. 135), [20]. As Marek Zvelebil puts it: "In such areas, one or two seasonally abundant resources may be relied on to produce the critical storable surplus for the lean seasons. This would require short periods of intensive harvest and precise scheduling during those times of the year when these resources were available" [21] (p. 170). This time constraint was solved through (1) time budgeting, (2) preparation of specific tools in advance for specific tasks, (3) use of untended devices, such as pits, traps, weirs, and nets, and (4) digging of storage pits down to the permafrost layer to refrigerate perishables, either meat for year-round consumption or bones for winter fuel [17] (pp. 161, 192, 229) [21] (p. 170). These solutions are dubbed by Zvelebil "the original industrial revolution." By contrast, tropical hunter-gatherers had less need to manage time. The heat rapidly spoiled meat and other foodstuffs, thus putting a premium on immediate use or exchange. Moreover, food and fuel sources were distributed more evenly over space and time than in the Arctic and sub-Arctic.

•
Changing the sexual division of labor. In the tropics, men hunted game animals while women gathered fruits, berries, nuts, seeds, roots, and other small food items. This division of labor became less feasible as humans spread north into environments with colder and longer winters, where women had few opportunities for food gathering. This was especially so on open steppe-tundra.
In such environments, women took on tasks unrelated to food procurement: garment making, shelter building, fire making, pottery, and ornamentation. These tasks required new tools, like hand-powered rotary drills for manufacture of ornamental objects and perhaps for fire making [17] (pp. 161, 169), [22]. There was also development of ceramic technology, including kilns heated to between 500 and 800 degrees C. [23]. Three sites in Moravia have yielded more than 10,000 ceramic artefacts, mostly figurines [24]. There were also advances in weaving, notably manufacture of ropes for rafts and nets [25]. Indeed, the Paleolithic saw northern Eurasians develop a wide range of woven goods: "The Eurasian inventory includes diverse cordage, knotted netting, plaited wicker-style basketry, and textiles, including simple and diagonal twined pieces and plain woven and twilled objects. Furthermore, some of these pieces show conjoining of two pieces of fabric by a whipping stitch to produce a seam, or sewing" [26]. By contrast, women had less opportunity as tropical hunter-gatherers to develop new tasks, apparently because they had to devote so much time to food gathering.

Recent Hunter-Gatherers
We see the same relationship between climate and technological complexity in hunter-gatherers of recent times. Tools become more diverse and complex as effective temperature decreases, apparently because food has to be obtained during shorter periods [17] (p. 10), [27] (pp. 181-195), [20] (pp. [17][18][19][20]. This time constraint also leads to food and fuel storage, which is practiced only where growing seasons are shorter than about 200 days. Finally, untended traps and snares are used because resources are unpredictable and widely dispersed [28,29], [17] (p. 10), [20].
As in the Paleolithic, this technological complexity is created primarily by women. Because men provide most of the food through hunting, women are free to perform tasks unrelated to food procurement. This is what Nicole Waguespack [30] concludes in her cross-cultural study of the sexual division of labor: Specific tasks that become increasingly dominated by female labor as meat dependence increases include house building, leatherworking, and burden carrying. [...] The increased involvement in house building and burden carrying suggests women's labor is linked to moving and establishing new residential camps. Measures of hunter-gatherer mobility, both in terms of the number of residential camps established and the distances between moves, are known to be positively associated with reliance on hunting (Binford 2001:269-280 [31]; Kelly 1995:111-160 [32]). Women not only perform these roles more commonly than men when the majority of the diet is derived through hunting but also are likely to perform these tasks more frequently as well [30] (p. 671). [31,32] Wendell Oswalt studied material culture in two Inuit groups and found that technological complexity was highest in garment making and shelter building, both of which were either wholly or largely women's work [33]. Cold winters thus changed the sexual division of labor in a crucial way, with men remaining as food providers and women pioneering the development of new skills. "In an unforgiving climate and terrain, men needed women's skills as processors of food and skin as much as women needed the raw materials from which they created nutritious meals and protective clothing" [34] (p. 37).
Waguespack concludes that it was this sexual division of labor in the Arctic, and not farming or population growth at lower latitudes, that pushed humans to go beyond the basics of food procurement and develop new skills. "A major shift in the partitioning of labor is generally associated with the advent of plant and animal domestication and increases in hunter-gatherer population density [...] However, my analysis of hunter-gatherer economy-and the work of others [...] on this topic-suggests that the partitioning of labor between food procurement and other often-related tasks is, in part, related to the degree of dependence on hunted foods" [30] (p. 674).

The Original Industrial Revolution: Preadaptation for Later Developments
The hunting peoples of northern Eurasia thus became mentally preadapted for later developments, particularly farming and the need for seasonal planting, harvesting, and food storage [35]. Those developments happened not only later but also farther south, in temperate and even tropical environments. There was thus a southward expansion of these northern peoples, a vast movement that would eventually cover almost all of Eurasia, North Africa, Oceania, and the Americas.
This process started even before the last Ice Age ended some 10,000 years ago. Around the time of the glacial maximum, northern Eurasians began to differentiate into western and eastern groups [36]. The two groups then expanded southward, the western one spreading throughout Europe and into the Middle East and South Asia, and the eastern one into the Americas, East Asia, Southeast Asia, and Oceania. This expansion took place at the expense of humans who had earlier spread into Asia from Africa along a southern route, i.e., via the Indian Ocean and Pacific coastlines [37][38][39][40][41], [42] (p. 243). Today, those southern Eurasians survive as a few relic groups: the Vedda of Sri Lanka, the Andaman Islanders, the Semang of the Malayan Peninsula, and the Aeta of the Philippines. To varying degrees, their genetic legacy also survives in the gene pool of present-day South and Southeast Asians [43,44].
Thus, East Asians are today largely the product of an expansion that began in northern Asia during the Paleolithic and progressively spread south during the Neolithic [45]. This demographic expansion is described in a recent summary of the archaeological evidence: Further, but much later in time, populations with a (proto?) Northeast Asian morphology, potentially originating in eastern Siberia, would appear to have migrated into northern and central China, eventually becoming associated with the rise of the Neolithic in central China. These cranio-dentally modern East Asian populations expanded and moved southwards, eventually meeting with pottery-using forager communities (that were cranio-dentally Australo-Melanesian), which had relatively high-density settlements, throughout southern China and northern Vietnam. [46] This is the most widely accepted version of the "Two-Layer" (TL) hypothesis: (1) anatomically modern humans (AMH) first spread into Asia through a northern route and a southern route; (2) the southerners were then replaced to varying degrees by northerners who spread southward from northern Asia and successively occupied northern China, southern China, and Southeast Asia: Under this model, extant "Negrito" populations-small bodied hunter-gatherer populations scattered in the Philippines, Malaysia, and Andaman Islands-are interpreted as "relic" descendants of the initial AMH late Pleistocene entry [...] The most parsimonious version of the TL hypothesis (TL1) considers that the early Holocene source population (=southeastern Chinese homeland) of the Neolithic dispersal into SEA is descended from the initial AMH Late Pleistocene expansion, implying a population continuity from the pre-Neolithic to Neolithic cultural periods in China, as recently reported in northeast Asia [...] Nevertheless, this model is challenged by the accumulating genetic, paleoanthropological, and archaeological evidence pointing to at least two late Pleistocene AMH dispersals into Asia. [...] The only way to accommodate the TL hypothesis (TL2) with recent genetic evidence is to consider that the source population (=southeastern Chinese homeland) of the Neolithic dispersal into SEA is descendant of a migration from northern China between 9 and 10 ka [...] that derived from a northern late Pleistocene expansion into East Asia. [47] (pp. [42][43] [...] our results fit better with a demic diffusion model involving an early Holocene migration from northern China into southeastern China, from which the Neolithic dispersal into SEA (TL2 model) took place. [47] (p. 53) A similar process of replacement seems to have taken place farther west. Fifteen to twelve thousand years ago, the Middle East was home to the Natufians, who were as anatomically similar to sub-Saharan Africans as they were to present-day Eurasians [48,49]. Likewise, many African-looking skulls and skeletons have been found in an arc of territory stretching from Brittany, through Switzerland and northern Italy, and into the Balkans. Most are from the Neolithic, but some are as recent as the Bronze Age and the early Iron Age [50] (pp. 291-292). Despite their anatomical resemblance to present-day sub-Saharan Africans, the Natufians were a branch of the Out-of-Africa expansion into Eurasia, with around half of their ancestry corresponding to "Basal Eurasians" [51]. They were Eurasians who had not been modified by the selection pressures that prevailed farther north.
How could northern Eurasians replace humans who had long been adapted to the climate, vegetation, and wildlife of southern latitudes? What happened to the home team advantage? To answer this question, we must remember that humans adapt much more to their cultural environment than to their natural environment. The pace of genetic evolution in our species speeded up over a hundredfold some 10,000 years ago, even though humans had already colonized the earth's surface from the equator to the Arctic Circle. They were no longer coping with new forms of climate, vegetation, and wildlife. They were now adapting to new forms of social complexity, such as tools and technologies, means of subsistence, social structures, and belief systems [52].
The temperate and tropical zones offered more possibilities for building larger and more complex societies, and those possibilities were exploited more effectively by northern hunting peoples. This can be seen in variation at COMT, a gene linked to executive function, working memory, and intelligence. At this gene, the Met allele is positively correlated with population IQ (r = 0.57) and is more frequent in farming societies than in hunter-gatherers, who have very low frequencies of the Met allele and a disproportionate predominance of the Val allele. This correlation has one exception: "hunter-gatherers living at high latitudes (Inuit) show high frequencies of the Met allele, possibly due to the higher pressure on technological skills and planning abilities posed by the adverse climatic conditions near the North Pole" [53].
Northern hunting peoples thus broke free of the cognitive straitjacket imposed by hunting and gathering. Their descendants would rise to the challenges of social complexity, thus creating selection pressures for further increases in cognitive ability. This was especially so in the advanced societies that would develop in Europe and East Asia.

Brain Size and Latitude in Humans
Until recently, the figure of Stephen Jay Gould dominated mainstream thinking on this subject. He authored a paper on Samuel George Morton in which he accused this leading physical anthropologist of fudging the data to make Europeans look bigger-brained than sub-Saharan Africans [54]. These findings were carried over into The Mismeasure of Man (1981), a bestseller and, eventually, required reading in many undergrad courses [55]. It was hailed by his colleague Richard Lewontin in a posthumous tribute: Despite the myth of detached objectivity that scientists propagate, their motivations are as messy as everyone else's. In particular, they have political, social, and personal concerns that may influence what they do, how they do it, and what they say about it. Putting aside deliberate fraud, of which we have an embarrassment of examples, the gathering of data, their statistical representation, and their interpretation offer many opportunities for unconscious bias toward conclusions that we already "knew" to be true. [56] In 2011, several physical anthropologists led by Jason Lewis located almost half of the skulls that Morton had measured more than a century and a half ago. They remeasured them and found very few errors in Morton's measurements. Furthermore, the errors were distributed randomly, there being in fact a non-significant tendency to overestimate African skull size [57].
Gould had never measured any of the skulls. His paper was at best a flawed reanalysis of Morton's published data. More disturbingly, the flaws had been pointed out much earlier in a paper that remained ignored despite being published in Current Anthropology [58]. Gould owed his reputation not so much to the quality of his work as to an academic milieu that covered for him and acted more like cheerleaders than responsible critics. Does brain size differ among human populations? This question was briefly addressed by Lewis and his colleagues: "[...] cranial capacity variation in human populations appears to be largely a function of climate, so, for example, the full range of average capacities is seen in Native American groups, as they historically occupied the full range of latitudes" [57]. Readers were referred to an earlier study by Kenneth Beals and others, who plotted brain size from 122 human populations as a function of latitude. The correlation was high: 0.76 for the Old World and 0.44 for the New World. The authors concluded that higher latitudes favor people with broader and more globular heads, which they explained as a means to retain body heat. This explanation, however, is challenged by their own data: The evidence suggests that thermoregulation has more effect upon the cranium than upon the body as a whole. The highest correlations occur with the coefficient of cranial morphology, absolute volume, and capacity relative to stature. Lower correlations are observed with surface area:mass ratio, cephalic index, nasal index, and ponderal index. Lower yet (but still significant) are the correlations with weight and body surface area. [59] (p. 312) If heads are larger at higher latitudes in order to retain body heat, why would such thermoregulation be less crucial for the body as a whole-as measured by the surface area to mass ratio? This explanation was also challenged by Iwatoro Morimoto, who pointed out that "in recent centuries, brachycranic skulls show a considerable increase in frequency in Eurasian populations, including the Japanese, that live in warm climates" [59] (p. 322).
Brain size correlates with latitude not only in humans but also in hominids collectively, from Australopithecus afarensis to Homo sapiens [60]; however, this correlation may be due to the overall increase in brain size over time and the parallel expansion of ancestral hominids into higher latitudes.
In sum, brains are bigger at higher latitudes, but the reason is still a matter of debate. Was it to retain more body heat during the last Ice Age? Or was it to provide more cognitive capacity? Whatever the reason, we must also explain why brain size has increased since Ice Age times in Eurasians at lower latitudes.

Brain Size and Latitude in Other Species
Do cold winters select for cognitive ability in the rest of the animal kingdom? In some species, yes, but not in most. It seems that most animals resolve the cold winter problem by hibernating, by going south for winter, by dying before winter comes, or by living under water. Only a minority of animals, particularly birds, confront the winter cold as fully active and conscious beings [61].
If we look at passerine birds, we see that the ratio of brain size to body size is higher in those that overwinter in the Arctic than in those that go south. Their behavior is also more flexible and innovative [62][63][64]. The black-capped chickadee (Poecile atricapillus) has a larger hippocampus and more hippocampus neurons in high-latitude populations from Alaska than in low-latitude ones from Kansas [65]. These geographic differences within a single bird species remain even among captive chickadees raised under the same conditions: We found that birds from the harsh northern population, where selection for cognitive abilities is expected to be high, significantly outperformed conspecifics from the mild southern population. Our results imply differences in cognitive abilities that may be inherited, as individuals from both populations were raised in and had experienced identical environmental conditions from 10 days of age. [66] (p. 3187) Cognitive differences between northern and southern chickadees are nonetheless greater in the wild than in captivity [67,68]. It seems that northern birds push the envelope of their phenotype to achieve higher cognitive performance. Selection then shifts their mean genotype in the same direction.
The picture is very different if we look at animals that remain inactive during winter. Brain size is smaller in amphibians that have longer seasons of inactivity [69]. It is likewise smaller in hibernating mammals: "Using a comparative phylogenetic approach on brain size estimates of 1104 mammalian species, and controlling for possible confounding variables, we indeed found that the presence of hibernation in mammals is correlated with decreased relative brain size" [70] (p. 1582). This is the case with vertebrates in general: brain size is positively correlated with mean temperature when body size is controlled [71]. Again, it seems that most animals adapt to winter by avoiding it in one way or another. Humans are among the few that confront it head-on.

Rushton, the Cold Winter Theory, and the Problem of Population Differences in IQ
Cold winters seem to select for cognitive ability, at least in humans. This theory has in fact been respectable and even mainstream among archaeologists, paleontologists, and biologists, if only because few of them have explored its implications. It is controversial only in the context of Rushton's work on race and IQ. Admittedly, that is the context of this paper.
Mean IQ differs among human populations, and these differences have sparked much debate-often bitter debate. Is the cause genetic? Is it environmental (nutrition, education, etc.)? Could it be differences in methodology or attitudes to test taking? For example, how real was the IQ gap between West Germany and East Germany? Most likely the cause was sampling bias. In the West, troublesome individuals could more easily skip school, evade the draft, and thus be unavailable for measurement of IQ [72].
New light has been shed on this question by genomic analyses, specifically studies of SNPs (single nucleotide polymorphisms) whose variants are associated with differences in educational attainment. Such SNPs appear to be numerous, numbering at least in the thousands. Even a few, however, can show the strength of natural selection for intelligence, just as a weathervane can show the direction of the wind. Six years ago, these weathervanes caught the interest of Davide Piffer, and he began examining data on ten of them from various human populations. For each population he estimated its genetic capacity for intelligence by calculating a "polygenic score", i.e., the number of SNP variants associated with high educational attainment, out of a maximum of ten. The score correlated with population IQ (r = 0.90) and PISA scores (r = 0.84), being highest in East Asians and lowest in sub-Saharan Africans [73]. These results were preliminary-with only ten SNPs, the geographic pattern might be due to chance alone. Over the following years, as other researchers discovered additional SNPs associated with educational attainment, Piffer repeated his study with ever-larger datasets. His latest study (published in this issue) uses 2411 SNPs, and the polygenic score correlates even higher with population IQ (r = 0.98). The geographic pattern is the same [74]. If this pattern is due to chance, why do we keep getting the same one?
It is doubtful whether this finding, or any one finding, will settle the debate over population differences in IQ. In any case, I wish to resolve a less contentious question: Is Rushton's position in this debate really supported by the cold winter theory? For the sake of argument, let us assume that mean IQ is higher where, in past generations, the environment has imposed higher cognitive demands. To what degree can this legacy be explained by the challenges of Paleolithic winters? To what degree by later challenges?

The Problem of Recent Selection
Humans did not stop evolving with the end of the last ice age. In fact, their genetic evolution accelerated, particularly with the transition from hunting and gathering to farming [52]. This finding by John Hawks and others came five years before Rushton's death, perhaps too late to influence his thinking. He nonetheless had other reasons for downplaying post-Paleolithic evolution: • He was already ridiculed for thinking that human evolution continued as late as the last ice age. It was widely thought in academia that humans had essentially stopped evolving much earlier, long before they spread from Africa to the other continents. This was the general thinking not only among Rushton's fierce opponents but also among his friendly critics, particularly those in evolutionary psychology. As someone who believed the contrary, he turned to the cold winter theory for support, but that theory would go only so far in broadening the time frame of human evolution and extending the environment of evolutionary adaptedness beyond the savannah of Africa.
• He wished to keep his argument simple, probably fearing that any exceptions, inconsistencies, or secondary explanations would be turned against him.

•
He also preferred simplicity for aesthetic reasons. If an explanation could not fit into a single unified theory, he would set it aside. We see this in his tendency to unify the cold winter theory with other ideas. Previously, he held a progressionist view of human evolution, seeing "younger" races as more advanced than "older" races. This earlier thinking then merged in his mind with the cold winter theory. Because Northeast Asians are "younger" than Europeans, they must have been more innovative under glacial conditions. He thus wrote: Survival in Northeast Asia about 40,000 years ago also required warm clothing. Archeologists have found needles, cave paintings of parkas, and grave ornaments marking the outlines of shirts and trousers. We know that warm furs were worn. Fox and wolf skeletons missing their paws tell us that these animals were skinned to make fur clothes. Houses were dug into the ground to provide insulation. These large dwellings were marked by post holes and had walls made from mammoth bones. Fireplaces and stone lamps were used to light the long Arctic winter night. [75] (p. 87) The above description is just as true for Europeans during the same time period. Indeed, more so. Europe has many Paleolithic sites with cave paintings, but Northeast Asia has only one, and that site has no depictions of human figures [76].

Recent Selection in East Asia
Here we come to an IQ difference that the cold winter theory cannot explain. Mean IQs are indeed higher in East Asians than in Europeans, but this cognitive advantage is confined to Chinese, Koreans, and Japanese. It is absent from the indigenous peoples of northern Asia (Altai, Evenks, Mongols, Yakuts) [77][78][79][80]. Mean IQ must have therefore increased in East Asians after the last Ice Age, probably during the time of recorded history. This increase is attributed by Ron Unz to a scarcity of women in the lower classes: [...] only the wealthier families of a Chinese village could afford the costs associated with obtaining wives for their sons, with female infanticide and other factors regularly ensuring up to a 15 percent shortfall in the number of available women. Thus, the poorest village strata usually failed to reproduce at all, while poverty and malnourishment also tended to lower fertility and raise infant mortality as one moved downward along the economic gradient. At the same time, the wealthiest villagers sometimes could afford multiple wives or concubines and regularly produced much larger numbers of surviving offspring. Each generation, the poorest disappeared, the less affluent failed to replenish their numbers, and all those lower rungs on the economic ladder were filled by the downwardly mobile children of the fecund wealthy. [81] (p. [22][23] Unz sees this wife scarcity as the main factor that drove the increase in mean IQ. There was also the imperial examination: "in China the proud family traditions would boast generations of top-scoring test-takers, along with the important government positions that they had received as a result" [81] (p. 19). This may have been a very secondary factor, since less than one percent of the population reached the top rank of chin-shih or the lesser rank of chu-jen [78]. Nonetheless, success at lower district or provincial levels also brought benefits, if only prestige, and the beneficiaries were much more numerous [82] (see also supplementary file B).

Recent Selection in Europe
Mean IQ seems to have likewise risen in Europe during historical times. This is the conclusion of several studies on Europeans as a whole or on certain European groups: Europeans. A team led by Michael A. Woodley of Menie examined data from ancient DNA retrieved at sites in Europe and central Asia, specifically the frequency of alleles associated with high educational attainment. They found that the alleles gradually increased in frequency over the time frame of the DNA samples, i.e., 4560 to 1210 years ago. A more definitive increase was found when the ancient genomes were compared with modern ones from the 1000 Genomes Project [83].
English. Gregory Clark used demographic data to chart the growth of the English middle class. He found that it grew steadily from the twelfth century onward, its descendants not only growing in number but also replacing the lower classes through downward mobility. By the 1800s its lineages accounted for most of the English population. Parallel to this demographic growth, English society became more and more middle class in its values: "Thrift, prudence, negotiation, and hard work were becoming values for communities that previously had been spendthrift, impulsive, violent, and leisure loving" [84] (p. 166). Using a variant of the cold winter theory, Woodley of Menie attributes this change to the unusually cold weather of the sixteenth and seventeenth centuries, i.e., the Little Ice Age [85]. Clark ascribes it to the consolidation of the State during the High Middle Ages, particularly the State's monopoly on violence, and the ensuing development of settled communities [84].
Western Europeans. On the basis of soft data, Georg W. Oesterdiekhoff argued that mean IQ steadily rose throughout Western Europe during late medieval and early modern times. Previously, most people failed to develop beyond the stage of preoperational thinking. They could learn language and social norms but their ability to reason was hindered by cognitive egocentrism, anthropomorphism, finalism, and animism. From the sixteenth century onward, more and more people reached the stage of operational thinking. They could better understand probability, cause and effect, and another person's perspective, whether real or hypothetical [86,87] (pp. 49,[86][87]. As this "smart fraction" grew in size it eventually reached a threshold where intellectuals were no longer isolated individuals but rather communities of people who could exchange ideas. This was one of the hallmarks of the Enlightenment: intellectuals were sufficiently numerous to meet in clubs, salons, coffeehouses, and debating societies. Ashkenazim. Gregory Cochran, Jason Hardy, and Henry Harpending presented evidence that the mean IQ of Ashkenazi Jews exceeds not only that of non-Jewish Europeans but also that of other Jewish groups. This cognitive advantage seems to be relatively recent, originating probably in the Middle Ages. The most striking piece of evidence is the high incidence among Ashkenazim of four genetic disorders: Tay-Sachs, Gaucher, Niemann-Pick, and mucolipidosis type IV (MLIV). All four affect the capacity to store sphingolipid compounds that promote the growth and branching of axons in the brain. These disorders are caused by alleles that are harmful in the homozygote state but beneficial in the much more common heterozygote state, i.e., the brain receives higher levels of sphingolipids without the adverse health effects [88,89]. On a related note, a team led by Curtis Dunkel examined genetic data from the Wisconsin Longitudinal Study and found that Ashkenazi Jews have a higher percentage of alleles associated with high educational attainment [90]. This finding has been corroborated by Davide Piffer in his latest study [74].

Parting Thoughts
Taken together, these findings may settle a question from the introduction to this paper. If mean IQ increased in Europe during late medieval and early modern times, earlier observers like Sa'id al-Andalusi would have been right in thinking intelligence to be highest in the "middle zone" between the cold north and the hot south, i.e., the Mediterranean, the Middle East, India, and China. This was where cognitive demands were highest and selection for cognitive ability strongest. This was also where hunting and gathering had first given way to farming and an increasingly complex way of life. That new cultural environment, however, was most ably exploited by incoming groups of northern origin who already possessed not only relevant technologies but also the right mental toolkit. Admittedly, they were not necessarily the ones who started the transition to farming. In southern China, others took the first steps toward sedentism and population growth [46]. In the Middle East, cereals were first cultivated by humans who were anatomically equidistant between sub-Saharan Africans and present-day Eurasians [48,49]. It seems that groups of northern origin often arrived after the fact, being better at exploiting opportunities that others had created.
For a long time thereafter, the middle latitudes were conducive to the evolution of intelligence. Conditions were favorable not only for farming, sedentism, and population growth but also for increasing social complexity, i.e., State formation, codification of laws, advent of literacy and numeracy, differentiation of language (formal vs. casual speech, prose, poetry, humor, rhetoric, legal writing), expansion of the built environment (villages, towns, cities), growth of music, literature, and the fine arts, development of religious beliefs and practices, construction of roads, canals, harbors, forts, and other infrastructures, and so on. The resulting cognitive demands further increased selection for cognitive ability, thus setting in motion a process of gene-culture coevolution that pushed mean IQs higher and higher in the middle zone.
If we were living a millennium ago, like Sa'id al-Andalusi, we would probably not be debating the cold winter theory. Everyone would agree that intelligence was highest where the climate is neither too hot nor too cold. The last millennium, however, has seen the geographical center of human development move north into Europe, particularly northwest Europe. The reasons cannot be fully discussed here. In the Middle East and elsewhere, farming was suffering from the cumulative effects of erosion, salinization, and overgrazing. There were also ideological constraints. Both Christianity and Islam sought to limit free expression of thought, but Christianity ultimately failed in this effort. Finally, and most importantly, a true market economy did not develop in the middle zone. The concept of trade was widely understood, but most goods and services were still produced within one's household for one's household, i.e., by family members and servants, or within one's network of relatives and in-laws. The market simply could not replace kinship as the main organizing principle of society.
The end of the first millennium thus marks the limit of what the middle zone could achieve with its mode of production. In northwest Europe, where kinship was weaker and individualism stronger, the market could go much farther in reorganizing economic and social relations, thereby creating a new environment that would favor people who had not only more cognitive ability but also a higher degree of future orientation, less willingness to settle personal disputes with violence, and better ability to process numerical and textual data-in short, a middle-class mindset [84,91].
In conclusion, there is no unified evolutionary theory of human intelligence, other than the general theory of evolution by natural selection. Undoubtedly, a key stage was adaptation to cold winters during the Paleolithic. Yet much would happen later, particularly in Europe and East Asia.