David Reich – Why the Bronze Age was an inflection point in human evolution
For decades, the scientific consensus held that natural selection had been largely quiescent in humans over the past several hundred thousand years. Ancient DNA was revolutionizing our understanding of migration and ancestry, but revealing almost nothing about biological adaptation. Now, analyzing over 16,000 ancient genomes from Europe and the Middle East, David Reich's lab has uncovered something startling: not only has selection been widespread and continuous, but it intensified dramatically during the Bronze Age, reshaping immune function, metabolism, and cognitive traits. The very period when humans invented writing and built the first cities may have exerted the strongest evolutionary pressures in recent human history—stronger even than the transition to agriculture itself.
Points clés
Natural selection has been far more active in recent human history than previously believed—hundreds of genetic variants have been under strong directional selection over the past 18,000 years in Europe and the Middle East, contradicting the long-held view that selection has been quiescent.
The Bronze Age (roughly 5,000–2,000 years ago) represents a biological inflection point: selection on immune and metabolic traits intensified during this period more than during the initial transition to agriculture, likely driven by urbanization, animal domestication, and infectious disease.
Cognitive traits show strong evidence of selection over the past 10,000 years, with the genetic predictor of intelligence increasing by approximately one standard deviation—but this selection peaked during the Bronze Age and has been undetectable over the past 2,000 years.
The 98% of allele frequency change driven by migration and drift dwarfs the 2% driven by selection, yet that 2% has reshaped the genome: immune traits show four- to five-fold enrichment in selection signals, while behavioral traits show weaker but real signals masked by their polygenic architecture.
The genetic toolkit for farming, symbolic behavior, and complex cognition was likely in place at least 50,000—and possibly 300,000—years ago, yet agriculture arose independently in multiple regions only after 12,000 years ago, suggesting climate stability, not genetic change, was the limiting factor.
En bref
The Bronze Age was not just a cultural revolution but a biological inflection point: selection for traits affecting immunity, metabolism, and cognition intensified dramatically between 5,000 and 2,000 years ago, suggesting that the shift to high-density urban life imposed stronger adaptive pressures on the human genome than the Agricultural Revolution itself.
The Bronze Age Inflection Point
Selection on immune and metabolic traits intensified dramatically 5,000–2,000 years ago.
The Bronze Age was not merely a technological shift but a biological crucible. Reich's analysis reveals that selection on immune traits—already the most selected category across the 18,000-year study period—reached peak intensity between 5,000 and 2,000 years ago, well after the initial Agricultural Revolution. Metabolic traits followed the same pattern. A tuberculosis risk variant (TYK2) rocketed upward in frequency before 3,000 years ago, then reversed sharply as TB became endemic. Multiple sclerosis and hemochromatosis variants show similar inflections.
This timing is counterintuitive. The textbook narrative places the great adaptive watershed at the onset of farming 10,000–12,000 years ago, when humans transitioned from mobile foraging to sedentary crop cultivation. But the genetic data suggest otherwise: the biological readout from the genome reacts far more strongly to the urbanism, animal proximity, and population density of the Bronze Age than to the initial domestication of plants. Humans were «wrenched into a way of living that was so different from how their hunter-gatherer ancestors lived that the organism had to adapt very strongly,» Reich observes—and that wrenching appears to have peaked not at farming's dawn, but millennia later.
Why Previous Methods Failed to Detect Selection
Migrations cause 98% of allele frequency change, drowning out selection's 2% signal.
Selection on Cognitive and Behavioral Traits
Genetic predictors of intelligence rose a full standard deviation, peaking in the Bronze Age.
The genetic variants that today predict performance on IQ tests and years of schooling have increased systematically over the past 10,000 years in Europe and the Middle East—by approximately one standard deviation on the scale of modern variation. This is not a small effect: one standard deviation separates the 50th percentile from the 85th. Moreover, this selection was not uniform. When Reich's team dragged a 2,000-year sliding window through the data, they found peak selection intensity between 5,000 and 2,000 years ago. After 2,000 years ago, there is essentially no detectable selection on these traits.
This result is puzzling on multiple levels. First, why would intelligence not have been maximized long ago? Reich speculates that what modern GWAS identify as «intelligence» may actually capture a broader syndrome—executive function, delayed gratification, planning—that happens to correlate with schooling, BMI, walking pace, and even age at first childbirth. Perhaps different adaptive optima favor different points on this spectrum at different times. Second, why the Bronze Age peak? One possibility: early states, literacy, trade networks, and specialized labor created new cognitive niches. Another: what's being selected is not raw problem-solving ability but a package of traits (compliance, future orientation) adaptive in stratified, agricultural societies but irrelevant or even maladaptive in small-scale foraging bands.
What Changed 12,000 Years Ago?
Farming arose independently worldwide after the Ice Age despite no new mutations.
“Farming was invented for the first time anywhere in the world in the Middle East 11,000 or 12,000 years ago. The people who invented farming exploded into Europe after 8,500 years ago, spread across the continent, and expanded rapidly. In the Bronze Age, there was an intensification of how people lived, with much higher population densities. People were living more and more next to their animals and getting their diseases, and exchanging their diseases with the animals and with each other. This is a period of rapid change in how people are living, resulting in different biological needs of this population. It's not surprising, perhaps, that in the context of these dramatic changes, the biology of the population might not be ideally adapted.”
Key Genetic Findings Across Traits
Seven thousand two hundred loci show selection; immune and metabolic traits dominate.
Why Hunter-Gatherers Score Low on Modern Cognitive Predictors
Ancient foragers three standard deviations below modern mean on IQ polygenic scores.
Why Hunter-Gatherers Score Low on Modern Cognitive Predictors
European hunter-gatherers from 10,000+ years ago have a genetic profile that, if transplanted to today, would predict IQ scores roughly three standard deviations below the modern mean. This does not mean they were «less intelligent» in any meaningful sense—survival in Paleolithic Europe demanded immense skill, knowledge, and adaptability. Rather, it suggests that the genetic architecture underlying performance on modern IQ tests was simply not optimized in foraging populations. The traits favored then—perhaps spatial reasoning, social cognition, improvisational problem-solving—may be orthogonal to, or even trade off against, the traits (compliance, patience, verbal abstraction) rewarded in agricultural and industrial societies.
The Methodological Breakthrough: Genetic Relatedness Matrices
New technique predicts genotypes from relatedness, isolating selection from drift and migration.
Build a relatedness matrix Compute how closely each of 22,000 individuals (16,000 ancient, 6,000 modern) is related to every other individual across the genome.
Predict genotypes from relatedness alone For each of 10 million variable sites, predict an individual's genotype based solely on their pattern of relatedness to others. This captures drift, migration, and population structure.
Test if selection improves predictions Ask whether adding a constant selection coefficient—assuming the allele has been drifting upward or downward consistently over time—improves the model fit beyond what relatedness alone provides.
Calibrate confidence using GWAS Cross-reference selection signals with UK Biobank genome-wide association studies. As the selection statistic rises, enrichment for known trait-associated variants increases, plateauing at ~5 standard deviations. This plateau indicates ~100% of signals above that threshold are real.
Partition by time and space Divide the dataset into an «archipelago» of small demes (e.g., Britain 4,000–3,500 years ago) where ancestry is stable. Within each deme, watch if alleles drift consistently in the same direction—that's selection.
A Speculative Model: Neanderthals as Culturally Modern
Neanderthals may share Middle Stone Age culture and recent common ancestry with modern humans.
In an unscripted whiteboard session after the recording, Reich sketched a radical reinterpretation of archaic human relationships. The standard model holds that Neanderthals and Denisovans are sister lineages, both splitting from modern humans 700,000–800,000 years ago. But this creates puzzles: Neanderthals share mitochondrial DNA and Y chromosomes with modern humans (common ancestor ~300,000–450,000 years ago), not with Denisovans. They also share Middle Paleolithic (Levallois) stone tool technology with modern humans and early anatomically modern Africans, but not with Denisovans in East Asia.
Reich proposes that a population in the Caucasus or Northeast Africa invented Levallois technology and Middle Stone Age culture around 300,000–400,000 years ago. This group expanded in two directions. Moving into Europe, they mixed with local archaics (proto-Neanderthals), and because the groups were only ~400,000 years diverged, gene flow was relatively permissive. The pioneers were swamped by local archaic DNA—ending up 95% archaic, 5% modern—but retained their cultural toolkit and, crucially, their mitochondrial and Y chromosome lineages (perhaps through matrilineal or patrilineal social structures). This would produce «Neanderthals»: genetically mostly archaic, but culturally and in some genetic elements, modern.
Simultaneously, the same population expanded into Africa, mixing with a deeply diverged archaic African lineage (~1.5 million years separation). Here, greater genetic incompatibility limited gene flow to ~20%, producing the ancestors of all living humans. If true, this model unifies the Middle Stone Age Revolution in Africa and the Middle Paleolithic in Europe as a single event, explains the mitochondrial and Y chromosome anomalies, and repositions Neanderthals as cousins rather than distant relatives. Reich stresses this is speculative—«probably wrong»—but argues it is more parsimonious than the current model, which has accreted epicycles to accommodate contradictory data.
Personnes
Glossaire
Avertissement : Ceci est un résumé généré par IA d'une vidéo YouTube à des fins éducatives et de référence. Il ne constitue pas un conseil en investissement, financier ou juridique. Vérifiez toujours les informations auprès des sources originales avant de prendre des décisions. TubeReads n'est pas affilié au créateur de contenu.