To the extent that evolutionary biologist and sociobiologist Robert Trivers has been in the news over the last decade, it has been for his entanglement with and highly-questionable defense of Jeffrey Epstein. But as Lionel Page reminds us, two decades before that– well before he could have known the execrable “financier”– Trivers made hugely important contributions to his field…
Trivers was one of the most—perhaps the most—influential evolutionary biologists of the 20th century. His work should be much more widely known in social and behavioural sciences, in particular in economics, as Trivers’ intellectual approach is very much in line with a game theoretic understanding of social interactions.
It is hard to overstate the importance of his work. Einstein famously published four groundbreaking papers in 1905, a year often referred to as his “Annus mirabilis”, during which he revolutionised physics. Trivers might be said to have had a “Quinquennium Mirabile” for the five years between 1971 and 1976, during which he produced a series of ideas that revolutionised evolutionary biology…
[Page unpacks four of those contributions: Reciprocal Alturism, Parental Investment, Parental Offspring Conflict, and Self-Deception, each fascinating…]
… Trivers has been one of the most influential evolutionary biologists, and his papers are still worth reading today. His insights, published more than 50 years ago, are fascinating. They often align very well with economic theories of behaviour, and it is therefore regrettable that his ideas are not more well-known in economics, and in particular in behavioural economics.
A key feature of Trivers’ take across these contributions was to see that beneath the world of social interactions we observe, there are deep structures in terms of incentives that shape the game we play. Understanding these games and their structures helps us make sense of the seemingly endless complexity of human psychology and social dynamics. In several key contributions, Trivers helped lift the veil on the underlying logic of human behaviour…
For more on Trivers and the controversies in his life (Epstein, but also the Black Panthers and a Rutgers set-to), all of which followed the burst of productivity described above, see here.
* Bertolt Brecht (through the mouth of Galileo, in The Life of Galileo)
###
As we linger over legacies, we might send material birthday greetings to a man who helped lay the groundwork for the field to which Trivers contributed, Ludwig Büchner; he was born on this date in 1824. A philosopher, physiologist, and physician, he became one of the leading exponents of 19th-century scientific materialism. Büchner was an early champion of Darwin’s theory of evolution, endorsing it within a decade of its first issuance, then did much to spread it by citing and building on it in his own books.
As far as we know, Büchner’s life was free of the scandal and conflict that plagued Trivers. He taught at the University of Tübingen and published dozens of books and papers. Later in his life he founded he “German Freethinkers League” (“Deutsche Freidenkerbund”) and served as a member of the second chamber of the Landstände of the Grand Duchy of Hesse as a representative of the German Free-minded Party from 1884 to 1890. He was the younger brother of Georg Büchner, a famous revolutionary playwright, and Luise Büchner, a women’s rights advocate; and he was the uncle of Ernst Büchner, inventor of the Büchner flask.
Mark Twain (the author of the observation above) was more correct than he may have understood. Alex Wakeman explains that, while most other plants have a single “most useful” element, wild cabbage has many. This makes it perfect for breeding….
Every crop we consume came from a wild ancestor. Through breeding, people selected for bigger grains, juicier fruit, more branches, or shorter stems – gradually turning wild plants into improved yet recognizable versions of their originals. The rare exception is Brassica oleracea, wild cabbage: the origin of cabbage, bok choy, collard greens, broccoli, Brussels sprouts, cauliflower, and much else.
Wild cabbage is unassuming: some untidy leaves and a few thick, coarse stems on the browner side of purple that poke out from the soil. Nothing about it looks appetizing.
Wild cabbage (Brassica oleracea) growing in Northumberland. Source
Nevertheless, many cultures have recognized something special in this plant. By selecting plants with denser layers of leaves, ancient people created modern cabbage and kale. Others bred for the inflorescence, a dense bundle of small flowers that forms the head of cauliflower and broccoli. By favoring large, edible buds, thirteenth-century farmers living around modern day Belgium created Brussels sprouts. Under different selection pressures, Brassica oleracea has become German kohlrabi, or Chinese gai lan, or East African collard greens.
This level of morphological diversity is unusual. Modern tomatoes, for example, vary in size, shape, and color, but are all recognizably tomatoes. Since the 1920s, scientists have worked to understand how Brassica oleracea was domesticated and to deepen our knowledge of evolution and artificial selection.
By combining modern genetics, genomics, and molecular biology with linguistic, historical, and sociological sources, researchers are now beginning to develop conclusive answers…
A man takes a train from London to the coast. He’s visiting a town called Wulfleet. It’s small and old, the kind of place with a pub that’s been pouring pints since the Battle of Bosworth Field. He’s going to write about it for his blog. He’s excited.
He arrives, he checks in. He walks to the cute B&B he’d picked out online. And he writes it all up like any good travel blogger would: in that breezy LiveJournal style from 25 years ago, perhaps, in his case, trying a little too hard.
But as his post goes on, his language gets older. A hundred years older with each jump. The spelling changes. The grammar changes. Words you know are replaced by unfamiliar words, and his attitude gets older too, as the blogger’s voice is replaced by that of a Georgian diarist, an Elizabethan pamphleteer, a medieval chronicler.
By the middle of his post, he’s writing in what might as well be a foreign language.
But it’s not a foreign language. It’s all English.
None of the story is real: not the blogger, not the town. But the languageis real, or at least realistic. I constructed the passages myself, working from what we know about how English was written in each period.
It’s a thousand years of the English language, compressed into a single blog post.
Read it and notice where you start to struggle. Notice where you give up entirely. Then meet me on the other side and I’ll tell you what happened to the language (and the blogger)…
As we travel through time, we might note that not every new emergence becomes sedimented into the evolutionary path, as we recall that on this date in 1980 that the National Academy of Recording Arts and Sciences awarded the first– and last– Grammy for Best Disco Recording. By the time that the Academy got around to it, disco was pretty much dead.
“I Will Survive” by Gloria Gaynor was the big winner that night. The other nominees were: Earth, Wind & Fire for “Boogie Wonderland,” Michael Jackson for “Don’t Stop Till You Get Enough,” Donna Summer for “Bad Girls,” and Rod Stewart for “Do Ya Think I’m Sexy?” (In January 2020, Gaynor won her second Grammy Award in her career for her gospel album Testimony.)
The estimable Brad DeLong ponders the Discovery Channel series Naked and Afraid, concluding that “the Scarecrow in The Wizard of Oz had a greatly exaggerated view of what he would have been able to do if he only had a brain”…
There is a shlock TV show, on the Discovery Channel, called “Naked & Afraid”.
In it, two humans are dropped into a wilderness somewhere, naked, with one and only one piece of technology each (usually something like a knife, a fire starter, or a fishing line). All around them are other mammals doing their mammal thing: living their lives, reproducing their populations, evolving to fit whatever niche they have found where they are. But the two humans dropped by themselves (well, they are surrounded by cameramen, sound technician, drivers, logistical support, and such who do not help and who stay out of the field of view) do not. Instead, the humans proceed, not too slowly, to start starving to death.
I am not being figurative or metaphorical…
[DeLong details the altogether dire deterioration and resulting ailments of two recent “contestents”…]
… Perhaps you just shrug your shoulders and say: “humans are relatively inept”… The other mammals out in the Amazon have been equipped by Darwin’s Daemon with teeth, claws, instincts, and brains that allow them to get into daily caloric balance. We don’t have much in the way of teeth and claws. We do have opposable thumbs. We do have big brains. They are supposed to compensate. But perhaps you shrug your shoulders and say: “they do not compensate very well”. For, out in the wilderness, Melissa Miller’s brain and thumbs failed at the one job for which Darwin’s Daemon gave them to us, for which other mammals’ teeth, claws, instincts, sprinting speed, dodging quickness, and much smaller and thus less energetically expensive brains largely suffice.
The rule: a smart, knowledgeable human (or two) in the wilderness naked should be afraid: they are highly likely to start starving to death.
And yet: Somehow we are here. We have not all yet been eaten. We have been evolved evolved. Our ancestors survived, and reproduced.
Our ancestors started to come down from the trees about seven million years ago. That was when we left the ancestors of our chimpanzee cousins still up in the forest canopy.
By five million years ago, the ardipitheci were walking upright when they had to, with much smaller and less sexually-dimorphic canines, but as of them with no signs of fire or stone‑tool use or indeed of semi-systematic butchery. Their brain cases were only 350cc, only 350 cubic centimeters. By 3.5 million years ago, the autralopitheci afarenses were habitually walking on two legs with their 450cc brain-cases. By 2.5 million years ago, the homines habiles with their Oldowan stone toolkit and 650cc brain-cases were around. And paleontologists judge they deserve our genus name: homo. By 1.8 million years ago, there were the homines erecti spreading out across the world, with their Acheulean handaxes, their endurance walking/running, and their 950cc brain-cases. When we look back 600,000 years ago, the world was then populated by the likes of the homines heidelbergenses: widely-controlled fire; complex hunting with tools like spears.
These people were not yet us: Their brain-cases were only 3/4 of the size of our brain-cases of 1350cc. They did not have organized big‑game hunting with spears, complex prepared‑core toolmaking techniques, long‑distance mobility, or evidence of our sustained and cumulative symbolic culture—cave art and engravings, personal ornaments, ritual burials, complex language‑supported planning, long‑distance exchange networks, composite tools made with adhesives, tailored clothing, or shelters. They did not have the final brain expansion, the globular skull, the reduced brow, or the chin.
And between 300,000 and 200,000 years ago there emerged people we definitely call us: homines sapientes, albeit “archaic”, with our brain-case size of 1350cc, but without the fully globular skull, the reduced brow, or the chin.
From a chimpanzee-sized brain one-quarter the size of ours five million years ago to our current state, our ancestors and then we have been evolved. And now we are here. So how can there have been so much selection pressure for larger brains when, even today, out in the wilderness they are insufficient to keep us, when naked individuals, from being hungry and afraid?
You know where I am going here. The answer of course, is simple: What is smart—what the brain is good for—is not each of our brains, but all of our brains thinking together. And the tools that we, and those who came before us, have made—tools that no one individual could make in a lifetime, and that embody all of that thinking-together one. Melissa Miller is an expert on knives, how to use them, and what to use them for. She could not make one from scratch.
From long-ago Acheulean handaxes to contemporary hunger in the Amazon, the throughline is simple: selection favored group knowledge and group production by a pecialized division of labor, not solo genius. Our edge not only was and is not claws or speed, it was and is not the ability to think up clever solutions to problems on the fly. Instead, it was pooled memory and anthology thinking-power, plus the division of labor that allows us to carve tools that contain the results of that collective thinking-power…
As we band together, we might recall that it was on this date in 1926 the Winnie-the-Pooh was first published…
The origin of the name of the bear that was stuffed with fluff began years before the book Winnie-the-Pooh by A.A. Milne [here] was published on this day in 1926. During the first World War, Canadian Lieutenant Harry Colebourn caught a bear and named her “Winnie” after his adopted hometown in Winnipeg, Manitoba. She was the brought to the London Zoo where Milne’s son, Christopher Robin would visit. [See also here.]
Christopher re-named his own teddy bear, Edward Bear, to Winnie-the-Pooh. His father named the characters in his book after Christopher’s stuffed animals including Piglet, Eeyore, Kanga, Roo and Tigger. (Mr. Milne added Owl and Rabbit).
In 1961, Walt Disney Productions bought the rights to the stories to create a series of cartoon shorts beginning with Winnie the Pooh and the Honey Tree which debuted in 1966. The last full-length animated movie from Disney, simply titled Winnie the Pooh, came out in theaters in 2011; the live action movie about the inspiration of the stories, Goodbye Christopher Robin by Fox Searchlight Pictures arrived in theaters in October of 2017 and Disney followed up with a live action/CGI about an adult Christopher Robin returning to the 100 Acre Wood in 2018.
A new mathematical model showed that evolutionary bursts led to the emergence of almost all characteristic cephalopod traits such as tentacles.
Indeed. Scientists have accepted this precept since Charles Darwin‘s publication of Origin of the Species. But how– and at what pace– does that adaptation happen? From those earliest days, the assumption was that change/adaptation happened slowly, roughly evenly– gradually– over time.
But in 1972, paleontologists Niles Eldredge and Stephen Jay Gould published a landmark paper developing their theory and called it punctuated equilibria. Their paper built upon Ernst Mayr‘s model of geographic speciation, I. M. Lerner‘s theories of developmental and genetic homeostasis, and their own empirical research. Eldredge and Gould proposed that the degree of gradualism commonly attributed to Darwin is virtually nonexistent in the fossil record, and that stasis dominates the history of most fossil species. Rather, they argued, when significant evolutionary change occurs, it is generally restricted to rare and geologically rapid events of branching speciation called cladogenesis (the process by which a species splits into two distinct species, rather than one species gradually transforming into another).
Jake Buhler reports on recent work that confirms the punctuated equilibrium theory and adds more detail…
Over the last half-billion years, squid, octopuses and their kin have evolved much like a fireworks display, with long, anticipatory pauses interspersed with intense, explosive changes. The many-armed diversity of cephalopods is the result of the evolutionary rubber hitting the road right after lineages split into new species, and precious little of their evolution has been the slow accumulation of gradual change.
They aren’t alone. Sudden accelerations spring from the crooks of branches in evolutionary trees, across many scales of life — seemingly wherever there’s a branching system of inherited modifications — in a dynamic not examined in traditional evolutionary models.
That’s the perspective emerging from a new mathematical framework (opens a new tab) published in Proceedings of the Royal Society B that describes the pace of evolutionary change. The new model, part of a roughly 50-year-long reimagining of evolution’s tempo, is rooted in the concept of punctuated equilibrium, which was introduced by the paleontologists Niles Eldredge and Stephen Jay Gould in 1972.
“Species would just sit still in the fossil record for millions of years, and then all of a sudden — bang! — they would turn into something else,” explained Mark Pagel, an evolutionary biologist at the University of Reading in the United Kingdom.
Punctuated equilibrium was initially a controversial proposal. The theory diverged from the dominant, century-long view that evolution adhered to a slow, steady pace of Darwinian gradualism, in which species incrementally and almost imperceptibly developed into new ones. It opened the confounding possibility that there was a discontinuity between the selection processes behind the microevolutionary changes that occur within a population and those driving the long-term, broad-scale changes that take place higher than the species level, known as macroevolution.
In the decades since, researchers have continued to debate these views as they’ve gathered more data: Paleontologists have accumulated fossil datasets tracing macroevolutionary changes in ancient lineages, while molecular biologists have reconstructed microevolution on a more compressed timescale — in DNA and the proteins they encode.
Now there are enough datasets to more fully test the theories of evolutionary change. Recently, a team of scientists blended insights from several evolutionary models with new methods to build a mathematical framework that better captures real evolutionary processes. When the team applied their tools to a selection of evolutionary datasets (including their own data from research into an ancient protein family), they found that evolutionary spikes weren’t just common, but somewhat predictably clustered at the forks in the evolutionary tree.
Their model showed that proteins contort themselves into new iterations more rapidly around the time they diverge from each other. Human languages twist and recast themselves at the bifurcations in their own family tree. Cephalopods’ soft bodies sprout arms and bloom with suckers at these same splits.
The new study adds to previous support for the punctuated equilibrium phenomenon, said Pagel, who wasn’t involved in the project. However, the rapid evolutionary behavior isn’t a unique process separate from natural selection, as Eldredge and Gould suggested, but rather the result of periods of extremely rapid adaptation propelling evolutionary change.
“This is really a rather beautiful story in the philosophy of science,” Pagel said…
Given the strains that the Antropocene is putting on our environment, this could be timely…
* Charles Darwin
###
As we dissect development, we might spare a thought for Barabara McClintock; she died on this date in 1992. A cytogeneticist, she is regarded as one of the most important figures in the history of genetics. In the 1940s and 50s McClintock’s work on the cytogenetics of maize led her to theorize that genes are transposable – they can move around – on and between chromosomes. McClintock drew this inference by observing changing patterns of coloration in maize kernels over generations of controlled crosses. The idea that genes could move did not seem to fit with what was then known about genes, but improved molecular techniques of the late 1970s and early 1980s allowed other scientists to confirm her discovery. She was awarded the 1983 Nobel Prize in Physiology or Medicine, the first American woman to win an unshared Nobel Prize.
For more on McClintock’s work and its legacy, see here and here.
You must be logged in to post a comment.