Posts Tagged ‘scientific revolution’
“The bigger, the better”*…
Thea Applebaum Licht with a reminder that, when it comes to size, Texas has got nothing on California…
Between about 1905 and 1915, the United States entered a golden age of postcards. Cheaper and faster mail service, the advent of “divided back” cards (freeing the entire front for images), and improved commercial printing all drove a new mass market for collectible communication. It was at this same moment that a craze for “tall-tale” or “exaggeration” postcards reached its peak. By cutting, collaging, and re-photographing images, artists created out-of-proportion illusions. One of the most popular genres was agricultural goods of fantastic dimensions.
Nowhere were such postcards more popular than in the western states. There, in the heart of the tough business of agriculture, illustrations of folkloric American abundance were understandable favorites. Pride and place were tied up with the prodigious crops. Supersized fruits and vegetables were often accompanied by brief captions: “How We Do Things at Attica, Wis.”, “The Kind We Raise in Our State”, or “The Kind We Grow in Texas”. Photographers like William “Dad” H. Martin and Alfred Stanley Johnson Jr. captured farmers harvesting furniture-sized onions and stacking corn cobs like timber, fisherman reeling in leviathans, and children sharing canoe-like slices of watermelon.
In the series of exaggeration postcards [produced in the run-up to the postcard boom, then published during it] collected [here], it is California that takes center stage. Produced by the prolific San Francisco–based publisher Edward H. Mitchell, each card features a single rail car rolling through lush farmland. Aboard are gargantuan, luminous fruits and vegetables: dimpled navel oranges, a dusky bunch of grapes, and mottled walnuts. Placed end-to-end, the cards would make a colorful train crossing California’s fertile valleys. Unlike other, more action-packed “tall-tale” cards — filled with farmers, fisherman, and children for scale — Mitchell’s series is restrained. Sharply illuminated, the colossal cargo lean toward artwork rather than gag. “A Carload of Mammoth Apples”[here], green-yellow and gleaming, could have been plucked from Rene Magritte’s The Son of Man [here].
Fabulous fruit and vegetables: “Calicornication: Postcards of Giant Produce (1909),” from @publicdomainrev.bsky.social.
In other art-related news: (very) long-term readers might recall that, back in 2008, (R)D reported that London’s Daily Mail believed that it had tracked him down, and that he is Robin Gunningham. Now as Boing Boing reports:
Anyone reading Banksy’s Wikipedia article at any point since a famous Mail on Sunday exposé in 2008 would likely get the impression the secretive stenciler is probably Robin Gunningham or Robert Del Naja, artists who came from the Bristol Underground. Reuters, having conducted extensive research into their movements, finds both men present at critical moments, but only one at all of them: an arrest report from New York City puts Gunningham firmly in the frame, and recent public records from Ukraine put it beyond doubt.
We later unearthed previously undisclosed U.S. court records and police reports. These included a hand-written confession by the artist to a long-ago misdemeanor charge of disorderly conduct – a document that revealed, beyond dispute, Banksy’s true identity. … Reuters presented that man with its findings about his identity and detailed questions about his work and career. He didn’t reply. Banksy’s company, Pest Control, said the artist “has decided to say nothing.”
His long-time lawyer, Mark Stephens, wrote to Reuters that Banksy “does not accept that many of the details contained within your enquiry are correct.” He didn’t elaborate. Without confirming or denying Banksy’s identity, Stephens urged us not to publish this report, saying doing so would violate the artist’s privacy, interfere with his art and put him in danger.
Del Naja (better known for other work) evidently participates in painting the murals and is perhaps the stencil draftsman (Banksy: “he can actually draw”). Banksy’s former manager, Steve Lazarides, organized a legal name change for Gunningham after the Mail on Sunday item, which successfully ended records for Banksy’s movements under his birth name and stymied researchers—until Reuters figured out the new one by poring through Ukrainian public records on days Del Naja was there. Gunningham used the name David Jones, among the most common in the U.K. If it rings a bell, you might be thinking of another famous British artist was who obliged by his record company to find something more unique.
* common idiom
###
As we live large, we might spare a thought for Isaac Newton; he died on this date (O.S.) in 1727. A polymath who was a key figure in the Scientific Revolution and the Enlightenment that followed, Newton was a mathematician, physicist, astronomer, alchemist, theologian, author, and inventor. He contributed to and refined the scientific method, and his work is considered the most influential in bringing forth modern science. His book Philosophiæ Naturalis Principia Mathematica (Mathematical Principles of Natural Philosophy), first published in 1687, achieved the first great unification in physics and established classical mechanics. He also made seminal contributions to optics, and shares credit with the German mathematician Gottfried Wilhelm Leibniz for formulating infinitesimal calculus. (Newton developed calculus a couple of years before Leibniz, but published a couple of years after.) Newton spent the last three decades of his life in London, serving as Warden (1696–1699) and Master (1699–1727) of the Royal Mint, a role in which he increased the trustworthiness/accuracy and security of British coinage in a way crucial to the rise of Great Britain as a commercial and colonial power.
Newton, of course, had a famous relationship with fruit:
Newton often told the story that he was inspired to formulate his theory of gravitation by watching the fall of an apple from a tree. The story is believed to have passed into popular knowledge after being related by Catherine Barton, Newton’s niece, to Voltaire. Voltaire then wrote in his Essay on Epic Poetry (1727), “Sir Isaac Newton walking in his gardens, had the first thought of his system of gravitation, upon seeing an apple falling from a tree.” – source
Newton’s apple is thought to have been the green skinned ‘Flower of Kent’ variety.

“Truth is ever to be found in the simplicity, and not in the multiplicity and confusion of things”*…
From Kim (Scott) Morrison‘s and Dror Bar-Natan‘s, The Knot Atlas, “a complete user-editable knot atlas, in the wiki spirit of Wikipedia“– a marvelous example of a wide-spread urge in mathematics to find order through classification. As Joseph Howlett explains, that quest continues, even as it proves vexatious…
Biology in the 18th century was all about taxonomy. The staggering diversity of life made it hard to draw conclusions about how it came to be. Scientists first had to put things in their proper order, grouping species according to shared characteristics — no easy task. Since then, they’ve used these grand catalogs to understand the differences among organisms and to infer their evolutionary histories. Chemists built the periodic table for the same purpose — to classify the elements and understand their behaviors. And physicists made the Standard Model to explain how the fundamental particles of the universe interact.
In his book The Order of Things, the philosopher Michel Foucault describes this preoccupation with sorting as a formative step for the sciences. “A knowledge of empirical individuals,” he wrote, “can be acquired only from the continuous, ordered and universal tabulation of all possible differences.”
Mathematicians never got past this obsession. That’s because the menagerie of mathematics makes the biological catalog look like a petting zoo. Its inhabitants aren’t limited by physical reality. Any conceivable possibility, whether it lives in our universe or in some hypothetical 200-dimensional one, needs to be accounted for. There are tons of different classifications to try — groups, knots, manifolds and so on — and infinitely many objects to sort in each of those classifications. Classification is how mathematicians come to know the strange, abstract world they’re studying, and how they prove major theorems about it.Take groups, a central object of study in math. The classification of “finite simple groups” — the building blocks of all groups — was one of the grandest mathematical accomplishments of the 20th century. It took dozens of mathematicians nearly 100 years to finish. In the end, they figured out that all finite simple groups fall into three buckets, except for 26 itemized outliers. A dedicated crew of mathematicians has been working on a “condensed” proof of the classification since 1994 — it currently comprises 10 volumes and several thousand pages, and still isn’t finished. But the gargantuan undertaking continues to bear fruit, recently helping to prove a decades-old conjecture that you can infer a lot about a group by examining one small part of it.
Mathematics, unfettered by the typical constraints of reality, is all about possibility. Classification gives mathematicians a way to start exploring that limitless potential…[Howlett reviews attempts to classify numbers by “type” (postive/negative, rational/irrational), and mathematical objects by “equivalency” (shapes that can be stretched or squeezed into the other without breaking or tearing, like a doughnut and and coffee cup (see here)…]
… Similarly, classification has played an important role in knot theory. Tie a knot in a piece of string, then glue the string’s ends together — that’s a mathematical knot. Knots are equivalent if one can be tangled or untangled, without cutting the string, to match the other. This mundane-sounding task has lots of mathematical uses. In 2023, five mathematicians made progress on a key conjecture in knot theory that stated that all knots with a certain property (being “slice”) must also have another (being “ribbon”), with the proof ruling out a suspected counterexample. (As an aside, I’ve often wondered why knot theorists insist on using nouns as adjectives.)
Classifications can also get more meta. Both theoretical computer scientists and mathematicians classify problems about classification based on how “hard” they are.
All these classifications turn math’s disarrayed infinitude into accessible order. It’s a first step toward reining in the deluge that pours forth from mathematical imaginings…
“The Never-Ending Struggle to Classify All Math,” from @quantamagazine.bsky.social.
* Isaac Newton
###
As we sort, we might spare a thought for the author of our title quote, Sir Isaac Newton; he died in this date in 1727. A polymath, Newton excelled in– and advanced– mathematics, physics, and astronomy; he was a theologian and a government offical (Master of the Mint)… and a dedicated alchemist. He was key to the Scientific Revolution and the Enlightenment that followed.
Newton’s book Philosophiæ Naturalis Principia Mathematica (Mathematical Principles of Natural Philosophy), first published in 1687, achieved the first great unification in physics and established classical mechanics (e.g., the Laws of Motion and the principle of universal gravitation). He also made seminal contributions to optics, and shares credit with German mathematician Gottfried Wilhelm Leibniz for formulating infinitesimal calculus. Indeed, Newton contributed to and refined the scientific method to such an extent that his work is considered the most influential in the development of modern science.
“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it”*…
It’s very hard, historian of science Benjamin Breen explains, to understand the implications of a scientific revolution as one is living through it…
2023 is shaping up to be an important year in the history of science. And no, I’m not talking about the reputed room-temperature semiconductor LK-99, which seems increasingly likely to be a dud.
Instead, I’m talking about the discoveries you’ll find in Wikipedia’s list of scientific advances for 2023. Here are some examples:
• January: Positive results from a clinical trial of a vaccine for RSV; OpenAI’s ChatGPT enters wide use.
• February: A major breakthrough in quantum computing; announcement of a tiny robot that can clean blood vessels; more evidence for the ability of psychedelics to enhance neuroplasticity; major developments in biocomputers.
• March: OpenAI rolls out GPT-4; continued progress on mRNA vaccines for cancer.
• April: NASA announces astronaut crew who will orbit the moon next year; promising evidence for gene therapy to fight Alzheimer’s.
• May: Scientists use AI to translate brain activity into written words; promising results for a different Alzheimer’s drug; human pangenome sequenced (largely by a team of UCSC researchers — go Banana Slugs!); more good news about the potential of mRNA vaccines for fighting cancer.
And skipping ahead to just the past two weeks:
• nuclear fusion ignition with net energy gain was achieved for the second time
• a radical new approach to attacking cancer tumors entered Phase 1 trials in humans
• and — announced just as I was writing this [in August, 2023] — one of the new crop of weight loss drugs was reported to cut rates of heart attack and stroke in high-risk individuals by 20% (!).
Also in January of 2023: the New York Times asked “What Happened to All of Science’s Big Breakthroughs?”
The headline refers to an article published in Nature which argues that there has been a steady drop in “disruptive” scientific and technological breakthroughs between the years of 1945 and 2010. Basically, it’s a restatement of the concept of a “Great Stagnation” which was proposed by the economist Tyler Cowen in 2011. Though the paper cites everyone from Cowen to Albert Einstein and Isaac Newton, it’s worth noting that it doesn’t cite a single historian of science or technology (unless Alexandre Koyré counts)…
Naturally, as a historian of science and medicine, I think that there really are important things to learn from the history of science and medicine! And what I want to argue for the rest of this post boils down to two specific lessons from that history:
- People living through scientific revolutions are usually unaware of them — and, if they are, they don’t think about them in the same way that later generations do.
- An apparent slowdown in the rate of scientific innovation doesn’t always mean a slowdown in the impacts of science. The history of the first scientific revolution — the one that began in the famously terrible seventeenth century — suggests that the positive impacts of scientific innovation, in particular, are not always felt by the people living throughthe period of innovation. Periods when the pace of innovation appears to slow down may also be eras when society becomes more capable of benefitting from scientific advances by learning how to mitigate previously unforeseen risks.
[… There follows a fascinating look back at the 1660s– the “original” scientific revolution– at Boyle, Newton, at what they hoped/expected, and at how that differed for what their work and that of their colleagues actually yielded. Then the cautionary tale of Thomas Midgley..]
As we appear to be entering a new era of rapid scientific innovation in the 2020s, it is worth remembering that it often takes decades before the lasting social value of a technical innovation is understood — and decades more before we understand its downsides.
In the meantime, I’m pretty psyched about the cancer drugs…
As Thomas Kuhn observed, “The historian of science may be tempted to exclaim that when paradigms change, the world itself changes with them.”
On the difficulty of knowing the outcomes of a scientific revolution from within it: “Experiencing scientific revolutions: the 1660s and the 2020s,” from @ResObscura.
* Max Planck
###
As we try to see, we might spare a thought for William Seward Burroughs; he died on this date in 1898. And inventor who had worked in a bank, he invented the world’s first commercially viable recording adding machine and pioneered of its manufacture. The very successful company that he founded went on to become Unisys, which was instrumental in the development of computing… the implications of which we’re still discovering– and Burroughs surely never saw.
Nor, one reckons, did he imagine that his grandson, William Seward Burroughs II, would become the cultural figure that he did.
“Two polar groups: at one pole we have the literary intellectuals, at the other scientists… Between the two a gulf of mutual incomprehension.”*…

A contempt for science is neither new, lowbrow, nor confined to the political right. In his famous 1959 lecture “The Two Cultures and the Scientific Revolution,” C.P. Snow commented on the disdain for science among educated Britons and called for a greater integration of science into intellectual life. In response to this overture, the literary critic F.R. Leavis wrote a rebuttal in 1962 that was so vituperative The Spectator had to ask Snow to promise not to sue for libel if they published the work.
The highbrow war on science continues to this day, with flak not just from fossil-fuel-funded politicians and religious fundamentalists but also from our most adored intellectuals and in our most august institutions of higher learning. Magazines that are ostensibly dedicated to ideas confine themselves to those arising in politics and the arts, with scant attention to new ideas emerging from science, with the exception of politicized issues like climate change (and regular attacks on a sin called “scientism”). Just as pernicious is the treatment of science in the liberal-arts curricula of many universities. Students can graduate with only a trifling exposure to science, and what they do learn is often designed to poison them against it.
The most frequently assigned book on science in universities (aside from a popular biology textbook) is Thomas Kuhn’s The Structure of Scientific Revolutions. That 1962 classic is commonly interpreted as showing that science does not converge on the truth but merely busies itself with solving puzzles before lurching to some new paradigm that renders its previous theories obsolete; indeed, unintelligible. Though Kuhn himself disavowed that nihilist interpretation, it has become the conventional wisdom among many intellectuals. A critic from a major magazine once explained to me that the art world no longer considers whether works of art are “beautiful” for the same reason that scientists no longer consider whether theories are “true.” He seemed genuinely surprised when I corrected him…
The usually extremely optimistic Steven Pinker (see here, e.g.) waxes concerned– if not, indeed, pessimistic– about the place of science in today’s society: “The Intellectual War on Science.”
* C.P. Snow, The Two Cultures and the Scientific Revolution (1959)
###
As we rein in our relativism, we might send heavenly birthday greetings to the scientist who inspired Thomas Kuhn (see here and here), Nicolaus Copernicus; he was born on this date in 1473. A Renaissance polyglot and polymath– he was a canon lawyer, a mathematician, a physician, a classics scholar, a translator, a governor, a diplomat, and an economist– he is best remembered as an astronomer. Copernicus’ De revolutionibus orbium coelestium (On the Revolutions of the Celestial Spheres; published just before his death in 1543), with its heliocentric account of the solar system, is often regarded as the beginning both of modern astronomy and of the scientific revolution.
Of all discoveries and opinions, none may have exerted a greater effect on the human spirit than the doctrine of Copernicus. The world had scarcely become known as round and complete in itself when it was asked to waive the tremendous privilege of being the center of the universe. Never, perhaps, was a greater demand made on mankind – for by this admission so many things vanished in mist and smoke! What became of our Eden, our world of innocence, piety and poetry; the testimony of the senses; the conviction of a poetic – religious faith? No wonder his contemporaries did not wish to let all this go and offered every possible resistance to a doctrine which in its converts authorized and demanded a freedom of view and greatness of thought so far unknown, indeed not even dreamed of.
– Goethe
The life of trees…

Bristlecone pine rings: they are so fine and dense that over a century of life can be embodied in a single inch of wood
The oldest of the living bristlecone pines were saplings when the pyramids were raised. The most ancient, called Methuselah, is estimated to be more than 4,800 years old. As Ross Andersen explains in Aeon, their rings tell tales of climates past… and hold portents of climates to come.
The burning of books and libraries has perhaps fallen out of fashion, but if you look closely, you will find its spirit survives in another distinctly human activity, one as old as civilisation itself: the destruction of forests. Trees and forests are repositories of time; to destroy them is to destroy an irreplaceable record of the Earth’s past. Over this past century of unprecendented deforestation, a tiny cadre of scientists has roamed the world’s remaining woodlands, searching for trees with long memories, trees that promise science a new window into antiquity. To find a tree’s memories, you have to look past its leaves and even its bark; you have to go deep into its trunk, where the chronicles of its long life lie, secreted away like a library’s lost scrolls. This spring, I journeyed to the high, dry mountains of California to visit an ancient forest, a place as dense with history as Alexandria. A place where the heat of a dangerous fire is starting to rise…
Deforestation began in prehistoric times, but it was not always as brutal or efficient as it is today. Our primate ancestors practised a kind of deforestation by migration, trading the treetops for terra firma and the forests for open plains. Humans are a different story. Anthropologists suspect we have been cutting down trees for as long as we have been around, mostly to harvest raw material for shelter and fire, but also to construct crude bridges to cross rivers into new landscapes. For a time, our tree-felling was no match for the regenerative power of forests. Indeed, today’s indigenous forest peoples demonstrate the human capacity to live within a forest’s natural limits. But over the past 5,000-10,000 years, our fast-growing civilisations have developed the technology to clear trees faster than they can grow back. In that short time — a slim fraction of the forests’ tenure on Earth — we have managed to destroy more than half of them. And we are getting better at it…
Mankind’s newest deforestation “tool”– climate change:
In 2005, a researcher from Arizona’s tree-ring lab named Matthew Salzer noticed an unusual trend in the most recent stretch of bristlecone tree rings. Over the past half century, bristlecones near the tree line have grown faster than in any 50-year period of the past 3,700 years, a shift that portends ‘an environmental change unprecedented in millennia,’ according to Salzer. As temperatures along the peaks warm, the bristlecones are fattening up, adding thick rings in every spring season. Initially there was hope that the trend was local to the White Mountains, but Salzer and his colleagues have found the same string of fat rings — the same warming — in three separate bristlecone habitats in the western US. This might sound like good news for the trees, but it most assuredly is not. Indeed, the thick new rings might be a prophecy of sorts, a foretelling of the trees’ extinction…
Over the course of 400 million years, trees built up a fertile new layer on the Earth’s surface, a layer that incubated entirely new ecologies, including those that gave rise to our ancestors. But now it is humans who spread out over the planet, coating its surface in cities and farms, clearing away the very trees that enabled our origins. This forest, like so many others, has become an intersection in time, a place where narratives of geologic grandeur are colliding. A place to put your ear to the ground, a place to confirm that even here, in the most ancient of groves, if you listen closely, you can hear the roar of the coming Anthropocene.
Read this moving story in its entirety at Aeon.
###
As we ponder pruning our purchases, we might say both hello and goodbye to Sir Thomas Browne; he was born on this date in 1605, and died on this date in 1682. A devout Christian doctor (author of Religio Medici [The Religion of a Physician]), Browne was also a follower of Francis Bacon, an adherent of the Baconian dedication to enquiry, and as a consequence, a keen observer of (and writer about) the natural world… thus an early partisan in what we now call the Scientific Revolution.





You must be logged in to post a comment.