Posts Tagged ‘Robert Boyle’
“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it”*…
It’s very hard, historian of science Benjamin Breen explains, to understand the implications of a scientific revolution as one is living through it…
2023 is shaping up to be an important year in the history of science. And no, I’m not talking about the reputed room-temperature semiconductor LK-99, which seems increasingly likely to be a dud.
Instead, I’m talking about the discoveries you’ll find in Wikipedia’s list of scientific advances for 2023. Here are some examples:
• January: Positive results from a clinical trial of a vaccine for RSV; OpenAI’s ChatGPT enters wide use.
• February: A major breakthrough in quantum computing; announcement of a tiny robot that can clean blood vessels; more evidence for the ability of psychedelics to enhance neuroplasticity; major developments in biocomputers.
• March: OpenAI rolls out GPT-4; continued progress on mRNA vaccines for cancer.
• April: NASA announces astronaut crew who will orbit the moon next year; promising evidence for gene therapy to fight Alzheimer’s.
• May: Scientists use AI to translate brain activity into written words; promising results for a different Alzheimer’s drug; human pangenome sequenced (largely by a team of UCSC researchers — go Banana Slugs!); more good news about the potential of mRNA vaccines for fighting cancer.
And skipping ahead to just the past two weeks:
• nuclear fusion ignition with net energy gain was achieved for the second time
• a radical new approach to attacking cancer tumors entered Phase 1 trials in humans
• and — announced just as I was writing this [in August, 2023] — one of the new crop of weight loss drugs was reported to cut rates of heart attack and stroke in high-risk individuals by 20% (!).
Also in January of 2023: the New York Times asked “What Happened to All of Science’s Big Breakthroughs?”
The headline refers to an article published in Nature which argues that there has been a steady drop in “disruptive” scientific and technological breakthroughs between the years of 1945 and 2010. Basically, it’s a restatement of the concept of a “Great Stagnation” which was proposed by the economist Tyler Cowen in 2011. Though the paper cites everyone from Cowen to Albert Einstein and Isaac Newton, it’s worth noting that it doesn’t cite a single historian of science or technology (unless Alexandre Koyré counts)…
Naturally, as a historian of science and medicine, I think that there really are important things to learn from the history of science and medicine! And what I want to argue for the rest of this post boils down to two specific lessons from that history:
- People living through scientific revolutions are usually unaware of them — and, if they are, they don’t think about them in the same way that later generations do.
- An apparent slowdown in the rate of scientific innovation doesn’t always mean a slowdown in the impacts of science. The history of the first scientific revolution — the one that began in the famously terrible seventeenth century — suggests that the positive impacts of scientific innovation, in particular, are not always felt by the people living throughthe period of innovation. Periods when the pace of innovation appears to slow down may also be eras when society becomes more capable of benefitting from scientific advances by learning how to mitigate previously unforeseen risks.
[… There follows a fascinating look back at the 1660s– the “original” scientific revolution– at Boyle, Newton, at what they hoped/expected, and at how that differed for what their work and that of their colleagues actually yielded. Then the cautionary tale of Thomas Midgley..]
As we appear to be entering a new era of rapid scientific innovation in the 2020s, it is worth remembering that it often takes decades before the lasting social value of a technical innovation is understood — and decades more before we understand its downsides.
In the meantime, I’m pretty psyched about the cancer drugs…
As Thomas Kuhn observed, “The historian of science may be tempted to exclaim that when paradigms change, the world itself changes with them.”
On the difficulty of knowing the outcomes of a scientific revolution from within it: “Experiencing scientific revolutions: the 1660s and the 2020s,” from @ResObscura.
* Max Planck
###
As we try to see, we might spare a thought for William Seward Burroughs; he died on this date in 1898. And inventor who had worked in a bank, he invented the world’s first commercially viable recording adding machine and pioneered of its manufacture. The very successful company that he founded went on to become Unisys, which was instrumental in the development of computing… the implications of which we’re still discovering– and Burroughs surely never saw.
Nor, one reckons, did he imagine that his grandson, William Seward Burroughs II, would become the cultural figure that he did.
“If we knew what it was we were doing, it would not be called research, would it?”*…

The first issue of the the first volume of the first scientific journal, the Philosophical Transactions of the Royal Society of London
Scientific papers, at the very dawn of that writing form, hadn’t yet evolved the conventions we’re so familiar with today. As a result, the contents of that first volume (and those that followed) are a fascinating mix of the groundbreaking, the banal, and the bizarre. Some are written as letters, some take the form of essays, some are abstracts or reviews of separately published books, and some are just plain inscrutable…
For example, this contribution from Robert Boyle, the father of modern chemistry and a pioneer of the scientific method:
A New Frigorifick Experiment Shewing, How a Considerable Degree of Cold May be Suddenly Produced without the Help of Snow, Ice, Haile, Wind, or Niter, and That at Any Time of the Year – Robert Boyle (again!) (Phil Trans 1:255-261). The word “frigorific”, which Boyle apparently coined for this title, meant “producing cold”, and Boyle’s claim was that simply mixing ammonium chloride into water would cool the solution down. This doesn’t seem to actually be true (saltpetre is frigorific; straight ammonium chloride can keep water liquid below normal freezing point, but isn’t actually frigorific). But although Boyle’s title is a bit hyperbolic, and he does go on a bit, he describes his experiments quite lucidly, so it’s probably unfair to call this one a weird paper. Whether Boyle was right or wrong, here he was doing modern science…
Stephen Heard observes…
Boyle’s Frigorifick paper raises an important point: not every paper in the early Philosophical Transactions was weird, even if in a few case it takes a close reading to realize that. The oddities are interspersed with important observations (like those of Jupiter’s Great Red Spot) and descriptions of major advances (like Robert Hooke’s microscopic observations of cells). But the oddities are there by the dozen, and they give the impression of a freewheeling, chaotic, and perhaps somewhat credulous period at the birth of modern science. It was not yet quite clear where the boundaries of science were – where to draw the lines between science and engineering, or architecture, or alchemy, or wild speculation…
Sound familiar?
See more examples and learn more at “The Golden Age of Weird Papers.”
* Albert Einstein
###
As we scratch our chins, we might spare a thought for Max Born; he died on this date in 1970. A German physicist and Nobel Laureate, he coined the phrase “quantum mechanics” to describe the field in which he made his greatest contributions. But beyond his accomplishments as a practitioner, he was a master teacher whose students included Enrico Fermi and Werner Heisenberg– both of whom became Nobel Laureates before their mentor– and J. Robert Oppenheimer.
Less well-known is that Born, who died in 1970, was the grandfather of Australian phenom and definitive Sandy-portrayer Olivia Newton-John.
You must be logged in to post a comment.