(Roughly) Daily

Posts Tagged ‘scientific revolution

“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it”*…

It’s very hard, historian of science Benjamin Breen explains, to understand the implications of a scientific revolution as one is living through it…

2023 is shaping up to be an important year in the history of science. And no, I’m not talking about the reputed room-temperature semiconductor LK-99, which seems increasingly likely to be a dud.

Instead, I’m talking about the discoveries you’ll find in Wikipedia’s list of scientific advances for 2023. Here are some examples:

• January: Positive results from a clinical trial of a vaccine for RSV; OpenAI’s ChatGPT enters wide use.

February: A major breakthrough in quantum computing; announcement of a tiny robot that can clean blood vessels; more evidence for the ability of psychedelics to enhance neuroplasticity; major developments in biocomputers.

• March: OpenAI rolls out GPT-4; continued progress on mRNA vaccines for cancer.

• April: NASA announces astronaut crew who will orbit the moon next year; promising evidence for gene therapy to fight Alzheimer’s.

• May: Scientists use AI to translate brain activity into written words; promising results for a different Alzheimer’s drug; human pangenome sequenced (largely by a team of UCSC researchers — go Banana Slugs!); more good news about the potential of mRNA vaccines for fighting cancer.

And skipping ahead to just the past two weeks:

• nuclear fusion ignition with net energy gain was achieved for the second time

• a radical new approach to attacking cancer tumors entered Phase 1 trials in humans

• and — announced just as I was writing this [in August, 2023] — one of the new crop of weight loss drugs was reported to cut rates of heart attack and stroke in high-risk individuals by 20% (!).

Also in January of 2023: the New York Times asked “What Happened to All of Science’s Big Breakthroughs?”

The headline refers to an article published in Nature which argues that there has been a steady drop in “disruptive” scientific and technological breakthroughs between the years of 1945 and 2010. Basically, it’s a restatement of the concept of a “Great Stagnation” which was proposed by the economist Tyler Cowen in 2011. Though the paper cites everyone from Cowen to Albert Einstein and Isaac Newton, it’s worth noting that it doesn’t cite a single historian of science or technology (unless Alexandre Koyré counts)…

Naturally, as a historian of science and medicine, I think that there really are important things to learn from the history of science and medicine! And what I want to argue for the rest of this post boils down to two specific lessons from that history:

  1. People living through scientific revolutions are usually unaware of them — and, if they are, they don’t think about them in the same way that later generations do.
  2. An apparent slowdown in the rate of scientific innovation doesn’t always mean a slowdown in the impacts of science. The history of the first scientific revolution — the one that began in the famously terrible seventeenth century — suggests that the positive impacts of scientific innovation, in particular, are not always felt by the people living throughthe period of innovation. Periods when the pace of innovation appears to slow down may also be eras when society becomes more capable of benefitting from scientific advances by learning how to mitigate previously unforeseen risks.

[… There follows a fascinating look back at the 1660s– the “original” scientific revolution– at Boyle, Newton, at what they hoped/expected, and at how that differed for what their work and that of their colleagues actually yielded. Then the cautionary tale of Thomas Midgley..]

As we appear to be entering a new era of rapid scientific innovation in the 2020s, it is worth remembering that it often takes decades before the lasting social value of a technical innovation is understood — and decades more before we understand its downsides.

In the meantime, I’m pretty psyched about the cancer drugs…

As Thomas Kuhn observed, “The historian of science may be tempted to exclaim that when paradigms change, the world itself changes with them.”

On the difficulty of knowing the outcomes of a scientific revolution from within it: “Experiencing scientific revolutions: the 1660s and the 2020s,” from @ResObscura.

* Max Planck

###

As we try to see, we might spare a thought for William Seward Burroughs; he died on this date in 1898. And inventor who had worked in a bank, he invented the world’s first commercially viable recording adding machine and pioneered of its manufacture. The very successful company that he founded went on to become Unisys, which was instrumental in the development of computing… the implications of which we’re still discovering– and Burroughs surely never saw.

Nor, one reckons, did he imagine that his grandson, William Seward Burroughs II, would become the cultural figure that he did.

source

“Two polar groups: at one pole we have the literary intellectuals, at the other scientists… Between the two a gulf of mutual incomprehension.”*…

 

A contempt for science is neither new, lowbrow, nor confined to the political right. In his famous 1959 lecture “The Two Cultures and the Scientific Revolution,” C.P. Snow commented on the disdain for science among educated Britons and called for a greater integration of science into intellectual life. In response to this overture, the literary critic F.R. Leavis wrote a rebuttal in 1962 that was so vituperative The Spectator had to ask Snow to promise not to sue for libel if they published the work.

The highbrow war on science continues to this day, with flak not just from fossil-fuel-funded politicians and religious fundamentalists but also from our most adored intellectuals and in our most august institutions of higher learning. Magazines that are ostensibly dedicated to ideas confine themselves to those arising in politics and the arts, with scant attention to new ideas emerging from science, with the exception of politicized issues like climate change (and regular attacks on a sin called “scientism”). Just as pernicious is the treatment of science in the liberal-arts curricula of many universities. Students can graduate with only a trifling exposure to science, and what they do learn is often designed to poison them against it.

The most frequently assigned book on science in universities (aside from a popular biology textbook) is Thomas Kuhn’s The Structure of Scientific Revolutions. That 1962 classic is commonly interpreted as showing that science does not converge on the truth but merely busies itself with solving puzzles before lurching to some new paradigm that renders its previous theories obsolete; indeed, unintelligible. Though Kuhn himself disavowed that nihilist interpretation, it has become the conventional wisdom among many intellectuals. A critic from a major magazine once explained to me that the art world no longer considers whether works of art are “beautiful” for the same reason that scientists no longer consider whether theories are “true.” He seemed genuinely surprised when I corrected him…

The usually extremely optimistic Steven Pinker (see here, e.g.) waxes concerned– if not, indeed, pessimistic– about the place of science in today’s society: “The Intellectual War on Science.”

* C.P. Snow, The Two Cultures and the Scientific Revolution (1959)

###

As we rein in our relativism, we might send heavenly birthday greetings to the scientist who inspired Thomas Kuhn (see here and here), Nicolaus Copernicus; he was born on this date in 1473.  A Renaissance polyglot and polymath– he was a canon lawyer, a mathematician, a physician,  a classics scholar, a translator, a governor, a diplomat, and an economist– he is best remembered as an astronomer.  Copernicus’ De revolutionibus orbium coelestium (On the Revolutions of the Celestial Spheres; published just before his death in 1543), with its heliocentric account of the solar system, is often regarded as the beginning both of modern astronomy and of the scientific revolution.

Of all discoveries and opinions, none may have exerted a greater effect on the human spirit than the doctrine of Copernicus. The world had scarcely become known as round and complete in itself when it was asked to waive the tremendous privilege of being the center of the universe. Never, perhaps, was a greater demand made on mankind – for by this admission so many things vanished in mist and smoke! What became of our Eden, our world of innocence, piety and poetry; the testimony of the senses; the conviction of a poetic – religious faith? No wonder his contemporaries did not wish to let all this go and offered every possible resistance to a doctrine which in its converts authorized and demanded a freedom of view and greatness of thought so far unknown, indeed not even dreamed of.

– Goethe

 source

Written by (Roughly) Daily

February 19, 2018 at 1:01 am

The life of trees…

 

Bristlecone pine rings: they are so fine and dense that over a century of life can be embodied in a single inch of wood

The oldest of the living bristlecone pines were saplings when the pyramids were raised.  The most ancient, called Methuselah, is estimated to be more than 4,800 years old.  As Ross Andersen explains in Aeon, their rings tell tales of climates past… and hold portents of climates to come.

The burning of books and libraries has perhaps fallen out of fashion, but if you look closely, you will find its spirit survives in another distinctly human activity, one as old as civilisation itself: the destruction of forests. Trees and forests are repositories of time; to destroy them is to destroy an irreplaceable record of the Earth’s past. Over this past century of unprecendented deforestation, a tiny cadre of scientists has roamed the world’s remaining woodlands, searching for trees with long memories, trees that promise science a new window into antiquity. To find a tree’s memories, you have to look past its leaves and even its bark; you have to go deep into its trunk, where the chronicles of its long life lie, secreted away like a library’s lost scrolls. This spring, I journeyed to the high, dry mountains of California to visit an ancient forest, a place as dense with history as Alexandria. A place where the heat of a dangerous fire is starting to rise…

Deforestation began in prehistoric times, but it was not always as brutal or efficient as it is today. Our primate ancestors practised a kind of deforestation by migration, trading the treetops for terra firma and the forests for open plains. Humans are a different story. Anthropologists suspect we have been cutting down trees for as long as we have been around, mostly to harvest raw material for shelter and fire, but also to construct crude bridges to cross rivers into new landscapes. For a time, our tree-felling was no match for the regenerative power of forests. Indeed, today’s indigenous forest peoples demonstrate the human capacity to live within a forest’s natural limits. But over the past 5,000-10,000 years, our fast-growing civilisations have developed the technology to clear trees faster than they can grow back. In that short time — a slim fraction of the forests’ tenure on Earth — we have managed to destroy more than half of them. And we are getting better at it…

Mankind’s newest deforestation “tool”– climate change:

In 2005, a researcher from Arizona’s tree-ring lab named Matthew Salzer noticed an unusual trend in the most recent stretch of bristlecone tree rings. Over the past half century, bristlecones near the tree line have grown faster than in any 50-year period of the past 3,700 years, a shift that portends ‘an environmental change unprecedented in millennia,’ according to Salzer. As temperatures along the peaks warm, the bristlecones are fattening up, adding thick rings in every spring season. Initially there was hope that the trend was local to the White Mountains, but Salzer and his colleagues have found the same string of fat rings — the same warming — in three separate bristlecone habitats in the western US. This might sound like good news for the trees, but it most assuredly is not. Indeed, the thick new rings might be a prophecy of sorts, a foretelling of the trees’ extinction…

Over the course of 400 million years, trees built up a fertile new layer on the Earth’s surface, a layer that incubated entirely new ecologies, including those that gave rise to our ancestors. But now it is humans who spread out over the planet, coating its surface in cities and farms, clearing away the very trees that enabled our origins. This forest, like so many others, has become an intersection in time, a place where narratives of geologic grandeur are colliding. A place to put your ear to the ground, a place to confirm that even here, in the most ancient of groves, if you listen closely, you can hear the roar of the coming Anthropocene.

Read this moving story in its entirety at Aeon.

###

As we ponder pruning our purchases, we might say both hello and goodbye to Sir Thomas Browne; he was born on this date in 1605, and died on this date in 1682.  A devout Christian doctor (author of Religio Medici [The Religion of a Physician]), Browne was also a follower of Francis Bacon, an adherent of the Baconian dedication to enquiry, and as a consequence, a keen observer of (and writer about) the natural world… thus an early partisan in what we now call the Scientific Revolution.

 source

 

Written by (Roughly) Daily

October 19, 2012 at 1:01 am

Meet a Beatle…

 click here for video

As a service to bewildered younger viewers of the recent Grammy Awards show, the History Channel and Twitter combined (under the auspices of Funny or Die) to produce the helpful documentary, Who is Paul McCartney?

As we say, “oh yeah (yeah yeah),” we might send heavenly birthday greetings to Renaissance astronomer Nicolaus Copernicus; he was born on this date in 1473.  Copernicus’ De revolutionibus orbium coelestium (On the Revolutions of the Celestial Spheres; published just before his death in 1543), with its heliocentric account of the solar system, is often regarded as the beginning both of modern astronomy and of the scientific revolution.

Of all discoveries and opinions, none may have exerted a greater effect on the human spirit than the doctrine of Copernicus. The world had scarcely become known as round and complete in itself when it was asked to waive the tremendous privilege of being the center of the universe. Never, perhaps, was a greater demand made on mankind – for by this admission so many things vanished in mist and smoke! What became of our Eden, our world of innocence, piety and poetry; the testimony of the senses; the conviction of a poetic – religious faith? No wonder his contemporaries did not wish to let all this go and offered every possible resistance to a doctrine which in its converts authorized and demanded a freedom of view and greatness of thought so far unknown, indeed not even dreamed of.

– Goethe

 Copernicus (source)

Written by (Roughly) Daily

February 19, 2012 at 1:01 am

Leading horses to water…

… and making them drink:

from Spiked Math.

On a more serious note… many are skeptical of “the Singularity”– the hypothetical point at which technological progress will have accelerated so much that the future becomes fundamentally unpredictable and qualitatively different from what’s gone before (click here for a transcript of the talk by Vernor Vinge that launched the concept, and here for a peek at what’s become of Vernor’s initial thinking).  But even those with doubts (among whom your correspondent numbers) acknowledge that technology is re-weaving the very fabric of life.  Readers interested in a better understanding of what’s afoot and where it might lead will appreciate Kevin Kelly’s What Technology Wants (and the continuing discussion on Kevin’s site).

As we re-set our multiplication tables, we might recall that it was on this date in 1664 that natural philosopher, architect and pioneer of the Scientific Revolution Robert Hooke showed an advance copy of his book Micrographia— a chronicle of Hooke’s observations through various lens– to members of the Royal Society.  The volume (which coined the word “cell” in a biological context) went on to become the first scientific best-seller, and inspired broad interest in the new science of microscopy.

source: Cal Tech

UPDATE: Reader JR notes that the image above is of an edition of Micrographia dated 1665.  Indeed, while (per the almanac entry above) the text was previewed to the Royal Society in 1664 (to wit the letter, verso), the book wasn’t published until September, 1665.  JR remarks as well that Micrographia is in English (while most scientific books of that time were still in Latin)– a fact that no doubt contributed to its best-seller status.