(Roughly) Daily

Posts Tagged ‘history of science

“Any fool can know. The point is to understand.”*…

A corridor in King’s College, Cambridge, England dating from the 15th century

… and, Rachael Scarborough King and Seth Rudy argue, to serve a clear purpose…

Right now, many forms of knowledge production seem to be facing their end. The crisis of the humanities has reached a tipping point of financial and popular disinvestment, while technological advances such as new artificial intelligence programmes may outstrip human ingenuity. As news outlets disappear, extreme political movements question the concept of objectivity and the scientific process. Many of our systems for producing and certifying knowledge have ended or are ending.

We want to offer a new perspective by arguing that it is salutary – or even desirable – for knowledge projects to confront their ends. With humanities scholars, social scientists and natural scientists all forced to defend their work, from accusations of the ‘hoax’ of climate change to assumptions of the ‘uselessness’ of a humanities degree, knowledge producers within and without academia are challenged to articulate why they do what they do and, we suggest, when they might be done. The prospect of an artificially or externally imposed end can help clarify both the purpose and endpoint of our scholarship.

We believe the time has come for scholars across fields to reorient their work around the question of ‘ends’. This need not mean acquiescence to the logics of either economic utilitarianism or partisan fealty that have already proved so damaging to 21st-century institutions. But avoiding the question will not solve the problem. If we want the university to remain a viable space for knowledge production, then scholars across disciplines must be able to identify the goal of their work – in part to advance the Enlightenment project of ‘useful knowledge’ and in part to defend themselves from public and political mischaracterisation.

Our volume The Ends of Knowledge: Outcomes and Endpoints Across the Arts and Sciences (2023) asks how we should understand the ends of knowledge today. What is the relationship between an individual knowledge project – say, an experiment on a fruit fly, a reading of a poem, or the creation of a Large Language Model – and the aim of a discipline or field? In areas ranging from physics to literary studies to activism to climate science, we asked practitioners to consider the ends of their work – its purpose – as well as its end: the point at which it might be complete. The responses showed surprising points of commonality in identifying the ends of knowledge, as well as the value of having the end in sight…

Read on for a provocative case that academics need to think harder about the purpose of their disciplines and a consideration of whether some of those should come to an end: “The Ends of Knowledge,” in @aeonmag.

* Albert Einstein

###

As we contemplate conclusions, we might recall that it was on this date in 1869 that the first issue of the journal Nature was published.  Taking it’s title from a line of Wordsworth’s (“To the solid ground of nature trusts the Mind that builds for aye”), its aim was to “provide cultivated readers with an accessible forum for reading about advances in scientific knowledge.”  It remains a weekly, international, interdisciplinary journal of science, one of the few remaining that publish across a wide array of fields.  It is consistently ranked the world’s most cited scientific journal and is ascribed an impact factor of approximately 64.8, making it one of the world’s top academic journals.

Nature‘s first first page (source)

“The structure of the universe- I mean, of the heavens and the earth and the whole world- was arranged by one harmony through the blending of the most opposite principles”*…

Two diagrams from Agrippa’s De occulta philosophia (1533) demonstrating the proportion, measure, and harmony of human bodies — Source: left, right

… And as we undertake to understand that structure, we use the lens– the mental models and language– that we have. The redoubtable Anthony Grafton considers and early 16th century attempt: Heinrich Cornelius Agrippa‘s De Occulta Philosophia libri III, Agrippa’s encyclopedic study of magic that was, at the same time, an attempt to describe the structure of the universe, sketching a path that leads both upward and downward: up toward complete knowledge of God, and down into every order of being on earth…

Heinrich Cornelius Agrippa’s manual of learned magic, De occulta philosophia (1533), explicated the ways in which magicians understood and manipulated the cosmos more systematically than any of his predecessors. It was here that he mapped the entire network of forces that passed from angels and demons, stars and planets, downward into the world of matter. Agrippa laid his work out in three books, on the elementary, astrological, and celestial worlds. But he saw all of them as connected, weaving complex spider webs of influence that passed from high to low and low to high. With the zeal and learning of an encyclopedist imagined by Borges, Agrippa catalogued the parts of the soul and body, animals, minerals, and plants that came under the influence of any given planet or daemon. He then offered his readers a plethora of ways for averting evil influences and enhancing good ones. Some of these were originally simple remedies, many of them passed down from Roman times in the great encyclopedic work of Pliny the Younger and less respectable sources, and lacked any deep connection to learned magic.

[Grafton describes the many dimensions of Agrippa’s compilation of the then-current state of magic…]

But few of the dozens of manuscript compilations that transmitted magic through the Middle Ages reflected any effort to impose a system on the whole range of magical practices, as Agrippa’s book did. He made clear that each of the separate arts of magic, from the simplest form of herbal remedy to the highest forms of communication with angels, fitted into a single, lucid structure with three levels: the elementary or terrestrial realm, ruled by medicine and natural magic; the celestial realm, ruled by astrology; and the intellectual realm, ruled by angelic magic. Long tendrils of celestial and magical influence stitched these disparate realms into something like a single great being…

Agrippa offered, in other words, both a grand, schematic plan of the cosmos, rather like that of the London Underground, which laid out its structure as a whole, and a clutch of minutely detailed local Ordinance Survey maps, which made it possible to navigate through any specific part of the cosmos. Readers rapidly saw what Agrippa had to offer. The owner of a copy of On Occult Philosophy, now in Munich, made clear in his only annotation that he appreciated Agrippa’s systematic presentation of a universe in which physical forms revealed the natures of beings and their relations to one another: “Physiognomy, metoposcopy [the interpretation of faces], and chiromancy, and the arts of divination from the appearance and gestures of the human body work through signs.” Agrippa’s book not only became the manual of magical practice, but it also made the formal claim that magic was a kind of philosophy in its own right…

A 16th century attempt to understand the structure of the universe: “Marked by Stars- Agrippa’s Occult Philosophy,” from @scaliger in @PublicDomainRev.

* Aristotle

###

As we take in the totality, we might send more modern birthday greetings to a rough contemporary of Agrippa’s, Evangelista Torricelli; he was born on this date in 1608. Even as Agrippa was trying to understand the world via magic, Torricelli, a student of Galileo, was using observation and reason to fuel the same quest. A physicist and mathematician, he is best known for his invention of the barometer, but is also known for his advances in optics, his work on the method of indivisibles, and “Torricelli’s Trumpet.” The torr, a unit of pressure, is named after him.

source

“Oh dark, dark, dark, amid the blaze of noon, irrevocably dark, total eclipse without all hope of day”*…

Today is the occasion of an annular eclipse, which will pass through eight U.S. states before crossing the Gulf of Mexico and to transit Mexico, Guatemala, Belize, Honduras, Nicaragua, Costa Rica, Panama, Colombia, and Brazil. While some people in the Western Hemisphere will witness a “ring of fire” during the eclipse, many more will experience the phenomenon of crescent sunlight. Rebecca Boyle has advice on how we might approach it…

… This Saturday, for some people in the Western Hemisphere, the Sun will disappear for a few minutes and appear to leave a flaming hole in the sky. Instead of a ball of fire, the Sun will transform into a ring of fire, a strange and wondrous sight. This is an annular solar eclipse, and it happens because the Moon is right smack in front of the Sun.

A solar eclipse only happens during new Moon phases, when we otherwise wouldn’t be able to see our nearest celestial companion. Though we get a new Moon every month, we do not get solar eclipses as often because of our satellite’s oddball path around the planet. Sometimes the Moon casts a shadow just above Earth, and sometimes just below. This weekend, the Moon’s shadow will fall onto Earth, just right for people in parts of the Western Hemisphere to see it.

The annular eclipse is a preview of a more incredible, rarer event next April, when a total solar eclipse will cross the continental United States. There is no experience on Earth like a total eclipse; make plans to see it, if you can. But this weekend’s “ring of fire” eclipse is an event you should try to see first (safely, with eclipse glasses), if you can get yourself into the western U.S. or parts of Central and South America. Here’s a map showing the eclipse path; if you can’t travel to see it in person, you can watch the eclipse online.

Eclipses happen because the Sun and Moon appear to be roughly the same diameter. The Sun is actually about 400 times larger than the Moon, but it is also about 400 times more distant, so they seem like the same size in our sky.

The Moon’s shadow forms two concentric cones, composed of an inner shadow called the umbra, where the sun is completely obscured, and an outer, broader cone called a penumbra, where sunlight still shines but it is partially blocked. The umbra can be seen in a narrow geographic ribbon across the Americas, and it’s where you will see a full eclipse; under the penumbra, which covers much of the western U.S., Central and South America, you will see a partial eclipse.

Like the gears of a clock, a combination of precise positions and movements initiate an eclipse of the Sun. As Earth spins, day breaks. The Sun and Moon appear to trace a path across the sky. The Sun is not moving (at least not perceptibly); Earth’s rotation makes the star’s position change. The Moon is moving around us while the Earth rotates, so it seems to move too, but it appears to go slower than our star. The partial solar eclipse begins as the Sun catches up to the Moon’s position in our sky. On Saturday morning around 8:06 a.m. Pacific time, people in Eugene, Oregon, will be the first to see the Moon appear to take a bite out of the Sun. The bite will get progressively bigger until the full annular eclipse begins at 9:16 a.m. Pacific time.

The annular eclipse only lasts about four minutes (depending on your precise location under the Moon’s shadow) but the partial eclipse, which will be visible over a much wider geographic area, lasts about an hour and 15 minutes before and afterward. During this phase, shadows cast by objects on Earth will change in unusual ways. One lovely place to be during a partial solar eclipse is underneath a tree, if you can find an evergreen or a deciduous tree that has not dropped its leaves yet. Look at the ground. In the dappled light, you will see crescents everywhere: the crescent Sun.

Sunlight is the heavens reaching down to touch us right where we stand; I think about this when I step into the light. But crescent sunlight is the Moon joining this experience. Its darkness, rather than its light, reaches out to touch us, too…

An informative and lyrical guide to today’s eclipse: “During an Annular Eclipse, Look to the Shadows,” from @rboyle31 in @atlasobscura.

* John Milton, Samson Agonistes

###

As we don’t look directly, we might recall that on this date in 1609, Galileo (who has claim to the titles Father of observational astronomy, modern-era classical physics, the scientific method, and modern science) put the telescope to use in his astronomical work. Upon hearing (at age 40) that a Dutch optician had invented a glass that made distant objects appear larger, Galileo crafted his telescope. He continued to improve his device, ultimately achieving 30X magnification, and recorded his observations of the Moon, the moons of Jupiter, the Phases of Venus, Sunspots, The Milky Way, and more. He published his initial telescopic astronomical observations in March 1610 in a brief treatise entitled Sidereus Nuncius (Starry Messenger).

Telescopes were also a profitable sideline for Galileo, who sold them to merchants who found them useful both at sea and as items of trade.

Galileo’s “cannocchiali” telescopes at the Museo Galileo, Florence (source)

Written by (Roughly) Daily

October 14, 2023 at 1:00 am

“Reality is merely an illusion, albeit a very persistent one”*…

In an excerpt from his new book, The Rigor of Angels: Borges, Heisenberg, Kant, and the Ultimate Nature of Reality, the estimable William Egginton explains the central mystery at the heart of one of the most important breakthroughs in physics–quantum mechanics…

For all its astonishing, mind-bending complexity– for all its blurry cats, entangled particles, buckyballs, and Bell’s inequalities– quantum mechanics ultimately boils down to one core mystery. This mystery found its best expression in the letter Heisenberg wrote to Pauli in the fevered throes of his discovery. The path a particle takes ‘only comes into existence through this, that we observe it.’ This single, stunning expression underlies all the rest: the wave/particle duality (interference patterns emerge when the particles have not yet been observed and hence their possible paths interfere with one another); the apparently absurd liminal state of Schrodinger’s cat ( the cat seems to remain blurred between life and death because atoms don’t release a particle until observed); the temporal paradox (observing a particle seems to retroactively determine the path it chose to get here); and, the one that really got to Einstein, if the observation of a particle at one place and time instantaneously changes something about the rest of reality, then locality, the cornerstone of relativity and guarantee that the laws of physics are invariable through the universe, vanishes like fog on a warming windowpane.

If the act of observation somehow instantaneously conjures a particle’s path, the foundations not only of classical physics but also of what we widely regard as physical reality crumble before our eyes. This fact explains why Einstein held fast to another interpretation. The particle’s path doesn’t come into existence when we observe it. The path exists, but we just can’t see it. Like the parable of the ball in the box he described in his letter to Schrodinger, a 50 percent chance of finding a ball in any one of two boxes does not complete the description of the ball’s reality before we open the box. It merely states our lack of knowledge about the ball’s whereabouts. 

And yet, as experiment after experiment has proven, the balls simply aren’t there before the observation. We can separate entangled particles, seemingly to any conceivable distance, and by observing one simultaneously come to know something about the other–something that wasn’t the case until the exact moment of observing it. Like the beer and whiskey twins, we can maintain total randomness up to a nanosecond before one of them orders, and still what the one decides to order will determine the other’s drink, on the spot, even light-years away. 

The ineluctable fact of entanglement tells us something profound about reality and our relation to it. Imagine you are one of the twins about to order a drink (this should be more imaginable than being an entangled particle about to be observed, but the idea is the same). From your perspective you can order either a whiskey or a beer: it’s a fifty-fifty choice; nothing is forcing your hand. Unbeknownst to you, however, in a galaxy far, far away, your twin has just made the choice for you. Your twin can’t tell you this or signal it in any way, but what you perceive to be a perfectly random set of possibilities, an open choice, is entirely constrained. You have no idea if you will order beer or whiskey, but when you order it, it will be the one or the other all the same. If your twin is, say, one light-year away, the time in which you make this decision doesn’t even exist over there yet. Any signals your sibling gets from you, or any signals you send, will take another year to arrive. And still, as of this moment, you each know. Neither will get confirmation for another year, but you can be confident, you can bet your life’s savings on it–a random coin toss in another galaxy, and you already know the outcome. 

The riddles that arise from Heisenberg’s starting point would seem to constitute the most vital questions of existence. And yet one of the curious side effects of quantum mechanics’ extraordinary success has been a kind of quietism in the face of those very questions. The interpretation of quantum mechanics, deciding what all this means, has tended to go unnoticed by serious physics departments and the granting agencies that support them in favor of the ‘shut up and calculate’ school, leading the former to take hold mainly in philosophy departments, as a subfield of the philosophy of science called foundations of physics. Nevertheless, despite such siloing, a few physicists persisted in exploring possible solutions to the quantum riddles. Some of their ideas have been literally otherworldly.

In the 1950s, a small group of graduate students working with John Wheeler at Princeton University became fascinated with these problems and kept returning to them in late-night, sherry-fueled rap sessions. Chief among this group was Hugh Everett III, a young man with classic 1950s-style nerd glasses and a looming forehead. Everett found himself chafing at the growing no-question zone that proponents of the Copenhagen interpretation had built around their science. Why should we accept that in one quantum reality, observations somehow cause nature to take shape out of a probabilistic range of options, whereas on this side of some arbitrary line in the sand we inhabit a different, classical reality where observations meekly bow to the world out there? What exactly determines when this change takes place? ‘Let me mention a few more irritating features of the Copenhagen Interpretation,’ Everett would write to its proponents: ‘You talk of the massiveness of macro systems allowing one to neglect further quantum effects … but never give any justification for this flatly asserted dogma.’…

A fascinating sample of a fascinating book: “Quantum Mechanics,” from @WilliamEgginton via the invaluable @delanceyplace.

Further to which, it’s interesting to recall that, in his 1921 The Analysis Of Mind, Bertrand Russell observed:

What has permanent value in the outlook of the behaviourists is the feeling that physics is the most fundamental science at present in existence. But this position cannot be called materialistic, if, as seems to be the case, physics does not assume the existence of matter…

via Robert Cottrell

See also: “Objective Reality May Not Exist, Quantum Experiment Suggests” (source of the image above).

* Albert Einstein

###

As we examine existence, we might spare a thought for Otto Frisch; he died on this date in 1979. A physicist, he was (with Otto Stern and Immanuel Estermann) the first to measure the magnetic moment of the proton. With his aunt, Lise Meitner, he advanced the first theoretical explanation of nuclear fission (coining the term) and first experimentally detected the fission by-products. Later, with his collaborator Rudolf Peierls, he designed the first theoretical mechanism for the detonation of an atomic bomb in 1940.

Otto Frisch’s wartime Los Alamos ID badge photo (source)

“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it”*…

It’s very hard, historian of science Benjamin Breen explains, to understand the implications of a scientific revolution as one is living through it…

2023 is shaping up to be an important year in the history of science. And no, I’m not talking about the reputed room-temperature semiconductor LK-99, which seems increasingly likely to be a dud.

Instead, I’m talking about the discoveries you’ll find in Wikipedia’s list of scientific advances for 2023. Here are some examples:

• January: Positive results from a clinical trial of a vaccine for RSV; OpenAI’s ChatGPT enters wide use.

February: A major breakthrough in quantum computing; announcement of a tiny robot that can clean blood vessels; more evidence for the ability of psychedelics to enhance neuroplasticity; major developments in biocomputers.

• March: OpenAI rolls out GPT-4; continued progress on mRNA vaccines for cancer.

• April: NASA announces astronaut crew who will orbit the moon next year; promising evidence for gene therapy to fight Alzheimer’s.

• May: Scientists use AI to translate brain activity into written words; promising results for a different Alzheimer’s drug; human pangenome sequenced (largely by a team of UCSC researchers — go Banana Slugs!); more good news about the potential of mRNA vaccines for fighting cancer.

And skipping ahead to just the past two weeks:

• nuclear fusion ignition with net energy gain was achieved for the second time

• a radical new approach to attacking cancer tumors entered Phase 1 trials in humans

• and — announced just as I was writing this [in August, 2023] — one of the new crop of weight loss drugs was reported to cut rates of heart attack and stroke in high-risk individuals by 20% (!).

Also in January of 2023: the New York Times asked “What Happened to All of Science’s Big Breakthroughs?”

The headline refers to an article published in Nature which argues that there has been a steady drop in “disruptive” scientific and technological breakthroughs between the years of 1945 and 2010. Basically, it’s a restatement of the concept of a “Great Stagnation” which was proposed by the economist Tyler Cowen in 2011. Though the paper cites everyone from Cowen to Albert Einstein and Isaac Newton, it’s worth noting that it doesn’t cite a single historian of science or technology (unless Alexandre Koyré counts)…

Naturally, as a historian of science and medicine, I think that there really are important things to learn from the history of science and medicine! And what I want to argue for the rest of this post boils down to two specific lessons from that history:

  1. People living through scientific revolutions are usually unaware of them — and, if they are, they don’t think about them in the same way that later generations do.
  2. An apparent slowdown in the rate of scientific innovation doesn’t always mean a slowdown in the impacts of science. The history of the first scientific revolution — the one that began in the famously terrible seventeenth century — suggests that the positive impacts of scientific innovation, in particular, are not always felt by the people living throughthe period of innovation. Periods when the pace of innovation appears to slow down may also be eras when society becomes more capable of benefitting from scientific advances by learning how to mitigate previously unforeseen risks.

[… There follows a fascinating look back at the 1660s– the “original” scientific revolution– at Boyle, Newton, at what they hoped/expected, and at how that differed for what their work and that of their colleagues actually yielded. Then the cautionary tale of Thomas Midgley..]

As we appear to be entering a new era of rapid scientific innovation in the 2020s, it is worth remembering that it often takes decades before the lasting social value of a technical innovation is understood — and decades more before we understand its downsides.

In the meantime, I’m pretty psyched about the cancer drugs…

As Thomas Kuhn observed, “The historian of science may be tempted to exclaim that when paradigms change, the world itself changes with them.”

On the difficulty of knowing the outcomes of a scientific revolution from within it: “Experiencing scientific revolutions: the 1660s and the 2020s,” from @ResObscura.

* Max Planck

###

As we try to see, we might spare a thought for William Seward Burroughs; he died on this date in 1898. And inventor who had worked in a bank, he invented the world’s first commercially viable recording adding machine and pioneered of its manufacture. The very successful company that he founded went on to become Unisys, which was instrumental in the development of computing… the implications of which we’re still discovering– and Burroughs surely never saw.

Nor, one reckons, did he imagine that his grandson, William Seward Burroughs II, would become the cultural figure that he did.

source