(Roughly) Daily

“Reality is merely an illusion, albeit a very persistent one”*…

In an excerpt from his new book, The Rigor of Angels: Borges, Heisenberg, Kant, and the Ultimate Nature of Reality, the estimable William Egginton explains the central mystery at the heart of one of the most important breakthroughs in physics–quantum mechanics…

For all its astonishing, mind-bending complexity– for all its blurry cats, entangled particles, buckyballs, and Bell’s inequalities– quantum mechanics ultimately boils down to one core mystery. This mystery found its best expression in the letter Heisenberg wrote to Pauli in the fevered throes of his discovery. The path a particle takes ‘only comes into existence through this, that we observe it.’ This single, stunning expression underlies all the rest: the wave/particle duality (interference patterns emerge when the particles have not yet been observed and hence their possible paths interfere with one another); the apparently absurd liminal state of Schrodinger’s cat ( the cat seems to remain blurred between life and death because atoms don’t release a particle until observed); the temporal paradox (observing a particle seems to retroactively determine the path it chose to get here); and, the one that really got to Einstein, if the observation of a particle at one place and time instantaneously changes something about the rest of reality, then locality, the cornerstone of relativity and guarantee that the laws of physics are invariable through the universe, vanishes like fog on a warming windowpane.

If the act of observation somehow instantaneously conjures a particle’s path, the foundations not only of classical physics but also of what we widely regard as physical reality crumble before our eyes. This fact explains why Einstein held fast to another interpretation. The particle’s path doesn’t come into existence when we observe it. The path exists, but we just can’t see it. Like the parable of the ball in the box he described in his letter to Schrodinger, a 50 percent chance of finding a ball in any one of two boxes does not complete the description of the ball’s reality before we open the box. It merely states our lack of knowledge about the ball’s whereabouts. 

And yet, as experiment after experiment has proven, the balls simply aren’t there before the observation. We can separate entangled particles, seemingly to any conceivable distance, and by observing one simultaneously come to know something about the other–something that wasn’t the case until the exact moment of observing it. Like the beer and whiskey twins, we can maintain total randomness up to a nanosecond before one of them orders, and still what the one decides to order will determine the other’s drink, on the spot, even light-years away. 

The ineluctable fact of entanglement tells us something profound about reality and our relation to it. Imagine you are one of the twins about to order a drink (this should be more imaginable than being an entangled particle about to be observed, but the idea is the same). From your perspective you can order either a whiskey or a beer: it’s a fifty-fifty choice; nothing is forcing your hand. Unbeknownst to you, however, in a galaxy far, far away, your twin has just made the choice for you. Your twin can’t tell you this or signal it in any way, but what you perceive to be a perfectly random set of possibilities, an open choice, is entirely constrained. You have no idea if you will order beer or whiskey, but when you order it, it will be the one or the other all the same. If your twin is, say, one light-year away, the time in which you make this decision doesn’t even exist over there yet. Any signals your sibling gets from you, or any signals you send, will take another year to arrive. And still, as of this moment, you each know. Neither will get confirmation for another year, but you can be confident, you can bet your life’s savings on it–a random coin toss in another galaxy, and you already know the outcome. 

The riddles that arise from Heisenberg’s starting point would seem to constitute the most vital questions of existence. And yet one of the curious side effects of quantum mechanics’ extraordinary success has been a kind of quietism in the face of those very questions. The interpretation of quantum mechanics, deciding what all this means, has tended to go unnoticed by serious physics departments and the granting agencies that support them in favor of the ‘shut up and calculate’ school, leading the former to take hold mainly in philosophy departments, as a subfield of the philosophy of science called foundations of physics. Nevertheless, despite such siloing, a few physicists persisted in exploring possible solutions to the quantum riddles. Some of their ideas have been literally otherworldly.

In the 1950s, a small group of graduate students working with John Wheeler at Princeton University became fascinated with these problems and kept returning to them in late-night, sherry-fueled rap sessions. Chief among this group was Hugh Everett III, a young man with classic 1950s-style nerd glasses and a looming forehead. Everett found himself chafing at the growing no-question zone that proponents of the Copenhagen interpretation had built around their science. Why should we accept that in one quantum reality, observations somehow cause nature to take shape out of a probabilistic range of options, whereas on this side of some arbitrary line in the sand we inhabit a different, classical reality where observations meekly bow to the world out there? What exactly determines when this change takes place? ‘Let me mention a few more irritating features of the Copenhagen Interpretation,’ Everett would write to its proponents: ‘You talk of the massiveness of macro systems allowing one to neglect further quantum effects … but never give any justification for this flatly asserted dogma.’…

A fascinating sample of a fascinating book: “Quantum Mechanics,” from @WilliamEgginton via the invaluable @delanceyplace.

Further to which, it’s interesting to recall that, in his 1921 The Analysis Of Mind, Bertrand Russell observed:

What has permanent value in the outlook of the behaviourists is the feeling that physics is the most fundamental science at present in existence. But this position cannot be called materialistic, if, as seems to be the case, physics does not assume the existence of matter…

via Robert Cottrell

See also: “Objective Reality May Not Exist, Quantum Experiment Suggests” (source of the image above).

* Albert Einstein

###

As we examine existence, we might spare a thought for Otto Frisch; he died on this date in 1979. A physicist, he was (with Otto Stern and Immanuel Estermann) the first to measure the magnetic moment of the proton. With his aunt, Lise Meitner, he advanced the first theoretical explanation of nuclear fission (coining the term) and first experimentally detected the fission by-products. Later, with his collaborator Rudolf Peierls, he designed the first theoretical mechanism for the detonation of an atomic bomb in 1940.

Otto Frisch’s wartime Los Alamos ID badge photo (source)

“It is the same in love as in war; a fortress that parleys is half taken”*…

The AT&T Long Lines Building, designed by John Carl Warnecke at 33 Thomas Street in Manhattan, under construction ca. 1974.

Further to yesterday’s post on historic battlements, Zach Mortice on a modern fortress that’s become a go-to location for film and television thrillers…

When it was completed in Lower Manhattan in 1974, 33 Thomas Street, formerly known as the AT&T Long Lines Building, was intended as the world’s largest facility for connecting long-distance telephone calls. Standing 532 feet — roughly equivalent to a 45-story building — it’s a mugshot for Brutalism, windowless and nearly featureless. Its only apertures are a series of ventilation hoods meant to hide microwave-satellite arrays, which communicate with ground-based relay stations and satellites in space. One of several long lines buildings designed by John Carl Warnecke for the New York Telephone Company, a subsidiary of AT&T, 33 Thomas Street is perhaps the most visually striking project in the architect’s long and influential career. Embodying postwar American economic and military hegemony, the tower broadcasts inscrutability and imperviousness. It was conceived, according to the architect, to be a “skyscraper inhabited by machines.”

“No windows or unprotected openings in its radiation-proof skin can be permitted,” reads a project brief prepared by Warnecke’s office; the building’s form and dimensions were shaped not by human needs for light and air, but by the logics of ventilation, cooling, and (not least) protection from atomic blast. “As such, the design project becomes the search for a 20th-century fortress, with spears and arrows replaced by protons and neutrons laying quiet siege to an army of machines within.” The purple prose of the project brief was perhaps inspired by the client. AT&T in the 1970s still held its telecom monopoly, and was an exuberant player in the Cold War military-industrial complex. Until 2009, 33 Thomas Street was a Verizon data center. And in 2016, The Intercept revealed that the building was functioning as a hub for the National Security Administration, which has bestowed upon it the Bond-film-esque moniker Titanpointe.

Computers at Titanpointe have monitored international phone calls, faxes and voice calls routed over the internet, and more, hoovering up data from the International Monetary Fund, the World Bank, and U.S. allies including France, Germany, and Japan. 33 Thomas Street, it turns out, is exactly what it looks like: an apocalypse-proof above-ground bunker intended not only to symbolize but to guarantee national security. For those overseeing fortress operations at the time of construction, objects of fear were nuclear-armed Communists abroad and a restive youth population at home, who couldn’t be trusted to obey the diktats of a culture that had raised up some in previously inconceivable affluence; an affluence built on the exploitation and disenfranchisement of people near and far.

By the time the NSA took over, targets were likely to be insurgents rejecting liberal democracy and American hegemony, from Islamic fundamentalists to world-market competitors in China, alongside a smattering of Black Lives Matter activists. For those outside the fortress, in the Nixon era as in the present, the fearful issue was an entrenched and unaccountable fusion of corporate and governmental capability, a power that flipped the switches connecting the world. At the same time, popular culture had begun, in the 1970s, to register a paranoia that has only intensified — the fear that people no longer call the shots. In its monumental implacability, Titanpointe seems to herald a posthuman regime, run by algorithm for the sole purpose of perpetuating its own system.

It is, in other words, a building tailor made for spy movies.

John Carl Warnecke did not realize, of course, that he was storyboarding a movie set…

How (and why) a windowless telecommunications hub in New York City embodying an architecture of surveillance and paranoia became an ideal location for conspiracy thrillers: “Apocalypse-Proof,” from @zachmortice in @PlacesJournal. Fascinating.

Margaret of Valois

###

As we ponder impenetrability, we might recall that it was on this date in 1780, during the American Revolutionary War, that Benedict Arnold, commander of the American fort at West Point, passed plans of the bastion to the British.

Portrait by Thomas Hart, 1776 (source)

“It is often our mightiest projects that most obviously betray the degree of our insecurity– the construction of fortifications, for instance”*…

From Public Domain Review, a look at a 17th century book that collects (beautiful) plans for forts and fortifications…

What is the peculiar appeal of military architecture? Whether Norman castle or Cold War concrete, there is a kind of sublimity that belongs to defensive design. It stems obviously from the massive scale of construction, and from the luxury of uncompromised execution that generous defence budgets afford. But there is also pleasure to be taken in the unornamented purity of style of structures that have been built solely for practical ends.

These qualities are abundant in the work of the seventeenth-century French military engineer Allain Manesson Mallet. Born in Paris in 1630, Manesson studied mathematics before becoming a soldier (he added the name Mallet in tribute to his teacher). In 1663, he was posted to Alentejo as an army engineer in the service of the Portuguese king Alfonso VI, where he fortified chateaux, until the Treaty of Lisbon in 1668. He returned to France with an appointment as mathematics instructor at the court of Louis XIV.

He recorded his military ideas in a highly successful manual, The Works of Mars (i.e. “the art of war”) in 1671. A year later came German and Dutch editions (the source of the images above), even though France was by then at war with the Netherlands.

Manesson’s book encompassed theories of fortifications from their origins in designs developed in the sixteenth century by Michelangelo and the architect Vincenzo Scamozzi, including more recent innovations of French and Dutch engineers….

More– and many more renderings of ramparts: “The Works of Mars” from @PublicDomainRev.

* W. G. Sebald, Austerlitz

###

As we build bastions, we might understand the Dutch interest in Manesson’s manual as we recall that it was on this date in 1602 that the Spanish-held city of Grave in the Netherlands was taken, at the end of a two-month siege, by a Dutch and English army led by Maurice of Orange and Francis Vere respectively.

Part of the Eighty Years’ War and the Anglo–Spanish War, the Siege of Grave and its ultimate fall were severe enough to cause a major mutiny in the Spanish army.

Siege of Grave in 1602 from a print by Simon Fokke (source)

“It is easier to build strong children than to repair broken men”*…

Your correspondent is headed into a particularly busy period of travel/work, so (Roughly) Daily will be more roughly than daily for next few days. Regular service should resume on September 20…

Grim, but important…

Legal protections for children in the United States and in every individual state fall short of international children’s rights standards, Human Rights Watch said [in a report released this week]. Children in the US can be legally married in 41 states, physically punished by school administrators in 47 states, sentenced to life without parole in 22 states, and work in hazardous agriculture conditions in all 50 states. As the only UN member state that has failed to ratify the Convention on the Rights of the Child, the US falls far below internationally adopted standards.

One year after the release of a scorecard that measures US compliance with key international child rights standards, 11 states have enacted reforms that improve their rankings. Absent federal ratification and federal laws regarding many of the issues the convention addresses, jurisdiction is left to individual states. As a result, the protection and advancement of child rights varies from state to state…

While only seven states score higher than a “D” grade, four states shed their “F” grade, three moved up to a “C,” and several significantly improved their rankings. Alaska, Colorado, Connecticut, Illinois, Maryland, Minnesota, New Hampshire, New Mexico, New York, Vermont, and West Virginia showed improvement over the last year.

The policy changes that improved states’ grades were most frequently in the areas of banning sentencing children to life without parole, raising the minimum age of prosecuting children in the juvenile system, and limiting or prohibiting child marriage. Progress was limited on banning corporal punishment. On child labor, some states moved to roll back child labor protections

The updated scorecard shows improvement, but many states still fail children: “No US State Meets Child Rights Standards,” from @hrw.

Related:

* Frederick Douglass

###

As we protect our progeny, we might recall that it was on this date in 1924 that the League of Nations passed the Declaration of the Rights of the Child (AKA The Geneva Declaration), a historic document drafted by Eglantyne Jebb that recognized and affirmed for the first time the existence of rights specific to children and the responsibility of adults towards children.

The U.S. was not a member of the League. But in 1959 the Declaration was adopted in an extended form by the United Nations.

Children’s day 1928 in Bulgaria. The text on the poster is the Geneva Declaration. In front are Prime Minister Andrey Lyapchev and Metropolitan Stefan of Sofia. (source)

“A new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it”*…

It’s very hard, historian of science Benjamin Breen explains, to understand the implications of a scientific revolution as one is living through it…

2023 is shaping up to be an important year in the history of science. And no, I’m not talking about the reputed room-temperature semiconductor LK-99, which seems increasingly likely to be a dud.

Instead, I’m talking about the discoveries you’ll find in Wikipedia’s list of scientific advances for 2023. Here are some examples:

• January: Positive results from a clinical trial of a vaccine for RSV; OpenAI’s ChatGPT enters wide use.

February: A major breakthrough in quantum computing; announcement of a tiny robot that can clean blood vessels; more evidence for the ability of psychedelics to enhance neuroplasticity; major developments in biocomputers.

• March: OpenAI rolls out GPT-4; continued progress on mRNA vaccines for cancer.

• April: NASA announces astronaut crew who will orbit the moon next year; promising evidence for gene therapy to fight Alzheimer’s.

• May: Scientists use AI to translate brain activity into written words; promising results for a different Alzheimer’s drug; human pangenome sequenced (largely by a team of UCSC researchers — go Banana Slugs!); more good news about the potential of mRNA vaccines for fighting cancer.

And skipping ahead to just the past two weeks:

• nuclear fusion ignition with net energy gain was achieved for the second time

• a radical new approach to attacking cancer tumors entered Phase 1 trials in humans

• and — announced just as I was writing this [in August, 2023] — one of the new crop of weight loss drugs was reported to cut rates of heart attack and stroke in high-risk individuals by 20% (!).

Also in January of 2023: the New York Times asked “What Happened to All of Science’s Big Breakthroughs?”

The headline refers to an article published in Nature which argues that there has been a steady drop in “disruptive” scientific and technological breakthroughs between the years of 1945 and 2010. Basically, it’s a restatement of the concept of a “Great Stagnation” which was proposed by the economist Tyler Cowen in 2011. Though the paper cites everyone from Cowen to Albert Einstein and Isaac Newton, it’s worth noting that it doesn’t cite a single historian of science or technology (unless Alexandre Koyré counts)…

Naturally, as a historian of science and medicine, I think that there really are important things to learn from the history of science and medicine! And what I want to argue for the rest of this post boils down to two specific lessons from that history:

  1. People living through scientific revolutions are usually unaware of them — and, if they are, they don’t think about them in the same way that later generations do.
  2. An apparent slowdown in the rate of scientific innovation doesn’t always mean a slowdown in the impacts of science. The history of the first scientific revolution — the one that began in the famously terrible seventeenth century — suggests that the positive impacts of scientific innovation, in particular, are not always felt by the people living throughthe period of innovation. Periods when the pace of innovation appears to slow down may also be eras when society becomes more capable of benefitting from scientific advances by learning how to mitigate previously unforeseen risks.

[… There follows a fascinating look back at the 1660s– the “original” scientific revolution– at Boyle, Newton, at what they hoped/expected, and at how that differed for what their work and that of their colleagues actually yielded. Then the cautionary tale of Thomas Midgley..]

As we appear to be entering a new era of rapid scientific innovation in the 2020s, it is worth remembering that it often takes decades before the lasting social value of a technical innovation is understood — and decades more before we understand its downsides.

In the meantime, I’m pretty psyched about the cancer drugs…

As Thomas Kuhn observed, “The historian of science may be tempted to exclaim that when paradigms change, the world itself changes with them.”

On the difficulty of knowing the outcomes of a scientific revolution from within it: “Experiencing scientific revolutions: the 1660s and the 2020s,” from @ResObscura.

* Max Planck

###

As we try to see, we might spare a thought for William Seward Burroughs; he died on this date in 1898. And inventor who had worked in a bank, he invented the world’s first commercially viable recording adding machine and pioneered of its manufacture. The very successful company that he founded went on to become Unisys, which was instrumental in the development of computing… the implications of which we’re still discovering– and Burroughs surely never saw.

Nor, one reckons, did he imagine that his grandson, William Seward Burroughs II, would become the cultural figure that he did.

source

%d bloggers like this: