(Roughly) Daily

Posts Tagged ‘Physics

“Why, sometimes I’ve believed as many as six impossible things before breakfast”*…

Imaginary numbers were long dismissed as mathematical “bookkeeping.” But now, as Karmela Padavic-Callaghan explains, physicists are proving that they describe the hidden shape of nature…

Many science students may imagine a ball rolling down a hill or a car skidding because of friction as prototypical examples of the systems physicists care about. But much of modern physics consists of searching for objects and phenomena that are virtually invisible: the tiny electrons of quantum physics and the particles hidden within strange metals of materials science along with their highly energetic counterparts that only exist briefly within giant particle colliders.

In their quest to grasp these hidden building blocks of reality scientists have looked to mathematical theories and formalism. Ideally, an unexpected experimental observation leads a physicist to a new mathematical theory, and then mathematical work on said theory leads them to new experiments and new observations. Some part of this process inevitably happens in the physicist’s mind, where symbols and numbers help make invisible theoretical ideas visible in the tangible, measurable physical world.

Sometimes, however, as in the case of imaginary numbers – that is, numbers with negative square values – mathematics manages to stay ahead of experiments for a long time. Though imaginary numbers have been integral to quantum theory since its very beginnings in the 1920s, scientists have only recently been able to find their physical signatures in experiments and empirically prove their necessity…

Learn more at “Imaginary numbers are real,” from @Ironmely in @aeonmag.

* The Red Queen, in Lewis Carroll’s Through the Looking Glass


As we get real, we might spare a thought for two great mathematicians…

Georg Friedrich Bernhard Riemann died on this date in 1866. A mathematician who made contributions to analysis, number theory, and differential geometry, he is remembered (among other things) for his 1859 paper on the prime-counting function, containing the original statement of the Riemann hypothesis, regarded as one of the most influential papers in analytic number theory.


Andrey (Andrei) Andreyevich Markov died on this date in 1922.  A Russian mathematician, he helped to develop the theory of stochastic processes, especially those now called Markov chains: sequences of random variables in which the future variable is determined by the present variable but is independent of the way in which the present state arose from its predecessors.  (For example, the probability of winning at the game of Monopoly can be determined using Markov chains.)  His work on the study of the probability of mutually-dependent events has been developed and widely applied to the biological, physical, and social sciences, and is widely used in Monte Carlo simulations and Bayesian analyses.


“‘Space-time’ – that hideous hybrid whose very hyphen looks phoney”*…

Space-time curvature [source: ESA]

Space and time seem about as basic as anything could be, even after Einstein’s theory of General Relativity threw (in) a curve. But as Steven Strogatz discusses with Sean Carroll, the reconciliation of Einstein’s work with quantum theory is seeming to suggest that space and time might actually be emergent properties of quantum reality, not fundamental parts of it…

… we’re going to be discussing the mysteries of space and time, and gravity, too. What’s so mysterious about them?

Well, it turns out they get really weird when we look at them at their deepest levels, at a super subatomic scale, where the quantum nature of gravity starts to kick in and become crucial. Of course, none of us have any direct experience with space and time and gravity at this unbelievably small scale. Up here, at the scale of everyday life, space and time seem perfectly smooth and continuous. And gravity is very well described by Isaac Newton’s classic theory, a theory that’s been around for over 300 years now.

But then, about 100 years ago, things started to get strange. Albert Einstein taught us that space and time could warp and bend like a piece of fabric. This warping of the space-time continuum is what we experience as gravity. But Einstein’s theory is mainly concerned with the largest scales of nature, the scale of stars, galaxies and the whole universe. It doesn’t really have much to say about space and time at the very smallest scales.

And that’s where the trouble really starts. Down there, nature is governed by quantum mechanics. This amazingly powerful theory has been shown to account for all the forces of nature, except gravity. When physicists try to apply quantum theory to gravity, they find that space and time become almost unrecognizable. They seem to start fluctuating wildly. It’s almost like space and time fall apart. Their smoothness breaks down completely, and that’s totally incompatible with the picture in Einstein’s theory.

s physicists try to make sense of all of this, some of them are coming to the conclusion that space and time may not be as fundamental as we always imagined. They’re starting to seem more like byproducts of something even deeper, something unfamiliar and quantum mechanical. But what could that something be?….

Find out at: “Where Do Space, Time and Gravity Come From?, ” from @stevenstrogatz and @seanmcarroll in @QuantaMagazine.

* Vladimir Nabokov


As we fumble with the fundamental, we might send far-sighted birthday greetings to Jocelyn Bell Burnell; she was born on this date in 1943. An astrophysicist, she discovered the first pulsar, while working as a post-doc, in 1957. She then discovered the next three detected pulsars.

The discovery eventually earned the Nobel Prize in Physics in 1974; however, she was not one of the prize’s recipients. The paper announcing the discovery of pulsars had five authors. Bell’s thesis supervisor Antony Hewish was listed first, Bell second. Hewish was awarded the Nobel Prize, along with the astronomer Martin Ryle.

A pulsar— or pulsating radio star– a highly magnetized, rotating neutron star that emits a beam of electromagnetic radiation. The precise periods of pulsars make them very useful tools. Observations of a pulsar in a binary neutron star system were used to  confirm (indirectly) the existence of gravitational radiation. The first extrasolar planets were discovered around a pulsar, PSR B1257+12.  And certain types of pulsars rival atomic clocks in their accuracy in keeping time.

Schematic rendering of a pulsar


Jocelyn Bell Burnell


Written by (Roughly) Daily

July 15, 2022 at 1:00 am

“Those who can imagine anything, can create the impossible”*…

As Charlie Wood explains, physicists are building neural networks out of vibrations, voltages and lasers, arguing that the future of computing lies in exploiting the universe’s complex physical behaviors…

… When it comes to conventional machine learning, computer scientists have discovered that bigger is better. Stuffing a neural network with more artificial neurons — nodes that store numerical values — improves its ability to tell a dachshund from a Dalmatian, or to succeed at myriad other pattern recognition tasks. Truly tremendous neural networks can pull off unnervingly human undertakings like composing essays and creating illustrations. With more computational muscle, even grander feats may become possible. This potential has motivated a multitude of efforts to develop more powerful and efficient methods of computation.

[Cornell’s Peter McMahon] and a band of like-minded physicists champion an unorthodox approach: Get the universe to crunch the numbers for us. “Many physical systems can naturally do some computation way more efficiently or faster than a computer can,” McMahon said. He cites wind tunnels: When engineers design a plane, they might digitize the blueprints and spend hours on a supercomputer simulating how air flows around the wings. Or they can stick the vehicle in a wind tunnel and see if it flies. From a computational perspective, the wind tunnel instantly “calculates” how wings interact with air.

A wind tunnel is a single-minded machine; it simulates aerodynamics. Researchers like McMahon are after an apparatus that can learn to do anything — a system that can adapt its behavior through trial and error to acquire any new ability, such as classifying handwritten digits or distinguishing one spoken vowel from another. Recent work has shown that physical systems like waves of light, networks of superconductors and branching streams of electrons can all learn.

“We are reinventing not just the hardware,” said Benjamin Scellier, a mathematician at the Swiss Federal Institute of Technology Zurich in Switzerland who helped design a new physical learning algorithm, but “also the whole computing paradigm.”…

Computing at the largest scale? “How to Make the Universe Think for Us,” from @walkingthedot in @QuantaMagazine.

Alan Turing


As we think big, we might send well-connected birthday greetings to Leonard Kleinrock; he was born on this date in 1934. A computer scientist, he made several foundational contributions the field, in particular to the theoretical foundations of data communication in computer networking. Perhaps most notably, he was central to the development of ARPANET (which essentially grew up to be the internet); his graduate students at UCLA were instrumental in developing the communication protocols for internetworking that made that possible.

Kleinrock at a meeting of the members of the Internet Hall of Fame


“Nothing in life is certain except death, taxes and the second law of thermodynamics”*…

The second law of thermodynamics– asserting that the entropy of a system increases with time– is among the most sacred in all of science, but it has always rested on 19th century arguments about probability. As Philip Ball reports, new thinking traces its true source to the flows of quantum information…

In all of physical law, there’s arguably no principle more sacrosanct than the second law of thermodynamics — the notion that entropy, a measure of disorder, will always stay the same or increase. “If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations — then so much the worse for Maxwell’s equations,” wrote the British astrophysicist Arthur Eddington in his 1928 book The Nature of the Physical World. “If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.” No violation of this law has ever been observed, nor is any expected.

But something about the second law troubles physicists. Some are not convinced that we understand it properly or that its foundations are firm. Although it’s called a law, it’s usually regarded as merely probabilistic: It stipulates that the outcome of any process will be the most probable one (which effectively means the outcome is inevitable given the numbers involved).

Yet physicists don’t just want descriptions of what will probably happen. “We like laws of physics to be exact,” said the physicist Chiara Marletto of the University of Oxford. Can the second law be tightened up into more than just a statement of likelihoods?

A number of independent groups appear to have done just that. They may have woven the second law out of the fundamental principles of quantum mechanics — which, some suspect, have directionality and irreversibility built into them at the deepest level. According to this view, the second law comes about not because of classical probabilities but because of quantum effects such as entanglement. It arises from the ways in which quantum systems share information, and from cornerstone quantum principles that decree what is allowed to happen and what is not. In this telling, an increase in entropy is not just the most likely outcome of change. It is a logical consequence of the most fundamental resource that we know of — the quantum resource of information…

Is that most sacrosanct natural laws, second law of thermodynamics, a quantum phenomenon? “Physicists Rewrite the Fundamental Law That Leads to Disorder,” from @philipcball in @QuantaMagazine.

* “Nothing in life is certain except death, taxes and the second law of thermodynamics. All three are processes in which useful or accessible forms of some quantity, such as energy or money, are transformed into useless, inaccessible forms of the same quantity. That is not to say that these three processes don’t have fringe benefits: taxes pay for roads and schools; the second law of thermodynamics drives cars, computers and metabolism; and death, at the very least, opens up tenured faculty positions.” — Seth Lloyd


As we get down with disorder, we might spare a thought for Francois-Marie Arouet, better known as Voltaire; he died on this date in 1778.  The Father of the Age of Reason, he produced works in almost every literary form: plays, poems, novels, essays, and historical and scientific works– more than 2,000 books and pamphlets (and more than 20,000 letters).  He popularized Isaac Newton’s work in France by arranging a translation of Principia Mathematica to which he added his own commentary.

A social reformer, Voltaire used satire to criticize the intolerance, religious dogma, and oligopolistic privilege of his day, perhaps nowhere more sardonically than in Candide.


“If we are to prevent megatechnics from further controlling and deforming every aspect of human culture, we shall be able to do so only with the aid of a radically different model derived directly, not from machines, but from living organisms and organic complexes (ecosystems)”*…

In a riff on Lewis Mumford, the redoubtable L. M. Sacasas addresses the unraveling of modernity…

The myth of the machine underlies a set of three related and interlocking presumptions which characterized modernity: objectivity, impartiality, and neutrality. More specifically, the presumptions that we could have objectively secured knowledge, impartial political and legal institutions, and technologies that were essentially neutral tools but which were ordinarily beneficent. The last of these appears to stand somewhat apart from the first two in that it refers to material culture rather than to what might be taken as more abstract intellectual or moral stances. In truth, however, they are closely related. The more abstract intellectual and institutional pursuits were always sustained by a material infrastructure, and, more importantly, the machine supplied a master template for the organization of human affairs.

Just as the modern story began with the quest for objectively secured knowledge, this ideal may have been the first to lose its implicit plausibility. Since the late 19th century onward, philosophers, physicists, sociologists, anthropologists, psychologists, and historians have, among others, proposed a more complex picture that emphasized the subjective, limited, contingent, situated, and even irrational dimensions of how humans come to know the world. The ideal of objectively secured knowledge became increasingly questionable throughout the 20th century. Some of these trends get folded under the label “postmodernism,” but I found the term unhelpful at best a decade ago—now find it altogether useless.

We can similarly trace a growing disillusionment with the ostensible impartiality of modern institutions. This takes at least two forms. On the one hand, we might consider the frustrating and demoralizing character of modern bureaucracies, which we can describe as rule-based machines designed to outsource judgement and enhance efficiency. On the other, we can note the heightened awareness of the actual failures of modern institutions to live up to the ideals of impartiality, which has been, in part, a function of the digital information ecosystem.

But while faith in the possibility of objectively secured knowledge and impartial institutions faltered, the myth of the machine persisted in the presumption that technology itself was fundamentally neutral. Until very recently, that is. Or so it seems. And my thesis (always for disputation) is that the collapse of this last manifestation of the myth brings the whole house down. This in part because of how much work the presumption of technological neutrality was doing all along to hold American society together. (International readers: as always read with a view to your own setting. I suspect there are some areas of broad overlap and other instances when my analysis won’t travel well). Already by the late 19th century, progress had become synonymous with technological advancements, as Leo Marx argued. If social, political, or moral progress stalled, then at least the advance of technology could be counted on…

But over the last several years, the plausibility of this last and also archetypal manifestation of the myth of the machine has also waned. Not altogether, to be sure, but in important and influential segments of society and throughout a wide cross-section of society, too. One can perhaps see the shift most clearly in the public discourse about social media and smart phones, but this may be a symptom of a larger disillusionment with technology. And not only technological artifacts and systems, but also with the technocratic ethos and the public role of expertise.

If the myth of the machine in these three manifestations, was, in fact, a critical element of the culture of modernity, underpinning its aspirations, then when each in turn becomes increasingly implausible the modern world order comes apart. I’d say that this is more or less where we’re at. You could usefully analyze any number of cultural fault lines through this lens. The center, which may not in fact hold, is where you find those who still operate as if the presumptions of objectivity, impartiality, and neutrality still compelled broad cultural assent, and they are now assailed from both the left and the right by those who have grown suspicious or altogether scornful of such presumptions. Indeed, the left/right distinction may be less helpful than the distinction between those who uphold some combination of the values of objectivity, impartiality, and neutrality and those who no longer find them compelling or desirable.

What happens when the systems and strategies deployed to channel often violent clashes within a population deeply, possibly intractably divided about substantive moral goods and now even about what Arendt characterized as the publicly accessible facts upon which competing opinions could be grounded—what happens when these systems and strategies fail?

It is possible to argue that they failed long ago, but the failure was veiled by an unevenly distributed wave of material abundance. Citizens became consumers and, by and large, made peace with the exchange. After all, if the machinery of government could run of its own accord, what was their left to do but enjoy the fruits of prosperity. But what if abundance was an unsustainable solution, either because it taxed the earth at too high a rate or because it was purchased at the cost of other values such as rootedness, meaningful work and involvement in civic life, abiding friendships, personal autonomy, and participation in rich communities of mutual care and support? Perhaps in the framing of that question, I’ve tipped my hand about what might be the path forward.

At the heart of technological modernity there was the desire—sometimes veiled, often explicit—to overcome the human condition. The myth of the machine concealed an anti-human logic: if the problem is the failure of the human to conform to the pattern of the machine, then bend the human to the shape of the machine or eliminate the human altogether. The slogan of the one of the high-modernist world’s fairs of the 1930s comes to mind: “Science Finds, Industry Applies, Man Conforms.” What is now being discovered in some quarters, however, is that the human is never quite eliminated, only diminished…

Eminently worth reading in full: “The Myth of the Machine, ” from @LMSacasas.

For a deep dive into similar waters, see John Ralston Saul‘s (@JohnRalstonSaul) Voltaire’s Bastards.

[Image above: source]

* Lewis Mumford, The Myth of the Machine


As we rethink rudiments, we might recall that it was on this date in 1919 that Arthur Eddington confirmed Einstein’s light-bending prediction– a part of The Theory of General Relativity– using photos of a solar eclipse. Eddington’s paper the following year was the “debut” of Einstein’s theoretical work in most of the English-speaking world (and occasioned an urban legend: when a reporter supposedly suggested that “only three people understand relativity,” Eddington was supposed to have jokingly replied “Oh, who’s the third?”)

One of Eddington’s photographs of the total solar eclipse of 29 May 1919, presented in his 1920 paper announcing its success, confirming Einstein’s theory that light “bends”
%d bloggers like this: