(Roughly) Daily

Posts Tagged ‘standard model

“Men knew better than they realized, when they placed the abode of the gods beyond the reach of gravity”*…

In search of a theory of everything…

Twenty-five particles and four forces. That description — the Standard Model of particle physics — constitutes physicists’ best current explanation for everything. It’s neat and it’s simple, but no one is entirely happy with it. What irritates physicists most is that one of the forces — gravity — sticks out like a sore thumb on a four-fingered hand. Gravity is different.

Unlike the electromagnetic force and the strong and weak nuclear forces, gravity is not a quantum theory. This isn’t only aesthetically unpleasing, it’s also a mathematical headache. We know that particles have both quantum properties and gravitational fields, so the gravitational field should have quantum properties like the particles that cause it. But a theory of quantum gravity has been hard to come by.

In the 1960s, Richard Feynman and Bryce DeWitt set out to quantize gravity using the same techniques that had successfully transformed electromagnetism into the quantum theory called quantum electrodynamics. Unfortunately, when applied to gravity, the known techniques resulted in a theory that, when extrapolated to high energies, was plagued by an infinite number of infinities. This quantization of gravity was thought incurably sick, an approximation useful only when gravity is weak.

Since then, physicists have made several other attempts at quantizing gravity in the hope of finding a theory that would also work when gravity is strong. String theory, loop quantum gravity, causal dynamical triangulation and a few others have been aimed toward that goal. So far, none of these theories has experimental evidence speaking for it. Each has mathematical pros and cons, and no convergence seems in sight. But while these approaches were competing for attention, an old rival has caught up.

The theory called asymptotically (as-em-TOT-ick-lee) safe gravity was proposed in 1978 by Steven Weinberg. Weinberg, who would only a year later share the Nobel Prize with Sheldon Lee Glashow and Abdus Salam for unifying the electromagnetic and weak nuclear force, realized that the troubles with the naive quantization of gravity are not a death knell for the theory. Even though it looks like the theory breaks down when extrapolated to high energies, this breakdown might never come to pass. But to be able to tell just what happens, researchers had to wait for new mathematical methods that have only recently become available…

For decades, physicists have struggled to create a quantum theory of gravity. Now an approach that dates to the 1970s is attracting newfound attention: “Why an Old Theory of Everything Is Gaining New Life,” from @QuantaMagazine.

* Arthur C. Clarke, 2010: Odyssey Two

###

As we unify, we might pause to remember Sir Arthur Stanley Eddington, OM, FRS; he died in this date in 1944.  An astrophysicist, mathematician, and philosopher of science known for his work on the motion, distribution, evolution and structure of stars, Eddington is probably best remembered for his relationship to Einstein:  he was, via a series of widely-published articles, the primary “explainer” of Einstein’s Theory of General Relativity to the English-speaking world; and he was, in 1919, the leader of the experimental team that used observations of a solar eclipse to confirm the theory.

 source

“If and when all the laws governing physical phenomena are finally discovered, and all the empirical constants occurring in these laws are finally expressed through the four independent basic constants, we will be able to say that physical science has reached its end”*…

The fine-structure constant was introduced in 1916 to quantify the tiny gap between two lines in the spectrum of colors emitted by certain atoms. The closely spaced frequencies are seen here through a Fabry-Pérot interferometer.

As fundamental constants go, the speed of light, c, enjoys all the fame, yet c’s numerical value says nothing about nature; it differs depending on whether it’s measured in meters per second or miles per hour. The fine-structure constant, by contrast, has no dimensions or units. It’s a pure number that shapes the universe to an astonishing degree — “a magic number that comes to us with no understanding,” as Richard Feynman described it. Paul Dirac considered the origin of the number “the most fundamental unsolved problem of physics.”

Numerically, the fine-structure constant, denoted by the Greek letter α (alpha), comes very close to the ratio 1/137. It commonly appears in formulas governing light and matter. “It’s like in architecture, there’s the golden ratio,” said Eric Cornell, a Nobel Prize-winning physicist at the University of Colorado, Boulder and the National Institute of Standards and Technology. “In the physics of low-energy matter — atoms, molecules, chemistry, biology — there’s always a ratio” of bigger things to smaller things, he said. “Those ratios tend to be powers of the fine-structure constant.”

The constant is everywhere because it characterizes the strength of the electromagnetic force affecting charged particles such as electrons and protons. “In our everyday world, everything is either gravity or electromagnetism. And that’s why alpha is so important,” said Holger Müller, a physicist at the University of California, Berkeley. Because 1/137 is small, electromagnetism is weak; as a consequence, charged particles form airy atoms whose electrons orbit at a distance and easily hop away, enabling chemical bonds. On the other hand, the constant is also just big enough: Physicists have argued that if it were something like 1/138, stars would not be able to create carbon, and life as we know it wouldn’t exist.

Physicists have more or less given up on a century-old obsession over where alpha’s particular value comes from; they now acknowledge that the fundamental constants could be random, decided in cosmic dice rolls during the universe’s birth. But a new goal has taken over.

Physicists want to measure the fine-structure constant as precisely as possible. Because it’s so ubiquitous, measuring it precisely allows them to test their theory of the interrelationships between elementary particles — the majestic set of equations known as the Standard Model of particle physics. Any discrepancy between ultra-precise measurements of related quantities could point to novel particles or effects not accounted for by the standard equations. Cornell calls these kinds of precision measurements a third way of experimentally discovering the fundamental workings of the universe, along with particle colliders and telescopes…

In a new paper in the journal Nature, a team of four physicists led by Saïda Guellati-Khélifa at the Kastler Brossel Laboratory in Paris reported the most precise measurement yet of the fine-structure constant. The team measured the constant’s value to the 11th decimal place, reporting that α = 1/137.03599920611. (The last two digits are uncertain.)

With a margin of error of just 81 parts per trillion, the new measurement is nearly three times more precise than the previous best measurement in 2018 by Müller’s group at Berkeley, the main competition. (Guellati-Khélifa made the most precise measurement before Müller’s in 2011.) Müller said of his rival’s new measurement of alpha, “A factor of three is a big deal. Let’s not be shy about calling this a big accomplishment”… largely ruling out some proposals for new particles

A team in Paris has made the most precise measurement yet of the fine-structure constant, killing hopes for a new force of nature: “Physicists Nail Down the ‘Magic Number’ That Shapes the Universe.”

[TotH to MK]

* George Gamow

###

As we ponder precision, we might spare a thought for Persian polymath Omar Khayyam; the mathematician, philosopher, astronomer, epigrammatist, and poet died on this date in 1131.  While he’s probably best known to English-speakers as a poet, via Edward FitzGerald’s famous translation of the quatrains that comprise the Rubaiyat of Omar Khayyam, Omar was one of the major mathematicians and astronomers of the medieval period.  He is the author of one of the most important works on algebra written before modern times, the Treatise on Demonstration of Problems of Algebra (which includes a geometric method for solving cubic equations by intersecting a hyperbola with a circle).  His astronomical observations contributed to the reform of the Persian calendar.  And he made important contributions to mechanics, geography, mineralogy, music, climatology, and Islamic theology.

 source

Adventures in Cosmology: Starting out Simply…

Why was entropy so low at the Big Bang? (source: Internet Encyclopedia of Philosophy)

Back in 2010, SUNY-Buffalo physics professor Dejan Stojkovic and colleagues made a simple– a radically simple– suggestion:  that the early universe — which exploded from a single point and was very, very small at first — was one-dimensional (like a straight line) before expanding to include two dimensions (like a plane) and then three (like the world in which we live today).

The core idea is that the dimensionality of space depends on the size of the space observed, with smaller spaces associated with fewer dimensions. That means that a fourth dimension will open up — if it hasn’t already — as the universe continues to expand.  (Interesting corollary: space has fewer dimensions at very high energies of the kind associated with the early, post-big bang universe.)

Stojkovic’s notion is challenging; but at the same time, it would help address a number of fundamental problems with the standard model of particle physics, from the incompatibility between quantum mechanics and general relativity to the mystery of the accelerating expansion of the universe.

But is it “true”?  There’s no way to know as yet.  But Stojkovic and his colleagues have devised a test using the Laser Interferometer Space Antenna (LISA), a planned international gravitational observatory, that could shed some definitive light on the question in just a few years.

Read the whole story in Science Daily, and read Stojkovic’s proposal for experimental proof in Physical Review Letters.

As we glance around for evidence of that fourth dimension, we might bid an indeterminate farewell to Ilya Prigogine, the Nobel Laureate whose work on dissipative structures, complex systems, and irreversibility led to the identification of self-organizing systems, and is seen by many as a bridge between the natural and social sciences.  He died at the Hospital Erasme in Brussels on this date in 2003.

Prigogine’s 1997 book, The End of Certainty, summarized his departure from the determinist thinking of Newton, Einstein, and Schrödinger in arguing for “the arrow of time”– and “complexity,” the ineluctable reality of irreversibility and instability.  “Unstable systems” like weather and biological life, he suggested, cannot be explained with standard deterministic models.  Rather, given their to sensitivity to initial conditions, unstable systems can only be explained statistically, probabilistically.

source: University of Texas

%d bloggers like this: