(Roughly) Daily

Posts Tagged ‘second law of thermodynamics

“Those who are not shocked when they first come across quantum theory cannot possibly have understood it”*…

Werner Heisenberg, Erwin Schrödinger, and Niels Bohr by Tasnuva Elahi

A scheduling note: your correspondent is headed onto the road for a couple of weeks, so (Roughly) Daily will be a lot more roughly than daily until September 20th or so.

100 years ago, a circle of physicists shook the foundation of science. As Philip Ball explains, it’s still trembling…

In 1926, tensions were running high at the Institute for Theoretical Physics in Copenhagen. The institute was established 10 years earlier by the Danish physicist Niels Bohr, who had shaped it into a hothouse for young collaborators to thrash out a new theory of atoms. In 1925, one of Bohr’s protégés, the brilliant and ambitious German physicist Werner Heisenberg, had produced such a theory. But now everyone was arguing with each other about what it implied for the nature of physical reality itself.

To the Copenhagen group, it appeared reality had come undone…

[Ball tells the story of Niels Bohr’s building on Max Planck, of Werner Heisenberg’s wrangling of Bohr’s thought into theory, of Einstein’s objections and Erwin Schrödinger’s competing theory; then he homes in on the ontological issue at stake…]

Quantum mechanics, they said, demanded we throw away the old reality and replace it with something fuzzier, indistinct, and disturbingly subjective. No longer could scientists suppose that they were objectively probing a pre-existing world. Instead, it seemed that the experimenter’s choices determined what was seen—what, in fact, could be considered real at all.

In other words, the world is not simply sitting there, waiting for us to discover all the facts about it. Heisenberg’s uncertainty principle implied that those facts are determined only once we measure them. If we choose to measure an electron’s speed (more strictly, its momentum) precisely, then this becomes a fact about the world—but at the expense of accepting that there are simply no facts about its position. Or vice versa…

…A century later, scientists are still arguing about this issue of what quantum mechanics means for the nature of reality…

[Ball recounts subsequent attempts to reconcile quantum theory to “reality,” including Schrödinger’s wave mechanics…]

… Schrödinger’s wave mechanics didn’t restore the kind of reality he and Einstein wanted. His theory represented all that could be said about a quantum object in the form of a mathematical expression called the wave function, from which one can predict the outcomes of making measurements on the object. The wave function looks much like a regular wave, like sound waves in air or water waves on the sea. But a wave of what?

At first, Schrödinger supposed that the amplitude of the wave—think of it like the height of a water wave—at a given point in space was a measure of the density of the smeared-out quantum particle there. But Born argued that in fact this amplitude (more precisely, the square of the amplitude) is a measure of the probability that we will find the particle there, if we make a measurement of its position.

This so-called Born rule goes to the heart of what makes quantum mechanics so odd. Classical Newtonian mechanics allows us to calculate the trajectory of an object like a baseball or the moon, so that we can say where it will be at some given time. But Schrödinger’s quantum mechanics doesn’t give us anything equivalent to a trajectory for a quantum particle. Rather, it tells us the chance of getting a particular measurement outcome. It seems to point in the opposite direction of other scientific theories: not toward the entity it describes, but toward our observation of it. What if we don’t make a measurement of the particle at all? Does the wave function still tell us the probability of its being at a given point at a given time? No, it says nothing about that—or more properly, it permits us to say nothing about it. It speaks only to the probabilities of measurement outcomes.

Crucially, this means that what we see depends on what and how we measure. There are situations for which quantum mechanics predicts that we will see one outcome if we measure one way, and a different outcome if we measure the same system in a different way. And this is not, as is sometimes implied (this was the cause of Heisenberg’s row with Bohr), because making a measurement disturbs the object in some physical manner, much as we might very slightly disturb the temperature of a solution in a test-tube by sticking a thermometer into it. Rather, it seems to be a fundamental property of nature that the very fact of acquiring information about it induces a change.

If, then, by reality we mean what we can observe of the world (for how can we meaningfully call something real if it can’t be seen, detected, or even inferred in any way?), it is hard to avoid the conclusion that we play an active role in determining what is real—a situation the American physicist John Archibald Wheeler called the “participatory universe.”..

… Heisenberg’s “uncertainty” captured that sense of the ground shifting. It was not the ideal word—Heisenberg himself originally used the German Ungenauigkeit, meaning something closer to “inexactness,” as well as Unbestimmtheit, which might be translated as “undeterminedness.” It was not that one was uncertain about the situation of a quantum object, but that there was nothing to be certain about.

There was an even more disconcerting implication behind the uncertainty principle. The vagueness of quantum phenomena, when an electron in an atom might seem to jump from one energy state to another at a time of its own choosing, seemed to indicate the demise of causality itself. Things happened in the quantum world, but one could not necessarily adduce a reason why. In his 1927 paper on the uncertainty principle, Heisenberg challenged the idea that causes in nature lead to predictable effects. That seemed to undermine the very foundation of science, and it made the world seem like a lawless, somewhat arbitrary place….

… One of Bohr’s most provocative views was that there is a fundamental distinction between the fuzzy, probabilistic quantum world and the classical world of real objects in real places, where measurements of, say, an electron with a macroscopic instrument tell us that it is here and not there.

What Bohr meant is shocking. Reality, he implied, doesn’t consist of objects located in time and space. It consists of “quantum events,” which are obliged to be self-consistent (in the sense that quantum mechanics can describe them accurately) but not classically consistent with one another. One implication of this, as far as we can currently tell, is that two observers can see different and conflicting outcomes from an event—yet both can be right.

But this rigid distinction between the quantum and classical worlds can’t be sustained today. Scientists can now conduct experiments that probe size scales in between those where quantum and classical rules are thought to apply—neither microscopic (the atomic scale) nor macroscopic (the human scale), but mesoscopic (an intermediate size). We can look, for example, at the behavior of nanoparticles that can be seen and manipulated yet are small enough to be governed by quantum rules. Such experiments confirm the view that there is no abrupt boundary of quantum and classical. Quantum effects can still be observed at these intermediate scales if our devices are sensitive enough, but those effects can be harder to discern as the number of particles in the system increases.

To understand such experiments, it’s not necessary to adopt any particular interpretation of quantum mechanics, but merely to apply the standard theory—encompassed within Schrödinger’s wave mechanics, say—more expansively than Bohr and colleagues did, using it to explore what happens to a quantum object as it interacts with its surrounding environment. In this way, physicists are starting to understand how information gets out of a quantum system and into its environment, and how, as it does so, the fuzziness of quantum probabilities morphs into the sharpness of classical measurement. Thanks to such work, it is beginning to seem that our familiar world is just what quantum mechanics looks like when you are 6 feet tall.

But even if we manage to complete that project of uniting the quantum with the classical, we might end up none the wiser about what manner of stuff—what kind of reality—it all arises from. Perhaps one day another deeper theory will tell us. Or maybe the Copenhagen group was right a hundred years ago that we just have to accept a contingent, provisional reality: a world only half-formed until we decide how it will be…

Eminently worth reading in full: “When Reality Came Undone,” from @philipcball in @NautilusMag.

See also: When We Cease to Understand the World, by Benjamin Labatut.

* Niels Bohr

###

As we wrestle with reality, we might spare a thought for Ludwig Boltzmann; he died on this date in 1906. A physicist and philosopher, he is best remembered for the development of statistical mechanics, and the statistical explanation of the second law of thermodynamics (which connected entropy and probability).

Boltzmann helped paved the way for quantum theory both with his development of statistical mechanics (which is a pillar of modern physics) and with his 1877 suggestion that the energy levels of a physical system could be discrete.

source

“Visualization gives you answers to questions you didn’t know you had”*…

Reckoning before writing: Mesopotamian Clay Tokens

Physical representations of data have existed for thousands of years. The List of Physical Visualizations (and the accompanying Gallery) collect illustrative examples, e.g…

5500 BC – Mesopotamian Clay Tokens

The earliest data visualizations were likely physical: built by arranging stones or pebbles, and later, clay tokens. According to an eminent archaeologist (Schmandt-Besserat, 1999):

“Whereas words consist of immaterial sounds, the tokens were concrete, solid, tangible artifacts, which could be handled, arranged and rearranged at will. For instance, the tokens could be ordered in special columns according to types of merchandise, entries and expenditures; donors or recipients. The token system thus encouraged manipulating data by abstracting all possible variables. (Harth 1983. 19) […] No doubt patterning, the presentation of data in a particular configuration, was developed to highlight special items (Luria 1976. 20).”

Clay tokens suggest that physical objects were used to externalize information, support visual thinking and enhance cognition way before paper and writing were invented…

There are 370 entries (so far). Browse them at List of Physical Visualizations (@dataphys)

Ben Schneiderman

###

As we celebrate the concrete, we might carefully-calculated birthday greetings to Rolf Landauer; he was born on this date in 1927. A physicist, he made a number important contributions in a range of areas: the thermodynamics of information processing, condensed matter physics, and the conductivity of disordered media.

He is probably best remembered for “Landauer’s Principle,” which described the energy used during a computer’s operation. Whenever the machine is resetting for another computation, bits are flushed from the computer’s memory, and in that electronic operation, a certain amount of energy is lost (a simple logical consequence of the second law of thermodynamics). Thus, when information is erased, there is an inevitable “thermodynamic cost of forgetting,” which governs the development of more energy-efficient computers. The maximum entropy of a bounded physical system is finite– so while most engineers dealt with practical limitations of compacting ever more circuitry onto tiny chips, Landauer considered the theoretical limit: if technology improved indefinitely, how soon will it run into the insuperable barriers set by nature?

A so-called logically reversible computation, in which no information is erased, may in principle be carried out without releasing any heat. This has led to considerable interest in the study of reversible computing. Indeed, without reversible computing, increases in the number of computations per joule of energy dissipated must eventually come to a halt. If Koomey‘s law continues to hold, the limit implied by Landauer’s principle would be reached around the year 2050.

source

“No structure, even an artificial one, enjoys the process of entropy. It is the ultimate fate of everything, and everything resists it.”*…

A 19th-century thought experiment that motivates physicists– and information scientists– still…

The universe bets on disorder. Imagine, for example, dropping a thimbleful of red dye into a swimming pool. All of those dye molecules are going to slowly spread throughout the water.

Physicists quantify this tendency to spread by counting the number of possible ways the dye molecules can be arranged. There’s one possible state where the molecules are crowded into the thimble. There’s another where, say, the molecules settle in a tidy clump at the pool’s bottom. But there are uncountable billions of permutations where the molecules spread out in different ways throughout the water. If the universe chooses from all the possible states at random, you can bet that it’s going to end up with one of the vast set of disordered possibilities.

Seen in this way, the inexorable rise in entropy, or disorder, as quantified by the second law of thermodynamics, takes on an almost mathematical certainty. So of course physicists are constantly trying to break it.

One almost did. A thought experiment devised by the Scottish physicist James Clerk Maxwell in 1867 stumped scientists for 115 years. And even after a solution was found, physicists have continued to use “Maxwell’s demon” to push the laws of the universe to their limits…

A thorny thought experiment has been turned into a real experiment—one that physicists use to probe the physics of information: “How Maxwell’s Demon Continues to Startle Scientists,” from Jonathan O’Callaghan (@Astro_Jonny)

* Philip K. Dick

###

As we reconsider the random, we might send carefully-calculated birthday greetings to Félix Édouard Justin Émile Borel; he was born on this date in 1871. A mathematician (and politician, who served as French Minister of the Navy), he is remembered for his foundational work in measure theory and probability. He published a number of research papers on game theory and was the first to define games of strategy.

But Borel may be best remembered for a thought experiment he introduced in one of his books, proposing that a monkey hitting keys at random on a typewriter keyboard will – with absolute certainty – eventually type every book in France’s Bibliothèque Nationale de France. This is now popularly known as the infinite monkey theorem.

source