(Roughly) Daily

Posts Tagged ‘entropy

“Simplicity, carried to the extreme, becomes elegance”*…

Jordana Cepelewicz on a very different approach to computing…

In 1936, the British mathematician Alan Turing came up with an idea for a universal computer. It was a simple device: an infinite strip of tape covered in zeros and ones, together with a machine that could move back and forth along the tape, changing zeros to ones and vice versa according to some set of rules. He showed that such a device could be used to perform any computation.

Turing did not intend for his idea to be practical for solving problems. Rather, it offered an invaluable way to explore the nature of computation and its limits. In the decades since that seminal idea, mathematicians have racked up a list of even less practical computing schemes. Games like Minesweeper or Magic: The Gathering could, in principle, be used as general-purpose computers. So could so-called cellular automata like John Conway’s Game of Life, a set of rules for evolving black and white squares on a two-dimensional grid.

In September 2023, Inna Zakharevich of Cornell University and Thomas Hull of Franklin & Marshall College showed that anything that can be computed can be computed by folding paper. They proved that origami is “Turing complete” — meaning that, like a Turing machine, it can solve any tractable computational problem, given enough time…

Read on for more on how folding paper can, in principle, be used to perform any possible computation: “How to Build an Origami Computer” from @jordanacep in @QuantaMagazine.

* Jon Franklin

###

As we contemplate calculation, we might send entropic birthday greeting to Rolf Landauer; he was born on this date in 1927. A physicist, we made important contributions made important contributions in several areas of the thermodynamics of information processing, condensed matter physics, and the conductivity of disordered media… most of which important to the development of computing (of the electronic variety).

He is best known for his discovery and formulation of what’s known as Landauer’s principle: that in any logically irreversible operation that manipulates information, such as erasing a bit of memory, entropy increases and an associated amount of energy is dissipated as heat– a “thermodynamic cost of forgetting,” relevant to chip design (how closely packed elements can be on a chip and still handle the heat), reversible computingquantum information, and quantum computing… but not an issue for origami.) 

source

“Nothing in life is certain except death, taxes and the second law of thermodynamics”*…

The second law of thermodynamics– asserting that the entropy of a system increases with time– is among the most sacred in all of science, but it has always rested on 19th century arguments about probability. As Philip Ball reports, new thinking traces its true source to the flows of quantum information…

In all of physical law, there’s arguably no principle more sacrosanct than the second law of thermodynamics — the notion that entropy, a measure of disorder, will always stay the same or increase. “If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations — then so much the worse for Maxwell’s equations,” wrote the British astrophysicist Arthur Eddington in his 1928 book The Nature of the Physical World. “If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.” No violation of this law has ever been observed, nor is any expected.

But something about the second law troubles physicists. Some are not convinced that we understand it properly or that its foundations are firm. Although it’s called a law, it’s usually regarded as merely probabilistic: It stipulates that the outcome of any process will be the most probable one (which effectively means the outcome is inevitable given the numbers involved).

Yet physicists don’t just want descriptions of what will probably happen. “We like laws of physics to be exact,” said the physicist Chiara Marletto of the University of Oxford. Can the second law be tightened up into more than just a statement of likelihoods?

A number of independent groups appear to have done just that. They may have woven the second law out of the fundamental principles of quantum mechanics — which, some suspect, have directionality and irreversibility built into them at the deepest level. According to this view, the second law comes about not because of classical probabilities but because of quantum effects such as entanglement. It arises from the ways in which quantum systems share information, and from cornerstone quantum principles that decree what is allowed to happen and what is not. In this telling, an increase in entropy is not just the most likely outcome of change. It is a logical consequence of the most fundamental resource that we know of — the quantum resource of information…

Is that most sacrosanct natural laws, second law of thermodynamics, a quantum phenomenon? “Physicists Rewrite the Fundamental Law That Leads to Disorder,” from @philipcball in @QuantaMagazine.

* “Nothing in life is certain except death, taxes and the second law of thermodynamics. All three are processes in which useful or accessible forms of some quantity, such as energy or money, are transformed into useless, inaccessible forms of the same quantity. That is not to say that these three processes don’t have fringe benefits: taxes pay for roads and schools; the second law of thermodynamics drives cars, computers and metabolism; and death, at the very least, opens up tenured faculty positions.” — Seth Lloyd

###

As we get down with disorder, we might spare a thought for Francois-Marie Arouet, better known as Voltaire; he died on this date in 1778.  The Father of the Age of Reason, he produced works in almost every literary form: plays, poems, novels, essays, and historical and scientific works– more than 2,000 books and pamphlets (and more than 20,000 letters).  He popularized Isaac Newton’s work in France by arranging a translation of Principia Mathematica to which he added his own commentary.

A social reformer, Voltaire used satire to criticize the intolerance, religious dogma, and oligopolistic privilege of his day, perhaps nowhere more sardonically than in Candide.

 source

“No structure, even an artificial one, enjoys the process of entropy. It is the ultimate fate of everything, and everything resists it.”*…

A 19th-century thought experiment that motivates physicists– and information scientists– still…

The universe bets on disorder. Imagine, for example, dropping a thimbleful of red dye into a swimming pool. All of those dye molecules are going to slowly spread throughout the water.

Physicists quantify this tendency to spread by counting the number of possible ways the dye molecules can be arranged. There’s one possible state where the molecules are crowded into the thimble. There’s another where, say, the molecules settle in a tidy clump at the pool’s bottom. But there are uncountable billions of permutations where the molecules spread out in different ways throughout the water. If the universe chooses from all the possible states at random, you can bet that it’s going to end up with one of the vast set of disordered possibilities.

Seen in this way, the inexorable rise in entropy, or disorder, as quantified by the second law of thermodynamics, takes on an almost mathematical certainty. So of course physicists are constantly trying to break it.

One almost did. A thought experiment devised by the Scottish physicist James Clerk Maxwell in 1867 stumped scientists for 115 years. And even after a solution was found, physicists have continued to use “Maxwell’s demon” to push the laws of the universe to their limits…

A thorny thought experiment has been turned into a real experiment—one that physicists use to probe the physics of information: “How Maxwell’s Demon Continues to Startle Scientists,” from Jonathan O’Callaghan (@Astro_Jonny)

* Philip K. Dick

###

As we reconsider the random, we might send carefully-calculated birthday greetings to Félix Édouard Justin Émile Borel; he was born on this date in 1871. A mathematician (and politician, who served as French Minister of the Navy), he is remembered for his foundational work in measure theory and probability. He published a number of research papers on game theory and was the first to define games of strategy.

But Borel may be best remembered for a thought experiment he introduced in one of his books, proposing that a monkey hitting keys at random on a typewriter keyboard will – with absolute certainty – eventually type every book in France’s Bibliothèque Nationale de France. This is now popularly known as the infinite monkey theorem.

source

Adventures in Cosmology: Starting out Simply…

Why was entropy so low at the Big Bang? (source: Internet Encyclopedia of Philosophy)

Back in 2010, SUNY-Buffalo physics professor Dejan Stojkovic and colleagues made a simple– a radically simple– suggestion:  that the early universe — which exploded from a single point and was very, very small at first — was one-dimensional (like a straight line) before expanding to include two dimensions (like a plane) and then three (like the world in which we live today).

The core idea is that the dimensionality of space depends on the size of the space observed, with smaller spaces associated with fewer dimensions. That means that a fourth dimension will open up — if it hasn’t already — as the universe continues to expand.  (Interesting corollary: space has fewer dimensions at very high energies of the kind associated with the early, post-big bang universe.)

Stojkovic’s notion is challenging; but at the same time, it would help address a number of fundamental problems with the standard model of particle physics, from the incompatibility between quantum mechanics and general relativity to the mystery of the accelerating expansion of the universe.

But is it “true”?  There’s no way to know as yet.  But Stojkovic and his colleagues have devised a test using the Laser Interferometer Space Antenna (LISA), a planned international gravitational observatory, that could shed some definitive light on the question in just a few years.

Read the whole story in Science Daily, and read Stojkovic’s proposal for experimental proof in Physical Review Letters.

As we glance around for evidence of that fourth dimension, we might bid an indeterminate farewell to Ilya Prigogine, the Nobel Laureate whose work on dissipative structures, complex systems, and irreversibility led to the identification of self-organizing systems, and is seen by many as a bridge between the natural and social sciences.  He died at the Hospital Erasme in Brussels on this date in 2003.

Prigogine’s 1997 book, The End of Certainty, summarized his departure from the determinist thinking of Newton, Einstein, and Schrödinger in arguing for “the arrow of time”– and “complexity,” the ineluctable reality of irreversibility and instability.  “Unstable systems” like weather and biological life, he suggested, cannot be explained with standard deterministic models.  Rather, given their to sensitivity to initial conditions, unstable systems can only be explained statistically, probabilistically.

source: University of Texas

Well that changes everything…

source: Imperial College, London

As all of one’s assumptions about the future (and thus the past) seem to be weakening, two dispatches from the world of science are genuinely foundation-shaking…

First, from New Scientist (and though NS doesn’t mention it, earlier from Freeman Dyson in NYRB):

Just suppose that Darwin’s ideas were only a part of the story of evolution. Suppose that a process he never wrote about, and never even imagined, has been controlling the evolution of life throughout most of the Earth’s history. It may sound preposterous, but this is exactly what microbiologist Carl Woese and physicist Nigel Goldenfeld, both at the University of Illinois at Urbana-Champaign, believe. Darwin’s explanation of evolution, they argue, even in its sophisticated modern form, applies only to a recent phase of life on Earth.

At the root of this idea is overwhelming recent evidence for horizontal gene transfer – in which organisms acquire genetic material “horizontally” from other organisms around them, rather than vertically from their parents or ancestors. The donor organisms may not even be the same species. This mechanism is already known to play a huge role in the evolution of microbial genomes, but its consequences have hardly been explored. According to Woese and Goldenfeld, they are profound, and horizontal gene transfer alters the evolutionary process itself. Since micro-organisms represented most of life on Earth for most of the time that life has existed – billions of years, in fact – the most ancient and prevalent form of evolution probably wasn’t Darwinian at all, Woese and Goldenfeld say…

Woese can’t put a date on when the transition to Darwinian evolution happened, but he suspects it occurred at different times in each of the three main branches of the tree of life, with bacteria likely to have changed first…

As we remember that what has changed can change again, we can read the whole story here.

Second, from ArXiv and ITWire, a suggestion that we might not have as long to figure this out as we have been thinking:  entropy in the universe is much higher than previously expected; thus the “heat death” of existence as we’ve known it, much closer. As Tapecutter observes in Slashdot,

In a paper soon to be published in the Astrophysical Journal [ArXiv link, above], Australian researchers have estimated the entropy of the universe is about 30 times higher than previous estimates. According to their research, super-massive black holes “are the largest contributor to the entropy of the observable universe, contributing at least an order of magnitude more entropy than previously estimated.” For those of us who like their science in the form of a car analogy, Dr. Lineweaver compared their results to a car’s gas tank. He states, ‘It’s a bit like looking at your gas gauge and saying “I thought I had half a gas tank, but I only have a quarter of a tank.”

Happily a quarter of a tank should be good for hundreds of thousands (if not millions) of years.

As we regain our bearings, we might note that this was a bad day for revolutionaries of another stripe:  it was on this date in 1606 that Guy Fawkes was executed for his role in the Catholic Restorationist “Gunpowder Plot.”

Guy Fawkes