(Roughly) Daily

Posts Tagged ‘Probability

“Why, sometimes I’ve believed as many as six impossible things before breakfast”*…

Imaginary numbers were long dismissed as mathematical “bookkeeping.” But now, as Karmela Padavic-Callaghan explains, physicists are proving that they describe the hidden shape of nature…

Many science students may imagine a ball rolling down a hill or a car skidding because of friction as prototypical examples of the systems physicists care about. But much of modern physics consists of searching for objects and phenomena that are virtually invisible: the tiny electrons of quantum physics and the particles hidden within strange metals of materials science along with their highly energetic counterparts that only exist briefly within giant particle colliders.

In their quest to grasp these hidden building blocks of reality scientists have looked to mathematical theories and formalism. Ideally, an unexpected experimental observation leads a physicist to a new mathematical theory, and then mathematical work on said theory leads them to new experiments and new observations. Some part of this process inevitably happens in the physicist’s mind, where symbols and numbers help make invisible theoretical ideas visible in the tangible, measurable physical world.

Sometimes, however, as in the case of imaginary numbers – that is, numbers with negative square values – mathematics manages to stay ahead of experiments for a long time. Though imaginary numbers have been integral to quantum theory since its very beginnings in the 1920s, scientists have only recently been able to find their physical signatures in experiments and empirically prove their necessity…

Learn more at “Imaginary numbers are real,” from @Ironmely in @aeonmag.

* The Red Queen, in Lewis Carroll’s Through the Looking Glass

###

As we get real, we might spare a thought for two great mathematicians…

Georg Friedrich Bernhard Riemann died on this date in 1866. A mathematician who made contributions to analysis, number theory, and differential geometry, he is remembered (among other things) for his 1859 paper on the prime-counting function, containing the original statement of the Riemann hypothesis, regarded as one of the most influential papers in analytic number theory.

source

Andrey (Andrei) Andreyevich Markov died on this date in 1922.  A Russian mathematician, he helped to develop the theory of stochastic processes, especially those now called Markov chains: sequences of random variables in which the future variable is determined by the present variable but is independent of the way in which the present state arose from its predecessors.  (For example, the probability of winning at the game of Monopoly can be determined using Markov chains.)  His work on the study of the probability of mutually-dependent events has been developed and widely applied to the biological, physical, and social sciences, and is widely used in Monte Carlo simulations and Bayesian analyses.

 source

“Nothing in life is certain except death, taxes and the second law of thermodynamics”*…

The second law of thermodynamics– asserting that the entropy of a system increases with time– is among the most sacred in all of science, but it has always rested on 19th century arguments about probability. As Philip Ball reports, new thinking traces its true source to the flows of quantum information…

In all of physical law, there’s arguably no principle more sacrosanct than the second law of thermodynamics — the notion that entropy, a measure of disorder, will always stay the same or increase. “If someone points out to you that your pet theory of the universe is in disagreement with Maxwell’s equations — then so much the worse for Maxwell’s equations,” wrote the British astrophysicist Arthur Eddington in his 1928 book The Nature of the Physical World. “If it is found to be contradicted by observation — well, these experimentalists do bungle things sometimes. But if your theory is found to be against the second law of thermodynamics I can give you no hope; there is nothing for it but to collapse in deepest humiliation.” No violation of this law has ever been observed, nor is any expected.

But something about the second law troubles physicists. Some are not convinced that we understand it properly or that its foundations are firm. Although it’s called a law, it’s usually regarded as merely probabilistic: It stipulates that the outcome of any process will be the most probable one (which effectively means the outcome is inevitable given the numbers involved).

Yet physicists don’t just want descriptions of what will probably happen. “We like laws of physics to be exact,” said the physicist Chiara Marletto of the University of Oxford. Can the second law be tightened up into more than just a statement of likelihoods?

A number of independent groups appear to have done just that. They may have woven the second law out of the fundamental principles of quantum mechanics — which, some suspect, have directionality and irreversibility built into them at the deepest level. According to this view, the second law comes about not because of classical probabilities but because of quantum effects such as entanglement. It arises from the ways in which quantum systems share information, and from cornerstone quantum principles that decree what is allowed to happen and what is not. In this telling, an increase in entropy is not just the most likely outcome of change. It is a logical consequence of the most fundamental resource that we know of — the quantum resource of information…

Is that most sacrosanct natural laws, second law of thermodynamics, a quantum phenomenon? “Physicists Rewrite the Fundamental Law That Leads to Disorder,” from @philipcball in @QuantaMagazine.

* “Nothing in life is certain except death, taxes and the second law of thermodynamics. All three are processes in which useful or accessible forms of some quantity, such as energy or money, are transformed into useless, inaccessible forms of the same quantity. That is not to say that these three processes don’t have fringe benefits: taxes pay for roads and schools; the second law of thermodynamics drives cars, computers and metabolism; and death, at the very least, opens up tenured faculty positions.” — Seth Lloyd

###

As we get down with disorder, we might spare a thought for Francois-Marie Arouet, better known as Voltaire; he died on this date in 1778.  The Father of the Age of Reason, he produced works in almost every literary form: plays, poems, novels, essays, and historical and scientific works– more than 2,000 books and pamphlets (and more than 20,000 letters).  He popularized Isaac Newton’s work in France by arranging a translation of Principia Mathematica to which he added his own commentary.

A social reformer, Voltaire used satire to criticize the intolerance, religious dogma, and oligopolistic privilege of his day, perhaps nowhere more sardonically than in Candide.

 source

“We do not inherit the earth from our ancestors, we borrow it from our children”*…

… and the interest rate on that loan is rising.

There’s much discussion of what’s causing the sudden-feeling spike in prices that we’re experiencing: pandemic disruptions, nativist and protectionist policies, the over-taxing of over-optimized supply chains, and others. But Robinson Meyer argues that there’s another issue, an underlying cause, that’s not getting the attention it deserves… one that will likely be even harder to address…

Over the past year, U.S. consumer prices have risen 7 percent, their fastest rate in nearly four decades, frustrating households and tanking President Joe Biden’s approval rating. And no wonder. High inflation corrodes the basic machinery of the economy, unsettling consumers, troubling companies, and preventing everyone from making sturdy plans for the future…

For years, scientists and economists have warned that climate change could cause massive shortages of major commodities, such as wine, chocolate, and cereals. Financial regulators have cautioned against a “disorderly transition,” in which the world commits only haphazardly to leaving fossil fuels, so it does not invest enough in their zero-carbon replacements. In an economy as prosperous and powerful as America’s, those problems are likely to show up—at least at first—not as empty grocery shelves or bankrupt gas stations but as price increases.

That phenomenon, long hypothesized, may be starting to actually arrive. Over the past year, unprecedented weather disasters have caused the price of key commodities to spike, and a volatile oil-and-gas market has allowed Russia and Saudi Arabia to exert geopolitical force.

“This climate-change risk to the supply chain—it’s actually real. It is happening now,” Mohamed Kande, the U.S. and global advisory leader at the accounting firm PwC, told me.

How to respond to these problems? The U.S. government has one tool to slow down the great chase of inflation: Leash up its dollars. By raising the rate at which the federal government lends money to banks, the Federal Reserve makes it more expensive for businesses or consumers to take out loans themselves. This brings demand in the economy more in line with supply. It is like the king in our thought experiment deciding to buy back some of his gold coins.

But wait—is it always appropriate to focus on dollars? What if the problem was caused by too few goods? Worse, what if the economy lost the ability to produce goods over time, throwing off the dollars-to-goods ratio? Then what was once an adequate number of dollars will, through no fault of its own, become too many...

… if the climate scars on supply continue to grow, does the Federal Reserve have the right tools to manage? Stinson Dean, the lumber trader, is doubtful. “Raising interest rates will blunt demand for housing—no doubt. But if you blunt demand enough to bring lumber prices down, you’re destroying the economy,” Dean told me. “For us to have lower lumber prices, we can only build a million homes a year. Do you really want to do that?

“Raising rates,” he said, “doesn’t grow more trees.” Nor does it grow more coffee, end a drought, or bring certainty to the energy transition. And if our new era of climate-driven inflation takes hold, America will need more than higher interest rates to bring balance to supply and demand.

A provocative look at the tangled roots of our inflation, suggesting that “The World Isn’t Ready for Climate-Change-Driven Inflation,” from @yayitsrob in @TheAtlantic. Eminently worth reading in full. Via @sentiers.

* Native American proverb

###

As we dig deeper, we might send carefully calculated birthday greetings to Frank Plumpton Ramsey; he was born on this date in 1903. A philosopher, mathematician, and economist, he made major contributions to all three fields before his death (at the age of 26) on this date in 1930.

While he is probably best remembered as a mathematician and logician and as Wittgenstein’s friend and translator, he wrote three paper in economics: on subjective probability and utility (a response to Keynes, 1926), on optimal taxation (1927, described by Joseph E. Stiglitz as “a landmark in the economics of public finance”), and optimal economic growth (1928; hailed by Keynes as “”one of the most remarkable contributions to mathematical economics ever made”). The economist Paul Samuelson described them in 1970 as “three great legacies – legacies that were for the most part mere by-products of his major interest in the foundations of mathematics and knowledge.”

For more on Ramsey and his thought, see “One of the Great Intellects of His Time,” “The Man Who Thought Too Fast,” and Ramsey’s entry in the Stanford Encyclopedia of Philosophy.

source

“No structure, even an artificial one, enjoys the process of entropy. It is the ultimate fate of everything, and everything resists it.”*…

A 19th-century thought experiment that motivates physicists– and information scientists– still…

The universe bets on disorder. Imagine, for example, dropping a thimbleful of red dye into a swimming pool. All of those dye molecules are going to slowly spread throughout the water.

Physicists quantify this tendency to spread by counting the number of possible ways the dye molecules can be arranged. There’s one possible state where the molecules are crowded into the thimble. There’s another where, say, the molecules settle in a tidy clump at the pool’s bottom. But there are uncountable billions of permutations where the molecules spread out in different ways throughout the water. If the universe chooses from all the possible states at random, you can bet that it’s going to end up with one of the vast set of disordered possibilities.

Seen in this way, the inexorable rise in entropy, or disorder, as quantified by the second law of thermodynamics, takes on an almost mathematical certainty. So of course physicists are constantly trying to break it.

One almost did. A thought experiment devised by the Scottish physicist James Clerk Maxwell in 1867 stumped scientists for 115 years. And even after a solution was found, physicists have continued to use “Maxwell’s demon” to push the laws of the universe to their limits…

A thorny thought experiment has been turned into a real experiment—one that physicists use to probe the physics of information: “How Maxwell’s Demon Continues to Startle Scientists,” from Jonathan O’Callaghan (@Astro_Jonny)

* Philip K. Dick

###

As we reconsider the random, we might send carefully-calculated birthday greetings to Félix Édouard Justin Émile Borel; he was born on this date in 1871. A mathematician (and politician, who served as French Minister of the Navy), he is remembered for his foundational work in measure theory and probability. He published a number of research papers on game theory and was the first to define games of strategy.

But Borel may be best remembered for a thought experiment he introduced in one of his books, proposing that a monkey hitting keys at random on a typewriter keyboard will – with absolute certainty – eventually type every book in France’s Bibliothèque Nationale de France. This is now popularly known as the infinite monkey theorem.

source

“Everything we care about lies somewhere in the middle, where pattern and randomness interlace”*…

True randomness (it’s lumpy)

We tend dramatically to underestimate the role of randomness in the world…

Arkansas was one out away from the 2018 College World Series championship, leading Oregon State in the series and 3-2 in the ninth inning of the game when Cadyn Grenier lofted a foul pop down the right-field line. Three Razorbacks converged on the ball and were in position to make a routine play on it, only to watch it fall untouched to the ground in the midst of them. Had any one of them made the play, Arkansas would have been the national champion.

Nobody did.

Given “another lifeline,” Grenier hit an RBI single to tie the game before Trevor Larnach launched a two-run homer to give the Beavers a 5-3 lead and, ultimately, the game. “As soon as you see the ball drop, you know you have another life,” Grenier said. “That’s a gift.” The Beavers accepted the gift eagerly and went on win the championship the next day as Oregon State rode freshman pitcher Kevin Abel to a 5-0 win over Arkansas in the deciding game of the series. Abel threw a complete game shutout and retired the last 20 hitters he faced.

The highly unlikely happens pretty much all the time…

We readily – routinely – underestimate the power and impact of randomness in and on our lives. In his book, The Drunkard’s Walk, Caltech physicist Leonard Mlodinow employs the idea of the “drunkard’s [random] walk” to compare “the paths molecules follow as they fly through space, incessantly bumping, and being bumped by, their sister molecules,” with “our lives, our paths from college to career, from single life to family life, from first hole of golf to eighteenth.” 

Although countless random interactions seem to cancel each another out within large data sets, sometimes, “when pure luck occasionally leads to a lopsided preponderance of hits from some particular direction…a noticeable jiggle occurs.” When that happens, we notice the unlikely directional jiggle and build a carefully concocted story around it while ignoring the many, many random, counteracting collisions.

As Tversky and Kahneman have explained, “Chance is commonly viewed as a self-correcting process in which a deviation in one direction induces a deviation in the opposite direction to restore the equilibrium. In fact, deviations are not ‘corrected’ as a chance process unfolds, they are merely diluted.”

As Stephen Jay Gould famously argued, were we able to recreate the experiment of life on Earth a million different times, nothing would ever be the same, because evolution relies upon randomness. Indeed, the essence of history is contingency.

Randomness rules.

Luck matters. A lot. Yet, we tend dramatically to underestimate the role of randomness in the world.

The self-serving bias is our tendency to see the good stuff that happens as our doing (“we worked really hard and executed the game plan well”) while the bad stuff isn’t our fault (“It just wasn’t our night” or “we simply couldn’t catch a break” or “we would have won if the umpiring hadn’t been so awful”). Thus, desirable results are typically due to our skill and hard work — not luck — while lousy results are outside of our control and the offspring of being unlucky.

Two fine books undermine this outlook by (rightly) attributing a surprising amount of what happens to us — both good and bad – to luck. Michael Mauboussin’s The Success Equation seeks to untangle elements of luck and skill in sports, investing, and business. Ed Smith’s Luck considers a number of fields – international finance, war, sports, and even his own marriage – to examine how random chance influences the world around us. For example, Mauboussin describes the “paradox of skill” as follows: “As skill improves, performance becomes more consistent, and therefore luck becomes more important.” In investing, therefore (and for example), as the population of skilled investors has increased, the variation in skill has narrowed, making luck increasingly important to outcomes.

On account of the growth and development of the investment industry, John Bogle could quite consistently write his senior thesis at Princeton on the successes of active fund management and then go on to found Vanguard and become the primary developer and intellectual forefather of indexing. In other words, the ever-increasing aggregate skill (supplemented by massive computing power) of the investment world has come largely to cancel itself out.

After a big or revolutionary event, we tend to see it as having been inevitable. Such is the narrative fallacy. In this paper, ESSEC Business School’s Stoyan Sgourev notes that scholars of innovation typically focus upon the usual type of case, where incremental improvements rule the day. Sgourev moves past the typical to look at the unusual type of case, where there is a radical leap forward (equivalent to Thomas Kuhn’s paradigm shifts in science), as with Picasso and Les Demoiselles

As Sgourev carefully argued, the Paris art market of Picasso’s time had recently become receptive to the commercial possibilities of risk-taking. Thus, artistic innovation was becoming commercially viable. Breaking with the past was then being encouraged for the first time. It would soon be demanded.

Most significantly for our purposes, Sgourev’s analysis of Cubism suggests that having an exceptional idea isn’t enough. For radical innovation really to take hold, market conditions have to be right, making its success a function of luck and timing as much as genius. Note that Van Gogh — no less a genius than Picasso — never sold a painting in his lifetime.

As noted above, we all like to think that our successes are earned and that only our failures are due to luck – bad luck. But the old expression – it’s better to be lucky than good – is at least partly true. That said, it’s best to be lucky *and* good. As a consequence, in all probabilistic fields (which is nearly all of them), the best performers dwell on process and diversify their bets. You should do the same…

As [Nate] Silver emphasizes in The Signal and the Noise, we readily overestimate the degree of predictability in complex systems [and t]he experts we see in the media are much too sure of themselves (I wrote about this problem in our industry from a slightly different angle…). Much of what we attribute to skill is actually luck.

Plan accordingly.

Taking the unaccountable into account: “Randomness Rules,” from Bob Seawright (@RPSeawright), via @JVLast

[image above: source]

* James Gleick, The Information: A History, a Theory, a Flood

###

As we contemplate chance, we might spare a thought for Oskar Morgenstern; he died on this date in 1977. An economist who fled Nazi Germany for Princeton, he collaborated with the mathematician John von Neumann to write Theory of Games and Economic Behavior, published in 1944, which is recognized as the first book on game theory— thus co-founding the field.

Game theory was developed extensively in the 1950s, and has become widely recognized as an important tool in many fields– perhaps especially in the study of evolution. Eleven game theorists have won the economics Nobel Prize, and John Maynard Smith was awarded the Crafoord Prize for his application of evolutionary game theory.

Game theory’s roots date back (at least) to the 1654 letters between Pascal and Fermat, which (along with work by Cardano and Huygens) marked the beginning of probability theory. (See Peter Bernstein’s marvelous Against the Gods.) The application of probability (Bayes’ rule, discrete and continuous random variables, and the computation of expectations) accounts for the utility of game theory; the role of randomness (along with the behavioral psychology of a game’s participants) explain why it’s not a perfect predictor.

source

Written by (Roughly) Daily

July 26, 2021 at 1:00 am

%d bloggers like this: