(Roughly) Daily

Posts Tagged ‘randomness

“Everything we care about lies somewhere in the middle, where pattern and randomness interlace”*…

6,144 colors in random order. (source: By grotos on Flickr; via O’Reilly Radar)

… A French mathematician has just won the Abel Prize for his decades of work developing a set of tools now widely used for taming random processes…

Random processes take place all around us. It rains one day but not the next; stocks and bonds gain and lose value; traffic jams coalesce and disappear. Because they’re governed by numerous factors that interact with one another in complicated ways, it’s impossible to predict the exact behavior of such systems. Instead, we think about them in terms of probabilities, characterizing outcomes as likely or rare…

… the French probability theorist Michel Talagrand was awarded the Abel Prize, one of the highest honors in mathematics, for developing a deep and sophisticated understanding of such processes. The prize, presented by the king of Norway, is modeled on the Nobel and comes with 7.5 million Norwegian kroner (about $700,000). When he was told he had won, “my mind went blank,” Talagrand said. “The type of mathematics I do was not fashionable at all when I started. It was considered inferior mathematics. The fact that I was given this award is absolute proof this is not the case.”

Other mathematicians agree. Talagrand’s work “changed the way I view the world,” said Assaf Naor of Princeton University. Today, added Helge Holden, the chair of the Abel prize committee, “it is becoming very popular to describe and model real-world events by random processes. Talagrand’s toolbox comes up immediately.”

A random process is a collection of events whose outcomes vary according to chance in a way that can be modeled — like a sequence of coin flips, or the trajectories of atoms in a gas, or daily rainfall totals. Mathematicians want to understand the relationship between individual outcomes and aggregate behavior. How many times do you have to flip a coin to figure out whether it’s fair? Will a river overflow its banks?

Talagrand focused on processes whose outcomes are distributed according to a bell-shaped curve called a Gaussian. Such distributions are common in nature and have a number of desirable mathematical properties. He wanted to know what can be said with certainty about extreme outcomes in these situations. So he proved a set of inequalities that put tight upper and lower bounds on possible outcomes. “To obtain a good inequality is a piece of art,” Holden said. That art is useful: Talagrand’s methods can give an optimal estimate of, say, the highest level a river might rise to in the next 10 years, or the magnitude of the strongest potential earthquake…

Say you want to assess the risk of a river flooding — which will depend on factors like rainfall, wind and temperature. You can model the river’s height as a random process. Talagrand spent 15 years developing a technique called generic chaining that allowed him to create a high-dimensional geometric space related to such a random process. His method “gives you a way to read the maximum from the geometry,” Naor said.

The technique is very general and therefore widely applicable. Say you want to analyze a massive, high-dimensional data set that depends on thousands of parameters. To draw a meaningful conclusion, you want to preserve the data set’s most important features while characterizing it in terms of just a few parameters. (For example, this is one way to analyze and compare the complicated structures of different proteins.) Many state-of-the-art methods achieve this simplification by applying a random operation that maps the high-dimensional data to a lower-dimensional space. Mathematicians can use Talagrand’s generic chaining method to determine the maximal amount of error that this process introduces — allowing them to determine the chances that some important feature isn’t preserved in the simplified data set.

Talagrand’s work wasn’t just limited to analyzing the best and worst possible outcomes of a random process. He also studied what happens in the average case.

In many processes, random individual events can, in aggregate, lead to highly deterministic outcomes. If measurements are independent, then the totals become very predictable, even if each individual event is impossible to predict. For instance, flip a fair coin. You can’t say anything in advance about what will happen. Flip it 10 times, and you’ll get four, five or six heads — close to the expected value of five heads — about 66% of the time. But flip the coin 1,000 times, and you’ll get between 450 and 550 heads 99.7% of the time, a result that’s even more concentrated around the expected value of 500. “It is exceptionally sharp around the mean,” Holden said.

“Even though something has so much randomness, the randomness cancels itself out,” Naor said. “What initially seemed like a horrible mess is actually organized.”…

Michel Talagrand Wins Abel Prize for Work Wrangling Randomness,” from @QuantaMagazine.

* James Gleick, The Information

###

As we comprehend the constructs in chance, we might spare a thought for Caspar Wessel; he died on this date in 1818. A mathematician, he the first person to describe the geometrical interpretation of complex numbers as points in the complex plane and vectors.

Not coincidentally, Wessel was also a surveyor and cartographer, who contributed to the Royal Danish Academy of Sciences and Letters‘ topographical survey of Denmark.

source

“For what are myths if not the imposing of order on phenomena that do not possess order in themselves? And all myths, however they differ from philosophical systems and scientific theories, share this with them, that they negate the principle of randomness in the world.”*…

And we humans are, as Kit Yates explains, myth-making animals…

Unfortunately, when it comes to understanding random phenomena, our intuition often lets us down. Take a look at the image below. Before you read the caption, see if you can pick out the data set generated using truly uniform random numbers for the coordinates of the dots (i.e., for each point, independent of the others, the horizontal coordinate is equally likely to fall anywhere along the horizontal axis and the vertical coordinate is equally likely to fall anywhere along the vertical).

Three data sets, each with 132 points. One represents the position of the nests of Patagonian seabirds, another the position of ant colony nest sites and the third represents randomly generated coordinates. Can you guess which one is which?

The truly randomly distributed points in the figure are those in the left-most image. The middle image represents the position of ants’ nests that, although distributed with some randomness, demonstrate a tendency to avoid being too close together in order not to overexploit the same resources. The territorial Patagonian seabirds’ nesting sites, in the right-most image, exhibit an even more regular and well-spaced distribution, preferring not to be too near to their neighbors when rearing their young. The computer-generated points, distributed uniformly at random in the left-hand image, have no such qualms about their close proximity.

If you chose the wrong option, you are by no means alone. Most of us tend to think of randomness as being “well spaced.” The tight clustering of dots and the frequent wide gaps of the genuinely random distribution seem to contradict our inherent ideas of what randomness should look like…

… As a case in point, after noticing a disproportionate number of Steely Dan songs playing on his iPod shuffle, journalist Steven Levy questioned Steve Jobs directly about whether “shuffle” was truly random. Jobs assured him that it was and even got an engineer on the phone to confirm it. A follow-up article Levy wrote in Newsweek garnered a huge response from readers having similar experiences, questioning, for example, how two Bob Dylan songs shuffled to play one after the other (from among the thousands of songs in their collections) could possibly be random.

We ascribe meaning too readily to the clustering that randomness produces, and, consequently, we deduce that there is some generative force behind the pattern. We are hardwired to do this. The “evolutionary” argument holds that tens of thousands of years ago, if you were out hunting or gathering in the forest and you heard a rustle in the bushes, you’d be wise to play it safe and to run away as fast as you could. Maybe it was a predator out looking for their lunch and by running away you saved your skin. Probably, it was just the wind randomly whispering in the leaves and you ended up looking a little foolish—foolish, but alive and able to pass on your paranoid pattern-spotting genes to the next generation…

This… is just one example of the phenomenon known in the psychology literature as pareidolia, in which an observer interprets an ambiguous auditory or visual stimulus as something they are familiar with. This phenomenon, otherwise known as “patternicity,” allows people to spot shapes in the clouds and is the reason why people think they see a man in the moon. Pareidolia is itself an example of the more general phenomenon of apophenia, in which people mistakenly perceive connections between and ascribe meaning to unrelated events or objects. Apophenia’s misconstrued connections lead us to validate incorrect hypotheses and draw illogical conclusions. Consequently, the phenomenon lies at the root of many conspiracy theories—think, for example, of extraterrestrial seekers believing that any bright light in the sky is a UFO.

Apophenia sends us looking for the cause behind the effect when, in reality, there is none at all. When we hear two songs by the same artist back-to-back, we are too quick to cry foul in the belief that we have spotted a pattern, when in fact these sorts of clusters are an inherent feature of randomness. Eventually, the dissatisfaction caused by the clustering inherent to the iPod’s genuinely random shuffle algorithm led Steve Jobs to implement the new “Smart Shuffle” feature on the iPod, which meant that the next song played couldn’t be too similar to the previous song, better conforming to our misconceived ideas of what randomness looks like. As Jobs himself quipped, “We’re making it less random to make it feel more random.”…

Why Randomness Doesn’t Feel Random,” an excerpt from How to Expect the Unexpected: The Science of Making Predictions—and the Art of Knowing When Not To, by @Kit_Yates_Maths in @behscientist.

* Stanislaw Lem

###

As we ponder purported patterns, we might send carefully-discerned birthday greetings to a man who did in fact find a pattern (or at least a meaning) in what might have seemed random and meaningless: Robert Woodrow Wilson; he was born on this date in 1936.  An astronomer, he detected– with Bell Labs colleague Arno Penzias– cosmic microwave background radiation: “relic radiation”– that’s to say, the “sound “– of the Big Bang… familiar to those of us old enough to remember watching an old-fashioned television after the test pattern was gone (when there was no broadcast signal received): the “fuzz” we saw and the static-y sounds we heard, were that “relic radiation” being picked up.

Their 1964 discovery earned them the 1978 Nobel Prize in Physics.

 source

“Everything we care about lies somewhere in the middle, where pattern and randomness interlace”*…

True randomness (it’s lumpy)

We tend dramatically to underestimate the role of randomness in the world…

Arkansas was one out away from the 2018 College World Series championship, leading Oregon State in the series and 3-2 in the ninth inning of the game when Cadyn Grenier lofted a foul pop down the right-field line. Three Razorbacks converged on the ball and were in position to make a routine play on it, only to watch it fall untouched to the ground in the midst of them. Had any one of them made the play, Arkansas would have been the national champion.

Nobody did.

Given “another lifeline,” Grenier hit an RBI single to tie the game before Trevor Larnach launched a two-run homer to give the Beavers a 5-3 lead and, ultimately, the game. “As soon as you see the ball drop, you know you have another life,” Grenier said. “That’s a gift.” The Beavers accepted the gift eagerly and went on win the championship the next day as Oregon State rode freshman pitcher Kevin Abel to a 5-0 win over Arkansas in the deciding game of the series. Abel threw a complete game shutout and retired the last 20 hitters he faced.

The highly unlikely happens pretty much all the time…

We readily – routinely – underestimate the power and impact of randomness in and on our lives. In his book, The Drunkard’s Walk, Caltech physicist Leonard Mlodinow employs the idea of the “drunkard’s [random] walk” to compare “the paths molecules follow as they fly through space, incessantly bumping, and being bumped by, their sister molecules,” with “our lives, our paths from college to career, from single life to family life, from first hole of golf to eighteenth.” 

Although countless random interactions seem to cancel each another out within large data sets, sometimes, “when pure luck occasionally leads to a lopsided preponderance of hits from some particular direction…a noticeable jiggle occurs.” When that happens, we notice the unlikely directional jiggle and build a carefully concocted story around it while ignoring the many, many random, counteracting collisions.

As Tversky and Kahneman have explained, “Chance is commonly viewed as a self-correcting process in which a deviation in one direction induces a deviation in the opposite direction to restore the equilibrium. In fact, deviations are not ‘corrected’ as a chance process unfolds, they are merely diluted.”

As Stephen Jay Gould famously argued, were we able to recreate the experiment of life on Earth a million different times, nothing would ever be the same, because evolution relies upon randomness. Indeed, the essence of history is contingency.

Randomness rules.

Luck matters. A lot. Yet, we tend dramatically to underestimate the role of randomness in the world.

The self-serving bias is our tendency to see the good stuff that happens as our doing (“we worked really hard and executed the game plan well”) while the bad stuff isn’t our fault (“It just wasn’t our night” or “we simply couldn’t catch a break” or “we would have won if the umpiring hadn’t been so awful”). Thus, desirable results are typically due to our skill and hard work — not luck — while lousy results are outside of our control and the offspring of being unlucky.

Two fine books undermine this outlook by (rightly) attributing a surprising amount of what happens to us — both good and bad – to luck. Michael Mauboussin’s The Success Equation seeks to untangle elements of luck and skill in sports, investing, and business. Ed Smith’s Luck considers a number of fields – international finance, war, sports, and even his own marriage – to examine how random chance influences the world around us. For example, Mauboussin describes the “paradox of skill” as follows: “As skill improves, performance becomes more consistent, and therefore luck becomes more important.” In investing, therefore (and for example), as the population of skilled investors has increased, the variation in skill has narrowed, making luck increasingly important to outcomes.

On account of the growth and development of the investment industry, John Bogle could quite consistently write his senior thesis at Princeton on the successes of active fund management and then go on to found Vanguard and become the primary developer and intellectual forefather of indexing. In other words, the ever-increasing aggregate skill (supplemented by massive computing power) of the investment world has come largely to cancel itself out.

After a big or revolutionary event, we tend to see it as having been inevitable. Such is the narrative fallacy. In this paper, ESSEC Business School’s Stoyan Sgourev notes that scholars of innovation typically focus upon the usual type of case, where incremental improvements rule the day. Sgourev moves past the typical to look at the unusual type of case, where there is a radical leap forward (equivalent to Thomas Kuhn’s paradigm shifts in science), as with Picasso and Les Demoiselles

As Sgourev carefully argued, the Paris art market of Picasso’s time had recently become receptive to the commercial possibilities of risk-taking. Thus, artistic innovation was becoming commercially viable. Breaking with the past was then being encouraged for the first time. It would soon be demanded.

Most significantly for our purposes, Sgourev’s analysis of Cubism suggests that having an exceptional idea isn’t enough. For radical innovation really to take hold, market conditions have to be right, making its success a function of luck and timing as much as genius. Note that Van Gogh — no less a genius than Picasso — never sold a painting in his lifetime.

As noted above, we all like to think that our successes are earned and that only our failures are due to luck – bad luck. But the old expression – it’s better to be lucky than good – is at least partly true. That said, it’s best to be lucky *and* good. As a consequence, in all probabilistic fields (which is nearly all of them), the best performers dwell on process and diversify their bets. You should do the same…

As [Nate] Silver emphasizes in The Signal and the Noise, we readily overestimate the degree of predictability in complex systems [and t]he experts we see in the media are much too sure of themselves (I wrote about this problem in our industry from a slightly different angle…). Much of what we attribute to skill is actually luck.

Plan accordingly.

Taking the unaccountable into account: “Randomness Rules,” from Bob Seawright (@RPSeawright), via @JVLast

[image above: source]

* James Gleick, The Information: A History, a Theory, a Flood

###

As we contemplate chance, we might spare a thought for Oskar Morgenstern; he died on this date in 1977. An economist who fled Nazi Germany for Princeton, he collaborated with the mathematician John von Neumann to write Theory of Games and Economic Behavior, published in 1944, which is recognized as the first book on game theory— thus co-founding the field.

Game theory was developed extensively in the 1950s, and has become widely recognized as an important tool in many fields– perhaps especially in the study of evolution. Eleven game theorists have won the economics Nobel Prize, and John Maynard Smith was awarded the Crafoord Prize for his application of evolutionary game theory.

Game theory’s roots date back (at least) to the 1654 letters between Pascal and Fermat, which (along with work by Cardano and Huygens) marked the beginning of probability theory. (See Peter Bernstein’s marvelous Against the Gods.) The application of probability (Bayes’ rule, discrete and continuous random variables, and the computation of expectations) accounts for the utility of game theory; the role of randomness (along with the behavioral psychology of a game’s participants) explain why it’s not a perfect predictor.

source

Written by (Roughly) Daily

July 26, 2021 at 1:00 am

“In the space between chaos and shape there was another chance”*…

Prince Hamlet spent a lot of time pondering the nature of chance and probability in William Shakespeare’s tragedy. In the famous “To be or not to be” speech, he notes that we helplessly face “the slings and arrows of outrageous fortune” — though a little earlier in the play he declares that “there’s a special providence in the fall of a sparrow,” suggesting that everything happens because God wills it to be so.

We can hardly fault the prince for holding two seemingly contradictory views about the nature of chance; after all, it is a puzzle that has vexed humankind through the ages. Why are we here? Or to give the question a slightly more modern spin, what sequence of events brought us here, and can we imagine a world in which we didn’t arrive on the scene at all?

It is to biologist Sean B. Carroll’s credit that he’s found a way of taking a puzzle that could easily fill volumes (and probably has filled volumes), and presenting it to us in a slim, non-technical, and fun little book, “A Series of Fortunate Events: Chance and the Making of the Planet, Life, and You.”

Carroll (not to be confused with physicist and writer Sean M. Carroll) gets the ball rolling with an introduction to the key concepts in probability and game theory, but quickly moves on to the issue at the heart of the book: the role of chance in evolution. Here we meet a key historical figure, the 20th-century French biochemist Jacques Monod, who won a Nobel Prize for his work on genetics. Monod understood that genetic mutations play a critical role in evolution, and he was struck by the random nature of those mutations…

Carroll quotes Monod: “Pure chance, absolutely free and blind, at the very root of the stupendous edifice of evolution: This central concept of modern biology is no longer one among other possible or even conceivable hypotheses. It is today the sole conceivable hypothesis, the only one that squares with observed and tested fact.”

“There is no scientific concept, in any of the sciences,” Monod concludes, “more destructive of anthropocentrism than this one.”

From there, it’s a short step to the realization that we humans might never have evolved in the first place…

Preview(opens in a new tab)

The profound impact of randomness in determining destiny: “The Power of Chance in Shaping Life and Evolution.”

See also: “Survival of the Luckiest.”

* Jeanette Winterson, The World and Other Places

###

As we blow on the dice, we might send carefully-calculated birthday greetings to Gabrielle-Émilie Le Tonnelier de Breteuil, Marquise du Châtelet, the French mathematician and physicist who is probably (if unfairly) better known as Voltaire’s mistress; she was born on this date in 1706.  Fascinated by the work of Newton and Leibniz, she dressed as a man to frequent the cafes where the scientific discussions of the time were held.  Her major work was a translation of Newton’s Principia, for which Voltaire wrote the preface; it was published a decade after her death, and was for many years the only translation of the Principia into French.

Judge me for my own merits, or lack of them, but do not look upon me as a mere appendage to this great general or that great scholar, this star that shines at the court of France or that famed author. I am in my own right a whole person, responsible to myself alone for all that I am, all that I say, all that I do. It may be that there are metaphysicians and philosophers whose learning is greater than mine, although I have not met them. Yet, they are but frail humans, too, and have their faults; so, when I add the sum total of my graces, I confess I am inferior to no one.
– Mme du Châtelet, to Frederick the Great of Prussia

source

“Real randomness requires an infinite amount of information”*…

the_quantum_random_number_generator_2_alt_1050x700

If you have ever tossed dice, whether in a board game or at the gambling table, you have created random numbers—a string of numbers each of which cannot be predicted from the preceding ones. People have been making random numbers in this way for millennia. Early Greeks and Romans played games of chance by tossing the heel bone of a sheep or other animal and seeing which of its four straight sides landed uppermost. Heel bones evolved into the familiar cube-shaped dice with pips that still provide random numbers for gaming and gambling today.

But now we also have more sophisticated random number generators, the latest of which required a lab full of laser equipment at the U.S. National Institute of Standards and Technology (NIST) in Boulder, CO. It relies on counterintuitive quantum behavior with an assist from relativity theory to make random numbers. This was a notable feat because the NIST team’s numbers were absolutely guaranteed to be random, a result never before achieved.

Why are random numbers worth so much effort? Random numbers are chaotic for a good cause. They are eminently useful, and not only in gambling. Since random digits appear with equal probabilities, like heads and tails in a coin toss, they guarantee fair outcomes in lotteries, such as those to buy high-value government bonds in the United Kingdom. Precisely because they are unpredictable, they provide enhanced security for the internet and for encrypted messages. And in a nod to their gambling roots, random numbers are essential for the picturesquely named “Monte Carlo” method that can solve otherwise intractable scientific problems…

Using entanglement to generate true mathematical randomness– and why that matters: “The Quantum Random Number Generator.”

* Tristan Perich

###

As we leave it to chance, we might send learned birthday greetings to Athanasius Kircher; he was born on this date in 1602.  A scholar, he published over 40 works. perhaps most notably on comparative religion, geology, and medicine, but over a range so broad that he was frequently compared to Leonardo Da Vinci (who died on the date in 1519) and was dubbed “Master of a Hundred Arts.”

For a look at one of his more curious works, see “Wonder is the beginning of wisdom.” And his take on The Plague (through which he lived in Italy in 1656), see here.

220px-Athanasius_Kircher_(cropped) source

Written by (Roughly) Daily

May 2, 2020 at 8:44 am