(Roughly) Daily

Posts Tagged ‘Huygens

“Patience’s design flaw became obvious for the first time in my life: the outcome is decided not during the course of play but when the cards are shuffled, before the game even begins. How pointless is that?”*…

A young girl sitting on the floor playing cards, with a glass on the side, surrounded by various cards spread out on a rug, and a fireplace in the background.

As Simone de Rochefort explains, Patience– or as we tend to know it, solitaire— illustrates the way in which some of humanity’s oldest toys are our most complex…

… last year, I got addicted to Solitaire.

Why me.

During the dark final days of 2024, I was averaging 12 wins per day in Sawayama Solitaire, one of the Solitaires created by developer Zachtronics. Sawayama Solitaire is a variant of Klondike — the one that’s been bundled into every version of Windows since 1990.

Some games of Sawayama Solitaire felt impossible. Some were absurdly easy. Most of them were a satisfying detangling of cards that had me immediately pressing that “new game” button once I got the win.

How was the most basic card game on Earth owning my life like this?

I think it’s because we don’t understand playing cards.

In 1969, as protests raged against the Vietnam War and counterculture made waves across the nation, a magician [and dear friend of Ricky Jay] named Persi Diaconis went to college.

Diaconis had been a professional magician since age 14, and was skilled in sleight-of-hand tricks. But it was probability that fascinated him.

He went on to take a degree in statistics. He became a world-renowned mathematician. In 1992, he proved that it takes seven riffle shuffles to truly randomize a 52-card deck, alongside fellow mathematician Dave Bayer. His research on card shuffling has implications for scientific fields as far-flung as the study of glass melting and the creation of magnets.

He doesn’t know how Solitaire works.

“One of the embarrassment of applied probability is that we can not analyze the original game of solitaire,” he wrote in the abstract for an academic talk called “The Mathematics of Solitaire,” given at the University of Washington in 1999. The talk has been given several times over the years, and is currently viewable on YouTube. One of his most recent appearances, in 2024, reiterates that despite all the technical advances we’ve made in science and mathematics, the complexity of cards is still somewhat a black box.

“What’s the chance of winning, how to play well, how do various changes of rules change the answers?” Diaconis wrote. “Surely you say, the computer can do this. Not at present, not even close.”

It’s not hard to see the relationship between magic and math. Cards contain limitless possibilities. In fact, math tells us there are more combinations of cards in a 52-card deck than there are atoms on Earth.

Writing for Quanta Magazine, Erica Klarreich asked mathematician Ron Graham what that means in practice. He told her, “If everyone had been shuffling decks of cards every second since the start of the Earth, you couldn’t touch 52 factorial,” the number of possible arrangements of a 52-card deck. Klarreich goes on: “Any time you shuffle a deck to the point of randomness, you have probably created an arrangement that has never existed before.”

So that’s nuts…

More amazement at “No one understands how playing cards work,” from @polygon.com‬.

And here:

* David Mitchell, Cloud Atlas

###

As we shuffle along, we might spare a thought for Christiaan Huygens; he died on this date in 1695. A mathematician, physicist, engineer, astronomer, and inventor, he was a key figure in the Scientific Revolution. In physics, Huygens made seminal contributions to optics and mechanics, while as an astronomer he studied the rings of Saturn and discovered its largest moon, Titan. As an engineer and inventor, he improved the design of telescopes and invented the pendulum clock, the most accurate timekeeper for almost 300 years. A talented mathematician and physicist, his works contain the first idealization of a physical problem by a set of mathematical parameters, and the first mathematical and mechanistic explanation of an unobservable physical phenomenon.

Relevantly to the piece above, Huygens also contributed to the development of probability theory and statistics. In 1665 he visited Paris and encountered the work of Fermat and Pascal, which led him to write what was, at the time, the most coherent presentation of a mathematical approach to games of chance in De Ratiociniis in Ludo Aleae (On reasoning in games of chance)– a work contains early game-theoretic ideas.

Portrait of Christiaan Huygens, a 17th-century mathematician and physicist, featuring curly hair and wearing an ornate robe with a decorative collar.

source

“Life is a Zen koan, that is, an unsolvable riddle. But the contemplation of that riddle – even though it cannot be solved – is, in itself, transformative.”*…

How hard is it to prove that problems are hard to solve? Meta-complexity theorists have been asking questions like this for decades. And as Ben Brubaker explains, a string of recent results has started to deliver answers…

… Even seasoned researchers find understanding in short supply when they confront the central open question in theoretical computer science, known as the P versus NP problem. In essence, that question asks whether many computational problems long considered extremely difficult can actually be solved easily (via a secret shortcut we haven’t discovered yet), or whether, as most researchers suspect, they truly are hard. At stake is nothing less than the nature of what’s knowable.

Despite decades of effort by researchers in the field of computational complexity theory — the study of such questions about the intrinsic difficulty of different problems — a resolution to the P versus NP question has remained elusive. And it’s not even clear where a would-be proof should start.

“There’s no road map,” said Michael Sipser, a veteran complexity theorist at the Massachusetts Institute of Technology who spent years grappling with the problem in the 1980s. “It’s like you’re going into the wilderness.”

It seems that proving that computational problems are hard to solve is itself a hard task. But why is it so hard? And just how hard is it? Carmosino and other researchers in the subfield of meta-complexity reformulate questions like this as computational problems, propelling the field forward by turning the lens of complexity theory back on itself.

“You might think, ‘OK, that’s kind of cool. Maybe the complexity theorists have gone crazy,’” said Rahul Ilango, a graduate student at MIT who has produced some of the most exciting recent results in the field.

By studying these inward-looking questions, researchers have learned that the hardness of proving computational hardness is intimately tied to fundamental questions that may at first seem unrelated. How hard is it to spot hidden patterns in apparently random data? And if truly hard problems do exist, how often are they hard?

“It’s become clear that meta-complexity is close to the heart of things,” said Scott Aaronson, a complexity theorist at the University of Texas, Austin.

This is the story of the long and winding trail that led researchers from the P versus NP problem to meta-complexity. It hasn’t been an easy journey — the path is littered with false turns and roadblocks, and it loops back on itself again and again. Yet for meta-complexity researchers, that journey into an uncharted landscape is its own reward. Start asking seemingly simple questions, said Valentine Kabanets, a complexity theorist at Simon Fraser University in Canada, and “you have no idea where you’re going to go.”…

Complexity theorists are confronting their most puzzling problem yet– complexity theory itself: “Complexity Theory’s 50-Year Journey to the Limits of Knowledge,” from @benbenbrubaker in @QuantaMagazine.

* Tom Robbins

###

As we limn limits, we might send thoroughly cooked birthday greetings to Denis Papin; he was born on this date in 1647. A mathematician and physicist who worked with  Christiaan Huygens and Gottfried Leibniz, Papin is better remembered as the inventor of the steam digester, the forerunner of the pressure cooker and of the steam engine.

source

“Everything we care about lies somewhere in the middle, where pattern and randomness interlace”*…

True randomness (it’s lumpy)

We tend dramatically to underestimate the role of randomness in the world…

Arkansas was one out away from the 2018 College World Series championship, leading Oregon State in the series and 3-2 in the ninth inning of the game when Cadyn Grenier lofted a foul pop down the right-field line. Three Razorbacks converged on the ball and were in position to make a routine play on it, only to watch it fall untouched to the ground in the midst of them. Had any one of them made the play, Arkansas would have been the national champion.

Nobody did.

Given “another lifeline,” Grenier hit an RBI single to tie the game before Trevor Larnach launched a two-run homer to give the Beavers a 5-3 lead and, ultimately, the game. “As soon as you see the ball drop, you know you have another life,” Grenier said. “That’s a gift.” The Beavers accepted the gift eagerly and went on win the championship the next day as Oregon State rode freshman pitcher Kevin Abel to a 5-0 win over Arkansas in the deciding game of the series. Abel threw a complete game shutout and retired the last 20 hitters he faced.

The highly unlikely happens pretty much all the time…

We readily – routinely – underestimate the power and impact of randomness in and on our lives. In his book, The Drunkard’s Walk, Caltech physicist Leonard Mlodinow employs the idea of the “drunkard’s [random] walk” to compare “the paths molecules follow as they fly through space, incessantly bumping, and being bumped by, their sister molecules,” with “our lives, our paths from college to career, from single life to family life, from first hole of golf to eighteenth.” 

Although countless random interactions seem to cancel each another out within large data sets, sometimes, “when pure luck occasionally leads to a lopsided preponderance of hits from some particular direction…a noticeable jiggle occurs.” When that happens, we notice the unlikely directional jiggle and build a carefully concocted story around it while ignoring the many, many random, counteracting collisions.

As Tversky and Kahneman have explained, “Chance is commonly viewed as a self-correcting process in which a deviation in one direction induces a deviation in the opposite direction to restore the equilibrium. In fact, deviations are not ‘corrected’ as a chance process unfolds, they are merely diluted.”

As Stephen Jay Gould famously argued, were we able to recreate the experiment of life on Earth a million different times, nothing would ever be the same, because evolution relies upon randomness. Indeed, the essence of history is contingency.

Randomness rules.

Luck matters. A lot. Yet, we tend dramatically to underestimate the role of randomness in the world.

The self-serving bias is our tendency to see the good stuff that happens as our doing (“we worked really hard and executed the game plan well”) while the bad stuff isn’t our fault (“It just wasn’t our night” or “we simply couldn’t catch a break” or “we would have won if the umpiring hadn’t been so awful”). Thus, desirable results are typically due to our skill and hard work — not luck — while lousy results are outside of our control and the offspring of being unlucky.

Two fine books undermine this outlook by (rightly) attributing a surprising amount of what happens to us — both good and bad – to luck. Michael Mauboussin’s The Success Equation seeks to untangle elements of luck and skill in sports, investing, and business. Ed Smith’s Luck considers a number of fields – international finance, war, sports, and even his own marriage – to examine how random chance influences the world around us. For example, Mauboussin describes the “paradox of skill” as follows: “As skill improves, performance becomes more consistent, and therefore luck becomes more important.” In investing, therefore (and for example), as the population of skilled investors has increased, the variation in skill has narrowed, making luck increasingly important to outcomes.

On account of the growth and development of the investment industry, John Bogle could quite consistently write his senior thesis at Princeton on the successes of active fund management and then go on to found Vanguard and become the primary developer and intellectual forefather of indexing. In other words, the ever-increasing aggregate skill (supplemented by massive computing power) of the investment world has come largely to cancel itself out.

After a big or revolutionary event, we tend to see it as having been inevitable. Such is the narrative fallacy. In this paper, ESSEC Business School’s Stoyan Sgourev notes that scholars of innovation typically focus upon the usual type of case, where incremental improvements rule the day. Sgourev moves past the typical to look at the unusual type of case, where there is a radical leap forward (equivalent to Thomas Kuhn’s paradigm shifts in science), as with Picasso and Les Demoiselles

As Sgourev carefully argued, the Paris art market of Picasso’s time had recently become receptive to the commercial possibilities of risk-taking. Thus, artistic innovation was becoming commercially viable. Breaking with the past was then being encouraged for the first time. It would soon be demanded.

Most significantly for our purposes, Sgourev’s analysis of Cubism suggests that having an exceptional idea isn’t enough. For radical innovation really to take hold, market conditions have to be right, making its success a function of luck and timing as much as genius. Note that Van Gogh — no less a genius than Picasso — never sold a painting in his lifetime.

As noted above, we all like to think that our successes are earned and that only our failures are due to luck – bad luck. But the old expression – it’s better to be lucky than good – is at least partly true. That said, it’s best to be lucky *and* good. As a consequence, in all probabilistic fields (which is nearly all of them), the best performers dwell on process and diversify their bets. You should do the same…

As [Nate] Silver emphasizes in The Signal and the Noise, we readily overestimate the degree of predictability in complex systems [and t]he experts we see in the media are much too sure of themselves (I wrote about this problem in our industry from a slightly different angle…). Much of what we attribute to skill is actually luck.

Plan accordingly.

Taking the unaccountable into account: “Randomness Rules,” from Bob Seawright (@RPSeawright), via @JVLast

[image above: source]

* James Gleick, The Information: A History, a Theory, a Flood

###

As we contemplate chance, we might spare a thought for Oskar Morgenstern; he died on this date in 1977. An economist who fled Nazi Germany for Princeton, he collaborated with the mathematician John von Neumann to write Theory of Games and Economic Behavior, published in 1944, which is recognized as the first book on game theory— thus co-founding the field.

Game theory was developed extensively in the 1950s, and has become widely recognized as an important tool in many fields– perhaps especially in the study of evolution. Eleven game theorists have won the economics Nobel Prize, and John Maynard Smith was awarded the Crafoord Prize for his application of evolutionary game theory.

Game theory’s roots date back (at least) to the 1654 letters between Pascal and Fermat, which (along with work by Cardano and Huygens) marked the beginning of probability theory. (See Peter Bernstein’s marvelous Against the Gods.) The application of probability (Bayes’ rule, discrete and continuous random variables, and the computation of expectations) accounts for the utility of game theory; the role of randomness (along with the behavioral psychology of a game’s participants) explain why it’s not a perfect predictor.

source

Written by (Roughly) Daily

July 26, 2021 at 1:00 am