(Roughly) Daily

Posts Tagged ‘Probability

“Never tell me the odds!”*…

How likely is it that one will be born on a Leap Day? That one will find a pearl in an oyster? That one will solve Wordle on the first guess? That one will die on a tornado? That two people will share the same fingerprint?

The good folks at R74n (@r74n.com) have these probabilities– and so many more: “What Are The Odds?

(Image above– and tutorial on the odds ratio: source)

* Han Solo (Harrison Ford) in Star Wars: Episode V– The Empire Strikes Back

###

As we place our bets, we might spare a thought for Harvey Kurtzman; he died on this date in 1993. A cartoonist and editor, he is best know for writing and editing the parodic comic book Mad from 1952 until 1956. Kurtzman scripted every story in the first twenty-three issues. (The New York Times‘ obituary for Kurtzman in 1993, alluding to the role of publisher William Gaines, said Kurtzman had “helped found Mad Magazine.” This prompted an angry response to the newspaper from Art Spiegelman, who complained that awarding Kurtzman partial credit for starting Mad was “like saying Michelangelo helped paint the Sistine Chapel just because some Pope owned the ceiling.”)

Kurtzman, who mentored many younger cartoonists (including Terry Gilliam and Robert Crumb), is considered, with cartoonists like Will EisnerJack Kirby, and Carl Barks, one of the defining creators of the Golden Age of American comic books. The prestigious Harvey Awards (for achievement in comic books) are named in his honor.

source

source

Written by (Roughly) Daily

February 21, 2026 at 1:00 am

“Patience’s design flaw became obvious for the first time in my life: the outcome is decided not during the course of play but when the cards are shuffled, before the game even begins. How pointless is that?”*…

A young girl sitting on the floor playing cards, with a glass on the side, surrounded by various cards spread out on a rug, and a fireplace in the background.

As Simone de Rochefort explains, Patience– or as we tend to know it, solitaire— illustrates the way in which some of humanity’s oldest toys are our most complex…

… last year, I got addicted to Solitaire.

Why me.

During the dark final days of 2024, I was averaging 12 wins per day in Sawayama Solitaire, one of the Solitaires created by developer Zachtronics. Sawayama Solitaire is a variant of Klondike — the one that’s been bundled into every version of Windows since 1990.

Some games of Sawayama Solitaire felt impossible. Some were absurdly easy. Most of them were a satisfying detangling of cards that had me immediately pressing that “new game” button once I got the win.

How was the most basic card game on Earth owning my life like this?

I think it’s because we don’t understand playing cards.

In 1969, as protests raged against the Vietnam War and counterculture made waves across the nation, a magician [and dear friend of Ricky Jay] named Persi Diaconis went to college.

Diaconis had been a professional magician since age 14, and was skilled in sleight-of-hand tricks. But it was probability that fascinated him.

He went on to take a degree in statistics. He became a world-renowned mathematician. In 1992, he proved that it takes seven riffle shuffles to truly randomize a 52-card deck, alongside fellow mathematician Dave Bayer. His research on card shuffling has implications for scientific fields as far-flung as the study of glass melting and the creation of magnets.

He doesn’t know how Solitaire works.

“One of the embarrassment of applied probability is that we can not analyze the original game of solitaire,” he wrote in the abstract for an academic talk called “The Mathematics of Solitaire,” given at the University of Washington in 1999. The talk has been given several times over the years, and is currently viewable on YouTube. One of his most recent appearances, in 2024, reiterates that despite all the technical advances we’ve made in science and mathematics, the complexity of cards is still somewhat a black box.

“What’s the chance of winning, how to play well, how do various changes of rules change the answers?” Diaconis wrote. “Surely you say, the computer can do this. Not at present, not even close.”

It’s not hard to see the relationship between magic and math. Cards contain limitless possibilities. In fact, math tells us there are more combinations of cards in a 52-card deck than there are atoms on Earth.

Writing for Quanta Magazine, Erica Klarreich asked mathematician Ron Graham what that means in practice. He told her, “If everyone had been shuffling decks of cards every second since the start of the Earth, you couldn’t touch 52 factorial,” the number of possible arrangements of a 52-card deck. Klarreich goes on: “Any time you shuffle a deck to the point of randomness, you have probably created an arrangement that has never existed before.”

So that’s nuts…

More amazement at “No one understands how playing cards work,” from @polygon.com‬.

And here:

* David Mitchell, Cloud Atlas

###

As we shuffle along, we might spare a thought for Christiaan Huygens; he died on this date in 1695. A mathematician, physicist, engineer, astronomer, and inventor, he was a key figure in the Scientific Revolution. In physics, Huygens made seminal contributions to optics and mechanics, while as an astronomer he studied the rings of Saturn and discovered its largest moon, Titan. As an engineer and inventor, he improved the design of telescopes and invented the pendulum clock, the most accurate timekeeper for almost 300 years. A talented mathematician and physicist, his works contain the first idealization of a physical problem by a set of mathematical parameters, and the first mathematical and mechanistic explanation of an unobservable physical phenomenon.

Relevantly to the piece above, Huygens also contributed to the development of probability theory and statistics. In 1665 he visited Paris and encountered the work of Fermat and Pascal, which led him to write what was, at the time, the most coherent presentation of a mathematical approach to games of chance in De Ratiociniis in Ludo Aleae (On reasoning in games of chance)– a work contains early game-theoretic ideas.

Portrait of Christiaan Huygens, a 17th-century mathematician and physicist, featuring curly hair and wearing an ornate robe with a decorative collar.

source

“The number 2 is a very dangerous number: that is why the dialectic is a dangerous process”*…

In order to bridge the yawning gulf between the humanities and the sciences, Gordon Gillespie suggests, we must turn to an unexpected field: mathematics…

In 1959, the English writer and physicist C P Snow delivered the esteemed Rede Lecture at the University of Cambridge [a talk now known as “The Two Cultures,” see here]. Regaled with champagne and Marmite sandwiches, the audience had no idea that they were about to be read the riot act. Snow diagnosed a rift of mutual ignorance in the intellectual world of the West. On the one hand were the ‘literary intellectuals’ (of the humanities) and on the other the (natural) ‘scientists’: the much-discussed ‘two cultures’. Snow substantiated his diagnosis with anecdotes of respected literary intellectuals who complained about the illiteracy of the scientists but who themselves had never heard of such a fundamental statement as the second law of thermodynamics. And he told of brilliant scientific minds who might know a lot about the second law but were barely up to the task of reading Charles Dickens, let alone an ‘esoteric, tangled and dubiously rewarding writer … like Rainer Maria Rilke.’

Sixty-plus years after Snow’s diatribe, the rift has hardly narrowed. Off the record, most natural scientists still consider the humanities to be a pseudo-science that lacks elementary epistemic standards. In a 2016 talk, the renowned theoretical physicist Carlo Rovelli lamented ‘the current anti-philosophical ideology’. And he quoted eminent colleagues such as the Nobel laureate Steven Weinberg, Stephen Hawking and Neil deGrasse Tyson, who agreed that ‘philosophy is dead’ and that only the natural sciences could explain how the world works, not ‘what you can deduce from your armchair’. Meanwhile, many humanities scholars see scientists as pedantic surveyors of nature, who may produce practical and useful results, but are blind to the truly deep insights about the workings of the (cultural) world. In his best-selling book The Fate of Rome (2017), Kyle Harper convincingly showed that a changing climate and diseases were major factors contributing to the final fall of the Roman Empire. The majority of Harper’s fellow historians had simply neglected such factors up to then; they had instead focused solely on the cultural, political and socioeconomic ones…

The divide between the two cultures is not just an academic affair. It is, more importantly, about two opposing views on the fundamental connection between mind and nature. According to one view, nature is governed by an all-encompassing system of laws. This image underlies the explanatory paradigm of causal determination by elementary forces. As physics became the leading science in the 19th century, the causal paradigm was more and more seen as the universal form of explanation. Nothing real fell outside its purview. According to this view, every phenomenon can be explained by a more or less complex causal chain (or web), the links of which can, in turn, be traced back, in principle, to basic natural forces. Anything – including any aspect of the human mind – that eludes this explanatory paradigm is simply not part of the real world, just like the ‘omens’ of superstition or the ‘astral projections’ of astrology.

On the opposing view, the human mind – be it that of individuals or collectives – can very well be regarded separately from its physical foundations. Of course, it is conceded that the mind cannot work without the brain, so it is not entirely independent of natural forces and their dynamics. But events of cultural significance can be explained as effects of very different kinds of causes, namely psychological and social, that operate in a sphere quite separate from that of the natural forces.

These divergent understandings underpin the worldviews of each culture. Naive realists – primarily natural scientists – like to point out that nature existed long before humankind. Nature is ordered according to laws that operate regardless of whether or not humans are around to observe. So the natural order of the world must be predetermined independently of the human mind. Conversely, naive idealists – including social constructivists, mostly encountered in the humanities – insist that all order is conceptual order, which is based solely on individual or collective thought. As such, order is not only not independent of the human mind, it’s also ambiguous, just as the human mind is ambiguous in its diverse cultural manifestations.

The clash of cultures between the humanities and the natural sciences is reignited over and over because of two images that portray the interrelationship of mind and nature very differently. To achieve peace between the two cultures, we need to overcome both views. We must recognise that the natural and the mental order of things go hand in hand. Neither can be fully understood without the other. And neither can be traced back to the other…

… The best mediator of a conciliatory view that avoids the mistake of the naive realist and the naive idealist is mathematics. Mathematics gives us shining proof that understanding some aspect of the world does not always come down to uncovering some intricate causal web, not even in principle. Determination is not explanation. And mathematics, rightly understood, demonstrates this in a manner that lets us clearly see the mutual dependency of mind and nature.

For mathematical explanations are structural, not causal. Mathematics lets us understand aspects of the world that are just as real as the Northern Lights or people’s behaviour, but are not effects of any causes. The distinction between causal and structural forms of explanation will become clearer in due course. For a start, take this example. Think of a dying father who wants to pass on his one possession, a herd of 17 goats, evenly to his three sons. He can’t do so. This is not the case because some hidden physical or psychological forces hinder any such action. The reason is simply that 17 is a prime number, so not divisible by three…

… In his ‘two cultures’ speech, Snow located mathematics clearly in the camp of the sciences. But… mathematics doesn’t adhere to the explanatory paradigm of causal determination. This distinguishes it from the natural sciences. Nevertheless, mathematics tells us a lot about nature. According to Kant, it does so because it tells us a lot about the human mind. Mind and nature are inseparable facets of the world we inhabit and conceive. So, why should the humanities not also count as a science? They can tell us just as much about that one world on a fundamental level as the natural sciences. Mathematics demonstrates this clearly…

… Mathematics undermines the causal explanatory paradigm not only in its natural scientific manifestations, but also in its uses in the humanities. We give explanations for a wide variety of phenomena by hidden causes way too often and way too fast, where the simple admission to having no explanation would not only be more honest, but also wiser. Wittgenstein spoke of the disease of wanting to explain. This disease shows itself not just in our private everyday exchanges and in the usual public debates, but also in scholarly discourse of the humanities. When confronted with individual or collective human thinking and behaviour, it is tempting to assume just a few underlying factors responsible for the thinking and behaviour. But, more often than not, there really is no such neat, analysable set of factors. Instead, there is a vast number of natural, psychological and societal factors that are all equally relevant for the emergence of the phenomenon one wants to explain. Perhaps a high-end computer could incorporate all these factors in a grand simulation. But a simulation is not an explanation. A simulation allows us to predict, but it doesn’t let us understand.

The aim of the humanities should not be to identify causes for every phenomenon they investigate. The rise and fall of empires, the economic and social ramifications of significant technological innovations, the cultural impact of great works of art are often products of irreducibly complex, chaotic processes. In such cases, trying to mimic the natural sciences by stipulating some major determining factors is a futile and misleading endeavour.

But mathematics shows that beyond the causal chaos there can be order of a different kind. The central limit theorem lets us see and explain a common regularity in a wide range of causally very different, but equally complex, natural processes. With this and many other examples of structural mathematical explanations of phenomena in the realm of the natural sciences in mind, it seems plausible that mathematical, or mathematically inspired, abstraction can also have fruitful applications in the humanities.

This is by no means meant to promote an uncritical imitation of mathematics in the humanities and social sciences. (The overabundance of simplistic econometric models, for instance, is a huge warning sign.) Rather, it is meant to motivate scholars in these fields to reflect more upon where and when causal explanations make sense. Complexity can’t always be reduced to a graspable causal explanation, or narrative. To the contrary, often the most enlightening enquiries are not those that propose new factors as the true explainers, but those that show by meticulous analysis that far more factors are crucially in play than previously thought. This, in turn, should motivate scholars to seek aspects of their subject of interest beyond causality that are both relevant and amenable to structural forms of explanation. Besides probability theory, chaos theoretical methods and game theory come to mind as mathematical sub-disciplines with potentially fruitful applications in this regard.

However, the main point of our discussion is not that mathematical applications in the humanities might bridge the gap between the natural sciences and the humanities. The point is that mathematics, not really belonging to either camp, shows them to be on an equal footing from the start. The natural scientific paradigm of explanation is not the role model any respectable form of enquiry has to follow. Mathematics shows that natural causes can’t explain every phenomenon, not even every natural phenomenon and not even in principle. So, there is no need for the humanities, the ‘sciences of the mind’, to always strive for explanations by causes that can be ‘reduced’ to more elementary, natural forces. Moreover, mathematics shows that causality, of any kind, is not the only possible basis on which any form of explanation ultimately has to stand. Take for example the semantic relationships between many of our utterances. It is not at all clear that these can be explained in terms of psychological causes, or any other causes. It is not unreasonable to believe that the world is irreducibly structured, in part, by semantic relations, just as it is structured by probabilistic relations…

… The divide between the natural sciences and the humanities does not stem from the supposed fact that only those mental phenomena are real that are explainable in natural-scientific terms. Nor is the divide due to some extra-natural mental order, determined by causal relationships of a very different kind than those studied in the natural sciences. The mental world and the physical world are one and the same world, and the respective sciences deal with different aspects of this one world. Properly understood, insofar as they deal with the same phenomena, they do not provide competing but complementary descriptions of these phenomena.

Mathematics provides the most impressive proof that a true understanding of the world goes beyond the discovery of causal relationships – whether they are constituted by natural or cultural forces. It is worth taking a closer look at this proof. For it outlines the bond that connects mind and nature in particularly bright colours. Kant understood this bond as a ‘transcendental’ one. The late Wittgenstein, on the other hand, demonstrated its anchoring in language – not in the sense of a purely verbal and written practice, but in the sense of a comprehensive practice of actions the mental and bodily elements of which cannot be neatly separated. In the words of Wittgenstein, ‘commanding, questioning, recounting, chatting are as much a part of our natural history as walking, eating, drinking, and playing.’

Mathematics too is part of this practice. As such, like every science, it is inseparably rooted in both nature and the human mind. Unlike the other sciences, this dual rootedness is obvious in the case of mathematics. One only has to see where it resides: beyond causality.

Uniting the “Two Cultures”? “Beyond Causality” in @aeon.co.

* C. P. Snow, The Two Cultures and the Scientific Revolution

###

As we come together, we might send carefully calculated birthday greetings to a man with a foot in each culture: Frank Plumpton Ramsey; he was born on this date in 1903. A philosopher, mathematician, and economist, he made major contributions to all three fields before his death (at the age of 26) on this date in 1930.

While he is probably best remembered as a mathematician and logician and as Wittgenstein’s friend and translator, he wrote three paper in economics: on subjective probability and utility (a response to Keynes, 1926), on optimal taxation (1927, described by Joseph E. Stiglitz as “a landmark in the economics of public finance”), and optimal economic growth (1928; hailed by Keynes as “”one of the most remarkable contributions to mathematical economics ever made”). The economist Paul Samuelson described them in 1970 as “three great legacies – legacies that were for the most part mere by-products of his major interest in the foundations of mathematics and knowledge.”

For more on Ramsey and his thought, see “One of the Great Intellects of His Time,” “The Man Who Thought Too Fast,” and Ramsey’s entry in the Stanford Encyclopedia of Philosophy.

source

Written by (Roughly) Daily

February 22, 2025 at 1:00 am

“Those who are not shocked when they first come across quantum theory cannot possibly have understood it”*…

Werner Heisenberg, Erwin Schrödinger, and Niels Bohr by Tasnuva Elahi

A scheduling note: your correspondent is headed onto the road for a couple of weeks, so (Roughly) Daily will be a lot more roughly than daily until September 20th or so.

100 years ago, a circle of physicists shook the foundation of science. As Philip Ball explains, it’s still trembling…

In 1926, tensions were running high at the Institute for Theoretical Physics in Copenhagen. The institute was established 10 years earlier by the Danish physicist Niels Bohr, who had shaped it into a hothouse for young collaborators to thrash out a new theory of atoms. In 1925, one of Bohr’s protégés, the brilliant and ambitious German physicist Werner Heisenberg, had produced such a theory. But now everyone was arguing with each other about what it implied for the nature of physical reality itself.

To the Copenhagen group, it appeared reality had come undone…

[Ball tells the story of Niels Bohr’s building on Max Planck, of Werner Heisenberg’s wrangling of Bohr’s thought into theory, of Einstein’s objections and Erwin Schrödinger’s competing theory; then he homes in on the ontological issue at stake…]

Quantum mechanics, they said, demanded we throw away the old reality and replace it with something fuzzier, indistinct, and disturbingly subjective. No longer could scientists suppose that they were objectively probing a pre-existing world. Instead, it seemed that the experimenter’s choices determined what was seen—what, in fact, could be considered real at all.

In other words, the world is not simply sitting there, waiting for us to discover all the facts about it. Heisenberg’s uncertainty principle implied that those facts are determined only once we measure them. If we choose to measure an electron’s speed (more strictly, its momentum) precisely, then this becomes a fact about the world—but at the expense of accepting that there are simply no facts about its position. Or vice versa…

…A century later, scientists are still arguing about this issue of what quantum mechanics means for the nature of reality…

[Ball recounts subsequent attempts to reconcile quantum theory to “reality,” including Schrödinger’s wave mechanics…]

… Schrödinger’s wave mechanics didn’t restore the kind of reality he and Einstein wanted. His theory represented all that could be said about a quantum object in the form of a mathematical expression called the wave function, from which one can predict the outcomes of making measurements on the object. The wave function looks much like a regular wave, like sound waves in air or water waves on the sea. But a wave of what?

At first, Schrödinger supposed that the amplitude of the wave—think of it like the height of a water wave—at a given point in space was a measure of the density of the smeared-out quantum particle there. But Born argued that in fact this amplitude (more precisely, the square of the amplitude) is a measure of the probability that we will find the particle there, if we make a measurement of its position.

This so-called Born rule goes to the heart of what makes quantum mechanics so odd. Classical Newtonian mechanics allows us to calculate the trajectory of an object like a baseball or the moon, so that we can say where it will be at some given time. But Schrödinger’s quantum mechanics doesn’t give us anything equivalent to a trajectory for a quantum particle. Rather, it tells us the chance of getting a particular measurement outcome. It seems to point in the opposite direction of other scientific theories: not toward the entity it describes, but toward our observation of it. What if we don’t make a measurement of the particle at all? Does the wave function still tell us the probability of its being at a given point at a given time? No, it says nothing about that—or more properly, it permits us to say nothing about it. It speaks only to the probabilities of measurement outcomes.

Crucially, this means that what we see depends on what and how we measure. There are situations for which quantum mechanics predicts that we will see one outcome if we measure one way, and a different outcome if we measure the same system in a different way. And this is not, as is sometimes implied (this was the cause of Heisenberg’s row with Bohr), because making a measurement disturbs the object in some physical manner, much as we might very slightly disturb the temperature of a solution in a test-tube by sticking a thermometer into it. Rather, it seems to be a fundamental property of nature that the very fact of acquiring information about it induces a change.

If, then, by reality we mean what we can observe of the world (for how can we meaningfully call something real if it can’t be seen, detected, or even inferred in any way?), it is hard to avoid the conclusion that we play an active role in determining what is real—a situation the American physicist John Archibald Wheeler called the “participatory universe.”..

… Heisenberg’s “uncertainty” captured that sense of the ground shifting. It was not the ideal word—Heisenberg himself originally used the German Ungenauigkeit, meaning something closer to “inexactness,” as well as Unbestimmtheit, which might be translated as “undeterminedness.” It was not that one was uncertain about the situation of a quantum object, but that there was nothing to be certain about.

There was an even more disconcerting implication behind the uncertainty principle. The vagueness of quantum phenomena, when an electron in an atom might seem to jump from one energy state to another at a time of its own choosing, seemed to indicate the demise of causality itself. Things happened in the quantum world, but one could not necessarily adduce a reason why. In his 1927 paper on the uncertainty principle, Heisenberg challenged the idea that causes in nature lead to predictable effects. That seemed to undermine the very foundation of science, and it made the world seem like a lawless, somewhat arbitrary place….

… One of Bohr’s most provocative views was that there is a fundamental distinction between the fuzzy, probabilistic quantum world and the classical world of real objects in real places, where measurements of, say, an electron with a macroscopic instrument tell us that it is here and not there.

What Bohr meant is shocking. Reality, he implied, doesn’t consist of objects located in time and space. It consists of “quantum events,” which are obliged to be self-consistent (in the sense that quantum mechanics can describe them accurately) but not classically consistent with one another. One implication of this, as far as we can currently tell, is that two observers can see different and conflicting outcomes from an event—yet both can be right.

But this rigid distinction between the quantum and classical worlds can’t be sustained today. Scientists can now conduct experiments that probe size scales in between those where quantum and classical rules are thought to apply—neither microscopic (the atomic scale) nor macroscopic (the human scale), but mesoscopic (an intermediate size). We can look, for example, at the behavior of nanoparticles that can be seen and manipulated yet are small enough to be governed by quantum rules. Such experiments confirm the view that there is no abrupt boundary of quantum and classical. Quantum effects can still be observed at these intermediate scales if our devices are sensitive enough, but those effects can be harder to discern as the number of particles in the system increases.

To understand such experiments, it’s not necessary to adopt any particular interpretation of quantum mechanics, but merely to apply the standard theory—encompassed within Schrödinger’s wave mechanics, say—more expansively than Bohr and colleagues did, using it to explore what happens to a quantum object as it interacts with its surrounding environment. In this way, physicists are starting to understand how information gets out of a quantum system and into its environment, and how, as it does so, the fuzziness of quantum probabilities morphs into the sharpness of classical measurement. Thanks to such work, it is beginning to seem that our familiar world is just what quantum mechanics looks like when you are 6 feet tall.

But even if we manage to complete that project of uniting the quantum with the classical, we might end up none the wiser about what manner of stuff—what kind of reality—it all arises from. Perhaps one day another deeper theory will tell us. Or maybe the Copenhagen group was right a hundred years ago that we just have to accept a contingent, provisional reality: a world only half-formed until we decide how it will be…

Eminently worth reading in full: “When Reality Came Undone,” from @philipcball in @NautilusMag.

See also: When We Cease to Understand the World, by Benjamin Labatut.

* Niels Bohr

###

As we wrestle with reality, we might spare a thought for Ludwig Boltzmann; he died on this date in 1906. A physicist and philosopher, he is best remembered for the development of statistical mechanics, and the statistical explanation of the second law of thermodynamics (which connected entropy and probability).

Boltzmann helped paved the way for quantum theory both with his development of statistical mechanics (which is a pillar of modern physics) and with his 1877 suggestion that the energy levels of a physical system could be discrete.

source

“Chance, too, which seems to rush along with slack reins, is bridled and governed by law”*…

… though that law can sometimes be less than obvious. Erica Klarreich reports on one creative mathematician’s efforts to help us learn…

In late January, Daniel Litt [pictured above] posed an innocent probability puzzle on the social media platform X (formerly known as Twitter) — and set a corner of the Twitterverse on fire.

Imagine, he wrote, that you have an urn filled with 100 balls, some red and some green. You can’t see inside; all you know is that someone determined the number of red balls by picking a number between zero and 100 from a hat. You reach into the urn and pull out a ball. It’s red. If you now pull out a second ball, is it more likely to be red or green (or are the two colors equally likely)?

Of the tens of thousands of people who voted on an answer to Litt’s problem, only about 22% chose correctly. (We’ll reveal the solution below, in case you want to think it over first.) In the months since, Litt, a mathematician at the University of Toronto, has continued to confound Twitter users with a series of probability puzzles about urns and coin tosses.

His posts have prompted lively online discussions among research mathematicians, computer scientists and economists — as well as philosophers, financiers, sports analysts and anonymous fans. Some joked that the puzzles were distracting them from their real work — “actively slowing down economic research,” as one economist put it. Others have posted papers exploring the puzzles’ mathematical ramifications.

Litt’s online project doesn’t just highlight the enduring allure of brainteasers. It also demonstrates the limits of our mathematical intuition, and the counterintuitive nature of probabilistic reasoning. As Litt wrote, there’s “nothing more exhilarating than posing a multiple-choice problem on which 50,000 people do substantially worse than random chance.”…

The answer to this puzzle, other puzzles, and Litt on what makes a great puzzle, and why simple probability questions can be so deceptively difficult: “Perplexing the Web, One Probability Puzzle at a Time,” from @EricaKlarreich in @QuantaMagazine.

Vaguely related (but also very interesting): “The Bookmaker,” via @annfriedman, who observes: “Leif Weatherby and Ben Recht on Nate Silver and the addiction to prediction: ‘Silver insists that viewing all decisions through this lens of gambling is the underappreciated characteristic of Very Successful People,’ they write. ‘But what Silver willfully ignores is that the successful players in this world aren’t the bettors. They are the bookies and casino owners—the house that never loses.'”

* Boethius, The Consolation of Philosophy

###

As we contemplate chance, we might send confirmatory birthday greetings to Carl David Anderson; he was born on this date in 1905. An experimental physicist, he shared the 1936 Nobel Prize in Physics for his discovery (that’s to say, confirmation of the existence) of the positron, the first known particle of antimatter… which had been predicted by mathematician and physicist Paul Dirac, whose “Dirac Equation“– in part a product of its author’s application of probability theory– had predicted (among many other features of quantum theory as we know it) the existence of the particle (and antimatter).

Carl David Anderson (source)

Written by (Roughly) Daily

September 3, 2024 at 1:00 am