(Roughly) Daily

Posts Tagged ‘prediction

“Everything we care about lies somewhere in the middle, where pattern and randomness interlace”*…

True randomness (it’s lumpy)

We tend dramatically to underestimate the role of randomness in the world…

Arkansas was one out away from the 2018 College World Series championship, leading Oregon State in the series and 3-2 in the ninth inning of the game when Cadyn Grenier lofted a foul pop down the right-field line. Three Razorbacks converged on the ball and were in position to make a routine play on it, only to watch it fall untouched to the ground in the midst of them. Had any one of them made the play, Arkansas would have been the national champion.

Nobody did.

Given “another lifeline,” Grenier hit an RBI single to tie the game before Trevor Larnach launched a two-run homer to give the Beavers a 5-3 lead and, ultimately, the game. “As soon as you see the ball drop, you know you have another life,” Grenier said. “That’s a gift.” The Beavers accepted the gift eagerly and went on win the championship the next day as Oregon State rode freshman pitcher Kevin Abel to a 5-0 win over Arkansas in the deciding game of the series. Abel threw a complete game shutout and retired the last 20 hitters he faced.

The highly unlikely happens pretty much all the time…

We readily – routinely – underestimate the power and impact of randomness in and on our lives. In his book, The Drunkard’s Walk, Caltech physicist Leonard Mlodinow employs the idea of the “drunkard’s [random] walk” to compare “the paths molecules follow as they fly through space, incessantly bumping, and being bumped by, their sister molecules,” with “our lives, our paths from college to career, from single life to family life, from first hole of golf to eighteenth.” 

Although countless random interactions seem to cancel each another out within large data sets, sometimes, “when pure luck occasionally leads to a lopsided preponderance of hits from some particular direction…a noticeable jiggle occurs.” When that happens, we notice the unlikely directional jiggle and build a carefully concocted story around it while ignoring the many, many random, counteracting collisions.

As Tversky and Kahneman have explained, “Chance is commonly viewed as a self-correcting process in which a deviation in one direction induces a deviation in the opposite direction to restore the equilibrium. In fact, deviations are not ‘corrected’ as a chance process unfolds, they are merely diluted.”

As Stephen Jay Gould famously argued, were we able to recreate the experiment of life on Earth a million different times, nothing would ever be the same, because evolution relies upon randomness. Indeed, the essence of history is contingency.

Randomness rules.

Luck matters. A lot. Yet, we tend dramatically to underestimate the role of randomness in the world.

The self-serving bias is our tendency to see the good stuff that happens as our doing (“we worked really hard and executed the game plan well”) while the bad stuff isn’t our fault (“It just wasn’t our night” or “we simply couldn’t catch a break” or “we would have won if the umpiring hadn’t been so awful”). Thus, desirable results are typically due to our skill and hard work — not luck — while lousy results are outside of our control and the offspring of being unlucky.

Two fine books undermine this outlook by (rightly) attributing a surprising amount of what happens to us — both good and bad – to luck. Michael Mauboussin’s The Success Equation seeks to untangle elements of luck and skill in sports, investing, and business. Ed Smith’s Luck considers a number of fields – international finance, war, sports, and even his own marriage – to examine how random chance influences the world around us. For example, Mauboussin describes the “paradox of skill” as follows: “As skill improves, performance becomes more consistent, and therefore luck becomes more important.” In investing, therefore (and for example), as the population of skilled investors has increased, the variation in skill has narrowed, making luck increasingly important to outcomes.

On account of the growth and development of the investment industry, John Bogle could quite consistently write his senior thesis at Princeton on the successes of active fund management and then go on to found Vanguard and become the primary developer and intellectual forefather of indexing. In other words, the ever-increasing aggregate skill (supplemented by massive computing power) of the investment world has come largely to cancel itself out.

After a big or revolutionary event, we tend to see it as having been inevitable. Such is the narrative fallacy. In this paper, ESSEC Business School’s Stoyan Sgourev notes that scholars of innovation typically focus upon the usual type of case, where incremental improvements rule the day. Sgourev moves past the typical to look at the unusual type of case, where there is a radical leap forward (equivalent to Thomas Kuhn’s paradigm shifts in science), as with Picasso and Les Demoiselles

As Sgourev carefully argued, the Paris art market of Picasso’s time had recently become receptive to the commercial possibilities of risk-taking. Thus, artistic innovation was becoming commercially viable. Breaking with the past was then being encouraged for the first time. It would soon be demanded.

Most significantly for our purposes, Sgourev’s analysis of Cubism suggests that having an exceptional idea isn’t enough. For radical innovation really to take hold, market conditions have to be right, making its success a function of luck and timing as much as genius. Note that Van Gogh — no less a genius than Picasso — never sold a painting in his lifetime.

As noted above, we all like to think that our successes are earned and that only our failures are due to luck – bad luck. But the old expression – it’s better to be lucky than good – is at least partly true. That said, it’s best to be lucky *and* good. As a consequence, in all probabilistic fields (which is nearly all of them), the best performers dwell on process and diversify their bets. You should do the same…

As [Nate] Silver emphasizes in The Signal and the Noise, we readily overestimate the degree of predictability in complex systems [and t]he experts we see in the media are much too sure of themselves (I wrote about this problem in our industry from a slightly different angle…). Much of what we attribute to skill is actually luck.

Plan accordingly.

Taking the unaccountable into account: “Randomness Rules,” from Bob Seawright (@RPSeawright), via @JVLast

[image above: source]

* James Gleick, The Information: A History, a Theory, a Flood

###

As we contemplate chance, we might spare a thought for Oskar Morgenstern; he died on this date in 1977. An economist who fled Nazi Germany for Princeton, he collaborated with the mathematician John von Neumann to write Theory of Games and Economic Behavior, published in 1944, which is recognized as the first book on game theory— thus co-founding the field.

Game theory was developed extensively in the 1950s, and has become widely recognized as an important tool in many fields– perhaps especially in the study of evolution. Eleven game theorists have won the economics Nobel Prize, and John Maynard Smith was awarded the Crafoord Prize for his application of evolutionary game theory.

Game theory’s roots date back (at least) to the 1654 letters between Pascal and Fermat, which (along with work by Cardano and Huygens) marked the beginning of probability theory. (See Peter Bernstein’s marvelous Against the Gods.) The application of probability (Bayes’ rule, discrete and continuous random variables, and the computation of expectations) accounts for the utility of game theory; the role of randomness (along with the behavioral psychology of a game’s participants) explain why it’s not a perfect predictor.

source

Written by (Roughly) Daily

July 26, 2021 at 1:00 am

“How I hate those who are dedicated to producing conformity”*…

Andy Warhol, Polaroid photographs of Truman Capote and William S. Burroughs

When one writer speaks frankly to another: William S. Burroughs’s bizarre and mean– and strangely prescient– “open letter” to Truman Capote after the publication of In Cold Blood

As Thom Robinson writes at RealityStudio, Burroughs had long been dismissive of Capote—and sometimes resentful of his success. For his part, Capote was none too impressed with Burroughs, who wasn’t yet the literary star he would become. Robinson quotes Capote telling the Chicago Daily News in 1967: “I hate pop art to death . . . Now William Burroughs. He’s what I’d call a pop writer. He gets some very interesting effects on a page. But at the cost of total lack of communication with the reader. Which is a pretty serious cost, I think.”

In the below letter, Burroughs engages in a sort of bizarre role-play, claiming (it seems) to speak for a department responsible for the cosmic fate of writers. He tells Capote that he has been following him closely, reading his works, his reviews, and his actions, even interviewing his characters, and that he has decided to withdraw the talent given to him by the department and curse him to never write anything good again—as if he were a minor god of creative action, or king of the muses. Robinson points out that Burroughs actually believed in curses at this time, and maybe he was right, because his damning words came true—he never wrote anything good again. Read Burroughs’s attack on Capote below. (He’s also not too keen on the New Yorker.)

July 23, 1970

My Dear Mr. Truman Capote

This is not a fan letter in the usual sense—unless you refer to ceiling fans in Panama. Rather call this a letter from “the reader”—vital statistics are not in capital letters—a selection from marginal notes on material submitted as all “writing” is submitted to this department. I have followed your literary development from its inception, conducting on behalf of the department I represent a series of inquiries as exhaustive as your own recent investigations in the sunflower state. I have interviewed all your characters beginning with Miriam—in her case withholding sugar over a period of several days proved sufficient inducement to render her quite communicative—I prefer to have all the facts at my disposal before taking action. Needless to say, I have read the recent exchange of genialities between Mr. Kenneth Tynan and yourself. I feel that he was much too lenient. Your recent appearance before a senatorial committee on which occasion you spoke in favor of continuing the present police practice of extracting confessions by denying the accused the right of consulting consul prior to making a statement also came to my attention. In effect you were speaking in approval of standard police procedure: obtaining statements through brutality and duress, whereas an intelligent police force would rely on evidence rather than enforced confessions. You further cheapened yourself by reiterating the banal argument that echoes through letters to the editor whenever the issue of capital punishment is raised: “Why all this sympathy for the murderer and none for his innocent victims?” I have in line of duty read all your published work. The early work was in some respects promising—I refer particularly to the short stories. You were granted an area for psychic development. It seemed for a while as if you would make good use of this grant. You choose instead to sell out a talent that is not yours to sell. You have written a dull unreadable book which could have been written by any staff writer on the New Yorker—(an undercover reactionary periodical dedicated to the interests of vested American wealth). You have placed your services at the disposal of interests who are turning America into a police state by the simple device of deliberately fostering the conditions that give rise to criminality and then demanding increased police powers and the retention of capital punishment to deal with the situation they have created. You have betrayed and sold out the talent that was granted you by this department. That talent is now officially withdrawn. Enjoy your dirty money. You will never have anything else. You will never write another sentence above the level of In Cold Blood. As a writer you are finished. Over and out. Are you tracking me? Know who I am? You know me, Truman. You have known me for a long time. This is my last visit.

[original in the Burroughs Archive of the New York Public Library’s Berg Collection]

Can one writer curse another for life? “William S. Burroughs’s Hate Letter to Truman Capote,” from Emily Temple (@knownemily) in @lithub.

[image above: source]

* William S. Burroughs

###

As we examine enmity, we might send apocalyptic birthday greetings to Harold Egbert Camping; he was born on this date in 1921. A Christian radio broadcaster and evangelist, he presided over Family Radio, a California-based radio station group that, at its peak, broadcast to more than 150 markets in the United States.

Camping is notorious for issuing a succession of failed predictions of dates for the End Times, which temporarily gained him a global following and millions of dollars of donations. Camping first predicted that the Judgment Day would occur on or about September 6, 1994. When it failed to occur, he revised the date to September 29 and then to October 2.  In 2005, Camping predicted the Second Coming of Christ to May 21, 2011, whereupon the saved would be taken up to heaven in the rapture, and that “there would follow five months of fire, brimstone and plagues on Earth, with millions of people dying each day, culminating on October 21, 2011, with the final destruction of the world.”

His prediction for May 21, 2011 was widely reported [including here], in part because of a large-scale publicity campaign by Family Radio, and prompted ridicule from atheist organizations and rebuttals from many other Christians.  After May 21 passed without the predicted events, Camping said he believed that a “spiritual” judgment had occurred on that date, and that the physical Rapture would occur on October 21, 2011, simultaneously with the final destruction of the universe by God. That, of course, also didn’t happen. But as Camping had suffered a stroke in June of 2011, he was largely silent thereafter… though in March 2012, he announced that his attempt to predict a date was “sinful,” and that his critics had been right in emphasizing the words of Matthew 24:36: “of that day and hour knoweth no man.” Family Radio is still recovering from the fallout of the failed end-times predictions.

source

“The only function of economic forecasting is to make astrology look respectable”*…

The pandemic economy has been strange and unpredictable from the get-go.

Throughout the past 14 months, the twists and turns have been surprising: The housing market boomedthe stock market soaredpeople got into day tradingeveryone hoarded toilet paper, and lumber became a must-have. There’s been widespread disagreement about how much support from the government was needed, whether the country was doing too much or not enough, or whether help would come at all. We won’t know whether the country overshot or undershot the response for years, and there’s still uncertainty about what’s happening in the labor marketprices, and other areas. And the prevailing theme has been one that has nothing to do with the economy directly: As long as Covid-19 isn’t under control, the economy isn’t either.

“Having been a forecaster for 10 years, we were surprised all the time, because nobody has a crystal ball and particularly if you just pull out one data series, one month, there’s just no way,” said Claudia Sahm, a former Federal Reserve economist and now a senior fellow at the Jain Family Institute. “It’s going to be a wild ride; the data through the end of this year, they’re going to be tough.”

The country and the world are staring into a black box of uncertainty on the economy. It’s frustrating, but it’s also inevitable. Anyone who says they know exactly what is going on in the economy right now is lying. The same goes for anyone who says they know what’s going to happen next.

“Because of the unique nature of this crisis, there are going to be some swings,” said Mike Konczal, director of macroeconomic analysis at the Roosevelt Institute. “In a year, they’re going to be trivia questions, but right now we’re obsessing about them.”

Few people will probably remember two years from now that the price of used cars and trucks went up by 10 percent in April. 

We know that the economy is different now than it was a year ago and that it will be different a year from now. What’s not clear is exactly how. And what we need now — including economists, experts, and policymakers — is the intellectual humility to recognize that’s the case.

“At this point, most things should be presumed temporary until proven permanent,” said Jed Kolko, chief economist at the jobs website Indeed.

It’s unnerving to admit what we don’t know, and the pandemic has been a real exercise in that. But after so long of staring into the abyss, maybe it’s time we embrace it…

Anyone who says they know exactly what’s happening in the economy is lying. Emily Stewart (@EmilyStewartM) explores that uncertainty and what it might mean: “The black box economy.”

* John Kenneth Galbraith

###

As we consult the stars, we might note that today is National Be a Millionaire Day. While many sources confirm this celebratory fact, there’s no real information on its origin. The term “millionaire” was coined in France around 1719 to describe speculators in the Mississippi Bubble who earned millions of livres in weeks before the bubble burst; it seems first to have appeared in the U.S. in 1786, when Thomas Jefferson wrote about the French… so the “holiday” surely dates from sometime after that.

source

“The future is there… looking back at us. Trying to make sense of the fiction we will have become”*…

 

Octavia Butler

 

Tim Maughan, an accomplished science fiction writer himself, considers sci-fi works from the 1980s and 90s, and their predictive power.  Covering Bruce Sterling, William Gibson, Rudy Rucker, Steven King, P.D. James, an episode of Star Trek: Deep Space Nine, and Bladerunner, he reserves special attention for a most deserving subject…

When you imagine the future, what’s the first date that comes into your mind? 2050? 2070? The year that pops into your head is almost certainly related to how old you are — some point within our lifetimes yet distant enough to be mysterious, still just outside our grasp. For those of us growing up in the 1980s and ’90s — and for a large number of science fiction writers working in those decades — the 2020s felt like that future. A decade we would presumably live to see but also seemed sufficiently far away that it could be a world full of new technologies, social movements, or political changes. A dystopia or a utopia; a world both alien and familiar.

That future is, of course, now…

Two science fiction books set in the 2020s tower over everything else from that era in their terrifying prescience: Octavia Butler’s Parable of the Sower (1993) and Parable of the Talents (1998). These books by the late master kick off in 2024 Los Angeles and are set against a backdrop of a California that’s been ravaged by floods, storms, and droughts brought on by climate change. Middle- and working-class families huddle together in gated communities, attempting to escape the outside world through addictive pharmaceuticals and virtual reality headsets. New religions and conspiracy theory–chasing cults begin to emerge. A caravan of refugees head north to escape the ecological and social collapse, while a far-right extremist president backed by evangelical Christians comes to power using the chillingly familiar election slogan Make America Great Again.

Although it now feels like much of Butler’s Parable books might have been pulled straight from this afternoon’s Twitter or tonight’s evening news, some elements are more far-fetched. The second book ends with followers of the new religion founded by the central character leaving Earth in a spaceship to colonize Alpha Centauri. Butler originally planned to write a third book following the fates of these interstellar explorers but, sadly, passed away in 2005 before she had a chance. She left us with a duology that remains more grounded and scarily familiar to those of us struggling to come to terms with the everyday dystopias that the real 2020s seem to be already presenting us.

Not that this remarkable accuracy was ever her objective.

“This was not a book about prophecy; this was an if-this-goes-on story,” Butler said about the books during a talk at MIT in 1998. “This was a cautionary tale, although people have told me it was prophecy. All I have to say to that is I certainly hope not.”

In the same talk, Butler describes in detail the fears that drove her to write this warning: the debate over climate change, the eroding of workers’ rights, the rise of the private prison industry, and the media’s increasing refusal to talk about all of these in favor of focusing on soundbite propaganda and celebrity news. Again, these are fears that feel instantly familiar today…

What Blade Runner, cyberpunk– and Octavia Butler– had to say about the age we’re entering now: “How Science Fiction Imagined the 2020s.”

* William Gibson, Pattern Recognition

###

As we honor prophets, we might recall that it was on this date in 1984 that Apple aired an epoch-making commercial, “1984” (directed by Blade Runner director Ridley Scott),  during Superbowl XVIII– for the first and only time.  Two days later, the first Apple Macintosh went on sale.

 

Written by (Roughly) Daily

January 22, 2020 at 1:01 am

“It’s tough to make predictions, especially about the future”*…

 

prediction

 

As astrophysicist Mario Livo recounts in Brilliant Blunders, in April 1900, the eminent physicist Lord Kelvin proclaimed that our understanding of the cosmos was complete except for two “clouds”—minor details still to be worked out. Those clouds had to do with radiation emissions and with the speed of light… and they pointed the way to two major revolutions in physics: quantum mechanics and the theory of relativity.  Prediction is hard; ironically, it’s especially hard for experts attempting foresight in their own fields…

The idea for the most important study ever conducted of expert predictions was sparked in 1984, at a meeting of a National Research Council committee on American-Soviet relations. The psychologist and political scientist Philip E. Tetlock was 30 years old, by far the most junior committee member. He listened intently as other members discussed Soviet intentions and American policies. Renowned experts delivered authoritative predictions, and Tetlock was struck by how many perfectly contradicted one another and were impervious to counterarguments.

Tetlock decided to put expert political and economic predictions to the test. With the Cold War in full swing, he collected forecasts from 284 highly educated experts who averaged more than 12 years of experience in their specialties. To ensure that the predictions were concrete, experts had to give specific probabilities of future events. Tetlock had to collect enough predictions that he could separate lucky and unlucky streaks from true skill. The project lasted 20 years, and comprised 82,361 probability estimates about the future.

The result: The experts were, by and large, horrific forecasters. Their areas of specialty, years of experience, and (for some) access to classified information made no difference. They were bad at short-term forecasting and bad at long-term forecasting. They were bad at forecasting in every domain. When experts declared that future events were impossible or nearly impossible, 15 percent of them occurred nonetheless. When they declared events to be a sure thing, more than one-quarter of them failed to transpire. As the Danish proverb warns, “It is difficult to make predictions, especially about the future.”…

One subgroup of scholars, however, did manage to see more of what was coming… they were not vested in a single discipline. They took from each argument and integrated apparently contradictory worldviews…

The integrators outperformed their colleagues in pretty much every way, but especially trounced them on long-term predictions. Eventually, Tetlock bestowed nicknames (borrowed from the philosopher Isaiah Berlin) on the experts he’d observed: The highly specialized hedgehogs knew “one big thing,” while the integrator foxes knew “many little things.”…

Credentialed authorities are comically bad at predicting the future. But reliable– at least more reliable– forecasting is possible: “The Peculiar Blindness of Experts.”

See Tetlock discuss his findings at a Long Now Seminar.  Read Berlin’s riff on Archilochus, “The Hedgehog and the Fox,” here.

* Yogi Berra

###

As we ponder prediction, we might send complicating birthday greetings to Edward Norton Lorenz; he was born on this date in 1917.  A mathematician who turned to meteorology during World War II, he established the theoretical basis of weather and climate predictability, as well as the basis for computer-aided atmospheric physics and meteorology.

But he is probably better remembered as the founder of modern chaos theory, a branch of mathematics focusing on the behavior of dynamical systems that are highly sensitive to initial conditions… and thus practically impossible to predict in detail with certainty.

In 1961, Lorenz was using a simple digital computer, a Royal McBee LGP-30, to simulate weather patterns by modeling 12 variables, representing things like temperature and wind speed. He wanted to see a sequence of data again, and to save time he started the simulation in the middle of its course. He did this by entering a printout of the data that corresponded to conditions in the middle of the original simulation. To his surprise, the weather that the machine began to predict was completely different from the previous calculation. The culprit: a rounded decimal number on the computer printout. The computer worked with 6-digit precision, but the printout rounded variables off to a 3-digit number, so a value like 0.506127 printed as 0.506. This difference is tiny, and the consensus at the time would have been that it should have no practical effect. However, Lorenz discovered that small changes in initial conditions produced large changes in long-term outcome. His work on the topic culminated in the publication of his 1963 paper “Deterministic Nonperiodic Flow” in Journal of the Atmospheric Sciences, and with it, the foundation of chaos theory…

His description of the butterfly effect, the idea that small changes can have large consequences, followed in 1969.

lorenz source

 

%d bloggers like this: