(Roughly) Daily

Posts Tagged ‘forecasting

“People seemed to believe that technology had stripped hurricanes of their power to kill. No hurricane expert endorsed this view.”*…

Tropical Storm Bertha approaching the South Carolina coast, May 27, 2020

For six straight years, Atlantic storms have been named in May, before [hurricane] season even begins. During the past nine Atlantic hurricane seasons, seven tropical storms have formed between May 15 and the official June 1 start date. Those have killed at least 20 people, causing about $200 million in damage, according to the WMO.

Last year, the hurricane center issued 36 “special” tropical weather outlooks before June 1, according to center spokesman Dennis Feltgen. Tropical Storms Arthur and Bertha both formed before June 1 near the Carolinas.

torms seem to be forming earlier because climate change is making the ocean warmer, University of Miami hurricane researcher Brian McNoldy said. Storms need warm water as fuel — at least 79 degrees (26 degrees Celsius). Also, better technology and monitoring are identifying and naming weaker storms that may not have been spotted in years past, Feltgen said…

With named storms coming earlier and more often in warmer waters, the Atlantic hurricane season is going through some changes with meteorologists ditching the Greek alphabet during busy years…

A special World Meteorological Organization committee Wednesday ended the use of Greek letters when the Atlantic runs out of the 21 names for the year, saying the practice was confusing and put too much focus on the Greek letter and not on the dangerous storm it represented. Also, in 2020 with Zeta, Eta and Theta, they sounded so similar it caused problems.

The Greek alphabet had only been used twice in 2005 and nine times last year in a record-shattering hurricane season. 

Starting this year, if there are more than 21 Atlantic storms, the next storms will come from a new supplemental list headed by Adria, Braylen, Caridad and Deshawn and ending with Will. There’s a new back-up list for the Eastern Pacific that runs from Aidan and Bruna to Zoe.

Meanwhile, the National Oceanic Atmospheric Administration is recalculating just what constitutes an average hurricane season… But the Atlantic hurricane season will start this year on June 1 as traditionally scheduled, despite meteorologists discussing the idea of moving it to May 15…

With so much activity, MIT’s [Kerry] Emanuel said the current warnings are too storm-centric, and he wants them more oriented to where people live, warning of specific risks such as floods and wind. That includes changing or ditching the nearly 50-year-old Saffir Simpson scale of rating hurricanes Category 1 to 5. 

That wind-based scale is “about a storm, it’s not about you. I want to make it about you, where you are,” he said. “It is about risk. In the end, we are trying to save lives and property”… Differentiating between tropical storms, hurricanes and extratropical cyclones can be a messaging problem when a system actually has a cold core, because these weaker storms can kill with water surges rather than wind… For example, some people and officials underestimated 2012’s Sandy because it wasn’t a hurricane and lost its tropical characteristic… 

Rethinking hurricanes in a time of climate change: “Bye Alpha, Eta: Greek alphabet ditched for hurricane names.”

* Erik Larson, Isaac’s Storm: A Man, a Time, and the Deadliest Hurricane in History

###

As we accommodate climate change, we might spare a thought for George Alfred Leon Sarton; he died on this date in 1956. A chemist by training, his primary interest lay in the past practices and precepts of his field…an interest that led him to found the discipline of the history of science as an independent field of study. His most influential work was the Introduction to the History of Science (three volumes totaling 4,296 pages). Sarton ultimately aimed to achieve an integrated philosophy of science that connected the sciences and the humanities– what he called “the new humanism.” His name is honored with the prestigious George Sarton Medal, awarded by the History of Science Society.

source

“Most of us spend too much time on the last twenty-four hours and too little on the last six thousand years”*…

 

Willard infographic

“Willard’s Chronographer of American History” (1845) by Emma Willard — David Rumsey Map Collection

 

In the 21st-century, infographics are everywhere. In the classroom, in the newspaper, in government reports, these concise visual representations of complicated information have changed the way we imagine our world.  Susan Schulten explores the pioneering work of Emma Willard (1787–1870), a leading feminist educator whose innovative maps of time laid the groundwork for the charts and graphics of today…

Willard’s remarkable story– and more glorious examples of her work– at “Emma Willard’s Maps of Time.”

* Will Durant

###

As we picture it all, we might recall that it was on this date in 1870 that Congress authorized the formation of the U.S. weather service (later named the Weather Bureau; later still, the National Weather Service), and placed it under the direction of the Army Signal Corps.  Cleveland Abbe,  who had started the first private weather reporting and warning service (in Cincinnati) and had been issuing weather reports or bulletins since September, 1869, was the only person in the country at the time who was experienced in drawing weather maps from telegraphic reports and forecasting from them.  He became the weather service’s inaugural chief scientist– effectively its founding head– in January, 1871.  The first U.S. meteorologist, he is known as the “father of the U.S. Weather Bureau,” where he systemized observation, trained personnel, and established scientific methods.  He went on to become one of the 33 founders of the National Geographic Society.

Cleveland Abbe

source

 

“It’s tough to make predictions, especially about the future”*…

 

prediction

 

As astrophysicist Mario Livo recounts in Brilliant Blunders, in April 1900, the eminent physicist Lord Kelvin proclaimed that our understanding of the cosmos was complete except for two “clouds”—minor details still to be worked out. Those clouds had to do with radiation emissions and with the speed of light… and they pointed the way to two major revolutions in physics: quantum mechanics and the theory of relativity.  Prediction is hard; ironically, it’s especially hard for experts attempting foresight in their own fields…

The idea for the most important study ever conducted of expert predictions was sparked in 1984, at a meeting of a National Research Council committee on American-Soviet relations. The psychologist and political scientist Philip E. Tetlock was 30 years old, by far the most junior committee member. He listened intently as other members discussed Soviet intentions and American policies. Renowned experts delivered authoritative predictions, and Tetlock was struck by how many perfectly contradicted one another and were impervious to counterarguments.

Tetlock decided to put expert political and economic predictions to the test. With the Cold War in full swing, he collected forecasts from 284 highly educated experts who averaged more than 12 years of experience in their specialties. To ensure that the predictions were concrete, experts had to give specific probabilities of future events. Tetlock had to collect enough predictions that he could separate lucky and unlucky streaks from true skill. The project lasted 20 years, and comprised 82,361 probability estimates about the future.

The result: The experts were, by and large, horrific forecasters. Their areas of specialty, years of experience, and (for some) access to classified information made no difference. They were bad at short-term forecasting and bad at long-term forecasting. They were bad at forecasting in every domain. When experts declared that future events were impossible or nearly impossible, 15 percent of them occurred nonetheless. When they declared events to be a sure thing, more than one-quarter of them failed to transpire. As the Danish proverb warns, “It is difficult to make predictions, especially about the future.”…

One subgroup of scholars, however, did manage to see more of what was coming… they were not vested in a single discipline. They took from each argument and integrated apparently contradictory worldviews…

The integrators outperformed their colleagues in pretty much every way, but especially trounced them on long-term predictions. Eventually, Tetlock bestowed nicknames (borrowed from the philosopher Isaiah Berlin) on the experts he’d observed: The highly specialized hedgehogs knew “one big thing,” while the integrator foxes knew “many little things.”…

Credentialed authorities are comically bad at predicting the future. But reliable– at least more reliable– forecasting is possible: “The Peculiar Blindness of Experts.”

See Tetlock discuss his findings at a Long Now Seminar.  Read Berlin’s riff on Archilochus, “The Hedgehog and the Fox,” here.

* Yogi Berra

###

As we ponder prediction, we might send complicating birthday greetings to Edward Norton Lorenz; he was born on this date in 1917.  A mathematician who turned to meteorology during World War II, he established the theoretical basis of weather and climate predictability, as well as the basis for computer-aided atmospheric physics and meteorology.

But he is probably better remembered as the founder of modern chaos theory, a branch of mathematics focusing on the behavior of dynamical systems that are highly sensitive to initial conditions… and thus practically impossible to predict in detail with certainty.

In 1961, Lorenz was using a simple digital computer, a Royal McBee LGP-30, to simulate weather patterns by modeling 12 variables, representing things like temperature and wind speed. He wanted to see a sequence of data again, and to save time he started the simulation in the middle of its course. He did this by entering a printout of the data that corresponded to conditions in the middle of the original simulation. To his surprise, the weather that the machine began to predict was completely different from the previous calculation. The culprit: a rounded decimal number on the computer printout. The computer worked with 6-digit precision, but the printout rounded variables off to a 3-digit number, so a value like 0.506127 printed as 0.506. This difference is tiny, and the consensus at the time would have been that it should have no practical effect. However, Lorenz discovered that small changes in initial conditions produced large changes in long-term outcome. His work on the topic culminated in the publication of his 1963 paper “Deterministic Nonperiodic Flow” in Journal of the Atmospheric Sciences, and with it, the foundation of chaos theory…

His description of the butterfly effect, the idea that small changes can have large consequences, followed in 1969.

lorenz source

 

“Memories, you’re talking about memories”*…

 

blade-runner-still-1-

It’s natural, here at the lip of a new year, to wonder what 2019 might hold.  And it’s bracing to note that Blade Runner (released in 1982) is one of 14 films set in a future that is this, the year on which we’re embarking.

But lest we dwell on the dark prognostication they tend to portray, we might take heart from Jill Lepore’s wonderfully-entertaining review of predictions: “What 2018 looked like fifty years ago” and recent honoree Isaac Asimov’s 1983 response to the Toronto Star‘s request for a look at the world of 2019.

Niels Bohr was surely right when he observed that “prediction is difficult, especially about the future.”

* Rick Deckard (Harrison Ford), Blade Runner

###

As we contend with the contemporary, we might spend a memorial moment honoring two extraordinary explorers who died on this date.  Marco Polo, whose coda to his remarkable travelogue was “I did not tell half of what I saw,” passed away on this date in 1324.

A page from “Il Milione” (aka” Le Livre des Merveilles” (“The Book of Wonders”)… and in English, “The Travels of Marco Polo”

And Galileo Galilei, the Italian physicist, philosopher, and pioneering astronomer, rose to his beloved heavens on this date in 1642.  Galileo (whom, readers will recall, had his share of trouble with authorities displeased with his challenge to Aristotelean cosmology), died insisting “still, it [the Earth] moves.”

Draft of Galileo’s letter to Leonardo Donato, Doge of Venice, in which he first recorded the movement of the moons of Jupiter– an observation that upset the notion that all celestial bodies must revolve around the Earth.

Written by LW

January 8, 2019 at 1:01 am

“We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run”*…

 

We are surrounded by hysteria about the future of artificial intelligence and robotics—hysteria about how powerful they will become, how quickly, and what they will do to jobs.

I recently saw a story in ­MarketWatch that said robots will take half of today’s jobs in 10 to 20 years. It even had a graphic to prove the numbers.

The claims are ludicrous. (I try to maintain professional language, but sometimes …) For instance, the story appears to say that we will go from one million grounds and maintenance workers in the U.S. to only 50,000 in 10 to 20 years, because robots will take over those jobs. How many robots are currently operational in those jobs? Zero. How many realistic demonstrations have there been of robots working in this arena? Zero. Similar stories apply to all the other categories where it is suggested that we will see the end of more than 90 percent of jobs that currently require physical presence at some particular site.

Mistaken predictions lead to fears of things that are not going to happen, whether it’s the wide-scale destruction of jobs, the Singularity, or the advent of AI that has values different from ours and might try to destroy us. We need to push back on these mistakes. But why are people making them? I see seven common reasons…

Mistaken extrapolations, limited imagination, and other common mistakes that distract us from thinking more productively about the future: Rodney Brooks on “The Seven Deadly Sins of AI Predictions.”

* Roy Amara, co-founder of The Institute for the Future

###

As we sharpen our analyses, we might recall that it was on this date in 1995 that The Media Lab at the Massachusetts Institute of Technology chronicled the World Wide Web in its A Day in the Life of Cyberspace project.

To celebrate its 10th anniversary, the Media Lab had invited submissions for the days leading up to October 10, 1995, on a variety of issues related to technology and the Internet, including privacy, expression, age, wealth, faith, body, place, languages, and the environment.  Then on October 10, a team at MIT collected, edited, and published the contributions to “create a mosaic of life at the dawn of the digital revolution that is transforming our planet.”

source

 

 

Written by LW

October 10, 2017 at 1:01 am

<span>%d</span> bloggers like this: