(Roughly) Daily

Posts Tagged ‘meteorology

“I’m having a magenta day. Not just red, but magenta!”*…

Your correspondent is still on the road; regular service resumes on or around May 6. Meantime, a colorful update…

Forget about red hot. A new color-coded heat warning system relies on magenta to alert Americans to the most dangerous conditions they may see this summer.

The National Weather Service and the Centers for Disease Control and Prevention on Monday — Earth Day — presented a new online heat risk system that combines meteorological and medical risk factors with a seven-day forecast that’s simplified and color-coded for a warming world of worsening heat waves.

“For the first time we’ll be able to know how hot is too hot for health and not just for today but for coming weeks,” Dr. Ari Bernstein, director of the National Center for Environmental Health, said at a joint news conference by government health and weather agencies.

Magenta is the worst and deadliest of five heat threat categories, hitting everybody with what the agencies are calling “rare and/or long-duration extreme heat with little to no overnight relief.” It’s a step higher than red, considered a major risk, which hurts anyone without adequate cooling and hydration and has impacts reverberating through the health care system and some industries. Red is used when a day falls within the top 5% hottest in a particular location for a particular date; when other factors come into play, the alert level may bump even higher to magenta, weather service officials said.

On the other hand, pale green is little to no risk. Yellow is minor risk, mostly to the very young, old, sick and pregnant. Orange is moderate risk, mostly hurting people who are sensitive to heat, especially those without cooling, such as the homeless.

When red-hot isn’t enough: New government heat risk tool sets magenta as most dangerous level,” from @AP.

See also: here and here

* Stephen King, Needful Things

###

As we reassess risk, we might recall that it was on this date in 1986 that Russia announced the Chernobyl nuclear disaster, two days after it happened.

A view of the facility three days after the incident (source)

“The greatest value of a picture is when it forces us to notice what we never expected to see”*…

Joseph Priestley’s 1765 A Chart of Biography, Library Company of Philadelphia / University of Oregon 

The breath-takingly broadly talented Joesph Preistley left us much– not least, Alyson Foster explains, a then-new way of understanding history…

It’s a testament to the wide-ranging and unconventional nature of Joseph Priestley’s mind that no one has settled on a term to sum up exactly what he was. The eighteenth-century British polymath has been described as, among other things, a historian, a chemist, an educator, a philosopher, a theologian, and a political radical who became, for a period of time, the most despised person in England. Priestley’s many contradictions—as a rationalist Unitarian millenarian, as a mild-mannered controversialist, as a thinker who was both ahead of his time and behind it—have provided endless fodder for the historians who have debated the precise nature of his legacy and his place among his fellow Enlightenment intellectuals. But his contributions—however they are categorized—have continued to live on in subtle and surprisingly enduring ways, more than two hundred years after his death, at the age of seventy, in rural Pennsylvania.

Take, for example, A Chart of Biography, which is considered to be the first modern timeline. This unusual, and unusually beautiful, pedagogical tool, which was published by Priestley in 1765, while he was in his thirties and working as a tutor at an academy in Warrington, England, tends to get lost in the shuffle of Priestley’s more notable achievements—his seminal 1761 textbook on language, The Rudiments of English Grammar, say, or his discovery of nine gases, including oxygen, 13 years later. But the chart, along with its companion, A New Chart of History, which Priestley published four years later, has become a curious subject of interest among data visualization aficionados who have analyzed its revolutionary design in academic papers and added it to Internet lists of notable infographics. Recently, both charts have become the focus of an NEH-supported digital humanities project, Chronographics: The Time Charts of Joseph Priestley, produced by scholars at the University of Oregon. 

Even those of us ignorant of (or uninterested in) infographics can look at the painstakingly detailed Chart of Biography for a moment or two and appreciate how it has become a source of fascination. The two-foot-by-three-foot, pastel-striped paper scroll—which contains the meticulously inscribed names of approximately 2,000 poets, artists, statesmen, and other famous historical figures dating back three millennia—is visually striking, combining a formal, somewhat ornate eighteenth-century aesthetic with the precise organization of a schematic. Every single one of the chart’s subjects is grouped vertically into one of six occupational categories, then plotted out chronologically along a horizonal line divided into ten-year increments. Despite the huge quantity of information it contains, it is extremely user-friendly. Any one of Priestley’s history students could run his eye across the chart and immediately gain a sense of the temporal lay of the land. Who came first: Copernicus or Newton? How many centuries separate Genghis Khan from Joan of Arc? Which artists were working during the reign of Henry VIII? The chart was a masterful blend of form and function…

The most significant design feature of Priestley’s chart—as historians point out—was the way in which he linked units of time to units of distance on the page, similar to the way a cartographer uses scale when creating a map. (The artist Pietro Lorenzetti lived two hundred years before Titian and thus is situated twice as far from Titian as Jan van Eyck, who predated Titian by about a century.) If this innovation is hard for contemporary viewers to fully appreciate, it’s probably because Priestley’s representation of time has become a convention that’s used everywhere in visual design and seems so obvious it’s now taken for granted.

To Priestley’s contemporaries, though, who were accustomed to cumbersome Eusebian-style [see here] chronological tables or the visually striking but often obscure “stream charts” created by the era’s chronographers, Priestley’s method of capturing time on the page revealed something revelatory and new—a way of seeing historical patterns and connections that would have otherwise remained hidden. “To many readers,” wrote Daniel Rosenberg and Anthony Grafton in their book, Cartographies of Time, Priestley’s Chart of Biography offered a never-before-seen “picture of time itself.”  

It was no wonder, then, that eighteenth-century readers found themselves drawn to it. A Chart of Biography sold well in both England and the United States, accruing many fans along the way. Along with the New Chart of History, it would go on to be printed in at least 19 editions and spawn numerous imitations, including one by Priestley’s future friend Thomas Jefferson, who developed his own “time chart” of market seasons in Washington, and the historian David Ramsay, who acknowledged Priestley’s influence in his Historical and Biographical Chart of the United States. The time charts marked Priestley’s first major commercial success and played a key role in establishing his reputation as a serious intellectual, earning him an honorary degree from the University of Edinburgh, and helping him secure a fellowship nomination to the Royal Society of London.

As much as anything he published, and he published a staggering amount—somewhere between 150 and 200 books, articles, papers, and pamphlets—Priestley’s time charts encapsulate his uniqueness as a thinker. Of his many intellectual gifts, his gift for synthesis—for knitting together the seemingly disparate things that caught his attention—might have been his greatest… 

Read on for how Priestley went on to become the most controversial man in England: “Joseph Priestley Created Revolutionary ‘Maps’ of Time,” by @alysonafoster in @humanitiesmag from @NEHgov.

More info on the Chart– and magnified views– here.

John Tukey

###

As we celebrate constructive charts, we might spare a thought for Edward Lorenz, a mathematician and meteorologist, best remembered as a pioneer of Chaos Theory; he died on this date in 2008. Having noticed that his computer weather simulation gave wildly different results from even tiny changes in the input data, he began investigating a phenomenon that he famously outlined in a 1963 paper— and that came to be known as the “butterfly effect,” that the flap of a butterfly’s wings could ultimately determine the weather thousands of miles away and days later… generalized in Chaos Theory to state that “slightly differing initial states can evolve into considerably different [later] states.”

source

Written by (Roughly) Daily

April 16, 2024 at 1:00 am

“One of the first conditions of happiness is that the link between man and nature shall not be broken”*…

Flash floods in the Libyan city of Derna were the most deadly climate disaster of 2023, killing 11,300 people

The estimable Jonathan Watts on the (painfully) pivotal year just ended…

As historically high temperatures continued to be registered in many parts of the world in late December, the former Nasa scientist James Hansen told the Guardian that 2023 would be remembered as the moment when failures became apparent.

“When our children and grandchildren look back at the history of human-made climate change, this year and next will be seen as the turning point at which the futility of governments in dealing with climate change was finally exposed,” he said.

“Not only did governments fail to stem global warming, the rate of global warming actually accelerated.”…

The bright side of this clear dichotomy is that young people may realise that they must take charge of their future. The turbulent status of today’s politics may provide opportunity,” he said.

His comments are a reflection of the dismay among experts at the enormous gulf between scientific warnings and political action. It has taken almost 30 years for world leaders to acknowledge that fossil fuels are to blame for the climate crisis, yet this year’s United Nations Cop28 summit in Dubai ended with a limp and vague call for a “transition away” from them, even as evidence grows that the world is already heating to dangerous levels…

Veteran climate watchers have been horrified at the pace of change. “The climate year 2023 is nothing but shocking, in terms of the strength of climate occurrences, from heatwaves, droughts, floods and fires, to rate of ice melt and temperature anomalies particularly in the ocean,” Prof Johan Rockström, the joint director of the Potsdam Institute for Climate Impact Research in Germany, said.

Aquino [Francisco Eliseu Aquino, a professor of climatology and oceanography at the Federal University of Rio Grande do Sul and the deputy director of Brazil’s polar and climatic centre] said human influence – through the burning of fossil fuels – had also created “frightening” dynamics between the poles and the tropics. Cold wet fronts from the Antarctic had interacted with record heat and drought in the Amazon to create unprecedented storms in between. Floods in southern Brazil killed 51 people in early September and then returned with similarly devastating force in mid-November.

Aquino said this “record record” was a taste of what was to come as the world entered dangerous levels of warming. “From this year onwards, we will understand concretely what it means to flirt with 1.5C [of heating] in the global average temperature and new records for disasters,” he said.

This is already happening. This year’s deadliest climate disaster was the flood in Libya that killed more than 11,300 people in the coastal city of Derna. In a single day, Storm Daniel unleashed 200 times as much rain as usually falls on the city in the entire month of September. Human-induced climate change made this up to 50 times more likely.

Forest fires burned a record area in Canada and Europe, and killed about 100 people in Lahaina on Maui island, the deadliest wildfire in recent US history, which happened in August. For those who prefer to calculate catastrophe in economic terms, the US broke its annual record of billion-dollar disasters by August, by which time there had already been 23.

…as science has proved beyond any doubt, global temperatures would continue to rise as long as humanity continues to burn fossil fuels and forests.

In the years ahead, the heat “anomaly” and catastrophes of 2023 would first become the new norm, and then be looked back on as one of the cooler, more stable years in people’s lives. As Hansen warned, unless there is radical and rapid change, failure will be built into the climate system…

Bracing, but critically important– eminently worth reading in full: “World will look back at 2023 as year humanity exposed its inability to tackle climate crisis, scientists say,” from @jonathanwatts in @guardian.

* Leo Tolstoy

###

As we face facts, we might recall that it was on this date in 1870 that Congress authorized the formation of the U.S. weather service (later named the Weather Bureau; later still, the National Weather Service), and placed it under the direction of the Army Signal Corps.  Cleveland Abbe, who had started the first private weather reporting and warning service (in Cincinnati) and had been issuing weather reports or bulletins since 1869, was the only person in the country at the time who was experienced in drawing weather maps from telegraphic reports and forecasting from them.  He became the weather service’s inaugural chief scientist– effectively its founding head– in January, 1871.  The first U.S. meteorologist, he is known as the “father of the U.S. Weather Bureau,” where he systemized observation, trained personnel, and established scientific methods.  He went on to become one of the 33 founders of the National Geographic Society.

Cleveland Abbe

source

“Criticism may not be agreeable, but it is necessary. It fulfills the same function as pain in the human body. It calls attention to an unhealthy state of things.”*…

The estimable Henry Farrell on why, on average, we’re better at criticizing others than thinking originally ourselves…

… our individual reasoning processes are biased in ways that are really hard for us (individually) to correct. We have a strong tendency to believe our own bullshit. The upside is that if we are far better at detecting bullshit in others than in ourselves, and if we have some minimal good faith commitment to making good criticisms, and entertaining good criticisms when we get them, we can harness our individual cognitive biases through appropriate group processes to produce socially beneficial ends. Our ability to see the motes in others’ eyes while ignoring the beams in our own can be put to good work, when we criticize others and force them to improve their arguments. There are strong benefits to collective institutions that underpin a cognitive division of labor.

This superficially looks to resemble the ‘overcoming bias’/’not wrong’ approaches to self-improvement that are popular on the Internet. But it ends up going in a very different direction: collective processes of improvement rather than individual efforts to remedy the irremediable. The ideal of the individual seeking to eliminate all sources of bias so that he (it is, usually, a he) can calmly consider everything from a neutral and dispassionate perspective is replaced by a Humean recognition that reason cannot readily be separated from the desires of the reasoner. We need negative criticisms from others, since they lead us to understand weaknesses in our arguments that we are incapable of coming at ourselves, unless they are pointed out to us…

… It’s not about a radical individual virtuosity, but a radical individual humility. Your most truthful contributions to collective reasoning are unlikely to be your own individual arguments, but your useful criticisms of others’ rationales. Even more pungently, you are on average best able to contribute to collective understanding through your criticisms of those whose perspectives are most different to your own, and hence very likely those you most strongly disagree with. The very best thing that you may do in your life is create a speck of intense irritation for someone whose views you vigorously dispute, around which a pearl of new intelligence may then accrete…

… One of my favourite passages from anywhere is the closing of Middlemarch, where Eliot says of Dorothea:

“Her full nature, like that river of which Cyrus broke the strength, spent itself in channels which had no great name on the earth. But the effect of her being on those around her was incalculably diffusive: for the growing good of the world is partly dependent on unhistoric acts; and that things are not so ill with you and me as they might have been, is half owing to the number who lived faithfully a hidden life, and rest in unvisited tombs.”

Striving to be a Dorothea is a noble vocation, and likely the best we can hope for in any event; sooner or later, we will all be forgotten. In the long course of time, all of our arguments and ideas will be broken down and decomposed. At best we may hope, if we are very lucky, that they will contribute in some minute way to a rich humus, from which plants that we will never see or understand might spring.

Eminently worth reading in full: “In praise of negativity,” from @henryfarrell.

* Winston Churchill

###

As we contemplate the constructive, we might recall that it was on this date in 1871 that a discipline wholly dependent on incorporating corrective critique into its methods was founded:  Cleveland Abbe became the founding chief scientist– effectively the head– of the newly formed U.S. Weather Service (later named the Weather Bureau; later still, the National Weather Service). 

Abbe had started the first private weather reporting and warning service (in Cincinnati) and had been issuing weather reports or bulletins since 1869 and was the only person in the country at the time who was experienced in drawing weather maps from telegraphic reports and forecasting from them. The first U.S. meteorologist, he is known as the “father of the U.S. Weather Bureau,” where he systemized observation, trained personnel, and established scientific methods.  He went on to become one of the 33 founders of the National Geographic Society.

source

“Dust to dust”*…

Back in 2016, we visited Jay Owens and his fascinating newsletter on dust… which went silent a couple of years later. Those of us who missed it, and were worried about its author were relieved to learn that he’d pulled back in order to turn his thinking into a book. That book is now here: Dust: The Modern World in a Trillion Particles, and The Guardian is here with an excerpt…

… Nobody normally thinks about dust, what it might be doing or where it should go: it is so tiny, so totally, absolutely, mundane, that it slips beneath the limits of vision. But if we pay attention, we can see the world within it.

Before we go any further, I should define my terms. What do I mean by dust? I want to say everything: almost everything can become dust, given time. The orange haze in the sky over Europe in the spring, the pale fur that accumulates on my writing desk and the black grime I wipe from my face in the evening after a day traversing the city. Dust gains its identity not from a singular material origin, but instead through its form (tiny solid particles), its mode of transport (airborne) and, perhaps, a certain loss of context, an inherent formlessness. If we knew precisely what it was made of, we might not call it dust, but instead dander or cement or pollen. “Tiny flying particles,” though, might suffice as a practical starting definition…

Dust is simultaneously a symbol of time, decay and death – and also the residue of life. Its meaning is never black or white, but grey and somewhat fuzzy. Living with dust – as we must – is a slow lesson in embracing contradiction: to clean, but not identify with cleanliness; to respect the material need for hygiene while distrusting it profoundly as a social metaphor…

A fascinating sample of a fascinating book: “Empire of dust: what the tiniest specks reveal about the world,” from @hautepop in @guardian.

Pair with: “Nothing is built on stone; all is built on sand” and “In every grain of sand there is the story of the earth.”

* from the burial service in the Book of Common Prayer

###

As we examine the elemental, we might send exploratory birthday greetings to Friedrich Wilhelm Heinrich Alexander von Humboldt; he was born on this date in 1769.  The younger brother of the Prussian minister, philosopher, and linguist Wilhelm von Humboldt, Alexander was a geographer, geologist, archaeologist, naturalist, explorer, and champion of Romantic philosophy.  Among many other contributions to human knowledge, his quantitative work on botanical geography laid the foundation for the field of biogeography; he surveyed and collected geological, zoological, botanical, and ethnographic specimens, including over 60,000 rare or new tropical plants.

As a geologist, Humboldt made pioneering observations of geological stratigraphy, structure (he named the Jurassic System), and geomorphology; and he understood the connections between volcanism and earthquakes. His advocacy of long-term systematic geophysical measurement laid the foundation for modern geomagnetic and meteorological monitoring.

For more, see: The Invention of Nature: Alexander Von Humboldt’s New World.

source