(Roughly) Daily

Posts Tagged ‘disaster

“Managed retreat is not just a last resort. It is not a failure to adapt at all. It is actually an active decision to adapt.”*…

Community High School, Valmeyer, Illinois

The town of Valmeyer, Illinois relocated decades ago after devastating floods. It may have lessons for communities forced to consider a managed retreat from climate impacts today…

In the summer of 1993, the southwestern Illinois town of Valmeyer took the brunt of a massive flood when, not once but twice in a month, the swollen Mississippi River topped its levee system. The village was engulfed in up to 16ft (5m) of floodwater that lingered for months, damaging some 90% of buildings.

Faced with either rebuilding the town and risking yet another disaster, or simply scattering to other towns or states by themselves, the 900 residents of this tight-knit farming community made a bold choice: to pack up everything and start over on new ground.

In the years that followed, hundreds of people moved out of the floodplain as the entire town was rebuilt from scratch on a bluff a mile uphill. In doing so, the town has become an early example of one of the most radical ways a community can adapt to a warming world: moving people and assets out of harm’s way.

Known as managed retreat, or planned relocation, the approach is often framed as a last resort to be pursued only when no other alternatives exist. But as the effects of climate change intensify, exposing more and more people across the globe to the risk of catastrophic flooding, devastating fires and other calamitous natural hazards, the concept is increasingly making its way into the mainstream as a viable – and necessary – adaptation strategy…

When one can’t resist the effects of climate change (e.g., with a sea wall to hold back rising water levels), or accommodate it (e.g., using air cooling and “greening” to combat rising temperatures), the remaining option is retreat: “The Illinois town that got up and left,” from @BBC_Future.

See also: “Managed Retreat in the United States.”

Miyuki Hino

###

As we rethink relocation, we might recall that it was on this date in 1913 that rain storms led to floodwaters from the Great Miami River reaching Dayton, Ohio– causing the Great Dayton Flood, which lasted another five days. The volume of water that passed through Dayton during this storm equaled the monthly flow over Niagara Falls; downtown Dayton was submerged up to 20 feet.

More than 360 people died; 65,000 were displaced; nearly 1,400 horses and 2,000 other domestic animals died. 20,000 homes were destroyed and buildings were moved off of their foundations. Property damage to homes, businesses, factories, and railroads was estimated at more than $100 million in 1913 dollars (more than $2 billion in today’s dollars).

source

“The highly motivated people in society are the ones causing all the trouble. It’s not the lazy unmotivated folks sitting in front of a TV eating potato chips who bother anyone.”*…

Americans consume about 1.85 billion pounds of potato chips annually– around 6.6 pounds per person. A history of our favorite snack…

When Covid-19 forced people to stay home, many of us found solace in a snack: potato chips. The crispy treats enjoyed around a $350 million increase in sales from 2019 to 2020. When the chips are down, it seems, Americans gobble them up.

Any search for the origins of this signature finger food must lead to George Crum (born George Speck), a 19th-century chef of Native and African American descent who made his name at Moon’s Lake House in the resort town of Saratoga Springs, New York. As the story goes, one day in 1853, the railroad and shipping magnate Cornelius Vanderbilt was eating at Moon’s when he ordered his fried potatoes be returned to the kitchen because they were too thick. Furious with such a fussy eater, Crum sliced some potatoes as slenderly as he could, fried them to a crisp and sent them out to Vanderbilt as a prank. Rather than take the gesture as an insult, Vanderbilt was overjoyed.

Other patrons began asking for Crum’s “Saratoga Chips,” which soon became a hit far beyond Upstate New York. In 1860, Crum opened his own restaurant near Saratoga known as Crum’s House or Crum’s Place, where a basket of potato chips sat invitingly on every table. Crum oversaw the restaurant until retiring over 30 years later; in 1889, a New York Herald writer called him “the best cook in America.” Crum died in 1914, but today’s astounding variety of potato chips, from cinnamon-and-sugar Pringles to flamin’ hot dill pickle Lay’s, are a tribute to the man American Heritage magazine called “the Edison of grease”…

George Crum, whose exasperation with Cornelius Vanderbilt reputedly helped spark America’s craze for potato chips

A fussy magnate, a miffed chef and the curious roots of the comfort food we hate to love: “How the Potato Chip Took Over America,” from @SmithsonianMag.

* George Carlin

###

As we try to eat just one, we might might recall that it was on this date in 1965 that a storage tank ruptured and three million gallons of soybean oil flooded streets in Mankato, Minnesota, then flowed in the Mississippi River.

Stan Eichten was working in the office of Honeymead Soybean Products [the largest oil-processing facility in the world at the time]… when one of the biggest environmental disasters in state history hit.

“There was a roar, like an explosion,” said Eichten of the rupture of a soy oil storage tank that sent millions of gallons into Mankato streets and the Minnesota River. “It was almost like a tsunami. There was oil 2 or 3 feet deep all over.”

source

source

“Speed is carrying us along, but we have yet to master it”*…

Kitchen #26 (2021) by Samuel Richardson

A call to contemplate the potential negative effects of internet technology along with its promise…

Conversations about technology tend to be dominated by an optimistic faith in technological progress, and headlines about new technologies tend to be peppered with deterministic language assuring readers of all the wonderful things these nascent technologies “will” do once they arrive. There is endless encouragement to think about all of the exciting benefits to be enjoyed if everything goes right, but significantly less attention is usually paid to the ways things might go spectacularly wrong.

In the estimation of philosopher Paul Virilio, the refusal to seriously contemplate the chance of failure can have calamitous effects. As he evocatively put it in 1997’s Open Sky, “Unless we are deliberately forgetting the invention of the shipwreck in the invention of the ship or the rail accident in the advent of the train, we need to examine the hidden face of new technologies, before that face reveals itself in spite of us.” Virilio’s formulation is a reminder that along with new technologies come new types of dangerous technological failures. It may seem obvious today that there had never been a car crash before the car was invented, but what future wrecks are being overlooked today amidst the excited chatter about AI, the metaverse, and all things crypto?

Virilio’s attention to accidents is a provocation to look at technology differently. To foreground the dangers instead of the benefits, and to see ourselves as the potential victims instead of as the smiling beneficiaries. As he put it in Pure War, first published in 1983, “Every technology produces, provokes, programs a specific accident.” Thus, the challenge becomes looking for the “accident” behind the technophilic light show — and what’s more, to find it before the wreckage starts to pile up. 

Undoubtedly, this is not the most enjoyable way to look at technology. It is far more fun to envision yourself enjoying the perfect meal prepared for you by your AI butler than to imagine yourself caught up in a Kafkaesque nightmare after the AI system denies your loan application. Nevertheless, if Virilio was right to observe that “the invention of the highway was the invention of 300 cars colliding in five minutes,” it would be wise to start thinking seriously about the crashes that await us as we accelerate down the information superhighway… 

The work of Paul Virilio urges us to ask: What future disasters inhere in today’s technologies? “Inventing the Shipwreck” from Zachary Loeb (@libshipwreck) in @_reallifemag. Eminently worth reading in full.

For a look at those who don’t just brush aside Virilio’s caution, but actively embrace speed and the chaos that it can cause:

Accelerationism holds that the modern, Western democratic state is so mired in corruption and ineptitude that true patriots should instigate a violent insurrection to hasten its destruction to allow a new, white-dominated order to emerge. Indeed, some of the foremost exponents of accelerationism today were at the U.S. Capitol on January 6. They included: the Oath Keepers, whose grab-bag ideology states that “paranoid anti-federalism envision[s] a restoration of ‘self-government’ and ‘natural rights’;” QAnon adherents, who remain convinced that the 2020 presidential election was stolen and that former President Donald Trump was thwarted from saving the world from a Satan-worshipping pedophilia ring run by Democrats, Jews, and other agents of the deep state; and, of course, Trump’s own die-hard “Stop the Steal” minions, who, against all reason and legal proof, seek to restore the former president to office.

The objective of accelerationism is to foment divisiveness and polarization that will induce the collapse of the existing order and spark a second civil war…

Read the full piece: “A Year After January 6, Is Accelerationism the New Terrorist Threat?

* Paul Virilio

###

As we practice prudence, we might recall that it was on this date in 1854 that Anthony Fass, a Philadelphia piano maker, was awarded the first U.S. patent (#11062) for an accordion.  (An older patent existed in Europe, issued in Vienna in 1829 to Cyrill Demian.)

“Music helps set a romantic mood. Imagine her surprise when you say, ‘We don’t need a stereo – I have an accordion’.”  – Martin Mull

“A gentleman is someone who can play the accordion, but doesn’t.”  – Tom Waits

accordion_patent

source

“History is humankind trying to get a grip. Obviously its not easy. But it could go better if you would pay a little more attention to certain details, like for instance your planet.”*…

A blast from the past…

In 1938, 20-year-old filmmaker Richard H. Lyford directed and starred in As the Earth Turns, a science-fiction silent movie about a mad scientist who purposely induces climate change as a way to end world violence.

But the 45-minute film became “lost,” only to resurface 80 years later, in 2018, when Lyford’s grandniece, Kim Lyford Bishop, discovered it. (After creating the film, Lyford went on to work at Disney and earn an Oscar for the 1950 documentary “The Titan: Story of Michelangelo.”)

Bishop then asked music composer Ed Hartman, who was her daughter’s percussions teacher, to score it.

Although “As the Earth Turns” was finally released in 2019 and took part in 123 film festivals, it will finally premiere on television on Halloween night, this Sunday on Turner Classic Movies at 9pm PST…

From The Seattle Times:

… “As the Earth Turns is the work of an exuberant, ambitious young man: Lyford wrote, directed and shot the film, and managed to corral a stable of actors and crew to capture his vision. You can see his fascination with the craft of filmmaking: Lyford experiments with miniatures and models (then used in Hollywood films, and a remarkable accomplishment for a barely-out-of-his-teens hobbyist), explosions, earthquakes and special makeup effects, all on a budget of next to nothing.”

A 1938 sci-fi film about climate change was lost. It’s making its TV debut 83 years later,” from Carla Sinclair (@Carla_Sinclair) and @BoingBoing.

* Kim Stanley Robinson, New York 2140

###

As we ponder prescience, we might recall that it was on this date in 2012 that Hurricane Sandy (AKA Superstorm Sandy) hit the east coast of the United States, killing 148 directly and 138 indirectly, wreaking nearly $70 billion in damages, and causing major power outages. In New York City streets, tunnels, and subway lines were flooded.

source

“People in any organization are always attached to the obsolete – the things that should have worked but did not, the things that once were productive and no longer are”*…

Ed Zitron argues that America has too many managers, and managers misbehaving at that…

In a 2016 Harvard Business Review analysis, two writers calculated the annual cost of excess corporate bureaucracy as about $3 trillion, with an average of one manager per every 4.7 workers. Their story mentioned several case studies—a successful GE plant with 300 technicians and a single supervisor, a Swedish bank with 12,000 workers and three levels of hierarchy—that showed that reducing the number of managers usually led to more productivity and profit. And yet, at the time of the story, 17.6 percent of the U.S. workforce (and 30 percent of the workforce’s compensation) was made up of managers and administrators—an alarming statistic that shows how bloated America’s management ranks had become.

The United States, more than anywhere else in the world, is addicted to the concept of management. As I’ve written before, management has become a title rather than a discipline. We have a glut of people in management who were never evaluated on their ability to manage before being promoted to their role. We have built corporate America around the idea that if you work hard enough, one day you might become a manager, someone who makes rather than takes orders. While this is not the only form of management, based on the response to my previous article and my newsletters on the subject, this appears to be how many white-collar employees feel. Across disparate industries, an overwhelming portion of management personnel is focused more on taking credit and placing blame rather than actually managing people, with dire consequences.

This type of “hall monitor” management, as a practice, is extremely difficult to execute remotely, and thus the coming shift toward permanent all- or part-remote work will lead to a dramatic rethinking of corporate structure. Many office workers—particularly those in industries that rely on the skill or creativity of day-to-day employees—are entering a new world where bureaucracy will be reduced not because executives have magically become empathetic during the pandemic, but because slowing down progress is bad business. In my eyes, that looks like a world in which the power dynamics of the office are inverted. With large swaths of people working from home some or all of the time, managers will be assessed not on their ability to intimidate other people into doing things, but on their ability to provide their workers with the tools they need to measurably succeed at their job.

In order to survive, managers, in other words, will need to start proving that they actually do something. What makes this shift all the more complicated is that many 21st-century, white-collar employees don’t necessarily need a hands-on manager to make sure they get their work done…

The pandemic has laid bare that corporate America disrespects entry-level workers. At many large companies, the early years of your career are a proving ground with little mentorship and training. Too many companies hand out enormous sums to poach people trained elsewhere, while ignoring the way that the best sports teams tend to develop stars—by taking young, energetic people and investing in their future (“trust the process,” etc.). This goes beyond investing in education and courses; it involves taking rising stars in your profession and working to make them as good as your top performer.

In a mostly remote world, a strong manager is someone who gets the best out of the people they’re managing, and sees the forest from the trees—directing workers in a way that’s informed by both experience and respect. Unfortunately, the traditional worker-to-manager pipeline often sets people up for inefficiency and failure. It’s the equivalent of taking a pitcher in their prime and making them a coach—being good at one thing doesn’t mean you can make other people good at the same thing. This is known as the Peter principle, a management concept developed by Laurence J. Peter in the late ’60s that posits that a person who’s good at their job in a hierarchical organization will invariably be promoted to a position that requires different skills, until they’re eventually promoted to something they can’t do, at which point they’ve reached their “maximum incompetence.” Consistent evidence shows that the principle is real: A study of sales workers at 214 firms by the National Bureau of Economic Research found that firms prioritize current job performance in promotion decisions over whether the person can actually do the job for which they’re being considered. In doing so, they’re placing higher value on offering the incentive of promotion to get more out of their workers, at the cost of potentially injecting bad management into their organization.

What I’m talking about here is a fundamental shift in how we view talent in the workplace. Usually, when someone is good at their job, they are given a soft remit to mentor people, but rarely is that formalized into something that is mutually beneficial. A lack of focus on fostering talent is counterintuitive, and likely based on a level of fear that one could train one’s own replacement, or that a business could foster its own competition. This is a problem that could be solved by paying people more money for being better at their job. Growing talent is also a more sustainable form of business—one that harkens back to the days of apprenticeships—where you’re fostering and locking up talent so that it doesn’t go elsewhere, and doesn’t cost you time and money to have to recruit it (or onboard it, which costs, on average, more than $4,000 a person). Philosophically, it changes organizations from a defensive position (having to recruit to keep up) to an offensive position (building an organization from within), and also greatly expands an organization’s ability to scale affordably…

The problem is that modern American capitalism has equated “getting the most out of someone” with “getting the most hours out of them,” rather than getting the most value out of them. “Success,” as I’ve discussed before, is worryingly disconnected from actually succeeding in business.

Reducing bureaucracy is also a net positive for the labor market, especially for young people. Entry-level corporate work is extremely competitive and painful, a years-long process in which you’re finding your footing in an industry and an organization. If we can change the lens through which we view those new to the workforce—as the potential hotshots of the future, rather than people who have to prove themselves—we’ll have stronger organizations that waste less money. We should be trying to distill and export the talents of our best performers, and give them what they need to keep doing great things for our companies while also making their colleagues better too.

All of this seems inevitable, to me, because a remote future naturally reconfigures the scaffolding of how work is done and how workers are organized. The internet makes the world a much smaller place, which means that simple things such as keeping people on task don’t justify an entire position—but mentorship and coaching that can get the best out of each worker do.

Hopefully we can move beyond management as a means of control, and toward a culture that appreciates a manager who fosters and grows the greatness in others.

The pandemic has exposed a fundamental weakness in the system: “Say Goodbye to Your Manager,” from @edzitron.

* Peter Drucker

###

As we reorganize, we might recall that it was on this date that Henri Giffard made the first first powered and controlled flight of an airship, traveling 27 km from Paris to Élancourt in his “Giffard dirigible.”

Airships were the first aircraft capable of controlled powered flight, and were most commonly used before the 1940s, largely floated with (highly-flammable) hydrogen gas. Their use decreased as their capabilities were surpassed by those of airplanes- and then plummeted after a series of high-profile accidents, including the 1930 crash and burning of the British R101 in France, the 1933 and 1935 storm-related crashes of the twin airborne aircraft carrier U.S. Navy helium-filled rigids, the USS Akron and USS Macon respectively, and– most famously– the 1937 burning of the German hydrogen-filled Hindenburg.

The Giffard dirigible [source]

Written by (Roughly) Daily

September 24, 2021 at 1:00 am

%d bloggers like this: