(Roughly) Daily

“With cities, it is as with dreams: everything imaginable can be dreamed”*…

 

The Transect

In 2012, the San Francisco Planning and Urban Research Association. an exhibition, “Grand Reductions: Ten diagrams that changed urban planning.”

The exhibition’s title – Grand Reductions – suggests the simple illustration’s power to encapsulate complex ideas. And for that reason the medium has always been suited to the city, an intricate organism that has been re-imagined (with satellite towns! in rural grids! in megaregions!) by generations of architects, planners and idealists. In the urban context, diagrams can be powerful precisely because they make weighty questions of land use and design digestible in a single sweep of the eye. But… they can also seductively oversimplify the problems of cities…

“The diagram can cut both ways: It can either be a distillation in the best sense of really taking a very complex set of issues and providing us with a very elegant communication of the solution,” [curator Benjamin] Grant says. “Or it can artificially simplify something that actually needs to be complex.”…

The high concepts that have informed the design of cities over the last century: “The Evolution of Urban Planning in 10 Diagrams.”

See also: “The cities and mansions that people dream of are those in which they finally live”*… and of course, Jane Jacob’s The Death and Life of Great American Cities and Christopher Alexander’s A New Theory of Urban Design.

* Italo Calvino

###

As we muse on metropoles, we might send exploratory birthday greeting to Eugene Fodor; he was born on this date in 1905.  Noting that travel guides of his time were boring, he wrote a guide to Europe, On the Continent—The Entertaining Travel Annual, which was published in 1936– and became the cornerstone of a travel publishing empire– the Fodor’s Guides.  He was elected to the American Society of Travel Agents (ASTA) World Travel Congress Hall of Fame, the only travel editor ever to be so honored.

In 1974, it was revealed that Fodor, a Hungarian-American who had joined the U.S. Army during World War II, had transferred to the Office of Strategic Services (the forerunner of the CIA) and served as a spy behind Nazi lines in occupied Hungary, Czechoslovakia, and Poland.

 source

 

Written by LW

October 14, 2017 at 1:01 am

“It’s the end of the world as we know it”*…

 

“The probability of global catastrophe is very high,” the Bulletin of the Atomic Scientists warned in setting the Doomsday Clock 2.5 minutes before midnight earlier this year. On nuclear weapons and climate change, “humanity’s most pressing existential threats,” the Bulletin’s scientists found that “inaction and brinkmanship have continued, endangering every person, everywhere on Earth.”

Every day, it seems, brings with it fresh new horrors. Mass murderCatastrophic climate changeNuclear annihilation.

It’s all enough to make a reasonable person ask: How much longer can things go on this way?

A Princeton University astrophysicist named J. Richard Gott has a surprisingly precise answer to that question…

Gott applies straight-forward logic and the laws of probability to setting our exit date.  “Calculations” haven’t worked out so well for Mayan seers or the likes of Harold Camping; but as you’ll read, Gott has tested his method, and done remarkably well… so: “We have a pretty good idea of when humans will go extinct.”

* REM

###

As we plan our parties, we might recall that it was on this date in (what we now call) 46 BCE, that the final year of the pre-Julian Roman calendar, began.  The Romans had added a leap month every few years to keep their lunar calendar in sync with the solar year, but had missed a few with the chaos of the civil wars of the late Republic. Julius Caesar added two extra leap months to recalibrate the calendar in preparation for his calendar reform, which went into effect in (what we now now as) 45 BC.  The year, which had 445 days, was thus known as annus confusionis (“year of confusion”).

Fragmentary fresco of a pre-Julian Roman calendar

source

 

 

Written by LW

October 13, 2017 at 1:01 am

“I just saw some idiot at the gym put a water bottle in the Pringles holder on the treadmill”*…

 

 source

Like millions of other people, I put a fair amount of effort into “being healthy.”  I don’t smoke, try to eat a reasonable diet, and so forth.  I do all of this with the backing of a strong scientific consensus that such behaviors are likely to be very good for my health and longevity.  None of this makes me special in any way; I am trying to follow what one might call the medical truth of health.

What I want to suggest here is that there is a dark underside to all that healthy behavior. The underside is that the healthy behavior encourages the view that individuals are largely responsible for their own health outcomes, and that if people end up unhealthy or diseased, it’s their fault for not having engaged in sufficiently healthy behaviors.  Call this a “social truth” of health.  This social truth has real consequences. On the one hand, if individuals are to blame for their poor health, then they should bear a lot of the cost of their disease.  After all, there is a sense in which they “chose” to be sick because of their unhealthy lifestyle.  On the other hand, policies designed to create healthier environments or at reducing structural factors associated with poor health outcomes, like poverty, start to seem less important.

“Healthism,” as a prescient article from 1980 called it, has been a growing part of the American social landscape since the 1970’s, when jogging emerged as a fitness trend.  The rise of healthism coincides with the rise of neoliberalism, a loosely-grouped set of policies that aim at analyzing all parts of society in economic terms, expanding the reach of actual markets, encouraging competitive behavior between individuals, and encouraging people to view their lives in entrepreneurial terms (for example, treating education as an investment the value of which is measured in terms of its probable future returns in the form of higher income). Because of the focus on individuals and market behaviors, neoliberal governance tends not to see systemic or public problems except insofar as they can be reduced to the problems of individuals…

It is in this context that we need to see our healthy lifestyles and the dilemma they pose.  It is obvious that those who have the good fortune and the means can and should want to be healthy, for its own sake.  On the other hand, the effort to be healthy directly feeds a narrative that says that poor health is the product of poor management, in the way that poor returns on financial investments might be.  No one is ever simply “healthy;” even health today may hide illness to come, future illness that must be detected and prevented.

The problem is that the wellness narrative causes us to over-estimate the degree to which it is fair to blame individuals for their health outcomes…

The importance of separating what is an individually-healthy behavior from good health policy: “Is Your Healthy Lifestyle Bad For You?

* meme

###

As we emphasize empathy, we might recall that it was on this date in 1850 that the first classes were held at The Female Medical College of Pennsylvania, the second medical school in the U.S. exclusively for women.  (The New England Female Medical College had been established two years earlier.)  It soon changed its name to The Woman’s Medical College of Pennsylvania, then much later was renamed as The Medical College of Pennsylvania after opening its doors to men in 1970.

The school’s first building

source

 

Written by LW

October 12, 2017 at 1:01 am

“Don’t ever forget two things I’m going to tell you. One, don’t believe everything that’s written about you. Two, don’t pick up too many checks.”*…

 

Ruth stepped out of the box after strike one, then stepped out again after strike two. Tired of being heckled, he pointed two fingers, which is where the controversy begins. In the legend, he was pointing to the center-field seats, four-hundred-plus feet away, calling his shot in the way of Minnesota Fats saying, “Eight ball, corner pocket.” Root’s third pitch was a curve—the deuce. Off the edge of plate, down, but Ruth swung anyway, sending it into deep afternoon. It landed exactly where he’d pointed, that’s what they said, beside the flagpole in back of the bleachers—490 feet from home. Lou Gehrig followed with another home run. The Yankees won 7 to 5 and went on to sweep the Series.

Ruth’s “Called Shot” is among the most famous plays in baseball history. Drawings show the penultimate moment: Babe, Bambino, the Sultan of Swat, arm outstretched, two fingers raised like the Pope giving a benediction. There’s a statue, movies. But it was disputed from the start. Did Ruth really call his shot, or did it just look that way?

Grantland Rice and Westbrook Pegler, among the most famous sportswriters of the day, had been watching from the press box behind home. Both claimed to have seen Ruth point to center, calling his shot. Franklin Roosevelt, then candidate for president, was at the game—he threw out the first pitch—and he saw it, too. Ditto Chicago Mayor Anton Cermak. Among the last living witnesses is retired Supreme Court Justice John Paul Stevens, who, then a twelve-year-old Cubs fan, was at the game with his father. The Cubs pitcher “Guy Bush was razzing Ruth,” Stevens told the writer Ed Sherman. “He and Ruth were in some kind of discussion back and forth. I heard years later it was over the Cubs being tightfisted and not giving a full share to Mark Koenig. I do remember Bush came out of the dugout and engaged in a colloquy with him … My interpretation was that he was responding to what Bush was saying. He definitely pointed toward center field. My interpretation always was, ‘I’m going to knock you to the moon.’ ”…

The most iconic event ever to occur in Wrigley Field did not star the Cubs—it unfolded in 1932, and starred the New York Yankees, with the home team serving merely as foil: The story of Babe Ruth’s most famous homer, “The Called Shot.”

* Babe Ruth

###

As we beckon to the bleachers, we might recall that it was on this date in 1911 that Ty Cobb was awarded the Chalmers Prize (an automobile), the equivalent of today’s MVP Award.  The “Georgia Peach” had achieved aa 40-game hitting streak and a .420 batting average, the highest in the league and record for the time; he led the league that year in numerous other categories as well, including 248 hits, 147 runs scored, 127 RBI, 83 stolen bases, 47 doubles, 24 triples and a .621 slugging percentage. Cobb hit eight home runs but finished second in that category to Frank Baker, who hit eleven.

Ty Cobb, left, and Joe Jackson, whom he bested for the 1911 batting title

source

Written by LW

October 11, 2017 at 1:01 am

“We tend to overestimate the effect of a technology in the short run and underestimate the effect in the long run”*…

 

We are surrounded by hysteria about the future of artificial intelligence and robotics—hysteria about how powerful they will become, how quickly, and what they will do to jobs.

I recently saw a story in ­MarketWatch that said robots will take half of today’s jobs in 10 to 20 years. It even had a graphic to prove the numbers.

The claims are ludicrous. (I try to maintain professional language, but sometimes …) For instance, the story appears to say that we will go from one million grounds and maintenance workers in the U.S. to only 50,000 in 10 to 20 years, because robots will take over those jobs. How many robots are currently operational in those jobs? Zero. How many realistic demonstrations have there been of robots working in this arena? Zero. Similar stories apply to all the other categories where it is suggested that we will see the end of more than 90 percent of jobs that currently require physical presence at some particular site.

Mistaken predictions lead to fears of things that are not going to happen, whether it’s the wide-scale destruction of jobs, the Singularity, or the advent of AI that has values different from ours and might try to destroy us. We need to push back on these mistakes. But why are people making them? I see seven common reasons…

Mistaken extrapolations, limited imagination, and other common mistakes that distract us from thinking more productively about the future: Rodney Brooks on “The Seven Deadly Sins of AI Predictions.”

* Roy Amara, co-founder of The Institute for the Future

###

As we sharpen our analyses, we might recall that it was on this date in 1995 that The Media Lab at the Massachusetts Institute of Technology chronicled the World Wide Web in its A Day in the Life of Cyberspace project.

To celebrate its 10th anniversary, the Media Lab had invited submissions for the days leading up to October 10, 1995, on a variety of issues related to technology and the Internet, including privacy, expression, age, wealth, faith, body, place, languages, and the environment.  Then on October 10, a team at MIT collected, edited, and published the contributions to “create a mosaic of life at the dawn of the digital revolution that is transforming our planet.”

source

 

 

Written by LW

October 10, 2017 at 1:01 am

“I must begin, not with hypothesis, but with specific instances, no matter how minute”*…

 

Paul Klee’s notebooks (notes for the classes he taught at the Bauhaus)– 3,900 pages of them– digitized and made available online by the Zentrum Paul Klee in Bern.

* Paul Klee

###

As we get specific, we might recall that this was a bad day for inclusiveness in Massachusetts in 1635: the General Court of the then-Colony banished Roger Williams for speaking out for the separation of church and state and against the right of civil authorities to punish religious dissension and to confiscate Indian land.  Williams moved out to edge of the Narragansett Bay, where with the assistance of the Narragansett tribe, he established a settlement at the junction of two rivers near Narragansett Bay, located in (what is now) Rhode Island. He declared the settlement open to all those seeking freedom of conscience and the removal of the church from civil matters– and many dissatisfied Puritans came. Taking the success of the venture as a sign from God, Williams named the community “Providence.”

Williams stayed close to the Narragansett Indians and continued to protect them from the land greed of European settlers. His respect for the Indians, his fair treatment of them, and his knowledge of their language enabled him to carry on peace negotiations between natives and Europeans, until the eventual outbreak of King Philip’s War in the 1670s.  And although Williams preached to the Narragansett, he practiced his principle of religious freedom by refraining from attempts to convert them.

Roger Williams statue, Roger Williams Park, Providence, R.I.

source

 

 

Written by LW

October 9, 2017 at 1:01 am

“I went to a restaurant that serves ‘breakfast at any time.’ So I ordered French Toast during the Renaissance.”*…

 

Casual dining chains — industry parlance for economical sit-down restaurants like Fridays, Applebee’s, Chili’s, and Buffalo Wild Wings — have subsisted in a dismal and persistent state of decline for about a decade. But in the last two years, things have gotten worse, with the number of people eating at casual dining chains overall falling every single month since June 2015; they are now the worst-performing segment of the entire restaurant industry. In recent months, Applebee’s has said it will close 135 locations this year; Buffalo Wild Wings will shed at least 60. Ruby Tuesday closed 109 restaurants last year, and put the whole company up for sale in MarchFriendly’sBennigan’sJoe’s Crab Shack, and Logan’s Roadhouse have all filed for bankruptcy.

Whatever your feelings about casual dining chains, they have been a vital part of the way that many Americans eat since the 1930s, when Howard Johnson began blanketing the highways with his trademark orange-and-teal restaurants — temples to affordable, quality fare in a wholesome setting. After plodding along for some 50 years, the genre exploded during the 1980s, as America entered a period of sustained economic growth and chains like Fridays, Olive Garden, and Applebee’s saturated suburban landscapes with their bland, softly corporate vision of good times and good food. While the brands and the fads have changed — RIP fried-clam sandwich, hello baby back ribs and buffalo sliders — the formula has remained more or less unchanged over the decades: middlebrow menu, solid value, and friendly service, consistently executed, from Pasadena to Tallahassee. Until recently, it was a formula that worked across cuisines, state lines, and demographics…

TGI Fridays and Applebee’s and their ilk are struggling as the American middle class and its enormous purchasing power withers away in real time, with the country’s population dividing into a vast class of low-wage earners who cannot afford the indulgence of sit-down meal of Chili’s Mix & Match Fajitas and a Coke, and a smaller cluster of high-income households for whom a Jack Daniel’s sampler platter at Fridays is no longer good enough. At the same time, the rise of the internet, smartphones, and streaming media have changed the ways that consumers across the income spectrum choose to allocate our leisure time — and, by association, our mealtimes. In-home (and in-hand) entertainment has altered how we consume casual meals, making the Applebee’s and Red Lobsters of the world less and less relevant to the way America eats.

As casual dining restaurants collapse in on themselves, TGI Fridays remains — unfortunately for it — an emblem for the entire category: In 2014, after years of slipping sales, the chain was sold to a pair of private equity firms, Sentinel Capital Partners and TriArtisan Capital Advisors, which swiftly began offloading company-owned restaurants to franchisees, essentially stripping the business for parts. Meanwhile, the chain’s beleaguered management has attempted to turn things around with a series of highly publicized initiatives, like delivering booze. Most notably, last year, Fridays unveiled a new concept restaurant in Texas — a stunning reversal from the tchotchke-laden image savagely memorialized in Mike Judge’s 1999 cult classic Office Space — that’s heavy on neutral tones, pale wood, brick walls, and exceedingly mellow, indistinct furniture; it looks like a neglected airport lounge in Helsinki…

A fascinating consideration of a restaurant that is both an avatar and a bellwether of the American middle class: “As Goes the Middle Class, So Goes TGI Fridays.”

See also: “Applebee’s Deserves To Die,” which explores the millennial dimension of this phenomenon:

The media-created meme that’s arisen about millennials killing things — beer, napkins, Hooters, cereal, casual dining establishments, and motorcycles, and golf, to name a few — is fascinating, again, because of what it reveals. Young people’s generally decreased standard of living and the preferences they have developed as a result are destroying established industries, and older people don’t like it. But these are rational responses to economic anxiety. Everything from high rates of homeownership to Hooters came out of a middle-class prosperity that doesn’t really exist anymore, because the middle class doesn’t really exist in America anymore, especially not for the millennials who had to grow up without the comfort of the American Dream. Chains united America, but things were different then, and for millennials at least, they’re irreparably broken now…

* Steven Wright

###

As we avail ourselves of the Endless Appetizers, we might recall that it was on this date in 1945 that a self-taught engineer named Percy Spencer applied for a patent for a “microwave cooking oven”; he had been working in a lab testing magnetrons, the high-powered vacuum tubes inside radars.  One day while working near the magnetrons– which produced microwaves– Spencer noticed a peanut butter candy bar in his pocket had begun to melt — shortly after, the microwave oven was born.

In 1947, Raytheon introduced Spencer’s invention, the world’s first microwave oven, the “Radarange”: a refrigerator-sized appliance that cost $2-3,000.  It found a some applications in commercial food settings and on Navy ships, but no consumer market.  Then Raytheon licensed the technology to the Tappan Stove Company, which introduced a wall-mounted version with two cooking speeds (500 and 800 watts), stainless steel exterior, glass shelf, top-browning element and a recipe card drawer.  It sold for $1,295 (figure $10,500 today).

Later Litton entered the business and developed the short, wide shape of the microwave that we’re familiar with today. As Wired reports, this opened the market:

Prices began to fall rapidly. Raytheon, which had acquired a company called Amana, introduced the first popular home model in 1967, the countertop Radarange. It cost $495 (about $3,200 today).

Consumer interest in microwave ovens began to grow. About 40,000 units were sold in the United States in 1970. Five years later, that number hit a million.

The addition of electronic controls made microwaves easier to use, and they became a fixture in most kitchens. Roughly 25 percent of U.S. households owned a microwave oven by 1986. Today, almost 90 percent of American households have a microwave oven.

Today, Percy Spencer’s invention and research into microwave technology are still being used as a jumping off point for further research in radar and magnetron technologies.  Different wavelengths of microwaves are being used to keep an eye on weather conditions and even rain structures via satellites, and are able to penetrate clouds, rain, and snow, according to NASA.  Other radar technology use microwaves to monitor sea levels to within a few centimeters.

Police are also known to use radar guns to monitor a vehicle’s speed, which continually transmit microwaves to measure the waves’ reflections to see how fast one is driving.

 source

 

Written by LW

October 8, 2017 at 1:01 am

%d bloggers like this: