(Roughly) Daily

Posts Tagged ‘middle class

“The most perfect political community is one in which the middle class is in control, and outnumbers both of the other classes”*…

For some, the prospect of further advances in AI and related tech (robotics, connectivity, et al.) conjures a future of existential risk, a Terminator-like dystopian future in which humans fight with “machines” for primacy. For others (among whom your correspondent numbers himself), AI (better understood as “augmented” than “artificial” intelligence) has real promise– but also dangers of a different (and very human) sort. Those technologies, dependent as they are on capital and specific/rare expertise, could fuel further concentration of wealth and power, could usher in an era of even greater inequality. Noah Smith is here to argue that my fears may be misplaced, that augmentation may narrow the skills gap and help reduce economic polarization…

On the app formerly known as Twitter, I’m known for occasionally going on rants about how it’s good to be normal and average and middle-class. To some degree this is because I believe that the only successful society is an egalitarian one where people don’t have to be exceptional in order to live good and comfortable and fulfilling lives. But some of it is also a reaction against the messages I was inundated with growing up. It seemed like every movie and book and TV show was telling me that nerds like me were special — that because we could do physics or program computers or even just play video games, we were destined to be exceptional. In the late 80s and 90s, it felt like we were on the cusp of a great shift, where the back-slapping jocks who had dominated American society in earlier times were on the verge of losing power and status to the bespectacled freaks and geeks. The Revenge of the Nerds was coming.

It wasn’t just fantasy, either. Over the next thirty years, the nerds really did win the economic competition. The U.S. shifted from manufacturing to knowledge industries like IT, finance, bio, and so on, effectively going from the world’s workshop to the world’s research park. This meant that simply being able to cut deals and manage large workforces were no longer the only important skills you needed to succeed at the highest levels of business. Bespectacled programmers and math nerds became our richest men. From the early 80s to the 2000s, the college earnings premium rose relentlessly, and a degree went from optional to almost mandatory for financial success.

The age of human capital was in full swing, and the general consensus was that “Average Is Over”. And with increased earnings came increased social status and personal confidence; by the time I moved out to San Francisco in 2016, tech people were clearly the masters of the Universe.

The widening gap in the performance of the nerds versus everyone else wasn’t the only cause of the rise in inequality in the U.S. — financialization, globalization, tax changes, the decline of unions, and other factors all probably played a role. But the increasing premium on human capital was impossible to ignore.

That trend lasted so long that most Americans can no longer remember anything else. We’ve become used to the idea that technology brings inequality, by delivering outsized benefits to the 20% of society who are smart and educated enough to take full advantage of it. It’s gotten to the point where we tacitly assume that this is just what technology does, period, so that when a new technology like generative AI comes along, people leap to predict that economic inequality will widen as a result of a new digital divide.

And it’s possible that will happen. I can’t rule it out. But I also have a more optimistic take here — I think it’s possible that the wave of new technologies now arriving in our economy will decrease much of the skills gap that opened up in the decades since 1980…

An optimistic take on technology and inequality: “Is it time for the Revenge of the Normies?” from @Noahpinion. Eminently worth reading in full.

* Aristotle

###

As we contemplate consequences, we might spare a thought for Joseph Glidden; he died on this date in 1906. An Illinois farmer, he developed and patented the design of the first commercially-feasible barbed wire in 1874 (an earlier, less successful patent preceded his)– a product that would transform the West. Before his innovation, settlers on the treeless plains had no easy way to fence livestock away from cropland, and ranchers had no way to prevent their herds from roaming far and wide. Glidden’s barbed wire opened the plains to large-scale farming, and closed the open range, bringing the era of the cowboy and the round-up to an end. With his partner, Isaac L. Ellwood, Glidden formed the Barb Fence Company of De Kalb, Illinois, and quickly became one of the wealthiest men in the nation.

source

“He, indeed, who gave fewest pledges to Fortune, has yet suffered her heaviest visitations”*…

As Zachary Crockett explains, taking the kids to a baseball game, a movie, or Disneyland is a bigger financial commitment than it used to be for middle-class families… a much bigger commitment…

In the 1950s and ’60s — the so-called Golden Age of American capitalism — family outings were within the realm of affordability for most median income earners. Many blue-collar workers could afford new homes and cars and still take their kids to Disneyland.

Despite rising wages, many of those same activities are now out of reach for everyday Americans.

The Hustle analyzed the cost of three family activities in 1960 vs. 2022:

1. A baseball game

2. A movie at a theater

3. A one-day Disneyland visit

We found that these family outings have increased in cost at 2-3x the rate of inflation — and that, in order to afford them, today’s American families have to work up to 2x as many hours as they did 60 years ago…

The painful details at: “America’s favorite family outings are increasingly out of reach,” from @zzcrockett in @TheHustle.

* John Maynard Keynes

###

As we rethink our plans, we might recall that it was on this date in 1951 that Disney’s Alice in Wonderland had its American premiere (in New York, two days after premiering in London). The average price of a movie ticket that year was $0.47 (or $4.53, adjusted for inflation); popcorn was 5-10 cents per bag.

source

Written by (Roughly) Daily

July 28, 2022 at 1:00 am

“The most perfect political community is one in which the middle class is in control, and outnumbers both of the other classes”*…

 

middle class

 

But is there a middle class?…

Every politician defends the middle class, but none of them knows quite what it is. In August, during a town hall, Joe Biden said, “We have to rebuild the middle class, and this time we bring everyone along.” In his telling, the middle class is part memory and part aspiration, less a demographic group than a morality tale of loss and redemption. It “isn’t a number,” Biden is fond of saying. “It’s a set of values.”

For many social scientists, though, the middle class is a matter of numbers. The Pew Research Center says that anyone who earns between a mere two-thirds of the median household income and twice that amount falls within it. By that definition, just under half of all American adults are middle class. Unlike in Britain, where the category is seen as more culturally refined, the American middle class includes blue-collar workers whose consumption patterns fit the bill; they can buy a home or put their kids through college. Biden defines the middle class even more expansively. To be middle class, he said in Iowa this summer, is to know “that your kid is safe going outside to play”—something most humans, if not most large primates, would agree they want. To be middle class is to be, well, normal.

Republicans, for their part, rarely promise to rebuild the middle class; they want, as President Trump has said, to make it “bigger and more prosperous than ever before.” But liberal politicians from Biden to Barack Obama to Elizabeth Warren often vow to restore the middle class to the former glory of the three decades after World War II—a time when, they say, prosperity was shared and class conflict neutralized.

Even then, however, there was a sense that the middle class was in crisis. In his 1956 best-seller, The Organization Man, William Whyte wrote of a middle class—an implicitly white middle class—trapped in suburbs and office jobs, shorn of the entrepreneurial individualism and wartime solidarity of earlier generations. In 1969, a New York Times reporter found in Italian-American Queens a community trapped between escalating grocery bills and the expanding “ghetto.” In 1977, the middle class was “struggling uphill,” the Chicago Tribune wrote. In 1992, it felt “betrayed” and “forgotten,” according to the Times. And since 2008, Times subscribers have read of a middle class that is “sagging,” “shrinking,” “sinking,” and “limping.” In short, the middle class, as our politicians imagine it, has never really existed [in a settled, continuous way]: It is always in decline, always on the brink of being rebuilt.

To imagine the middle class, then, is to invoke a myth. Politicians use it to bind Americans together in a shared hope that they can one day return to the lost idyll of the postwar period. In that sense, the concept is remarkably optimistic, if somewhat inconsistent. As Lawrence Samuel argues in The American Middle Class: A Cultural History, the term expresses two incompatible things: It suggests that the United States is a classless society in which most citizens belong to the same social sphere, even as it hints at a rarefied class above the middle that anyone can reach if they work hard enough to ascend the ladder of opportunity. These can’t both be true—if the United States were a classless society, there would be no need for upward mobility. The metaphor gives the lie to the myth. Every ladder, after all, has a top and a bottom—and it’s the bottom that bears all the weight…

Politicians– and business people and academics– are quick to reference “the middle class.”  John Patrick Leary (@johnpatleary) explores “What We Talk About When We Talk About the Middle Class.”

* Aristotle

###

As we contemplate classification and its consequences, we might recall that it was on this date in 1792, during George Washington’s first term as president, that the first edition of The Farmers Almanac was published.  (It became The Old Farmers Almanac in 1832 to distinguish itself from similarly-titled competitors.)  Still going strong, it is the oldest continuously-published periodical in the U.S.

Almanac source

 

Written by (Roughly) Daily

November 25, 2019 at 1:01 am

“I went to a restaurant that serves ‘breakfast at any time.’ So I ordered French Toast during the Renaissance.”*…

 

Casual dining chains — industry parlance for economical sit-down restaurants like Fridays, Applebee’s, Chili’s, and Buffalo Wild Wings — have subsisted in a dismal and persistent state of decline for about a decade. But in the last two years, things have gotten worse, with the number of people eating at casual dining chains overall falling every single month since June 2015; they are now the worst-performing segment of the entire restaurant industry. In recent months, Applebee’s has said it will close 135 locations this year; Buffalo Wild Wings will shed at least 60. Ruby Tuesday closed 109 restaurants last year, and put the whole company up for sale in MarchFriendly’sBennigan’sJoe’s Crab Shack, and Logan’s Roadhouse have all filed for bankruptcy.

Whatever your feelings about casual dining chains, they have been a vital part of the way that many Americans eat since the 1930s, when Howard Johnson began blanketing the highways with his trademark orange-and-teal restaurants — temples to affordable, quality fare in a wholesome setting. After plodding along for some 50 years, the genre exploded during the 1980s, as America entered a period of sustained economic growth and chains like Fridays, Olive Garden, and Applebee’s saturated suburban landscapes with their bland, softly corporate vision of good times and good food. While the brands and the fads have changed — RIP fried-clam sandwich, hello baby back ribs and buffalo sliders — the formula has remained more or less unchanged over the decades: middlebrow menu, solid value, and friendly service, consistently executed, from Pasadena to Tallahassee. Until recently, it was a formula that worked across cuisines, state lines, and demographics…

TGI Fridays and Applebee’s and their ilk are struggling as the American middle class and its enormous purchasing power withers away in real time, with the country’s population dividing into a vast class of low-wage earners who cannot afford the indulgence of sit-down meal of Chili’s Mix & Match Fajitas and a Coke, and a smaller cluster of high-income households for whom a Jack Daniel’s sampler platter at Fridays is no longer good enough. At the same time, the rise of the internet, smartphones, and streaming media have changed the ways that consumers across the income spectrum choose to allocate our leisure time — and, by association, our mealtimes. In-home (and in-hand) entertainment has altered how we consume casual meals, making the Applebee’s and Red Lobsters of the world less and less relevant to the way America eats.

As casual dining restaurants collapse in on themselves, TGI Fridays remains — unfortunately for it — an emblem for the entire category: In 2014, after years of slipping sales, the chain was sold to a pair of private equity firms, Sentinel Capital Partners and TriArtisan Capital Advisors, which swiftly began offloading company-owned restaurants to franchisees, essentially stripping the business for parts. Meanwhile, the chain’s beleaguered management has attempted to turn things around with a series of highly publicized initiatives, like delivering booze. Most notably, last year, Fridays unveiled a new concept restaurant in Texas — a stunning reversal from the tchotchke-laden image savagely memorialized in Mike Judge’s 1999 cult classic Office Space — that’s heavy on neutral tones, pale wood, brick walls, and exceedingly mellow, indistinct furniture; it looks like a neglected airport lounge in Helsinki…

A fascinating consideration of a restaurant that is both an avatar and a bellwether of the American middle class: “As Goes the Middle Class, So Goes TGI Fridays.”

See also: “Applebee’s Deserves To Die,” which explores the millennial dimension of this phenomenon:

The media-created meme that’s arisen about millennials killing things — beer, napkins, Hooters, cereal, casual dining establishments, and motorcycles, and golf, to name a few — is fascinating, again, because of what it reveals. Young people’s generally decreased standard of living and the preferences they have developed as a result are destroying established industries, and older people don’t like it. But these are rational responses to economic anxiety. Everything from high rates of homeownership to Hooters came out of a middle-class prosperity that doesn’t really exist anymore, because the middle class doesn’t really exist in America anymore, especially not for the millennials who had to grow up without the comfort of the American Dream. Chains united America, but things were different then, and for millennials at least, they’re irreparably broken now…

* Steven Wright

###

As we avail ourselves of the Endless Appetizers, we might recall that it was on this date in 1945 that a self-taught engineer named Percy Spencer applied for a patent for a “microwave cooking oven”; he had been working in a lab testing magnetrons, the high-powered vacuum tubes inside radars.  One day while working near the magnetrons– which produced microwaves– Spencer noticed a peanut butter candy bar in his pocket had begun to melt — shortly after, the microwave oven was born.

In 1947, Raytheon introduced Spencer’s invention, the world’s first microwave oven, the “Radarange”: a refrigerator-sized appliance that cost $2-3,000.  It found a some applications in commercial food settings and on Navy ships, but no consumer market.  Then Raytheon licensed the technology to the Tappan Stove Company, which introduced a wall-mounted version with two cooking speeds (500 and 800 watts), stainless steel exterior, glass shelf, top-browning element and a recipe card drawer.  It sold for $1,295 (figure $10,500 today).

Later Litton entered the business and developed the short, wide shape of the microwave that we’re familiar with today. As Wired reports, this opened the market:

Prices began to fall rapidly. Raytheon, which had acquired a company called Amana, introduced the first popular home model in 1967, the countertop Radarange. It cost $495 (about $3,200 today).

Consumer interest in microwave ovens began to grow. About 40,000 units were sold in the United States in 1970. Five years later, that number hit a million.

The addition of electronic controls made microwaves easier to use, and they became a fixture in most kitchens. Roughly 25 percent of U.S. households owned a microwave oven by 1986. Today, almost 90 percent of American households have a microwave oven.

Today, Percy Spencer’s invention and research into microwave technology are still being used as a jumping off point for further research in radar and magnetron technologies.  Different wavelengths of microwaves are being used to keep an eye on weather conditions and even rain structures via satellites, and are able to penetrate clouds, rain, and snow, according to NASA.  Other radar technology use microwaves to monitor sea levels to within a few centimeters.

Police are also known to use radar guns to monitor a vehicle’s speed, which continually transmit microwaves to measure the waves’ reflections to see how fast one is driving.

 source

 

Written by (Roughly) Daily

October 8, 2017 at 1:01 am

%d