(Roughly) Daily

Posts Tagged ‘sociology

“Badger hates Society, and invitations, and dinner, and all that sort of thing”*…

Americans are now spending more time alone– mostly at home– than ever. It’s changing our personalities, our politics, and even our relationship to reality. While the pandemic certainly enforced some of that isolation; the post-COVID world remains extraordinarily atomized.

In a bracing essay, Derek Thompson, explores the emergence of this wide-spread isolation, unpacking its drivers, enumerating its considerable (personal and civic) costs, musing on the possible impact of AI, and pondering what might lead to a return to sociability…

… “I have a view that is uncommon among social scientists, which is that moral revolutions are real and they change our culture,” Robert Putnam [author of Bowling Alone] told me. In the early 20th century, a group of liberal Christians, including the pastor Walter Rauschenbusch, urged other Christians to expand their faith from a narrow concern for personal salvation to a public concern for justice. Their movement, which became known as the Social Gospel, was instrumental in passing major political reforms, such as the abolition of child labor. It also encouraged a more communitarian approach to American life, which manifested in an array of entirely secular congregations that met in union halls and community centers and dining rooms. All of this came out of a particular alchemy of writing and thinking and organizing. No one can say precisely how to change a nation’s moral-emotional atmosphere, but what’s certain is that atmospheres do change. Our smallest actions create norms. Our norms create values. Our values drive behavior. And our behaviors cascade.

The anti-social century is the result of one such cascade, of chosen solitude, accelerated by digital-world progress and physical-world regress. But if one cascade brought us into an anti-social century, another can bring about a social century. New norms are possible; they’re being created all the time. Independent bookstores are booming—the American Booksellers Association has reported more than 50 percent growth since 2009—and in cities such as New York City and Washington, D.C., many of them have become miniature theaters, with regular standing-room-only crowds gathered for author readings. More districts and states are banning smartphones in schools, a national experiment that could, optimistically, improve children’s focus and their physical-world relationships. In the past few years, board-game cafés have flowered across the country, and their business is expected to nearly double by 2030. These cafés buck an 80-year trend. Instead of turning a previously social form of entertainment into a private one, they turn a living-room pastime into a destination activity. As sweeping as the social revolution I’ve described might seem, it’s built from the ground up by institutions and decisions that are profoundly within our control: as humble as a café, as small as a new phone locker at school…

On how we spend our time and what that yields: “The Anti-Social Century,” from @dkthomp.bsky.social in @theatlantic.com (gift article).

See also: “You’re Being Alienated From Your Own Attention,” from @chrislhayes.bsky.social (also in @theatlantic.com, also a gift article)

* Kenneth Grahame, The Wind in the Willows

###

As we call a friend, we might recall that it was on this date in 1915 that Alexander Graham Bell placed the first transcontinental phone call, from New York to San Francisco, where the Panama–Pacific International Exposition celebrations were underway and his assistant, his assistant Thomas Augustus Watson stood by. Bell repeated his famous first telephonic words, “Mr. Watson, come here. I want you,” to which Watson this time replied “It will take me five days to get there now!” Bell’s call officially initiated AT&T’s transcontinental service.

Alexander Graham Bell, about to call San Francisco from New York. (source)

And, on ths date 45 years later, in 1959, The first non-stop transcontinental commercial jet trip was made by an American Airlines Boeing 707, from Los Angeles to New York. The sleek silver plane made the flight in airline official time of 4 hours and 3 minutes, half the usual scheduled time for the prop-driven DC- 7Cs then in regular use on that route.

source

“This is a cardboard universe”*…

Tina Zimmermann, Amazon Tsunami, installation at the European Cultural Center Venice, April-November 2022, Palazzo Mora, Venice, Italy. [Courtesy of the artist]

As we rid ourselves of the detritus of the Holiday season, let us pause to consider its signature component: at the turn of the 21st century, corrugated cardboard accounted for just fifteen percent of the United States recycling stream; today, it’s nearly half. Shannon Mattern on the cardboard box…

… As historian Maria Rentetzi writes (“Cardboard Box: The Politics of Materiality,” in Boxes: A Field Guide), “the cardboard box — the waste of our commercial world — is recycled in such a way as to make visible the disorder in our societies, the faults of capitalism.” It is an abject object that touches all parts of the city, from the granite kitchen island to the sewer grate. And for many of us, the cardboard box is our closest touchpoint to globalized trade, structuring our relations with people in distant places. It brings the logistics chain to our doorstep. The magnificently ripped metal freight container may get the Economist cover shot, but the plain brown box delivers messages to our homes. Its very existence in our homes, Marshall McLuhan would say, is the message. In the immortal words of Walter Paepcke, founder of the Container Corporation of America, “packages are not just commodities; they are communications.”

Let’s unpack that, shall we? Boxes are media in multiple senses of the word. They’re lithographed surfaces designed to be read, and they’re dimensional containers that mediate between outside and inside worlds. They’re “media of transport and information, shapers of public opinion and consumer desire, and means of targeting attention.” And they’re “logistical media” that “arrange people and property into time and space,” that “coordinate and control the movement of labor, people, and things situated along and within global supply chains.” The cardboard box is a minimalist form with maximalist ambitions, an arboreal apparatus made from one of the world’s most abundant renewable resources, then filled with plastic and moved around by copious quantities of oil. It doesn’t just coordinate and control landscapes; it transforms them.

Cardboard’s ubiquity rests on simple claims: I can hold that, and I can go there. The Container Corporation of America was founded in 1926, and upon those claims it built an empire with surprising reach. The CCA made collapsible shipping boxes, and it transformed packaging into a science and an art. It advanced market research, shaped mid-century taste, and altered the chromatic universe through color standards. It employed some of the best graphic designers of the period, and as national borders shifted after the Second World War, it commissioned Herbert Bayer, author of the Universal typeface, to revise the World Geo-Graphic Atlas. Even then, the CCA was remaking that new world to meet its logistical needs, rehabbing mining towns and germinating forests, and orchestrating civic discourse about all of this.

How did a packaging company get into the publishing business — into the containment and distribution of information? How were geographic imaginations changed in the process? Soon we’ll dive into the Paepcke archive, to find answers to those questions. But first I want to show you how a cardboard box is made…

In turn, inspiring and horrifying– the social history of the cardboard carton: “World in a Box,” @shannonmattern.bsky.social in @placesjournal.bsky.social.

For an earlier (R)D focused on Mattern’s work: “To clarify, ADD data.”

* Philip K. Dick, The Dark-Haired Girl

###

As we tape it tight, we might recall that it was on this date in 1910 that a federal official who might slowed the onset of the cardboard box was fired: Gifford Pinchot, the first chief of the United States Forest Service, was fired by President William Howard Taft. Pinchot had opposed Taft’s newly appointed Secretary of the Interior, Richard Ballinger, who favored commercial exploitation of federal reserve lands.

During President Theodore Roosevelt’s term, Pinchot had help enable policies for the conservation of natural resources. Roosevelt had designated millions of acres to protect as National Forests. That legacy was threatened, so Pinchot pressured Taft to remove Ballinger from office. In November of 1909, Collier’s Magazine had created a scandal when it accused Ballinger of shady dealings in coal lands in Alaska. When Pinchot criticized both Ballinger and Taft, the president reacted by firing him.

source

“If you could choose only one of the following two inventions, indoor plumbing or the Internet, which would you choose?”*…

“Window to the Future,” Motorola magazine ad, circa 1961

For years, people have bemoaned the sorry state of innovation. Compared with the great inventions of the industrial era, the inventions of our own time seem pathetic. In a short essay reprised from 2012, the estimable Nicholas Carr offers a different take: We’re as innovative as ever, but the focus of innovation has shifted…

… The original inspiration for such grousing… came from Robert J. Gordon, a Northwestern University economist. His influential 2000 paper “Does the ‘New Economy’ Measure Up to the Great Inventions of the Past?” included a damning comparison of the flood of inventions of a century ago with the seeming trickle we see today. Consider the products invented in just the ten years between 1876 and 1886: the internal combustion engine, the electric lightbulb, the electric transformer, the steam turbine, the electric railroad, the automobile, the telephone, the movie camera, the phonograph, the linotype machine, the film roll for cameras, the dictaphone, the cash register, vaccines, reinforced concrete, the flush toilet. The typewriter had arrived a few years earlier, and the punch-card tabulator, the computer’s precursor, would appear a few years later. And then, in short order, came the airplane, the radio, air conditioning, the vacuum tube, jet aircraft, the television, the refrigerator, and a raft of other home appliances, as well as revolutionary advances in manufacturing processes. (And let’s not forget The Bomb.)

The conditions of life changed utterly between 1890 and 1950, observed Gordon. Between 1950 and today? Not so much.

So why is innovation less impressive today?… maybe it’s crappy education. Or a lack of corporate investment in research. Or short-sighted venture capitalists. Or monopolistic business practices. Or overaggressive lawyers. Or imagination-challenged entrepreneurs. Or maybe it’s a catastrophic loss of mojo. None of these explanations makes much sense. The aperture of science grows ever wider, after all, even as the commercial and reputational rewards for innovation grow ever larger and the ability to share ideas grows ever stronger. Any barrier to innovation should be swept away by such forces.

Let me float an alternative explanation: There has been no decline in innovation; there has just been a shift in its focus. We’re as creative as ever, but we’ve funneled our creativity into areas that produce smaller-scale, less far-reaching, less visible breakthroughs. And we’ve done that for entirely rational reasons. We’re getting precisely the kind of innovation we desire — and deserve.

My idea is that there’s a hierarchy of innovation that runs in parallel with Abraham Maslow’s famous hierarchy of needs. Maslow argued that human needs progress through five stages, with each new stage requiring the fulfillment of lower-level, or more basic, needs. So first we have to meet our most primitive Physiological needs, and that frees us to focus on our needs for Safety, and once our needs for Safety are met, we can attend to our needs for Belongingness, and then on to our needs for personal Esteem, and finally, at the peak of Maslow’s pyramid, to our needs for Self-Actualization.

If you look at Maslow’s hierarchy as an inflexible structure, with clear boundaries between its levels, it falls apart. Our needs are messy, and the boundaries between them are porous. A caveman probably pursued esteem and self-actualization, to some degree, just as we today spend time and effort fulfilling our physical needs. But if you look at the hierarchy as a map of human focus, then it makes sense — and seems to be borne out by history.

In short: The more comfortable you are, the more time you spend thinking about yourself.

If technological progress is shaped by human needs, as it surely is, then general shifts in needs would also bring shifts in the nature of innovation. The tools we invent would move through the hierarchy of needs, from tools that help safeguard our bodies or coordinate social groups on up to tools that allow us to modify our moods or express our individuality — from tools of survival to tools of the self. Here’s my crack at what the hierarchy of innovation looks like:

Innovation’s focus moves up through five stages, propelled by shifts in the needs we seek to fulfill. In the beginning come Technologies of Survival (think bow-and-arrow), then Technologies of Social Organization (think cathedral), then Technologies of Prosperity (think assembly line), then technologies of leisure (think TV), and finally Technologies of the Self (think Facebook, or Prozac).

As with Maslow’s hierarchy, you shouldn’t look at my hierarchy as a rigid one. Innovation today continues at all five levels. But the rewards, both monetary and reputational, are greatest at the top level (Technologies of the Self), which has the effect of shunting investment, attention, and activity in that direction. We’re already physically comfortable, so getting a little more physically comfortable doesn’t seem particularly pressing. We’ve become inward looking, and what we crave are more powerful tools for modifying our internal states or projecting those states outward to an audience. An entrepreneur today has a greater prospect of fame and riches if he or she creates a popular social-networking app than a faster, more efficient system for mass transit. Innovation, to put a dark spin on it, arcs toward decadence…

One might wonder why Carr doesn’t focus more (and more affirmatively) on information technologies– the emergence of personal computing in all of its form factors, the knitting together of the web and the advent of turbocharged connectivity, and on its back, the emergence of social media– all of which have been hugely impactful, perhaps especially in the 12 years since this essay was written: a “TikTok/YouTube/influencer election,” a social media fueled mental health concern, the proliferation of meme investing and online gambling, etc., etc… But then, one might conclude that they simply underline his point.

Why our innovations seem so small: “The Arc of Innovation Bends toward Decadence.”

Pair with Ted Gioia‘s painfully apposite “The State of the Culture, 2024.”

* Robert J. Gordon

###

As we ponder progress, we might recall that it was on this date that an acute observer of this phenomenon, The Simpsons, made its debut as a full-length show. Originally a part of The Tracey Ullman Show, The Simpsons got their own Christmas special, which aired on FOX on this day in 1989.  “Simpsons Roasting on an Open Fire” (AKA “The Simpsons Christmas Special”– the first of an annual tradition) was created by Matt Groening and written by Mimi Pond (who only wrote the one episode). It was viewed by 13.4 million viewers, was nominated for two Emmy Awards, and hasn’t left the airwaves since.

source

“Public opinion polls are rather like children in a garden, digging things up all the time to see how they’re growing”*…

As the press continues to treat this year’s alltooconsequential election as a horse race, your correspondent is re-visiting a topic touched a few weeks ago: the prevalence of polling data in election coverage. Rick Perlstein weighs in with a (fascinating) history of presidential election polling, then turns to it implications…

… That polls do not predict Presidential election outcomes any better now than they did a century ago is but one conclusion of this remarkable history. A second conclusion lurks more in the background—but I think it is the most important one to absorb.

For most of this century, the work was the subject of extraordinary ambivalence, even among pollsters. In 1948, George Gallup called presidential polling (as distinguished from issue polling, which has its own problems) “this Frankenstein.” In 1980, Elmo Roper admitted that “our polling techniques have gotten more and more sophisticated, yet we seem to be missing more and more elections.” All along, conventional journalists made a remarkably consistent case that they were empty calories that actively crowded out genuine civic engagement: “Instead of feeling the pulse of democracy,” as a 1949 critic put it, “Dr. Gallup listens to its baby talk.”

Critics rooted for polls to fail. Eric Sevareid, in 1964, recorded his “secret glee and relief when the polls go wrong,” which might restore “the mystery and suspense of human behavior eliminated by clinical dissection.” If they were always right, as James Reston picked up the plaint in 1970, “Who would vote?” Edward R. Murrow argued in 1952 that polling “contributed something to the dehumanization of society,” and was delighted, that year, when “the people surprised the pollsters … It restored to the individual, I suspect, some sense of his own sovereignty” over the “petty tyranny of those who assert that they can tell us what we think.”

Still and all, the practice grew like Topsy. There was an “extraordinary expansion” in polls for the 1980 election, including the first partnerships between polling and media organizations. The increase was accompanied by a measurable failure of quality, which gave birth to a new critique: news organizations “making their own news and flacking it as if it were an event over which they had no control.”

And so, after the 1980 debacle, high-minded observers began wondering whether presidential polls had “outlived their usefulness,” whether the priesthood would end up “defrocked.” In 1992, the popular columnist Mike Royko went further, proposing sabotage: Maybe if people just lied, pollsters would have to give up. In 2000, Alison Mitchell of The New York Times proposed a polling moratorium in the four weeks leading up to elections, noting the “numbing length … to which polling is consuming both politics and journalism.”

Instead, polling proliferated: a “relentless barrage,” the American Journalism Review complained, the media obsessing over each statistically insignificant blip. Then, something truly disturbing started happening: People stopped complaining.

A last gasp was 2008, when Arianna Huffington revived Royko’s call for sabotage, until, two years later, she acquired the aggregator Polling.com and renamed it HuffPost Pollster. “Polling, whether we like it or not,” the former skeptic proclaimed, “is a big part of how we communicate about politics.”

And so it is.

Even as the resources devoted to every other kind of journalism atrophied, poll-based political culture has overwhelmed us, crowding out all other ways of thinking about public life. Joshua Cohen tells the story of the time Silver, looking for a way to earn eyeballs between elections, considered making a model to predict congressional votes. But voters, he snidely remarked, “don’t care about bills being passed.”

Pollsters might not be able to tell us what we think about politics. But increasingly, they tell us how to think about politics—like them. Following polls has become our vision of what political participation is. Our therapy—headlines like the one on AlterNet last week, “Data Scientist Who Correctly Predicted 2020 Election Now Betting on ‘Landslide’ Harris Win.” Our political masochism: “Holy cow, did you hear about that Times poll.” “Don’t worry, I heard it’s an outlier …”

The Washington Post’s polling director once said, “There’s something addictive about polls and poll numbers.” He’s right. When we refer to “political junkies,” polls are pretty much the junk.

For some reason, I’ve been able to pretty much swear off the stuff, beyond mild indulgence. Maybe it’s my dime-store Buddhism. I try to stay in the present—and when it comes to the future, try to stick with things I can do. Maybe, I hereby offer myself as a role model?

As a “political expert,” friends, relatives, and even strangers are always asking me, “Who’s going to win?” I say I really have no idea. People are always a little shocked: Prediction has become what people think political expertise is for.

Afterward, the novelty of the response gets shrugged off, and we can talk. Beyond polling’s baby talk. About our common life together, about what we want to happen, and how we might make it so. But no predictions about whether this sort of thing might ever prevail. No predictions at all…

Presidential polls are no more reliable than they were a century ago. So why do they consume our political lives?

Eminently worth reading in full. Presidential polls are no more reliable than they were a century ago. So why do they consume our political lives? “The Polling Imperilment,” from @rickperlstein in @TheProspect.

Pair with: “The Problems with Polls.”

For more on why today’s polls are so flawed, see “A public-opinion poll is no substitute for thought.”

Apposite: from the estimable James Fallows: “Election Countdown, 38 Days to Go: What Is Wrong With Our Leading Paper?

* J. B. Priestley

###

As we pray for more consequential coverage, we might recall that it was on this date in 1936 that the (then-venerable) Literary Digest mailed out return postcard to 2,000,000 Americans, asking them to return the card with an indication for whether they would be voting in the upcoming presidential election for incumbent, Franklin D. Roosevelt or challenger Alf Landon. They published the results of their anxiously-anticipated poll in their October 31 issue: a massive victory for Landon. In the event, of course, Roosevelt defeated Landon in an unprecedented landslide.

The issue in question (source)

Written by (Roughly) Daily

September 30, 2024 at 1:00 am

“All the world is made of faith, and trust, and pixie dust”*…

The Gen Z gender divide, especially as it relates to ever-more conservative males and ever-more liberal females, is widely remarked. (E.g., see here [gift article].) Ruth Graham explores that divide in a different dimension– one that may be fueling the political divergence…

… For the first time in modern American history, young men are now more religious than their female peers. They attend services more often and are more likely to identify as religious.

“We’ve never seen it before,” Ryan Burge, an associate professor of political science at Eastern Illinois University, said of the flip.

Among Generation Z Christians, this dynamic is playing out in a stark way: The men are staying in church, while the women are leaving at a remarkable clip.

Church membership has been dropping in the United States for years. But within Gen Z, almost 40 percent of women now describe themselves as religiously unaffiliated, compared with 34 percent of men, according to a survey last year of more than 5,000 Americans by the Survey Center on American Life at the American Enterprise Institute.

In every other age group, men were more likely to be unaffiliated. That tracks with research that has shown that women have been consistently more religious than men, a finding so reliable that some scholars have characterized it as something like a universal human truth.

The men and women of Gen Z are also on divergent trajectories in almost every facet of their lives, including education, sexuality and spirituality.

Young women are still spiritual and seeking, according to surveys of religious life. But they came of age as the #MeToo movement opened a national conversation about sexual harassment and gender-based abuse, which inspired widespread exposures of abuse in church settings under the hashtag #ChurchToo. And the overturning of Roe v. Wade in 2022 compelled many of them to begin paying closer attention to reproductive rights.

Young men have different concerns. They are less educated than their female peers. In major cities, including New York and Washington, they earn less.

At the same time, they place a higher value on traditional family life. Childless young men are likelier than childless young women to say they want to become parents someday, by a margin of 12 percentage points, according to a survey last year by Pew…

… This growing gender divide has the potential to reshape the landscape of not just religion, but also of family life and politics. In a Times/Siena poll of six swing states in August, young men favored former President Donald J. Trump by 13 points, while young women favored Vice President Kamala Harris by 38 points — a 51-point gap far larger than in other generational cohorts.

It is too early to know if this new trend in churchgoing indicates a long-term realignment, said Russell Moore, the editor in chief of Christianity Today.

But he marveled at its strangeness in Christian history.

“I’m not sure what church life looks like with a decreasing presence of women,” he said, pointing out that they historically have been crucial forces in missionary work and volunteering. “We need both spiritual mothers and spiritual fathers.”…

Eminently worth reading in full: “In a First Among Christians, Young Men Are More Religious Than Young Women” (gift article) by @publicroad in @nytimes.

* J.M. Barrie, Peter Pan

###

As we ponder piety, we might spare a thought for Aimee Semple McPherson; she died on this date in 1944. A Pentecostal evangelist and media celebrity in the 1920s and 1930s, she is best known for founding the Los Angeles-based Foursquare Church and for pioneering the use of media to build her following. She broadcast on radio (spiced with popular entertainment and stage techniques) to draw in both audience and revenue for her weekly sermons at Foursquare’s Angelus Temple, an early megachurch. In her time, she was the most well-known (and publicized) Protestant evangelist, surpassing Billy Sunday and other predecessors.

While McPherson certainly undertook her own promotion, her fame was ignited in 1926, when (a la Agatha Christie) she disappeared– in McPherson’s case for five weeks. The evangelist insisted that she had been kidnapped and taken to Mexico; many believed that she had “retreated” into a tryst with a male colleague. Indeed, she was investigated and charged by L.A. authorities with fabricating a hoax. That charge was never proved, though many still believe that she was in fact the architect of her own disappearance. In any event, the turmoil was national news– and supercharged her rise.

source