(Roughly) Daily

Posts Tagged ‘sociology

“The duty of a good Cuisinier is to transmit to the next generation everything he has learned and experienced.”*…

Five years ago, we marked the passage of Lynn Olver, a reference librarian who pretty much single-handedly created and maintained The Food Timeline: history of human eating habits for 20,000 years. Worried that her life’s work might lie fallow and spoil, her family was searching for a new host.

Happily, one was found. Later in 2020, Virginia Tech University Libraries and the College of Liberal Arts and Human Sciences (CLAHS) offered Virginia Tech as a new home for the physical book collection and the web resource– and the site lives on…

Ever wonder how the ancient Romans fed their armies? What the pioneers cooked along the Oregon Trail? Who invented the potato chip…and why? So do we!!! Food history presents a fascinating buffet of popular lore and contradictory facts. Some experts say it’s impossible to express this topic in exact timeline format. They are correct. Most foods are not invented; they evolve…

Dive into “The Food Timeline,” courtesy of @vtliberalarts.bsky.social‬.

See also (the source of the almanac entry below) chef James T. Ehler‘s marvelous FoodReference.com– “on this date” history and more.

(Image above: source)

Fernand Point

###

As we dig in, we might send healthy birthday greetings to Gilbert Blane; he was born on this date in 1749. A Scottish physician who served on the Sick and Wounded Board of the Admiralty, he instituted health reform in the Royal Navy. Perhaps most memorably, he was largely responsible for requiring citrus juice (lemons, later limes) on all naval vessel to prevent scurvy.

Portrait of Sir Gilbert Blane, a Scottish physician known for his health reforms in the Royal Navy and prevention of scurvy.

source

Written by (Roughly) Daily

August 29, 2025 at 1:00 am

“Whoever wishes to foresee the future must consult the past; for human events ever resemble those of preceding times”*…

An advertisement for the 1910 Ford Model T touring car, showcasing its price, features, and specifications. The car is depicted in a side view with a detailed description of its engine power and equipment.

In times like these, perspective is at a premium. Here, Derek Thompson on what we might learn from our not-so-terribly-distant past…

When we hear about technological change and social crisis in the 21st century, it is easy to imagine that we are living through a special period of history. But many eras have grappled with the problems that seem to uniquely plague our own. The beginning of the 20th century was a period of speed and technological splendor (the automobile! the airplane! the bicycle!), shattered nerves, mass anxiety, and a widespread sense that the world had been forever knocked off its historical axis: a familiar stew of ideas. I think we can learn a lot about the present by studying historical periods whose challenges rhyme with our own.

My favorite period of history is the 30- to 40-year span between the end of the 19th century and the early innings of the 20th century. It was an era of incredible change. From Abundance (which Thompson co-authored with Ezra Klein):

Imagine going to sleep in 1875 in New York City and waking up thirty years later. As you shut your eyes, there is no electric lighting, Coca-Cola, basketball, or aspirin. There are no cars or “sneakers.” The tallest building in Manhattan is a church.

When you wake up in 1905, the city has been remade with towering steel-skeleton buildings called “skyscrapers.” The streets are filled with novelty: automobiles powered by new internal combustion engines, people riding bicycles in rubber-soled shoes—all recent innovations. The Sears catalog, the cardboard box, and aspirin are new arrivals. People have enjoyed their first sip of Coca-Cola and their first bite of what we now call an American hamburger. The Wright brothers have flown the first airplane. When you passed into slumber, nobody had taken a picture with a Kodak camera or used a machine that made motion pictures, or bought a device to play recorded music. By 1905, we have the first commercial versions of all three—the simple box camera, the cinematograph, and the phonograph.

No book on turn-of-the-century history has influenced me more, or brought me more joy, than The Vertigo Years: Europe 1900-1914 by Philipp Blom. I think it might be the most underrated history book ever written. In my favorite chapters focusing on the years around 1910, Blom describes how turn-of-the-century technology changed the way people thought about art and human nature and how it contributed to a nervous breakdown across the west. Disoriented by the speed of modern times, Europeans and Americans suffered from record-high rates of anxiety and a sense that our inventions had destroyed our humanity. Meanwhile, some artists channeled this disorientation to create some of the greatest art of all time.

[Thompson uses passages from Blom to unpack those issues– a world moving too fast, the anxiety occasioned by technological change, and the responses of artists and creators of culture. He concludes with a consideration of two influential new theories of human nature that arose at that point…]

… Blom closes his chapter “1910: Human Nature Changed” by considering two intellectual giants of the time: the sociologist Max Weber and the psychoanalyst Sigmund Freud, whose International Psychoanalytic Association was founded in 1910. The tension between their theories of human nature are profoundly relevant today.

In his famous work The Protestant Ethic and the Spirit of Capitalism, Weber, a German sociologist, argued that certain Protestant—especially Calvinist—traditions supported habits that aligned with the development of modern capitalism. He argued that the Protestant tradition of northern European worshippers cultivated a disciplined approach to work, savings, and investment that proved valuable in commerce, while the Calvinist doctrine of divine grace “could lead believers to read worldly success as a possible sign of God’s favor,” as Blom summarizes. Weber believed that Protestantism not only encouraged followers to pour their energies into labor (hence the allusion to Work Ethic in the book’s title) but also helped create a culture of trade and investment that supported the rise of modern capitalism.

“It is easy to see how Freud’s analysis follows on from Weber’s,” Blom writes. To Freud, human nature was at risk of being fully dissolved by capitalism and modern society, like chalk dropped in acid. Beneath the polite masks demanded by modern society, he said, there lurked a more atavistic and instinctual self. Freud saw our psyche as a tug-of-war between the id (our animal urges) and superego (the voice in our head that internalizes society’s rules), with the ego stuck in the middle trying to negotiate an authentic identity in the face of mass inauthenticity. One of Freud’s most fantastic insights was that some people can channel or redirect their most raw and unacceptable urges toward productive and acceptable work. His name for this bit of psychological alchemy was sublimation.

Modern capitalism, in Freudian terms, was the sublimation of self-interest—or, one might even say, the sublimation of greed. “The suppression of natural urges is a necessary precondition for capitalist success,” Blom writes in summary, “but while it is productive for the group and its wealth, such an approach will eventually exact its revenge on the individual.” By this interpretation, the mass anxiety of the early 1900s—whether you call it neurasthenia, American Nervousness, or Newyorkitis—was price of modernity, technological development, and even capitalism itself.

There is little evidence that Freud and Weber ever debated one another. Yet when you set their theories side by side, it’s hard not to hear a conversation that still shapes much modern commentary. Weber wrote that modern capitalism evolved from religious doctrines that fit our nature, while Freud argued that human nature is unfit for a modern world that distorts and represses our basic urges. Are our most impressive inventions the ultimate expression of our humanity, or are they the ultimate threat to it? This is the question that every generation must answer for itself, including our own. It is a question equally worthy of the automobile and artificial intelligence. The troubling answer—for Weber and for Freud; for 1910 and for 2025—is: perhaps, both.

Learning from our past: “1910: The Year the Modern World Lost Its Mind,” from @dkthomp.bsky.social‬. Eminently worth reading in full.

Pair with another “history lesson,” a consideration of American mechanisms of voter restriction/suppression over the years (as context for the current application of the Orban playbook by the Trump Administration and states like Texas: “Competitive authoritarianism” and America’s slide toward it.” moves fueled by appeals to the very anxieties (and to false nostalgia for times that were free of it) discussed above.

* Machiavelli

###

As we look back to look forward, we might send altitudinous birthday greetings to a man whose work figured into the tale that Thompson and Blom tell: Orville Wright; he was born on this date in 1871. An inventor and aviator, he American inventor and aviator, he  invented, with his elder brother Wilbur, the first powered airplane, Flyer, capable of sustained, controlled flight. In 1903, at Kitty Hawk, North Carolina, Orville made the first ever manned powered flight, airborn for 12 sec. By 1905, the brothers had improved the design, built and and made several long flights in Flyer III, which was the first fully practical airplane, able to fly up to 38-min and travel 24 miles (though not without incident). Their Model A was produced later in 1908, capable of over two hours of flight. By 1909 their flights were the subject of wide public interest, watched by leaders (like President Taft) and by public crowds of as many as a million people (in Manhattan during the Hudson-Fulton Celebration in New York City)… by 1910, flight and its future had become one of the many accelerating vectors driving the turmoil that THompson describes.

A historical black and white portrait of Orville Wright, an American inventor and aviation pioneer, showcasing his distinguished attire and prominent mustache.

source

“I didn’t study theology out of piety. I studied it because I wanted to know.”*…

A hospital bed with a white pillow and blue blanket, alongside a pink bedside table containing a pitcher, a bowl, and bottles.

Beatrice Marovich on a discipline declining…

People often assume that theology is only for true believers: those who want to defend the existence of God against the skepticism of secular outsiders. But there’s an old open secret in the field: theologians often have a complicated relationship with belief, and some theologians are even non-believers. I’ve always been a secular—or non-religious—person. That’s the “tradition” I was raised in. But I’m also a theologian.

I knew that it was a risk, going into the field of theology. There are conversations I’ve been shut out of because I’m not religious enough. And I’m often marked as a troubling outsider by scholars who see themselves taking a purely secular approach to the interdisciplinary study of religion. But as a graduate student, and even early in my career as a faculty member at a small liberal arts college, I believed the field of theology was opening up, and becoming more complex. It felt, to me, as if there were a creative disintegration happening that might make more room for scholars like me. But after more than a decade in the field, I’ve come to feel that something else is happening instead. It feels like the field is dying.

People are still doing theology in public (if, by doing theology we mean talking about gods, spirits, and other divine powers). But the field I was trained in as a scholar—academic theology—feels like it’s dying. It’s a field that’s often philosophical, but always theoretical. Because of this, theology can verge quickly into the abstract, and the speculative. Theologians might make use of anthropological, sociological, and historical studies of religion. But they tend not to feel beholden to any of those disciplines. Indeed, theologians are often wading into explicitly interdisciplinary conversations about science, politics, gender, and race (among other things). In its lack of clear focus, theology might be the most undisciplined discipline in the American academy today. And that undisciplined discipline feels like it’s dying. At least to me.

But is theology really dying? Or is this just the feeling I have, as I’m being squeezed out of the field? Or, perhaps I’m I fixated on the mortality of this collective project because I’ve been writing, thinking, and teaching about death. When I looked at enrollment numbers at seminaries and theological schools, the numbers aren’t necessarily damning. At least not yet. They don’t necessarily confirm my feeling, or my mood. Neither did Sean Larsen’s 2020 State of Theology study, funded by the Templeton Religion Trust. There were people, in that study, who remained optimistic about the discipline’s prospects. And while Ted Smith’s 2023 book The End of Theological Education does acknowledge that the institutions that built theology in America are collapsing, he remains optimistic about what the church can do for the future of theology.

I needed to know if others shared my feeling, or mood. So, I decided to have a conversation with my colleagues. I reached out to people in my network, to see who felt compelled to weigh in. I had three questions for them: Is academic theology really dying? If so, how do you feel about this death? And, finally, If you could save one thing from the sinking ship that is academic theology, what would it be? This essay is a kind of report: it’s what my colleagues told me.

What you’ll read here does reflect a bias: these are voices from within my network. Nevertheless, I think their words are worth sharing. Whether or not academic theology is really dying, it may still be worth thinking about its mortality. If I’ve learned any lesson from writing and thinking about death, it’s that when we acknowledge that it’s there, when we remember that we’re always living in death’s shadows, we take what’s in front of us much more seriously. We can see the full fragility of things, and we can try—against the odds—to resist entropy and protect what we think is worth saving, inheriting, or carrying on into the future. And we can think about what we’re ready to let go of. Because all things, in time, do die. It’s only a question of when…

[Marovich examines the state of the field v ia a recounting of highlights from her conversations with colleagues…]

… I conducted these interviews in the spring of 2024, in what feels to me (now) like a different world. What David Kline so succinctly described as the “institutional frameworks for intellectual life” seem more fragile and threatened than ever, as the Trump administration rapidly defunds education and research, and attacks media outlets. And we can’t forget, of course, about the many threats that Artificial Intelligence—in the form of Large Language Models like ChatGPT—poses to these fragile frameworks for intellectual life. I’m aware that it may seem small-minded and naïve to worry about my own obscure little academic discipline, when the whole structure is falling apart. So, it does seem important for me to clarify that I have spent (and will continue to spend) many hours grieving, as if in anticipation, what feels like the evaporation of intellectual possibilities—intellectual life itself!—in America. I am torn up about all of this. And yet, simultaneously, I do remain concerned about the strange little ecosystem that comprises my corner of the world.

As I think over these conversations with my colleagues, I find myself torn between letting go and holding on—or, perhaps better said, trying to hold space. I agree with Hanna Reichel when they suggest that letting go of the growth mindset is painful and difficult for Americans, perhaps more than anyone else. And this contributes to so much of the damage that American life does to the planet we share with others. I recognize that this is a problem. And I am compelled by Colby Dickinson’s suggestion that perhaps learning to die—learning an ars moriendi—might be the best thing that theology could do right now. So much of what is good about theology is probably already in diaspora, as Amaryah Armstrong has suggested. I do have a certain kind of faith that much of the power of theology will live on, in some shape and form, wherever it goes.

And yet Sameer Yadav’s point about academic theology existing as a kind of “nowhere” space strikes me as so deeply true. That nowhere space has given me so much room to explore, it’s opened dimensions of life to me that I would never have seen, and it’s introduced me to so many incredible people—living and dead. I am grateful for this community, and I feel like I owe it something. I feel compelled to somehow preserve that generative and undisciplined nowhere space for others. Like Meg Mercury, I would like to see this nowhere space open up and expand, for those people who don’t feel as if they belong in traditional religious structures. And yet, I also recognize that the cash value of this sort of space—for the church and for the academy—is more or less zero. The odds that it will survive, even if (as David Congdon noted) there is some educational New Deal that revives higher education, are slim. But perhaps this is one of the reasons why I felt compelled to speak with my colleagues, and write this piece, in the first place. Perhaps it was a gesture at letting go. Or perhaps it was a little leap of faith—a little gesture towards expanding space and time for this nowhere community to find new forms of shelter in which to gather…

On doing hospice care for an academic discipline: “Is Theology Dying?” from @beamarovich.bsky.social‬ in The Other Journal.

Mary Daly

###

As we ponder the preservation of perspicacity, we might send controversial birthday greetings to a man whose experience illustrates (one episode in) the long history of theology’s peril, Bernard Lamy; he was born on this date in 1640. A French Oratorian and mathematician, he was was also an important theologian… whose teachings were judged alternately either controversial or irrelevent at the series of institutions to which he was forced continually to move throughout his career.

Engraving of Bernard Lamy, a French Oratorian and mathematician, depicting him in a traditional clerical outfit, inside an ornate frame with an inscription below.

source

Written by (Roughly) Daily

June 15, 2025 at 1:00 am

“Status is welcome, agreeable, pleasant, & hard to obtain in the world”*…

Illustration of two figures standing on a staircase made of books, with one figure holding a book above their head, symbolizing knowledge and expertise.

We live in a time when a certain kind of status– expertise– is under attack. Dan Williams suggests that by celebrating “common sense” over expert authority, populism performs a dramatic status inversion. It gifts uneducated voters the power of knowledge and deflates those who look down on them…

… As Will Storr argues in The Status Game, humiliation is the “nuclear bomb of the emotions”. When ignited, it can fuel everything from genocide to suicide, mass atrocities to self-immolation. There are few parts of human nature more chaotic, dangerous, or self-destructive. And yet, there is often a rationale underlying these reactions rooted in the strange nature of human sociality.

If humans were solitary animals, we would have evolved to approximate the behaviour of Homo economicus, the idealised rational agent imagined in much of twentieth century economics. We would act in ways that are predictable, sensible, and consistent. The characters depicted in Dostoevsky’s novels would be unintelligible to such a creature, except as victims of mental illness.

But we are not. We are social creatures, and almost everything puzzling and paradoxical about our species is downstream of this fact.

For one thing, we rely on complex networks of cooperation to achieve almost all our goals. Given this, much of human behaviour is rooted not in ordinary material self-interest but in the need to gain access to such networks—to win approval, cultivate a good reputation, and attract partners, friends, and allies. Human decision-making occurs within the confines of this social scrutiny. We evaluate almost every action, habit, and preference not just by its immediate effects but by its reputational impact.

At the same time, much of human competition is driven by the desire for prestige. In well-functioning human societies, individuals advance their interests not by bullying and dominating others but by impressing them. These high-status individuals are admired, respected, and deferred to. They win esteem and all its benefits. Their lives feel meaningful and purposeful.

In contrast, those who fail at the status game—who stack up at the bottom of the prestige hierarchy—experience shame and humiliation. If their position feels unfair, they become resentful and angry. In extreme cases, they might take vengeance on those who look down on them. Or they might take their own life. In some cases, such as mass killings by young men who “lose face” and run “amok” (a Malay word, illustrating the behaviour’s cross-cultural nature), they do both…

… The name of this newsletter, “Conspicuous Cognition”, is inspired by Veblen’s ideas about economics. Just as he sought to correct a misguided tendency to treat economics through a narrowly economic lens, my work and writings seek to correct a similarly misguided tendency to treat cognition—how we think, form beliefs, generate ideas, evaluate evidence, communicate, and so on—through a narrowly cognitive lens.

Much cognition is competitive and conspicuous. People strive to show off their intelligence, knowledge, and wisdom. They compete to win attention and recognition for making novel discoveries or producing rationalisations of what others want to believe. They often reason not to figure out the truth but to persuade and manage their reputation. They often form beliefs not to acquire knowledge but to signal their impressive qualities and loyalties.

Placed in this context of social competition and impression management, what might be called “epistemic charity”—the free offer of knowledge and expertise—takes on a different appearance. Although this charity can be driven by disinterested altruism (think of parents educating their children), it can also result from status competition and a desire to show off.

In some cases, people are happy to receive such epistemic charity and heap praise and admiration on those who provide it. The wonders of modern science emerge from a status game that celebrates those who make discoveries. However, we sometimes recoil at the thought of admitting someone has discovered something new, or—even worse—that they know better than we do. When that happens, we are not sceptical of the truth of their ideas, although we might choose to frame things that way. Rather, their offer of knowledge carries a symbolic significance we want to reject. It hurts our pride. It feels humiliating.

On a small scale, this feeling is an everyday occurrence. Few people like to be corrected, to admit they are wrong, or to acknowledge another’s superior knowledge, wisdom, or intelligence. On a larger scale, it might be implicated in some of the most significant and dangerous trends in modern politics.

Many of our most profound political problems appear to be entangled with epistemic issues. Think of our alleged crises of “disinformation”, “misinformation”, “post-truth”, and conspiracy theories. Think of the spread of viral lies and falsehoods on social media. Think of intense ideological polarisation, vicious political debates, and heated culture wars, disagreements and conflicts that ultimately concern what is true.

A critical aspect of these problems is the so-called “crisis of expertise”, the widespread populist rejection of claims advanced in institutions like science, universities, public health organisations, and mainstream media. Famously, many populists have “had enough of experts.” As Trump once put it, “The experts are terrible.”

This rejection of expertise goes beyond mere scepticism. It is actively hostile. The Trump administration’s recent attacks on Harvard and other elite universities provide one illustration of this hostility, but there are many others. Most obviously, there is the proud willingness among many populists to spread and accept falsehoods, conspiracy theories, and quack science in the face of an exasperated barrage of “fact-checks” from establishment institutions. Why are these corrections so politically impotent? Why do so many voters refuse to “follow the science” or “trust the experts”?

Experts have produced many theories. Some point to ignorance and stupidity. Some point to disinformation and mass manipulation. Some point to partisan media, echo chambers, and algorithms. And some suggest that the crisis might be related to objective failures by experts themselves.

There is likely some truth in all these explanations. Nevertheless, they share a common assumption: that the “crisis of expertise” is best understood in epistemic terms. They assume that populist hostility to the expert class reflects scepticism that their expertise is genuine—that they really know what they claim to know.

Perhaps this assumption is mistaken. Perhaps at least in some cases, the crisis of expertise is less about doubting expert knowledge than about rejecting the social hierarchy that “trust the experts” implies… some populists might sooner accept ignorance than epistemic charity from those they refuse to acknowledge as superior…

… If this analysis is correct, the populist rejection of expertise is not merely an intellectual disagreement over truth or evidence, even if it is typically presented that way. It is, in part, a proud refusal to accept epistemic charity from those who present themselves as social superiors.

In the case of populist elites and conspiracy theorists, this refusal is often driven by objectionable feelings of grandiosity and narcissism. However, for many ordinary voters, it may serve as a more understandable dignity-defence mechanism, a refusal to accept the social meanings implied by one-way deference to elites with alien values. It is less “post-truth” than anti-humiliation.

This would help to explain several features of the populist rejection of expertise.

First, there is its emotional signature. In many cases, the populist refusal to defer to experts appears to be wrapped up in intense emotions of resentment, indignation, and defiant pride, rather than simple scepticism.

Second, the rejection of expert authority often has a performative character. Experts are not merely ignored. They are actively, angrily, and proudly rejected. Like Captain Snegiryov, the populist publicly tramples on the expert’s offer of knowledge.

Third, there is the destructive aspect of many populist sentiments. If the issue were merely scepticism of experts and establishment institutions, the solution would presumably involve targeted reforms designed to make them more reliable. As recent Republican attacks on elite universities make clear, many populists prefer to take a sledgehammer to these institutions. The explosive hostility towards public health experts during the pandemic provides another telling example.

Finally, there is the fact that populists often embrace anti-intellectualism as an identity marker, a badge of pride. The valorisation of gut instincts, the proposed “revolution of common sense”, and the embrace of slogans like “do your own research” affirm the status of those who prioritise intuition over experts. The demonisation of “ivory tower academics”, “blue-haired”, “woke” professors, and the “chattering classes” are crafted to have a similar effect. This all looks more like status-inverting propaganda than intellectual disagreements over truth and trustworthiness.

To understand is not to forgive. Just as we can empathise with Snegiryov’s refusal of much-needed money whilst condemning it as short-sighted and self-destructive, we can try to understand the populist rejection of expertise without endorsing or justifying it.

To be clear, there are profound problems with our expert class and elite institutions. They routinely make errors, sometimes catastrophic ones, and often wield their social authority in ways that advance their own interests over the public good. The Iraq war, the financial crisis, and the many failures of policy and communication throughout the pandemic provide powerful illustrations of these expert failures, but there are many others.

Moreover, the social and political uniformity of experts today creates legitimate concerns about their trustworthiness. When scientific journals, public health authorities, and fact-checking organisations are obviously shaped by the values, partisan allegiances, and sensibilities of highly educated, progressive professionals, it is reasonable for those with very different values and identities to become mistrustful of them.

Nevertheless, there is no alternative to credentialed experts in complex, modern societies. To address the political challenges we confront today, we need specialised training, rigorous standards of evidence, and coordinated activity within institutions carefully engineered to produce knowledge. Although these institutions must be reformed in countless ways, they are indispensable.

Given this, the populists’ rejection of expertise does not liberate them from bias and error. It guarantees bias and error. Gut instincts, intuition, and “common sense” are fundamentally unreliable ways of producing knowledge. As we see with the MAGA media ecosystem today, the valorisation of such methods means returning to a pre-scientific, medieval worldview dominated by baseless conspiracy theories, snake oil medicine, economic illiteracy, and know-nothing punditry.

And yet, the dangers associated with this style of politics underscore the importance of understanding its causes. If the crisis of expertise is partly rooted in feelings of status threat, resentment, and humiliation, this has significant implications for how we should think about—and address—this crisis.

Most obviously, it suggests that purely epistemic solutions will have limited efficacy. You cannot fact-check your way out of status competition. And as long as the acceptance of expert guidance is experienced as an admission of social inferiority, there will be a lucrative market for demagogues and bullshitters who produce more status-affirming narratives.

Moreover, it suggests that rebuilding trust in experts means more than improving their reliability, as crucial as that is. Institutions dominated by a single social class and political tribe will inevitably face resistance and backlash in broader society, regardless of their technical competence.

We do not just need better ways of producing knowledge. We need to rethink how knowledge is offered: in ways that respect people’s pride and minimise the humiliations of one-sided epistemic charity…

Eminently worth reading in full: “Status, class, and the crisis of expertise,” from @danwphilosophy.bsky.social‬.

(Image above: source)

* Buddha (Ittha Sutta, AN 5.43)

###

As we dig dignity, we might send classy birthday greetings to George Bryan “Beau” Brummell; he was born on this date in 1778. An important figure in Regency England (a close pal of the Prince Regent, the future King George IV), he became the the arbiter of men’s fashion in London and in the territories under its cultural sway. 

Brummell was remembered afterwards as the preeminent example of the dandy; a whole literature was founded on his manner and witty sayings, e.g. “Fashions come and go; bad taste is timeless.”

Portrait of George Bryan 'Beau' Brummell, a prominent figure in Regency England known for his influence on men's fashion.

source

Written by (Roughly) Daily

June 7, 2025 at 1:00 am

“I know not all that may be coming, but be it what it will, I’ll go to it laughing”*…

Antropologist Kristin Bell explores laughter as a far more complex phenomenon than simple delight, reflecting on its surprising power to disturb and disrupt…

… As an anthropologist specializing in health and medicine, laughter isn’t really in my professional wheelhouse—unless you subscribe to the view that laughter is the best medicine. My interest in the topic is more personal, not just because of my history as a former Giggling Gertie, but because it’s a behavior that is much less straightforward than it seems.

Ideally, laughter is something we share. According to anthropologist Munro Edmonson, laughter is sociable; it ideally invites a similar response. Indeed, it has contagious qualities: When we hear someone laugh, we often laugh, or at least smile, ourselves—an effect consistently shown through psychological research. This is how we ended up with canned laughter on sitcoms. Studios realized that the sound of laughter made their shows seem funnier to their audiences, while also giving them a degree of control over when people laughed…

… According to the anthropologist Munro Edmonson, the central feature of laughter is aspiration: We release a forceful puff of air as we laugh.

But laughter is also characterized by repetition. In fact, given the extraordinary variability in the sounds people make when they laugh, repetition is what makes laughter universally recognizable. This is why writers conventionalize laughter as “he-he-he,” “ha-ha-ha,” and “ho-ho-ho” (well, at least if you’re Santa Claus). Notably, this feature isn’t exclusive to English representations. Edmonson observed that laughter is represented in Russian as xe, xe, xe; in Tzotzil—a Mayan language spoken in Mexico—it’s ‘eh ‘eh ‘eh.

We don’t fully understand why humans make this sound when we laugh. When 19th-century biologist Charles Darwin set out to explore the biology of feelings in The Expression of the Emotions in Man and Animals, he wrote, “why the sounds which man utters when he is pleased have the peculiar reiterated character of laughter we do not know.” However, the response seems to occur well before culture is embedded in our behaviors: Recognizable laughter is evident in babies from 4 months old.

Nor is laughter unique to humans. Great apes respond to being tickled in much the same way that humans do. Of course, because chimps, bonobos, et cetera have a different vocal apparatus than humans, it sounds more like a dog panting or a person having an asthma attack or energetic sex. However, these primate sounds have the same “peculiar, reiterated character” that Darwin highlighted in humans. This is why laughter is characterized by scientists as a cross-species phenomenon.

Yet, while laughter is evident in the play of other primates, it’s unclear whether they have a sense of humor. Recent research provides evidence of a capacity for teasing through nonverbal behavior. But, as the evolutionary psychologist Robert Provine noted, “there is no evidence that they respond to apparently humorous behavior, their own or that of others, with laughter.”

Giving meaning to laughter seems to be distinctively human.

While some laughter is deliberate, much of it is outside conscious control—an attribute that goes a long way toward explaining the widespread Euro-American ambivalence toward the act. According to the literary scholar Sebastian Coxon, a growing anxiety about mirth is evident in the European historical record from the late Middle Ages. For example, the 16th-century Dutch philosopher Desiderius Erasmus, better known for advising children to “replace farts with coughs,” also warned against “loud laughter and immoderate mirth.”

Notably, Erasmus singled out the “neighing sound that some people make when they laugh” for particular opprobrium—an impulse evident in the contemporary tendency to compare unrestrained laughter with the cries of animals: “howling” with laughter, “hooting” in delight, “snorting” with amusement, and so on. Indeed, while the term “guffaw” might not be borrowed from animal noises, it certainly sounds like it could be.

These characterizations reveal an attempt to draw laughter into the realm of taste and civility—categories that are strongly tied to gender and class strictures. For instance, in an 1860 etiquette guide titled The Ladies’ Book of Etiquette and Manual of Politeness: A Complete Hand Book for the Use of the Lady in Polite Society, readers are counseled to moderate their laughter during a dinner party so that it’s neither too loud nor too soft: “To laugh in a suppressed way, has the appearance of laughing at those around you, and a loud, boisterous laugh is always unlady-like.”

Social judgments abound not just in relation to how we laugh but what we laugh at—as an early 19th-century artwork attests. “Laughter,” etched by British artist and social commentator Thomas Rowlandson, depicts a man laughing at his cat adorned in a bonnet and cloak.

The caption reads: ‘Laughter is one of the most pleasing of the Passions and is with difficulty accounted for, as risibility is frequently excited from the most simple causes—as is the case with the Countryman and his Cat.’ The implication is that “unsophisticated” countrymen lack “class” and are therefore easily amused. (For the record, I am equally unsophisticated, because I will never not find cats pictured with human props funny.)

Still, despite the association between humor and taste, it’s often physical comedy that gets the most laughs. It’s not a coincidence that the first truly global hit comedy was The Gods Must Be Crazy, whose sublime “Tati-like slapstick routines” drew audiences from New York and Caracas to Tokyo and Lagos, despite being widely condemned by film reviewers as apartheid propaganda.

Indeed, screenwriters have long predicted that physical humor will become increasingly prominent in Hollywood comedies because it “transcends dialogue and even most cultural differences,” and movies must increasingly appeal to a global market to produce reliable returns. (As far as I can tell, the future of Hollywood films is basically Marvel movies and slapstick comedies.)…

As McDonald observes, laughter disrupts the notion of a stable, coherent self—reflected in terms like “cracking up” and “bursting.” Moreover, unrestrained laughter doesn’t just signify a lack of personal control; it can be politically dangerous as well. The literary historian Joseph Butwin writes of “seditious laughter” as a weapon of the oppressed that can serve to destabilize hierarchies and power relations.

In the end, it’s clear that laughter is a deeply curious thing. It’s simultaneously the most social of human expressions and the one most disruptive of social edifices and rules. Shared, sanctioned laughter might bring us together, but unsanctioned laughter shows the cracks, revealing that we’re not quite who we think…

The Strange Power of Laughter

* Herman Melville, Moby-Dick

###

As we muse on mirth (and lest we forget that sometimes laughter is simply a function of simple delight), we might recall that it was on this date in 1929 that  Rube Goldberg‘s “The Inventions of Professor Lucifer Gorgonzola Butts, A.K.,” cartoon series first published in Colliers Weekly.

source

Written by (Roughly) Daily

January 26, 2025 at 1:00 am