(Roughly) Daily

Posts Tagged ‘populism

“Status is welcome, agreeable, pleasant, & hard to obtain in the world”*…

Illustration of two figures standing on a staircase made of books, with one figure holding a book above their head, symbolizing knowledge and expertise.

We live in a time when a certain kind of status– expertise– is under attack. Dan Williams suggests that by celebrating “common sense” over expert authority, populism performs a dramatic status inversion. It gifts uneducated voters the power of knowledge and deflates those who look down on them…

… As Will Storr argues in The Status Game, humiliation is the “nuclear bomb of the emotions”. When ignited, it can fuel everything from genocide to suicide, mass atrocities to self-immolation. There are few parts of human nature more chaotic, dangerous, or self-destructive. And yet, there is often a rationale underlying these reactions rooted in the strange nature of human sociality.

If humans were solitary animals, we would have evolved to approximate the behaviour of Homo economicus, the idealised rational agent imagined in much of twentieth century economics. We would act in ways that are predictable, sensible, and consistent. The characters depicted in Dostoevsky’s novels would be unintelligible to such a creature, except as victims of mental illness.

But we are not. We are social creatures, and almost everything puzzling and paradoxical about our species is downstream of this fact.

For one thing, we rely on complex networks of cooperation to achieve almost all our goals. Given this, much of human behaviour is rooted not in ordinary material self-interest but in the need to gain access to such networks—to win approval, cultivate a good reputation, and attract partners, friends, and allies. Human decision-making occurs within the confines of this social scrutiny. We evaluate almost every action, habit, and preference not just by its immediate effects but by its reputational impact.

At the same time, much of human competition is driven by the desire for prestige. In well-functioning human societies, individuals advance their interests not by bullying and dominating others but by impressing them. These high-status individuals are admired, respected, and deferred to. They win esteem and all its benefits. Their lives feel meaningful and purposeful.

In contrast, those who fail at the status game—who stack up at the bottom of the prestige hierarchy—experience shame and humiliation. If their position feels unfair, they become resentful and angry. In extreme cases, they might take vengeance on those who look down on them. Or they might take their own life. In some cases, such as mass killings by young men who “lose face” and run “amok” (a Malay word, illustrating the behaviour’s cross-cultural nature), they do both…

… The name of this newsletter, “Conspicuous Cognition”, is inspired by Veblen’s ideas about economics. Just as he sought to correct a misguided tendency to treat economics through a narrowly economic lens, my work and writings seek to correct a similarly misguided tendency to treat cognition—how we think, form beliefs, generate ideas, evaluate evidence, communicate, and so on—through a narrowly cognitive lens.

Much cognition is competitive and conspicuous. People strive to show off their intelligence, knowledge, and wisdom. They compete to win attention and recognition for making novel discoveries or producing rationalisations of what others want to believe. They often reason not to figure out the truth but to persuade and manage their reputation. They often form beliefs not to acquire knowledge but to signal their impressive qualities and loyalties.

Placed in this context of social competition and impression management, what might be called “epistemic charity”—the free offer of knowledge and expertise—takes on a different appearance. Although this charity can be driven by disinterested altruism (think of parents educating their children), it can also result from status competition and a desire to show off.

In some cases, people are happy to receive such epistemic charity and heap praise and admiration on those who provide it. The wonders of modern science emerge from a status game that celebrates those who make discoveries. However, we sometimes recoil at the thought of admitting someone has discovered something new, or—even worse—that they know better than we do. When that happens, we are not sceptical of the truth of their ideas, although we might choose to frame things that way. Rather, their offer of knowledge carries a symbolic significance we want to reject. It hurts our pride. It feels humiliating.

On a small scale, this feeling is an everyday occurrence. Few people like to be corrected, to admit they are wrong, or to acknowledge another’s superior knowledge, wisdom, or intelligence. On a larger scale, it might be implicated in some of the most significant and dangerous trends in modern politics.

Many of our most profound political problems appear to be entangled with epistemic issues. Think of our alleged crises of “disinformation”, “misinformation”, “post-truth”, and conspiracy theories. Think of the spread of viral lies and falsehoods on social media. Think of intense ideological polarisation, vicious political debates, and heated culture wars, disagreements and conflicts that ultimately concern what is true.

A critical aspect of these problems is the so-called “crisis of expertise”, the widespread populist rejection of claims advanced in institutions like science, universities, public health organisations, and mainstream media. Famously, many populists have “had enough of experts.” As Trump once put it, “The experts are terrible.”

This rejection of expertise goes beyond mere scepticism. It is actively hostile. The Trump administration’s recent attacks on Harvard and other elite universities provide one illustration of this hostility, but there are many others. Most obviously, there is the proud willingness among many populists to spread and accept falsehoods, conspiracy theories, and quack science in the face of an exasperated barrage of “fact-checks” from establishment institutions. Why are these corrections so politically impotent? Why do so many voters refuse to “follow the science” or “trust the experts”?

Experts have produced many theories. Some point to ignorance and stupidity. Some point to disinformation and mass manipulation. Some point to partisan media, echo chambers, and algorithms. And some suggest that the crisis might be related to objective failures by experts themselves.

There is likely some truth in all these explanations. Nevertheless, they share a common assumption: that the “crisis of expertise” is best understood in epistemic terms. They assume that populist hostility to the expert class reflects scepticism that their expertise is genuine—that they really know what they claim to know.

Perhaps this assumption is mistaken. Perhaps at least in some cases, the crisis of expertise is less about doubting expert knowledge than about rejecting the social hierarchy that “trust the experts” implies… some populists might sooner accept ignorance than epistemic charity from those they refuse to acknowledge as superior…

… If this analysis is correct, the populist rejection of expertise is not merely an intellectual disagreement over truth or evidence, even if it is typically presented that way. It is, in part, a proud refusal to accept epistemic charity from those who present themselves as social superiors.

In the case of populist elites and conspiracy theorists, this refusal is often driven by objectionable feelings of grandiosity and narcissism. However, for many ordinary voters, it may serve as a more understandable dignity-defence mechanism, a refusal to accept the social meanings implied by one-way deference to elites with alien values. It is less “post-truth” than anti-humiliation.

This would help to explain several features of the populist rejection of expertise.

First, there is its emotional signature. In many cases, the populist refusal to defer to experts appears to be wrapped up in intense emotions of resentment, indignation, and defiant pride, rather than simple scepticism.

Second, the rejection of expert authority often has a performative character. Experts are not merely ignored. They are actively, angrily, and proudly rejected. Like Captain Snegiryov, the populist publicly tramples on the expert’s offer of knowledge.

Third, there is the destructive aspect of many populist sentiments. If the issue were merely scepticism of experts and establishment institutions, the solution would presumably involve targeted reforms designed to make them more reliable. As recent Republican attacks on elite universities make clear, many populists prefer to take a sledgehammer to these institutions. The explosive hostility towards public health experts during the pandemic provides another telling example.

Finally, there is the fact that populists often embrace anti-intellectualism as an identity marker, a badge of pride. The valorisation of gut instincts, the proposed “revolution of common sense”, and the embrace of slogans like “do your own research” affirm the status of those who prioritise intuition over experts. The demonisation of “ivory tower academics”, “blue-haired”, “woke” professors, and the “chattering classes” are crafted to have a similar effect. This all looks more like status-inverting propaganda than intellectual disagreements over truth and trustworthiness.

To understand is not to forgive. Just as we can empathise with Snegiryov’s refusal of much-needed money whilst condemning it as short-sighted and self-destructive, we can try to understand the populist rejection of expertise without endorsing or justifying it.

To be clear, there are profound problems with our expert class and elite institutions. They routinely make errors, sometimes catastrophic ones, and often wield their social authority in ways that advance their own interests over the public good. The Iraq war, the financial crisis, and the many failures of policy and communication throughout the pandemic provide powerful illustrations of these expert failures, but there are many others.

Moreover, the social and political uniformity of experts today creates legitimate concerns about their trustworthiness. When scientific journals, public health authorities, and fact-checking organisations are obviously shaped by the values, partisan allegiances, and sensibilities of highly educated, progressive professionals, it is reasonable for those with very different values and identities to become mistrustful of them.

Nevertheless, there is no alternative to credentialed experts in complex, modern societies. To address the political challenges we confront today, we need specialised training, rigorous standards of evidence, and coordinated activity within institutions carefully engineered to produce knowledge. Although these institutions must be reformed in countless ways, they are indispensable.

Given this, the populists’ rejection of expertise does not liberate them from bias and error. It guarantees bias and error. Gut instincts, intuition, and “common sense” are fundamentally unreliable ways of producing knowledge. As we see with the MAGA media ecosystem today, the valorisation of such methods means returning to a pre-scientific, medieval worldview dominated by baseless conspiracy theories, snake oil medicine, economic illiteracy, and know-nothing punditry.

And yet, the dangers associated with this style of politics underscore the importance of understanding its causes. If the crisis of expertise is partly rooted in feelings of status threat, resentment, and humiliation, this has significant implications for how we should think about—and address—this crisis.

Most obviously, it suggests that purely epistemic solutions will have limited efficacy. You cannot fact-check your way out of status competition. And as long as the acceptance of expert guidance is experienced as an admission of social inferiority, there will be a lucrative market for demagogues and bullshitters who produce more status-affirming narratives.

Moreover, it suggests that rebuilding trust in experts means more than improving their reliability, as crucial as that is. Institutions dominated by a single social class and political tribe will inevitably face resistance and backlash in broader society, regardless of their technical competence.

We do not just need better ways of producing knowledge. We need to rethink how knowledge is offered: in ways that respect people’s pride and minimise the humiliations of one-sided epistemic charity…

Eminently worth reading in full: “Status, class, and the crisis of expertise,” from @danwphilosophy.bsky.social‬.

(Image above: source)

* Buddha (Ittha Sutta, AN 5.43)

###

As we dig dignity, we might send classy birthday greetings to George Bryan “Beau” Brummell; he was born on this date in 1778. An important figure in Regency England (a close pal of the Prince Regent, the future King George IV), he became the the arbiter of men’s fashion in London and in the territories under its cultural sway. 

Brummell was remembered afterwards as the preeminent example of the dandy; a whole literature was founded on his manner and witty sayings, e.g. “Fashions come and go; bad taste is timeless.”

Portrait of George Bryan 'Beau' Brummell, a prominent figure in Regency England known for his influence on men's fashion.

source

Written by (Roughly) Daily

June 7, 2025 at 1:00 am

“I believe, you know, I actually, naturally think, in long, sad, singing lines”*…

Matthew Zipf on how– in the hands of a true literay stylist– the humble comma is a matter of precision, logic, individuality, and music…

… The most conspicuous mark of Renata Adler’s style is its abundance of commas. In her two novels, Speedboat (1976) and Pitch Dark (1983), there are a few sentences that edge on the absurd: “For some time, Leander had spoken, on the phone, of a woman, a painter, whom he had met, one afternoon, outside the gym, and whom he was trying to introduce, along with Simon, into his apartment and his life.” A critic tallied it up, counting “40 words and ten commas—Guinness Book of World Records?” Each of those commas had its grammatical defense, but Adler’s style did not comply with the usual standards of fluent prose. She cordoned off phrases, such as “on the phone,” that other writers would just run through. One reader, responding to a 1983 New York magazine profile of Adler, wrote in a letter to the editor, “If the examples of Renata Adler’s writing … are typical, Miss Adler will never make it to the road. The way is ‘jarringly, piece by piece, line by line, and without interruption’ blocked by commas.” The reader was quoting one of Adler’s own comma-laden critical phrases against her. The editors titled the letter “Comma Wealth.”

Adler’s comma usage differs from the balanced rhythms of 18th-century essayists, as well as from the breathless lines common among American writers after Hemingway. Her punctuation jars, and turns abruptly, like a skater’s blade stopping and sending up shards of ice. She likes to leave out conjunctions in chains of adjectives, as in her film review of  “a leering, uncertain, embarrassing, protracted little comedy.” Certain lines of hers work almost entirely by carefully placed commas, which tighten the style, each one a rivet on the page: “But this I know, or think I know, that idle people are often bored and bored people, unless they sleep a lot, are cruel.” A hesitation, a stutter, and then a swing into the qualification (“unless they sleep a lot”). Interestingly, no comma in “bored and bored people”: grammar sacrificed for rhythm and speed.

Not everyone saw the elegant precision in how she pointed her sentences. After one of her books came out, a critic wrote that “virtually every sentence is peppered with enough commas to make the prose read like a series of hiccups.” Another letter writer complained of her “muddled syntax and wandering, endless sentences.”

But Adler’s style had its admirers, too. “Nobody in this country writes better prose than Renata Adler’s,” a critic wrote in a Harper’s review of her first novel. New Yorker staff writer Elif Batuman recently said that she could not think of a living stylist she admired more. But the most insightful comment might have come from a man splitting the difference. At The New York Times, where in 1968 Adler had become the daily film critic, editor Abe Rosenthal addressed a concerned colleague by first conceding that Adler was no great stylist, and then suggesting a metaphor: She’s olives. The readers will grow to like her…

… Adler sang of punctuation. In Pitch Dark, there is an associative run, typical of her novels, that reads: “And this matter of the commas. And this matter of the paragraphs. The true comma. The pause comma. The afterthought comma. The hesitation comma. The rhythm comma. The blues.” I wonder if everyone hears the music in that “riff,” as she calls it, with the rhythm of the short fragments and the satisfaction of the last monosyllable. There is also, within the musical list, a legitimate sorting of functions. The afterthought comma, Adler said, was when you wanted to add something, and there was no obvious way to do it. The true comma was the grammatical one, separating phrases or clauses. When spoken, it could function differently from other commas. To read out loud, “For some time, Leander had spoken, on the phone, of a woman …” you should not rest at each phrase. The true comma does not always require a pause.

Punctuation controls two things: logical separation and breath. In Adler’s words, “part of it is meaning, and part of it is cadence.” Writers weigh each role differently. Didion wrote that grammar was a piano she learned to play by ear, and she seems to have given priority to breath. Adler learned grammar in school, where she diagrammed sentences, and then at The New Yorker, which gave far more weight to logic. The magazine, where she began working at age 24, was alternately beloved and deplored for its commas. If a phrase was not essential to the sentence, the editors wanted it enveloped: thus “on the phone,” a wrapped-up appositive. (“May I offer you a comma?” the magazine’s editor, William Shawn, used to say to Adler in editing sessions.) Punctuation grew into a dogmatic inheritance, a passion, and a trademark of the magazine. E. B. White wrote that “commas in The New Yorker fall with the precision of knives in a circus act, outlining the victim.”

Knives are an apt metaphor. The first systematic survey of English punctuation, published in 1785, recorded that the Greek komma means “a segment, or a part cut off [from] a complete sentence.” The word comes from koptein, to cut. Precision, too, is a kind of cutting, a drawing of lines. It has a coincident etymology in Latin: praecidere means “to cut off.” Adding commas does not necessarily make your work precise, and you can write clearly without much punctuation. Precision might come, for example, in short declarative sentences. It relies on other things, too, such as diction, the mot juste. But the wish to separate accurately, to put different things into different cells, connects, at least in Adler’s case, to an actual grammatical usage. When she launches into one of her riffs, when she begins listing, or when she describes a phone call, her work suggests the old definition of thought as collecting and dividing: cutting between concepts as a good butcher slides his knife along the natural joints…

… In Writing Degree Zero, Roland Barthes described the “loneliness of style.” It is “the writer’s ‘thing,’ ” he wrote, “his glory and his prison, it is his solitude.” Adler is meticulous about her style. During her time as a film critic, she lost days to dealing with New York Times editors, who kept changing her prose, making edits that she had already thought of and decided against (“there was rarely the conception that in doing sentences a writer chooses among options,” she wrote about the experience). Her pieces for the Times sometimes appeared without her having seen the final version. She was happier at The New Yorker, where, though they might propose significant changes, the magazine’s editors made sure “not to attribute to the writer a single word, or a single cut, or a single mark of punctuation, which the writer had not seen and, in some sense, approved.” She has always cared about those things. For without the grand plot of a detective novel, without the large, vivid characters of Austen or Dickens, what Adler has is style. It is her solitude, her glory, her thing…

Eminently worth reading in full: “In the Matter of the Commas” (in The American Scholar).

* Renata Adler, Pitch Dark

###

As we punctuate with passion, we might send farsighted birthday greetings to (a somehat more parsimonious, but still deft, user of commas) Alexander Herzen (Aleksándr Ivánovich Gértsen); he was born on this date in 1812. A Russian thinker and writer he was a key inspiration for agrarian populism and Russian socialism. With his writings, many composed while exiled in London, he attempted to influence the situation in Russia, contributing to a political climate that led to the emancipation of the serfs in 1861. Perhaps most notably, he published the important social novel Who is to Blame? (in 1845–46) and his autobiography, My Past and Thoughts (written 1852–1870), is considered one of the best examples of that genre in Russian literature.

source

“Lukewarm acceptance is much more bewildering than outright rejection”*…

Every week, Sam Circle reads the New Yorker— closely– and publishes a wonderful review of the contents of each issue in two parts: the primary editorial and the poetry and cartoons. In their most recent missive, the “Random Pick” (an article from the archive) was “This Year’s Model” by Michael Kelly. (June 17, 1996)…

Who’d have guessed that the most blistering take I’ve read on the Democrats’ current travails would be something a centrist wrote in the ‘90s? I have a general sense of Clinton’s deal, but given that I was four when he left office (I know, I know) the details aren’t visceral for me, and it’s hard to know how literally to take leftists when they call him a social conservative. But [while] I wouldn’t exactly call [Kelly] trustworthy in general (here’s Tom Scocca with a blistering and definitive posthumous takedown), I at least grant the trust of contemporaneousness when he says Clinton is, “on social issues,” running “to the left of Pat Buchanan but to the right of, say, George Bush”. It’s sick that Kelly’s issue with Clinton claiming he’s going to gut welfare and put far more cops on the streets is that he maybe can’t be trusted to actually do so; it doesn’t matter, though, because Kelly’s analysis is still sharp, and in many ways Clinton can be seen as a predecessor of Trump: “You vote for Clinton, and who knows what you’ll get? Maybe he’ll turn again – back your way.” There are no principles, there are only deals; it’s a politics of nihilism loosely cloaked in a politics of populism. And centrists still push this “we’re just following the polls” message. This is an uneasy glimpse of the past, clarified by the horrors of the present…

The legacy of the “centrist urge” and faux populism in the 90s…

See also: Rebecca Solnit’s “Stop glorifying ‘centrism’. It is an insidious bias favoring an unjust status quo” (source of the image above)

* “I must confess that over the past few years I have been gravely disappointed with the white moderate. I have almost reached the regrettable conclusion that the Negro’s great stumbling block in his stride toward freedom is not the White Citizen’s Counciler or the Ku Klux Klanner, but the white moderate, who is more devoted to “order” than to justice; who prefers a negative peace which is the absence of tension to a positive peace which is the presence of justice; who constantly says: “I agree with you in the goal you seek, but I cannot agree with your methods of direct action”; who paternalistically believes he can set the timetable for another man’s freedom; who lives by a mythical concept of time and who constantly advises the Negro to wait for a “more convenient season.” Shallow understanding from people of good will is more frustrating than absolute misunderstanding from people of ill will. Lukewarm acceptance is much more bewildering than outright rejection.” – Martin Luther King, Jr., “Letter from Birmingham Jail

###

As we take stock, we might recall that it was on this date in 1862 that Congress passed the the Act Prohibiting the Return of Slaves, effectively nullifying the Fugitive Slave Act of 1850 and setting the stage for the Emancipation Proclamation.

source

“It turns out that we’re actually capable of something other than neoliberalism and actually we’re really capable of enjoying ourselves more than we do under neoliberalism”*…

… but the path from here to there, the estimable Brad DeLong warns, could be overcast. In notes for his lectures to his Econ 135 class at Berkeley (“The History of Economic Growth,” shared in his terrific newsletter, Grasping Reality) he begins with an explanation of neoliberalism [also explained here– source of the image above], then considers what might be next…

So what is coming after neoliberalism?

First, one thing that is coming, at least here in America, is renewed or perhaps novel attention to places. Places have never been important in American identity. American identity has, instead, long been defined by a focus on mobility and opportunity. Americans are people who have moved to new places—undertaken errands unto the Wilderness—precisely because of the mistakes being made in and the limitations circumscribing their choices where they were. Americans are people who have abandoned some Old World because of its mistakes, and have moved to a New World to remake themselves and make a new society that will at least make different mistakes. The promise of more abundant resources and the chance to build a better life has driven this pattern of migration and reinvention. Thus the advice given to those who find their birth-region constraining or insufficiently prosperous has always been “go west!”: move to opportunity.

My Richardson ancestors were farmers in the hilly, rocky terrain of New England in the 1840s. Farming the land was difficult. To say that New England soil is “stony” is to greatly understate the case, as you can see even today from the ubiquitous stone walls found throughout New England all built from rocks that had to be removed from the fields before farming could even begin.

The Richardson family decided to leave New Hampshire and traveled down the Ohio River to St. Louis, where they established a pharmaceutical company: the Richardson Drug Company. The family story is that they specialized in cocaine—legal at the time, and their cocaine products were very low concentration, nothing like lines or crack. But, still, my ancestors became the very first cocaine pushers west of the Mississippi in St. Louis. The company was quite successful for two generations. Then, one New Year’s Day, a catastrophic fire destroyed their chemical plant. The fire department was, the story goes, slow to respond, as they were recovering from New Year’s Eve. And how does a catastrophic fire start when the plant is entirely shut down for the holiday. I am suspicious of my ancestors.

Rather than rebuild the plant, the Richardsons opted to take the large insurance settlement and shift their focus to banking. The course of the Richardsons is thus a very American story: change who you are and what you are doing and where you are doing several times over the course of even a few generations.

The Neoliberal Order was about capitalism but it was also about freedom. And one aspect of this freedom was freedom to successfully organize to resist being dominated by the behemoths of the New Deal Order: Big Government, Big Business, Big Labor, and also Big Cultural Expectations. The assumption that your husband should get a job with a large corporation and commute by car as you moved to suburbia and that you alone should raise the children was an essential part of the New Deal Order. And it called forth a middle-class feminist rebellion. The assumption that Blacks should largely stay in their place and be happy with slow advances toward equal rights and a small share of the benefits from social-insurance programs was an essential part of the bargains in the 1930s that formed the New Deal Order. The Black Civil Rights movement was not in itself neoliberal, but was an expression of the underlying anti-system anti-bureaucracy current. And with respect to land-use planning—Big Government bureaucrats should not be able to assist Big Finance money and Big Business bulldozers to order you around and bulldoze and “renew” your community. It was individual unbureaucratic enterpreneurship that was supposed to be beautiful. Hence NIMBYism (Not In My Backyard-ism) as we know it today is an important piece of the Neoliberal Order, as it actually was on the ground.

Consider San Francisco’s Embarcadero Freeway, an 8-story, 90-foot high structure that blocked views of the ocean and bay. Residents preferred to maintain the open views rather than prioritize faster commutes for drivers from Marin County. This was seen as a victory for rational, people-centered development at the time. And the post-1989 earthquake removal of the initial parts of the Embarcadero Freeway was a huge win—it resulted in a much more pleasant and open waterfront area for residents and visitors to enjoy.

But in the long run NIMBYism has been a disaster. Berkeley houses no more people now than it did fifty years ago. So housing prices have skyrocketed, and the guy who runs the Little Farm Children’s Center in Tilden Park has to commute from beyond the Altamont Pass.

NIMBYism killed America’s tradition of moving to opportunity stone dead. This has been a very powerful if indirect cause of rage against The Neoliberal Order Machine. Thus the growing call for place-based policies to make opportunity move to where people are, instead of assuming people will move to opportunity. The Polanyian right to the land—to keep Schumpeterian creative-destruction from destroying your community as a side-effect of its pursuit of profit—is and will take a more prominent role in whatever comes after the Neoliberal Order.

Second, the “after” will include explicit industrial policies. The Neoliberal Order was about hyperglobalization. Under the Neoliberal Order it was assumed that free trade and laissez-faire policies were beneficial for all. They were beneficial for the Global North as they heightened the concentration of high-value and high-externality activities like science, engineering, and worthwhile manufacturing within itself. And they were beneficial for the Global South because only the threat that economic activity and talented people would leave could curb the predatory instincts of Global South governments. The concerns of economists like W. Arthur Lewis that trade in a globalized market on terms increasingly tilted against primary products actually developed the fact of underdevelopment were pushed to one side.

But now the assumption that free trade works to concentrate high-value and high-externality activities like science, engineering, and worthwhile manufacturing in the United States is very much in doubt. The CHIPS Act of the Biden administration signals the end of the belief that the global market was working in America’s favor. The CHIPS Act represents a shift away from the implicit acceptance of the global market’s inequities now that they no longer seem to be working so strongly in America’s favor. Instead, there is now a demand for more explicit industrial policies as an alternative..

Third, the “after” will include a strong demand for champions of the people. There is growing recognition that neoliberalism has led to an unfair domestic plutocracy. The 2008 Republican presidential and vice-presidential ticket was almost composed of individuals who collectively owned 20 houses—John McCain owned 12 houses, and Mitt Romney owned 8. Political advisors felt that that foreclosed choosing Romney as likely to make the ticket look ridiculous, and so they prevailed on McCain to choose the very odd Alaska Governor Sarah Palin insted.

What to do about plutocracy, where there is a growing belief that the system is working not for the people but for the super-rich and for their rootless cosmopolite allies and clients? Power requires countervailing power. Hence what is needed is someone powerful to vindicate the interests of the common people, rather than of some privileged élite: a strongman to disrupt the status quo and the inertia of “business as usual”.

It has never been the case that the “strongman” has to come from the people. Indeed, often in history a plutocrat, oligarch, or aristocrat has been preferred—a “class traitor” as other members of Harvard’s Porcellian Society whispered about their fellow member, New Deal President Franklin Delano Roosevelt. The idea is that only someone who has thoroughly benefitted from being in the system and knows it inside and out will know enough about its vulnerability to be able to disrupt it.

Analogously, consider Andrew Jackson. He positioned himself as a defender of the common people against the system—land speculators, Philadelphia financiers, and corrupt politicians who together made sure that the people could not prosper as America grew.vJackson presented himself as an outsider who would protect the interests of the “Kentucky frontiersmen” against the domestic élite, even though he himself was no true frontiersman.

Indeed, the earliest examples of strongman politicians overthrowing existing oligarchic systems to vindicate at least the short-run interests of a broader “people” come from the early days of Classical Hellenic civilization. Peisistratos, Tyrant of Athens in the -500s, is the prime historical example. The Tyrants abolished debt slavery, canceled the debts of the overindebted, and redistributed land more equitably—paving the way for the establishment of Hellenic democracy, which was a very attractive civilization as far as the societies of domination of those days went.

Unfortunately for us, the champions of the people being chosen today appear more fascist than populist—more interested in telling people what to do to make them followers to burnish the glory of the leader than in lifting the burdens from the people by cancelling the debts and redistributing the land—and more kleptocrat than plutocrat, with the leader’s skills more in running a con game than in understanding the workings of the system.

Fourth, what is coming after the Neoliberal Order appears to be a politics of fear: fear of the diverse, fear of the woke, fear of the other—whatever the other is, people who seem strange and weird—and fear of the rootless cosmopolite.

In the last analysis, the Neoliberal Order fell because it did not deliver the goods. Free markets and largely ineffectual gestures at freeing-up individual autonomy from bureaucracy were not enough to create a society where people felt at home, even if there was a great expansion of individual freedom to choose elsewise than commanded by formerly-dominant social norms. But the failure of the past Order did not in itself bring a new one into existence. In this sense we are in a similar period of uncertainty to that of the late 1920s and early 1930s. Back then, before he died in Mussolini’s jail, the Marxist thinker Antonio Gramsci observed: “The Old Order is dying, and the New Order appears perhaps to be stillborn: now is a time of monsters”…

Oh, to be able to go back to school… Eminently worth reading in full: “Neoliberalism & After,” from @delong.bsky.social. See also the notes from a proximate lecture: “Post-2010 “Polycrisis”: Culture, Communications, Politics, & War.”

* “It turns out that we’re actually capable of something other than neoliberalism and actually we’re really capable of enjoying ourselves more than we do under neoliberalism. It feels that if neoliberalism is first about privatizing desire and imagination before the economy, then we’re in this process of publicizing it again.” – Rebecca Solnit

###

As we fumble with the future, we might recall that it was on this date in 1968 that 60 Minutes, which had premiered two months earlier, introduced its trademark “ticking stopwatch” opening logo/transition. 60 Minutes is, of course, the most-watched television news show in history.

Since near the show’s inception in 1968, the opening of 60 Minutes features a stopwatch. The Aristo (Heuer) design first appeared in 1978. On October 29, 2006, the background changed to red, the title text color changed to white, and the stopwatch was shifted to the upright position. This version was used from 1992 to 2006 (the Square 721 type was changed in 1998). Source

“A republic, if you can keep it”*…

Nathan Gardels on one of the deepest issues at play in the social and political sphere in the U.S and around the world…

It is a mark of just how deep the crisis of governance across Western democracies has become that conflict irresolvable through political competition is giving way to the reconsideration of founding constitutions and the institutions they invest with legitimacy.

At its heart, this crisis is about trust. As the political scientist Francis Fukuyama has argued, “Belief in the corruptibility of all institutions leads to the dead end of universal distrust. American democracy, all democracy, will not survive a lack of belief in the possibility of impartial institutions; instead partisan political combat will come to pervade every aspect of life.” And so it has.

The ongoing populist surge of recent years did not cause the crisis. It is a symptom of the decay of democratic institutions that, captured by the organized interests of an insider establishment, failed to address the dislocations of hyper-globalization, the disruptions of rapid technological change and the attendant creep of widening cultural cleavage. Too many were left behind and struggled while others prospered and played.

Adding danger to decay, the fevered partisans of populism are intent on throwing out the baby with the bathwater, assaulting the integrity of the very institutions which protect republics from themselves through checks and balances, or that are critical to the fair administration of complex societies. The rebellion against a moribund political class has become a revolt against governance itself and the infrastructure that goes with it.

Populists who fashion themselves as tribunes of the people have never met independent and impartial institutions they can happily abide. Believing they are the embodiment of majority will, any constraint on their power is portrayed as a contrivance by elites to keep the masses down.  When cemented with cultural resentment against those at the top who look down on the unsophisticated rabble living in the sticks and outside the fashionable status sphere, anti-elitist sentiment has enough truth value to stick.

We’ve seen versions of this over recent years where the previous governments in Poland and Brazil, as well as the present government in Israel, have sought to politicize the top courts and limit their independence from the powers that be. Hungary under Viktor Orbán, an outright proponent of illiberal democracy, has done the same, seeking further to stifle independent media, think tanks and civil society organizations for good measure.

In Mexico, Claudia Sheinbaum, the president-in-waiting elected by a landslide earlier this year, has pledged to continue pursuing President Andrés Manuel Lopez Obrador’s plan for popular election of that nation’s Supreme Court, thus making its slant coincide with the interests of the ruling party. Sheinbaum, like her predecessor, is also bent on disempowering the independent electoral commission that oversees the polls and certifies voting outcomes.

With the U.S. Supreme Court already dominated by ultra-conservative judges, partisans of Donald Trump have turned their attention to slashing the powers of what it calls “the administrative state” — those agencies with the discretionary authority under legislative mandate to regulate private sector activities in realms from environmental impact to food and drug safety to publicly traded securities to the monopolistic conduct of large companies. Most of these agencies have been in place since the early 20th century as the Progressive Era’s response to the vast inequality, child labor, unsanitary industry, crony corruption and robber barons of the unregulated Gilded Age.

The aim of modern-day populists is to both diminish and politicize the regulatory technocracy to fit their agenda. The famous Project 2025 plan prepared by the Heritage Foundation in anticipation of another Trump presidency has gone so far as to propose the abolition of the National Oceanic and Atmospheric Administration — the key body monitoring climate change, which they don’t believe in. My colleague Nils Gilman aptly calls this endeavor “institutional vandalism.” [See here for a taste of Gilman’s sharp thinking on the more general issue at play.]

Though the Trump campaign has sought to distance itself from the details of Project 2025, which may scare sensible voters in the runup to the November election, few have any doubts that it provides the essential roadmap for action if Republicans come to power.

Following verdicts to overturn Roe vs. Wade on abortion, blunt the scope of regulatory agencies and codify presidential immunity, the realization of what a stacked Supreme Court means has prompted President Joe Biden to engage the battle over institutions head on.

As his last stand after bowing out of the presidential race, the president is embarking on a quixotic quest to undo the impact of recent rulings and seek a constitutional amendment to reform how the Supreme Court works. First, arguing that “no one is above the law,” he would repeal the presidential immunity recently granted and impose a “binding code of conduct” with strict ethics guidelines prohibiting political activity by justices and requiring transparent disclosure of gifts.

The core structural change of Biden’s proposal is to get rid of lifetime terms and limit them to 18 years, with staggered appointments every two years (when one of the terms expires) to avoid the enduring sway of justices chosen by one political regime and ideological persuasion. In short, a process which would perpetually unstack the highest court instead of invite and enable its stacking.

This is an uphill battle, for sure, since amending the constitution would entail a 2/3 vote of both houses of Congress and approval of ¾ of all state legislatures.

“Defend the institutions” is hardly a rallying cry that will stir the passions of the public in the same way as the instinctive appeal of demagogues who promise simple solutions to complex problems while blaming all misfortune on the world outside or perceived enemies within. But repairing and restoring the integrity of democracy’s infrastructure is the only path back to trust. That is a tall order in the short term.

Popular emotion is the Achilles heel of democracies. Institutions that temper emotion through the cool deliberation of disinterested reason are what make the system work to the benefit of all.

As Fukuyama rightly says, democracies can’t survive without at least a belief in the possibility of impartial platforms for the administration of justice and governance. That proposition will be tested as never before in the battle over institutions in the near years ahead…

A challenge to democracy’s infrastructure: “The Battle Over Institutions,” from @NoemaMag, with @FukuyamaFrancis and @nils_gilman.

* Benjamin Franklin’s response to Elizabeth Willing Powel‘s question in 1787: “Well, Doctor, what have we got, a republic or a monarchy?”

###

As we rally around the rudiments, we might recall that it was on this date in 1935 that President Franklin D. Roosevelt signed the Social Security Act, part of his New Deal program that created a government pension system for the retired.

By 1930, the United States was, along with Switzerland, the only modern industrial country without any national social security system. Amid the Great Depression, the physician Francis Townsend galvanized support behind a proposal to issue direct payments to older people. Responding to that movement, Roosevelt organized a committee led by Secretary of Labor Frances Perkins to develop a major social welfare program proposal. Roosevelt presented the plan in early 1935 and signed the Social Security Act into law on August 14, 1935. The Supreme Court upheld the act in two major cases decided in 1937.

The law established the Social Security program. The old-age program is funded by payroll taxes, and over the ensuing decades, it contributed to a dramatic decline in poverty among older people, and spending on Social Security became a significant part of the federal budget. The Social Security Act also established an unemployment insurance program [only a few states had poorly-funded programs at the time] administered by the states and the Aid to Dependent Children program, which provided aid to families headed by single mothers. The law was later amended by acts such as the Social Security Amendments of 1965, which established two major healthcare programs: Medicare and Medicaid.

source

Roosevelt signs Social Security Bill (source)