Posts Tagged ‘slavery’
“I still believe that if your aim is to change the world, journalism is a more immediate short-term weapon”*…
In the most recent issue of his “No Mercy/No Malice” newsletter, “The Podcast Election,” Scott Galloway makes the case that “in each election the victor is likely to be whoever best weaponizes an emerging medium.” John Grinspan (curator of political history at the Smithsonian Institution) offers a timely historical example: Long before anyone was accused of being “woke,” the Wide Awakes used new news technology to rapidly construct a national movement…
… when I discuss the deep history of political division in our country, someone in the audience always asserts that we can’t possibly compare past divisions to the present, because our media landscape is doing unprecedented harm, unlike anything seen in the past.
I’m always struck by people’s belief in a placid media landscape in the past, a time of calm before the internet blew everything up. In fact, the most divided period in the history of U.S. democracy — the mid-1800s — coincided with a sudden boom in new communications technologies, confrontational political influencers, widespread disinformation, and nasty fights over free speech. This media landscape helped bring about the Civil War.
The point is not that 21st century media is like the 19th century’s, but that the past was hardly full of the upstanding, rational, nonpartisan journalists many like to believe it was. And at this era’s center, in the campaign that actually led to the war, was a huge, strange, forgotten movement — the Wide Awakes — born from this media landscape and fought out in the newspapers, polling places and, ultimately, battlefields of the nation.
Newspapers had been around for centuries by the 1800s, but as American rates of literacy rose, millions of ordinary citizens became daily news junkies. The number of papers jumped from a few publications in 1800 to 4,000 brawling rags by 1860, collectively printing hundreds of millions of pages each year. They ranged from the snarky, immensely popular New York Herald and the blood-drenched true crime reports in the National Police Gazette to the high-minded abolitionism of The Liberator.
Nearly everyone literate devoured them — from wealthy elites to schoolgirls to enslaved people technically banned from reading. Newspapers published scandals and rumors, riling mobs and sparking frequent attacks on editors — often by other editors. Well into the 20th century, communities were still pulling newspaper presses out of local rivers, having been hurled there by angry mobs.
Ninety-five percent of American newspapers had explicit political affiliations. Many were directly bankrolled by political parties. There was no concept of journalistic independence and nonpartisanship until the turn of the 20th century.
These partisan presses, not the government, even printed the election ballots. Readers voted by cutting ballots from their pages and bringing them to the polls. (Imagine if TikTok influencers or podcasters were responsible for administering elections.)
The telegraph may seem old-timey today, but with its introduction in the 1840s, Americans could suddenly disseminate breaking news across huge territories along electrical wires. It allowed people to argue the issues nationwide, long before the internet, television, or even radio.
Americans became a people by arguing politics in the press. When politics was local, the major parties had avoided discussing slavery, taking what Abraham Lincoln mocked as a “don’t care” attitude. But now that Maine could debate with Texas, the topic shot to the forefront. By the 1850s, Northerners were digesting its evils daily.
The National Era — an abolitionist paper in Washington — first printed Harriet Beecher Stowe’s hair-raising Uncle Tom’s Cabin, by far the most influential antislavery novel in history. Meanwhile, the radical pro-slavery magazine De Bow’s Review, based in New Orleans, spread a maximalist vision of expanding slavery far and wide. Americans living thousands of miles from each other could argue about the issue, and the only gatekeepers were editors who profited from spreading often legitimate outrage.
It’s fitting, then, that the Northern pushback to slavery’s expansion came from the 19th-century equivalent of “very online” young newspaper readers. Early in the 1860 election, a core of young clerks in Connecticut formed a club to help campaign for the antislavery Republican Party. They happened to live in the state with the highest literacy rates and huge newspaper circulations. So when a local editor wrote that the Republicans seemed “wide awake” in the campaign, the boys named their club the Wide Awakes.
Adding militaristic uniforms, torch-lit midnight rallies, and an open eye as their all-seeing symbol, a new movement was born, which I chronicle in my recent book, Wide Awake: The Forgotten Force that Elected Lincoln and Spurred the Civil War. Often, their chief issue was not the knotty specifics of what to do about slavery, but the fight for a “Free Press” — unsuppressed by supporters of slavery, South or North.
The Wide Awakes exploded across the national newspaper network. Within months of their founding, young Republicans were forming clubs from Connecticut to California. Most learned how to organize their companies through the papers. They built a reciprocal relationship with America’s press: cheering friendly newspaper offices and harassing pro-slavery Democratic papers’ headquarters. Friendly editors returned the favor, marching with the Wide Awakes and pushing their readers to form more clubs, like the Indiana newspaperman who nudged: “Cannot such an organization be gotten up in this town?”
None of this could be described as independent journalism, but it sure spread a movement. It only took a few months to turn the Wide Awakes into one of the largest partisan movements America had ever seen, believed to have 500,000 members — the equivalent of 5 million today, proportionally speaking.
The same network of newspapers spread fear as well. Readers in much of the South saw the clubs as a partisan paramilitary organization. Wild accounts shared accidental misinformation and deliberate disinformation, pushing the false notion that the Wide Awakes were preparing for a war, not an election. The presence of a few hundred African American Wide Awakes in Boston morphed into claims in Mississippi that “the Wide Awakes are composed mainly of Negroes” who were plotting a race war. A dispersed, partisan media exaggerated such falsehoods like a national game of telephone.
By the time Lincoln won election in November 1860, hysterical editors predicted a Wide Awake attack on the South. Secessionist newspapers used fears of Wide Awakes to help push states out of the Union…
… What began in ink was spiraling into lead and steel. It took 16 years to develop from the introduction of the telegraph to the Civil War. Undoubtedly, the fight over slavery caused that conflict, but the newspapers fed it, amplified it, exaggerated it.
Mid-19th-century Americans lived with an odd combination: an unprecedented ability to spread information, but also a siloed and partisan system of interpreting it. It helped the nation finally reckon with the crimes of slavery, but also spread bad faith, irrational panic, and outright lies. This history can add a needed perspective to today’s political conflicts, so often magnified by social media. In both eras, new technologies supercharged existing political tensions. Yet we can also see from this heated history that political media is less like an unstoppable, unreformable force that will consume democracy, and more like another in a succession of breathtaking, catastrophic, wild new landscapes that must be tamed…
Perspective from our past: “How a 19th-century news revolution sparked activists, influencers, disinformation, and the Civil War,” from @NiemanLab.
Apposite: the Galloway piece referenced above and “The TikTok Electorate” from Max Read… and more fundamentally, “Are Americans too ignorant and gullible to self-govern?” a consideration of a century-old debate between Walter Lippmann and John Dewey, from the estimable Howard Rheingold.
* Tom Stoppard
###
As we muse on media, we might recall that it was on this date in 1969 that journalist Seymour Hersh submitted the story that (the following day) filed the story that broke the news of the My Lai massacre to the American public. At least 347 (up to 504) Vietnamese civilians, almost all women, children, and elderly men, were murdered by U.S. soldiers, the largest massacre of civilians by U.S. forces in the 20th century.
Hersh had tried the day before to place a cautious and conservative piece but was initially rejected by Life and Look magazines. He turned to his friend David Obst, who ran the anti-war Dispatch News Service and who placed a more candid version in 35 papers (including the Washington Post and the Boston Globe); it ran in those papers the following day. Initial reaction was muted, as the press was focused on a massive anti-war demonstration in Washington scheduled for November 15. But the story spread, prompting global outrage and fueling domestic opposition to U.S. involvement in the Vietnam War. Hersh’s coverage of the atrocity earned him the 1970 Pulitzer Prize for International Reporting.

“Not in utter loneliness to live / Myself at last did to the Devil give!”*…
With an excerpt from his new book, Devil’s Contract: The History of the Faustian Bargain, Ed Simon on “the most important story ever told”– the story of Humanity’s transactional relationship with evil…
The legend of the Devil’s contract is the most alluring, the most provocative, the most insightful, the most important story ever told. It concerns a humanity strung between Heaven and Hell, the saintly and the satanic; how a man could trade his soul for powers omnipotent, signing a covenant with the Devil so that he could briefly live as a god before being pulled down to Hell. Frequently associated with Christopher Marlowe’s Doctor Faustus, that Elizabethan play wasn’t the origin of that myth, but his is certainly a sterling example of that eternal script. Yet long before that Renaissance play and long afterwards, we can find the inky traces of Faust’s damned signature in a multitude of works both high and low, canonical and popular. More disturbing than that is the way that the Devil’s hoof-prints can be found across the wide swatch of history, in our willingness to embrace power and engage in exploitation, to summon self-interestedness and to conjure cruelty…
…
… Tamburlaine the Great’s iconoclasm and The Jew of Malta’s irreverence aside, no work of sacred heresy in Marlowe’s oeuvre is as profound as Doctor Faustus. His quisling scholar selling his birthright for the pottage of trickery and illusion may be modernity’s operative metaphor, but Marlowe was hardly the originator of the myth. As you’ll read in the chapters ahead, Marlowe adapted the historical Johann Faust from German folkloric tradition, though the myth of a contract with Satan existed centuries before that unfortunate alchemist first crossed potassium nitrate with sulfur. Nor of course was Marlowe’s rendition the final word, as thousands of permutations of the basic story have been produced over the half-millennium, from Goethe to the musical Damn Yankees, Thomas Mann to the Dixie-fried pablum of the execrable Charlie Daniels Band number “The Devil Went Down to Georgia.” High culture like Franz Liszt’s Faust Symphony and Gustav Mahler’s Symphony No. 8; pop culture from the comic book Ghost Rider to the Jack Black flick Tenacious D in the Pick of Destiny.
“The figure of Faust is—after Christ, Mary, and the Devil—the single most popular character in the history of Western Christian culture,” writes Jeffrey Burton Russell in his classic Mephistopheles: The Devil in the Modern World. And of those characters, Faust is the most fully human to us, in his arrogance and his failure, his negotiations and his capitulations, in the whole litany of abuse which the cankered soul is capable of inflicting upon itself. Russell’s contention is far from hyperbole, and amending the word “character” to “narrative,” I’d say that there are few archetypal scripts in our culture as essential as the legend of a man selling his soul to the Devil. Thousands of works of literature and film, music and art, grapple with the bargain whereby somebody trades what’s most human for power or wealth, influence or knowledge. Only the myth of Adam and Eve being cast out of Eden competes with Faust in terms of influence, and that story is arguably an early variation on the Devil’s contract…
… though it is ostensibly a history, and this narrative moves onward rather chronologically, I prefer to think of the story it tells as being about a character who is outside of time, who lives parallel to past, present, and future. An eternal story. Because what this book is concerned with are the implications— culturally, politically, theologically—of these highly symbolically charged narratives concerning the abjuration of a soul, of the ceding of what’s intrinsic to us, of the capitulations and negotiations which make up any failed life, which is to say every life. More than a history, then, Devil’s Contract is an account of what it means to be human in all of our failings.
Increasingly an account of humanity right now. For all the legend’s archaicism, the muttered Latin and the alchemical conjuration, Faust’s story has always been estimably modern, perhaps the first modern story. Unlike Adam and Eve, with their inscrutable Bronze Age story composed in an idiom so ancient and foreign that centuries of theologians have disagreed on what the implications of each facet of the tale might mean, the details in the Faust legend are inescapably of our time. This is, after all, the story of a contract. The dénouement of most versions of the Faust story involves the signing of a legally binding document, an experience foreign to the authors of Genesis but replete in our own lives, whether interacting with human resources or clicking on an agreement with our phone company. Faust’s tale may deal in the numinous and the transcendent, but it’s also about bureaucracy and paperwork, our contemporary hell and its sacrament, respectively. We recognize Faust in a manner that no character in the Bible can ever be our contemporary…
…
… Marlowe staged his play at the very beginning of what is increasingly being called the Anthropocene, the geological epoch in which humanity was finally able to impose its will (in an almost occult manner) upon the earth. There are costs to any such contract, as the wisdom of the legend has it, so that it’s worth considering after five centuries of human domination of the planet that we might now be facing our own collective appointment at Deptford. We seem to finally be facing the final act, the apocalyptic tenor of our times, from climate change to nuclear brinkmanship making the continued survival of humanity an open question, our sad predicament the result of hubris, and greed, and vainglory. It may be appropriate to rechristen this age the Faustocene. Because whether or not the Devil is real, his effects in the world are. When it comes to “truth” and “facts,” the two words are not synonymous, and I wouldn’t at all be surprised if I could make out the smoke of some devilish chimera beyond the neon-line of the Rose Theater, deep within a darkness so all-encompassing that not a squib of light is capable of escaping…
“A Deal With the Devil: What the Age-Old Faustian Bargain Reveals About the Modern World” in @lithub.
See also: “You Are Equal To The Spirit You Understand,” Nathan Gardels‘ consideration of the lessons in Goethe’s Faust, in @NoemaMag.
* Johann Wolfgang von Goethe, Faust
###
As we reconsider our contracts, we might recall that it was on this date in 1834 that slavery was abolished in the British Empire, as the Slavery Abolition Act 1833 came into force (though it remained legal in the possessions of the East India Company until the passage of the Indian Slavery Act, 1843).

“Sometimes we drug ourselves with dreams of new ideas”*…
Further to last week’s piece on Samuel Arbesman‘s “incremental humanism,” Jennifer Banks unpacks the differences between the two leading “flavors” of humanism afoot today: one akin to Arbesman’s; the other, not so much…
In 2003, Edward Said wrote in the wake of the terrorist attacks of 11 September 2001 and in the context of the United States’ war on terror that ‘humanism is the only, and, I would go so far as saying, the final, resistance we have against the inhuman practices and injustices that disfigure human history.’ The moment, he felt, was ‘apocalyptic’, and the end was indeed near for him; he died of leukaemia later that year.
So why was it humanism that he held to so tightly as war and sickness cinched time’s horizon around him? Humanism, an intellectual and cultural movement that emerged in Renaissance Europe emphasising classical learning and affirming human potential, had been subject to decades of critique by the time Said was writing this. Among its many detractors were postcolonialists who argued that humanism’s elevation of a particular kind of human – Eurocentric, rational, empiricist, self-realising, secular and universal – had provided thin cover for the exploitation of large swaths of the world’s population.
But Said, one of the founders of postcolonial studies, hadn’t given up on the term, despite its imperialist entanglements. He imagined a humanism abused but not exhausted, an –ism more elastic and plural, more subject to critique and revision, and more acquainted with the limits of reason than many humanisms have historically been. Humanism, he argued, was more like an ‘exigent, resistant, intransigent art’ – an art that was not, for him, particularly triumphant. His humanism was defined by a ‘tragic flaw that is constitutive to it and cannot be removed’. It refused all final solutions to the irreconcilable, dialectical oppositions that are at the heart of human life – a refusal that ironically kept the world liveable and the future open.
At stake in his defence was not only the survival of the humanistic fields of study he had devoted his academic career to, but the survival, freedom and thriving of actual people, including those populations that humanisms had historically excluded. Various antihumanisms had gradually been eroding humanism’s stature within the academy, but it was humanism, he believed, with its positive ideas about liberty, learning and human agency – and not antihumanist deconstructions – that inspired people to resist unjust wars, military occupations, despotism and tyranny.
Humanism, however, fell further out of vogue in the two decades that followed. Humanities enrolments dropped dramatically at universities, and funding for departments like comparative literature, women’s studies, religion, and foreign languages got slashed. Increasingly, however, it wasn’t just the inadequacies of any –ism that were the problem. It was the subject at the heart of humanism that came under widespread attack: the human itself. Given that history could be read as a catalogue of human greed, blindness, exclusions and violence, the future seemed to belong to someone – or something – else. The humane in humanism seemed to be missing. Alternative ideologies like antihumanism, transhumanism, posthumanism and antinatalism seeped from the fringes into the mainstream, buoyed by their conviction that they might offer the planet or even the cosmos something more ethical, more humane even, than humans have ever been able to. Humanity’s time, perhaps, was simply up.
In his book The Revolt Against Humanity: Imagining a Future Without Us (2023), the American critic Adam Kirsch identifies the contested line between humanists and non-humanists as one of the defining faultlines of our political and cultural moment. The debates between them can feel merely semantic, the stuff of graduate seminars, but the revolt against humanity is likely to have major implications for our future, Kirsch argues, even if its prophecies about our imminent extinction don’t come true. ‘[D]isappointed prophecies,’ he writes, ‘have been responsible for some of the most important movements in history, from Christianity to Communism.’ Anyone committed to the prospect of a liveable future should pay close attention to what’s going on here.
…
I might have never put too much stock in a term like humanism if I had not read around in the transhumanist literature. I came to this work while researching a book on birth that explored the relationship between birth, death and the question of a human future. Does humanity have a future? Do we deserve one? What will that future look like? The answers to those questions will be determined by many forces – technological, economic, political, environmental and more – but also by how we experience and think about our own births and deaths. Despite large areas of convergence, humanists and transhumanists can end up with wildly different visions of our future, based on dramatically different understandings of birth and death, as one can see by comparing how a novelist (Toni Morrison) and a philosopher (Nick Bostrom) have explored these themes. Morrison offers us a prophetic celebration of Earthly, ongoing, biological generation and a future that allows for human freedom, while Bostrom points us toward a highly controlled surveillance world order, organised around a paranoid fear of human action, and oriented toward the pristine emptiness of outer space. Which future, we should ask ourselves, would we willingly choose?
…
Do read on for her analysis: “What awaits us?“, from @jenniferabanks in @aeonmag.
Apposite: “The Philosophy Of Co-Becoming” from @NoemaMag and “To pay attention, this is our endless and proper work,” @LMSacasas on Illich.
###
As we ponder possibility, we might spare a thought for Dandara, “the Warrior Queen” of the Quilombo dos Palmares, a settlement of Afro-Brazilian people who freed themselves from enslavement during Brazil’s colonial period. She was captured by colonial authorities on this date in 1694 and committed suicide rather than be returned to a life of slavery.
“It is easy to show that the fears of the early 1770s about the East India Company in America were unfounded; it is not easy to show that they were also unreasonable”*…

Last Saturday was the 250th anniversary of The Boston Tea Party, a protest against the Tea Act (“no taxation without representation”) and an accelerant of colonial support for the American Revolution. But as Deb Chachra and Robert Martello explain, there’s more to the story than we typically hear…
It’s a familiar story to many Americans. On the evening of December 16th, 1773, Massachusetts patriots, including some disguised as ‘Mohawk warriors’, boarded three vessels in Boston Harbor and dumped thousands of pounds of tea into the sea. This act of civil disobedience in protest of heavy-handed British colonial policies, including taxation and monopoly protections, is what we now know as “The Boston Tea Party.”
But behind this story lies another, of where that tea came from and why. For the American patriots, the tea itself was tangible evidence of the British government’s willingness to put profit and imperial control over the well-being, and even the lives, of its colonial subjects.
That tea was the property of the British East India Company which, in the years leading up to the American Revolution, was a massive, highly profitable corporation that held trading rights all over south and east Asia, including what is now India, Pakistan, Bangladesh, Myanmar, and China. As Nick Robins describes in his book The Corporation That Changed the World, those rights were acquired by systematically undermining local governance, and were enforced by the East India Company’s huge private army, which it used to seize and control territory. In 1757, Company soldiers fought and won the Battle of Plassey against the Nawab of Bengal and his French allies. In its wake, they installed a series of rulers who implemented a treaty in which the East India Company was granted the diwani, the right to collect taxes, while the puppet-Nawabs nominally remained responsible for political and judicial oversight, called the nizamat.
In the 18th century, Bengal was a prosperous textile hub, and its skilled workers were producing a wide array of some of the finest fabrics in the world. Selling these valuable goods had already generated enormous profits for the East India Company, and now taxation provided another revenue stream. Then, in 1768, a severe drought led to crop failures. Even as the Bengalis began to go hungry, company officers continued to collect taxes – at the point of a bayonet if necessary. The East India Company made virtually no provision for famine relief, and after decades of weakened local authority and with tax monies sent off to fill company coffers in London, there was little on-the-ground financial and administrative capacity to address the crisis. Worse, company agents saw hunger and starvation as money-making opportunities, and bought up grain in order to sell it at an enormous profit. Had the available food been redistributed, more residents would have survived. Instead, farms went unplanted, the drought was followed by flooding, disease spread through the weakened populace, and the situation went from dangerous to disastrous. Contemporary estimates put the death toll of the Great Bengal Famine of 1770 at between seven and ten million people – between a quarter and a third of the population.
The enormous human suffering that resulted from the actions of the East India Company, and the Company’s depraved indifference to it, were so horrifying that, as historian William Dalrymple describes, they created the first whistleblowers. Employees wrote to publications in London to detail the atrocities they had observed in Bengal. Their accounts prompted an enormous outcry and ongoing news coverage, with magazines and newspapers carrying cover-to-cover stories on the actions of the East India Company and the response of the British government. And the uproar was not limited to England – print publications routinely crossed the Atlantic… By the time of the Boston Tea Party, the Massachusetts colonists had been discussing, for years, this brutal demonstration of what can happen when a community lacks a voice in their own governance. They learned that even in times of direst need, a colony’s domestically produced resources can be extracted by outsiders in the name of greater profits. Diwani without nizamat is, quite literally, taxation without representation.
The colonists had also begun to experience the economic fallout of this crisis. Two years into the famine, and as a predictable consequence of the humanitarian disaster they were largely responsible for creating, the East India Company’s tax and trade revenues had collapsed. This precipitated a credit crisis in British banks that reverberated across the Empire, including the American colonies. But the East India Company did have some ready assets it could sell to raise much-needed cash: its warehouses in London were full of tea from China.
Rather than censure the East India Company, the British Parliament gave them a bailout. In addition to a government loan, the Tea Act of 1773 granted the struggling Company the monopoly right to sell their tea in the American colonies, cheaply and to a captive market, in order to quickly bring in some revenue and stabilize their finances. Parliament also took the opportunity to apply a three-pence tax on the tea to fund imperial oversight and control, including paying for customs inspectors, royally appointed governors, and occupying troops. If the New England colonists allowed this tea to leave the ships and enter the marketplace, this is what their labor would be paying for. No matter how cheap the tea was, it wasn’t worth this.
The Parliamentary response to the Bengali Famine demonstrated how the British Empire’s appetite for revenue could trump any amount of colonial suffering. What’s more, if it could happen in Bengal, what’s to say it couldn’t happen in Boston?…
Motivated by anger, outrage, and fear, the patriots took decisive steps on a moonlit December night in 1773, dumping the hated tea into the harbor while making a point of leaving the ships themselves and the other cargo untouched…
The wages of colonialism: “Tea and Famine,” @debcha
###
As we commiserate with the Irish, we might recall that the American colonist’s reaction to the East India Company was not the first. Prior to the establishment of the British behemoth in 1600, “companies” were formed and funded (in England, Holland, the Italian City-States, et al.) only for the duration of a single voyage and liquidated upon the return of the fleet– a very risky, all or nothing, proposition. The English East India Company demonstrated that pooling risk across a larger, ultimately open-ended series of voyages was a more bankable proposition.
Threatened with ruin, their Dutch competitors followed suit, forming their East India Company– United East India Company or VOC– in 1602. It was the first joint-stock company in the world; and as shares in the company could be bought by any resident of the United Provinces and then subsequently bought and sold in open-air secondary markets (one of which became the Amsterdam Stock Exchange), it is sometimes considered to have been the first multinational corporation.
Statistically, the VOC eclipsed all of its rivals in the Asia trade. Between 1602 and 1796 the VOC sent almost a million Europeans to work in the Asia trade on 4,785 ships and netted for their efforts more than 2.5 million tons of Asian trade goods and slaves. By contrast, the rest of Europe combined sent only 882,412 people from 1500 to 1795, and the fleet of the English (later British) East India Company, the VOC’s nearest competitor, was a distant second to its total traffic with 2,690 ships and a mere one-fifth the tonnage of goods carried by the VOC. The VOC enjoyed huge profits from its spice monopoly and slave trading activities through most of the 17th century. At its peak, VOC was worth almost $8 trillion dollars at current currency values.
On this date in 1603, its first fleet, under Admiral Steven van der Haghen, departed for the East-Indies.
“What people these days call ‘Vibes’ is a smell, a taste of the soul”*…
Up? Down? Better? Worse? What’s actually going on in our economy? Noah Smith on the asymmetric warfare going on around that question…
As we gear up for election season, a big debate is whether the U.S. economy is doing well or not. Biden supporters point to extremely low unemployment, falling inflation, and real wages that have started rising again. Biden opponents — including both conservatives and socialists — contend that the inflation of 2021-22 left such a severe scar on Americans’ pocketbooks that low consumer confidence is perfectly justified. Biden supporters counter that since inflation has come down — and was never as severe as in the 1970s — the anger over the economy is just “vibes”.
Basically, the Biden supporters are right; the U.S. economy is truly excellent right now. Inflation looks beat, everyone has a job, incomes and wealth are rising, and so on. But on the other hand, I can’t command people to simply stop being mad about the inflation that reduced their purchasing power back in 2021-22. People care about what they care about.
At the same time, though, I think it’s possible for negative narratives about the economy to take hold among the general populace and distort people’s understanding of what’s actually going on. For example, John Burn-Murdoch of the Financial Times recently found [gift article] that consumer sentiment closely tracks real economic indicators in other countries, but has diverged in America since 2020:
Now this could be because Americans simply care about different things than Europeans; we might simply have started to really really hate interest rates since 2021, while Europeans didn’t. But a simpler explanation is that Americans’ negative sentiment is due to something other than economic indicators. And it’s possible that that “something” is a negative narrative — i.e., vibes…
“Vibes vs. data”
Indeed, as Burn-Murdoch observes in his analysis…
… It seems US consumer sentiment is becoming the latest victim of expressive responding, where people give incorrect answers to questions to signal wider tribal political or social affiliations. My advice: if you want to know what Americans really think of economic conditions, look at their spending patterns. Unlike cautious Europeans, US consumers are back on the pre-pandemic trendline and buying more stuff than ever…
“Should we believe Americans when they say the economy is bad?” (gift article)
But why? Jonathan Kirshner‘s review of Martin Wolf‘s important book The Crisis of Democratic Capitalism, suggest an unsettling answer…
The Crisis of Democratic Capitalism is an essential read for its articulation of the perilous crossroads at which the future of enlightened liberal civilization now stands. Wolf argues persuasively that, for all their visible flaws and imperfections, competitive market capitalism and liberal democracy are the best bad systems available for organizing human societies. And each requires the other to thrive—“[b]ut this marriage between those complementary opposites […] is always fragile.” Capitalism has been allowed to run amok, and it has elicited a backlash that threatens democracy…
Wolf’s central argument is that capitalism and democracy are inherently interdependent, yet also often in tension with one another—and managing the balance of that indispensable relationship is akin to walking a tightrope. In traditional autocracies, the economy has been captured by those that control the state, and that control is the basis of their power (which is why they are so reluctant to let go of the reins of authority). Liberal democracies today face the inverse problem: the capture of the state by those that control the economy. This is plutocracy, and aside from the injustice it visits on societies, it is also profoundly dangerous, because in democratic plutocracies (like the United States today), the simmering frustrations of mass polities will at some point lead to the voluntary election of an autocrat: “[I]nsecurity and fear are gateways to tyranny.” Decades of stagnant incomes, rising inequality, and the erosion of high-quality jobs for the middle class and the less-educated have allowed the relationship between capitalism and democracy to become dangerously unbalanced. The Crisis of Democratic Capitalism argues that the fault lies with the failure of public policy to tame the excesses of capitalism; it warns that those excesses will unleash the forces that destroy democracy.
Economic inequality, on the rise for 50 years, has soared to ever greater extremes in recent decades. As Wolf reports, from 1993 to 2015, the real income of the top 1 percent of the population in the United States nearly doubled; for everybody else, over those same years, aggregate real income grew by 14 percent. More pointedly, as the very rich got much, much richer from 2005 to 2014, 81 percent of US households had flat or falling real income—a weighty reminder that we continue to live in a world defined by the Global Financial Crisis and its aftermath…
… the financialization of the economy, especially after the 1990s, and the fortunes amassed from that process, were part and parcel of a larger shift towards “rigged capitalism”—the emergence of which The Crisis of Democratic Capitalism places at the heart of the matter. In a remarkable (and laudable) intellectual evolution, Wolf, who welcomed and celebrated the Thatcher revolution in Britain, and not so long ago penned the book Why Globalization Works (2004), now attributes the crisis of our time to “what Adam Smith warned us against—the tendency of the powerful to rig the economic and political systems against the rest of society.” Superseding a well-ordered market society, rigged capitalism—a toxic brew of developments and practices including financialization, winner-take-all markets, reduced competition, increased rent-seeking behavior (the use of concentrated economic power to extract monopoly profits), tax avoidance and evasion, and the erosion of ethical standards—has led to a widespread loss of confidence in the legitimacy of democracy…
These pathologies run deep, and well below the headlines. The use of political power to undermine competition—which must thrive at the heart of any capitalist society—is an endemic attribute of rigged capitalism. (And it is why we pay higher prices for most things than a “free market” would levy.) Many if not most giant corporations are now monopolies or near-monopolies, a situation that, as any card-carrying professional economist of even the most conservative stripe would agree, generates inefficiencies, rent-seeking behavior, and outright exploitation. Many markets have become shielded, protections reinforced by access to the corridors of power, with wealth extracted from consumers (and workers) in consequence: consider the atrocity of unskilled workers in fast food restaurants being forced to sign “non-compete” clauses, an act of collusive wage suppression.
Rigged capitalism—which yields massive concentrations of wealth for a sliver of largely-above-the-law plutocrats, combined with stagnation and declining opportunities for the majority—leads to a basic political problem: “How, after all, does a political party dedicated to the material interests of the top 0.1 percent of the income distribution win and hold power in a universal suffrage democracy? The answer is pluto-populism.” This is where race, identity politics, and the culture wars come into play. The century-long political hammerlock held by the Democratic Party on the Old South was based on voter suppression and other devices that guaranteed, for working-class whites, greater economic opportunity, access to the legal system, and higher social status than Blacks, in exchange for their political support. Bob Dylan, at 22 years old, saw through this in his song “Only a Pawn in Their Game” (1964)—and nearly 60 years later, that game hasn’t changed much…
rigged capitalism will nevertheless unleash forces not easily contained—and render liberal democracy unsustainable. As political scientist Rawi Abdelal has argued, “the social fact of unfairness is more important than the material fact of income and wealth distribution.” Endemic corruption, arbitrariness of justice, and fear for future prospects are poisonous to the body politic, undermining shared perceptions of the legitimacy of democratic society. In such settings, past and present, fear, despair, and frustration create the space for charismatic personalist authoritarians peddling promises of deliverance but who, once in power, consolidate their hold on the state by undermining the institutional constraints on their authority. And so, democracy dies from within.
What is bewildering about the American case is not that it has witnessed the rise of a leader who, as Wolf describes, “not only had no idea what a liberal democracy was but despised the idea,” and who was “instinctively authoritarian”—this, after all, is what pluto-populism conjures. What remains bizarre, however, is that, of all the possible choices, a hedonistic, ethically suspect, narcissistic grifter—who for decades was a signature beneficiary of rigged capitalism—would emerge as the people’s choice. Yet Donald Trump, like the gargantuan Stay-Puft Marshmallow Man from Ghostbusters, has been summoned by a collective subconscious rage to act as a malevolent score-settling agent of destruction…
“Rigged Capitalism and the Rise of Pluto-populism: On Martin Wolf’s ‘The Crisis of Democratic Capitalism’”
All three articles– and Wolf’s book– are eminently worth reading in full.
###
As we ponder populism, we might recall that it was on this date in 1865 that the 27th state (Georgia) ratified the 13th Amendment to the U.S. Constitution, abolishing slavery and involuntary servitude (except as punishment for a crime). Proclaimed on December 18, it was the first of the three Reconstruction Amendments adopted following the American Civil War.
The Emancipation Proclamation (made in September 1862; effective January 1, 1863) had freed all current slaves in the U.S. (though as a practical matter freedom took years longer). The Thirteenth Amendment assured that it would never be reinstated.








You must be logged in to post a comment.