(Roughly) Daily

Posts Tagged ‘colonialism

“It may be said in truth that man is always susceptible of improvement”*…

Portrait of Thomas Robert Malthus, an influential economist and demographer, seated with a book in his hands.

… a (possibly surprising) observation from Thomas Robert Malthus, a man whose dire predictions of paucity and doom have informed political philosophy and economics since the late 18th century… and given rise to the adjective “Malthusian.” In a review of Deborah Valenze‘s The Invention of Scarcity: Malthus and the Margins of HistoryOliver Cussen explores his (also possibly surprising) lasting impact…

For​ the late French historian Emmanuel Le Roy Ladurie, Thomas Robert Malthus was an indispensable guide to the agrarian past. Le Roy Ladurie applied Malthus’s argument that population grows faster than subsistence to the archives of Languedoc, where, in the empirical detail of parish registers, cadastral surveys, tax rolls and price series, he perceived ‘the immense respiration of a social structure’ over the course of three centuries. In the 15th century, after the Black Death, the region’s population was at a historic low. Land was left fallow, and villagers complained about the encroachment of wild animals and forests on crops and pasture. Nature was taking its revenge for the great land colonisation movement of the Middle Ages. Civilisation recovered, but growth in a world of limits was ultimately self-defeating. The prosperity of the 16th century soon gave way to famine, drought, war and plague. It was only after modern technology unlocked the productive capacities of the earth that society was able to escape this cycle of expansion, crisis and renewal. Acknowledging his intellectual debt, Le Roy Ladurie pointed out the irony that the ‘Malthusian curse’ should lift just as it was being discovered in England in 1798. ‘Malthus was a clear-headed theoretician of traditional societies,’ he conceded, ‘but he was a prophet of the past; he was born too late in a world too new.’

Yet the spectre of Malthus continued to haunt industrial modernity. No sooner had the Great Exhibition of 1851 encouraged Victorians to embrace material gratification without guilt than William Stanley Jevons began to warn of the imminent exhaustion of the nation’s coal supply. Drawing explicitly on Malthus, Jevons argued that the increased demand on resources from a growing population was forcing mines into deeper and more inaccessible seams. ‘We shall begin as it were to see the further shore of our Black Indies,’ he warned. ‘The wave of population will break upon that shore, and roll back upon itself.’ John Maynard Keynes, who made no secret of his admiration for Malthus, attributed the First World War and the Russian Revolution to overpopulation and global competition for food. The ‘great acceleration’ of the second half of the 20th century, a period of unprecedented energy consumption, economic prosperity and demographic growth, produced its own peculiar versions of Malthusian catastrophism, from the neoliberal to the cosmological (the American scientist Garrett Hardin seriously entertained ‘interstellar migration’ as a solution to ‘the population problem’). When the first edition of the Essay on the Principle of Population appeared in 1798 there were just over ten million people in Britain and life expectancy was under forty. We are a long way from Malthus’s Britain, and further still from Le Roy Ladurie’s Languedoc. Why does a theory of scarcity endure in an age of abundance?…

And at what cost? A fascinating– and all-too-relevant– story: “Prophet of the Past,” from @lrb.co.uk‬.

(Image above: source)

* Thomas Robert Malthus, An Essay on the Principle of Population

###

As we ponder the prospect of progress, we might might send chilly birthday greetings to a man who helped breach some of the limits about which Malthus worried: Carl von Linde; he was born on this date in 1842. A scientist, engineer, and businessman, he discovered the refrigeration cycle, invented the first industrial-scale air separation and gas liquefaction processes, and laid the foundation for the development of the refrigerator and ultimately the cold chain— the supply chain that uses refrigeration to maintain perishable goods (meat and produce, pharmaceuticals, and other heat-sensitive goods) in transit, often across the globe.

Black and white portrait of Carl von Linde, a scientist with a beard and glasses, wearing a suit with a bow tie.

source

“What is it about maps? I could look at them all day”*…

Jonn Elledge devotes the current issue of his nifty newsletter to a series of fascinating facts from his wonderful book, A Brief History of the World in 47 Borders (as it’s known in its U.S. edition, as compared to the less qualified title of the U.K. original pictured above)…

… Back in May, the good people at the UK’s leading maps and travel specialist bookshop Stanfords were kind enough to select my new book A History of the World in 47 Borders as their book of the month. And to promote it, they asked me to make a quick video, talking about it…

My initial thought was to list a single fact from each of the book’s 47 chapters, but that, I soon realised, would go on forever and take an absurd number of takes to get right. So in the end I decided on a top 10: that took an absurd number of takes to get right too, and also features me the wrong way round, for some reason, but at least it’s only three minutes long…

… as a special Christmas treat, not to mention shameless attempt to get more of you to buy the book for yourself or a loved one, here are all 47 of the facts I originally chose…

You’ll find tidbits like these:

The oldest known international border was the one between what today we call Upper Egypt and Lower Egypt. We know about it today because, sometime around 3100BCE, someone abolished it.

The Great Wall is not really one wall, but many. And the first Chinese emperor used to kidnap young men to make them build them.

The Open Borders Policies of Genghis Khan basically created the modern world.

In 1884, the great powers agreed to divide up the entire map of Africa without ever visiting. No Africans were in attendance, and one who’d asked for an invite, the Sultan of Zanzibar, was openly laughed at.

There’s a piece of Africa which two countries, Egypt and Sudan, both aggressively claim belongs to the other.

There’s an opera house in which the US/Canada border divides audience from stage.

There’s a coral atoll a thousand miles from Japan which is technically classed as a suburb of Tokyo.

Air traffic control zones cover the entire planet except the Galapagos Islands and the bit of the Arctic where Santa lives.

So much more at “47 Facts from A History of the World in 47 Borders,” from @jonnelledge.bsky.social

* Bill Bryson

###

As we muse on maps, we might send acutely observant birthday greetings to an astute student of the human animal, anthropologist Margaret Mead; she was born on this date in 1901.  Best-known for her studies of the nonliterate peoples of Oceania, she was 23 when she first traveled to the South Pacific to conduct research for her doctoral dissertation. The book that resulted, Coming of Age in Samoa, was– and remains– a best-seller.

 source

“The good of man must be the end of the science of politics”*…

Long lines of people queuing outside the polling station in the black township of Soweto, in the southwest suburbs of Johannesburg, Wednesday, 27 April, 1994. The majority of South Africa’s 22 million voters were voting in the nation’s first all-race elections. (AP Photo/Denis Farrell)

Democracy is an easy ideal to embrace (at least for most). But the devil’s in the details. Mohamed Kheir Omer and Parselelo Kantai review the history of democracy in post-colonial Africa and wonder if it’s not time to revisit some of those “details”…

The first peaceful transfer of power in post-colonial Africa was in Somalia in 1967 when Abdirashid Ali Sharmarke defeated incumbent President Aden Abdullah Osman Daar. The second would only follow a quarter of a century later, when in November 1991 trade union leader, Frederick Chiluba defeated incumbent Zambian president, Kenneth Kaunda in the country’s first multiparty election since 1972 when single party rule had been introduced.

For Africa’s Big Men, news of Kaunda’s defeat was yet another signal of what threatened to become Africa’s second Wind of Change after the one that had swept away colonial rule and brought them into power at the end of the 1950s. In the streets of the capitals, the people were in revolt. Dakar, Abidjan, Cotonou, Kinshasa, Yaounde, Nairobi, Harare and several others – all rocked by youth demanding the end of single-party rule and the return of pluralism. Having previously only worried about coups sanctioned and financed in Western metropolises, the dawning realisation that they now had to fear popular revolts – both in the street and at the ballot box – suggested, even to the least paranoid of them, that their former patrons were abandoning them.

[Those Cold War-era autocrats had been] agents of the neocolonial system that had guaranteed the expropriation of Africa’s resources since the moment of flag independence; for them, ‘democracy’ was the ultimate betrayal. Since it was their friends in Washington, London and Paris who had won the Cold War, why were they abandoning their faithful clients? Why was a new dispensation being organised without their participation?…

African governments were forced into accepting political liberalisation – that is, the re-introduction of opposition parties – as part of a set of conditions on balance of payments support, itself necessitated by the structural adjustment austerity programmes initiated in the mid-1980s following the debt crisis of circa 1982. With the Cold War over, a vernacular of “good governance”, “transparency” and “accountability” became the mediating language of relations between the rich OECD countries and their aid recipients in Africa. In many African countries, the adoption of multiparty democracy was mandated by Western creditors as a precondition for continued assistance.

Democracy, therefore, was more a creature of the market than of popular citizen aspirations. Western media commentators referred to the package of conditional aid as “market democracy”. In many cases across the continent, the original campaigners for pluralism found themselves side-lined in favour of a new set of actors with closer links to Western embassies and who espoused reformist visions in line with neoliberal orthodoxy. In time, it would dawn on even the more radical political actors that unless they toed the new line, they would lose their place on the donor gravy train…

[The authors review the history and offer some observations that point in the direction of moving from an inherited one-size-fits-all democracy towards a set of culturally-specific applications of the democratic principle…]

… Consider the Gada system, a traditional socio-political system practiced by the Oromo people in Ethiopia and parts of northern Kenya. It is a complex form of social organisation that governs the political, social, economic, and religious life of the community. This indigenous institution predates many modern forms of governance and democracy, showcasing elements of direct democracy, checks and balances, and the peaceful transition of power. Leaders are elected through a democratic process which includes term limits. It also includes a legislative assembly and mechanism for conflict resolution. It has been recognised by UNESCO as an Intangible Cultural Heritage of Humanity.

A number of countries in Africa such as Rwanda, Senegal, Madagascar, Lesotho and Morocco employ a mixed electoral system, blending elements of proportional representation with majoritarian or plural systems, which highlight the diversity of electoral systems across Africa, with each country tailoring the mixed electoral model to its specific political, social, and historical context.

Somalia currently uses  the 4.5 model, based on a power-sharing model among the four major clans, while giving minority clans a half share to improve inclusivity. Some argue the move killed the possibility of a national identity. The system got corrupted, failed to reform and with foreign regional interference, is struggling to perform.  These mixed electoral systems offer a means to promote inclusivity and representation while striving for effective governance. However, the specific design and implementation of these systems can significantly impact their effectiveness and the extent to which they achieve these goals.

Other indigenous systems include the philosophy of Ubuntu (consensus building) in Southern Africa where its cultural and philosophical ethos indirectly influences the values foundational to democratic processes in societies where it’s integral to cultural heritage. Its emphasis on inclusivity, communal conflict resolution, collective participation, and ethical conduct shapes the spirit and objectives of governance and elections, impacting not the technical aspects of how votes are cast and counted, but the overarching principles guiding democratic engagement and policymaking.

These traditional models, which often involve direct democracy and community consensus, might offer valuable insights for creating more effective governance structures in Africa.

30 years since electoral democracy was re-introduced, a re-evaluation of election strategies is required – one that considers a mixed approach that incorporates local traditions with modern electoral processes. This approach may better serve the interests of the African populace, addressing the endemic issues of violence, corruption, and inefficacy plaguing the current system.

This would necessitate recognition and legitimation of both systems within African cultural, historical, and political contexts. Key to this approach is engaging a broad spectrum of stakeholders to ensure the model accurately reflects Africa’s diverse societies. Utilising traditional networks for voter education and mobilization can enhance participation and reduce costs. Forming electoral committees composed of both contemporary officials and traditional leaders will ensure the electoral process is transparent, fair, and locally relevant. Incorporating traditional elements into state ceremonies related to elections can also deepen the process’s legitimacy and cultural resonance. Promoting decentralization through local governance structures that combine traditional and elected authority is crucial. Continuous dialogue for model refinement and the necessity of legal and constitutional adjustments to support this hybrid model are essential for its success. Implementing this model demands careful planning, extensive consultation, and phased introduction, aligning it legally and functionally within each country’s governance framework…

Learning from our mistakes: “Africa’s democratic dividend,” from @africaarguments.

While there are lessons here we can apply to more “mature” democracies, we should remember that getting democracy “right,” in the various ways that might be accomplished, across Africa is the primary point. See, for example, “This Will Finish Us” (“How Gulf princes, the safari industry, and conservation groups are displacing the Maasai from the last of their Serengeti homeland”).

* Aristotle

###

As we ponder political process, we might recall that it was on this date in 2012 that Puntland inaugurated its constitution, 14 years after declaring itself an autonomous region within the Somalia federation. The constitution established the Puntland Electoral Commission, which has been guiding the region’s gradual shift from a parliament-based vote system to multi-party elections.

source

“When I despair, I remember that all through history the way of truth and love have always won”*…

Aditya Narayan Sharma on how the Hindu right distorted Gandhi…

Even outside India, it can be difficult to escape the cult of Mohandas Gandhi, the lawyer, thinker, and politician who helped liberate the nation from British colonial rule in 1947. The praise ranges from the anodyne (Gandhi is a “hero not just to India but to the world,” per Barack Obama) to the ironic (“really phenomenal,” according to Burmese political prisoner turned genocide defender Aung San Suu Kyi) to the surreal (“I am Gandhi-like. I think like Gandhi. I act like Gandhi,” declared New York City Mayor Eric Adams). Seventy-six years after his death, Gandhi is not only an icon of Indian independence, but a uniquely potent international symbol of peace and nonviolence. Gandhi has been, at one point or another, as historian Vinay Lal puts it, the “patron saint” of “environmentalists, pacifists, conscientious objectors, non-violent activists, nudists, naturopaths, vegetarians, prohibitionists, social reformers, internationalists, moralists, trade union leaders, political dissidents, hunger strikers, anarchists, luddites, celibates, anti-globalisation activists, pluralists, ecumenists, walkers, and many others.” Everyone, it seems, has endorsed the honorific coined for him more than a century ago: Mahatma, Sanskrit for “great soul.”

Within India, Gandhi graces every banknote and is plastered on billboards and painted on walls alongside busy thoroughfares. His bespectacled face looms over big cities and small towns alike. Countless schools, universities, roads, and public spaces are named after him. In 2013, the government of Bihar, India’s poorest state, spent several million dollars building the world’s tallest Gandhi statue, casting him in a shimmering tower of bronze with two grateful children by his side. Public figures fight to outperform one another at Mahatma-loving, something of a national sport: in 2021, one representative viral video captured a regional party leader clinging to a bust of Gandhi and sobbing. But Gandhi’s ubiquity masks the fact that among political actors, commentators, intellectuals, and a growing swath of the general public, his reputation is far from settled. 

The lead-up to a general election this spring — in which the Hindu nationalist Bharatiya Janata Party (BJP), led by Prime Minister Narendra Modi, is likely to beat out the centrist Indian National Congress Party and be reelected for a third straight term — has brought dueling visions of Gandhi to the fore. Congress, which was helmed by Gandhi himself on the road to independence, still hopes to capitalize on its historic connections to the Mahatma, but the efforts of its increasingly ossified leadership are falling flat. Meanwhile, the BJP pays lip service to Gandhi’s brand while vigorously working to counter his core values, including, most crucially, his lifelong pursuit of Hindu-Muslim unity. The far-right fringes go even further than the official party line: in some circles, Gandhi is belittled, mocked, burned in effigy. This confused state of affairs suggests that a reckoning with the competing narratives swirling around Gandhi is long overdue. Even as he has been flattened into an ill-defined figurehead by liberals and centrists, his complex legacy is being appropriated — and at times desecrated — by India’s seemingly unstoppable right…

It is dangerous, ultimately, to cede criticism of Gandhi to the Hindu right. Many Indians, myself included, admire our founding fathers for their grand, if imperfect and patchily implemented, vision of a secular and pluralist country. Nevertheless, the kernel of truth behind the right-wing critique of Gandhi is that the republic was founded by patrician Anglophone elites, and its core institutions do reflect the worldview of a small, affluent group who were, in many crucial ways, disconnected from the material and spiritual realities of the people they governed. Contemporary India has severe socioeconomic, caste, gender, and regional inequalities, in part as a legacy of this paternalistic cohort’s work. But that’s a starting point for politics, not a dead end. Look a little deeper, and opponents of the BJP will find not only flaws but also invaluable resources in Gandhi’s writings, particularly his distinctively Indian formulation of secularism that stands a real chance of resisting Hindutva. And in an era of rising religious violence, Gandhian pacifism itself may be more relevant than ever: it’s no longer a set of bland phrases from history books, but an urgent directive. Beyond shallow paeans to the forgotten values, Gandhi’s message could be deployed against his killer’s ideological heirs, if only someone were willing to do it. No one — politician, citizen, or intellectual — can seriously claim to inherit Gandhi’s values until they take him down from his pedestal, rescue him from both the glibness of liberal idol worship and the humiliation of Hindutva slander, and re-engage with the great thinker himself. That is surely the only fate befitting the man we once called Bapu, or Dad…

Eminently worth reading in full: “Character Assassination,” from @AdityaNSharma in @thedrift_mag.

See also: “Prime Minister Modi Is Disarming the Opposition Ahead of India’s National Elections.”

* “When I despair, I remember that all through history the way of truth and love have always won. There have been tyrants and murderers, and for a time, they can seem invincible, but in the end, they always fall. Think of it–always.” – Mahatma Gandhi

###

As we resist self-rewarding revisionism, we might recall that it was on this date in 1602 that an ur-engine of the colonialization from which Gandhi led India was born: Vereenigde Oost-Indische Compagnie (VOC, or The Dutch East India Company, as it’s known in the Anglophone world) was incorporated. It was a response to the English (later, British) East India company, on which it was modeled, up to a point.

Generally considered the world’s first trans-national corporation and the first publicly to issue stocks and bonds (and the first company to be ever actually listed on an official stock exchange), it began with a 21-year monopoly on the Dutch spice trade.  The VOC also prefigured the mega-corporation of today in that it had quasi-governmental powers, including the ability to wage war, imprison and execute convicts, negotiate treaties, strike its own coins, and establish colonies.  Considered by many to be the largest and most powerful corporation in history, the VOC eclipsed all of its rivals (including the British) in international trade (and many nations in power) for almost 200 years.

source

“Sometimes we drug ourselves with dreams of new ideas”*…

Further to last week’s piece on Samuel Arbesman‘s “incremental humanism,” Jennifer Banks unpacks the differences between the two leading “flavors” of humanism afoot today: one akin to Arbesman’s; the other, not so much…

In 2003, Edward Said wrote in the wake of the terrorist attacks of 11 September 2001 and in the context of the United States’ war on terror that ‘humanism is the only, and, I would go so far as saying, the final, resistance we have against the inhuman practices and injustices that disfigure human history.’ The moment, he felt, was ‘apocalyptic’, and the end was indeed near for him; he died of leukaemia later that year.

So why was it humanism that he held to so tightly as war and sickness cinched time’s horizon around him? Humanism, an intellectual and cultural movement that emerged in Renaissance Europe emphasising classical learning and affirming human potential, had been subject to decades of critique by the time Said was writing this. Among its many detractors were postcolonialists who argued that humanism’s elevation of a particular kind of human – Eurocentric, rational, empiricist, self-realising, secular and universal – had provided thin cover for the exploitation of large swaths of the world’s population.

But Said, one of the founders of postcolonial studies, hadn’t given up on the term, despite its imperialist entanglements. He imagined a humanism abused but not exhausted, an –ism more elastic and plural, more subject to critique and revision, and more acquainted with the limits of reason than many humanisms have historically been. Humanism, he argued, was more like an ‘exigent, resistant, intransigent art’ – an art that was not, for him, particularly triumphant. His humanism was defined by a ‘tragic flaw that is constitutive to it and cannot be removed’. It refused all final solutions to the irreconcilable, dialectical oppositions that are at the heart of human life – a refusal that ironically kept the world liveable and the future open.

At stake in his defence was not only the survival of the humanistic fields of study he had devoted his academic career to, but the survival, freedom and thriving of actual people, including those populations that humanisms had historically excluded. Various antihumanisms had gradually been eroding humanism’s stature within the academy, but it was humanism, he believed, with its positive ideas about liberty, learning and human agency – and not antihumanist deconstructions – that inspired people to resist unjust wars, military occupations, despotism and tyranny.

Humanism, however, fell further out of vogue in the two decades that followed. Humanities enrolments dropped dramatically at universities, and funding for departments like comparative literature, women’s studies, religion, and foreign languages got slashed. Increasingly, however, it wasn’t just the inadequacies of any –ism that were the problem. It was the subject at the heart of humanism that came under widespread attack: the human itself. Given that history could be read as a catalogue of human greed, blindness, exclusions and violence, the future seemed to belong to someone – or something – else. The humane in humanism seemed to be missing. Alternative ideologies like antihumanism, transhumanism, posthumanism and antinatalism seeped from the fringes into the mainstream, buoyed by their conviction that they might offer the planet or even the cosmos something more ethical, more humane even, than humans have ever been able to. Humanity’s time, perhaps, was simply up.

In his book The Revolt Against Humanity: Imagining a Future Without Us (2023), the American critic Adam Kirsch identifies the contested line between humanists and non-humanists as one of the defining faultlines of our political and cultural moment. The debates between them can feel merely semantic, the stuff of graduate seminars, but the revolt against humanity is likely to have major implications for our future, Kirsch argues, even if its prophecies about our imminent extinction don’t come true. ‘[D]isappointed prophecies,’ he writes, ‘have been responsible for some of the most important movements in history, from Christianity to Communism.’ Anyone committed to the prospect of a liveable future should pay close attention to what’s going on here.

I might have never put too much stock in a term like humanism if I had not read around in the transhumanist literature. I came to this work while researching a book on birth that explored the relationship between birth, death and the question of a human future. Does humanity have a future? Do we deserve one? What will that future look like? The answers to those questions will be determined by many forces – technological, economic, political, environmental and more – but also by how we experience and think about our own births and deaths. Despite large areas of convergence, humanists and transhumanists can end up with wildly different visions of our future, based on dramatically different understandings of birth and death, as one can see by comparing how a novelist (Toni Morrison) and a philosopher (Nick Bostrom) have explored these themes. Morrison offers us a prophetic celebration of Earthly, ongoing, biological generation and a future that allows for human freedom, while Bostrom points us toward a highly controlled surveillance world order, organised around a paranoid fear of human action, and oriented toward the pristine emptiness of outer space. Which future, we should ask ourselves, would we willingly choose?

Do read on for her analysis: “What awaits us?“, from @jenniferabanks in @aeonmag.

Apposite: “The Philosophy Of Co-Becoming” from @NoemaMag and “To pay attention, this is our endless and proper work,” @LMSacasas on Illich.

Audre Lorde

###

As we ponder possibility, we might spare a thought for Dandara, “the Warrior Queen” of the Quilombo dos Palmares, a settlement of Afro-Brazilian people who freed themselves from enslavement during Brazil’s colonial period. She was captured by colonial authorities on this date in 1694 and committed suicide rather than be returned to a life of slavery.

source