(Roughly) Daily

Posts Tagged ‘history of ideas

“You shouldn’t rely on what you believe to be true. You might be mistaken. Everything can be questioned, everything doubted. The best option, then, is to keep an open mind.”*…

The ancient Sceptics– often called Pyrrhonists after Pyrrho, the ancient Greek master Sceptic who lived in the 4th and 3rd centuries BCE– used doubt as a way of investigating the world. As Mahdi Ranaee explains, later thinkers undermined even that possibility…

Ask any philosopher what scepticism is, and you will receive as many different answers as people you’ve asked. Some of them take it to be showing that we cannot have any knowledge – of, say, the external world – and some of them take it to be even more radical in showing that we cannot have any reasonable beliefs. In the interests of getting a handle on the varieties of scepticism, one can locate four different milestones of sceptical thought in the history of Western philosophy. These four milestones start with the least threatening of them, Pyrrhonian skepticism, and continue by Cartesian and Kantian scepticisms to the Wittgensteinian moment in which even our intention to act is put in question…

Fascinating: “Known unknowables,” in @aeonmag.

* Pyrrho (as paraphrased by Nigel Warburton)

###

As we question questioning, we might spare a thought for a not-so-sceptical thinker, Thomas Carlyle; he died on this date in 1881.  A Victorian polymath, he was an accomplished philosopher, satirical writer, essayist, translator, historian, mathematician, and teacher.  While he was an enormously popular lecturer in his time, and his contributions to mathematics earned him eponymous fame (the Carlyle circle), he may be best remembered as a historian (and champion of the “Great Man” theory of history)… and as the coiner of phrases like “the dismal science” (to describe economics).

While not adhering to any formal religion, Carlyle asserted the importance of belief and developed his own philosophy of religion. He preached “Natural Supernaturalism,” the idea that all things are “Clothes” which at once reveal and conceal the divine, that “a mystic bond of brotherhood makes all men one,” and that duty, work and silence are essential. He attacked utilitarianism as mere atheism and egoism; instead taking a medievalist tack, postulating the Great Man theory, a philosophy of history which argues that history is shaped by exceptional individuals. (Indeed his thinking, which extended to a critique of democracy and an argument for “Heroarchy (Government of Heroes),” was appropriated and perverted by Nazi thinkers in Germany.

Carlyle’s History of the French Revolution, a three volume work that assured his fame as a historian, was finished in 1836 but not published until 1837 because John Stuart Mill’s maid mistook the manuscript of Volume One for kindling.  The setback prompted Carlyle to compare himself to a man who has nearly killed himself accomplishing zero.”  But he re-wrote the first volume from scratch.

“A well-written Life is almost as rare as a well-spent one.”   – Thomas Carlyle

 source

Written by (Roughly) Daily

February 5, 2023 at 1:00 am

“I love to talk about nothing. It’s the only thing I know anything about.”*…

Joy Walker: Three Boxes Three Ways, 2009

Nonbeing belongs to that category of concepts that seem self-evident and self-explanatory, but as FT explains, it has perplexed philosophers for ages…

Bertrand Russell’s 1951 obituary for Ludwig Wittgenstein is only a few paragraphs long, and the second consists largely of a single pointed anecdote:

Quite at first I was in doubt as to whether he was a man of genius or a crank…. He maintained, for example, at one time that all existential propositions are meaningless. This was in a lecture room, and I invited him to consider the proposition: “There is no hippopotamus in this room at present.” When he refused to believe this, I looked under all the desks without finding one; but he remained unconvinced.

The exchange is typical of the two philosophers’ relationship: Russell’s proper British demeanor was frequently ruffled by the Austrian’s dry humor. But it also illustrates two general approaches to philosophy: one that takes pleasure in complexities, absurdities, and ironies, and one that takes pleasure in resolving them. Just as Wittgenstein surely realized that there was no hippopotamus in the room, Russell surely realized that Wittgenstein’s objection could not be dispelled empirically by looking under each desk. At stake was not a fact of perception but the epistemological status of negation—the philosophical meaning and value of assertions about nothing.

Nothing, or nonbeing, belongs to that category of concepts—like being, space, and consciousness—that seem self-evident and self-explanatory to most people most of the time, but that for philosophy have been objects of deepest perplexity and millennia-long dispute. It’s a little like breathing, which happens automatically until we stop to think about it. To most of us, Russell’s statement “There is no hippopotamus in this room” is both easily understood and easily verified. We think we know what it means, and most of us would only need a quick look around to affirm or deny the proposition.

But here our troubles begin. If you look around the room and don’t see a hippopotamus, presumably you do still see something: some kind of perception or sensory data is reaching your consciousness, which allows you to make a judgment. What is it that you do see when you see a hippopotamus not being there? Are you perceiving a nonbeing, seeing a particular thing whose nature is absence, or are you not perceiving any being, seeing no “thing” at all? When you see a hippopotamus not being there, are you also seeing a whale and a lion and a zebra not being there? Is every room full of all the things that aren’t in it?

From an evolutionary perspective, one predator not being there is just as good as any other predator not being there, but dialectics and logic are a little more particular. If every possible animal is not there at the same time, what specific truth-value can the assertion “There is no hippopotamus in this room” possibly have? Hence Wittgenstein’s insistence, facetious or not, that all existential propositions are meaningless. In this manner the complications and implications of nothing spill into every area of philosophical inquiry, and we quickly come to sympathize with Aristophanes’ brutal satire of philosophers in The Clouds:

Socrates: Have you got hold of anything?

Strepsiades: No, nothing whatever.

Socrates: Nothing at all?

Strepsiades: No, nothing except my tool, which I’ve got in my hand.

Nonbeing, through the ages: “Apropos of Nothing,” from @ft_variations in @nybooks.

* Oscar Wilde

###

As we analyze absence, we might send mindful birthday greetings to Mahasi Sayadaw; he was born on this date in 1904. A Burmese Theravada Buddhist monk and meditation master, he had a significant impact on the teaching of vipassanā (insight) meditation in the West and throughout Asia.

source

“Philosophy is a battle against the bewitchment of our intelligence by means of language”*…

Clockwise from top: Iris Murdoch, Philippa Foot, Mary Midgley, Elizabeth Anscombe

How four women defended ethical thought from the legacy of positivism…

By Michaelmas Term 1939, mere weeks after the United Kingdom had declared war on Nazi Germany, Oxford University had begun a change that would wholly transform it by the academic year’s end. Men ages twenty and twenty-one, save conscientious objectors and those deemed physically unfit, were being called up, and many others just a bit older volunteered to serve. Women had been able to matriculate and take degrees at the university since 1920, but members of the then all-male Congregation had voted to restrict the number of women to fewer than a quarter of the overall student population. Things changed rapidly after the onset of war. The proportion of women shot up, and, in many classes, there were newly as many women as men.

Among the women who experienced these novel conditions were several who did courses in philosophy and went on to strikingly successful intellectual careers. Elizabeth Anscombe, noted philosopher and Catholic moral thinker who would go on occupy the chair in philosophy that Ludwig Wittgenstein had held at Cambridge, started a course in Greats—roughly, classics and philosophy—in 1937, as did Jean Austin (neé Coutts), who would marry philosopher J. L. Austin and later have a long teaching career at Oxford. Iris Murdoch, admired and beloved philosopher and novelist, began to read Greats in 1938 at the same time as Mary Midgley (neé Scrutton), who became a prominent public philosopher and animal ethicist. A year later Philippa Foot (neé Bosanquet), distinguished moral philosopher, started to read the then relatively new course PPE—philosophy, politics and economics—and three years after that Mary Warnock (neé Wilson), subsequently a high-profile educator and public intellectual, went up to read Greats.

Several of these women would go on to make groundbreaking contributions to ethics…

Oxford philosophy in the early to mid 1930s had been in upheaval. The strains of Hegel-inspired idealism that had remained influential in Britain through the first decade of the twentieth century had been definitively displaced, in the years before World War I, by realist doctrines which claimed that knowledge must be of what is independent of the knower, and which were elaborated within ethics into forms of intuitionism. By the ’30s, these schools of thought were themselves threatened by new waves of enthusiasm for the themes of logical positivism developed by a group of philosophers and scientists, led by Moritz Schlick, familiarly known as the Vienna Circle. Cambridge University’s Susan Stebbing, the first woman to be appointed to a full professorship in philosophy in the UK, had already interacted professionally with Schlick in England and had championed tenets of logical positivism in essays and public lectures when, in 1933, Oxford don Gilbert Ryle recommended that his promising tutee Freddie Ayer make a trip to Vienna. Ayer obliged, and upon his return he wrote a brief manifesto, Language, Truth and Logic (1936), in defense of some of the Vienna Circle’s views. The book became a sensation, attracting attention and debate far beyond the halls of academic philosophy. Its bombshell contention was that only two kinds statements are meaningful: those that are true solely in virtue of the meanings of their constituent terms (such as “all bachelors are unmarried”), and those that can be verified through physical observation. The gesture seemed to consign to nonsense, at one fell swoop, the statements of metaphysics, theology, and ethics.

This turn to “verification” struck some as a fitting response to strains of European metaphysics that many people, rightly or wrongly, associated with fascist irrationalism and the gathering threat of war. But not everyone at Oxford was sympathetic. Although Ayer’s ideas weren’t universally admired, they were widely discussed, including by a group of philosophers led by Isaiah Berlin, who met regularly at All Souls College—among them, J. L. Austin, Stuart Hampshire, Donald MacKinnon, Donald MacNabb, Anthony Woozley, and Ayer himself. Oxford philosophy’s encounter with logical positivism would have a lasting impact and would substantially set the terms for subsequent research in many areas of philosophy—including, it would turn out, ethics and political theory…

A fascinating intellectual history of British moral philosophy in the second half of the 20th century: “Metaphysics and Morals,” Alice Crary in @BostonReview.

* Ludwig Wittgenstein

###

As we ponder precepts, we might recall that it was on this date in 1248 that the seat of the action described above, The University of Oxford, received its Royal Charter from King Henry III.   While it has no known date of foundation, there is evidence of teaching as far back as 1096, making it the oldest university in the English-speaking world and the world’s second-oldest university in continuous operation (after the University of Bologna).

The university operates the world’s oldest university museum, as well as the largest university press in the world, and the largest academic library system in Britain.  Oxford has educated and/or employed many notables, including 72 Nobel laureates, 4 Fields Medalists, and 6 Turing Award winners, 27 prime ministers of the United Kingdom, and many heads of state and government around the world. 

42749697282_7a6203784e_o

 source

“If we are to prevent megatechnics from further controlling and deforming every aspect of human culture, we shall be able to do so only with the aid of a radically different model derived directly, not from machines, but from living organisms and organic complexes (ecosystems)”*…

In a riff on Lewis Mumford, the redoubtable L. M. Sacasas addresses the unraveling of modernity…

The myth of the machine underlies a set of three related and interlocking presumptions which characterized modernity: objectivity, impartiality, and neutrality. More specifically, the presumptions that we could have objectively secured knowledge, impartial political and legal institutions, and technologies that were essentially neutral tools but which were ordinarily beneficent. The last of these appears to stand somewhat apart from the first two in that it refers to material culture rather than to what might be taken as more abstract intellectual or moral stances. In truth, however, they are closely related. The more abstract intellectual and institutional pursuits were always sustained by a material infrastructure, and, more importantly, the machine supplied a master template for the organization of human affairs.

Just as the modern story began with the quest for objectively secured knowledge, this ideal may have been the first to lose its implicit plausibility. Since the late 19th century onward, philosophers, physicists, sociologists, anthropologists, psychologists, and historians have, among others, proposed a more complex picture that emphasized the subjective, limited, contingent, situated, and even irrational dimensions of how humans come to know the world. The ideal of objectively secured knowledge became increasingly questionable throughout the 20th century. Some of these trends get folded under the label “postmodernism,” but I found the term unhelpful at best a decade ago—now find it altogether useless.

We can similarly trace a growing disillusionment with the ostensible impartiality of modern institutions. This takes at least two forms. On the one hand, we might consider the frustrating and demoralizing character of modern bureaucracies, which we can describe as rule-based machines designed to outsource judgement and enhance efficiency. On the other, we can note the heightened awareness of the actual failures of modern institutions to live up to the ideals of impartiality, which has been, in part, a function of the digital information ecosystem.

But while faith in the possibility of objectively secured knowledge and impartial institutions faltered, the myth of the machine persisted in the presumption that technology itself was fundamentally neutral. Until very recently, that is. Or so it seems. And my thesis (always for disputation) is that the collapse of this last manifestation of the myth brings the whole house down. This in part because of how much work the presumption of technological neutrality was doing all along to hold American society together. (International readers: as always read with a view to your own setting. I suspect there are some areas of broad overlap and other instances when my analysis won’t travel well). Already by the late 19th century, progress had become synonymous with technological advancements, as Leo Marx argued. If social, political, or moral progress stalled, then at least the advance of technology could be counted on…

But over the last several years, the plausibility of this last and also archetypal manifestation of the myth of the machine has also waned. Not altogether, to be sure, but in important and influential segments of society and throughout a wide cross-section of society, too. One can perhaps see the shift most clearly in the public discourse about social media and smart phones, but this may be a symptom of a larger disillusionment with technology. And not only technological artifacts and systems, but also with the technocratic ethos and the public role of expertise.

If the myth of the machine in these three manifestations, was, in fact, a critical element of the culture of modernity, underpinning its aspirations, then when each in turn becomes increasingly implausible the modern world order comes apart. I’d say that this is more or less where we’re at. You could usefully analyze any number of cultural fault lines through this lens. The center, which may not in fact hold, is where you find those who still operate as if the presumptions of objectivity, impartiality, and neutrality still compelled broad cultural assent, and they are now assailed from both the left and the right by those who have grown suspicious or altogether scornful of such presumptions. Indeed, the left/right distinction may be less helpful than the distinction between those who uphold some combination of the values of objectivity, impartiality, and neutrality and those who no longer find them compelling or desirable.

What happens when the systems and strategies deployed to channel often violent clashes within a population deeply, possibly intractably divided about substantive moral goods and now even about what Arendt characterized as the publicly accessible facts upon which competing opinions could be grounded—what happens when these systems and strategies fail?

It is possible to argue that they failed long ago, but the failure was veiled by an unevenly distributed wave of material abundance. Citizens became consumers and, by and large, made peace with the exchange. After all, if the machinery of government could run of its own accord, what was their left to do but enjoy the fruits of prosperity. But what if abundance was an unsustainable solution, either because it taxed the earth at too high a rate or because it was purchased at the cost of other values such as rootedness, meaningful work and involvement in civic life, abiding friendships, personal autonomy, and participation in rich communities of mutual care and support? Perhaps in the framing of that question, I’ve tipped my hand about what might be the path forward.

At the heart of technological modernity there was the desire—sometimes veiled, often explicit—to overcome the human condition. The myth of the machine concealed an anti-human logic: if the problem is the failure of the human to conform to the pattern of the machine, then bend the human to the shape of the machine or eliminate the human altogether. The slogan of the one of the high-modernist world’s fairs of the 1930s comes to mind: “Science Finds, Industry Applies, Man Conforms.” What is now being discovered in some quarters, however, is that the human is never quite eliminated, only diminished…

Eminently worth reading in full: “The Myth of the Machine, ” from @LMSacasas.

For a deep dive into similar waters, see John Ralston Saul‘s (@JohnRalstonSaul) Voltaire’s Bastards.

[Image above: source]

* Lewis Mumford, The Myth of the Machine

###

As we rethink rudiments, we might recall that it was on this date in 1919 that Arthur Eddington confirmed Einstein’s light-bending prediction– a part of The Theory of General Relativity– using photos of a solar eclipse. Eddington’s paper the following year was the “debut” of Einstein’s theoretical work in most of the English-speaking world (and occasioned an urban legend: when a reporter supposedly suggested that “only three people understand relativity,” Eddington was supposed to have jokingly replied “Oh, who’s the third?”)

One of Eddington’s photographs of the total solar eclipse of 29 May 1919, presented in his 1920 paper announcing its success, confirming Einstein’s theory that light “bends”

“Behind the hieroglyphic streets there would either be a transcendent meaning, or only the earth”*…

Gerardo Dottori, Explosion of Red on Green, 1910, oil on canvas. London, Tate Modern. [source]

A crop of new books attempts to explain the allure of conspiracy theories and the power of belief; Trevor Quirk considers them…

For the millions who were enraged, disgusted, and shocked by the Capitol riots of January 6, the enduring object of skepticism has been not so much the lie that provoked the riots but the believers themselves. A year out, and book publishers confirmed this, releasing titles that addressed the question still addling public consciousness: How can people believe this shit? A minority of rioters at the Capitol had nefarious intentions rooted in authentic ideology, but most of them conveyed no purpose other than to announce to the world that they believed — specifically, that the 2020 election was hijacked through an international conspiracy — and that nothing could sway their confidence. This belief possessed them, not the other way around.

At first, I’d found the riots both terrifying and darkly hilarious, but those sentiments were soon overwon by a strange exasperation that has persisted ever since. It’s a feeling that has robbed me of my capacity to laugh at conspiracy theories — QAnon, chemtrails, lizardmen, whatever — and the people who espouse them. My exasperation is for lack of an explanation. I see Trump’s most devoted hellion, rampaging down the halls of power like a grade schooler after the bell, and I need to know the hidden causes of his dopey rebellion. To account for our new menagerie of conspiracy theories, I told myself, would be to reclaim the world from entropy, to snap experience neatly to the grid once again. I would use recent books as the basis for my account of conspiracy theories in the age of the internet. From their pages I would extract insights and errors like newspaper clippings, pin the marginal, bizarre, and seemingly irrelevant details to the corkboard of my mind, where I could spy eerie resonances, draw unseen connections. At last, I could reveal that our epistemic bedlam is as a Twombly canvas — messy but decipherable…

Learn with @trevorquirk: “Out There,” in @GuernicaMag.

* Thomas Pynchon, The Crying of Lot 49

###

As we tangle with truth, we might send rigorous birthday greetings to Gustav Bergmann; he was born on this date in 1906. A philosopher, he was a member of the Vienna Circle, a a group of philosophers and scientists drawn from the natural and social sciences, logic and mathematics, whose values were rooted in the ideals of the Enlightenment. Their approach, logical positivism, an attempt to use logic to make philosophy “scientific,” has had immense influence on 20th-century philosophy, especially on the philosophy of science and analytic philosophy… even if it has not, in fact, eliminated the issues explored above.

source

We might also send birthday greetings in the form of logical and semantic puzzles both to the precocious protagonist of Alice’s Adventures in Wonderland and to her inspiration, Alice Liddell; they were “born” on this date in 1852.

source

%d bloggers like this: