(Roughly) Daily

Posts Tagged ‘philosophy

“Men have become the tools of their tools”*…

Visionary philosopher Bernard Stiegler argued that it’s not our technology that makes humans special; rather, it’s our relationship with that technology. Bryan Norton explains…

It has become almost impossible to separate the effects of digital technologies from our everyday experiences. Reality is parsed through glowing screens, unending data feeds, biometric feedback loops, digital protheses and expanding networks that link our virtual selves to satellite arrays in geostationary orbit. Wristwatches interpret our physical condition by counting steps and heartbeats. Phones track how we spend our time online, map the geographic location of the places we visit and record our histories in digital archives. Social media platforms forge alliances and create new political possibilities. And vast wireless networks – connecting satellites, drones and ‘smart’ weapons – determine how the wars of our era are being waged. Our experiences of the world are soaked with digital technologies.

But for the French philosopher Bernard Stiegler, one of the earliest and foremost theorists of our digital age, understanding the world requires us to move beyond the standard view of technology. Stiegler believed that technology is not just about the effects of digital tools and the ways that they impact our lives. It is not just about how devices are created and wielded by powerful organisations, nation-states or individuals. Our relationship with technology is about something deeper and more fundamental. It is about technics.

According to Stiegler, technics – the making and use of technology, in the broadest sense – is what makes us human. Our unique way of existing in the world, as distinct from other species, is defined by the experiences and knowledge our tools make possible, whether that is a state-of-the-art brain-computer interface such as Neuralink, or a prehistoric flint axe used to clear a forest. But don’t be mistaken: ‘technics’ is not simply another word for ‘technology’. As Martin Heidegger wrote in his essay ‘The Question Concerning Technology’ (1954), which used the German term Technik instead of Technologie in the original title: the ‘essence of technology is by no means anything technological.’ This aligns with the history of the word: the etymology of ‘technics’ leads us back to something like the ancient Greek term for art – technē. The essence of technology, then, is not found in a device, such as the one you are using to read this essay. It is an open-ended creative process, a relationship with our tools and the world.

This is Stiegler’s legacy. Throughout his life, he took this idea of technics, first explored while he was imprisoned for armed robbery, further than anyone else. But his ideas have often been overlooked and misunderstood, even before he died in 2020. Today, they are more necessary than ever. How else can we learn to disentangle the effects of digital technologies from our everyday experiences? How else can we begin to grasp the history of our strange reality?…

[Norton unspools Stiegler’s remarkable life and the development of his thought…]

… Technology, for better or worse, affects every aspect of our lives. Our very sense of who we are is shaped and reshaped by the tools we have at our disposal. The problem, for Stiegler, is that when we pay too much attention to our tools, rather than how they are developed and deployed, we fail to understand our reality. We become trapped, merely describing the technological world on its own terms and making it even harder to untangle the effects of digital technologies and our everyday experiences. By encouraging us to pay closer attention to this world-making capacity, with its potential to harm and heal, Stiegler is showing us what else is possible. There are other ways of living, of being, of evolving. It is technics, not technology, that will give the future its new face…

Eminently worth reading in full: “Our tools shape our selves,” from @br_norton in @aeonmag.

Compare and contrast: Kevin Kelly‘s What Technology Wants

* Henry David Thoreau

###

As we own up, we might send phenomenological birthday greetings to Immanuel Kant; he was born on this date in 1724.  One of the central figures of modern philosophy, Kant is remembered primarily for his efforts to unite reason with experience (e.g., Critique of Pure Reason [Kritik der reinen Vernunft], 1781), and for his work on ethics (e.g., Metaphysics of Morals [Die Metaphysik der Sitten], 1797) and aesthetics (e.g., Critique of Judgment [Kritik der Urteilskraft], 1790).  

But Kant made important contributions to mathematics and astronomy. For example: his argument that mathematical truths are a form of synthetic a priori knowledge was cited by Einstein as an important early influence on his work.  And his description of the Milky Way as a lens-shaped collection of stars that represented only one of many “island universes,” was later shown to be accurate by Herschel.

Act so as to treat humanity, whether in your own person or in that of another, at all times also as an end, and not only as a means.

Metaphysic of Morals

 source

“Analysis of death is not for the sake of becoming fearful but to appreciate this precious lifetime”*…

As Alex Blasdel explains, new research into the dying brain suggests the line between life and death may be less distinct than previously thought…

… For all that science has learned about the workings of life, death remains among the most intractable of mysteries. “At times I have been tempted to believe that the creator has eternally intended this department of nature to remain baffling, to prompt our curiosities and hopes and suspicions all in equal measure,” the philosopher William James wrote in 1909.

In 1976, the New York Times reported on the burgeoning scientific interest in “life after death” and the “emerging field of thanatology”. The following year, Moody and several fellow thanatologists founded an organisation that became the International Association for Near-Death Studies. In 1981, they printed the inaugural issue of Vital Signs, a magazine for the general reader that was largely devoted to stories of near-death experiences. The following year they began producing the field’s first peer-reviewed journal, which became the Journal of Near-Death Studies. The field was growing, and taking on the trappings of scientific respectability. Reviewing its rise in 1988, the British Journal of Psychiatry captured the field’s animating spirit: “A grand hope has been expressed that, through NDE research, new insights can be gained into the ageless mystery of human mortality and its ultimate significance, and that, for the first time, empirical perspectives on the nature of death may be achieved.”

But near-death studies was already splitting into several schools of belief, whose tensions continue to this day. One influential camp was made up of spiritualists, some of them evangelical Christians, who were convinced that near-death experiences were genuine sojourns in the land of the dead and divine. As researchers, the spiritualists’ aim was to collect as many reports of near-death experience as possible, and to proselytise society about the reality of life after death. Moody was their most important spokesman; he eventually claimed to have had multiple past lives and built a “psychomanteum” in rural Alabama where people could attempt to summon the spirits of the dead by gazing into a dimly lit mirror.

The second, and largest, faction of near-death researchers were the parapsychologists, those interested in phenomena that seemed to undermine the scientific orthodoxy that the mind could not exist independently of the brain. These researchers, who were by and large trained scientists following well established research methods, tended to believe that near-death experiences offered evidence that consciousness could persist after the death of the individual. Many of them were physicians and psychiatrists who had been deeply affected after hearing the near-death stories of patients they had treated in the ICU. Their aim was to find ways to test their theories of consciousness empirically, and to turn near-death studies into a legitimate scientific endeavour.

Finally, there emerged the smallest contingent of near-death researchers, who could be labelled the physicalists. These were scientists, many of whom studied the brain, who were committed to a strictly biological account of near-death experiences. Like dreams, the physicalists argued, near-death experiences might reveal psychological truths, but they did so through hallucinatory fictions that emerged from the workings of the body and the brain. (Indeed, many of the states reported by near-death experiencers can apparently be achieved by taking a hero’s dose of ketamine.) Their basic premise was: no functioning brain means no consciousness, and certainly no life after death. Their task, which Borjigin took up in 2015, was to discover what was happening during near-death experiences on a fundamentally physical level.

Slowly, the spiritualists left the field of research for the loftier domains of Christian talk radio, and the parapsychologists and physicalists started bringing near-death studies closer to the scientific mainstream. Between 1975, when Moody published Life After Life, and 1984, only 17 articles in the PubMed database of scientific publications mentioned near-death experiences. In the following decade, there were 62. In the most recent 10-year span, there were 221. Those articles have appeared everywhere from the Canadian Urological Association Journal to the esteemed pages of The Lancet.

Today, there is a widespread sense throughout the community of near-death researchers that we are on the verge of great discoveries…

… Perhaps the story to be written about near-death experiences is not that they prove consciousness is radically different from what we thought it was…

… there is something that binds many of these people – the physicalists, the parapsychologists, the spiritualists – together. It is the hope that by transcending the current limits of science and of our bodies, we will achieve not a deeper understanding of death, but a longer and more profound experience of life. That, perhaps, is the real attraction of the near-death experience: it shows us what is possible not in the next world, but in this one…

Eminently worth reading in full: “The new science of death: ‘There’s something happening in the brain that makes no sense’,” from @unkowthe_again in @guardian.

* Dalai Lama

###

As we ponder passages, we might send innovative (and painless) birthday greetings to Robert Andrew Hingson; he was born on this date in 1913. An anesthesiologist and inventor, he is best known for three major inventions that continue to relieve pain and suffering worldwide today. One is a very portable respirator anesthesia gas machine and resuscitator, called the Western Reserve Midget, used to deliver a short-term, general anesthetic.

The second came from extensive experiments in the use of anesthesia to prevent pain during childbirth, leading to the invention of the continuous caudal epidural anesthesia technique.

The third and best known is his “peace gun,” a pistol-shaped jet injector that enabled efficient, mass, needle-less inoculation worldwide against such diseases as small pox, measles, tuberculosis, tetanus, leprosy, polio, and influenza. It can inoculate 1,000 persons per hour with several simultaneous vaccines.

source

“Man’s first expression, like his first dream, was an aesthetic one”*…

From the new series, “Conjectures,” in the invaluable Public Domain Review, a piece by Octavian Esanu

What do we want from “school”? Knowledge, surely. But other things too. Experience, perhaps? — the vibrating sense of having been present as new thinking happened, of having been affected by an encounter with ideas? Certain kinds of teaching and learning, anyway, privilege that vaunted nexus of knowing and being. Early in the first session of his seminar on Theodor Adorno’s Aesthetic Theory, the American Marxist literary scholar Fredric Jameson asserts, here below, that “aesthetics” can be thought of as precisely a project that lies “halfway between the cognitive and the artistic” — which is to say, it is the enterprise of trying to understand (conceptually) that which seems to elude reduction to concepts (because we are, somehow, there in aesthetic experiences; and we are not conceptual!). By meticulously translating his recordings of Jameson’s seminars into the theatrical idiom of the stage script, the artist and scholar (and former Jameson student) Octavian Esanu doubles down, playfully and tenderly, on this deep problem. Pedagogy as performance? Teaching and learning, about art — as a work of art?

Series editor D. Graham Burnett‘s introduction

An experiment with historical form and method: “[Door creaks open. Footsteps]: Fredric Jameson’s Seminar on Aesthetic Theory,” from @PublicDomainRev.

Barnett Newman

###

As we investigate the ineffable, we might send absolutist birthday greetings to Thomas Hobbes of Malmesbury; he was born on this date in 1588.  A father of political philosophy and political science, Hobbes developed some of the fundamentals of European liberal thought: the right of the individual; the natural equality of all men; the artificial character of the political order (which led to the later distinction between civil society and the state); the view that all legitimate political power must be “representative” and based on the consent of the people; and a liberal interpretation of law which leaves people free to do whatever the law does not explicitly forbid– all this, though Hobbes was, on rational grounds, a champion of absolutism for the sovereign.  It was that, Hobbes reasoned, or the bloody chaos of a “war of all against all.”  His 1651 book Leviathan established social contract theory, the foundation of most later Western political philosophy.

Indeed, it was in some large measure Hobbes (and his legacy) that Adorno’s Frankfurt School colleagues Max Horkheimer, Erich FrommHerbert Marcuse (et al.) were working to revise.

 source

Written by (Roughly) Daily

April 5, 2024 at 1:00 am

“The solution to poverty is to abolish it directly”*…

The idea of a guaranteed flow of funds to allow anyone and everyone to meet basic needs– as we’re currently discussing it, a universal basic income– has been getting significant attention in recent decades. But as Karl Widerquist explains (in an excerpt from his recent book, Universal Basic Income). “UBI” dates back as a concept– and as a practice– many centuries…

Support for Universal Basic Income (UBI) has grown so rapidly over the past few years that people might think the idea appeared out of nowhere. In fact, the idea has roots going back hundreds or even thousands of years, and activists have been floating similar ideas with gradually increasing frequency for more than a century.

Since 1900, the concept of a basic income guarantee (BIG) has experienced three distinct waves of support, each larger than the last. The first, from 1910 to 1940, was followed by a down period in the 1940s and 1950s. A second and larger wave of support happened in the 1960s and 1970s, followed by another lull in most countries through about 2010. BIG’s third, most international, and by far largest wave of support began to take off in the early 2010s, and it has increased every year since then.

[But] We could trace the beginnings of UBI into prehistory, because many have observed that “prehistoric” (in the sense of nonliterate) societies had two ways of doing things that might be considered forms of unconditional income…

From pre-historic nomads, through ancient Athens, to Thomas Paine and then Henry George, Widerquist unspools the history of UBI, then walks through the “three waves” that began in the early 20th century, concluding with the current state of the debate: “The Deep and Enduring History of Universal Basic Income,” from @KarlWiderquist and @mitpress.

For more on the recent history of the UBI debate, see Widerquist’s essay, “Three Waves of Basic Income Support.”

And for a peak at the results of (small, incomplete, but encouraging) experiments in this direction, see: “Places across the U.S. are testing no-strings cash as part of the social safety net,” from @NPR.

* Dr. Martin Luther King, Jr.

###

As we ponder poverty, we might send thoughtful birthday greeting to James Tobin; he was born on this date in 1918. An economist who contributed to the development of key ideas in the Keynesian economics of his generation, he made pioneering contributions to the study of investment, monetary and fiscal policy, and financial markets– for which he shared the Nobel Memorial Prize in Economic Sciences in 1981.

Outside academia, Tobin is probably best known for his suggestion of a tax on foreign exchange transactions, now known as the “Tobin tax,” designed to reduce speculation in the international currency markets, which he saw as dangerous and unproductive.

And relevantly to the piece above, Tobin, Paul SamuelsonJohn Kenneth Galbraith and another 1,200 economists signed a document in 1968 calling for the U. S. Congress to introduce that year a system of income guarantees and supplements– a UBI.

source

“We are not what we know but what we are willing to learn”*…

Abigail Tulenko argues that folktales, like formal philosophy, unsettle us into thinking anew about our cherished values and views of the world…

The Hungarian folktale Pretty Maid Ibronka terrified and tantalised me as a child. In the story, the young Ibronka must tie herself to the devil with string in order to discover important truths. These days, as a PhD student in philosophy, I sometimes worry I’ve done the same. I still believe in philosophy’s capacity to seek truth, but I’m conscious that I’ve tethered myself to an academic heritage plagued by formidable demons.

The demons of academic philosophy come in familiar guises: exclusivity, hegemony and investment in the myth of individual genius. As the ethicist Jill Hernandez notes, philosophy has been slower to change than many of its sister disciplines in the humanities: ‘It may be a surprise to many … given that theology and, certainly, religious studies tend to be inclusive, but philosophy is mostly resistant toward including diverse voices.’ Simultaneously, philosophy has grown increasingly specialised due to the pressures of professionalisation. Academics zero in on narrower and narrower topics in order to establish unique niches and, in the process, what was once a discipline that sought answers to humanity’s most fundamental questions becomes a jargon-riddled puzzle for a narrow group of insiders.

In recent years, ‘canon-expansion’ has been a hot-button topic, as philosophers increasingly find the exclusivity of the field antithetical to its universal aspirations. As Jay Garfield remarks, it is as irrational ‘to ignore everything not written in the Eurosphere’ as it would be to ‘only read philosophy published on Tuesdays.’ And yet, academic philosophy largely has done just that. It is only in the past few decades that the mainstream has begun to engage seriously with the work of women and non-Western thinkers. Often, this endeavour involves looking beyond the confines of what, historically, has been called ‘philosophy’.

Expanding the canon generally isn’t so simple as resurfacing a ‘standard’ philosophical treatise in the style of white male contemporaries that happens to have been written by someone outside this demographic. Sometimes this does happen, as in the case of Margaret Cavendish (1623-73) whose work has attracted increased recognition in recent years. But Cavendish was the Duchess of Newcastle, a royalist whose political theory criticises social mobility as a threat to social order. She had access to instruction that was highly unusual for women outside her background, which lends her work a ‘standard’ style and structure. To find voices beyond this elite, we often have to look beyond this style and structure.

Texts formerly classified as squarely theological have been among the first to attract significant renewed interest. Female Catholic writers such as Teresa of Ávila or Sor Juana Inés de la Cruz, whose work had been largely ignored outside theological circles, are now being re-examined through a philosophical lens. Likewise, philosophy departments are gradually including more work by Buddhist philosophers such as Dignāga and Ratnakīrti, whose epistemological contributions have been of especial recent interest. Such thinkers may now sit on syllabi alongside Augustine or Aquinas who, despite their theological bent, have long been considered ‘worthy’ of philosophical engagement.

On the topic of ‘worthiness’, I am wary of using the term ‘philosophy’ as an honorific. It is crucial that our interest in expanding the canon does not involve the implication that the ‘philosophical’ confers a degree of rigour over the theological, literary, etc. To do so would be to engage in a myopic and uninteresting debate over academic borders. My motivating question is not what the label of ‘philosophy’ can confer upon these texts, but what these texts can bring to philosophy. If philosophy seeks insight into the nature of such universal topics as reality, morality, art and knowledge, it must seek input from those beyond a narrow few. Engaging with theology is a great start, but these authors still largely represent an elite literate demographic, and raise many of the same concerns regarding a hegemonic, exclusive and individualistic bent.

As Hernandez quips: ‘[W]e know white, Western men have not cornered the market on deeply human, philosophical questions.’ And furthermore, ‘we also know, prudentially, that philosophy as a discipline needs to (and must) undergo significant navel-gazing to survive … in an ever-increasingly difficult time for homogenous, exclusive academic disciplines.’ In light of our aforementioned demons, it appears that philosophy is in urgent need of an exorcism.

I propose that one avenue forward is to travel backward into childhood – to stories like Ibronka’s. Folklore is an overlooked repository of philosophical thinking from voices outside the traditional canon. As such, it provides a model for new approaches that are directly responsive to the problems facing academic philosophy today. If, like Ibronka, we find ourselves tied to the devil, one way to disentangle ourselves may be to spin a tale…

Wisdom is where we find it: “Folklore is philosophy,” in @aeonmag. Eminently worth reading in full.

Apposite: “Syncretic Past.”

* Mary Catherine Bateson

###

As we update our understanding of understanding, we might send thoughtful birthday greetings to Michael Sandel; he was born on this date in 1953. A philosopher and professor of government theory at Harvard Law School (where his course Justice was the university’s first course to be made freely available online and on television, seen so far by tens of millions of people around the world), he is probably best known for his critique of John Rawls‘ A Theory of Justice (in Sandel’s book, Liberalism and the Limits of Justice).

Sandel subscribes to a certain version of communitarianism (although he is uncomfortable with the label), and in this vein he is perhaps best known for his critique of John Rawls’s A Theory of Justice. Rawls’s argument depends on the assumption of the veil of ignorance, which Sandel argues commits Rawls to a view of people as “unencumbered selves”. Sandel’s view is that we are by nature encumbered to an extent that makes it impossible even hypothetically to have such a veil. Some examples of such ties are those with our families, which we do not make by conscious choice but are born with, already attached. Because they are not consciously acquired, it is impossible to separate oneself from such ties. Sandel believes that only a less-restrictive, looser version of the veil of ignorance should be postulated. Criticism such as Sandel’s inspired Rawls to subsequently argue that his theory of justice was not a “metaphysical” theory but a “political” one, a basis on which an overriding consensus could be formed among individuals and groups with many different moral and political views.

source

source