(Roughly) Daily

Posts Tagged ‘philosophy

“Okay, so that was trolley problem version number seven. Chidi opted to run over five William Shakespeares instead of one Santa Claus.”*…

 

Trolley Problem

 

Imagine you are standing beside some tram tracks. In the distance, you spot a runaway trolley hurtling down the tracks towards five workers who cannot hear it coming. Even if they do spot it, they won’t be able to move out of the way in time.

As this disaster looms, you glance down and see a lever connected to the tracks. You realise that if you pull the lever, the tram will be diverted down a second set of tracks away from the five unsuspecting workers.

However, down this side track is one lone worker, just as oblivious as his colleagues.

So, would you pull the lever, leading to one death but saving five?

This is the crux of the classic thought experiment known as the trolley dilemma, developed by philosopher Philippa Foot in 1967 and adapted by Judith Jarvis Thomson in 1985.

The trolley dilemma allows us to think through the consequences of an action and consider whether its moral value is determined solely by its outcome.

The trolley dilemma has since proven itself to be a remarkably flexible tool for probing our moral intuitions, and has been adapted to apply to various other scenarios, such as war, torture, drones, abortion and euthanasia.

The trolley dilemma explored: would you kill one person to save five?

See also your correspondent’s favorite examination of the issue, from The Good Place:

 

Indeed, Michael “resolves” the dilemma in a way that would make today’s Almanac honoree (below) proud:

 

For an earlier look at The Trolley Problem on (Roughly) Daily, see “Educating the mind without educating the heart is no education at all”*…

[Image at top from Devine Lu Linvega (or @neauoire@merveilles.town)]

* Michael, The Good Place, Season Two, Episode Five: “The Trolley Problem”

###

As we muse on morality, we might spare a thought for John Bordley Rawls; he died on this date in 2002 (Spinoza’s birthday and the anniversary of Darwin’s publication of On the Origin of the Species).  A moral and political philosopher, Rawls argued for “justice as fairness,” recommending equal basic rights, equality of opportunity, and promoting the interests of the least advantaged members of society.  He made these social justice arguments using a thought experiment he called the “original position,” in which people select what kind of society they would choose to live under as if they did not know which social position they would personally occupy.

Rawls received both the Schock Prize for Logic and Philosophy and the National Humanities Medal in 1999, the latter presented by President Bill Clinton, in recognition of the way in which Rawls’ work “helped a whole generation of learned Americans revive their faith in democracy itself.”  He is widely considered the most important political philosopher of the 20th century– with the unusual distinction among contemporary political philosophers of being frequently cited by the courts of law in the United States and Canada and referenced by practicing politicians in the United States and the UK.

A concept central to the “original position” approach to moral dilemmas is Rawls’ notion of a “veil of ignorance“: we decide the outcome without knowing which character we “are” in the situation… an approach that leads to Michael to his conclusion in The Good Place.  The “Golden Rule” strikes again!  [see also “The Unselfish Trolley Problem“]

220px-john_rawls source

 

“Men of broader intellect know that there is no sharp distinction betwixt the real and the unreal”*…

 

Tesseract

Colored cubes — known as “Tesseracts” — as depicted in the frontispiece to Hinton’s The Fourth Dimension (1904)

 

During the period we now call the fin de siècle, worlds collided. Ideas were being killed off as much as being born. And in a sort of Hegelian logic of thesis/antithesis/synthesis, the most interesting ones arose as the offspring of wildly different parents. In particular, the last gasp of Victorian spirituality infused cutting-edge science with a certain sense of old-school mysticism. Theosophy was all the rage; Huysmans dragged Satan into modern Paris; and eccentric poets and scholars met in the British Museum Reading Room under the aegis of the Golden Dawn for a cup of tea and a spot of demonology. As a result of all this, certain commonly-accepted scientific terms we use today came out of quite weird and wonderful ideas being developed at the turn of the century. Such is the case with space, which fascinated mathematicians, philosophers, and artists with its unfathomable possibilities…

In April 1904, C. H. Hinton published The Fourth Dimension, a popular maths book based on concepts he had been developing since 1880 that sought to establish an additional spatial dimension to the three we know and love. This was not understood to be time as we’re so used to thinking of the fourth dimension nowadays; that idea came a bit later. Hinton was talking about an actual spatial dimension, a new geometry, physically existing, and even possible to see and experience; something that linked us all together and would result in a “New Era of Thought.”…

Hinton’s ideas gradually pervaded the cultural milieu over the next thirty years or so — prominently filtering down to the Cubists and Duchamp. The arts were affected by two distinct interpretations of higher dimensionality: on the one hand, the idea as a spatial, geometric concept is readily apparent in early Cubism’s attempts to visualise all sides of an object at once, while on the other hand, it becomes a kind of all-encompassing mystical codeword used to justify avant-garde experimentation. “This painting doesn’t make sense? Ah, well, it does in the fourth dimension…” It becomes part of a language for artists exploring new ideas and new spaces…

By the late 1920s, Einsteinian Space-Time had more or less replaced the spatial fourth dimension in the minds of the public. It was a cold yet elegant concept that ruthlessly killed off the more romantic idea of strange dimensions and impossible directions. What had once been the playground of spiritualists and artists was all too convincingly explained. As hard science continued to rise in the early decades of the twentieth century, the fin-de-siècle’s more outré ideas continued to decline. Only the Surrealists continued to make reference to it, as an act of rebellion and vindication of the absurd. The idea of a real higher dimension linking us together as One sounded all a bit too dreamy, a bit too old-fashioned for a new century that was picking up speed, especially when such vague and multifarious explanations were trumped by the special theory of relativity. Hinton was as much hyperspace philosopher as scientist and hoped humanity would create a more peaceful and selfless society if only we recognised the unifying implications of the fourth dimension. Instead, the idea was banished to the realms of New Age con-artists, reappearing these days updated and repackaged as the fifth dimension. Its shadow side, however, proved hopelessly alluring to fantasy writers who have seen beyond the veil, and bring back visions of horror from an eldritch land outside of time and space that will haunt our nightmares with its terrible geometry, where tentacles and abominations truly horrible sleep beneath the Pacific Ocean waiting to bring darkness to our world… But still we muddle on through.

Hyperspace, tesseracts, ghosts, and colorful cubesJon Crabb, Editor, British Library Publishing, on the work of Charles Howard Hinton and the cultural history of higher dimensions: “Notes on the Fourth Dimension.”

[TotH to MK]

* H.P. Lovecraft, The Tomb

###

As we get high(er), we might recall that it was on this date in 1946 that Al Gross went public with his invention of the walkie talkie.  Gross had developed it as a top secret project during World War II; he went on to develop the circuitry that opened the way to personal pocket paging systems, CB radio, and patented precursors of the cell phone and the cordless phone.  Sadly for him, his patents expired before they became commercially viable.  ”Otherwise,” Gross said, after winning the M.I.T. lifetime achievement award, ”I’d be as rich as Bill Gates.”

While Gross himself is almost unknown to the general public, he did achieve one-step-removed notoriety in 1948 when he “gifted” his friend Chester Gould the concept of miniaturized radio transceivers, which Gross had just patented.  Gould put it to use as the two-way wrist radio in his comic strip Dick Tracy.

200px-ALGROS2 source

 

Written by LW

November 20, 2019 at 1:01 am

“Nothing is at last sacred but the integrity of your own mind”*…

 

mind internet

 

Imagine that a person’s brain could be scanned in great detail and recreated in a computer simulation. The person’s mind and memories, emotions and personality would be duplicated. In effect, a new and equally valid version of that person would now exist, in a potentially immortal, digital form. This futuristic possibility is called mind uploading. The science of the brain and of consciousness increasingly suggests that mind uploading is possible – there are no laws of physics to prevent it. The technology is likely to be far in our future; it may be centuries before the details are fully worked out – and yet given how much interest and effort is already directed towards that goal, mind uploading seems inevitable. Of course we can’t be certain how it might affect our culture but as the technology of simulation and artificial neural networks shapes up, we can guess what that mind uploading future might be like.

Suppose one day you go into an uploading clinic to have your brain scanned. Let’s be generous and pretend the technology works perfectly. It’s been tested and debugged. It captures all your synapses in sufficient detail to recreate your unique mind. It gives that mind a standard-issue, virtual body that’s reasonably comfortable, with your face and voice attached, in a virtual environment like a high-quality video game. Let’s pretend all of this has come true.

Who is that second you?

Princeton neuroscientist, psychologist, and philosopher Michael Graziano explores: “What happens if your mind lives forever on the internet?

* Ralph Waldo Emerson, Self-Reliance

###

As we ponder presence, we might spare a thought for William “Willy” A. Higinbotham; he died on this date in 1994.  A physicist who was a member of the team that developed the first atomic bomb, he later became a leader in the nuclear non-proliferation movement.

But Higinbotham may be better remembered as the creator of Tennis for Two— the first interactive analog computer game, one of the first electronic games to use a graphical display, and the first to be created as entertainment (as opposed to as a demonstration of a computer’s capabilities).  He built it for the 1958 visitor day at Brookhaven National Laboratory.

It used a small analogue computer with ten direct-connected operational amplifiers and output a side view of the curved flight of the tennis ball on an oscilloscope only five inches in diameter. Each player had a control knob and a button.

 source

The 1958 Tennis for Two exhibit

source

 

Written by LW

November 10, 2019 at 1:01 am

“One must not think slightingly of the paradoxical”*…

 

Argo

The Building of the Argo, by Antoon Derkinderen, c. 1901. Rijksmuseum.

 

The thought problem known as the ship of Theseus first appears in Plutarch’s Lives, a series of biographies written in the first century. In one vignette, Theseus, founder-hero of Athens, returns victorious from Crete on a ship that the Athenians went on to preserve.

They took away the old planks as they decayed, putting in new and stronger timber in their place, insomuch that this ship became a standing example among the philosophers for the logical question of things that grow; one side holding that the ship remained the same, and the other contending that it was not the same…

Of course, the conundrum of how things change and stay the same has been with us a lot longer than Plutarch. Plato, and even pre-Socratics like Heraclitus, dealt in similar questions. “You can’t step in the same river twice,” a sentiment found on inspirational Instagram accounts, is often attributed to Heraclitus. His actual words—“Upon those who step into the same rivers, different and again different waters flow”—might not be the best Instagram fodder but, figuratively at least, provided the waters that the ship of Theseus later sailed.

Two thousand years later the ship is still bobbing along, though some of its parts have been replaced. Now known colloquially as Theseus’ paradox, in the U.S. the idea sometimes appears as “Washington’s ax.” While not as ancient as the six-thousand-year-old stone ax discovered last year at George Washington’s estate, the age-old question remains: If Washington’s ax were to have its handle and blade replaced, would it still be the same ax? The same has been asked of a motley assortment of items around the world. In Hungary, for example, there is a similar fable involving the statesman Kossuth Lajos’ knife, while in France it’s called Jeannot’s knife.

This knife, that knife, Washington’s ax—there’s even a “Lincoln’s ax.” We don’t know where these stories originated. They likely arose spontaneously and had nothing to do with the ancient Greeks and their philosophical conundrums. The only thing uniting these bits of folklore is that the same question was asked: Does a thing remain the same after all its parts are replaced? In the millennia since the ship of Theseus set sail, some notions that bear its name have less in common with the original than do the fables of random axes and knives, while other frames for this same question threaten to replace the original entirely.

One such version of this idea is attributed to Enlightenment philosopher John Locke, proffering his sock as an example. An exhibit called Locke’s Socks at Pace University’s now-defunct Museum of Philosophy serves to demonstrate. On one wall, six socks were hung: the first a cotton sports sock, the last made only of patches. A museum guide, according to a New York Times write-up, asked a room full of schoolchildren, “Assume the six socks represent a person’s sock over time. Can we say that a sock which is finally all patches, with none of the original material, is the same sock?”

The question could be asked of Theseus’ paradox itself. Can it be said that a paradox about a ship remains the same if the ship is replaced with a knife or a sock? Have we lost anything from Theseus’ paradox if instead we start calling it “the Locke’s Sock paradox”?…

Is a paradox still the same after its parts have been replaced?  A consideration: “Restoring the Ship of Theseus.”

* Soren Kierkegaard

###

As we contemplate change, we might spare a reasoned thought for the Enlightenment giant (and sock darner) John Locke; the physician and philosopher died on this date in 1704.  An intellectual descendant of Francis Bacon, Locke was among the first empiricists. He spent over 20 years developing the ideas he published in his most significant work, Essay Concerning Human Understanding (1690), an analysis of the nature of human reason which promoted experimentation as the basis of knowledge.  Locke established “primary qualities” (e.g., solidity, extension, number) as distinct from “secondary qualities” (sensuous attributes like color or sound). He recognized that science is made possible when the primary qualities, as apprehended, create ideas that faithfully represent reality.

Locke is, of course, also well-remembered as a key developer (with Hobbes, and later Rousseau) of the concept of the Social Contract.  Locke’s theory of “natural rights” influenced Voltaire and Rosseau– and formed the intellectual basis of the U.S. Declaration of Independence.

220px-John_Locke source

 

 

Written by LW

October 28, 2019 at 1:01 am

“All that is solid melts into air”*…

 

misinfo

 

Ideas replaced with feelings. A radical relativism that implies truth is unknowable. Politicians who revel in lying openly, shamelessly, as if being caught out is the point of politics. The notion of the people and the many redefined ceaselessly, words unmoored from meaning, ideas of the future dissolving into nasty nostalgias with enemies everywhere, conspiracy replacing ideology, facts equated to fibs, discussion collapsing into mutual accusations, where every argument is just another smear campaign, all information warfare … and the sense that everything under one’s feet is constantly moving, inherently unstable, liquid …

Almost a decade ago I left Russia because I was exhausted by living in a system where, to quote myself invoking Hannah Arendt, “nothing is true and everything is possible.” Those were still relatively vegetarian days in Moscow — before the invasion of Ukraine — but it was already a world where terms like liberal or democracy were used to mean their opposite, where paranoia was increasingly replacing reasoned argument, and where spectacle had pushed out sense. You were left with only gut feelings to lead your way through the fog of disinformation. I returned to the thing once known as “the West,” living in London and often working in the United States, because, in the words of my naïve self, I wanted to live in a world where “words have meaning,” where facts were not dismissed as “just information war.” Russia seemed a country unable to come to terms with the loss of the Cold War, or with any of the traumas of the 20th century. It was ultimately, I thought, a sideshow, a curio pickled in its own agonies. Russians stressed this themselves: in Western Europe, America, things are “normalno” they would tell me. If you have the chance, that is where you send your wives, children, money … to “normalnost.”

Back in the West, however, I soon noticed things that reminded me of Moscow…

Peter Pomerantsev in an essay from his new book, This Is Not Propaganda: Adventures in the War Against Reality: “Normalnost.”

Pair with his essay “The Info War of All Against All” and this review of his book.

[image above: source]

* Karl Marx and Friedrich Engels, The Communist Manifesto

###

As we get down with Diogenes, we might expect little or no help from today’s birthday boy, Henri-Louis Bergson; he was born on this date in 1859.  A philosopher especially influential in the first half of the 20th Century, Bergson convinced many of the primacy of immediate experience and intuition over rationalism and science for the understanding reality…. many, but not the likes of Wittgenstein, Russell, Moore, and Santayana, who thought that he willfully misunderstood the scientific method in order to justify his “projection of subjectivity onto the physical world.”  Still, in 1927 Bergson won the Nobel Prize (in Literature); and in 1930, received France’s highest honor, the Grand-Croix de la Legion d’honneur.

Bergson’s influence waned mightily later in the century.  To the extent that there’s been a bit of a resurgence of interest, it’s largely the result, in philosophical circles, of Gilles Deleuze’s appropriation of Bergson’s concept of “mulitplicity” and his treatment of duration, which Deleuze used in his critique of Hegel’s dialectic, and in the religious and spiritualist studies communities, of Bergson’s seeming embrace of the concept of an overriding/underlying consciousness in which humans participate.

 source

 

 

Written by LW

October 18, 2019 at 1:01 am

“If a man will begin with certainties, he shall end in doubts; but if he will be content to begin with doubts, he shall end in certainties”*…

 

referee

 

The motivation for using video review in sports is obvious: to get more calls right. This seems like an easy enough mission to fulfill, but anyone who has spent even a little time watching sports on TV can attest to the fact that the application of video review is not so simple. In most sports where it is applied, video review has actually created more confusion and less clarity. Why is this the case? Follow me into an examination of thousands of years of philosophical discourse, and we will find the answer together, my friends.

The root problem with video review is that it is so often used to make decisions based on rules that contain an inherent level of vagueness. For example, according to the NFL’s current catch rule (Rule 8, Section 1, Article 3a) an inbounds player must secure “control of the ball in his hands or arms prior to the ball touching the ground” in order to complete a catch. The term “control” in that rule is vague. There are borderline cases of controlling a football, which means boundaries for when the term “control” can and can’t be applied are fuzzy ones.

Philosophers have been dealing with the problems posed by vagueness since at least the 4th Century BC, because the problems that vagueness causes aren’t limited to the NFL’s struggle with the catch rule. Vagueness also has important implications for metaphysics, the philosophy of language, and our understanding of the nature of truth and the foundations of logic…

Into the rabbit hole of certainty at “A Philosopher’s Definitive (And Slightly Maddening) Case Against Replay Review.”

* Francis Bacon, The Oxford Francis Bacon IV: The Advancement of Learning

###

As we correct our concept of correctness, we might send antiseptic birthday greetings to Earle Dickson; he was born on this date in 1892.  Dickson, concerned that his wife, Josephine Knight, often cut herself while doing housework and cooking, devised a way that she could easily apply her own dressings.  He prepared ready-made bandages by placing squares of cotton gauze at intervals along an adhesive strip and covering them with crinoline.  In the event, all his wife had to do was cut off a length of the strip and wrap it over her cut.  Dickson, who worked as a cotton buyer at Johnson & Johnson, took his idea to his employer… and the Band-Aid was born.

 source

 

 

Written by LW

October 10, 2019 at 1:01 am

“But enough about me, let’s talk about you…. what do you think of me?”*…

 

Century

 

The 21st century is the most important century in human history.

At least that’s what a number of thinkers say. Their argument is pretty simple: Mostly, it’s that there are huge challenges that we have to surmount this century to get any future at all, making this the most consequential of all centuries so far. Furthermore, a solution to those challenges would likely mean a future farther from the brink of destruction — which makes this century more pivotal than future centuries, too…

[The case for this century as most important, unpacked]

Sure, we have some pretty good arguments for the importance of our era. But … doesn’t everybody? Are the arguments for the 21st century really that much stronger than the arguments for the 1st century, or for centuries yet to come?

Under this view, sure, we have some serious challenges ahead of us. But it’s a mistake to think we’re in a unique moment in history. There’s every reason to think that the challenges faced in future centuries will be as significant…

[The cases for other periods as most important, explored]

But it’s not just an abstract philosophy argument… If this is the crucial moment in human history, foundations that’ll be around for centuries aren’t a top priority. If humanity’s biggest problems are best left to our grandchildren and their grandchildren, then it doesn’t seem so strange to try to set up enduring human institutions with the power to influence successive generations. If this is the critical moment, the balance of our efforts should probably be spent less on long-term priorities questions and more on action — like political efforts to reverse course on dangerous human activities, and research on how to mitigate the immediate dangers of present threats…

A philosophic argument over the relative importance of our present era (and why it matters): “Is this the most important century in human history?

(Your correspondent would note that, even if this is the most important century in human history so far, it doesn’t follow that there won’t centuries that are at least as important to come…  so it’s surely prudent to invest energy and resources in assuring that foundational institutions and infrastructure, options, and resources are available to our successors…)

* variously attributed to “CC Bloom” (Bette Midler in Beaches) and NYC Mayor Ed Koch, among others

###

As we take the long view, we might recall that it was on this date in 1520 that Suleiman I (AKA Suleiman the Magnificent) was proclaimed Sultan of the Ottoman Empire.  The tenth and longest-reigning Sultan, he ruled over at least 25 million people at the apex of Ottoman economic, military, and political power, and at the broadest range of its geographical reach.  A distinguished poet and goldsmith, he also became a great patron of culture, overseeing the Golden Age of the Ottoman Empire in its artistic, literary and architectural development.

220px-EmperorSuleiman source

 

 

Written by LW

September 30, 2019 at 1:01 am

%d bloggers like this: