(Roughly) Daily

Posts Tagged ‘theory

“Ultimately, it is the desire, not the desired, that we love”*…

Or is it? The web– and the world– are awash in talk of the Mimetic Theory of Desire (or Rivalry, as its creator, René Girard, would also have it). Stanford professor (and Philosophy Talk co-host) Joshua Landy weights in with a heavy word of caution…

Here are two readings of Shakespeare’s Hamlet. Which do you think we should be teaching in our schools and universities?

Reading 1. Hamlet is unhappy because he, like all of us, has no desires of his own, and therefore has no being, properly speaking. The best he can do is to find another person to emulate, since that’s the only way anyone ever develops the motivation to do anything. Shakespeare’s genius is to show us this life-changing truth.

Reading 2. Hamlet is unhappy because he, like all of us, is full of body thetans, harmful residue of the aliens brought to Earth by Xenu seventy-five million years ago and disintegrated using nuclear bombs inside volcanoes. Since it is still some time until the practice of auditing comes into being, Hamlet has no chance of becoming “clear”; it is no wonder that he displays such melancholy and aimlessness. Shakespeare’s genius is to show us this life-changing truth.

Whatever you make of the first, I’m rather hoping that you feel at least a bit uncomfortable with the second. If so, I have a follow-up question for you: what exactly is wrong with it? Why not rewrite the textbooks so as to make it our standard understanding of Shakespeare’s play? Surely you can’t fault the logic behind it: if humans have indeed been full of body thetans since they came into existence, and Hamlet is a representation of a human being, Hamlet must be full of body thetans. What is more, if everyone is still full of body thetans, then Shakespeare is doing his contemporaries a huge favor by telling them, and the new textbooks will be doing us a huge favor by telling the world. Your worry, presumably, is that this whole body thetan business is just not true. It’s an outlandish hypothesis, with nothing whatsoever to support it. And since, as Carl Sagan once said, “extraordinary claims require extraordinary evidence,” we would do better to leave it alone.

I think you see where I’m going with this. The fact is, of course, that the first reading is just as outlandish as the second. As I’m about to show (not that it should really need showing), human beings do have desires of their own. That doesn’t mean that all our desires are genuine; it’s always possible to be suckered into buying a new pair of boots, regardless of the fact that they are uglier and shoddier than our old ones, just because they are fashionable. What it means is that some of our desires are genuine. And having some genuine desires, and being able to act on them, is sufficient for the achievement of authenticity. For all we care, Hamlet’s inky cloak could be made by Calvin Klein, his feathered hat by Diane von Furstenberg; the point is that he also has motivations (to know things, to be autonomous, to expose guilt, to have his story told accurately) that come from within, and that those are the ones that count.

To my knowledge, no one in the academy actually reads Hamlet (or anything else) the second way. But plenty read works of literature the first way. René Girard, the founder of the approach, was rewarded for doing so with membership in the Académie française, France’s elite intellectual association. People loved his system so much that they established a Colloquium on Violence and Religion, hosted by the University of Innsbruck, complete with a journal under the ironically apt name Contagion. More recently, Peter Thiel, the co-founder of PayPal, loved it so much that he sank millions of dollars into Imitatio, an institute for the dissemination of Girardian thought. And to this day, you’ll find casual references to the idea everywhere, from people who seem to think it’s a truth, one established by René Girard. (Here’s a recent instance from the New York Times opinion pages: “as we have learned from René Girard, this is precisely how desires are born: I desire something by way of imitation, because someone else already has it.”) All of which leads to an inevitable question: what’s the difference between Girardianism and Scientology? Why has the former been more successful in the academy? Why is the madness of theory so, well, contagious?…

Are we really dependent on others for our desires? Does that mechanism inevitably lead to rivalry, scapegoating, and division? @profjoshlandy suggests not: “Deceit, Desire, and the Literature Professor: Why Girardians Exist,” in @StanfordArcade. Via @UriBram in @TheBrowser. Eminently worth reading in full.

* Friedrich Nietzsche (an inspiration to Girard)

###

As we tease apart theorizing, we might spare a thought for William Whewell; he died on this date in 1866. A scientist, Anglican priest, philosopher, theologian, and historian of science, he was Master of Trinity College, Cambridge.

At a time when specialization was increasing, Whewell was renown for the breadth of his work: he published the disciplines of mechanics, physics, geology, astronomy, and economics, while also finding the time to compose poetry, author a Bridgewater Treatise, translate the works of Goethe, and write sermons and theological tracts. In mathematics, Whewell introduced what is now called the Whewell equation, defining the shape of a curve without reference to an arbitrarily chosen coordinate system. He founded mathematical crystallography and developed a revision of  Friedrich Mohs’s classification of minerals. And he organized thousands of volunteers internationally to study ocean tides, in what is now considered one of the first citizen science projects.

But some argue that Whewell’s greatest gift to science was his wordsmithing: He created the words scientist and physicist by analogy with the word artist; they soon replaced the older term natural philosopher. He also named linguisticsconsiliencecatastrophismuniformitarianism, and astigmatism.

Other useful words were coined to help his friends: biometry for John Lubbock; Eocine, Miocene and Pliocene for Charles Lyell; and for Michael Faraday, electrode, anode, cathode, diamagnetic, paramagnetic, and ion (whence the sundry other particle names ending -ion).

source

“The nature of an innovation is that it will arise at a fringe”*…

 

fringe

 

Alternative media outlets of the Left and Right have become a crucial supplement to our knowledge of the world, providing those perspectives usually ignored by our mainstream media...

From the masthead of the aggressively-inclusive site that means to make those views available, The Unz Review.

The collection is sufficiently vast that your correspondent cannot guarantee against any bias in its eclecticism (indeed, he notes that it is the work of Ron Unz).  Still, it’s a remarkable aggregation of theory, opinion, and reportage, from what seems a broad array of points-of-view.

Readers are advised to steel themselves, take a deep breath…  then dive in.

Pair with Wikipedia’s Fringe Theory page– and perhaps more interestingly still, their explanation in their editorial guidelines of how they identify and classify fringe theories.

[Image above: source]

* “The nature of an innovation is that it will arise at a fringe where it can afford to become prevalent enough to establish its usefulness without being overwhelmed by the inertia of the orthodox system.”   — Kevin Kelly

###

As we iris out, we might recall that it was on this date in 1804 that Corps of Discovery– better known today as the Lewis and Clark Expedition– left Camp Dubois, near Wood River, Illinois, commencing what would be a trek over two years on which they became the first American expedition to cross what is now the western portion of the United States.

President Thomas Jefferson had commissioned the expedition shortly after the Louisiana Purchase (in 1803) to explore and to map the newly acquired territory, to find a practical route across the western half of the continent– a Northwest Passage– and to establish an American presence in this territory before Britain and other European powers tried to claim it.

 source

Meriwether Lewis and William Clark

source

 

 

Written by (Roughly) Daily

May 14, 2020 at 1:01 am

“Anyone who lives within their means suffers from a lack of imagination”*…

 

Modern Monetary Theory’s basic principle seems blindingly obvious: Under a fiat currency system, a government can print as much money as it likes. As long as country can mobilize the necessary real resources of labor, machinery, and raw materials, it can provide public services. Our fear of deficits, according to MMT, comes from a profound misunderstanding of the nature of money.

Every five-year-old understands money. It’s what you give the nice lady before she hands you the ice cream cone—an object with intrinsic value that can be redeemed for goods or services. Through the lens of Modern Monetary Theory, however, a dollar is nothing but a liability issued by the US government, which promises to accept it back in payment of taxes. The dollar in your pocket represents a debt owed you by the federal government. Money isn’t a lump of gold but rather an IOU.

This mildly metaphysical distinction ends up having huge practical consequences. It means the federal government, unlike you and me, can’t run out of cash. It can run out of things money can buy—which will drive up their price and be manifest in inflation—but it can’t run out of money. As Sam Levey, a graduate student in economics who tweets under the name Deficit Owls told me, “Macy’s can’t run out of Macy’s gift certificates.”

Especially for those who want the government to provide more services to citizens, this is a convincing argument, and one that can be understood by non-economists…

Everyone knows governments need to tax before they can spend. What Modern Monetary Theory presupposes is, maybe they don’t.  Offered for interest (and with no endorsement): “The Radical Theory That the Government Has Unlimited Money.”

* Oscar Wilde

###

As we crank up the printing press, we might recall that it was on this date in 2009, several months into the Great Recession, President Barack Obama met with the CEOs of America’s 13 largest financial institutions to discuss a path out of the economic trough onto which the U.S. had descended.  Finding them suspicious of his new (Democratic) administration and worried that he would be less generous to their companies than President Bush and his administration had been, Obama opened by suggesting…

My administration is the only thing between you and the pitchforks… But you need to show that you get that this is a crisis and that everyone has to make some sacrifices…I’m not out there to go after you. I’m protecting you. But if I’m going to shield you from public and congressional anger, you have to give me something to work with on these issues of compensation. Help me help you Everybody has to pitch in. We’re all in this together.

The result was a series of compromises that survived the Obama Administration, but that are now being systematically undone under the Trump Administration.

See also “13 Bankers.”

Kenneth D. Lewis, the chief executive of Bank of America, with other bank executives outside the White House after the meeting with President Obama

source

 

Written by (Roughly) Daily

March 27, 2018 at 1:01 am

“A few scribbles on a blackboard… can change the course of human affairs”*…

 

What’s the most transformative piece of technology in U.S. classrooms? Smart boards? Laptops? In a 2000 paper on computers in education, Steven D. Krause argues that it’s one that’s been around for nearly two centuries: the blackboard. And he suggests that if we want to understand how teachers adopt technology, we might want to study its history.

To understand the impact of blackboards, Krause writes, we need to consider what schools were like before them. Around 1800, most U.S. schools were one-room log buildings with a fireplace at one end and a single window at the other. “Writing lessons” generally meant students working on their own, whittling goose-quill pens and copying out texts.

When the idea of chalkboards first arrived in the early nineteenth century, they came as a revelation to teachers and education experts. In 1841, one educator declared that the blackboard’s unknown inventor “deserves to be ranked among the best contributors to learning and science, if not among the greatest benefactors of mankind.” Around the same time, another writer praised blackboards for “reflecting the workings, character and quality of the individual mind.”

It’s important to remember that school budgets and student-teacher ratios in the early nineteenth century would seem ludicrous to a modern school district. One teacher might be responsible for hundreds of students, with very little funding for supplies.

Krause writes that one prominent way of using the blackboard to improve education under these circumstances was known as the Lancasterian method, after British educator John Lancaster. Lancaster prescribed particular ways of physically arranging the classroom so that a teacher could work with a large group all at once…

The whole dusty story at “How blackboards transformed American education.”  Read Krause’s paper, “‘Among the Greatest Benefactors of Mankind’: What the Success of Chalkboards Tells Us about the Future of Computers in the Classroom,” here.

* Stanislaw Ulam

###

As we clean the erasers, we might send repetitive-but-instructive birthday greetings to Edwin Ray Guthrie; he was born on this date in 1886.  A philosopher and mathematician by training, he became a leading behavioral psychologist, specializing in the psychology of learning and more specifically, in the role association plays in acquiring skills.  He’s probably best remembered for his belief that all learning is based on a stimulus- response association, instantiated in his Law of Contiguity, which held that “a combination of stimuli which has accompanied a movement, will on its recurrence tend to be followed by that movement.”  Movements are, he argued, small stimulus- response combinations; these movements make up an act.  Thus, a learned behavior– an act that is consolidated by the learner so that it can be repeated– is, at its root, a series of movements.  Guthrie believed that what is learned are the movements (of which the behaviors are simply a result).

 source

 

Written by (Roughly) Daily

January 9, 2018 at 1:01 am