(Roughly) Daily

Posts Tagged ‘Psychology

“a tale told by an idiot, full of sound and fury, signifying nothing”*…

 

nihilism

 

Nihilism, not unlike time (according to Augustine) or porn (according to the U.S. Supreme Court), is one of those concepts that we are all pretty sure we know the meaning of unless someone asks us to define it. Nihil means “nothing.” -ism means “ideology.” Yet when we try to combine these terms, the combination seems to immediately refute itself, as the idea that nihilism is the “ideology of nothing” appears to be nonsensical. To say that this means that someone “believes in nothing” is not really much more helpful, as believing in something suggests there is something to be believed in, but if that something is nothing, then there is not something to be believed in, in which case believing in nothing is again a self-refuting idea.

It is easy therefore to fall into the trap of thinking “Everything is nihilism!” which of course leads to thinking “Nothing is nihilism!” Thus in order to preserve nihilism as a meaningful concept, it is necessary to distinguish it from concepts that are often associated with it but are nevertheless different, concepts such as pessimism, cynicism, and apathy…

The varieties of negativity: “What Nihilism Is Not.”

* Shakespeare, Macbeth

###

As we dabble in the dark, we might send existentially-thrilling birthday greetings to Patricia Highsmith; she was born on this date in 1921.   Dubbed “the poet of apprehension” by novelist Graham Greene, she wrote 22 novels and numerous short stories (over two dozen of which have been adapted to film) in a career that spanned five decades.

For example, her first novel, Strangers on a Train, has been adapted for stage and screen numerous times, notably by Alfred Hitchcock in 1951; her 1955 novel The Talented Mr. Ripley has been adapted several times for film, theatre, and radio.  Writing under the pseudonym “Claire Morgan”, Highsmith published the first lesbian novel with a happy ending, The Price of Salt, in 1952, republished 38 years later as Carol under her own name and later adapted into a 2015 film.

220px-Pathigh source

 

“Right now I’m having amnesia and déjà vu at the same time. I think I’ve forgotten this before”*…

 

630px-Mnemosine

Mnemosyne, the Greek goddess of Memory, c. 100

 

We think of memory as something internal—we remember with our minds (or, for the materialists among us, our brains). But human history is cluttered with attempts to externalize memory by encoding it onto objects and images. We have built models and systems to help us organize, keep track of, and recall information. These techniques are part of what the ancient Greeks called artificial memory. For the Greeks, natural memory encompassed those things a person happened to remember, and artificial memory consisted of recollections a person buttressed through preparation and effort. Artificial memory was a skill that could be learned and improved upon, one that had its own art: the ars memoriae, or art of memory.

The anthropologist Drew Walker reminds us that so-called mnemonic devices are not objects that stand alone but are instead “part of action.” These memory aids cannot fully store information the way writing does; they work only if you have already memorized the related material. Yet even as mere prompts or catalysts, they serve as crucial technologies for preserving and passing on histories, cultural practices, and learned wisdom.

Scholar Lynne Kelly argues that prehistoric and nonliterate cultures relied on memory technologies to preserve their oral traditions, a practice that continues to this day. Australian Aboriginal songlines record memory in short verses that are to be sung at particular places. Knowing the song helps you find your way across the territory—its melodies and rhythms describe the landscape—while its words tell the history of both the people and the land itself, describing, for example, which creator animal built that rocky outcrop or crevasse. Some songlines tell histories that trace back forty thousand years. Many are sacred and cannot be shared with outsiders. The Southern Australian Museum’s 2014 exhibit of the Ngiṉṯaka songline caused significant controversy because some Aṉangu felt the exhibit shared parts of the songline that were meant to be secret and that its curators had not sufficiently consulted with them. While songlines transform large expanses of land into a mnemonic device, other oral cultures have turned to smaller objects—calendar stones, ropes with knots in them, sticks marked with notches—to serve as tables of contents for important stories and information…

Jules Evans reviews mnemotechnics and the visualization of memory– the ways that we remember: “Summon Up Remembrance.”

See also “It’s a memory technique, a sort of mental map”*…

* Steven Wright

###

As we stroll down memory lane, we might recall that it was on this date in 1961 that President Dwight D, Eisenhower made his farewell address on a national television broadcast.  Perhaps most famously, Eisenhower, the only general to be elected president in the 20th century, used the speech to warn the nation against the corrupting influence of what he described as the “military-industrial complex.”

But he also used the occasion to urge a long view of our America and its citizen’s responsibilities:

As we peer into society’s future, we – you and I, and our government – must avoid the impulse to live only for today, plundering for our own ease and convenience the precious resources of tomorrow. We cannot mortgage the material assets of our grandchildren without risking the loss also of their political and spiritual heritage. We want democracy to survive for all generations to come, not to become the insolvent phantom of tomorrow.

250px-eisenhower_farewell source

 

“Talkin’ ’bout my generation”*…

 

Generations

 

“The competition between paradigms is not the sort of battle that can be resolved by proofs”  – Thomas S. Kuhn, The Structure of Scientific Revolutions

As recently as the late 1980s, most Americans thought gay sex was not only immoral but also something that ought to be illegal. Yet by 2015, when the Supreme Court legalised same-sex marriage, there were only faint murmurs of protest. Today two-thirds of Americans support it, and even those who frown on it make no serious effort to criminalise it.

This surge in tolerance illustrates how fast public opinion can shift. The change occurred because two trends reinforced each other. First, many socially conservative old people have died, and their places in the polling samples have been taken by liberal millennials. In addition, people have changed their minds. Support for gay marriage has risen by some 30 percentage points within each generation since 2004, from 20% to 49% among those born in 1928-45 and from 45% to 78% among those born after 1980.

However, this shift in opinion makes gay marriage an exception among political issues. Since 1972 the University of Chicago has run a General Social Survey every year or two, which asks Americans their views on a wide range of topics. Over time, public opinion has grown more liberal. But this is mostly the result of generational replacement, not of changes of heart.

For example, in 1972, 42% of Americans said communist books should be banned from public libraries. Views varied widely by age: 55% of people born before 1928 (who were 45 or older at the time) supported a ban, compared with 37% of people aged 27-44 and just 25% of those 26 or younger. Today, only a quarter of Americans favour this policy. However, within each of these birth cohorts, views today are almost identical to those from 47 years ago. The change was caused entirely by the share of respondents born before 1928 falling from 49% to nil, and that of millennials—who were not born until at least 1981, and staunchly oppose such a ban—rising from zero to 36%.

Not every issue is as extreme as these two. But on six of the eight questions we examined—all save gay marriage and marijuana legalisation—demographic shifts accounted for a bigger share of overall movement in public opinion than changes in beliefs within cohorts. On average, their impact was about twice as large.

Social activists devote themselves to changing people’s views, and sometimes succeed. In general, however, battles for hearts and minds are won by grinding attrition more often than by rapid conquest.

The Economist illustrates the way in which generational change is the driver of changes in public opinion: “Societies change their minds faster than people do.”

Paul Graham has tips on how to anticipate and navigate, even to lead, this change: “What you can’t say.”

“The proliferation of competing articulations, the willingness to try anything, the expression of explicit discontent, the recourse to philosophy and to debate over fundamentals, all these are symptoms of a transition from normal to extraordinary research. It is upon their existence more than upon that of revolutions that the notion of normal science depends… though the world does not change with a change of paradigm, the scientist afterward works in a different world.”

Thomas S. Kuhn, The Structure of Scientific Revolutions

* The Who

###

As we go with the flow, we might send responsive birthday greetings to John Broadus Watson; he was born on this date in 1878.  A psychologist inspired by the (then recent) work of Ivan Pavlov, Watson established the psychological school of behaviorism, most dramatically through his address Psychology as the Behaviorist Views it, at Columbia University in 1913.  Watson studied the biology, physiology, and behavior of animals, viewing them as extremely complex machines that responded to situations according to their “wiring,” or nerve pathways, which were conditioned by experience.  When he continued with studies of the behavior of children, his conclusion was that humans, while more complicated than animals, operated on the same principles; he was particularly interested in the conditioning of emotions.  Watson’s behaviorism dominated psychology in the U.S. in the 1920s and ’30s (and got a second wind with the ascendence of B.F. Skinner).

 source

Ironically, it is also the birthday (1886) of one of Watson’s contemporaries and antagonists, Edwin Ray Guthrie.  Guthrie was also a behaviorist, but argued against of Watson’s theory of classical conditioning and Skinner’s related theory of operant conditioning.  Guthie’s focus was the psychology of learning and the role that association plays.  In his Law of Contiguity, he held that “a combination of stimuli which has accompanied a movement, will on its recurrence tend to be followed by that movement.” He held that all learning is based on a stimulus- response association; movements are small stimulus- response combinations.  These movements make up an act.  A learned behavior is a series of movements, and it takes time for the movements to develop into an act. Which is to say that he believed that learning is incremental, and that many acquired behaviors involve repetition of movements– because what is learned are movements, not behaviors.

200px-Edwin_Ray_Guthrie source

 

“Time flies like an arrow; fruit flies like a banana”*…

 

Time

 

How does now unstick itself and become the past, and when does the future morph into the present? How do these states transition, one into another, so seamlessly?

How long is right now ?…

[Fascinating explorations of different explanations…]

Each theory of right now has one thing in common: It challenges the notion that the present is reliable and objective, or that it stretches out infinitely in front of us, even if we sometimes perceive it that way. This is an important reminder because the way we think about time affects the kinds of decisions we make.

We don’t just think about the past, present, or future, we think about ourselves in those places. (That’s the impetus behind something called Time Perspective Theory, which argues that there are six different ways people regard time, and it greatly influences your perspective on life.)

Studies have found that many people think about themselves in the future not as themselves, but as other people. When asked to imagine a birthday in the far off future, people are more likely to envision it from a third-person viewpoint. When we think about ourselves in 10 years, compared to right now, it activates similar parts of our brain that think about others, not ourselves.

Our instinct to place a lot of emphasis on the present, said Hal Hershfield, a psychologist at UCLA who has studied how perceptions of time relate to the choices people make. But if we could better relate to our future selves, we could be better off later on. Hershfield and his collaborators did a study that found that those who felt more similar to their future selves made more future-oriented decisions and had higher levels of well-being across a decade…

How long is right now?  As long as it took you to read that question?  Or shorter?  Or it might not exist at all…

* Anthony G. Oettinger

###

As we remember Ram Dass, we might recall that it was on this date in 1914 that Henry Ford announced that his company would cut its workday from nine hours to eight, and double workers’ wages to $5 per day.

Cited as the beginning of the “living wage” revolution, it is often suggested that Ford made the move so that his employees could afford the product that they were making.  But many historians argue that the real motivations were likelier an attempt to reduce employee turnover and to put economic pressure on competitors.  In any event, that’s what happened:  while Ford’s move was hugely unpopular among other automakers, they saw the increase in Ford’s productivity– and a significant increase in profit margin (from $30 million to $60 million in two years)– and (mostly) followed suit.

3a20510u_ford

Assembly line at the Ford Motor Company’s Highland Park plant, ca. 1913

source

 

Written by LW

January 5, 2020 at 1:01 am

“Nothing is at last sacred but the integrity of your own mind”*…

 

mind internet

 

Imagine that a person’s brain could be scanned in great detail and recreated in a computer simulation. The person’s mind and memories, emotions and personality would be duplicated. In effect, a new and equally valid version of that person would now exist, in a potentially immortal, digital form. This futuristic possibility is called mind uploading. The science of the brain and of consciousness increasingly suggests that mind uploading is possible – there are no laws of physics to prevent it. The technology is likely to be far in our future; it may be centuries before the details are fully worked out – and yet given how much interest and effort is already directed towards that goal, mind uploading seems inevitable. Of course we can’t be certain how it might affect our culture but as the technology of simulation and artificial neural networks shapes up, we can guess what that mind uploading future might be like.

Suppose one day you go into an uploading clinic to have your brain scanned. Let’s be generous and pretend the technology works perfectly. It’s been tested and debugged. It captures all your synapses in sufficient detail to recreate your unique mind. It gives that mind a standard-issue, virtual body that’s reasonably comfortable, with your face and voice attached, in a virtual environment like a high-quality video game. Let’s pretend all of this has come true.

Who is that second you?

Princeton neuroscientist, psychologist, and philosopher Michael Graziano explores: “What happens if your mind lives forever on the internet?

* Ralph Waldo Emerson, Self-Reliance

###

As we ponder presence, we might spare a thought for William “Willy” A. Higinbotham; he died on this date in 1994.  A physicist who was a member of the team that developed the first atomic bomb, he later became a leader in the nuclear non-proliferation movement.

But Higinbotham may be better remembered as the creator of Tennis for Two— the first interactive analog computer game, one of the first electronic games to use a graphical display, and the first to be created as entertainment (as opposed to as a demonstration of a computer’s capabilities).  He built it for the 1958 visitor day at Brookhaven National Laboratory.

It used a small analogue computer with ten direct-connected operational amplifiers and output a side view of the curved flight of the tennis ball on an oscilloscope only five inches in diameter. Each player had a control knob and a button.

 source

The 1958 Tennis for Two exhibit

source

 

Written by LW

November 10, 2019 at 1:01 am

“The early 70s were the days when all the survivors of the Sixties went a bit nuts”*…

 

Painting of an Astronaut and Martian by Frank R. Paul

 

Stephen Paul Miller calls the seventies the uncanny decade — the “undecade.” Things were particularly weird in these years, which remain shrouded in America’s cultural memory, as if by a kind of smog. One reason for the haze is the period’s elusive placement between the highly overdetermined sixties — often considered by historians to last well into the subsequent decade — and the more garish icons that come to the fore later in the seventies, like disco and punk, Pong and Star Wars, Jonestown and the Bicentennial. Indeed, liminality is a key characteristic of the early seventies. Radical and transformative forces unleashed in the sixties mutated and dissipated into much broader segments of culture and society. One no longer needed to be an inhabitant of San Francisco, the East Village, or Ann Arbor to explore the creative maelstrom of drugs, uncorked sexual experimentation, and the alternative worldviews associated with radical politics or the occult revival. Thresholds were everywhere.

At the same time, and in stark contrast to the previous years, the horizon of individual and social possibilities abruptly narrowed. Whether left, right, or center, the nation drifted into a Slough of Despond perhaps unprecedented in American history. In polls taken at the end of the seventies, people looked back at a decade of “disillusion and cynicism, helplessness and apprehension,” a list we might as well round out with disorientation, paranoia, boredom, and frustrated rage. I suspect that one reason we find ourselves dependably amused by tacky seventies fluff like shag carpet, massive sideburns, and smiley face buttons is that we need to keep the trauma and perplexity of the era at bay. This is despite (or due to) the fact that so many of the era’s bummers resonate with our own: fears about terrorism and environmental collapse, surveillance paranoia, political cynicism, foreign war fatigue, and a pervasive apocalyptic undertow that tugs beneath an over-heated, desperately sexualized, fantastical, and often bleak popular culture…

Three psycho-spiritual “events” of the 1970s — involving Philip K. Dick, Robert Anton Wilson, and Terence and Dennis McKenna — had a strange synchronicity.  In an excerpt from his new book, High Weirdness, the ever-illuminating Erik Davis explains: “In the Age of the Psychonauts.”

* Robert Anton Wilson

###

As we push at the doors of perception, we might recall that it was on this date in 1971 that the theatrical musical Jesus Christ Superstar premiered on Broadway.  With music by Andrew Lloyd Webber and lyrics by Tim Rice, it had originated the prior year as a concept album (that was given a concert performance in Pittsburgh earlier in 1971).

JCS source

 

“Ignorance, allied with power, is the most ferocious enemy justice can have”*…

 

Dunning-Kruger

 

The American author and aphorist William Feather once wrote that being educated means “being able to differentiate between what you know and what you don’t.” As it turns out, this simple ideal is extremely hard to achieve. Although what we know is often perceptible to us, even the broad outlines of what we don’t know are all too often completely invisible. To a great degree, we fail to recognize the frequency and scope of our ignorance.

In 1999, in the Journal of Personality and Social Psychology, my then graduate student Justin Kruger and I published a paper that documented how, in many areas of life, incompetent people do not recognize—scratch that, cannot recognize—just how incompetent they are, a phenomenon that has come to be known as the Dunning-Kruger effect. Logic itself almost demands this lack of self-insight: For poor performers to recognize their ineptitude would require them to possess the very expertise they lack. To know how skilled or unskilled you are at using the rules of grammar, for instance, you must have a good working knowledge of those rules, an impossibility among the incompetent. Poor performers—and we are all poor performers at some things—fail to see the flaws in their thinking or the answers they lack…

The trouble with ignorance is that it feels so much like expertise. A leading researcher on the psychology of human wrongness– David Dunning himself– explains the Dunning-Kruger effect: “We are all confident idiots.”

* James Baldwin

###

As we reconsider our confidence, we might recall that it was on this date in 1996 that the cable channel Fox News debuted.

fox-news-logo source

 

Written by LW

October 7, 2019 at 1:01 am

%d bloggers like this: