(Roughly) Daily

Posts Tagged ‘Psychology

“If names be not correct, language is not in accordance with the truth of things”*…

What’s in a name?…

The goal of this article is to promote clear thinking and clear writing among students and teachers of psychological science by curbing terminological misinformation and confusion. To this end, we present a provisional list of 50 commonly used terms in psychology, psychiatry, and allied fields that should be avoided, or at most used sparingly and with explicit caveats. We provide corrective information for students, instructors, and researchers regarding these terms, which we organize for expository purposes into five categories: inaccurate or misleading terms, frequently misused terms, ambiguous terms, oxymorons, and pleonasms. For each term, we (a) explain why it is problematic, (b) delineate one or more examples of its misuse, and (c) when pertinent, offer recommendations for preferable terms. By being more judicious in their use of terminology, psychologists and psychiatrists can foster clearer thinking in their students and the field at large regarding mental phenomena…

From “a gene for” through “multiple personality disorder” and “scientific proof” to “underlying biological dysfunction”: “Fifty psychological and psychiatric terms to avoid: a list of inaccurate, misleading, misused, ambiguous, and logically confused words and phrases.”

[TotH to @BoingBoing, whence the photo above]

* Confucius, The Analects

###

As we speak clearly, we might send carefully-worded birthday greetings to Francois-Marie Arouet, better known as Voltaire; he was born on this date in 1694.  The Father of the Age of Reason, he produced works in almost every literary form: plays, poems, novels, essays, and historical and scientific works– more than 2,000 books and pamphlets (and more than 20,000 letters).  He popularized Isaac Newton’s work in France by arranging a translation of Principia Mathematica to which he added his own commentary.

A social reformer, Voltaire used satire to criticize the intolerance, religious dogma, and oligopolistic privilege of his day, perhaps nowhere more sardonically than in Candide.

 source

Written by (Roughly) Daily

November 21, 2022 at 1:00 am

“The only thing that will redeem mankind is cooperation”*…

… and happily that prospect may be more likely than we’d been led to believe in works like Robert Putnam’s Bowling Alone

Despite widespread worries that the social fabric is disintegrating, data from the American Psychological Association shows that since the 1950s, cooperation between strangers has steadily increased in the United States.

“We were surprised by our findings that Americans became more cooperative over the last six decades because many people believe U.S. society is becoming less socially connected, less trusting, and less committed to the common good,” said lead researcher Yu Kou, Ph.D., a professor of social psychology at Beijing Normal University. “Greater cooperation within and between societies may help us tackle global challenges, such as responses to pandemics, climate change, and immigrant crises.”

Over 63,000 people participated in 511 studies that were carried out in the US between 1956 and 2017 that were analyzed by the researchers. These studies included lab tests that evaluated strangers’ cooperation. The study was recently published in the journal Psychological Bulletin.

The study discovered a slight, gradual rise in collaboration over the period of 61 years…

Good News: Cooperation Among Strangers Has Increased for the Past 60 Years.” The full study is here.

* Bertrand Russell

###

As we bowl together, we might recall that any improvement on cooperation is on a base that’s not too high: it was on this date in 1957 that nine Black students, having been denied entrance to Little Rock, Arkansas’ Central High School (in defiance of a 1954 Supreme Court ruling), were escorted to school by soldiers of the Airborne Battle Group of the U.S. Army’s 101st Airborne Division. Two days earlier the Black students had faced an angry mob of over 1,000 Whites in front of Central High School who were protesting the integration project; as the students were escorted inside by the Little Rock police (supporting national Guard troops), violence escalated, and they were removed from the school. President Eisenhower responded by calling in the regular Army.

Elizabeth Eckford attempts to enter Central High on September 4, 1957

source

Written by (Roughly) Daily

September 25, 2022 at 1:00 am

“The first duty in life is to be as artificial as possible. What the second duty is no one has as yet discovered.”*…

Photo by Tengyart on Unsplash

Bo Winegard is not so sure that authenticity is a virtue…

That we should not lie is generally sound advice, though few of us are able to navigate life without uttering or affirming the occasional falsehood. However, some—generally those of a romantic temperament—also strive to apply this counsel to the self. They argue that authenticity is one of humankind’s chief virtues and that betraying it is immoral and tragic—immoral, because it requires a person to lie about their underlying being; tragic, because it smothers the unique self beneath a dull blanket of conformity.

I do not share this enthusiasm for authenticity because it is based on a fundamental misunderstanding of human nature. At best, authenticity can be undesirable; at worst, it is philosophically incoherent. The word “authenticity” is sometimes useful in ordinary discourse—we may say that a person is authentically a lover of the arts or authentically cheerful or authentically kindhearted, and it’s obvious what these claims mean. Nor will I deny that lying about one’s own traits and tendencies is often a bad idea and sometimes genuinely immoral. Nevertheless, authenticity, as understood by many of its modern champions, is not a noble or even attainable ideal…

Read on for his argument– the short form of which is that to be human is to be artificial: “Against Authenticity,” from @EPoe187 in @Quillette.

Apposite (albeit from an orthogonal perspective): “After Authenticity,” from @tobyshorin.

* Oscar Wilde

###

As we settle for sincerity (?), we might recall that on this date in 1787 George Washington hosted a farewell dinner for his officers (which doubled as a celebration of the signing of the Constitution and Washington’s election as the new nation’s first President) that resulted in an epic tab, largely for drinks. The bill, at the City Tavern in Philadelphia, totaled over 89 pounds– between $15,400 and $17,253 in today’s dollars.

A replica of the historic City Tavern, where Washington ran up his tab.
Smallbones/Wikimedia Commons

Written by (Roughly) Daily

September 15, 2022 at 1:00 am

“Wouldn’t economics make a lot more sense if it were based on how people actually behave, instead of how they should behave?”*…

Behavioral economics aims to accomplish exactly that. Its approach has been to catalogue the dozens of cognitive biases that stop us from acting “rationally.” Jason Collins argues that instead of building up a messier and messier picture of human behavior, we need a new model…

From the time of Aristotle through to the 1500s, the dominant model of the universe had the sun, planets, and stars orbiting around the Earth.

This simple model, however, did not match what could be seen in the skies. Venus appears in the evening or morning. It never crosses the night sky as we would expect if it were orbiting the Earth. Jupiter moves across the night sky but will abruptly turn around and go back the other way.

To deal with these ‘anomalies’, Greek astronomers developed a model with planets orbiting around two spheres. A large sphere called the deferent is centered on the Earth, providing the classic geocentric orbit. The smaller spheres, called epicycles, are centered on the rim of the larger sphere. The planets orbit those epicycles on the rim. This combination of two orbits allowed planets to shift back and forth across the sky.

But epicycles were still not enough to describe what could be observed. Earth needed to be offset from the center of the deferent to generate the uneven length of seasons. The deferent had to rotate at varying speeds to capture the observed planetary orbits. And so on. The result was a complicated pattern of deviations and fixes to this model of the sun, planets, and stars orbiting around the Earth.

Instead of this model of deviations and epicycles, what about an alternative model? What about a model where the Earth and the planets travel in elliptical orbits around the sun?

By adopting this new model of the solar system, a large collection of deviations was shaped into a coherent model. The retrograde movements of the planets were given a simple explanation. The act of prediction became easier as a model that otherwise allowed astronomers to muddle through became more closely linked to the reality it was trying to describe.

Behavioral economics today is famous for its increasingly large collection of deviations from rationality, or, as they are often called, “biases.” While useful in applied work, it is time to shift our focus from collecting deviations from a model of rationality that we know is not true. Rather, we need to develop new theories of human decision to progress behavioral economics as a science. We need heliocentrism… 

For a thoughtful critique of current thinking and a set of four “features” that might inform a new approach: “We don’t have a hundred biases, we have the wrong model,” from @jasonacollins.

* Dan Ariely, Predictably Irrational: The Hidden Forces That Shape Our Decisions

###

As we dwell on decisions, we might spare a thought for someone who would probably have had little patience for ideas like these, Rose Friedman; she died on this date in 2009. A free-market economist, she was the wife and intellectual partner of Milton Friedman– a father of the “Chicago School” of neoclassical economic thought that underlies the neoliberlism so dominant of late [see here], of which behavioral economics is a corrective/critique– with whom she co-wrote papers and books (e.g., Free to Choose and Capitalism and Freedom) and co-founded EdChoice (formerly the Milton and Rose D. Friedman Foundation), with the aim of promoting the use of school vouchers and “freedom of choice” in education.

When her husband received his Medal of Freedom in 1988, President Ronald Reagan joked that Rose was known for being the only person to ever have won an argument against Milton.

source

“Food is not rational. Food is culture, habit, craving, and identity”*…

From the cover of sci-fi magazine If, May 1960

Your correspondent is heading several time zones away, so (Roughly) Daily will be on hiatus for a week. Meantime…

Kelly Alexander on the very real way in which we are not only what we eat, but also what we imagine eating…

Despite the faux social science of trend reports, I have always been interested in the flavours that exert a hold on our collective hearts and minds. I’m especially intrigued by how the foods we seem to fetishise in the present – the artisanal, the local, the small-batch – are never the ones we seem to associate with the tantalising prospect of ‘the future’.

How do we know what will be delicious in the future? It depends on who ‘we’ are. For Baby Boomers who didn’t grow up on a diet of Dune-style scenarios of competing for resources on a depleted planet, it was TV dinners, angel whips and Tang – the instant powdered orange drink that became a hit after NASA included it on John Glenn’s Mercury spaceflight in 1962. That same year, The Jetsons – an animated show chronicling the life and times of a family in 2062 – premiered on US television. In one episode, mom Jane ‘makes’ breakfast for son Elroy using an iPad-like device. She orders ‘the usual’: milk, cereal (‘crunchy or silent?’ Jane asks Elroy, before pre-emptively selecting ‘silent’), bacon, and one soft-boiled egg, all of which is instantly beamed to the table…

For many students in my Food Studies courses at the University of North Carolina, the ‘future delicious’ conjures readymade meal ‘solutions’ that eliminate not just the need for cooks but the need for meals. This includes Soylent, the synthesised baby formula-like smoothies, or the food substitutes slugged by software engineers coding at their desks. It includes power bars and Red Bulls to provide energy and sustenance without the fuss of a dinner table (an antiquated ceremony that takes too long). Also, meal kits that allow buyers to play at cooking by mixing a few things that arrive pre-packaged, sorted and portioned; and Impossible Burgers, a product designed to mimic the visceral and textural experience of eating red meat – down to realistic drips of ‘blood’ (beet juice enhanced with genetically modified yeast), and named to remind us that no Baby Boomer thought such a product was even possible.

Such logic makes the feminist anthropologist Donna Haraway wary. It betrays, she writes, ‘a comic faith in technofixes’ that ‘will somehow come to the rescue of its naughty but very clever children.’ To Haraway’s point, the future delicious tends to value the technological component of its manufacture over the actual food substrate, sidestepping what the material culture expert Bernie Herman described to me as ‘the fraught and negotiated concept of delicious’…

More tastiness at: “What our fantasies about futuristic food say about us,” from @howamericaeats in @aeonmag.

See also: “The perfect meal in a pill?

How do science fiction authors imagine the food of the future? Works conceived between 1896 and 1973 addressed standardised consumers, alienated by a capitalist society in pursuit of profitability. Were these works prophecy or metaphor?

* Jonathan Safran Foer

###

As we dig in, we might recall that it was on this date in 1955 that Nathan’s Famous, on Coney Island, sold it’s one-millionth hot dog. The restaurant (which has, of course, grown into both a chain and a retail brand) had been founded by Nathan Handwerker in 1916.

Nathan, eating one of his own

source

Written by (Roughly) Daily

July 6, 2022 at 1:00 am

%d bloggers like this: