(Roughly) Daily

Posts Tagged ‘Psychology

“Fortune sides with him who dares”*…

Timing is everything: risk and the rhythm of the week…

The seven-day week originated in Mesopotamia among the Babylonians, and it has stuck around for millennia. However, it’s not inherently special. Egyptians once used a ten-day week, and Romans used an eight-day week before officially adopting a seven-day week in AD 321.

Still, the seven-day week is so ingrained that we may notice how days “feel.” I was recently caught off guard by a productive “Tuesday”, realizing halfway through the day that it was actually Monday. Recent research shows that a big player in the psychology of weeks is a tendency to take risks.

“Across a range of studies, we have found that response to risk changes systematically through the week. Specifically, willingness to take risks decreases from Monday to Thursday and rebounds on Friday. The surprising implication is that the outcome of a decision can depend on the day of the week on which it is taken.”…

Feels like a Tuesday: research explains why days ‘feel’ certain ways,” from Annie Rauwerda @BoingBoing. The underlying research, by Dr. Rob Jenkins, is here.

* Virgil

###

As we take a chance, we might recall that it was on this date in 1908 (a Thursday) that Thomas Etholen Selfridge became the first American to die in an airplane crash. An Army lieutenant and pilot, he was a passenger on Orville Wright’s demonstration flight of the 1908 Wright Military Flyer for the US Army Signal Corps division at Ft. Meade, Maryland. With the two men aboard, e Flyer was carrying more weight than it had ever done before…

The Flyer circled Fort Myer 4½ times at a height of 150 feet. Halfway through the fifth circuit, at 5:14 in the afternoon, the right-hand propeller broke, losing thrust. This set up a vibration, causing the split propeller to hit a guy-wire bracing the rear vertical rudder. The wire tore out of its fastening and shattered the propeller; the rudder swivelled to the horizontal and sent the Flyer into a nose-dive. Wright shut off the engine and managed to glide to about 75 feet, but the craft hit the ground nose-first. Both men were thrown forward against the remaining wires and Selfridge struck one of the wooden uprights of the framework, fracturing the base of his skull. He underwent neurosurgery but died three hours later without regaining consciousness. Wright suffered severe injuries, including a broken left thigh, several broken ribs, and a damaged hip, and was hospitalized for seven weeks…

Wikipedia

Two photographs taken of the Flyer just prior to the flight, show that Selfridge was not wearing any headgear, while Wright was only wearing a cap. Given speculation that Selfridge would have survived had he worn headgear, early pilots in the US Army were instructed to wear large heavy headgear reminiscent of early football helmets.

source

Written by (Roughly) Daily

September 17, 2021 at 1:00 am

“Consciousness was upon him before he could get out of the way”*…

Some scientists, when looking at the ladder of nature, find no clear line between mind and no-mind…

Last year, the cover of New Scientist ran the headline, “Is the Universe Conscious?” Mathematician and physicist Johannes Kleiner, at the Munich Center for Mathematical Philosophy in Germany, told author Michael Brooks that a mathematically precise definition of consciousness could mean that the cosmos is suffused with subjective experience. “This could be the beginning of a scientific revolution,” Kleiner said, referring to research he and others have been conducting. 

Kleiner and his colleagues are focused on the Integrated Information Theory of consciousness, one of the more prominent theories of consciousness today. As Kleiner notes, IIT (as the theory is known) is thoroughly panpsychist because all integrated information has at least one bit of consciousness.

You might see the rise of panpsychism as part of a Copernican trend—the idea that we’re not special. The Earth is not the center of the universe. Humans are not a treasured creation, or even the pinnacle of evolution. So why should we think that creatures with brains, like us, are the sole bearers of consciousness? In fact, panpsychism has been around for thousands of years as one of various solutions to the mind-body problem. David Skrbina’s 2007 book, Panpsychism in the West, provides an excellent history of this intellectual tradition.

While there are many versions of panpsychism, the version I find appealing is known as constitutive panpsychism. It states, to put it simply, that all matter has some associated mind or consciousness, and vice versa. Where there is mind there is matter and where there is matter there is mind. They go together. As modern panpsychists like Alfred North Whitehead, David Ray Griffin, Galen Strawson, and others have argued, all matter has some capacity for feeling, albeit highly rudimentary feeling in most configurations of matter. 

While inanimate matter doesn’t evolve like animate matter, inanimate matter does behave. It does things. It responds to forces. Electrons move in certain ways that differ under different experimental conditions. These types of behaviors have prompted respected physicists to suggest that electrons may have some type of extremely rudimentary mind. For example the late Freeman Dyson, the well-known American physicist, stated in his 1979 book, Disturbing the Universe, that “the processes of human consciousness differ only in degree but not in kind from the processes of choice between quantum states which we call ‘chance’ when made by electrons.” Quantum chance is better framed as quantum choice—choice, not chance, at every level of nature. David Bohm, another well-known American physicist, argued similarly: “The ability of form to be active is the most characteristic feature of mind, and we have something that is mind-like already with the electron.”

Many biologists and philosophers have recognized that there is no hard line between animate and inanimate. J.B.S. Haldane, the eminent British biologist, supported the view that there is no clear demarcation line between what is alive and what is not: “We do not find obvious evidence of life or mind in so-called inert matter…; but if the scientific point of view is correct, we shall ultimately find them, at least in rudimentary form, all through the universe.”…

Electrons May Very Well Be Conscious“: Tam Hunt (@TamHunt) explains.

* Kingsley Amis

###

As we challenge (chauvinistic?) conventions, we might spare a thought for a man who was no great respecter of consciousness, B. F. Skinner; he died on this date in 1990. A psychologist, he was the pioneer and champion of what he called “radical behaviorism,” the assumption that behavior is a consequence of environmental histories of “reinforcement” (reactions to positive and negative stimuli):

What is felt or introspectively observed is not some nonphysical world of consciousness, mind, or mental life but the observer’s own body. This does not mean, as I shall show later, that introspection is a kind of psychological research, nor does it mean (and this is the heart of the argument) that what are felt or introspectively observed are the causes of the behavior. An organism behaves as it does because of its current structure, but most of this is out of reach of introspection.

About Behaviorism

Building on the work of Ivan Pavlov and John B. Watson, Skinner used operant conditioning to strengthen behavior, considering the rate of response to be the most effective measure of response strength. To study operant conditioning, he invented the operant conditioning chamber (aka the Skinner box).

C.F. also: Thomas Pynchon’s Gravity’s Rainbow.

source

“Be a good ancestor”*…

Even though– especially because– it’s hard…

… Mental time travel is essential. In one of Aesop’s fables, ants chastise a grasshopper for not collecting food for the winter; the grasshopper, who lives in the moment, admits, “I was so busy singing that I hadn’t the time.” It’s important to find a proper balance between being in the moment and stepping out of it. We all know people who live too much in the past or worry too much about the future. At the end of their lives, people often regret most their failures to act, stemming from unrealistic worries about consequences. Others, indifferent to the future or disdainful of the past, become unwise risk-takers or jerks. Any functioning person has to live, to some extent, out of the moment. We might also think that it’s right for our consciousnesses to shift to other times—such inner mobility is part of a rich and meaningful life.

On a group level, too, we struggle to strike a balance. It’s a common complaint that, as societies, we are too fixated on the present and the immediate future. In 2019, in a speech to the United Nations about climate change, the young activist Greta Thunberg inveighed against the inaction of policymakers: “Young people are starting to understand your betrayal,” she said. “The eyes of all future generations are upon you.” But, if their inaction is a betrayal, it’s most likely not a malicious one; it’s just that our current pleasures and predicaments are much more salient in our minds than the fates of our descendants. And there are also those who worry that we are too future-biased. A typical reaction to long-range programs, such as John F. Kennedy’s Apollo program or Elon Musk’s SpaceX, is that the money would be better spent on those who need it right now. Others complain that we are too focussed on the past, or with the sentimental reconstruction of it. Past, present, future; history, this year, the decades to come. How should we balance them in our minds?

Meghan Sullivan, a philosopher at the University of Notre Dame, contemplates these questions in her book “Time Biases: A Theory of Rational Planning and Personal Persistence.” Sullivan is mainly concerned with how we relate to time as individuals, and she thinks that many of us do it poorly, because we are “time-biased”—we have unwarranted preferences about when events should happen. Maybe you have a “near bias”: you eat the popcorn as the movie is about to start, even though you would probably enjoy it more if you waited. Maybe you have a “future bias”: you are upset about an unpleasant task that you have to do tomorrow, even though you’re hardly bothered by the memory of performing an equally unpleasant task yesterday. Or maybe you have a “structural bias,” preferring your experiences to have a certain temporal shape: you plan your vacation such that the best part comes at the end.

For Sullivan, all of these time biases are mistakes. She advocates for temporal neutrality—a habit of mind that gives the past, the present, and the future equal weight. She arrives at her arguments for temporal neutrality by outlining several principles of rational decision-making. According to the principle of success, Sullivan writes, a rational person prefers that “her life going forward go as well as possible”; according to the principle of non-arbitrariness, a rational person’s preferences “are insensitive to arbitrary differences.” A commitment to being rational, Sullivan argues, will make us more time-neutral, and temporal neutrality will help us think better about everyday problems, such as how best to care for elderly parents and save for retirement.

Perhaps our biggest time error is near bias—caring too much about what’s about to happen, and too little about the future. There are occasions when this kind of near bias can be rational: if someone offers you the choice between a gift of a thousand dollars today and a year from now, you’d be justified in taking the money now, for any number of reasons. (You can put it in the bank and get interest; there’s a chance you could die in the next year; the gift giver could change her mind.) Still, it’s more often the case that, as economists say, we too steeply “discount” the value of what’s to come. This near bias pulls at us in our everyday decisions. We tend to be cool and rational when planning for the far-off future, but we lose control when temptations grow nearer in time.

If near bias is irrational, Sullivan argues, so is future bias… Sullivan shares an example invented by the philosopher Derek Parfit. Suppose that you require surgery. It’s an unpleasant procedure, for which you need to be awake, in order to coöperate with the surgeon. Afterward, you will be given a drug that wipes out your memory of the experience. On the appointed day, you wake up in the hospital bed, confused, and ask the nurse about the surgery. She says that there are two patients in the ward—one who’s already had the operation, and another who’s soon to have it; she adds that, unusually, the operation that already happened took much longer than expected. She isn’t sure which patient you are, and has to go check. You would be greatly relieved, Parfit says, if the nurse comes back and tells you that you already had the operation. That is, you would willingly consign to your past self a long and agonizing procedure to avoid a much shorter procedure to come.

There is an evolutionary logic behind this kind of bias. As Caspar Hare, a philosopher at M.I.T., puts it, “It is not an accident that we are future-biased with respect to pain. That feature of ourselves has been selected-for by evolution.” In general, Hare writes, it seems likely that animals that focussed their attention on the future survived longer and reproduced more…

In 1992, Parfit teamed up with the economist Tyler Cowen to argue, in a book chapter, that our governments are too eager to discount the fortunes of future people. Parfit and Cowen proposed that even a small bias in favor of the present over the future could have huge consequences over time. Suppose that a politician reasons that one life now is equal to 1.01 lives a year from now, and so embraces policies that favor a hundred people now over a hundred people next year. This hardly seems to matter—but this “discount rate” of one per cent per year implies that we would rather save a single life now, at the cost of a million lives in about fourteen hundred years. At a ten-per-cent discount rate, one life now would be worth a million in a mere century and half. Although no one in power thinks in exactly these terms, many of our decisions favor the present over the future.

In a 2018 book, “Stubborn Attachments,” Cowen expands on the idea, asking how we can fight near bias at a societal level and better further the interests of future people. There are “a variety of relevant values” that we might want to consider in our temporal rebalancing, he writes, “including human well-being, justice, fairness, beauty, the artistic peaks of human achievement, the quality of mercy,” and so on. Cowen concludes that the best way to maximize all of these things for the future is to increase economic growth. (He doesn’t go just by G.D.P.—he adds in various measures of “leisure time, household production, and environmental amenities.”)

The thing about economic growth, Cowen tells us, is that it has the potential to advance just about everything that people value. “Wealthier societies have better living standards, better medicines, and offer greater personal autonomy, greater fulfillment, and more sources of fun,” he writes. He concedes that, in recent decades, inequality has risen within wealthier nations, but also notes that, as a consequence of global economic growth, “recent world history has been an extraordinarily egalitarian time”: over all, countries are becoming more equal. In terms of happiness, Cowen shows that there is considerable evidence supporting the commonsense view that citizens of rich countries are happier than citizens of poor countries, and that, within rich countries, wealthier individuals are happier than poorer ones. The data actually understate the strength of the effect, Cowen writes, because many studies miss the happiness boost that comes from more years on the earth: “Researchers do not poll the dead.”

Cowen is sympathetic to the school of thought known as effective altruism, which holds that we should use data and research to figure out how to do the greatest good for the greatest number of people. But he worries that these sorts of altruists are too prone to think about the greatest good for people right now. An effective altruist might hold that, instead of spending money on some luxury for yourself, you should use it to help the poor. But, for Cowen, this sort of advice is too present-oriented. Even a small boost in the growth rate has enormous ramifications for years to come. “Our strongest obligations are to contribute to sustainable economic growth,” he writes, “and to support the general spread of civilization, rather than to engage in massive charitable redistribution in the narrower sense.” In general, Cowen thinks that policymakers should be more future-oriented. He suggests that we should put fewer resources into improving the lives of the elderly and devote correspondingly more resources to the young and the not-yet-born. Most politicians would balk at this suggestion, but, when they do the opposite—well, that’s a choice, too.

Cowen, to my mind, glosses over the problem of diminishing returns. Suppose that our prosperity increases a hundredfold. Life would be better, but would our happiness also increase by a multiple of a hundred? After a certain point, it might make sense to worry less about growth. Perhaps the most privileged of us are close to that point now. But these things can be hard to judge. The Babylonian kings might have thought that they were living the best possible lives, not realizing that, in the future, even everyday schmoes would be wiser and more pain-free, living longer, eating better, and traveling more.

Whether or not one agrees with Cowen’s thesis, there are clearly good reasons for adopting temporal neutrality on a societal level. It’s less clear that we have an obligation to be rigorously time-neutral as individuals. If we can indulge our own time biases without making horrible errors in judgment, why shouldn’t we? Why not distribute our pleasures and pains unevenly throughout our lives, if we believe that, for us, doing so will contribute to “life going forward as well as possible”? For many people, as Seneca wrote, “Things that were hard to bear are sweet to remember.” We undertake activities that we know to be difficult or unpleasant because we see them as part of a good life and wish to think back upon them in the future. We curate our presents to furnish our futures with the right kinds of pasts. If this benign bias encourages us to take on difficult things, isn’t it wise to indulge the bias?

Many people suspect that a good life might be one that’s ordered in a certain way. Psychologists find that people tend to prefer the idea of a wonderful life that ends abruptly to the idea of an equally wonderful one that includes some additional, mildly pleasant years—the “James Dean effect.” There’s also an appeal to starting with the worst and then seeing things improve. Andy Dufresne, the protagonist of the film “The Shawshank Redemption,” based on a novella by Stephen King, is convicted of double murder but maintains his innocence; he spends twenty-eight years in prison before stealing millions of dollars from his corrupt warden and escaping, then living out the rest of his life on a Mexican beach. It’s an exhilarating and powerful tale, but, if one flipped the order—coastal paradise, then brutal prison—it would be impossible to enjoy. Rags to riches beats riches to rags, even if the good and the bad are in precise balance. Maybe this is what Sullivan calls a structural bias—but, without structure, there’s no story, and stories are good things to have.

It’s true that time-biased thinking can mislead us. Imagine that you are listening to a symphony for a pleasurable ninety minutes—and then, at the end, someone’s cell phone goes off, to loud shushing and stifled laughter. You might say that these awful thirty seconds ruined the experience, even though the first ninety-nine per cent of it was wonderful, and think that, if the phone had rung at the start, it would have been less of a problem. But is a disruption in the finale really worse than an interruption in the overture? Sullivan’s arguments show that we should try reconsidering those kinds of intuitions—and that we should be wary, in general, of the strange places to which they can lead us. In a classic series of studies, Daniel Kahneman and his colleagues exposed volunteers to two different experiences—sixty seconds of moderate pain, and sixty seconds of moderate pain followed by thirty seconds of mild pain. When they asked people which experience they would rather repeat, most chose the second experience, just because it ended better. There is little good to be said about choosing more over-all pain just because the experience ends on the right note.

And yet giving up all our time biases is a lot to ask. We are, it seems, constituted to favor the here and now, to radically discount the distant future, and to give special weight to how experiences end. We can move in the direction of temporal neutrality, fighting against certain time biases just as we resist our other unreasonable biases and preferences. This may make us more rational, more kind to others, and, at times, more happy.

How much should we value the past, the present, and the future? “Being in Time,” from Paul Bloom (@paulbloomatyale)

* “Be a good ancestor. Stand for something bigger than yourself. Add value to the Earth during your sojourn.” – Marian Wright Edelman

###

As we take the long view, we might recall that it was on this date in 356 BC that the second version of the Temple of Artemis at Ephesus (which had replaced a Bronze Age structure) was destroyed by arson (by a man, Herostratus, set fire to the wooden roof-beams, seeking fame at any cost; thus the term “herostratic fame“).

Its third iteration was finished several decades later, and survived for six centuries. It was described in Antipater of Sidon‘s list of the world’s Seven Wonders:

I have set eyes on the wall of lofty Babylon on which is a road for chariots, and the statue of Zeus by the Alpheus, and the hanging gardens, and the colossus of the Sun, and the huge labour of the high pyramids, and the vast tomb of Mausolus; but when I saw the house of Artemis that mounted to the clouds, those other marvels lost their brilliancy, and I said, “Lo, apart from Olympus, the Sun never looked on aught so grand”.

This model of the Temple of Artemis, at Miniatürk Park, Istanbul, Turkey, attempts to recreate the probable appearance of the third temple.

source

Written by (Roughly) Daily

July 21, 2021 at 1:00 am

“Political language has to consist largely of euphemism, question-begging and sheer cloudy vagueness…. Such phraseology is needed if one wants to name things without calling up mental pictures of them.”*…

Mental models can be helpful, but they can also obscure as much as they reveal…

“The era of big government is over,” then-US President Bill Clinton proclaimed in 1996. But President Joe Biden’s multi-trillion-dollar spending plans are suggesting precisely the opposite. Behind the politicians stand the policy gurus, eager to put their names on – as the fashionable phrase goes – a new “policy paradigm.”

Paradigm-peddlers have not yet settled on a single label for the post-pandemic era, but frothy ideas abound. Countries should “build back better,” but only after a “great reset.” Economic growth used to be a pretty good thing on its own; these days, it is unmentionable in polite company unless it is “inclusive, equitable, and sustainable.” (I can see why, but must all three adjectives always be strung together?)

Harvard University’s Dani Rodrik was right to argue recently that we should beware of economists bearing policy paradigms. Such frameworks are supposed to organize thinking, but more often than not they substitute for it.

Consider a paradigm that the pandemic is supposed to have killed: neoliberalism. Neoliberal once meant a particular approach to free-market economics. Applying the description to leaders like Margaret Thatcher and Ronald Reagan made some sense. But in current parlance, the term also applies to former UK Prime Minister Tony Blair, former German Chancellor Gerhard Schröder, and the social democrats who have governed Chile for 24 of the last 30 years – in fact, to anyone who thinks markets have some role to play in human affairs.

Through repeated, careless use, neoliberal has now become one of those words that, as George Orwell said, “are strictly meaningless, in the sense that they not only do not point to any discoverable object, but are hardly even expected to do so by the reader.”

But meaningless is not the same as useless. If a speaker at an academic seminar, policy conference, or cocktail party tars someone as a neoliberal, two messages are immediately clear: the speaker is good, and the target is bad, unconcerned with the plight of the downtrodden. Tarring someone with this particular epithet is virtue-signaling par excellence. It marks the speaker as a member of a progressive tribe concerned about the world’s poor.

The right has its own ideological identity markers. In the debate about Obamacare and health insurance in the United States, or about vouchers for school funding anywhere, anyone claiming to support “freedom of choice” is not just making a point, but also sending a signal.

Both freedom and choice have multiple meanings that philosophers have been debating at least since classical Greek times: freedom to or freedom from? Choice to do what? Is someone with little money or education really “free to choose,” as the Nobel laureate economist Milton Friedman used to say? In fact, today’s freedom-of-choice advocates probably do not want to pursue those ancient and endless debates; they are simply signaling their membership in the ideological free-market tribe.

As the world seeks to ensure recovery from the COVID-19 crisis, simplistic political and economic ideologies will not lead to effective policymaking. Rodrik rightly pines for economic thinking that is unbeholden to cliché or to narrow identity politics. As he says, “The right answer to any policy question in economics is, ‘It depends.’” Circumstances matter, and the devil is in the details. 

I want the same thing as Rodrik, but you can’t always get what you want. Because nowadays (at least outside Trumpian circles) identities based on race or religion are unacceptable, ideologies have become the last refuge of the identity-seeking and politically savvy scoundrel, and new economic paradigms the weapon of choice…

In the old joke, a man walks into a psychiatrist’s office and says, “Doctor, my brother’s crazy! He thinks he’s a chicken.” The doctor says, “Why don’t you bring him to me?” And the man replies, “I would, but I need the eggs.” 

Political ideologies can be crazy, and those who peddle them often behave like chickens. But how we crave those eggs…

Simplistic political and economic ideologies that serve as identity markers will not lead to effective policymaking; but something in human psychology makes many crave them anyway: “The Perils of Paradigm Economics,” from Andrés Velasco (@AndresVelasco).

[image above: source]

* George Orwell, Politics and the English Language

###

As we acknowledge nuance, we might send qualified birthday greetings to Sidney James Webb; he was born on this date in 1859. An economist, he was an early member of the Fabian Society (joining, like George Bernard Shaw, three months after its founding). He co-founded the London School of Economics (where Andrés Velasco is currently Dean of the School of Public Policy), and wrote the original, pro-nationalisation Clause IV for the British Labour Party.

A committed socialist, Webb and his wife Beatrice were staunch supporters of the Soviet Union and its communist program. Ignoring the mounting evidence of atrocities in the USSR in favor of their commitment to the concept of collectivism, they wrote Soviet Communism: A New Civilisation? (1935) and The Truth About Soviet Russia (1942), both positive assessments of Stalin’s regime. The Trotskyist historian Al Richardson later described Soviet Communism: A New Civilization? as “pure Soviet propaganda at its most mendacious.”

source

“It may be roundly asserted that human ingenuity cannot concoct a cipher which human ingenuity cannot resolve”*…

But sometimes it takes lots of ingenuity… and often, a great deal of time…

The United States National Security Agency—the country’s premier signals intelligence organization—recently declassified a Cold War-era document about code-breaking.

The 1977 book, written by cryptologist Lambros Callimahos, is the last in a trilogy called Military Cryptanalytics. It’s significant in the history of cryptography, as it explains how to break all types of codes, including military codes, or puzzles—which are created solely for the purpose of a challenge.

The first two parts of the trilogy were published publicly in the 1980s and covered solving well-known types of classical cipher. But in 1992, the US Justice Department claimed releasing the third book could harm national security by revealing the NSA’s “code-breaking prowess“. It was finally released in December last year. 

A key part of Callimahos’s book is a chapter titled Principles of Cryptodiagnosis, which describes a systematic three-step approach to solving a message encrypted using an unknown method… 

See how those three steps work at “Declassified Cold War code-breaking manual has lessons for solving ‘impossible’ puzzles.”

* Edgar Allan Poe

###

As we ponder puzzles, we might send intelligent birthday greetings to Alfred Binet; he was born on this date in 1857. A psychologist, he invented the first practical IQ test, the Binet–Simon test (in response to a request from the French Ministry of Education to devise a method to identify students needing remedial help).

source

%d bloggers like this: