(Roughly) Daily

Posts Tagged ‘medicine

“I am now no more than a pile of blood, bone, and meat that is unhappy”*…

Most of us are familiar with the placebo effect. Dr. Michael H. Berstein explains the “nocebo”…

The term “nocebo effect” derives from the Latin word nocere, which translates roughly as “to harm” (as in the Hippocratic injunction, primum non nocerefirst, do no harm). Whereas the better-known placebo effect is typically positive (the alleviation of pain or malaise through treatments that otherwise have no inherent therapeutic value); the nocebo effect is negative, often manifesting as headache, skin irritation, or nausea.

No surprise, then, that the nocebo effect has been called “the placebo effect’s evil twin.” It can be more formally summarized as “the occurrence of a harmful event that stems from conscious or subconscious expectations.” Or, more simply: When you expect to feel sick, you are more likely to feel sick.

Of course, human expectations come up in all sorts of banal, everyday contexts, such as when you tell a friend that you’re stuck in traffic and so he or she should expect your arrival in twenty minutes. But expectation is also an important term of art that academics use (sometimes interchangeably with “expectancy”), having been popularized by Dr. Irving Kirsch, who now serves as Associate Director of the Program in Placebo Studies at Harvard Medical School.

Kirsch’s work built on that of Dr. Henry Beecher, who served with the American military during World War II. While deployed in North Africa and Italy, he gave saltwater to wounded soldiers, but told them they were receiving a powerful painkiller. Beecher did not engage in this deception by choice, but by necessity: As an anesthesiologist treating a flood of battlefield injuries, he faced the difficult task of rationing his supply of morphine.

The roots of our understanding of the nocebo effect are more obscure. But we do find an early precedent involving the work of eighteenth-century German physician Franz Mesmer, best known for his interest in the eponymous proto-hypnotic therapy known as “mesmerism.” In the salons of Paris and Vienna, he promoted the idea that illnesses could be alleviated by using magnets to govern the flow of fluid in patients’ bodies. (If this sounds like obvious quackery, which it is, bear in mind that Mesmer lacked any of our modern-day tools of science. He lived in an era when bloodletting with leeches was still seen as state-of-the-art medical treatment.)

Louis XVI (yes, the French king of guillotine fame) learned of Mesmer’s claims, and (properly) regarded them with skepticism. He established a commission to investigate, led by none other than Benjamin Franklin, who was then serving as the United States Minister to France. The American polymath and Francophile performed what we would now refer to as placebo-controlled studies so as to (as the commission put it) “separate the effects of the imagination from those attributed to magnetism.”…

… The mind’s unfortunate ability to create suffering ex nihilo can sometimes affect large groups of people though a process of social contagion (or, in the more indelicate language of the past, hysterical contagion). One such example, known as “The June Bug,” occurred in a U.S. textile mill in 1962. Many employees began to feel dizzy and nauseous. Some vomited. Rumors of a mysterious bug that was biting employees began to circulate, and eventually 62 workers became ill. Yet a subsequent Centers for Disease Control and Prevention investigation determined that no bugs could be identified. Nor could investigators find any other physical cause of the illnesses. This type of phenomenon is now referred to as psychogenic illness—sickness caused by belief.

Over the course of history, there have been countless other examples of psychogenic illness, with symptoms ranging from hysterical laughter to seizures. Aldous Huxley, the famed author of Brave New World, described one such seventeenth-century example in his lesser-known historically-based novel, The Devils of Loudun. In the 1630s, as Huxley documents, an entire convent of Ursuline nuns in the western French community of Loudun became convinced that they’d been demonically possessed (complete with convulsions, and other symptoms recognizable to any connoisseur of the modern exorcism-themed horror-movie genre) due to the unholy machinations of a (genuinely licentious) local priest named Urbain Grandier.

Could such a mass outbreak occur today, in an era when few believe in demonic spirits? Consider that during 2016 and 2017, no fewer than 21 American diplomats serving in Cuba reported a range of bizarre neurological symptoms that later came to be collectively described as “Havana Syndrome.” News of the outbreak spread globally through American diplomatic networks, and eventually more than 200 U.S. diplomats became ill. One leading theory was that the Russian government was attacking American embassies and consulates with microwaves.

To be clear: We do not yet know for certain the cause of these ailments. And it is conceivable that speculation concerning Russian involvement may prove correct (even if the microwave theory is far-fetched). That said, the possibility of psychogenic effects is obvious, and I regard it as concerning that this theory seems to have been rejected out of hand by American officials.

In 2021, in fact, a senior State Department official who’d been mandated to oversee the task force investigating Havana Syndrome was pushed out of her role when she refused to take psychogenic illness off the menu of potential causes. A former C.I.A. officer who claimed he’d been affected by Havana Syndrome while serving in Moscow declared that failing to rule out “mass hysteria” as a cause was “grotesquely insulting to victims and automatically disqualifying to lead the task force.”

I suspect that if Ben Franklin were alive today, he might take a different view…

When we experience pain, depression, or illness based on nothing more than negative expectations: “The Placebo Effect’s Evil Twin,” from @mh_bernstein in @Quillette.

Adapted, with permission, from the forthcoming book, The Nocebo Effect: When Words Make You Sick, by Michael H. Bernstein, Ph.D., Charlotte Blease, Ph.D., Cosima Locher, Ph. D., and Walter A. Brown, M.D. Published by Mayo Clinic Press.

* J.M. Coetzee, Waiting for the Barbarians

###

As we adjust our attitude, we might recall that it was on this date in 1867 that Joseph Lister published the first of his series of articles in The Lancet on “The Antiseptic Principle of the Practice of Surgery.”  Lister, having noticed that carbolic acid (phenol) was used to deodorize sewage, had experimented with using it to spray surgical instruments, surgical incisions, and dressings.  The result, he reported, was a substantially reduced incidence of gangrene.

source

“Right now I’m having amnesia and déjà vu at the same time. I think I’ve forgotten this before.”*…

The author, far left, as a very young child

Our first three years are usually a blur, and we don’t remember much before age seven. Kristin Ohlson wondered why…

… Freud argued that we repress our earliest memories because of sexual trauma but, until the 1980s, most researchers assumed that we retained no memories of early childhood because we created no memories – that events took place and passed without leaving a lasting imprint on our baby brains. Then in 1987, a study by the Emory University psychologist Robyn Fivush and her colleagues dispelled that misconception for good, showing that children who were just 2.5 years old could describe events from as far as six months into their past.

But what happens to those memories? Most of us assume that we can’t recall them as adults because they’re just too far back in our past to tug into the present, but this is not the case. We lose them when we’re still children…

To form long-term memories, an array of biological and psychological stars must align, and most children lack the machinery for this alignment. The raw material of memory – the sights, sounds, smells, tastes and tactile sensations of our life experiences – arrive and register across the cerebral cortex, the seat of cognition. For these to become memory, they must undergo bundling in the hippocampus, a brain structure named for its supposed resemblance to a sea horse, located under the cerebral cortex. The hippocampus not only bundles multiple input from our senses together into a single new memory, it also links these sights, sounds, smells, tastes, and tactile sensations to similar ones already stored in the brain. But some parts of the hippocampus aren’t fully developed until we’re adolescents, making it hard for a child’s brain to complete this process.

‘So much has to happen biologically to store a memory,’ the psychologist Patricia Bauer of Emory University told me. There’s ‘a race to get it stabilised and consolidated before you forget it. It’s like making Jell-O: you mix the stuff up, you put it in a mould, and you put it in the refrigerator to set, but your mould has a tiny hole in it. You just hope your Jell-O – your memory – gets set before it leaks out through that tiny hole.’

In addition, young children have a tenuous grip on chronology. They are years from mastering clocks and calendars, and thus have a hard time nailing an event to a specific time and place. They also don’t have the vocabulary to describe an event, and without that vocabulary, they can’t create the kind of causal narrative that [that’s] at the root of a solid memory. And they don’t have a greatly elaborated sense of self, which would encourage them to hoard and reconsider chunks of experience as part of a growing life-narrative.

Frail as they are, children’s memories are then susceptible to a process called shredding. In our early years, we create a storm of new neurons in a part of the hippocampus called the dentate gyrus and continue to form them throughout the rest of our lives, although not at nearly the same rate. A recent study by the neuroscientists Paul Frankland and Sheena Josselyn of the Hospital for Sick Children in Toronto suggests that this process, called neurogenesis, can actually create forgetting by disrupting the circuits for existing memories.

Our memories can become distorted by other people’s memories of the same event or by new information, especially when that new information is so similar to information already in storage. For instance, you meet someone and remember their name, but later meet a second person with a similar name and become confused about the name of the first person. We can also lose our memories when the synapses that connect neurons decay from disuse. ‘If you never use that memory, those synapses can be recruited for something different,’ Bauer told me.

Memories are less vulnerable to shredding and disruptions as the child grows up. Most of the solid memories that we carry into the rest of our lives are formed during what’s called ‘the reminiscence bump’, from ages 15 to 30, when we invest a lot of energy in examining everything to try to figure out who we are. The events, culture and people of that time remain with us and can even overshadow the features of our ageing present, according to Bauer. The movies were the best back then, and so was the music, and the fashion, and the political leaders, and the friendships, and the romances. And so on…

Why we remember so little from our youngest years: “The great forgetting,” from @kristinohlson in @aeonmag.

* Steven Wright

###

As we stroll down memory lane, we might spare a thought for Benjamin McLane Spock; he died on this date in 1998.  The first pediatrician to study psychoanalysis to try to understand children’s needs and family dynamics, he collected his findings in a 1946 book, The Common Sense Book of Baby and Child Care, which was criticized in some academic circles as being too reliant on anecdotal evidence, and in some conservative circles for promoting (what Norman Vincent Peale and others called) “permissiveness” by parents.  Despite that push-back, it became one of the best-selling volumes in history, having sold at the time of Spock’s death in 1998 over 50 million copies in 40 languages.

220px-Benjamin_McLane_Spock_(1976)

source

“I analogize it to sex. You realize there were certain things you shouldn’t do, but the urge is there and you can’t resist.”*…

The estimable Cory Doctorow on the incursion of private equity into health care…

As someone who writes a lot of fiction about corporate crime, I naturally end up spending a lot of time being angry about corporate crime. It’s pretty goddamned enraging. But the fiction writer in me is especially upset at how cartoonishly evil the perps are – routinely doing things that I couldn’t ever get away with putting in a novel.

Beyond a doubt, the most cartoonishly evil characters are the private equity looters. And the most cartoonishly evil private equity looters are the ones who get involved in health care.

Writing for The American Prospect, Maureen Tcacik details a national scandal: the collapse of PE-backed hospital chain Steward Health, a company that bought and looted hospitals up and down the country, starving them of everything from heart valves to prescription paper, ripping off suppliers, doctors and nurses, and callously exposing patients to deadly risk…

[There follows an illuminating– and truly terrifying (backed up sewage in the wards; bats colonizing hospital floors; stiffed employees and vendors)– an unpacking of Steward’s deeds and a location of them in the larger landscape of private equity.]

… But despite Steward’s increasingly furious creditors and its decaying facilities, the company remains bullish on its ability to continue operations. Medical Properties Trust – the real estate investment trust that is nominally a separate company from Steward – recently hosted a conference call to reassure Wall Street investors that it would be a going concern. When a Bank of America analyst asked MPT’s CFO how this could possibly be, given the facility’s dire condition and Steward’s degraded state, the CFO blithely assured him that the company would get bailouts: “We own hospitals no one wants to see closed.”

That’s the thing about PE and health-care. The looters who buy out every health-care facility in a region understand that this makes them too big to fail: no matter how dangerous the companies they drain become, local governments will continue to prop them up. Look at dialysis, a market that’s been cornered by private equity rollups. Today, if you need this lifesaving therapy, there’s a good chance that every accessible facility is owned by a private equity fund that has fired all its qualified staff and ceased sterilizing its needles. Otherwise healthy people who visit these clinics sometimes die due to operator error. But they chug along, because no dialysis clinics is worse that “dialysis clinics where unqualified sadists sometimes kill you with dirty needles

The PE sector spent more than a trillion dollars over the past decade buying up healthcare companies, and it has trillions more in “dry powder” allocated for further medical acquisitions. Why not? As the CFO of Medical Properties Trust told that Bank of America analyst last week, when you “own hospitals no one wants to see closed.” you literally can’t fail, no matter how many people you murder.

The PE sector is a reminder that the crimes people commit for money far outstrip the crimes they commit for ideology. Even the most ideological killers are horrified by the murders their profit-motivated colleagues commit.

Last year, Tkacic wrote about the history of IG Farben, the German company that built Monowitz, a private slave-labor camp up the road from Auschwitz to make the materiel it was gouging Hitler’s Wehrmacht on…

Farben bought the cheapest possible slaves from Auschwitz, preferentially sourcing women and children. These slaves were worked to death at a rate that put Auschwitz’s wholesale murder in the shade. Farben’s slaves died an average of just three months after starting work at Monowitz. The situation was so abominable, so unconscionable, that the SS officers who provided outsource guard-labor to Monowitz actually wrote to Berlin to complain about the cruelty.

The Nuremberg trials are famous for the Nazi officers who insisted that they were “just following order” but were nonetheless executed for their crimes. 24 Farben executives were also tried at Nuremberg, where they offered a very different defense: “We had a fiduciary duty to our shareholders to maximize our profits.” 19 of the 24 were acquitted on that basis.

PE is committed to an ideology that is far worse than any form of racial animus or other bias. As a sector, it is committed to profit above all other values. As a result, its brutality knows no bounds, no decency, no compassion. Even the worst crimes we commit for hate are nothing compared to the crimes we commit for greed…

When private equity destroys your hospital,” from @doctorow. Eminently worth reading in full– and following his newsletter (from whence this comes).

* David Rubenstein, co-founder and co-chairman of the private equity giant The Carlyle Group, at a Harvard Business School Conference

###

As we rethink returns, we might recall that it was on this date in 1944 that Louis Buchalter (AKA Lepke Buchalter, AKA Louis Lepke) was executed in the electric Chair at Sing Sing.  One of the premier labor racketeers in New York City in the 1930s, he is better remembered as the creator (in 1929) and overseer (thereafter) of an efficient system for performing mob hits; while Buchalter never named it, it became known in the press as “Murder, Inc.

The Cosa Nostra mobsters wanted to insulate themselves from any connection to these murders. Buchalter’s partner, mobster Albert Anastasia, would relay a contract request from the Cosa Nostra to Buchalter. In turn, Buchalter would assign the job to Jewish and Italian street gang members from Brooklyn.

None of these contract killers had any connections with the major crime families. If they were caught, they could not implicate their Cosa Nostra employers in the crimes. Buchalter used the same killers for his own murder contracts. The Murder, Inc., killers were soon completing jobs all over the country for their mobster bosses…

source

Murder, Inc. was believed to be responsible for as many as 1,000 contract killings before it was exposed in 1941, and Buchalter was finally charged and convicted of murder that same year.

source

“I discovered that if one looks a little closer at this beautiful world, there are always red ants underneath”*…

A polygyne population of red imported fire ants at Brackenridge Field. Austin, Texas, USA. Photo by Alexander Wild

E.O. Wilson once observed that “Ants have the most complicated social organization on earth next to humans.” John Whitfield explores the way in which, over the past four centuries, quadrillions of ants have created a strange and turbulent global society that shadows our own…

It is a familiar story: a small group of animals living in a wooded grassland begin, against all odds, to populate Earth. At first, they occupy a specific ecological place in the landscape, kept in check by other species. Then something changes. The animals find a way to travel to new places. They learn to cope with unpredictability. They adapt to new kinds of food and shelter. They are clever. And they are aggressive.

In the new places, the old limits are missing. As their population grows and their reach expands, the animals lay claim to more territories, reshaping the relationships in each new landscape by eliminating some species and nurturing others. Over time, they create the largest animal societies, in terms of numbers of individuals, that the planet has ever known. And at the borders of those societies, they fight the most destructive within-species conflicts, in terms of individual fatalities, that the planet has ever known.

This might sound like our story: the story of a hominin species, living in tropical Africa a few million years ago, becoming global. Instead, it is the story of a group of ant species, living in Central and South America a few hundred years ago, who spread across the planet by weaving themselves into European networks of exploration, trade, colonisation and war – some even stowed away on the 16th-century Spanish galleons that carried silver across the Pacific from Acapulco to Manila. During the past four centuries, these animals have globalised their societies alongside our own.

It is tempting to look for parallels with human empires. Perhaps it is impossible not to see rhymes between the natural and human worlds, and as a science journalist I’ve contributed more than my share. But just because words rhyme, it doesn’t mean their definitions align. Global ant societies are not simply echoes of human struggles for power. They are something new in the world, existing at a scale we can measure but struggle to grasp: there are roughly 200,000 times more ants on our planet than the 100 billion stars in the Milky Way…

The more I learn, the more I am struck by the ants’ strangeness rather than their similarities with human society. There is another way to be a globalised society – one that is utterly unlike our own. I am not even sure we have the language to convey, for example, a colony’s ability to take bits of information from thousands of tiny brains and turn it into a distributed, constantly updated picture of their world. Even ‘smell’ seems a feeble word to describe the ability of ants’ antennae to read chemicals on the air and on each other. How can we imagine a life where sight goes almost unused and scent forms the primary channel of information, where chemical signals show the way to food, or mobilise a response to threats, or distinguish queens from workers and the living from the dead?

As our world turns alien, trying to think like an alien will be a better route to finding the imagination and humility needed to keep up with the changes than looking for ways in which other species are like us. But trying to think like an ant, rather than thinking about how ants are like us, is not to say that I welcome our unicolonial insect underlords. Calamities follow in the wake of globalised ant societies. Most troubling among these is the way that unicolonial species can overwhelmingly alter ecological diversity when they arrive somewhere new. Unicolonial ants can turn a patchwork of colonies created by different ant species into a landscape dominated by a single group. As a result, textured and complex ecological communities become simpler, less diverse and, crucially, less different to each other. This is not just a process; it is an era. The current period in which a relatively small number of super-spreading animals and plants expands across Earth is sometimes called the Homogecene. It’s not a cheering word, presaging an environment that favours the most pestilential animals, plants and microbes. Unicolonial ants contribute to a more homogenous future, but they also speak to life’s ability to escape our grasp, regardless of how we might try to order and exploit the world. And there’s something hopeful about that, for the planet, if not for us.

The scale and spread of ant societies is a reminder that humans should not confuse impact with control. We may be able to change our environment, but we’re almost powerless when it comes to manipulating our world exactly how we want. The global society of ants reminds us that we cannot know how other species will respond to our reshaping of the world, only that they will…

Bracing: “Ant geopolitics,” from @gentraso in @aeonmag.

* David Lynch

###

As we investigate insects, we might spare a thought for Sara Josephine Baker; she died on this date in 1945. A physician, she was a pioneer in public health and child welfare in the United States in her roles as assistant to the Commissioner for Public Health of New York City, and later, head of the city’s Department of Health in “Hell’s Kitchen” for 25 years. Convinced of the value of well-baby care and the prevention of disease, in 1908 she founded the Bureau of Child Hygiene– and decreased the death rate by 1200 from the previous year. Her work made the New York City infant mortality rate the lowest in the USA or Europe at the time. She set up free milk clinics, licensed midwives, and taught the use of silver nitrate to prevent blindness in newborns.

Baker is also remembered as the public health official who (twice) tracked down “Typhoid Mary.”

source

“Foul cankering rust the hidden treasure frets, but gold that’s put to use more gold begets.”*…

The scientific literature is vast. No individual human can fully know all the published research findings, even within a single field of science. As Ulkar Aghayeva explains, regardless of how much time a scientist spends reading the literature, there’ll always be what the information scientist Don Swanson called ‘undiscovered public knowledge’: knowledge that exists and is published somewhere, but still remains largely unknown.

Some scientific papers receive very little attention after their publication – some, indeed, receive no attention whatsoever. Others, though, can languish with few citations for years or decades, but are eventually rediscovered and become highly cited. These are the so-called ‘sleeping beauties’ of science.

The reasons for their hibernation vary. Sometimes it is because contemporaneous scientists lack the tools or practical technology to test the idea. Other times, the scientific community does not understand or appreciate what has been discovered, perhaps because of a lack of theory. Yet other times it’s a more sublunary reason: the paper is simply published somewhere obscure and it never makes its way to the right readers.

What can sleeping beauties tell us about how science works? How do we rediscover information the scientific body of knowledge already contains but that is not widely known? Is it possible that, if we could understand sleeping beauties in a more systematic way, we might be able to accelerate scientific progress?

Sleeping beauties are more common than you might expect.

The term sleeping beauties was coined by Anthony van Raan, a researcher in quantitative studies of science, in 2004. In his study, he identified sleeping beauties between 1980 and 2000 based on three criteria: first, the length of their ‘sleep’ during which they received few if any citations. Second, the depth of that sleep – the average number of citations during the sleeping period. And third, the intensity of their awakening – the number of citations that came in the four years after the sleeping period ended. Equipped with (somewhat arbitrarily chosen) thresholds for these criteria, van Raan identified sleeping beauties at a rate of about 0.01 percent of all published papers in a given year.

Later studies hinted that sleeping beauties are even more common than that. A systematic study in 2015, using data from 384,649 papers published in American Physical Society journals, along with 22,379,244 papers from the search engine Web of Science, found a wide, continuous range of delayed recognition of papers in all scientific fields. This increases the estimate of the percentage of sleeping beauties at least 100-fold compared to van Raan’s.

Many of those papers became highly influential many decades after their publication – far longer than the typical time windows for measuring citation impact. For example, Herbert Freundlich’s paper ‘Concerning Adsorption in Solutions’ (though its original title is in German) was published in 1907, but began being regularly cited in the early 2000s due to its relevance to new water purification technologies. William Hummers and Richard Offeman’s ‘Preparation of Graphitic Oxide’, published in 1958, also didn’t ‘awaken’ until the 2000s: in this case because it was very relevant to the creation of the soon-to-be Nobel Prize–winning material graphene

Indeed, one of the most famous physics papers, Albert Einstein, Boris Podolsky, and Nathan Rosen (EPR)’s ‘Can Quantum-Mechanical Description of Physical Reality Be Considered Complete?’ (1935) is a classic example of a sleeping beauty…

More examples, and explanation of why they slumber, and thoughts on how to awaken them sooner: “Waking up science’s sleeping beauties,” from @ulkar_aghayeva in @WorksInProgMag.

[Image above: source]

* Shakespeare, “Venus and Adonis”

###

As we dwell on discovery, we might send healing birthday greetings to a woman whose scientific work thankfully rarely napped, Gertrude Elion; she was born on this date in 1918. A pharmacologist, she shared the 1988 Nobel Prize in Physiology or Medicine with George H. Hitchings and Sir James Black for their use of innovative methods of rational drug design (focused on understanding the target of the drug rather than simply using trial-and-error) in the development of new drugs.  Her work led to the creation of the anti-retroviral drug AZT, which was the first drug widely used against AIDS. Her well-known and widely deployed creations also include the first immunosuppressive drug, azathioprine, used to fight rejection in organ transplants, the first successful antiviral drug, acyclovir (ACV), used in the treatment of herpes infection, and a number of drugs used in cancer treatment.

source