(Roughly) Daily

Posts Tagged ‘malaria

“Alchemy. The link between the immemorial magic arts and modern science. Humankind’s first systematic effort to unlock the secrets of matter by reproducible experiment.”*…

As (AI/tech pro and writer) Dale Markowitz explains, for scientists of yore anything—from mermaids to alchemy—was on the table…

In 1936, the economist John Maynard Keynes purchased a trove of Isaac Newton’s unpublished notes. These included more than 100,000 words on the great physicist’s secret alchemical experiments. Keynes, shocked and awed, dubbed them “wholly magical and wholly devoid of scientific value.” This unexpected discovery, paired with things like Newton’s obsession with searching for encrypted messages in the Bible’s Book of David, showed that Newton “was not the first of the age of reason,” Keynes concluded. “He was the last of the magicians.”

When it came to fascination with the occult, Newton was hardly alone. Many contemporary scientists may cast aspersions on spells, mythical tales, and powers of divination. Not so for many of the early modern thinkers who laid the foundations of modern science. To them, the world teemed with the uncanny: witches, unicorns, mermaids, stars that foretold the future, base metals that could be coaxed into gold or distilled into elixirs of eternal life. 

These fantastical beliefs were shared by the illiterate and educated elite alike—including many of the forebears of contemporary science, including chemist Robert Boyle, who gave us modern chemistry and Boyle’s law, and biologist Carl Linnaeus, who developed the taxonomic system by which scientists classify species today. Rather than stifling discovery, their now-arcane beliefs may have helped drive them and other scientists to endure hot smoky days in the bowels of alchemical laboratories or long frigid nights on the balconies of astronomical towers.

To understand the role of magic in spurring scientific progress, it helps to understand the state of learning in Europe in those times. Throughout the Middle Ages, many scholars were fixated on the idea that knowledge could only be gleaned from ancient texts. Universities taught from incomplete, often poorly translated copies of Aristotle, Ptolemy, and Galen. To stray from the giants was a crime: In 14th-century Oxford, scholars could be charged 5 shillings for contradicting Aristotle. Curiosity was considered a sin on par with lust. A powerful motivator was needed to shuck off ancient thinking.

One of the first influential thinkers to break with the old ways was the 16th-century Swiss-German physician Paracelsus. The father of toxicology, known for his pioneering use of chemicals in medicine, Paracelsus was among the first of his time to champion the importance of experimentation and observation—a philosophy which would set the foundations for the scientific method. Paracelsus showed the scholars what he thought of their old books by publicly burning his copies of Galen and Avicenna. 

But what led him to this experiment-first approach? Perhaps it was because, to Paracelsus, experimentation was a kind of magic. His writing fuses scientific observation with the occult. To him, medicine, astrology, and alchemy were inextricably linked—different ways of unveiling sacred truths hidden in nature by God. Paracelsus considered himself a kind of magus, as he believed Moses and Solomon had been, as Newton would view himself 150 years later. Paracelsus believed, though, that divine knowledge could be gained not just by studying scripture, but also by studying nature. The alchemical workbench, the night sky—these were even surer routes to God than any dusty old textbook…

[Markowitz recounts the stories of Tycho Brahe [almanac entry here], his patron Holy Roman Emperor Rudolf II, Robert Boyle, William Harvey, and Linnaeus [here], who, in 1749, urged the Royal Swedish Academy of Sciences to launch a hunt for mermaids…]

… To our contemporary ears, most all of this may sound fairly ridiculous. But as Edward Donlick puts it in The Clockwork Universe, “The world was so full of marvels, in other words, that the truly scientific approach was to reserve judgment about what was possible and what wasn’t, and to observe and experiment instead.” To the 17th-century scientist, anything was on the table, so long as it could be experimentally studied.

Today, we know how the story ends: Belief in astrology, alchemy, and witchcraft declined in places where empiricism and skepticism became cornerstones of science. But perhaps early scientists’ fascination with the occult should remind us of other tenants of discovery: open-mindedness and curiosity. Witches, mermaids, and the philosopher’s stone may not have survived modern scrutiny, but it was curiosity about them that drove real progress and allowed early thinkers to stray from established norms. In this sense, curiosity is a kind of magic…

How the Occult Gave Birth to Science,” from @dalequark.bsky.social in @nautil.us.

See also: “The importance of experimental proof, on the other hand, does not mean that without new experimental data we cannot make advances” and “Everyone knows Newton as the great scientist. Few remember that he spent half his life muddling with alchemy, looking for the philosopher’s stone. That was the pebble by the seashore he really wanted to find.”

John Ciardi

###

As we think about transmutation, we might spare a thought for a rough contemporary (and fellow-traveler) of Newton’s, Rasmus Bartholin; he died on this date in 1698. A physician, mathematician, and physicist, he is best known for his discovery of the optical phenomenon of double refraction. In 1669, Bartholin observed that images seen through Icelandic feldspar (calcite) were doubled and that, when the crystal was rotated, one image remained stationary while the other rotated with the crystal. Such behaviour of light could not be explained using Newton’s optical theories of the time. Subsequently, this was explained as the effect of the polarisation of the light.

Bartholin also wrote a several mathematical works and made astronomical observations (including the comets of 1665). And he is famed for his medical work, in particular his introduction of quinine in the fight against malaria.

(Bartholin’s family was packed with pioneering scientists, 12 of whom became professors at the University of Copenhagen; perhaps most notable, his elder brother Thomas, who discovered the lymphatic system in humans and advanced the theory of “refrigeration anesthesia”(being the first to describe it scientifically).

A portrait of a 17th-century man with long curly hair, wearing a red robe and white lace cravat, posing with a serious expression.
Rasmus Bartholin (source)

“If all insects disappeared, all life on earth would perish. If all humans disappeared, all life on earth would flourish.”

As Lars Chittka explains, insects have surprisingly rich inner lives—a revelation that has wide-ranging ethical implications…

In the early 1990s, when I was a Ph.D. student at the Free University of Berlin modeling the evolution of bee color perception, I asked a botany professor for some advice about flower pigments. I wanted to know the degrees of freedom that flowers have in producing colors to signal to bees. He replied, rather furiously, that he was not going to engage in a discussion with me, because I worked in a neurobiological laboratory where invasive procedures on live honeybees were performed. The professor was convinced that insects had the capacity to feel pain. I remember walking out of the botanist’s office shaking my head, thinking the man had lost his mind.

Back then, my views were in line with the mainstream. Pain is a conscious experience, and many scholars then thought that consciousness is unique to humans. But these days, after decades of researching the perception and intelligence of bees, I am wondering if the Berlin botany professor might have been right.

Researchers have since shown that bees and some other insects are capable of intelligent behavior that no one thought possible when I was a student. Bees, for example, can count, grasp concepts of sameness and difference, learn complex tasks by observing others, and know their own individual body dimensions, a capacity associated with consciousness in humans. They also appear to experience both pleasure and pain. In other words, it now looks like at least some species of insects—and maybe all of them—are sentient.

These discoveries raise fascinating questions about the origins of complex cognition. They also have far-reaching ethical implications for how we should treat insects in the laboratory and in the wild…

Insects are key enablers of much life on earth. They appear to exhibit intelligence, and maybe more: “Do Insects Feel Joy and Pain?” in @sciam.

Bugs are not going to inherit the earth. They own it now. So we might as well make peace with the landlord.

Thomas Eisner

Pair with this helpfully skeptical (but respectful) review of Chittka’s book, The Mind of a Bee.

* Jonas Salk

###

As we ponder our place, we might recall that it was on this date in 1897 that physician Sir Ronald Ross made a key breakthrough when he discovered malaria parasites while dissecting a mosquito. This day is now known as World Mosquito Day, in celebration of his critical discovery.

source

Written by (Roughly) Daily

August 20, 2023 at 1:00 am

“Law; an ordinance of reason for the common good, made by him who has care of the community”*…

 

vaccination

 

In the fall of 1713, measles struck the city of Boston, where Cotton Mather, a Puritan theologian and pastor, lived with his pregnant wife and numerous children. Within a month, his wife, their twin newborn babies, another child, and their maidservant had all died. On November 12, Mather wrote in his journal, “The epidemical Malady began upon this Town, is like to pass thro’ the Countrey. . . . it [might] be a service unto the public, to insert in the News-paper, a brief Direction for the managing of the sick. I will advise with a Physician or two.” On November 21, he wrote, “Lord I am oppressed; undertake for me!” On November 23, he wrote, “My poor Family is now left without any Infant in it, or any under seven Years of Age.”

Eight years later, when an explosive smallpox epidemic threatened Boston’s population of eleven thousand, Mather became an outspoken advocate for a new prophylactic against the virus: inoculation. Dr. William Douglass, one of the few doctors in town with a medical degree, rallied others to oppose Mather, claiming that the method was untested (which was true, at least in the new colony) and that it jeopardized the lives of all those who received it. In young Boston, the fight over inoculation tore at epidemic-addled nerves. In November 1721, a bomb was thrown through Mather’s window. A letter attached to it read, “Cotton Mather, you dog, dam you! I’ll inoculate you with this; with a pox to you.”

Douglass won out, quashing Mather’s plans for a systematic inoculation of the town’s population. Eight hundred and forty-four people died of the virus, accounting for 75 percent of all deaths in Boston that year. The unexploded bomb on Mather’s floorboards disabuses those of us living in 2019 of the impression—generated over the past two years by endless news stories about the current global measles outbreak—that inoculation controversies are a novel feature of our present hyper-mediated, hyper-politicized time.

Measles was considered to be eliminated in the United States in 2000. Still, the virus has regained extraordinary ground—and claimed an increasing number of lives—in recent years. Seven hundred and four cases were reported in the United States in the first four months of 2019, according to the Centers for Disease Control and Prevention. The count reached 1,077 by mid-June, occurring in twenty-eight states. More than six hundred cases have occurred in New York City alone since September of 2018.

In June, the CDC issued a warning to travelers planning to leave the country, by which point outbreaks were occurring in all the places you’d expect, countries beset by depressed economies, poor public health management, war, or extreme poverty, including Ethiopia, Madagascar, Kyrgyzstan, the Democratic Republic of Congo, the Philippines, Sudan, and Georgia (not the U.S. state, although cases have been reported there as well). But also, cases were appearing in countries where entrenched vaccination systems existed and where measles had been thought largely a disease of the past: Belgium, France, Germany, and Italy.

The grand cause for these infections—and for the 300 percent growth of reported measles cases around the world in the first quarter of the year over the same quarter the previous year—is precisely the absence of what Cotton Mather proposed for 1721-era Boston: systematic vaccination of the population. The more interesting question, beyond simple international vaccination logistics, is: What ideological and historical shifts have allowed the reemergence of a disease once believed to be under controlled decline?…

A history of the debates over vaccination that asks, can the social contract be protected from a measles outbreak?: “Herd Immunity.”

* Thomas Aquinas

###

As we pull up our sleeves, we might send healing birthday greetings to Giovanni Maria Lancisi; he was born on this date in 1654.  A doctor (he was personal physician to three popes), epidemiologist (he made the correlation between mosquitoes and malaria), and anatomist (his study of the heart resulted in the eponymous Lancisi’s sign), he is considered the first modern hygienist.

He carried out extensive anatomical and physiological studies, also epidemiology studies on malaria, influenza and cattle plague.  Contrary to the then-traditional conception of “mal’ aria ” – literally, “bad air” – Lancisi observed that the lethal fever, malaria, disappeared when the swamps near to the city were cleared.  He concluded that injurious substances transmitted from flies and mosquitoes were the origin of the disease.

250px-Giovanni_Maria_Lancisi source

 

 

Written by (Roughly) Daily

October 26, 2019 at 1:01 am

“If you think you are too small to make a difference, try sleeping with a mosquito”*…

 

mosquito

 

A month after the opening salvos of the American Revolution at Lexington and Concord in April 1775, the newly appointed commander in chief of the Continental Army, George Wash­ington, had a request for his political masters in the Continental Con­gress. He urged them to buy up as much cinchona bark and quinine powder as possible. Given the dire financial pressures of the squabbling colonial government, and the dearth of pretty much everything needed to fight a war, his total allotment was a paltry 300 pounds. General Washington was a frequent visitor to the quinine chest as he suffered from recurrent bouts (and reinfection) of malaria since first contracting the disease in 1749 at the age of seventeen.

Luckily for the Americans, the British were also drastically short of Peruvian Spanish-supplied quinine throughout the war. In 1778, shortly before they entered the fray in support of the American cause, the Spanish cut off this supply completely. Any available stores were sent to British troops in India and the Caribbean. At the same time, the mosquito’s mer­ciless, unrelenting strikes on unseasoned British troops lacking quinine during the final British southern campaign — launched in 1780 with the capture of Charleston, the strategic port city and mosquito sanctuary­ — determined the fate of the United States of America.

As J. R. McNeill colorfully contours, ‘The argument here is straight­forward: In the American Revolution the British southern campaigns ultimately led to defeat at Yorktown in October 1781 in part because their forces were much more susceptible to malaria than were the American. . . . [T]he balance tipped because Britain’s grand strategy committed a larger proportion of the army to malarial (and yellow fever) zones.’ A full 70% of the British Army that marched into this southern mosquito maelstrom in 1780 was recruited from the poorer, famished regions of Scotland and the northern counties of England, outside the malaria belt of Pip’s Fenland marshes. Those who had already served some time in the colonies had done so in the northern zone of infection and had not yet been seasoned to American malaria.

General Washington and the Continental Congress, on the other hand, had the advantage of commanding acclimated, malaria-seasoned colonial troops. American militiamen had been hardened to their sur­roundings during the Seven Years’ War and the turbulent decades head­ing toward open hostilities against their king. Washington personally recognized, albeit short of scientific affirmation or medical endorsement, that with his recurrent malarial seasonings, ‘I have been protected be­yond all human probability or expectation.’ While they did not know it at the time, this might well have been the Americans’ only advantage over the British when, after twelve years of seething resentment and discontent since the passing of the Royal Proclamation [of 1763 that prohibited land sales to colonists], war suddenly and unexpectedly came.

The Americans’ secret weapon– an excerpt from Timothy C. Winegard’s Mosquito: A Human History of of our Deadliest Predator: “George Washington, Mosquitoes, and the American Revolution.”

[via the ever-illuminating Delanceyplace.com]

* Dalia Lama XIV

###

As we douse ourselves in DEET, we might recall that it was on this date in 1781– before the fall of Yorktown, but after a decisive week of fighting– that General George Washington wrote to the President of the Continental Congress to give an account of the recent action.  Three days later the Siege of Yorktown (as it became known) ended with the surrender of British forces under General Cornwallis.  It was the final major land battle of the Revolutionary War; the capture of Cornwallis and his army prompted the British government to negotiate an end to the conflict.

300px-Surrender_of_Lord_Cornwallis

Surrender of Lord Cornwallis, by John Trumbull

source

 

Written by (Roughly) Daily

October 16, 2019 at 1:01 am

“I never quite envisioned myself a proper doctor under that white coat”*…

The Agnew Clinic” by Thomas Eakin, 1889

Toward the end of the 19th century, Western medicine had an image problem. Joseph Lister’s ideas about antiseptics were spreading, and John Snow had made a breakthrough in mapping the spread of cholera. But to the public, most medical “cures” were little more than quackery and mysticism, and the appearance of a physician merely presaged a painful death.

At the same time, the reputation of science was in rapid ascendancy. The Industrial Revolution was transforming the towns and cities of Europe and America, and new breakthroughs were reported on a weekly basis in more than a thousand different scientific journals.

So the medical establishment did a costume change. Doctors dropped their traditional black coats, which were worn either as a mark of formality (like a tuxedo) or to symbolize the solemnity of their profession, and instead opted for white coats like the ones worn by scientists in their laboratories…

Sartorial history at its most clinical: “Why the White Lab Coat Changed Medical History.”

* Robert Jay Lifton

###

As we battle “white coat hypertension,” we might spare a thought for Sir Ronald Ross; he died on this date in 1932.  A physician, bacteriologist, and mathematician, he located the malarial parasite in the gut of the Anopheles mosquito, identifying it as the disease vector– for which he became the first British Nobelist, awarded the 1902 Prize for Physiology or Medicine.

source

Written by (Roughly) Daily

September 16, 2016 at 1:01 am