(Roughly) Daily

Posts Tagged ‘philosophy of science

“The pursuit of science is a grand adventure, driven by curiosity, fueled by passion, and guided by reason”*…

Adam Mastroianni on how science advances (and how it’s held back), with a provocative set of suggestions for how it might be accelerated…

There are two kinds of problems in the world: strong-link problems and weak-link problems.

Weak-link problems are problems where the overall quality depends on how good the worst stuff is. You fix weak-link problems by making the weakest links stronger, or by eliminating them entirely.

Food safety, for example, is a weak-link problem. You don’t want to eat anything that will kill you. That’s why it makes sense for the Food and Drug Administration to inspect processing plants, to set standards, and to ban dangerous foods…

Weak-link problems are everywhere. A car engine is a weak-link problem: it doesn’t matter how great your spark plugs are if your transmission is busted. Nuclear proliferation is a weak-link problem: it would be great if, say, France locked up their nukes even tighter, but the real danger is some rogue nation blowing up the world. Putting on too-tight pants is a weak-link problem: they’re gonna split at the seams.

It’s easy to assume that all problems are like this, but they’re not. Some problems are strong-link problems: overall quality depends on how good the best stuff is, and the bad stuff barely matters. Like music, for instance. You listen to the stuff you like the most and ignore the rest. When your favorite band releases a new album, you go “yippee!” When a band you’ve never heard of and wouldn’t like anyway releases a new album, you go…nothing at all, you don’t even know it’s happened. At worst, bad music makes it a little harder for you to find good music, or it annoys you by being played on the radio in the grocery store while you’re trying to buy your beetle-free asparagus…

Strong-link problems are everywhere; they’re just harder to spot. Winning the Olympics is a strong-link problem: all that matters is how good your country’s best athletes are. Friendships are a strong-link problem: you wouldn’t trade your ride-or-dies for better acquaintances. Venture capital is a strong-link problem: it’s fine to invest in a bunch of startups that go bust as long as one of them goes to a billion…

In the long run, the best stuff is basically all that matters, and the bad stuff doesn’t matter at all. The history of science is littered with the skulls of dead theories. No more phlogiston nor phlegm, no more luminiferous ether, no more geocentrism, no more measuring someone’s character by the bumps on their head, no more barnacles magically turning into geese, no more invisible rays shooting out of people’s eyes, no more plum pudding

Our current scientific beliefs are not a random mix of the dumbest and smartest ideas from all of human history, and that’s because the smarter ideas stuck around while the dumber ones kind of went nowhere, on average—the hallmark of a strong-link problem. That doesn’t mean better ideas win immediately. Worse ideas can soak up resources and waste our time, and frauds can mislead us temporarily. It can take longer than a human lifetime to figure out which ideas are better, and sometimes progress only happens when old scientists die. But when a theory does a better job of explaining the world, it tends to stick around.

(Science being a strong-link problem doesn’t mean that science is currently strong. I think we’re still living in the Dark Ages, just less dark than before.)

Here’s the crazy thing: most people treat science like it’s a weak-link problem.

Peer reviewing publications and grant proposals, for example, is a massive weak-link intervention. We spend ~15,000 collective years of effort every year trying to prevent bad research from being published. We force scientists to spend huge chunks of time filling out grant applications—most of which will be unsuccessful—because we want to make sure we aren’t wasting our money…

I think there are two reasons why scientists act like science is a weak-link problem.

The first reason is fear. Competition for academic jobs, grants, and space in prestigious journals is more cutthroat than ever. When a single member of a grant panel, hiring committee, or editorial board can tank your career, you better stick to low-risk ideas. That’s fine when we’re trying to keep beetles out of asparagus, but it’s not fine when we’re trying to discover fundamental truths about the world…

The second reason is status. I’ve talked to a lot of folks since I published The rise and fall of peer review and got a lot of comments, and I’ve realized that when scientists tell me, “We need to prevent bad research from being published!” they often mean, “We need to prevent people from gaining academic status that they don’t deserve!” That is, to them, the problem with bad research isn’t really that it distorts the scientific record. The problem with bad research is that it’s cheating

I get that. It’s maddening to watch someone get ahead using shady tactics, and it might seem like the solution is to tighten the rules so we catch more of the cheaters. But that’s weak-link thinking. The real solution is to care less about the hierarchy

Here’s our reward for a generation of weak-link thinking.

The US government spends ~10x more on science today than it did in 1956, adjusted for inflation. We’ve got loads more scientists, and they publish way more papers. And yet science is less disruptive than ever, scientific productivity has been falling for decades, and scientists rate the discoveries of decades ago as worthier than the discoveries of today. (Reminder, if you want to blame this on ideas getting harder to find, I will fight you.)…

Whether we realize it or not, we’re always making calls like this. Whenever we demand certificates, credentials, inspections, professionalism, standards, and regulations, we are saying: “this is a weak-link problem; we must prevent the bad!”

Whenever we demand laissez-faire, the cutting of red tape, the letting of a thousand flowers bloom, we are saying: “this is a strong-link problem; we must promote the good!”

When we get this right, we fill the world with good things and rid the world of bad things. When we don’t, we end up stunting science for a generation. Or we end up eating a lot of asparagus beetles…

Science is a strong-link problem,” from @a_m_mastroianni in @science_seeds.

* James Clerk Maxwell

###

As we ponder the process of progress, we might spare a thought for Sir Christopher Wren; he died on this date in 1723.  A mathematician and astronomer (who co-founded and later served as president of the Royal Society), he is better remembered as one of the most highly acclaimed English architects in history; he was given responsibility for rebuilding 52 churches in the City of London after the Great Fire in 1666, including what is regarded as his masterpiece, St. Paul’s Cathedral, on Ludgate Hill.

Wren, whose scientific work ranged broadly– e.g., he invented a “weather clock” similar to a modern barometer, new engraving methods, and helped develop a blood transfusion technique– was admired by Isaac Newton, as Newton noted in the Principia.

 source

“A translator’s primary work isn’t knowing what it means (that’s a prerequisite, not the work itself). Translation is working out how to say it, how to write it. Translating is writing.”*

Further, in a fashion, to yesterday’s post: the estimable Damion Searls argues for a literary approach to translating rigorous philosophical texts…

Ludwig Wittgenstein’s [here] Tractatus Logico-Philosophicus is a book with an aura. His name, let’s admit it, is already a vibe; the title sets an extremely highbrow tone; the paragraphs are all numbered, promising a very impressive logical rigor, even if questions linger. (Is 6.2322 really exactly one level more pri­mary than 5.47321? What does “3.001” mean since there’s no 3.0 or 3.00?) And then the text itself has a kind of cryptic grandeur, awe-inspiring opacity, Olympian disregard for normal human understanding that gives us what we expect, what we want, from such an iconic philosopher. It’s an exciting challenge. A lot of the reason why the book has been so widely read in the century since its English publication in 1922, by philosophers and philosophy students and nonphilosophers alike, is how it makes its readers feel.

Several similarly forbidding-yet-thereby-thrilling books were published in English that same year—T. S. Eliot’s The Waste Land; James Joyce’s Ulysses—but unlike those, the Tractatus was a translation, and the question arises how much of its style was a byproduct of bringing it into English. The book’s title did not come from Wittgenstein: it was an esoteric pun on Spinoza’s Tractatus Theologico-Politicus from 1670, sug­gested by G. E. Moore, the Cambridge philosopher who was the fourth most important figure in getting the book into English, after the credited translator C. K. Ogden, the actual translator Frank Ramsey [here], and Bertrand Rus­sell [here]. Wittgenstein’s own German title was the far more hum­ble and straightforward Logisch-Philosophische Abhandlung, something like Essay on Logic and Philosophy. Russell’s introduction, included in the first edition and every subsequent one until this one, firmly placed the book in the context of techni­cal academic philosophy. And the book’s language in English was simply not at all like Wittgenstein’s forceful, earnest, fluid, subtle German.

Yet the book in English is what it is; should it just stay that way? This same debate came up around the retranslation of yet another iconic book from 1922: C. K. Scott Moncrieff’s translation of Marcel Proust’s Swann’s Way. He too completely changed the title (from Proust’s In Search of Lost Time to the Shake­speare quote Remembrance of Things Past); he too created an English-language voice, lush and purple, that wasn’t the orig­inal’s. And yet his writing was what generations of English-language Proust readers knew and loved; his translation was modified slightly over the years but largely preserved; when Lydia Davis came along with a new translation faithful to other aspects of the original, such as Proust’s analytical rigor, many readers didn’t care whether or not her version was more like the real Proust—Scott Moncrieff’s Proust was the real thing as far as they were concerned.

The situation with the Tractatus is clearer and less debat­able, for two reasons. First, the earlier translations are more deeply flawed than Scott Moncrieff’s Proust ever was. Second and perhaps more important, Wittgenstein’s book is explic­itly about the relationships between language and thought, between language and the world, making it imperative to get these relationships right in translation. And so I have retranslated the book, paying special attention to where the assumptions of typical academic philosophy translation would lead us away from expressing Wittgenstein’s thought in English. Implicitly, I am making the case for a certain kind of approach that is generally called “literary”—attentive to emotional nuances, subtle connotations, and expressive power—even when translating rigorous philosophical texts…

Eminently worth reading in full: “Translating Philosophy: The Case of Wittgenstein’s Tractatus,” in @wwborders.

* The equally estimable Emily Wilson, paraphrasing Searls

###

As we emphasize essence, we might spare a thought for the creator of the inspiration of the title of Wittgenstein’s work, Baruch Spinoza; he died on this date in 1677. One of the foremost thinkers of the Age of Reason, he was a philosopher who contributed to nearly every area of philosophical discourse, including metaphysics, epistemology, political philosophy, ethics, philosophy of mind, and philosophy of science. His rationalism and determinism put him in opposition to Descartes and helped lay the foundation for The Enlightenment; his pantheistic views led to his excommunication from the Jewish community in Amsterdam.

As men’s habits of mind differ, so that some more readily embrace one form of faith, some another, for what moves one to pray may move another to scoff, I conclude … that everyone should be free to choose for himself the foundations of his creed, and that faith should be judged only by its fruits; each would then obey God freely with his whole heart, while nothing would be publicly honored save justice and charity.

Tractatus Theologico-Politicus, 1670

 source

“A prudent question is one-half of wisdom”*…

Sir Francis Bacon, portrait by Paul van Somer I, 1617

The death of Queen Elizabeth I created a career opportunity for philosopher and statesman Francis Bacon– one that, as Susan Wise Bauer explains– led him to found empiricism, to pioneer inductive reasoning, and in so doing, to advance the scientific method…

In 1603, Francis Bacon, London born, was forty-three years old: a trained lawyer and amateur philosopher, happily married, politically ambitious, perpetually in debt.

He had served Elizabeth I of England loyally at court, without a great deal of recognition in return. But now Elizabeth was dead at the age of sixty-nine, and her crown would go to her first cousin twice removed: James VI of Scotland, James I of England.

Francis Bacon hoped for better things from the new king, but at the moment he had no particular ‘in’ at the English court. Forced to be patient, he began working on a philosophical project he’d had in mind for some years–a study of human knowledge that he intended to call Of the Proficience and Advancement of Learning, Divine and Human.

Like most of Bacon’s undertakings, the project was ridiculously ambitious. He set out to classify all learning into the proper branches and lay out all of the possible impediments to understanding. Part I condemned what he called the three ‘distempers’ of learning, which included ‘vain imaginations,’ pursuits such as astrology and alchemy that had no basis in actual fact; Part II divided all knowledge into three branches and suggested that natural philosophy should occupy the prime spot. Science, the project of understanding the universe, was the most important pursuit man could undertake. The study of history (‘everything that has happened’) and poesy (imaginative writings) took definite second and third places.

For a time, Bacon didn’t expand on these ideas. The Advancement of Learning opened with a fulsome dedication to James I (‘I have been touched–yea, and possessed–with an extreme wonder at those your virtues and faculties . . . the largeness of your capacity, the faithfulness of your memory, the swiftness of your apprehension, the penetration of your judgment, and the facility and order of your elocution …. There hath not been since Christ’s time any king or temporal monarch which hath been so learned in all literature and erudition, divine and human’), and this groveling soon yielded fruit. In 1607 Bacon was appointed as solicitor general, a position he had coveted for years, and over the next decade or so he poured his energies into his government responsibilities.

He did not return to natural philosophy until after his appointment to the even higher post of chancellor in 1618. Now that he had battled his way to the top of the political dirt pile, he announced his intentions to write a work with even greater scope–a new, complete system of philosophy that would shape the minds of men and guide them into new truths. He called this masterwork the Great Instauration: the Great Establishment, a whole new way of thinking, laid out in six parts.

Part I, a survey of the existing ‘ancient arts’ of the mind, repeated the arguments of the Advancement of Learning. But Part II, published in 1620 as a stand-alone work, was something entirely different. It was a wholesale challenge to Aristotelian methods, a brand-new ‘doctrine of a more perfect use of reason.’

Aristotelian thinking relies, heavily, on deductive reasoning for ancient logicians and philosophers, the highest and best road to the truth. Deductive reasoning moves from general statements (premises) to specific conclusions.

MAJOR PREMISE: All heavy matter falls toward the center of the universe. MINOR PREMISE: The earth is made of heavy matter. MINOR PREMISE: The earth is not falling. CONCLUSION: The earth must already be at the center of the universe.

But Bacon had come to believe that deductive reasoning was a dead end that distorted evidence: ‘Having first determined the question according to his will,’ he objected, ‘man then resorts to experience, and bending her to conformity with his placets [expressions of assent], leads her about like a captive in a procession.’ Instead, he argued, the careful thinker must reason the other way around: starting from specifics and building toward general conclusions, beginning with particular pieces of evidence and working, inductively, toward broader assertions.

This new way of thinking–inductive reasoning–had three steps to it. The ‘true method’ Bacon explained,

‘first lights the candle, and then by means of the candle shows the way; commencing as it does with experience duly ordered and digested, not bungling or erratic, and from it deducing axioms, and from established axioms again new experiments.’

In other words, the natural philosopher must first come up with an idea about how the world works: ‘lighting the candle.’ Second, he must test the idea against physical reality, against ‘experience duly ordered’–both observations of the world around him and carefully designed experiments. Only then, as a last step, should he ‘deduce axioms,’ coming up with a theory that could be claimed to carry truth. 

Hypothesis, experiment, conclusion: Bacon had just traced the outlines of the scientific method…

Francis Bacon and the Scientific Method

An excerpt from The Story of Western Science by @SusanWiseBauer, via the invaluable @delanceyplace.

* Francis Bacon

###

As we embrace empiricism, we might send carefully-transmitted birthday greetings to Augusto Righi; he was born on this date in 1850. A physicist and a pioneer in the study of electromagnetism, he showed that showed that radio waves displayed characteristics of light wave behavior (reflection, refraction, polarization, and interference), with which they shared the electromagnetic spectrum. In 1894 Righi was the first person to generate microwaves.

Righi influenced the young Guglielmo Marconi, the inventor of radio, who visited him at his lab. Indeed, Marconi invented the first practical wireless telegraphy radio transmitters and receivers in 1894 using Righi’s four ball spark oscillator (from Righi’s microwave work) in his transmitters.

source

“Over the long term, symbiosis is more useful than parasitism. More fun, too.”*…

Blue-green formations of malachite form in copper deposits near the surface as they weather. But they could only arise after life raised atmospheric oxygen levels, starting about 2.5 billion years ago.

There are many more varieties of minerals on earth than previously believed– and about half of them formed as parts or byproducts of living things…

The impact of Earth’s geology on life is easy to see, with organisms adapting to environments as different as deserts, mountains, forests, and oceans. The full impact of life on geology, however, can be easy to miss.

A comprehensive new survey of our planet’s minerals now corrects that omission. Among its findings is evidence that about half of all mineral diversity is the direct or indirect result of living things and their byproducts. It’s a discovery that could provide valuable insights to scientists piecing together Earth’s complex geological history—and also to those searching for evidence of life beyond this world.

In a pair of papers published on July 1, 2022 in American Mineralogist, researchers Robert HazenShaunna Morrison and their collaborators outline a new taxonomic system for classifying minerals, one that places importance on precisely how minerals form, not just how they look. In so doing, their system acknowledges how Earth’s geological development and the evolution of life influence each other.

Their new taxonomy, based on an algorithmic analysis of thousands of scientific papers, recognizes more than 10,500 different types of minerals. That’s almost twice as many as the roughly 5,800 mineral “species” in the classic taxonomy of the International Mineralogical Association, which focuses strictly on a mineral’s crystalline structure and chemical makeup.

Morrison and Hazen also identified 57 processes that individually or in combination created all known minerals. These processes included various types of weathering, chemical precipitations, metamorphic transformation inside the mantle, lightning strikes, radiation, oxidation, massive impacts during Earth’s formation, and even condensations in interstellar space before the planet formed. They confirmed that the biggest single factor in mineral diversity on Earth is water, which through a variety of chemical and physical processes helps to generate more than 80 percent of minerals.

But they also found that life is a key player: One-third of all mineral kinds form exclusively as parts or byproducts of living things—such as bits of bones, teeth, coral, and kidney stones (which are all rich in mineral content) or feces, wood, microbial mats, and other organic materials that over geologic time can absorb elements from their surroundings and transform into something more like rock. Thousands of minerals are shaped by life’s activity in other ways, such as germanium compounds that form in industrial coal fires. Including substances created through interactions with byproducts of life, such as the oxygen produced in photosynthesis, life’s fingerprints are on about half of all minerals.

But they also found that life is a key player: One-third of all mineral kinds form exclusively as parts or byproducts of living things—such as bits of bones, teeth, coral, and kidney stones (which are all rich in mineral content) or feces, wood, microbial mats, and other organic materials that over geologic time can absorb elements from their surroundings and transform into something more like rock. Thousands of minerals are shaped by life’s activity in other ways, such as germanium compounds that form in industrial coal fires. Including substances created through interactions with byproducts of life, such as the oxygen produced in photosynthesis, life’s fingerprints are on about half of all minerals.

Historically, scientists “have artificially drawn a line between what is geochemistry and what is biochemistry,” said Nita Sahai, a biomineralization specialist at the University of Akron in Ohio who was not involved in the new research. In reality, the boundary between animal, vegetable, and mineral is much more fluid.

A new origins-based system for classifying minerals reveals the huge geochemical imprint that life has left on Earth. It could help us identify other worlds with life too: “Life Helps Make Almost Half of All Minerals on Earth,” from @jojofoshosho0 in @QuantaMagazine.

Larry Wall

###

As we muse on minerals, we might send systemic birthday greetings to Thomas Samuel Kuhn; he was born on this date in 1922.  A physicist, historian, and philosopher of science, Kuhn believed that scientific knowledge didn’t advance in a linear, continuous way, but via periodic “paradigm shifts.”  Karl Popper had approached the same territory in his development of the principle of “falsification” (to paraphrase, a theory isn’t false until it’s proven true; it’s true until it’s proven false).  But while Popper worked as a logician, Kuhn worked as a historian.  His 1962 book The Structure of Scientific Revolutions made his case; and while he had– and has— his detractors, Kuhn’s work has been deeply influential in both academic and popular circles (indeed, the phrase “paradigm shift” has become an English-language staple).

“What man sees depends both upon what he looks at and also upon what his previous visual-conception experience has taught him to see.”

Thomas S. Kuhn, The Structure of Scientific Revolutions

 source

“Behind the hieroglyphic streets there would either be a transcendent meaning, or only the earth”*…

Gerardo Dottori, Explosion of Red on Green, 1910, oil on canvas. London, Tate Modern. [source]

A crop of new books attempts to explain the allure of conspiracy theories and the power of belief; Trevor Quirk considers them…

For the millions who were enraged, disgusted, and shocked by the Capitol riots of January 6, the enduring object of skepticism has been not so much the lie that provoked the riots but the believers themselves. A year out, and book publishers confirmed this, releasing titles that addressed the question still addling public consciousness: How can people believe this shit? A minority of rioters at the Capitol had nefarious intentions rooted in authentic ideology, but most of them conveyed no purpose other than to announce to the world that they believed — specifically, that the 2020 election was hijacked through an international conspiracy — and that nothing could sway their confidence. This belief possessed them, not the other way around.

At first, I’d found the riots both terrifying and darkly hilarious, but those sentiments were soon overwon by a strange exasperation that has persisted ever since. It’s a feeling that has robbed me of my capacity to laugh at conspiracy theories — QAnon, chemtrails, lizardmen, whatever — and the people who espouse them. My exasperation is for lack of an explanation. I see Trump’s most devoted hellion, rampaging down the halls of power like a grade schooler after the bell, and I need to know the hidden causes of his dopey rebellion. To account for our new menagerie of conspiracy theories, I told myself, would be to reclaim the world from entropy, to snap experience neatly to the grid once again. I would use recent books as the basis for my account of conspiracy theories in the age of the internet. From their pages I would extract insights and errors like newspaper clippings, pin the marginal, bizarre, and seemingly irrelevant details to the corkboard of my mind, where I could spy eerie resonances, draw unseen connections. At last, I could reveal that our epistemic bedlam is as a Twombly canvas — messy but decipherable…

Learn with @trevorquirk: “Out There,” in @GuernicaMag.

* Thomas Pynchon, The Crying of Lot 49

###

As we tangle with truth, we might send rigorous birthday greetings to Gustav Bergmann; he was born on this date in 1906. A philosopher, he was a member of the Vienna Circle, a a group of philosophers and scientists drawn from the natural and social sciences, logic and mathematics, whose values were rooted in the ideals of the Enlightenment. Their approach, logical positivism, an attempt to use logic to make philosophy “scientific,” has had immense influence on 20th-century philosophy, especially on the philosophy of science and analytic philosophy… even if it has not, in fact, eliminated the issues explored above.

source

We might also send birthday greetings in the form of logical and semantic puzzles both to the precocious protagonist of Alice’s Adventures in Wonderland and to her inspiration, Alice Liddell; they were “born” on this date in 1852.

source