(Roughly) Daily

Posts Tagged ‘philosophy of science

“A prudent question is one-half of wisdom”*…

Sir Francis Bacon, portrait by Paul van Somer I, 1617

The death of Queen Elizabeth I created a career opportunity for philosopher and statesman Francis Bacon– one that, as Susan Wise Bauer explains– led him to found empiricism, to pioneer inductive reasoning, and in so doing, to advance the scientific method…

In 1603, Francis Bacon, London born, was forty-three years old: a trained lawyer and amateur philosopher, happily married, politically ambitious, perpetually in debt.

He had served Elizabeth I of England loyally at court, without a great deal of recognition in return. But now Elizabeth was dead at the age of sixty-nine, and her crown would go to her first cousin twice removed: James VI of Scotland, James I of England.

Francis Bacon hoped for better things from the new king, but at the moment he had no particular ‘in’ at the English court. Forced to be patient, he began working on a philosophical project he’d had in mind for some years–a study of human knowledge that he intended to call Of the Proficience and Advancement of Learning, Divine and Human.

Like most of Bacon’s undertakings, the project was ridiculously ambitious. He set out to classify all learning into the proper branches and lay out all of the possible impediments to understanding. Part I condemned what he called the three ‘distempers’ of learning, which included ‘vain imaginations,’ pursuits such as astrology and alchemy that had no basis in actual fact; Part II divided all knowledge into three branches and suggested that natural philosophy should occupy the prime spot. Science, the project of understanding the universe, was the most important pursuit man could undertake. The study of history (‘everything that has happened’) and poesy (imaginative writings) took definite second and third places.

For a time, Bacon didn’t expand on these ideas. The Advancement of Learning opened with a fulsome dedication to James I (‘I have been touched–yea, and possessed–with an extreme wonder at those your virtues and faculties . . . the largeness of your capacity, the faithfulness of your memory, the swiftness of your apprehension, the penetration of your judgment, and the facility and order of your elocution …. There hath not been since Christ’s time any king or temporal monarch which hath been so learned in all literature and erudition, divine and human’), and this groveling soon yielded fruit. In 1607 Bacon was appointed as solicitor general, a position he had coveted for years, and over the next decade or so he poured his energies into his government responsibilities.

He did not return to natural philosophy until after his appointment to the even higher post of chancellor in 1618. Now that he had battled his way to the top of the political dirt pile, he announced his intentions to write a work with even greater scope–a new, complete system of philosophy that would shape the minds of men and guide them into new truths. He called this masterwork the Great Instauration: the Great Establishment, a whole new way of thinking, laid out in six parts.

Part I, a survey of the existing ‘ancient arts’ of the mind, repeated the arguments of the Advancement of Learning. But Part II, published in 1620 as a stand-alone work, was something entirely different. It was a wholesale challenge to Aristotelian methods, a brand-new ‘doctrine of a more perfect use of reason.’

Aristotelian thinking relies, heavily, on deductive reasoning for ancient logicians and philosophers, the highest and best road to the truth. Deductive reasoning moves from general statements (premises) to specific conclusions.

MAJOR PREMISE: All heavy matter falls toward the center of the universe. MINOR PREMISE: The earth is made of heavy matter. MINOR PREMISE: The earth is not falling. CONCLUSION: The earth must already be at the center of the universe.

But Bacon had come to believe that deductive reasoning was a dead end that distorted evidence: ‘Having first determined the question according to his will,’ he objected, ‘man then resorts to experience, and bending her to conformity with his placets [expressions of assent], leads her about like a captive in a procession.’ Instead, he argued, the careful thinker must reason the other way around: starting from specifics and building toward general conclusions, beginning with particular pieces of evidence and working, inductively, toward broader assertions.

This new way of thinking–inductive reasoning–had three steps to it. The ‘true method’ Bacon explained,

‘first lights the candle, and then by means of the candle shows the way; commencing as it does with experience duly ordered and digested, not bungling or erratic, and from it deducing axioms, and from established axioms again new experiments.’

In other words, the natural philosopher must first come up with an idea about how the world works: ‘lighting the candle.’ Second, he must test the idea against physical reality, against ‘experience duly ordered’–both observations of the world around him and carefully designed experiments. Only then, as a last step, should he ‘deduce axioms,’ coming up with a theory that could be claimed to carry truth. 

Hypothesis, experiment, conclusion: Bacon had just traced the outlines of the scientific method…

Francis Bacon and the Scientific Method

An excerpt from The Story of Western Science by @SusanWiseBauer, via the invaluable @delanceyplace.

* Francis Bacon

###

As we embrace empiricism, we might send carefully-transmitted birthday greetings to Augusto Righi; he was born on this date in 1850. A physicist and a pioneer in the study of electromagnetism, he showed that showed that radio waves displayed characteristics of light wave behavior (reflection, refraction, polarization, and interference), with which they shared the electromagnetic spectrum. In 1894 Righi was the first person to generate microwaves.

Righi influenced the young Guglielmo Marconi, the inventor of radio, who visited him at his lab. Indeed, Marconi invented the first practical wireless telegraphy radio transmitters and receivers in 1894 using Righi’s four ball spark oscillator (from Righi’s microwave work) in his transmitters.

source

“Over the long term, symbiosis is more useful than parasitism. More fun, too.”*…

Blue-green formations of malachite form in copper deposits near the surface as they weather. But they could only arise after life raised atmospheric oxygen levels, starting about 2.5 billion years ago.

There are many more varieties of minerals on earth than previously believed– and about half of them formed as parts or byproducts of living things…

The impact of Earth’s geology on life is easy to see, with organisms adapting to environments as different as deserts, mountains, forests, and oceans. The full impact of life on geology, however, can be easy to miss.

A comprehensive new survey of our planet’s minerals now corrects that omission. Among its findings is evidence that about half of all mineral diversity is the direct or indirect result of living things and their byproducts. It’s a discovery that could provide valuable insights to scientists piecing together Earth’s complex geological history—and also to those searching for evidence of life beyond this world.

In a pair of papers published on July 1, 2022 in American Mineralogist, researchers Robert HazenShaunna Morrison and their collaborators outline a new taxonomic system for classifying minerals, one that places importance on precisely how minerals form, not just how they look. In so doing, their system acknowledges how Earth’s geological development and the evolution of life influence each other.

Their new taxonomy, based on an algorithmic analysis of thousands of scientific papers, recognizes more than 10,500 different types of minerals. That’s almost twice as many as the roughly 5,800 mineral “species” in the classic taxonomy of the International Mineralogical Association, which focuses strictly on a mineral’s crystalline structure and chemical makeup.

Morrison and Hazen also identified 57 processes that individually or in combination created all known minerals. These processes included various types of weathering, chemical precipitations, metamorphic transformation inside the mantle, lightning strikes, radiation, oxidation, massive impacts during Earth’s formation, and even condensations in interstellar space before the planet formed. They confirmed that the biggest single factor in mineral diversity on Earth is water, which through a variety of chemical and physical processes helps to generate more than 80 percent of minerals.

But they also found that life is a key player: One-third of all mineral kinds form exclusively as parts or byproducts of living things—such as bits of bones, teeth, coral, and kidney stones (which are all rich in mineral content) or feces, wood, microbial mats, and other organic materials that over geologic time can absorb elements from their surroundings and transform into something more like rock. Thousands of minerals are shaped by life’s activity in other ways, such as germanium compounds that form in industrial coal fires. Including substances created through interactions with byproducts of life, such as the oxygen produced in photosynthesis, life’s fingerprints are on about half of all minerals.

But they also found that life is a key player: One-third of all mineral kinds form exclusively as parts or byproducts of living things—such as bits of bones, teeth, coral, and kidney stones (which are all rich in mineral content) or feces, wood, microbial mats, and other organic materials that over geologic time can absorb elements from their surroundings and transform into something more like rock. Thousands of minerals are shaped by life’s activity in other ways, such as germanium compounds that form in industrial coal fires. Including substances created through interactions with byproducts of life, such as the oxygen produced in photosynthesis, life’s fingerprints are on about half of all minerals.

Historically, scientists “have artificially drawn a line between what is geochemistry and what is biochemistry,” said Nita Sahai, a biomineralization specialist at the University of Akron in Ohio who was not involved in the new research. In reality, the boundary between animal, vegetable, and mineral is much more fluid.

A new origins-based system for classifying minerals reveals the huge geochemical imprint that life has left on Earth. It could help us identify other worlds with life too: “Life Helps Make Almost Half of All Minerals on Earth,” from @jojofoshosho0 in @QuantaMagazine.

Larry Wall

###

As we muse on minerals, we might send systemic birthday greetings to Thomas Samuel Kuhn; he was born on this date in 1922.  A physicist, historian, and philosopher of science, Kuhn believed that scientific knowledge didn’t advance in a linear, continuous way, but via periodic “paradigm shifts.”  Karl Popper had approached the same territory in his development of the principle of “falsification” (to paraphrase, a theory isn’t false until it’s proven true; it’s true until it’s proven false).  But while Popper worked as a logician, Kuhn worked as a historian.  His 1962 book The Structure of Scientific Revolutions made his case; and while he had– and has— his detractors, Kuhn’s work has been deeply influential in both academic and popular circles (indeed, the phrase “paradigm shift” has become an English-language staple).

“What man sees depends both upon what he looks at and also upon what his previous visual-conception experience has taught him to see.”

Thomas S. Kuhn, The Structure of Scientific Revolutions

 source

“Behind the hieroglyphic streets there would either be a transcendent meaning, or only the earth”*…

Gerardo Dottori, Explosion of Red on Green, 1910, oil on canvas. London, Tate Modern. [source]

A crop of new books attempts to explain the allure of conspiracy theories and the power of belief; Trevor Quirk considers them…

For the millions who were enraged, disgusted, and shocked by the Capitol riots of January 6, the enduring object of skepticism has been not so much the lie that provoked the riots but the believers themselves. A year out, and book publishers confirmed this, releasing titles that addressed the question still addling public consciousness: How can people believe this shit? A minority of rioters at the Capitol had nefarious intentions rooted in authentic ideology, but most of them conveyed no purpose other than to announce to the world that they believed — specifically, that the 2020 election was hijacked through an international conspiracy — and that nothing could sway their confidence. This belief possessed them, not the other way around.

At first, I’d found the riots both terrifying and darkly hilarious, but those sentiments were soon overwon by a strange exasperation that has persisted ever since. It’s a feeling that has robbed me of my capacity to laugh at conspiracy theories — QAnon, chemtrails, lizardmen, whatever — and the people who espouse them. My exasperation is for lack of an explanation. I see Trump’s most devoted hellion, rampaging down the halls of power like a grade schooler after the bell, and I need to know the hidden causes of his dopey rebellion. To account for our new menagerie of conspiracy theories, I told myself, would be to reclaim the world from entropy, to snap experience neatly to the grid once again. I would use recent books as the basis for my account of conspiracy theories in the age of the internet. From their pages I would extract insights and errors like newspaper clippings, pin the marginal, bizarre, and seemingly irrelevant details to the corkboard of my mind, where I could spy eerie resonances, draw unseen connections. At last, I could reveal that our epistemic bedlam is as a Twombly canvas — messy but decipherable…

Learn with @trevorquirk: “Out There,” in @GuernicaMag.

* Thomas Pynchon, The Crying of Lot 49

###

As we tangle with truth, we might send rigorous birthday greetings to Gustav Bergmann; he was born on this date in 1906. A philosopher, he was a member of the Vienna Circle, a a group of philosophers and scientists drawn from the natural and social sciences, logic and mathematics, whose values were rooted in the ideals of the Enlightenment. Their approach, logical positivism, an attempt to use logic to make philosophy “scientific,” has had immense influence on 20th-century philosophy, especially on the philosophy of science and analytic philosophy… even if it has not, in fact, eliminated the issues explored above.

source

We might also send birthday greetings in the form of logical and semantic puzzles both to the precocious protagonist of Alice’s Adventures in Wonderland and to her inspiration, Alice Liddell; they were “born” on this date in 1852.

source

“A mind that is stretched by a new idea can never go back to its original dimensions”*…

Alex Berezow observes (in an appreciation of Peter AtkinsGalileo’s Finger: The Ten Great Ideas of Science) that, while scientific theories are always being tested, scrutinized for flaws, and revised, there are ten concepts so durable that it is difficult to imagine them ever being replaced with something better…

In his book The Structure of Scientific Revolutions, Thomas Kuhn argued that science, instead of progressing gradually in small steps as is commonly believed, actually moves forward in awkward leaps and bounds. The reason for this is that established theories are difficult to overturn, and contradictory data is often dismissed as merely anomalous. However, at some point, the evidence against the theory becomes so overwhelming that it is forcefully displaced by a better one in a process that Kuhn refers to as a “paradigm shift.” And in science, even the most widely accepted ideas could, someday, be considered yesterday’s dogma.

Yet, there are some concepts which are considered so rock solid, that it is difficult to imagine them ever being replaced with something better. What’s more, these concepts have fundamentally altered their fields, unifying and illuminating them in a way that no previous theory had done before…

The bedrock of modern biology, chemistry, and physics: “The ten greatest ideas in the history of science,” from @AlexBerezow in @bigthink.

* Oliver Wendell Holmes

###

As we forage for first principles, we might send carefully-calcuated birthday greetings to Georgiy Antonovich Gamov; he was born on this date in 1904. Better known by the name he adopted on immigrating to the U.S., George Gamow, he was a physicist and cosmologist whose early work was instrumental in developing the Big Bang theory of the universe; he also developed the first mathematical model of the atomic nucleus. In 1954, he expanded his interests into biochemistry and his work on deoxyribonucleic acid (DNA) made a basic contribution to modern genetic theory.

But mid-career Gamow began to shift his energy to teaching and to writing popular books on science… one of which, One Two Three… Infinity, inspired legions of young scientists-to-be and kindled a life-long interest in science in an even larger number of other youngsters (including your correspondent).

source

“Real generosity towards the future lies in giving all to the present”*…

Iwan Rhys Morus suggests that we’re enthralled to a Victorian paradigm that haunts us still: the idea that inventors and entrepreneurs hold the keys to the utopian future…

Tech titans like Elon Musk and Jeff Bezos present themselves as men who could single-handedly shape the future. For their supporters, their ruthless drive toward success is their key virtue. And their showmanship — Musk sending a Tesla Roadster into space on a Falcon Heavy rocket, or Bezos sending Captain Kirk into orbit with Blue Origin — is a way of demonstrating that virtue and asserting they are in control.

We owe to the Victorians the idea that there is a firm link between virtue and technological agency. They established a powerful paradigm that continues to haunt us: that the future is (or can be) a utopia, and inventors and entrepreneurs are the ones who know how to get there.

While our notions of virtue have shifted today, we still assume that future-making is the prerogative of very specific sorts of innovators — even as their imagined identities have fractured and transformed. The assumption that innovation is the property of charismatic individuals still underlies the way we think about technology.

The seductive power of Victorian thinking about the relationship between character, technology, and the future remains pervasive, even if views about just what the proper character of the inventor should be have shifted….

With its focus on individual virtue, the Victorian vision of the future is an exclusive one. When we subscribe to this paradigm about how — and by whom — the future is made, we’re also relinquishing control over that future. We’re acknowledging that tomorrow belongs to them, not to us.

Back To The Victorian Future,” by @irmorus1 in @NoemaMag. Eminently worth reading in full.

* Albert Camus

###

As we ponder power and its purpose, we might send inclusive birthday greetings to Jacques Lucien Monod; he was born on this date in 1910. A biochemist, he shared (with with François Jacob and André Lwoff) the Nobel Prize in Physiology or Medicine in 1965, “for their discoveries concerning genetic control of enzyme and virus synthesis.”

But Monod, who became the director of the Pasteur Institute, also made significant contributions to the philosophy of science– in particular via his 1971 book (based on a series of his lectures) Chance and Necessity, in which he examined the philosophical implications of modern biology. The importance of Monod’s work as a bridge between the chance and necessity of evolution and biochemistry on the one hand, and the human realm of choice and ethics on the other, can be seen in his influence on philosophers, biologists, and computer scientists including Daniel Dennett, Douglas Hofstadter, Marvin Minsky, and Richard Dawkins.

source

%d bloggers like this: