Posts Tagged ‘scientific method’
“A prudent question is one-half of wisdom”*…
The death of Queen Elizabeth I created a career opportunity for philosopher and statesman Francis Bacon– one that, as Susan Wise Bauer explains– led him to found empiricism, to pioneer inductive reasoning, and in so doing, to advance the scientific method…
In 1603, Francis Bacon, London born, was forty-three years old: a trained lawyer and amateur philosopher, happily married, politically ambitious, perpetually in debt.
He had served Elizabeth I of England loyally at court, without a great deal of recognition in return. But now Elizabeth was dead at the age of sixty-nine, and her crown would go to her first cousin twice removed: James VI of Scotland, James I of England.
Francis Bacon hoped for better things from the new king, but at the moment he had no particular ‘in’ at the English court. Forced to be patient, he began working on a philosophical project he’d had in mind for some years–a study of human knowledge that he intended to call Of the Proficience and Advancement of Learning, Divine and Human.
Like most of Bacon’s undertakings, the project was ridiculously ambitious. He set out to classify all learning into the proper branches and lay out all of the possible impediments to understanding. Part I condemned what he called the three ‘distempers’ of learning, which included ‘vain imaginations,’ pursuits such as astrology and alchemy that had no basis in actual fact; Part II divided all knowledge into three branches and suggested that natural philosophy should occupy the prime spot. Science, the project of understanding the universe, was the most important pursuit man could undertake. The study of history (‘everything that has happened’) and poesy (imaginative writings) took definite second and third places.
For a time, Bacon didn’t expand on these ideas. The Advancement of Learning opened with a fulsome dedication to James I (‘I have been touched–yea, and possessed–with an extreme wonder at those your virtues and faculties . . . the largeness of your capacity, the faithfulness of your memory, the swiftness of your apprehension, the penetration of your judgment, and the facility and order of your elocution …. There hath not been since Christ’s time any king or temporal monarch which hath been so learned in all literature and erudition, divine and human’), and this groveling soon yielded fruit. In 1607 Bacon was appointed as solicitor general, a position he had coveted for years, and over the next decade or so he poured his energies into his government responsibilities.
He did not return to natural philosophy until after his appointment to the even higher post of chancellor in 1618. Now that he had battled his way to the top of the political dirt pile, he announced his intentions to write a work with even greater scope–a new, complete system of philosophy that would shape the minds of men and guide them into new truths. He called this masterwork the Great Instauration: the Great Establishment, a whole new way of thinking, laid out in six parts.
Part I, a survey of the existing ‘ancient arts’ of the mind, repeated the arguments of the Advancement of Learning. But Part II, published in 1620 as a stand-alone work, was something entirely different. It was a wholesale challenge to Aristotelian methods, a brand-new ‘doctrine of a more perfect use of reason.’
Aristotelian thinking relies, heavily, on deductive reasoning for ancient logicians and philosophers, the highest and best road to the truth. Deductive reasoning moves from general statements (premises) to specific conclusions.
MAJOR PREMISE: All heavy matter falls toward the center of the universe. MINOR PREMISE: The earth is made of heavy matter. MINOR PREMISE: The earth is not falling. CONCLUSION: The earth must already be at the center of the universe.
But Bacon had come to believe that deductive reasoning was a dead end that distorted evidence: ‘Having first determined the question according to his will,’ he objected, ‘man then resorts to experience, and bending her to conformity with his placets [expressions of assent], leads her about like a captive in a procession.’ Instead, he argued, the careful thinker must reason the other way around: starting from specifics and building toward general conclusions, beginning with particular pieces of evidence and working, inductively, toward broader assertions.
This new way of thinking–inductive reasoning–had three steps to it. The ‘true method’ Bacon explained,
‘first lights the candle, and then by means of the candle shows the way; commencing as it does with experience duly ordered and digested, not bungling or erratic, and from it deducing axioms, and from established axioms again new experiments.’
In other words, the natural philosopher must first come up with an idea about how the world works: ‘lighting the candle.’ Second, he must test the idea against physical reality, against ‘experience duly ordered’–both observations of the world around him and carefully designed experiments. Only then, as a last step, should he ‘deduce axioms,’ coming up with a theory that could be claimed to carry truth.
Hypothesis, experiment, conclusion: Bacon had just traced the outlines of the scientific method…
Francis Bacon and the Scientific Method
An excerpt from The Story of Western Science by @SusanWiseBauer, via the invaluable @delanceyplace.
* Francis Bacon
###
As we embrace empiricism, we might send carefully-transmitted birthday greetings to Augusto Righi; he was born on this date in 1850. A physicist and a pioneer in the study of electromagnetism, he showed that showed that radio waves displayed characteristics of light wave behavior (reflection, refraction, polarization, and interference), with which they shared the electromagnetic spectrum. In 1894 Righi was the first person to generate microwaves.
Righi influenced the young Guglielmo Marconi, the inventor of radio, who visited him at his lab. Indeed, Marconi invented the first practical wireless telegraphy radio transmitters and receivers in 1894 using Righi’s four ball spark oscillator (from Righi’s microwave work) in his transmitters.
“Eventually everything connects”*…
Long-time readers will know of your correspondent’s fascination with Powers of Ten, a remarkable short film by Charles and Ray Eames, with Philip Morrison, that begins with a couple having a picnic, zooms out by “powers of ten” to the edge of the universe, then zooms in (by those same increments) to a proton.
We’ve looked before at a number of riffs on this meditation on scale: see, e.g., here, here, and here.
Now the BBC has updated the first half of Powers of Ten:
It’s a trip worth taking.
* Charles Eames
###
As we wrestle with relationships, we might light a birthday candle for Sir Francis Bacon– English Renaissance philosopher, lawyer, linguist, composer, mathematician, geometer, musician, poet, painter, astronomer, classicist, philosopher, historian, theologian, architect, father of modern science (The Baconian– aka The Scientific– Method), and patron of modern democracy, whom some allege was the illegitimate son of Queen Elizabeth I of England (and others, the actual author of Shakespeare’s plays)… He was in any event born on this date in 1561.
Bacon (whose Essays were, in a fashion, the first “management book” in English) was, in Alexander Pope’s words, “the greatest genius that England, or perhaps any country, ever produced.” He probably did not actually write the plays attributed to Shakespeare (as a thin, but long, line of enthusiasts, including Mark Twain and Friedrich Nietzsche, believed). But Bacon did observe, in a discussion of sedition that’s as timely today as ever, that “the remedy is worse than the disease.”

“Alchemy. The link between the immemorial magic arts and modern science. Humankind’s first systematic effort to unlock the secrets of matter by reproducible experiment.”*…
Science has entered a new era of alchemy, suggests Robbert Dijkgraaf, Director of the Institute for Advanced Study at Princeton– and, he argues, that’s a good thing…
Is artificial intelligence the new alchemy? That is, are the powerful algorithms that control so much of our lives — from internet searches to social media feeds — the modern equivalent of turning lead into gold? Moreover: Would that be such a bad thing?
According to the prominent AI researcher Ali Rahimi and others, today’s fashionable neural networks and deep learning techniques are based on a collection of tricks, topped with a good dash of optimism, rather than systematic analysis. Modern engineers, the thinking goes, assemble their codes with the same wishful thinking and misunderstanding that the ancient alchemists had when mixing their magic potions.
It’s true that we have little fundamental understanding of the inner workings of self-learning algorithms, or of the limits of their applications. These new forms of AI are very different from traditional computer codes that can be understood line by line. Instead, they operate within a black box, seemingly unknowable to humans and even to the machines themselves.
This discussion within the AI community has consequences for all the sciences. With deep learning impacting so many branches of current research — from drug discovery to the design of smart materials to the analysis of particle collisions — science itself may be at risk of being swallowed by a conceptual black box. It would be hard to have a computer program teach chemistry or physics classes. By deferring so much to machines, are we discarding the scientific method that has proved so successful, and reverting to the dark practices of alchemy?
Not so fast, says Yann LeCun, co-recipient of the 2018 Turing Award for his pioneering work on neural networks. He argues that the current state of AI research is nothing new in the history of science. It is just a necessary adolescent phase that many fields have experienced, characterized by trial and error, confusion, overconfidence and a lack of overall understanding. We have nothing to fear and much to gain from embracing this approach. It’s simply that we’re more familiar with its opposite.
After all, it’s easy to imagine knowledge flowing downstream, from the source of an abstract idea, through the twists and turns of experimentation, to a broad delta of practical applications. This is the famous “usefulness of useless knowledge,” advanced by Abraham Flexner in his seminal 1939 essay (itself a play on the very American concept of “useful knowledge” that emerged during the Enlightenment).
A canonical illustration of this flow is Albert Einstein’s general theory of relativity. It all began with the fundamental idea that the laws of physics should hold for all observers, independent of their movements. He then translated this concept into the mathematical language of curved space-time and applied it to the force of gravity and the evolution of the cosmos. Without Einstein’s theory, the GPS in our smartphones would drift off course by about 7 miles a day.
But maybe this paradigm of the usefulness of useless knowledge is what the Danish physicist Niels Bohr liked to call a “great truth” — a truth whose opposite is also a great truth. Maybe, as AI is demonstrating, knowledge can also flow uphill.
In the broad history of science, as LeCun suggested, we can spot many examples of this effect, which can perhaps be dubbed “the uselessness of useful knowledge.” An overarching and fundamentally important idea can emerge from a long series of step-by-step improvements and playful experimentation — say, from Fröbel to Nobel.
Perhaps the best illustration is the discovery of the laws of thermodynamics, a cornerstone of all branches of science. These elegant equations, describing the conservation of energy and increase of entropy, are laws of nature, obeyed by all physical phenomena. But these universal concepts only became apparent after a long, confusing period of experimentation, starting with the construction of the first steam engines in the 18th century and the gradual improvement of their design. Out of the thick mist of practical considerations, mathematical laws slowly emerged…
One could even argue that science itself has followed this uphill path. Until the birth of the methods and practices of modern research in the 17th century, scientific research consisted mostly of nonsystematic experimentation and theorizing. Long considered academic dead ends, these ancient practices have been reappraised in recent years: Alchemy is now considered to have been a useful and perhaps even necessary precursor to modern chemistry — more proto-science than hocus-pocus.
The appreciation of tinkering as a fruitful path toward grand theories and insights is particularly relevant for current research that combines advanced engineering and basic science in novel ways. Driven by breakthrough technologies, nanophysicists are tinkering away, building the modern equivalents of steam engines on the molecular level, manipulating individual atoms, electrons and photons. Genetic editing tools such as CRISPR allow us to cut and paste the code of life itself. With structures of unimaginable complexity, we are pushing nature into new corners of reality. With so many opportunities to explore new configurations of matter and information, we could enter a golden age of modern-day alchemy, in the best sense of the word.
However, we should never forget the hard-won cautionary lessons of history. Alchemy was not only a proto-science, but also a “hyper-science” that overpromised and underdelivered. Astrological predictions were taken so seriously that life had to adapt to theory, instead of the other way around. Unfortunately, modern society is not free from such magical thinking, putting too much confidence in omnipotent algorithms, without critically questioning their logical or ethical basis.
Science has always followed a natural rhythm of alternating phases of expansion and concentration. Times of unstructured exploration were followed by periods of consolidation, grounding new knowledge in fundamental concepts. We can only hope that the current period of creative tinkering in artificial intelligence, quantum devices and genetic editing, with its cornucopia of useful applications, will eventually lead to a deeper understanding of the world…
Today’s powerful but little-understood artificial intelligence breakthroughs echo past examples of unexpected scientific progress: “The Uselessness of Useful Knowledge,” from @RHDijkgraaf at @the_IAS.
Pair with: “Neuroscience’s Existential Crisis- we’re mapping the brain in amazing detail—but our brain can’t understand the picture” for a less optimistic view.
* John Ciardi
###
As we experiment, we might recall that it was on this date in 1993 that the Roman Catholic Church admitted that it had erred in condemning Galileo. For over 359 years, the Church had excoriated Galileo’s contentions (e.g., that the Earth revolves around the Sun) as anti-scriptural heresy. In 1633, at age 69, Galileo had been forced by the Roman Inquisition to repent, and spent the last eight years of his life under house arrest. After 13 years of inquiry, Pope John Paul II’s commission of historic, scientific and theological scholars brought the pontiff a “not guilty” finding for Galileo; the Pope himself met with the Pontifical Academy of Sciences to help correct the record.

“An architect should live as little in cities as a painter. Send him to our hills, and let him study there what nature understands by a buttress, and what by a dome.”*…
We’ve misunderstood an important part of the history of urbanism– jungle cities. Patrick Roberts suggests that they have much to teach us…
Visions of “lost cities” in the jungle have consumed western imaginations since Europeans first visited the tropics of Asia, Africa and the Americas. From the Lost City of Z to El Dorado, a thirst for finding ancient civilisations and their treasures in perilous tropical forest settings has driven innumerable ill-fated expeditions. This obsession has seeped into western societies’ popular ideas of tropical forest cities, with overgrown ruins acting as the backdrop for fear, discovery and life-threatening challenges in countless films, novels and video games.
Throughout these depictions runs the idea that all ancient cities and states in tropical forests were doomed to fail. That the most resilient occupants of tropical forests are small villages of poison dart-blowing hunter-gatherers. And that vicious vines and towering trees – or, in the case of The Jungle Book, a boisterous army of monkeys – will inevitably claw any significant human achievement back into the suffocating green whence it came. This idea has been boosted by books and films that focus on the collapse of particularly enigmatic societies such as the Classic Maya. The decaying stone walls, the empty grand structures and the deserted streets of these tropical urban leftovers act as a tragic warning that our own way of life is not as secure as we would like to assume.
For a long time, western scholars took a similar view of the potential of tropical forests to sustain ancient cities. On the one hand, intensive agriculture, seen as necessary to fuel the growth of cities and powerful social elites, has been considered impossible on the wet, acidic, nutrient-poor soils of tropical forests. On the other, where the rubble of cities cannot be denied, in the drier tropics of North and Central America, south Asia and south-east Asia, ecological catastrophe has been seen as inevitable. Deforestation to make way for massive buildings and growing populations, an expansion of agriculture across marginal soils, as well as natural disasters such as mudslides, flooding and drought, must have made tropical cities a big challenge at best, and a fool’s gambit at worst.
Overhauling these stereotypes has been difficult. For one thing, the kind of large, multiyear field explorations usually undertaken on the sites of ancient cities are especially hard in tropical forests. Dense vegetation, mosquito-borne disease, poisonous plants and animals and torrential rain have made it arduous to find and excavate past urban centres. Where organic materials, rather than stone, might have been used as a construction material, the task becomes even more taxing. As a result, research into past tropical urbanism has lagged behind similar research in Mesopotamia and Egypt and the sweeping river valleys of east Asia.
Yet many tropical forest societies found immensely successful methods of food production, in even the most challenging of circumstances, which could sustain impressively large populations and social structures. The past two decades of archaeological exploration, applying the latest science from the land and the air, have stripped away canopies to provide new, more favourable assessments.
Not only did societies such as the Classic Maya and the Khmer empire of Cambodia flourish, but pre-colonial tropical cities were actually some of the most extensive urban landscapes anywhere in the pre-industrial world – far outstripping ancient Rome, Constantinople/Istanbul and the ancient cities of China.
Ancient tropical cities could be remarkably resilient, sometimes surviving many centuries longer than colonial- and industrial-period urban networks in similar environments. Although they could face immense obstacles, and often had to reinvent themselves to beat changing climates and their own exploitation of the surrounding landscape, they also developed completely new forms of what a city could be, and perhaps should be.
Extensive, interspersed with nature and combining food production with social and political function, these ancient cities are now catching the eyes of 21st-century urban planners trying to come to grips with tropical forests as sites of some of the fastest-growing human populations around the world today…
They may be vine-smothered ruins today, but the lost cities of the ancient tropics still have a lot to teach us about how to live alongside nature. Dr. Roberts (@palaeotropics) explains: “The real urban jungle: how ancient societies reimagined what cities could be,” adapted from his new book, Jungle: How Tropical Forests Shaped the World – and Us.
* John Ruskin
###
As we acclimate, we might send thoughtful birthday greetings to Sir Karl Raimund Popper; he was born on this date in 1902. One of the greatest philosophers of science of the 20th century, Popper is best known for his rejection of the classical inductivist views on the scientific method, in favor of empirical falsification: a theory in the empirical sciences can never be proven, but it can be falsified, meaning that it can and should be scrutinized by decisive experiments. (Or more simply put, whereas classical inductive approaches considered hypotheses false until proven true, Popper reversed the logic: conclusions drawn from an empirical finding are true until proven false.)
Popper was also a powerful critic of historicism in political thought, and (in books like The Open Society and Its Enemies and The Poverty of Historicism) an enemy of authoritarianism and totalitarianism (in which role he was a mentor to George Soros).

You must be logged in to post a comment.