(Roughly) Daily

Posts Tagged ‘scientific method

“I used to measure the skies, now I measure the shadows of Earth”*…

From ancient Egyptian cubits to fitness tracker apps, humankind has long been seeking ever more ways to measure the world – and ourselves…

The discipline of measurement developed for millennia… Around 6,000 years ago, the first standardised units were deployed in river valley civilisations such as ancient Egypt, where the cubit was defined by the length of the human arm, from elbow to the tip of the middle finger, and used to measure out the dimensions of the pyramids. In the Middle Ages, the task of regulating measurement to facilitate trade was both privilege and burden for rulers: a means of exercising power over their subjects, but a trigger for unrest if neglected. As the centuries passed, units multiplied, and in 18th-century France there were said to be some 250,000 variant units in use, leading to the revolutionary demand: “One king, one law, one weight and one measure.”

It was this abundance of measures that led to the creation of the metric system by French savants. A unit like the metre – defined originally as one ten-millionth of the distance from the equator to the north pole – was intended not only to simplify metrology, but also to embody political ideals. Its value and authority were derived not from royal bodies, but scientific calculation, and were thus, supposedly, equal and accessible to all. Then as today, units of measurement are designed to create uniformity across time, space and culture; to enable control at a distance and ensure trust between strangers. What has changed since the time of the pyramids is that now they often span the whole globe.

Despite their abundance, international standards like those mandated by NIST and the International Organization for Standardization (ISO) are mostly invisible in our lives. Where measurement does intrude is via bureaucracies of various stripes, particularly in education and the workplace. It’s in school that we are first exposed to the harsh lessons of quantification – where we are sorted by grade and rank and number, and told that these are the measures by which our future success will be gauged…

A fascinating survey of the history of measurement, and a consideration of its consequences: “Made to measure: why we can’t stop quantifying our lives,” from James Vincent (@jjvincent) in @guardian, an excerpt from his new book Beyond Measure: The Hidden History of Measurement.

And for a look at what it takes to perfect one of the most fundamental of those measures, see Jeremy Bernstein‘s “The Kilogram.”

* “I used to measure the skies, now I measure the shadows of Earth. Although my mind was sky-bound, the shadow of my body lies here.” – Epitaph Johannes Kepler composed for himself a few months before he died

###

As we get out the gauge, we might send thoughtfully-wagered birthday greetings Blaise Pascal; he was born on this date in 1623.  A French mathematician, physicist, theologian, and inventor (e.g.,the first digital calculator, the barometer, the hydraulic press, and the syringe), his commitment to empiricism (“experiments are the true teachers which one must follow in physics”) pitted him against his contemporary René “cogito, ergo sum” Descartes– and was foundational in the acceleration of the scientific/rationalist commitment to measurement…

 source

Happy Juneteenth!

“Eventually everything connects”*…

Long-time readers will know of your correspondent’s fascination with Powers of Ten, a remarkable short film by Charles and Ray Eames, with Philip Morrison, that begins with a couple having a picnic, zooms out by “powers of ten” to the edge of the universe, then zooms in (by those same increments) to a proton.

We’ve looked before at a number of riffs on this meditation on scale: see, e.g., here, here, and here.

Now the BBC has updated the first half of Powers of Ten:

It’s a trip worth taking.

* Charles Eames

###

As we wrestle with relationships, we might light a birthday candle for Sir Francis Bacon– English Renaissance philosopher, lawyer, linguist, composer, mathematician, geometer, musician, poet, painter, astronomer, classicist, philosopher, historian, theologian, architect, father of modern science (The Baconian– aka The Scientific– Method), and patron of modern democracy, whom some allege was the illegitimate son of Queen Elizabeth I of England (and others, the actual author of Shakespeare’s plays)… He was in any event born on this date in 1561.

Bacon (whose Essays were, in a fashion, the first “management book” in English) was, in Alexander Pope’s words, “the greatest genius that England, or perhaps any country, ever produced.”  He probably did not actually write the plays attributed to Shakespeare (as a thin, but long, line of enthusiasts, including Mark Twain and Friedrich Nietzsche, believed).  But Bacon did observe, in a discussion of sedition that’s as timely today as ever, that “the remedy is worse than the disease.”

 source

“Alchemy. The link between the immemorial magic arts and modern science. Humankind’s first systematic effort to unlock the secrets of matter by reproducible experiment.”*…

Science has entered a new era of alchemy, suggests Robbert Dijkgraaf, Director of the Institute for Advanced Study at Princeton– and, he argues, that’s a good thing…

Is artificial intelligence the new alchemy? That is, are the powerful algorithms that control so much of our lives — from internet searches to social media feeds — the modern equivalent of turning lead into gold? Moreover: Would that be such a bad thing?

According to the prominent AI researcher Ali Rahimi and others, today’s fashionable neural networks and deep learning techniques are based on a collection of tricks, topped with a good dash of optimism, rather than systematic analysis. Modern engineers, the thinking goes, assemble their codes with the same wishful thinking and misunderstanding that the ancient alchemists had when mixing their magic potions.

It’s true that we have little fundamental understanding of the inner workings of self-learning algorithms, or of the limits of their applications. These new forms of AI are very different from traditional computer codes that can be understood line by line. Instead, they operate within a black box, seemingly unknowable to humans and even to the machines themselves.

This discussion within the AI community has consequences for all the sciences. With deep learning impacting so many branches of current research — from drug discovery to the design of smart materials to the analysis of particle collisions — science itself may be at risk of being swallowed by a conceptual black box. It would be hard to have a computer program teach chemistry or physics classes. By deferring so much to machines, are we discarding the scientific method that has proved so successful, and reverting to the dark practices of alchemy?

Not so fast, says Yann LeCun, co-recipient of the 2018 Turing Award for his pioneering work on neural networks. He argues that the current state of AI research is nothing new in the history of science. It is just a necessary adolescent phase that many fields have experienced, characterized by trial and error, confusion, overconfidence and a lack of overall understanding. We have nothing to fear and much to gain from embracing this approach. It’s simply that we’re more familiar with its opposite.

After all, it’s easy to imagine knowledge flowing downstream, from the source of an abstract idea, through the twists and turns of experimentation, to a broad delta of practical applications. This is the famous “usefulness of useless knowledge,” advanced by Abraham Flexner in his seminal 1939 essay (itself a play on the very American concept of “useful knowledge” that emerged during the Enlightenment).

A canonical illustration of this flow is Albert Einstein’s general theory of relativity. It all began with the fundamental idea that the laws of physics should hold for all observers, independent of their movements. He then translated this concept into the mathematical language of curved space-time and applied it to the force of gravity and the evolution of the cosmos. Without Einstein’s theory, the GPS in our smartphones would drift off course by about 7 miles a day.

But maybe this paradigm of the usefulness of useless knowledge is what the Danish physicist Niels Bohr liked to call a “great truth” — a truth whose opposite is also a great truth. Maybe, as AI is demonstrating, knowledge can also flow uphill.

In the broad history of science, as LeCun suggested, we can spot many examples of this effect, which can perhaps be dubbed “the uselessness of useful knowledge.” An overarching and fundamentally important idea can emerge from a long series of step-by-step improvements and playful experimentation — say, from Fröbel to Nobel.

Perhaps the best illustration is the discovery of the laws of thermodynamics, a cornerstone of all branches of science. These elegant equations, describing the conservation of energy and increase of entropy, are laws of nature, obeyed by all physical phenomena. But these universal concepts only became apparent after a long, confusing period of experimentation, starting with the construction of the first steam engines in the 18th century and the gradual improvement of their design. Out of the thick mist of practical considerations, mathematical laws slowly emerged…

One could even argue that science itself has followed this uphill path. Until the birth of the methods and practices of modern research in the 17th century, scientific research consisted mostly of nonsystematic experimentation and theorizing. Long considered academic dead ends, these ancient practices have been reappraised in recent years: Alchemy is now considered to have been a useful and perhaps even necessary precursor to modern chemistry — more proto-science than hocus-pocus.

The appreciation of tinkering as a fruitful path toward grand theories and insights is particularly relevant for current research that combines advanced engineering and basic science in novel ways. Driven by breakthrough technologies, nanophysicists are tinkering away, building the modern equivalents of steam engines on the molecular level, manipulating individual atoms, electrons and photons. Genetic editing tools such as CRISPR allow us to cut and paste the code of life itself. With structures of unimaginable complexity, we are pushing nature into new corners of reality. With so many opportunities to explore new configurations of matter and information, we could enter a golden age of modern-day alchemy, in the best sense of the word.

However, we should never forget the hard-won cautionary lessons of history. Alchemy was not only a proto-science, but also a “hyper-science” that overpromised and underdelivered. Astrological predictions were taken so seriously that life had to adapt to theory, instead of the other way around. Unfortunately, modern society is not free from such magical thinking, putting too much confidence in omnipotent algorithms, without critically questioning their logical or ethical basis.

Science has always followed a natural rhythm of alternating phases of expansion and concentration. Times of unstructured exploration were followed by periods of consolidation, grounding new knowledge in fundamental concepts. We can only hope that the current period of creative tinkering in artificial intelligence, quantum devices and genetic editing, with its cornucopia of useful applications, will eventually lead to a deeper understanding of the world…

Today’s powerful but little-understood artificial intelligence breakthroughs echo past examples of unexpected scientific progress: “The Uselessness of Useful Knowledge,” from @RHDijkgraaf at @the_IAS.

Pair with: “Neuroscience’s Existential Crisis- we’re mapping the brain in amazing detail—but our brain can’t understand the picture” for a less optimistic view.

*  John Ciardi

###

As we experiment, we might recall that it was on this date in 1993 that the Roman Catholic Church admitted that it had erred in condemning Galileo.  For over 359 years, the Church had excoriated Galileo’s contentions (e.g., that the Earth revolves around the Sun) as anti-scriptural heresy.  In 1633, at age 69, Galileo had been forced by the Roman Inquisition to repent, and spent the last eight years of his life under house arrest.  After 13 years of inquiry, Pope John Paul II’s commission of historic, scientific and theological scholars brought the pontiff a “not guilty” finding for Galileo; the Pope himself met with the Pontifical Academy of Sciences to help correct the record.

Galileo (standing; white collar, dark smock) showing the Doge of Venice (seated) how to use the telescope. From a fresco by Giuseppe Bertini

source

“An architect should live as little in cities as a painter. Send him to our hills, and let him study there what nature understands by a buttress, and what by a dome.”*…

We’ve misunderstood an important part of the history of urbanism– jungle cities. Patrick Roberts suggests that they have much to teach us…

Visions of “lost cities” in the jungle have consumed western imaginations since Europeans first visited the tropics of Asia, Africa and the Americas. From the Lost City of Z to El Dorado, a thirst for finding ancient civilisations and their treasures in perilous tropical forest settings has driven innumerable ill-fated expeditions. This obsession has seeped into western societies’ popular ideas of tropical forest cities, with overgrown ruins acting as the backdrop for fear, discovery and life-threatening challenges in countless films, novels and video games.

Throughout these depictions runs the idea that all ancient cities and states in tropical forests were doomed to fail. That the most resilient occupants of tropical forests are small villages of poison dart-blowing hunter-gatherers. And that vicious vines and towering trees – or, in the case of The Jungle Book, a boisterous army of monkeys – will inevitably claw any significant human achievement back into the suffocating green whence it came. This idea has been boosted by books and films that focus on the collapse of particularly enigmatic societies such as the Classic Maya. The decaying stone walls, the empty grand structures and the deserted streets of these tropical urban leftovers act as a tragic warning that our own way of life is not as secure as we would like to assume.

For a long time, western scholars took a similar view of the potential of tropical forests to sustain ancient cities. On the one hand, intensive agriculture, seen as necessary to fuel the growth of cities and powerful social elites, has been considered impossible on the wet, acidic, nutrient-poor soils of tropical forests. On the other, where the rubble of cities cannot be denied, in the drier tropics of North and Central America, south Asia and south-east Asia, ecological catastrophe has been seen as inevitable. Deforestation to make way for massive buildings and growing populations, an expansion of agriculture across marginal soils, as well as natural disasters such as mudslides, flooding and drought, must have made tropical cities a big challenge at best, and a fool’s gambit at worst.

Overhauling these stereotypes has been difficult. For one thing, the kind of large, multiyear field explorations usually undertaken on the sites of ancient cities are especially hard in tropical forests. Dense vegetation, mosquito-borne disease, poisonous plants and animals and torrential rain have made it arduous to find and excavate past urban centres. Where organic materials, rather than stone, might have been used as a construction material, the task becomes even more taxing. As a result, research into past tropical urbanism has lagged behind similar research in Mesopotamia and Egypt and the sweeping river valleys of east Asia.

Yet many tropical forest societies found immensely successful methods of food production, in even the most challenging of circumstances, which could sustain impressively large populations and social structures. The past two decades of archaeological exploration, applying the latest science from the land and the air, have stripped away canopies to provide new, more favourable assessments.

Not only did societies such as the Classic Maya and the Khmer empire of Cambodia flourish, but pre-colonial tropical cities were actually some of the most extensive urban landscapes anywhere in the pre-industrial world – far outstripping ancient Rome, Constantinople/Istanbul and the ancient cities of China.

Ancient tropical cities could be remarkably resilient, sometimes surviving many centuries longer than colonial- and industrial-period urban networks in similar environments. Although they could face immense obstacles, and often had to reinvent themselves to beat changing climates and their own exploitation of the surrounding landscape, they also developed completely new forms of what a city could be, and perhaps should be.

Extensive, interspersed with nature and combining food production with social and political function, these ancient cities are now catching the eyes of 21st-century urban planners trying to come to grips with tropical forests as sites of some of the fastest-growing human populations around the world today…

They may be vine-smothered ruins today, but the lost cities of the ancient tropics still have a lot to teach us about how to live alongside nature. Dr. Roberts (@palaeotropics) explains: “The real urban jungle: how ancient societies reimagined what cities could be,” adapted from his new book, Jungle: How Tropical Forests Shaped the World – and Us.

* John Ruskin

###

As we acclimate, we might send thoughtful birthday greetings to Sir Karl Raimund Popper; he was born on this date in 1902.  One of the greatest philosophers of science of the 20th century, Popper is best known for his rejection of the classical inductivist views on the scientific method, in favor of empirical falsification: a theory in the empirical sciences can never be proven, but it can be falsified, meaning that it can and should be scrutinized by decisive experiments.  (Or more simply put, whereas classical inductive approaches considered hypotheses false until proven true, Popper reversed the logic: conclusions drawn from an empirical finding are true until proven false.)

Popper was also a powerful critic of historicism in political thought, and (in books like The Open Society and Its Enemies and The Poverty of Historicism) an enemy of authoritarianism and totalitarianism (in which role he was a mentor to George Soros).

 source

“Facts alone, no matter how numerous or verifiable, do not automatically arrange themselves into an intelligible, or truthful, picture of the world. It is the task of the human mind to invent a theoretical framework to account for them.”*…

PPPL physicist Hong Qin in front of images of planetary orbits and computer code

… or maybe not. A couple of decades ago, your correspondent came across a short book that aimed to explain how we think know what we think know, Truth– a history and guide of the perplexed, by Felipe Fernández-Armesto (then, a professor of history at Oxford; now, at Notre Dame)…

According to Fernández-Armesto, people throughout history have sought to get at the truth in one or more of four basic ways. The first is through feeling. Truth is a tangible entity. The third-century B.C. Chinese sage Chuang Tzu stated, ”The universe is one.” Others described the universe as a unity of opposites. To the fifth-century B.C. Greek philosopher Heraclitus, the cosmos is a tension like that of the bow or the lyre. The notion of chaos comes along only later, together with uncomfortable concepts like infinity.

Then there is authoritarianism, ”the truth you are told.” Divinities can tell us what is wanted, if only we can discover how to hear them. The ancient Greeks believed that Apollo would speak through the mouth of an old peasant woman in a room filled with the smoke of bay leaves; traditionalist Azande in the Nilotic Sudan depend on the response of poisoned chickens. People consult sacred books, or watch for apparitions. Others look inside themselves, for truths that were imprinted in their minds before they were born or buried in their subconscious minds.

Reasoning is the third way Fernández-Armesto cites. Since knowledge attained by divination or introspection is subject to misinterpretation, eventually people return to the use of reason, which helped thinkers like Chuang Tzu and Heraclitus describe the universe. Logical analysis was used in China and Egypt long before it was discovered in Greece and in India. If the Greeks are mistakenly credited with the invention of rational thinking, it is because of the effective ways they wrote about it. Plato illustrated his dialogues with memorable myths and brilliant metaphors. Truth, as he saw it, could be discovered only by abstract reasoning, without reliance on sense perception or observation of outside phenomena. Rather, he sought to excavate it from the recesses of the mind. The word for truth in Greek, aletheia, means ”what is not forgotten.”

Plato’s pupil Aristotle developed the techniques of logical analysis that still enable us to get at the knowledge hidden within us. He examined propositions by stating possible contradictions and developed the syllogism, a method of proof based on stated premises. His methods of reasoning have influenced independent thinkers ever since. Logicians developed a system of notation, free from the associations of language, that comes close to being a kind of mathematics. The uses of pure reason have had a particular appeal to lovers of force, and have flourished in times of absolutism like the 17th and 18th centuries.

Finally, there is sense perception. Unlike his teacher, Plato, and many of Plato’s followers, Aristotle realized that pure logic had its limits. He began with study of the natural world and used evidence gained from experience or experimentation to support his arguments. Ever since, as Fernández-Armesto puts it, science and sense have kept time together, like voices in a duet that sing different tunes. The combination of theoretical and practical gave Western thinkers an edge over purer reasoning schemes in India and China.

The scientific revolution began when European thinkers broke free from religious authoritarianism and stopped regarding this earth as the center of the universe. They used mathematics along with experimentation and reasoning and developed mechanical tools like the telescope. Fernández-Armesto’s favorite example of their empirical spirit is the grueling Arctic expedition in 1736 in which the French scientist Pierre Moreau de Maupertuis determined (rightly) that the earth was not round like a ball but rather an oblate spheroid…

source

One of Fernández-Armesto most basic points is that our capacity to apprehend “the truth”– to “know”– has developed throughout history. And history’s not over. So, your correspondent wondered, mightn’t there emerge a fifth source of truth, one rooted in the assessment of vast, ever-more-complete data maps of reality– a fifth way of knowing?

Well, those days may be upon us…

A novel computer algorithm, or set of rules, that accurately predicts the orbits of planets in the solar system could be adapted to better predict and control the behavior of the plasma that fuels fusion facilities designed to harvest on Earth the fusion energy that powers the sun and stars.

he algorithm, devised by a scientist at the U.S. Department of Energy’s (DOE) Princeton Plasma Physics Laboratory (PPPL), applies machine learning, the form of artificial intelligence (AI) that learns from experience, to develop the predictions. “Usually in physics, you make observations, create a theory based on those observations, and then use that theory to predict new observations,” said PPPL physicist Hong Qin, author of a paper detailing the concept in Scientific Reports. “What I’m doing is replacing this process with a type of black box that can produce accurate predictions without using a traditional theory or law.”

Qin (pronounced Chin) created a computer program into which he fed data from past observations of the orbits of Mercury, Venus, Earth, Mars, Jupiter, and the dwarf planet Ceres. This program, along with an additional program known as a ‘serving algorithm,’ then made accurate predictions of the orbits of other planets in the solar system without using Newton’s laws of motion and gravitation. “Essentially, I bypassed all the fundamental ingredients of physics. I go directly from data to data,” Qin said. “There is no law of physics in the middle.”

The process also appears in philosophical thought experiments like John Searle’s Chinese Room. In that scenario, a person who did not know Chinese could nevertheless ‘translate’ a Chinese sentence into English or any other language by using a set of instructions, or rules, that would substitute for understanding. The thought experiment raises questions about what, at root, it means to understand anything at all, and whether understanding implies that something else is happening in the mind besides following rules.

Qin was inspired in part by Oxford philosopher Nick Bostrom’s philosophical thought experiment that the universe is a computer simulation. If that were true, then fundamental physical laws should reveal that the universe consists of individual chunks of space-time, like pixels in a video game. “If we live in a simulation, our world has to be discrete,” Qin said. The black box technique Qin devised does not require that physicists believe the simulation conjecture literally, though it builds on this idea to create a program that makes accurate physical predictions.

This process opens up questions about the nature of science itself. Don’t scientists want to develop physics theories that explain the world, instead of simply amassing data? Aren’t theories fundamental to physics and necessary to explain and understand phenomena?

“I would argue that the ultimate goal of any scientist is prediction,” Qin said. “You might not necessarily need a law. For example, if I can perfectly predict a planetary orbit, I don’t need to know Newton’s laws of gravitation and motion. You could argue that by doing so you would understand less than if you knew Newton’s laws. In a sense, that is correct. But from a practical point of view, making accurate predictions is not doing anything less.”

Machine learning could also open up possibilities for more research. “It significantly broadens the scope of problems that you can tackle because all you need to get going is data,” [Qin’s collaborator Eric] Palmerduca said…

But then, as Edwin Hubble observed, “observations always involve theory,” theory that’s implicit in the particulars and the structure of the data being collected and fed to the AI. So, perhaps this is less a new way of knowing, than a new way of enhancing Fernández-Armesto’s third way– reason– as it became the scientific method…

The technique could also lead to the development of a traditional physical theory. “While in some sense this method precludes the need of such a theory, it can also be viewed as a path toward one,” Palmerduca said. “When you’re trying to deduce a theory, you’d like to have as much data at your disposal as possible. If you’re given some data, you can use machine learning to fill in gaps in that data or otherwise expand the data set.”

In either case: “New machine learning theory raises questions about nature of science.”

Francis Bello

###

As we experiment with epistemology, we might send carefully-observed and calculated birthday greetings to Georg Joachim de Porris (better known by his professional name, Rheticus; he was born on this date in 1514. A mathematician, astronomer, cartographer, navigational-instrument maker, medical practitioner, and teacher, he was well-known in his day for his stature in all of those fields. But he is surely best-remembered as the sole pupil of Copernicus, whose work he championed– most impactfully, facilitating the publication of his master’s De revolutionibus orbium coelestium (On the Revolutions of the Heavenly Spheres)… and informing the most famous work by yesterday’s birthday boy, Galileo.

source

%d bloggers like this: