(Roughly) Daily

Posts Tagged ‘empiricism

“The world of reality has its limits; the world of imagination is boundless”*…

Still, it’s useful to know the difference… and as Yasemin Saplakoglu explains, that’s a complex process– one that science takes very seriously…

As I sit at my desk typing up this newsletter, I can see a plant to my left, a water bottle to my right and a gorilla sitting across from me. The plant and bottle are real, but the gorilla is a product of my mind — and I intuitively know that this is true. That’s because my brain, like most people’s, has the ability to distinguish reality from imagination. If it didn’t, or if I had a condition that disrupts this distinction, I’d constantly see gorillas and elephants where they don’t exist.

Imagination is sometimes described as perception in reverse. When we look at an object, electromagnetic waves enter the eyes, where they are translated into neural signals that are then sent to the visual cortex at the back of the brain. This process generates an image: “plant.” With imagination, we start with what we want to see, and the brain’s memory and semantic centers send signals to the same brain region: “gorilla.”

In both cases, the visual cortex is activated. Recalling memories can also activate some of the same regions. Yet the brain can clearly distinguish between imagination, perception and memory in most cases (though it is still possible to get confused). How does it keep everything straight?

By probing the differences between these processes, neuroscientists are untangling how the human brain creates our experience. They’re finding that even our perception of reality is in many ways imagined. “Underneath our skull, everything is made up,” Lars Muckli, a professor of visual and cognitive neurosciences at the University of Glasgow, told me. “We entirely construct the world in its richness and detail and color and sound and content and excitement. … It is created by our neurons.”

To distinguish reality and imagination, the brain might have some kind of “reality threshold,” according to one theory. Researchers recently tested this by asking people to imagine specific images against a backdrop — and then secretly projected faint outlines of those images there. Participants typically recognized when they saw a real projection versus their imagined one, and  those who rated images as more vivid were also more likely to identify them as real. The study suggested that when processing images, the brain might make a judgment on reality based on signal strength. If the signal is weak, the brain takes it for imagination. If it’s strong, the brain deems it real. “The brain has this really careful balancing act that it has to perform,” Thomas Naselaris, a neuroscientist at the University of Minnesota, told me. “In some sense it is going to interpret mental imagery as literally as it does visual imagery.”

Although recalling memories is a creative and imaginative process, it activates the visual cortex as if we were seeing. “It started to raise the question of whether a memory representation is actually different from a perceptual representation at all,” Sam Ling, a neuroscientist at Boston University, told me. A recent study looked to identify how memories and perceptions are constructed differently at the neurobiological level. When we perceive something, visual cues undergo layers of processing in the visual cortex that increase in complexity. Neurons in earlier parts of this process fire more precisely than those that get involved later. In the study, researchers found that during memory recall, neurons fired in a much blurrier way through all the layers. That might explain why our memories aren’t often as crisp as what we’re seeing in front of us…

How Do Brains Tell Reality From Imagination?” from @yaseminsaplakoglu.bsky.social in @quantamagazine.bsky.social.

* Jean-Jacques Rousseau

###

As we parse perception, we might send mindful birthday greetings to a man whose work figures into the history of science’s struggle on this issue, Franz Brentano; he was born on this date in 1838. A philosopher and psychologist, his 1874 Psychology from an Empirical Standpoint, considered his magnum opus and is credited with having reintroduced the medieval scholastic concept of intentionality into contemporary philosophy and psychology.

Brentano also studied perception, with conclusions that prefigure the discussion above…

He is also well known for claiming that Wahrnehmung ist Falschnehmung (‘perception is misconception’) that is to say perception is erroneous. In fact he maintained that external, sensory perception could not tell us anything about the de facto existence of the perceived world, which could simply be illusion. However, we can be absolutely sure of our internal perception. When I hear a tone, I cannot be completely sure that there is a tone in the real world, but I am absolutely certain that I do hear. This awareness, of the fact that I hear, is called internal perception. External perception, sensory perception, can only yield hypotheses about the perceived world, but not truth. Hence he and many of his pupils (in particular Carl Stumpf and Edmund Husserl) thought that the natural sciences could only yield hypotheses and never universal, absolute truths as in pure logic or mathematics.

However, in a reprinting of his Psychologie vom Empirischen Standpunkte (Psychology from an Empirical Standpoint), he recanted this previous view. He attempted to do so without reworking the previous arguments within that work, but it has been said that he was wholly unsuccessful. The new view states that when we hear a sound, we hear something from the external world; there are no physical phenomena of internal perception… – source

source

“A prudent question is one-half of wisdom”*…

Sir Francis Bacon, portrait by Paul van Somer I, 1617

The death of Queen Elizabeth I created a career opportunity for philosopher and statesman Francis Bacon– one that, as Susan Wise Bauer explains– led him to found empiricism, to pioneer inductive reasoning, and in so doing, to advance the scientific method…

In 1603, Francis Bacon, London born, was forty-three years old: a trained lawyer and amateur philosopher, happily married, politically ambitious, perpetually in debt.

He had served Elizabeth I of England loyally at court, without a great deal of recognition in return. But now Elizabeth was dead at the age of sixty-nine, and her crown would go to her first cousin twice removed: James VI of Scotland, James I of England.

Francis Bacon hoped for better things from the new king, but at the moment he had no particular ‘in’ at the English court. Forced to be patient, he began working on a philosophical project he’d had in mind for some years–a study of human knowledge that he intended to call Of the Proficience and Advancement of Learning, Divine and Human.

Like most of Bacon’s undertakings, the project was ridiculously ambitious. He set out to classify all learning into the proper branches and lay out all of the possible impediments to understanding. Part I condemned what he called the three ‘distempers’ of learning, which included ‘vain imaginations,’ pursuits such as astrology and alchemy that had no basis in actual fact; Part II divided all knowledge into three branches and suggested that natural philosophy should occupy the prime spot. Science, the project of understanding the universe, was the most important pursuit man could undertake. The study of history (‘everything that has happened’) and poesy (imaginative writings) took definite second and third places.

For a time, Bacon didn’t expand on these ideas. The Advancement of Learning opened with a fulsome dedication to James I (‘I have been touched–yea, and possessed–with an extreme wonder at those your virtues and faculties . . . the largeness of your capacity, the faithfulness of your memory, the swiftness of your apprehension, the penetration of your judgment, and the facility and order of your elocution …. There hath not been since Christ’s time any king or temporal monarch which hath been so learned in all literature and erudition, divine and human’), and this groveling soon yielded fruit. In 1607 Bacon was appointed as solicitor general, a position he had coveted for years, and over the next decade or so he poured his energies into his government responsibilities.

He did not return to natural philosophy until after his appointment to the even higher post of chancellor in 1618. Now that he had battled his way to the top of the political dirt pile, he announced his intentions to write a work with even greater scope–a new, complete system of philosophy that would shape the minds of men and guide them into new truths. He called this masterwork the Great Instauration: the Great Establishment, a whole new way of thinking, laid out in six parts.

Part I, a survey of the existing ‘ancient arts’ of the mind, repeated the arguments of the Advancement of Learning. But Part II, published in 1620 as a stand-alone work, was something entirely different. It was a wholesale challenge to Aristotelian methods, a brand-new ‘doctrine of a more perfect use of reason.’

Aristotelian thinking relies, heavily, on deductive reasoning for ancient logicians and philosophers, the highest and best road to the truth. Deductive reasoning moves from general statements (premises) to specific conclusions.

MAJOR PREMISE: All heavy matter falls toward the center of the universe. MINOR PREMISE: The earth is made of heavy matter. MINOR PREMISE: The earth is not falling. CONCLUSION: The earth must already be at the center of the universe.

But Bacon had come to believe that deductive reasoning was a dead end that distorted evidence: ‘Having first determined the question according to his will,’ he objected, ‘man then resorts to experience, and bending her to conformity with his placets [expressions of assent], leads her about like a captive in a procession.’ Instead, he argued, the careful thinker must reason the other way around: starting from specifics and building toward general conclusions, beginning with particular pieces of evidence and working, inductively, toward broader assertions.

This new way of thinking–inductive reasoning–had three steps to it. The ‘true method’ Bacon explained,

‘first lights the candle, and then by means of the candle shows the way; commencing as it does with experience duly ordered and digested, not bungling or erratic, and from it deducing axioms, and from established axioms again new experiments.’

In other words, the natural philosopher must first come up with an idea about how the world works: ‘lighting the candle.’ Second, he must test the idea against physical reality, against ‘experience duly ordered’–both observations of the world around him and carefully designed experiments. Only then, as a last step, should he ‘deduce axioms,’ coming up with a theory that could be claimed to carry truth. 

Hypothesis, experiment, conclusion: Bacon had just traced the outlines of the scientific method…

Francis Bacon and the Scientific Method

An excerpt from The Story of Western Science by @SusanWiseBauer, via the invaluable @delanceyplace.

* Francis Bacon

###

As we embrace empiricism, we might send carefully-transmitted birthday greetings to Augusto Righi; he was born on this date in 1850. A physicist and a pioneer in the study of electromagnetism, he showed that showed that radio waves displayed characteristics of light wave behavior (reflection, refraction, polarization, and interference), with which they shared the electromagnetic spectrum. In 1894 Righi was the first person to generate microwaves.

Righi influenced the young Guglielmo Marconi, the inventor of radio, who visited him at his lab. Indeed, Marconi invented the first practical wireless telegraphy radio transmitters and receivers in 1894 using Righi’s four ball spark oscillator (from Righi’s microwave work) in his transmitters.

source

“I used to measure the skies, now I measure the shadows of Earth”*…

From ancient Egyptian cubits to fitness tracker apps, humankind has long been seeking ever more ways to measure the world – and ourselves…

The discipline of measurement developed for millennia… Around 6,000 years ago, the first standardised units were deployed in river valley civilisations such as ancient Egypt, where the cubit was defined by the length of the human arm, from elbow to the tip of the middle finger, and used to measure out the dimensions of the pyramids. In the Middle Ages, the task of regulating measurement to facilitate trade was both privilege and burden for rulers: a means of exercising power over their subjects, but a trigger for unrest if neglected. As the centuries passed, units multiplied, and in 18th-century France there were said to be some 250,000 variant units in use, leading to the revolutionary demand: “One king, one law, one weight and one measure.”

It was this abundance of measures that led to the creation of the metric system by French savants. A unit like the metre – defined originally as one ten-millionth of the distance from the equator to the north pole – was intended not only to simplify metrology, but also to embody political ideals. Its value and authority were derived not from royal bodies, but scientific calculation, and were thus, supposedly, equal and accessible to all. Then as today, units of measurement are designed to create uniformity across time, space and culture; to enable control at a distance and ensure trust between strangers. What has changed since the time of the pyramids is that now they often span the whole globe.

Despite their abundance, international standards like those mandated by NIST and the International Organization for Standardization (ISO) are mostly invisible in our lives. Where measurement does intrude is via bureaucracies of various stripes, particularly in education and the workplace. It’s in school that we are first exposed to the harsh lessons of quantification – where we are sorted by grade and rank and number, and told that these are the measures by which our future success will be gauged…

A fascinating survey of the history of measurement, and a consideration of its consequences: “Made to measure: why we can’t stop quantifying our lives,” from James Vincent (@jjvincent) in @guardian, an excerpt from his new book Beyond Measure: The Hidden History of Measurement.

And for a look at what it takes to perfect one of the most fundamental of those measures, see Jeremy Bernstein‘s “The Kilogram.”

* “I used to measure the skies, now I measure the shadows of Earth. Although my mind was sky-bound, the shadow of my body lies here.” – Epitaph Johannes Kepler composed for himself a few months before he died

###

As we get out the gauge, we might send thoughtfully-wagered birthday greetings Blaise Pascal; he was born on this date in 1623.  A French mathematician, physicist, theologian, and inventor (e.g.,the first digital calculator, the barometer, the hydraulic press, and the syringe), his commitment to empiricism (“experiments are the true teachers which one must follow in physics”) pitted him against his contemporary René “cogito, ergo sum” Descartes– and was foundational in the acceleration of the scientific/rationalist commitment to measurement…

 source

Happy Juneteenth!

“One must not think slightingly of the paradoxical”*…

 

Argo

The Building of the Argo, by Antoon Derkinderen, c. 1901. Rijksmuseum.

 

The thought problem known as the ship of Theseus first appears in Plutarch’s Lives, a series of biographies written in the first century. In one vignette, Theseus, founder-hero of Athens, returns victorious from Crete on a ship that the Athenians went on to preserve.

They took away the old planks as they decayed, putting in new and stronger timber in their place, insomuch that this ship became a standing example among the philosophers for the logical question of things that grow; one side holding that the ship remained the same, and the other contending that it was not the same…

Of course, the conundrum of how things change and stay the same has been with us a lot longer than Plutarch. Plato, and even pre-Socratics like Heraclitus, dealt in similar questions. “You can’t step in the same river twice,” a sentiment found on inspirational Instagram accounts, is often attributed to Heraclitus. His actual words—“Upon those who step into the same rivers, different and again different waters flow”—might not be the best Instagram fodder but, figuratively at least, provided the waters that the ship of Theseus later sailed.

Two thousand years later the ship is still bobbing along, though some of its parts have been replaced. Now known colloquially as Theseus’ paradox, in the U.S. the idea sometimes appears as “Washington’s ax.” While not as ancient as the six-thousand-year-old stone ax discovered last year at George Washington’s estate, the age-old question remains: If Washington’s ax were to have its handle and blade replaced, would it still be the same ax? The same has been asked of a motley assortment of items around the world. In Hungary, for example, there is a similar fable involving the statesman Kossuth Lajos’ knife, while in France it’s called Jeannot’s knife.

This knife, that knife, Washington’s ax—there’s even a “Lincoln’s ax.” We don’t know where these stories originated. They likely arose spontaneously and had nothing to do with the ancient Greeks and their philosophical conundrums. The only thing uniting these bits of folklore is that the same question was asked: Does a thing remain the same after all its parts are replaced? In the millennia since the ship of Theseus set sail, some notions that bear its name have less in common with the original than do the fables of random axes and knives, while other frames for this same question threaten to replace the original entirely.

One such version of this idea is attributed to Enlightenment philosopher John Locke, proffering his sock as an example. An exhibit called Locke’s Socks at Pace University’s now-defunct Museum of Philosophy serves to demonstrate. On one wall, six socks were hung: the first a cotton sports sock, the last made only of patches. A museum guide, according to a New York Times write-up, asked a room full of schoolchildren, “Assume the six socks represent a person’s sock over time. Can we say that a sock which is finally all patches, with none of the original material, is the same sock?”

The question could be asked of Theseus’ paradox itself. Can it be said that a paradox about a ship remains the same if the ship is replaced with a knife or a sock? Have we lost anything from Theseus’ paradox if instead we start calling it “the Locke’s Sock paradox”?…

Is a paradox still the same after its parts have been replaced?  A consideration: “Restoring the Ship of Theseus.”

* Soren Kierkegaard

###

As we contemplate change, we might spare a reasoned thought for the Enlightenment giant (and sock darner) John Locke; the physician and philosopher died on this date in 1704.  An intellectual descendant of Francis Bacon, Locke was among the first empiricists. He spent over 20 years developing the ideas he published in his most significant work, Essay Concerning Human Understanding (1690), an analysis of the nature of human reason which promoted experimentation as the basis of knowledge.  Locke established “primary qualities” (e.g., solidity, extension, number) as distinct from “secondary qualities” (sensuous attributes like color or sound). He recognized that science is made possible when the primary qualities, as apprehended, create ideas that faithfully represent reality.

Locke is, of course, also well-remembered as a key developer (with Hobbes, and later Rousseau) of the concept of the Social Contract.  Locke’s theory of “natural rights” influenced Voltaire and Rosseau– and formed the intellectual basis of the U.S. Declaration of Independence.

220px-John_Locke source

 

 

Written by (Roughly) Daily

October 28, 2019 at 1:01 am

“Every technology, every science that tells us more about ourselves, is scary at the time”*…

 

Further to last weekend’s visit with Silicon Valley’s security robots...

Researchers led by the University of Cambridge have built a mother robot that can independently build its own children and test which one does best; and then use the results to inform the design of the next generation, so that preferential traits are passed down from one generation to the next.

Without any human intervention or computer simulation beyond the initial command to build a robot capable of movement, the mother created children constructed of between one and five plastic cubes with a small motor inside.

In each of five separate experiments, the mother designed, built and tested generations of ten children, using the information gathered from one generation to inform the design of the next. The results, reported in the open access journal PLOS One, found that preferential traits were passed down through generations, so that the ‘fittest’ individuals in the last generation performed a set task twice as quickly as the fittest individuals in the first generation…

email readers click here for video

“Natural selection is basically reproduction, assessment, reproduction, assessment and so on,” said lead researcher Dr Fumiya Iida of Cambridge’s Department of Engineering, who worked in collaboration with researchers at ETH Zurich. “That’s essentially what this robot is doing – we can actually watch the improvement and diversification of the species… We want to see robots that are capable of innovation and creativity…”

See and read more here (and here).

* Rodney Brooks

###

As we select naturally, we might spare a thought for Blaise Pascal; he died on this date in 1662.  A French mathematician, physicist, theologian, and inventor (e.g.,the first digital calculator, the barometer, the hydraulic press, and the syringe), his commitment to empiricism (“experiments are the true teachers which one must follow in physics”) pitted him against his contemporary René “cogito, ergo sum” Descartes…

 source

 

Written by (Roughly) Daily

August 19, 2015 at 1:01 am