(Roughly) Daily

Posts Tagged ‘Science

“Oh how wrong we were to think immortality meant never dying”*…



Quantum simulation (Verresen et al., Nature Physics, 2019)


Further (in a fashion) to yesterday’s post…

Nothing lasts forever. Humans, planets, stars, galaxies, maybe even the Universe itself, everything has an expiration date. But things in the quantum realm don’t always follow the rules. Scientists have found that quasiparticles in quantum systems could be effectively immortal.

That doesn’t mean they don’t decay, which is reassuring. But once these quasiparticles have decayed, they are able to reorganise themselves back into existence, possibly ad infinitum.

This seemingly flies right in the face of the second law of thermodynamics, which asserts that entropy in an isolated system can only move in an increasing direction: things can only break down, not build back up again.

Of course, quantum physics can get weird with the rules; but even quantum scientists didn’t know quasiparticles were weird in this particular manner…

Maybe some things are forever.  More at “Scientists Find Evidence a Strange Group of Quantum Particles Are Basically Immortal.”

Read the underlying Nature Physics article, by physicist Ruben Verresen and his team at the Technical University of Munich and the Max Planck Institute for the Physics of Complex Systems, here.

* Gerard Way


As we ponder perpetuity, we might send carefully-deduced birthday greetings to Richard Bevan Braithwaite; he was born on this date in 1900.  A Cambridge don who specialized in the philosophy of science, he focused on the logical features common to all sciences.  Braithwaite was concerned with the impact of science on our beliefs about the world and the appropriate responses to that impact.  He was especially interested in probability (and its applications in decision theory and games theory) and in the statistical sciences.  He was president of the Aristotelian Society from 1946 to 1947, and was a Fellow of the British Academy.

It was Braithwaite’s poker that Ludwig Wittgenstein reportedly brandished at Karl Popper during their confrontation at a Moral Sciences Club meeting in Braithwaite’s rooms in King’s College. The implement subsequently disappeared. (See here.)



“Nothing happens until something moves”*…




What determines our fate? To the Stoic Greek philosophers, fate is the external product of divine will, ‘the thread of your destiny’. To transcendentalists such as Henry David Thoreau, it is an inward matter of self-determination, of ‘what a man thinks of himself’. To modern cosmologists, fate is something else entirely: a sweeping, impersonal physical process that can be boiled down into a single, momentous number known as the Hubble Constant.

The Hubble Constant can be defined simply as the rate at which the Universe is expanding, a measure of how quickly the space between galaxies is stretching apart. The slightest interpretation exposes a web of complexity encased within that seeming simplicity, however. Extrapolating the expansion process backward implies that all the galaxies we can observe originated together at some point in the past – emerging from a Big Bang – and that the Universe has a finite age. Extrapolating forward presents two starkly opposed futures, either an endless era of expansion and dissipation or an eventual turnabout that will wipe out the current order and begin the process anew.

That’s a lot of emotional and intellectual weight resting on one small number…

How scientists pinned a single number on all of existence: “Fate of the Universe.”

[Readers might remember that the Big Bang wasn’t always an accepted paradigm— and that on-going research continues to surface challenges.]

* Albert Einstein


As we center ourselves, we might spare a thought for Kurt Friedrich Gödel; he died on this date in 1978.  A  logician, mathematician, and philosopher, he is considered (along with Aristotle, Alfred Tarski— whose birthday this also is– and Gottlob Frege) to be one of the most important logicians in history.  Gödel had an immense impact upon scientific and philosophical thinking in the 20th century.  He is, perhaps, best remembered for his Incompleteness Theorems, which led to (among other important results) Alan Turing’s insights into computational theory.

Kurt Gödel’s achievement in modern logic is singular and monumental – indeed it is more than a monument, it is a landmark which will remain visible far in space and time. … The subject of logic has certainly completely changed its nature and possibilities with Gödel’s achievement.                  — John von Neumann

kurt_gödel source


“The science which feeds men is worth at least as much as the one which teaches how to kill them”*…


Food revolution


We are on the cusp of the biggest economic transformation, of any kind, for 200 years. While arguments rage about plant- versus meat-based diets, new technologies will soon make them irrelevant. Before long, most of our food will come neither from animals nor plants, but from unicellular life. After 12,000 years of feeding humankind, all farming except fruit and veg production is likely to be replaced by ferming: brewing microbes through precision fermentation. This means multiplying particular micro-organisms, to produce particular products, in factories.I know some people will be horrified by this prospect. I can see some drawbacks. But I believe it comes in the nick of time…

Scientists are replacing crops and livestock with food made from microbes and water. It may save humanity’s bacon: “Lab-grown food will soon destroy farming – and save the planet.”

* French epicurean Jean Anthelme Brillat-Savarin


As we dig in, we might recall that it was on this date in 1943 that an official of the Meats Division of the wartime Office of Price Administration (OPA) announced that, for the duration of World War II, frankfurters (or ‘hot dogs’) would be replaced with “Victory Sausages,” in which a substantial proportion of the meat in those sausages would itself be replaced with “an unspecified amount of soybean meal or some other substitute.”

Victory was achieved; but it was not, in this dimension in any case, sweet.

meat-340x499 source


“Talkin’ ’bout my generation”*…




“The competition between paradigms is not the sort of battle that can be resolved by proofs”  – Thomas S. Kuhn, The Structure of Scientific Revolutions

As recently as the late 1980s, most Americans thought gay sex was not only immoral but also something that ought to be illegal. Yet by 2015, when the Supreme Court legalised same-sex marriage, there were only faint murmurs of protest. Today two-thirds of Americans support it, and even those who frown on it make no serious effort to criminalise it.

This surge in tolerance illustrates how fast public opinion can shift. The change occurred because two trends reinforced each other. First, many socially conservative old people have died, and their places in the polling samples have been taken by liberal millennials. In addition, people have changed their minds. Support for gay marriage has risen by some 30 percentage points within each generation since 2004, from 20% to 49% among those born in 1928-45 and from 45% to 78% among those born after 1980.

However, this shift in opinion makes gay marriage an exception among political issues. Since 1972 the University of Chicago has run a General Social Survey every year or two, which asks Americans their views on a wide range of topics. Over time, public opinion has grown more liberal. But this is mostly the result of generational replacement, not of changes of heart.

For example, in 1972, 42% of Americans said communist books should be banned from public libraries. Views varied widely by age: 55% of people born before 1928 (who were 45 or older at the time) supported a ban, compared with 37% of people aged 27-44 and just 25% of those 26 or younger. Today, only a quarter of Americans favour this policy. However, within each of these birth cohorts, views today are almost identical to those from 47 years ago. The change was caused entirely by the share of respondents born before 1928 falling from 49% to nil, and that of millennials—who were not born until at least 1981, and staunchly oppose such a ban—rising from zero to 36%.

Not every issue is as extreme as these two. But on six of the eight questions we examined—all save gay marriage and marijuana legalisation—demographic shifts accounted for a bigger share of overall movement in public opinion than changes in beliefs within cohorts. On average, their impact was about twice as large.

Social activists devote themselves to changing people’s views, and sometimes succeed. In general, however, battles for hearts and minds are won by grinding attrition more often than by rapid conquest.

The Economist illustrates the way in which generational change is the driver of changes in public opinion: “Societies change their minds faster than people do.”

Paul Graham has tips on how to anticipate and navigate, even to lead, this change: “What you can’t say.”

“The proliferation of competing articulations, the willingness to try anything, the expression of explicit discontent, the recourse to philosophy and to debate over fundamentals, all these are symptoms of a transition from normal to extraordinary research. It is upon their existence more than upon that of revolutions that the notion of normal science depends… though the world does not change with a change of paradigm, the scientist afterward works in a different world.”

Thomas S. Kuhn, The Structure of Scientific Revolutions

* The Who


As we go with the flow, we might send responsive birthday greetings to John Broadus Watson; he was born on this date in 1878.  A psychologist inspired by the (then recent) work of Ivan Pavlov, Watson established the psychological school of behaviorism, most dramatically through his address Psychology as the Behaviorist Views it, at Columbia University in 1913.  Watson studied the biology, physiology, and behavior of animals, viewing them as extremely complex machines that responded to situations according to their “wiring,” or nerve pathways, which were conditioned by experience.  When he continued with studies of the behavior of children, his conclusion was that humans, while more complicated than animals, operated on the same principles; he was particularly interested in the conditioning of emotions.  Watson’s behaviorism dominated psychology in the U.S. in the 1920s and ’30s (and got a second wind with the ascendence of B.F. Skinner).


Ironically, it is also the birthday (1886) of one of Watson’s contemporaries and antagonists, Edwin Ray Guthrie.  Guthrie was also a behaviorist, but argued against of Watson’s theory of classical conditioning and Skinner’s related theory of operant conditioning.  Guthie’s focus was the psychology of learning and the role that association plays.  In his Law of Contiguity, he held that “a combination of stimuli which has accompanied a movement, will on its recurrence tend to be followed by that movement.” He held that all learning is based on a stimulus- response association; movements are small stimulus- response combinations.  These movements make up an act.  A learned behavior is a series of movements, and it takes time for the movements to develop into an act. Which is to say that he believed that learning is incremental, and that many acquired behaviors involve repetition of movements– because what is learned are movements, not behaviors.

200px-Edwin_Ray_Guthrie source


“Just as the twig is bent, the tree’s inclined”*…


Crown shyness


In certain forests, when you look up you will see a network of cracks formed by gaps between the outermost edges of the tree branches. It looks like a precisely engineered jigsaw puzzle, each branch growing just perfectly so it almost—but not quite—touches the neighboring tree. This beautiful phenomenon is called crown shyness.

Crown shyness doesn’t happen all the time, and scientists aren’t completely certain why it happens at all…

The forest keeps its secrets… Despite decades of study– and a profusion of postulation– no one yet fully understands “The Mysteries of Crown Shyness.”

* Alexander Pope


As we keep to ourselves, we might spare a thought for Gregor Johann Mendel; he died on this date in 1884. After a profoundly-unpromising start, Mendel became a scientist, Augustinian friar, and abbot of St. Thomas’ Abbey in Brno, Moravia (today’s Czech Republic).  A botanist and plant experimenter, he was the first to lay the mathematical foundation of the science of genetics (of which he is now consider the “Father”).  Over the period 1856-63, Mendel grew and analyzed over 28,000 pea plants.  He carefully studied the height, pod shape, pod color, flower position, seed color, seed shape and flower color of each– and from those observations derived two very important generalizations, known today as the Laws of Heredity.



Written by LW

January 6, 2020 at 1:01 am

“Time flies like an arrow; fruit flies like a banana”*…




How does now unstick itself and become the past, and when does the future morph into the present? How do these states transition, one into another, so seamlessly?

How long is right now ?…

[Fascinating explorations of different explanations…]

Each theory of right now has one thing in common: It challenges the notion that the present is reliable and objective, or that it stretches out infinitely in front of us, even if we sometimes perceive it that way. This is an important reminder because the way we think about time affects the kinds of decisions we make.

We don’t just think about the past, present, or future, we think about ourselves in those places. (That’s the impetus behind something called Time Perspective Theory, which argues that there are six different ways people regard time, and it greatly influences your perspective on life.)

Studies have found that many people think about themselves in the future not as themselves, but as other people. When asked to imagine a birthday in the far off future, people are more likely to envision it from a third-person viewpoint. When we think about ourselves in 10 years, compared to right now, it activates similar parts of our brain that think about others, not ourselves.

Our instinct to place a lot of emphasis on the present, said Hal Hershfield, a psychologist at UCLA who has studied how perceptions of time relate to the choices people make. But if we could better relate to our future selves, we could be better off later on. Hershfield and his collaborators did a study that found that those who felt more similar to their future selves made more future-oriented decisions and had higher levels of well-being across a decade…

How long is right now?  As long as it took you to read that question?  Or shorter?  Or it might not exist at all…

* Anthony G. Oettinger


As we remember Ram Dass, we might recall that it was on this date in 1914 that Henry Ford announced that his company would cut its workday from nine hours to eight, and double workers’ wages to $5 per day.

Cited as the beginning of the “living wage” revolution, it is often suggested that Ford made the move so that his employees could afford the product that they were making.  But many historians argue that the real motivations were likelier an attempt to reduce employee turnover and to put economic pressure on competitors.  In any event, that’s what happened:  while Ford’s move was hugely unpopular among other automakers, they saw the increase in Ford’s productivity– and a significant increase in profit margin (from $30 million to $60 million in two years)– and (mostly) followed suit.


Assembly line at the Ford Motor Company’s Highland Park plant, ca. 1913



Written by LW

January 5, 2020 at 1:01 am

“I’ve developed a new philosophy. I only dread one day at a time.”*…




Starting [last] month, the very talented Adam Koford, the creator of Laugh-Out-Loud Cats webcomic, started posting these wonderful bootleg Peanuts comics to his Twitter account, and continued almost every day since.

Loose and sketchy, they capture the essence of Charles Schultz’ Peanuts so well: sweet and sad, combining childlike wonder and existential dread. As he went on, they started evolving a unique style of their own, distinct from the Peanuts characters but still recognizable….

Via Andy Baio‘s wonderful site Waxy.  The “Peanuts” panels are strewn through Adam’s Twitter feed; as a gift to us all, Baio collected a bunch of them into a Twitter “Moment.”

Enjoy… and don’t mention it to the Schultz estate.

* Charlie Brown


As we ruminate on reality, we might recall that today’s a relative-ly good day for it, as it was on this date in 1900 that German physicist Max Planck presented and published his study of the effect of radiation on a “black-body” substance (introducing what we’ve come to know as the Planck Postulate), and the quantum theory of modern physics– and for that matter, Twentieth Century modernity– were born.

Planck study demonstrated that in certain situations energy exhibits the characteristics of physical matter– something unthinkable at the time, when energy was thought to exist only in wave form– and suggested that energy exists in discrete packets, which he called “quanta”… thus laying the foundation on which he, Einstein, Bohr, Schrodinger, Dirac, and others built our modern understanding.

220px-Max_Planck_1933Max Planck


Written by LW

December 14, 2019 at 1:01 am

%d bloggers like this: