(Roughly) Daily

Archive for October 2020

“There is nothing more deceptive than an obvious fact”*…

Distortions and outright lies by politicians and pundits have become so common that major news outlets like the Associated Press, CNN, BBC, Fox News,and Washington Post routinely assign journalists and fact-checkers to verify claims made during stump speeches and press briefings. The motivation to uncover falsehoods and misleading statements taken out of context is laudable. But when it comes to real-world complexities, the trouble is that people often see different things when looking at the same event, a phenomenon repeatedly documented by psychologists.

Laboratory studies reveal that, when shown a video of a group of protesters, people see either a peaceful protest or an unruly mob blocking pedestrian access, depending on their sociopolitical beliefs. The world outside the lab shows similar biased perception: For example, 68 percent of Republicans consider the videotaped demonstrations in Portland, Ore., Kenosha, Wisc., and New York City to be riots, versus only 30 percent of Democrats, according to a Fox News poll released in September. Journalists and fact-checkers are human beings subject to the same psychological biases as everyone else—and their analyses of what constitute “facts” is affected by their own political and ideological values, resulting in what psychologists term selective perception.

Fact-checkers’ decisions have significant consequences for debates about fake news that cannot be overstated. Researchers have studied the cascading cognitive effects of misinformation, and their findings are relevant to current concerns about fake news and to the limitations of fact-checking. Misinformation can be insidious; it can seep into the unconscious mind and influence beliefs and behaviors long after we have forgotten its source or the evidence invoked to support it. Under laboratory conditions, a selection of objective facts and complete fabrications can be presented, and researchers can then examine the spread of misinformation about these facts and whether and how this spread results in false beliefs.

Unlike a pristine laboratory setting, however, the world of politics is messy, and there can be deep disagreements about the facts themselves, as the above contradictory claims illustrate. When it comes to partisan fact-checking about complex issues—which describes much of the fact-checking that takes place in the context of political news—the truth as stated is often the subjective opinion of people with shared political views.

One path to a solution is “adversarial fact-checking.” Fact-checking is often done by teams of two or more journalists rather than by a single person. We propose that political claims continue to be aggressively fact-checked, but by teams of individuals with diverse sociopolitical views; for example, by pairing fact-checkers from major liberal and conservative news sources. This would add little, if any, cost. The media should abandon fact-checkers’ pretext of objectivity and political disinterest and instead acknowledge their sociopolitical leanings in much the way that NPR tries to pit pro and con points of view in political coverage…

Having each side’s fact-checkers checked by the other side’s fact-checkers could lead to an infinite regress toward an uncertain truth. But this is preferable to belief in a truth that may not exist. Adversarial fact-checkers would debate the same “evidence” and ensure a balanced presentation of the facts. This may not guarantee that fact-checkers will agree or even that readers will discern the truth. But it will reveal the sometimes-tenuous nature of fact-checkers’ claims and the psychological context in which human cognition unfolds—and this would be a meaningful barrier to the spread of fake news and the creation of false beliefs among voters.

One notes that the Hegelian suggestion above assumes that fact-checkers from each side would be actively seeking to overcome their personal biases, to determine an “objective” truth… that only unconscious– not conscious, weaponized– biases are the issue.

Still, it’s certainly true that at least some fact-checkers aim to get closer to the truth, even as their biases can shroud the very truth they seek: “The Psychology of Fact-Checking.”

* Sherlock Holmes, in Arthur Conan Doyle’s “The Boscombe Valley Mystery”

###

As we clean our lenses, we might recall that it was on this date in 1517– All Hallows (All Saints) Eve– that Martin Luther, a priest and scholar in Wittenberg, Germany, upset by what he saw as the excesses and corruption of the Roman Catholic Church (especially the papal practice of taking payments– “indulgences”– for the forgiveness of sins), posted his 95 Theses on the door of Castle Church.  Thus began the Protestant Reformation.

Martin Luther (source)

Lest in this this pandemic-attenuated moment we forget: today, All Hallows (All Saints) Eve, is celebrated as Halloween, which is (if it is, as many scholars believe, directly descended from the ancient Welsh harvest festival Samhain) the longest-running holiday with a set date… and (usually, anyway) the second-biggest (after Christmas) commercial holiday in the United States.

source

“A Genuine ‘TV’ Brand Dinner”*…

In 1925, the Brooklyn-born entrepreneur Clarence Birdseye invented a machine for freezing packaged fish that would revolutionize the storage and preparation of food. [see here.] Maxson Food Systems of Long Island used Birdseye’s technology, the double-belt freezer, to sell the first complete frozen dinners to airlines in 1945, but plans to offer those meals in supermarkets were canceled after the death of the company’s founder, William L. Maxson. Ultimately, it was the Swanson company that transformed how Americans ate dinner (and lunch)—and it all came about, the story goes, because of Thanksgiving turkey.

According to the most widely accepted account, a Swanson salesman named Gerry Thomas conceived the company’s frozen dinners in late 1953 when he saw that the company had 260 tons of frozen turkey left over after Thanksgiving, sitting in ten refrigerated railroad cars. (The train’s refrigeration worked only when the cars were moving, so Swanson had the trains travel back and forth between its Nebraska headquarters and the East Coast “until panicked executives could figure out what to do,” according to Adweek.) Thomas had the idea to add other holiday staples such as cornbread stuffing and sweet potatoes, and to serve them alongside the bird in frozen, partitioned aluminum trays designed to be heated in the oven. Betty Cronin, Swanson’s bacteriologist, helped the meals succeed with her research into how to heat the meat and vegetables at the same time while killing food-borne germs.

Whereas Maxson had called its frozen airline meals “Strato-Plates,” Swanson introduced America to its “TV dinner” (Thomas claims to have invented the name) at a time when the concept was guaranteed to be lucrative: As millions of white women entered the workforce in the early 1950s, Mom was no longer always at home to cook elaborate meals—but now the question of what to eat for dinner had a prepared answer. Some men wrote angry letters to the Swanson company complaining about the loss of home-cooked meals. For many families, though, TV dinners were just the ticket. Pop them in the oven, and 25 minutes later, you could have a full supper while enjoying the new national pastime: television.

In 1950, only 9 percent of U.S. households had television sets—but by 1955, the number had risen to more than 64 percent, and by 1960, to more than 87 percent. Swanson took full advantage of this trend, with TV advertisements that depicted elegant, modern women serving these novel meals to their families, or enjoying one themselves. “The best fried chicken I know comes with a TV dinner,” Barbra Streisand told the New Yorker in 1962…

Thanks to the pandemic, Thanksgiving’s most unexpected legacy is heating up again: “A Brief History of the TV Dinner.”

* The Swanson “promise” on every package

###

As we peel back the foil, we might send clearly-outlined birthday greetings to Irma S. Rombauer; she was born on this date in 1877. An author of cookbooks, she is best remembered for The Joy of Cooking, one of the most widely-published and used cookbooks in the U.S. Originally unable to find a publisher, she privately printed an edition in 1931. But in 1935 a commercial edition (the first of nine) was released by Bobbs-Merrill.

Written in a plain but witty style, including basic techniques and simple dishes, not just the complext recipes that were the staples of most cookbooks of the time, it found a broad and loyal audience among the American middle class. Indeed, Julia Child credits The Joy of Cooking with teaching her the basic techniques of the kitchen, and (along with Gourmet Magazine) teaching her to cook.

source

Written by (Roughly) Daily

October 30, 2020 at 1:01 am

“He told me that in 1886 he had invented an original system of numbering”*…

A visualization of the 3-adic numbers

The rational numbers are the most familiar numbers: 1, -5, ½, and every other value that can be written as a ratio of positive or negative whole numbers. But they can still be hard to work with.

The problem is they contain holes. If you zoom in on a sequence of rational numbers, you might approach a number that itself is not rational. This short-circuits a lot of basic mathematical tools, like most of calculus.

Mathematicians usually solve this problem by arranging the rationals in a line and filling the gaps with irrational numbers to create a complete number system that we call the real numbers.

But there are other ways of organizing the rationals and filling the gaps: the p-adic numbers. They are an infinite collection of alternative number systems, each associated with a unique prime number: the 2-adics, 3-adics, 5-adics and so on.

The p-adics can seem deeply alien. In the 3-adics, for instance, 82 is much closer to 1 than to 81. But the strangeness is largely superficial: At a structural level, the p-adics follow all the rules mathematicians want in a well-behaved number system…

“We’re all on Earth and we work with the reals, but if you went [anywhere] else, you’d work with the p-adics,” [University of Washington mathematician Bianca] Viray explained. “It’s the reals that are the outliers.”

The p-adics form an infinite collection of number systems based on prime numbers. They’re at the heart of modern number theory… which is itself at the heart of computer science, numerical analysis, and cryptography: “An Infinite Universe of Number Systems.”

* Jorge Luis Borges, Labyrinths

###

As we dwell on digits, we might send carefully-calculated birthday greetings to Klaus Friedrich Roth; he was born on this date in 1925. After escaping with his family from Nazi Germany, he was educated at Cambridge, then taught mathematics first at University College London, then at Imperial College London. He made a number of important contribution to Number Theory, for which he won the De Morgan Medal and the Sylvester Medal, and election to Fellowship of the Royal Society. In 1958 he was awarded mathematics’ highest honor, the Fields Medal, for proving Roth’s theorem on the Diophantine approximation of algebraic numbers.

source

“The heart and soul of the company is creativity”*…

Creativity doesn’t have a deep history. The Oxford English Dictionary records just a single usage of the word in the 17th century, and it’s religious: ‘In Creation, we have God and his Creativity.’ Then, scarcely anything until the 1920s – quasi-religious invocations by the philosopher A N Whitehead. So creativity, considered as a power belonging to an individual – divine or mortal – doesn’t go back forever. Neither does the adjective ‘creative’ – being inventive, imaginative, having original ideas – though this word appears much more frequently than the noun in the early modern period. God is the Creator and, in the 17th and 18th centuries, the creative power, like the rarely used ‘creativity’, was understood as divine. The notion of a secular creative ability in the imaginative arts scarcely appears until the Romantic Era, as when the poet William Wordsworth addressed the painter and critic Benjamin Haydon: ‘Creative Art … Demands the service of a mind and heart.’

This all changes in the mid-20th century, and especially after the end of the Second World War, when a secularised notion of creativity explodes into prominence. The Google Ngram chart bends sharply upwards from the 1950s and continues its ascent to the present day. But as late as 1970, practically oriented writers, accepting that creativity was valuable and in need of encouragement, nevertheless reflected on the newness of the concept, noting its absence from some standard dictionaries even a few decades before.

Before the Second World War and its immediate aftermath, the history of creativity might seem to lack its object – the word was not much in circulation. The point needn’t be pedantic. You might say that what we came to mean by the capacity of creativity was then robustly picked out by other notions, say genius, or originality, or productivity, or even intelligence or whatever capacity it was believed enabled people to think thoughts considered new and valuable. And in the postwar period, a number of commentators did wonder about the supposed difference between emergent creativity and such other long-recognised mental capacities. The creativity of the mid-20th century was entangled in these pre-existing notions, but the circumstances of its definition and application were new…

Once seen as the work of genius, how did creativity become an engine of economic growth and a corporate imperative? (Hint: the Manhattan Project and the Cold War played important roles.): “The rise and rise of creativity.”

(Image above: source)

* Bob Iger, CEO of The Walt Disney Company

###

As we lionize the latest, we might recall that it was on this date in 1726 that Jonathan Swift’s Travels into Several Remote Nations of the World. In Four Parts. By Lemuel Gulliver, First a Surgeon, and then a Captain of Several Ships— much better known as Gulliver’s Travels— was first published.  A satire both of human nature and of the “travelers’ tales” literary subgenre popular at the time, it was an immediate hit (John Gay wrote in a 1726 letter to Swift that “It is universally read, from the cabinet council to the nursery”).  It has, of course, become a classic.

From the first edition

source

Written by (Roughly) Daily

October 28, 2020 at 1:01 am

“We are what we pretend to be, so we must be careful about what we pretend to be”*…

There is just something obviously reasonable about the following notion: If all life is built from atoms that obey precise equations we know—which seems to be true—then the existence of life might just be some downstream consequence of these laws that we haven’t yet gotten around to calculating. This is essentially a physicist’s way of thinking, and to its credit, it has already done a great deal to help us understand how living things work…

But approaching the subject of life with this attitude will fail us, for at least two reasons. The first reason we might call the fallacy of reductionism. Reductionism is the presumption that any piece of the universe we might choose to study works like some specimen of antique, windup clockwork, so that it is easy (or at least eminently possible) to predict the behavior of the whole once you know the rules governing how each of its parts pushes on and moves with the others…

The second mistake in how people have viewed the boundary between life and non-life is still rampant in the present day and originates in the way we use language. A great many people imagine that if we understand physics well enough, we will eventually comprehend what life is as a physical phenomenon in the same way we now understand how and why water freezes or boils. Indeed, it often seems people expect that a good enough physical theory could become the new gold standard for saying what is alive and what is not.

However, this approach fails to acknowledge that our own role in giving names to the phenomena of the world precedes our ability to say with any clarity what it means to even call something alive. A physicist who wants to devise theories of how living things behave or emerge has to start by making intuitive choices about how to translate the characteristics of the examples of life we know into a physical language. After one has done so, it quickly becomes clear that the boundary between what is alive and what is not is something that already got drawn at the outset, through a different way of talking than physics provides…

Physics is an approach to science that roots itself in the measurement of particular quantities: distance, mass, duration, charge, temperature, and the like. Whether we are talking about making empirical observations or developing theories to make predictions, the language of physics is inherently metrical and mathematical. The phenomena of physics are always expressed in terms of how one set of measurable numbers behaves when other sets of measurable numbers are held fixed or varied. This is why the genius of Newton’s Second Law, F = ma, was not merely that it proposed a successful equation relating force (F), mass (m), and acceleration (a), but rather that it realized that these were all quantities in the world that could be independently measured and compared in order to discover such a general relationship.

This is not how the science of biology works. It is true that doing excellent research in biology involves trafficking in numbers, especially these days: For example, statistical methods help one gain confidence in trends discovered through repeated observations (such as a significant but small increase in the rate of cell death when a drug is introduced). Nonetheless, there is nothing fundamentally quantitative about the scientific study of life. Instead, biology takes the categories of living and nonliving things for granted as a starting point, and then uses the scientific method to investigate what is predictable about the behavior and qualities of life. Biologists did not have to go around convincing humanity that the world actually divides into things that are alive and things that are not; instead, in much the same way that it is quite popular across the length and breadth of human language to coin terms for commonplace things like stars, rivers, and trees, the difference between being alive and not being alive gets denoted with vocabulary.

In short, biology could not have been invented without the preexisting concept of life to inspire it, and all it needed to get going was for someone to realize that there were things to be discovered by reasoning scientifically about things that were alive. This means, though, that biology most certainly is not founded on mathematics in the way that physics is. Discovering that plants need sunlight to grow, or that fish will suffocate when taken out of water, requires no quantification of anything whatsoever. Of course, we could learn more by measuring how much sunlight the plant got, or timing how long it takes for the fish-out-of-water to expire. But the basic empirical law in biological terms only concerns itself with what conditions will enable or prevent thriving, and what it means to thrive comes from our qualitative and holistic judgment of what it looks like to succeed at being alive. If we are honest with ourselves, the ability to make this judgment was not taught to us by scientists, but comes from a more common kind of knowledge: We are alive ourselves, and constantly mete out life and death to bugs and flowers in our surroundings. Science may help us to discover new ways to make things live or die, but only once we tell the scientists how to use those words. We did not know any physics when we invented the word “life,” and it would be strange if physics only now began suddenly to start dictating to us what the word means.

The origin of life can’t be explained by first principles: “Why Physics Can’t Tell Us What Life Is.”

See also this interview with Jeremy England, the author of the article linked above (and of the book from which it is excerpted): “The Physicist’s New Book of Life.”

  • Kurt Vonnegut, Mother Night

###

As we live and let live, we might spare a thought for Ernest Everett Just; he died on this date in 1941.  A pioneering biologist, academic, and science writer, he contributed mightily to the understanding of cell division, the fertilization of egg cells, experimental parthenogenesis, hydration, cell division, dehydration in living cells, and the effect of ultra violet rays on egg cells.

An African-American, he had limited academic prospects on his graduation from Dartmouth, but was able to land a teaching spot at Howard University.  Just met  Frank R. Lillie, the head of the Department of Zoology at the University of Chicago and director of the Marine Biological Laboratory (MBL) at Woods Hole, Mass.  In 1909 Lillie invited Just to spend first one, then several summers at Woods Hole, where Just pioneered the study of whole cells under normal conditions (rather than simply breaking them apart in a laboratory setting).  In 1915, Just was awarded the first Spingarn Medal, the highest honor given by the NAACP.

But outside MBL, Just experienced discrimination.  Seeking more opportunity he spent most of the 1930s in various European universities– until the outbreak of WW II hostilities caused him to return to the U.S. in late 1940.  He died of pancreatic cancer on this date the next year.

Ernest_Everett_Just

 source

Written by (Roughly) Daily

October 27, 2020 at 1:01 am