Posts Tagged ‘knowledge’
“The more that you read, the more things you will know. The more that you learn, the more places you’ll go.”*…
The end of the year approaches, and thoughts turn to retrospectives. In what has become a (Roughly) Daily tradition, today’s edition features a year-end recap from the estimable Tom Whitwell, who shares a full deck of fascinating things he learned in 2024. For example…
6. The London Underground has a distinct form of mosquito, Culex pipiens f. Molestus, genetically different from above-ground mosquitos, and present since at least the 1940s. [Katharine Byrne & Richard A Nichols]
7. Ozempic is a modified, synthetic version of a protein discovered in the venomous saliva of the Gila monster, a large, sluggish lizard native to the United States. [Scott Alexander]
22. In 2022, 55% of Macy’s income came from credit cards rather than retail sales. That’s fairly normal for US department stores. [Pan Kwan Yuk]
29. You can buy 200 real human molars for $900. [B for Bones, via Lauren]
32. In 1800, 1 in 3 people on earth were Chinese. Today, it’s less than 1 in 5. [Our World in Data, via Boyan Slat]
42. n the 2020s, over 16% of movies have colons in the title (Like Spider-Man: Homecoming), up almost 300% since the 1990s. [Daniel Parris]
46. Between the 1920s and 1950s, millions of ‘enemies of the people’ — often educated elites — were sent to prison camps in the Soviet Union. Today, the areas around those camps are more prosperous and productive than similar areas. [Toews & Vézina]
Many more fascinating factoids at: “52 things I learned in 2024,” from @TomWhitwell.
Previous lists: 2014 2015 2016 2017 2018 2019 2020 2021 2022 2023… and sprinkled throughout the December postings in (R)D over the years.
###
As we forage, we might recall that on this date in 1968 Marvin Gaye’s version of “I Heard It Through the Grapevine” was in the middle of its seven-week occupancy of the #1 spot on Billboard’s Hot 100.
A year earlier, Gladys Knight and the Pips had had a hit with the tune (#1 on the R&B chart; #2 on the Hot 100). Gaye’s version overtook its predecessor and became the biggest hit single on the Motown family of labels up to that point. The Gaye recording has since become an acclaimed soul classic. In 1998 the song was inducted into the Grammy Hall of Fame for “historical, artistic and significant” value.
“I was conscious that I knew practically nothing”*…
The estimable Nicholas Carr observes that “you don’t make friends by telling people they’re not as smart as they think they are. And you definitely don’t make friends by telling all of humanity that it’s not as smart as it thinks it is. That’s why the philosophical school of Mysterianism has never caught on with the public.” As an amateur Mysterian himself, he reprises a 2017 essay to spread the good word…
By leaps, steps, and stumbles, science progresses. Its seemingly inexorable advance promotes a sense that everything can be known and will be known. Through observation and experiment, and lots of hard thinking, we will come to explain even the murkiest and most complicated of nature’s secrets: consciousness, dark matter, time, the origin and fate of the universe.
But what if our faith in nature’s knowability is just an illusion, a trick of the overconfident human mind? That’s the working assumption behind a school of thought known as Mysterianism. Situated at the fruitful if sometimes fraught intersection of scientific and philosophic inquiry, the Mysterianist view has been promulgated, in different ways, by many prominent thinkers, from the philosopher Colin McGinn to the linguist Noam Chomsky to the cognitive scientist Steven Pinker. The Mysterians propose that human intellect has boundaries and that many of the mysteries of the cosmos will forever lie beyond our comprehension.
Mysterianism is most closely associated with the so-called hard problem of consciousness: How can the inanimate matter of the brain produce subjective feelings? The Mysterians suggest that the human mind is incapable of understanding itself, that we will never know how consciousness works. But if Mysterianism applies to the workings of the mind, there’s no reason it shouldn’t also apply to the workings of nature in general. As McGinn has suggested, “It may be that nothing in nature is fully intelligible to us.”
The simplest and best argument for Mysterianism is founded on evolutionary evidence. When we examine any other living creature, we understand immediately that its intellect is limited. Even the brightest, most curious dog is not going to master arithmetic. Even the wisest of owls knows nothing of the physiology of the field mouse it devours. If all the minds that evolution has produced have bounded comprehension, then it’s only logical that our own minds, also products of evolution, would have limits as well. As Pinker has put it, “The brain is a product of evolution, and just as animal brains have their limitations, we have ours.” To assume that there are no limits to human understanding is to believe in a level of human exceptionalism that seems miraculous, if not mystical.
Mysterianism, it’s important to emphasize, is not inconsistent with materialism [with theism or idealism]. The Mysterians don’t suggest that what’s unknowable has to be spiritual or otherwise otherworldly. They posit that matter itself has complexities that lie beyond our ken. Like every other animal on earth, we humans are just not smart enough to understand all of nature’s laws and workings.
What’s truly disconcerting about Mysterianism is that, if our intellect is bounded, we can never know how much of existence lies beyond our grasp. What we know or may in the future know may be trifling compared with the unknowable unknowns. “As to myself,” remarked Isaac Newton in his old age, “I seem to have been only like a boy playing on the sea-shore, and diverting myself in now and then finding a smoother pebble or a prettier shell than ordinary, whilst the great ocean of truth lay all undiscovered before me.” It may be that we are all like that child on the strand, playing with the odd pebble or shell — and fated to remain so.
Mysterianism teaches us humility. Through science, we have come to understand much about nature, but much more may remain outside the scope of our perception and comprehension. If the Mysterians are right, science’s ultimate achievement may be to reveal to us its own limits…
On unknowable unknowns: Question Marks of the Mysterians, from @roughtype in his terrific newsletter, New Cartographies.
Pair with Flatland (here and here) and Godel’s second incompleteness theorem.
* Socrates (per Plato in Apology 22d)
###
As we wonder, we might recall that it was on this date (tough different sources offer different November dates) in 1966 that 96 Tears, the debut studio album by the American garage rock band ? and the Mysterians was released. The title single, which had been released some months earlier was at #1 on the Billboard Hot 100; the album joined the single on the charts for fifteen weeks; the follow-up single “I Need Somebody” charted for ten weeks.
“The greatest obstacle to discovery is not ignorance – it is the illusion of knowledge”*…
Learning from the past: as John Thornhill explains in his consideration of Jason Roberts‘ Every Living Thing, the rivalry between Buffon and Linnaeus has lessons about disrupters and exploitation…
The aristocratic French polymath Georges-Louis Leclerc, Comte de Buffon chose a good year to die: 1788. Reflecting his status as a star of the Enlightenment and author of 35 popular volumes on natural history, Buffon’s funeral carriage drawn by 14 horses was watched by an estimated 20,000 mourners as it processed through Paris. A grateful Louis XVI had earlier erected a statue of a heroic Buffon in the Jardin du Roi, over which the naturalist had masterfully presided. “All nature bows to his genius,” the inscription read.
The next year the French Revolution erupted. As a symbol of the ancien regime, Buffon was denounced as an enemy of progress, his estates in Burgundy seized, and his son, known as the Buffonet, guillotined. In further insult to his memory, zealous revolutionaries marched through the king’s gardens (nowadays known as the Jardin des Plantes) with a bust of Buffon’s great rival, Carl Linnaeus. They hailed the Swedish scientific revolutionary as a true man of the people.
The intense intellectual rivalry between Buffon and Linnaeus, which still resonates today, is fascinatingly told by the author Jason Roberts in his book Every Living Thing, my holiday reading while staying near Buffon’s birthplace in Burgundy. Natural history, like all history, might be written by the victors, as Roberts argues. And for a long time, Linnaeus’s highly influential, but flawed, views held sway. But the book makes a sympathetic case for the further rehabilitation of the much-maligned Buffon.
The two men were, as Roberts writes, exact contemporaries and polar opposites. While Linnaeus obsessed about classifying all biological species into neat categories with fixed attributes and Latin names (Homo sapiens, for example), Buffon emphasised the vast diversity and constantly changing nature of every living thing.
In Roberts’s telling, Linnaeus emerges as a brilliant but ruthless dogmatist, who ignored inconvenient facts that did not fit his theories and gave birth to racial pseudoscience. But it was Buffon’s painstaking investigations and acceptance of complexity that helped inspire the evolutionary theories of Charles Darwin, who later acknowledged that the Frenchman’s ideas were “laughably like mine”.
In two aspects, at least, this 18th-century scientific clash rhymes with our times. The first is to show how intellectual knowledge can often be a source of financial gain. The discovery of crops and commodities in other parts of the world and the development of new methods of cultivation had a huge impact on the economy in that era. “All that is useful to man originates from these natural objects,” Linnaeus wrote. “In one word, it is the foundation of every industry.”
Great wealth was generated from trade in sugar, potatoes, coffee, tea and cochineal while Linnaeus himself explored ways of cultivating pineapples, strawberries and freshwater pearls.
“In many ways, the discipline of natural history in the 18th century was roughly analogous to technology today: a means of disrupting old markets, creating new ones, and generating fortunes in the process,” Roberts writes. As a former software engineer at Apple and a West Coast resident, Roberts knows the tech industry.
Then as now, the addition of fresh inputs into the economy — whether natural commodities back then or digital data today — can lead to astonishing progress, benefiting millions. But it can also lead to exploitation. As Roberts tells me in a telephone interview, it was the scaling up of the sugar industry in the West Indies that led to the slave trade. “Sometimes we think we are inventing the future when we are retrofitting the past,” he says.
The second resonance with today is the danger of believing we know more than we do. Roberts compares Buffon’s state of “curious unknowing” to the concept of “negative capability” described by the English poet John Keats. In a letter written in 1817, Keats argued that we should resist the temptation to explain away things we do not properly understand and accept “uncertainties, mysteries, doubts, without any irritable reaching after fact and reason.”
Armed today with instant access to information and smart machines, the temptation is to ascribe a rational order to everything, as Linnaeus did. But scientific progress depends on a humble acceptance of relative ignorance and a relentless study of the fabric of reality. The spooky nature of quantum mechanics would have blown Linnaeus’s mind. If Buffon still teaches us anything, it is to study the peculiarity of things as they are, not as we might wish them to be…
“What an epic 18th-century scientific row teaches us today,” @johnthornhillft on @itsJason in @FT (gift link)
Pair with “Frameworks” from Céline Henne (@celinehenne) “Knowledge is often a matter of discovery. But when the nature of an enquiry itself is at question, it is an act of creation.”
* Daniel J. Boorstin
###
As we embrace the exceptions, we might send carefully-coded birthday greetings to John McCarthy; he was born on this date in 1927. An eminent computer and cognitive scientist– he was awarded both the Turning Prize and the National Medal of Science– McCarthy coined the phrase “artificial Intelligence” to describe the field of which he was a founder.
It was McCarthy’s 1979 article, “Ascribing Mental Qualities to Machines” (in which he wrote, “Machines as simple as thermostats can be said to have beliefs, and having beliefs seems to be a characteristic of most machines capable of problem solving performance”) that provoked John Searle‘s 1980 disagreement in the form of his famous Chinese Room Argument… provoking a broad debate that continues to this day.

“I was conscious that I knew practically nothing”*…
As Joshua Rothman reminds us, we have a lot to learn from studying our ignorance…
… The truth, of course, is that we’re ignorant about the future. Who will win the election in November? Will we lose our jobs because of A.I.? Will the planet boil or merely simmer? What will skyscrapers, or smartphones, or schools look like in thirty years? We’re not in the dark about these questions; we can make educated guesses or predictions. But there’s an odd way in which, the more informed our speculations become, the more they serve to highlight what we don’t know. “The knowledge we possess determines the degree of specificity of the ignorance we recognize,” the philosopher Daniel DeNicola writes, in his book “Understanding Ignorance.” The more you know, the more precisely you can say what you don’t.
DeNicola’s book is an entry in a subfield of philosophy called “agnotology”—the study of ignorance. As philosophical subfields go, agnotology sounds abstract and even a little contradictory: what could it even mean to study what’s unknown? And yet, because ignorance is actually an everyday condition from which we all suffer, the study of it is quite down to earth. Have you ever been in a bookstore, leafed through a weighty tome, and then returned it to the shelf? You are practicing “rational ignorance,” DeNicola writes, by making “the more-or-less conscious decision that something is not worth knowing—at least for me, at least not now.” (In an information-rich society, he notes, knowing when to maintain this kind of ignorance is actually an important skill.) Have you ever tuned out a gossipy friend because you don’t want to know who said what about whom? Deciding that you’d rather be above the fray is “strategic ignorance”; you embrace it because it will make life better, deploying it when you decide not to read the reviews before seeing a movie, or conduct a hiring process in which the names of the candidates are obscured. There’s a big difference between strategic ignorance and what DeNicola calls “involuntary” ignorance: “In the iconic image, Justice is blindfolded, not blind,” he writes.
My wife’s parents have a box of letters that were sent between her grandfather and her grandmother while he was serving in the Navy during the Second World War. The box is in the basement; no one has read the letters, and no one plans to. This reflects a valid concern for privacy, but it also involves what DeNicola calls “willful ignorance”—the persistent, long-term maintenance of a gap in one’s knowledge that could easily be filled in. Willful ignorance isn’t necessarily bad; it might be wise to avoid learning the disturbing details of a half-forgotten traumatic event, for instance, lest they keep the trauma fresh. But we should be wary of willful ignorance, DeNicola argues, because it often flows from fear. “Consider a mother who is so upset about her son’s military service that she refuses to discuss it while he remains on active duty,” DeNicola writes. Or a voter who refuses to read about a favored candidate’s ongoing scandal. “The benefits of willful ignorance tend to be overestimated by those who exhibit it”; knowledge can be a path to overcoming fear.
DeNicola argues that, even when we don’t choose ignorance, there are ways in which we must “dwell in ignorance,” no matter what we do. We’re ignorant of most of what happened in the past because, despite our efforts at historical reconstruction, “worlds disappear” in the flow of time. We’re ignorant about the future not just because we don’t know what will happen but because we lack the ideas needed to comprehend future knowledge: “Galileo could not have known that solar flares produce bursts of radiation,” for example, because the very idea of radiation depends on a “framework of theoretical concepts” that wasn’t developed until hundreds of years after he lived. It turns out that there’s a special word, “ignoration,” which describes the condition of people who “do not even know that they do not know.” In a broad, almost existential sense, we all live in ignoration all the time. Recognizing this makes knowing what you don’t know feel like a step forward—even an opportunity to be seized…
… In a recent book called “Sense, Nonsense, and Subjectivity,” a German philosopher named Markus Gabriel argues that our personhood is partly based on ignorance—that “to be someone, to be a subject, is to be wrong about something.” It’s intuitive to hold the opposite view—to say that we are the sum of what we know. But Gabriel points out that, even when you know something to be true, you probably also know that there are aspects of it about which you’re probably wrong. I encountered this phenomenon recently when my son asked me to explain the meaning of “E=mc2”—but, also, when I tried to tell him about how I’d met his mom. “We were riding up in an elevator, and we started talking, and then she got off,” I said. “And then, later, when I was riding down, she got back on.”
This story is true, but also wreathed in inevitable uncertainties. What exactly did we say to one another? What were we wearing, or thinking, or feeling, before and after? There are limits to recollection, and to noticing in the moment; life is short, and you can’t know it all, not even about yourself. But you can know, at least to some extent, what you chose not to know and what you wished you’d found out. You can understand what you looked away from, and toward…
“What Don’t We Know?” from @joshuarothman in @NewYorker.
* Socrates, from Plato, Apology 22d
###
As we noodle on nescience, we might send bodacious birthday greetings to that most fabulous of flappers, Betty Boop; she made her first appearance on this date in 1930. The creation of animator Max Fleischer, she debuted in “Dizzy Dishes” (in which, still unevolved as a character, she is drawn as an anthropomorphic female dog).
“The recognition of oneself as a part of nature, and reliance on natural things, are disappearing for hundreds of millions of people who do not know that anything is being lost.”*…
The estimable Alan Jacobs on the (glorious) novels of Robertson Davies, and (what Jacobs suggests is) a central question running throughout them: What ways of Wisdom have been discarded by modern Knowledge?…
Long ago every village in England had a cunning man, or woman—an untrained but intuitive healer, a person with a good nose for other people’s troubles and a tactical shrewdness about how to handle them. If your problems were simple and obvious, if you needed a broken bone set or a bad tooth pulled, you’d go to the surgeon. Everyone knew that. But what if you weren’t quite sure what was wrong with you? What if your spirit was troubled but also your digestion, and you didn’t know which was causing which, or if they were separate miseries? Then you needed to consult the cunning ones.
The Cunning Man is the last novel by the great Canadian writer Robertson Davies, and its titular figure is a man of the late twentieth century named Jonathan Hullah, who grew up in a remote outpost in northern Ontario and got his first ideas about healing by hanging around with Elsie Smoke, an Ojibwa herbalist and healer, a “wise woman”—a cunning woman. Hullah ultimately becomes a doctor and a practitioner of what some now call “holistic medicine,” though that term is not used in the book by Hullah or anyone else. Hullah thinks of himself as a disciple of the great Renaissance physician Paracelsus— the first person to theorize that physical disease can be the product of what we now would call psychological distress. As Hullah comments, “The problem for a Paracelsian physician like me is that I see diseases as disguises in which people present me with their wretchedness.” It is a problem because people are happy to speak of their diseases but reluctant to acknowledge their wretchedness.
Hullah’s creator almost certainly learned about Paracelsus through reading Carl Jung, who was perhaps the most important guiding figure of Davies’s intellectual and religious life. From my point of view, which is that of a generally orthodox Christian, Davies’s embrace of Jungian ideas is a convenient way to get all the benefits of belief in transcendent order with none of the obligations of obedience to a personal God. Nevertheless, there is much in Davies’s picture of the cunning man—and in closely related ideas that he developed in the latter part of his career as a novelist—from which thinking Christians can and should learn. Above all, I think, we should adopt a kind of historically aware intellectual pluralism, a willingness to learn from and make use of the past, and especially those elements of the past that have been discarded by modernity as refuse and waste. The thoughtful Christian should be a cunning practitioner of filth therapy.
In Davies’s wicked and wonderful novel The Rebel Angels, a scholar named Clement Hollier—whom Davies refers to as a “paleo-psychologist,” a student of ancient and discarded ways of thinking—grows fascinated by what he calls “Filth Therapy.” He suspects that a scientific colleague is pursuing a similar path: “He works with human excrement—what is rejected, what is accounted of no worth to mankind—and in it I suppose he hopes to discover something that is of worth.”…
…
… In his many novels Davies returns over and over again to this theme. He portrays modernity as a world in which we love our crowns even as we despise and try to rip up our roots. The Rebel Angels is the first novel in what has come to be known as the Cornish Trilogy because it deals with the Cornish family, and in the novel that follows it, What’s Bred in the Bone, a young painter named Francis Cornish struggles with his love of Renaissance painting— struggles because he doesn’t just admire the Old Masters but wants to paint as they painted. And yet, he thinks, “surely one must paint in the manner of one’s day?” Anything else is “a kind of fakery, or a deliberate throw-back, like those PreRaphaelites.” And he has a very specific reason for believing that one must choose between “the manner of one’s own day” and a historically informed “fakery”: “Even if you are a believer, you cannot believe as the great men of the past believed.”
Cornish’s mentor, a brilliant restorer of art named Saraceni, disputes this, and constantly holds out to young Cornish the challenge of acquiring “the ability to work truly in the technique and also in the spirit of the past.” And Cornish achieves this ability, at least to Saraceni’s satisfaction; but when his masterful painting is discovered to be new rather than old, it is immediately and universally decried as a fake— even though Cornish never pretended that the painting was by anyone else. For artists and connoisseurs of our age, only the crown—the thought-world of the moment—can provide an authentic and valid mode of artistic (or religious) experience. To work from the root is necessarily to be inauthentic…
…
In a city in Paraguay you may find a curious assembly of musicians called La Orquesta de Instrumentos Reciclados de Cateura—the Recycled Instruments Orchestra of Cateura. But these instrumentos are not professionally designed and built objects that have been discovered and repaired: they have been made out of recycled materials. Violins are constructed from cans and bent forks, a discarded oil drum forms the body of a cello, a saxophone somehow emerges from a drainpipe and a few bent spoons. Most of the musicians are teenagers from Cateura, which is a slum, and a slum built on and around a landfill. They too are among the world’s discards, thought to be without value, people in whom society invests no hope. But Fabio Chavez, the creator and director of the orchestra, has invested in them. He has said, “People realize that we shouldn’t throw away trash carelessly. Well, we shouldn’t throw away people either.”
In The Rebel Angels Maria’s mother healed the souls of great instruments that had been damaged by time and use. This is a wondrous art and worthy of great praise.
But then what praise is appropriate for those who have taken the filth of the world and given it souls, souls capable of the loveliest utterance? And what wonder is adequate to the imaginative dedication of Fabio Chavez, whose name should be known throughout the world? “The world sends us garbage,” he says. “We send back music.”…
Eminently worth reading in full: “Filth Therapy: A Cunning Word.” Also eminently worth reading: every one of Robertson Davies’ novels.
This essay dates from 2017. Jacobs brought it back up in response to his reading of a fascinating new book: Cunning Folk: Life in the Era of Practical Magic, by Tabitha Stanmore. It’s on Google Books, here.
* Robertson Davies, The Rebel Angels
###
As we find treasure in trash, we might recall that it was in this date in 1953 that John Kraft (the younger brother of James Kraft, the founder of Kraft Cheese [later Kraft Foods]) received U.S. patent No. 2,641,545 for the manufacture of soft surface cured cheese. Just one year earlier, the company had introduced Cheez Whiz.







You must be logged in to post a comment.