(Roughly) Daily

Posts Tagged ‘learning

“Nothing from nothing ever yet was born”*…

Lacy M. Johnson argues that there is no hierarchy in the web of life…

… Humans have been lumbering around the planet for only a half million years, the only species young and arrogant enough to name ourselves sapiens in genus Homo. We share a common ancestor with gorillas and whales and sea squirts, marine invertebrates that swim freely in their larval phase before attaching to rocks or shells and later eating their own brain. The kingdom Animalia, in which we reside, is an offshoot of the domain Eukarya, which includes every life-form on Earth with a nucleus—humans and sea squirts, fungi, plants, and slime molds that are ancient by comparison with us—and all these relations occupy the slenderest tendril of a vast and astonishing web that pulsates all around us and beyond our comprehension.

The most recent taxonomies—those based on genetic evidence that evolution is not a single lineage, but multiple lineages, not a branch that culminates in a species at its distant tip, but a network of convergences—have moved away from their histories as trees and chains and ladders. Instead, they now look more like sprawling, networked webs that trace the many points of relation back to ever more ancient origins, beyond our knowledge or capacity for knowing, in pursuit of the “universal ancestors,” life-forms that came before metabolism, before self-replication—the several-billion-year-old plasmodial blobs from which all life on Earth evolved. We haven’t found evidence for them yet, but we know what we’re looking for: they would be simple, small, and strange.

Slime molds can enter stasis at any stage in their life cycle—as an amoeba, as a plasmodium, as a spore— whenever their environment or the climate does not suit their preferences or needs. The only other species who have this ability are the so-called “living fossils” such as tardigrades and Notostraca (commonly known as water bears and tadpole shrimp, respectively). The ability to become dormant until conditions are more favorable for life might be one of the reasons slime mold has survived as long as it has, through dozens of geologic periods, countless ice ages, and the extinction events that have repeatedly wiped out nearly all life on Earth.

Slime mold might not have evolved much in the past two billion years, but it has learned a few things during that time. In laboratory environments, researchers have cut Physarum polycephalum into pieces and found that it can fuse back together within two minutes. Or, each piece can go off and live separate lives, learn new things, and return later to fuse together, and in the fusing, each individual can teach the other what it knows, and can learn from it in return.

Though, in truth, “individual” is not the right word to use here, because “individuality”—a concept so central to so many humans’ identities—doesn’t apply to the slime mold worldview. A single cell might look to us like a coherent whole, but that cell can divide itself into countless spores, creating countless possible cycles of amoeba to plasmodium to aethalia, which in turn will divide and repeat the cycle again. It can choose to “fruit” or not, to reproduce sexually or asexually or not at all, challenging every traditional concept of “species,” the most basic and fundamental unit of our flawed and imprecise understanding of the biological world. As a consequence, we have no way of knowing whether slime molds, as a broad class of beings, are stable or whether climate change threatens their survival, as it does our own. Without a way to count their population as a species, we can’t measure whether they are endangered or thriving. Should individuals that produce similar fruiting bodies be considered a species? What if two separate slime molds do not mate but share genetic material? The very idea of separateness seems antithetical to slime mold existence. It has so much to teach us…

More at: “What Slime Knows,” from @lacymjohnson in @Orion_Magazine.

See also, “Slime Molds Remember — but Do They Learn?” (from whence the image above) and “Now, in addition to penicillin, we can credit mold with elegant design.”

* Lucretius, On the Nature of Things

###

As we contemplate kinship, we might send insightful birthday greetings to Johann Hedwig; he was born on this date in 1730. A botanist noted for his study of mosses, he is considered “the father of bryology” (the study of mosses… cousins of mold).

source

Written by (Roughly) Daily

December 8, 2021 at 1:00 am

“Curiosity has its own reason for existence”*…

This is, as nearly as I can tell, the 5,000th (Roughly) Daily post (4,505 blog posts, preceded by 495 email-only pieces). On this numerologically-significant occasion, my deep thanks to readers past and present. It seems appropriate to devote this post to the impulse that has powered (Roughly) Daily from the start, curiosity– free-range curiosity…

Recently I read a terrific blog post by CJ Eller where he talks about the value of paying attention to offbeat things.

Eller was joining an online conversation about how people get caught up in the “status and celebrity game” when they’re trying to grow their audience. They become overly obsessed with following — and emulating, and envying— the content of people with massive audiences. The conversation started with this poignant essay by the author Ali Montag; she concludes that rabidly chasing followers endows your writing (and thinking!) with “inescapable mediocrity.” (It also tends to make you miserable, too, she points out)…

Instead of crowding your attention with what’s already going viral on the intertubes, focus on the weird stuff. Hunt down the idiosyncratic posts and videos that people are publishing, oftentimes to tiny and niche audiences. It’s decidedly unviral culture — but it’s more likely to plant in your mind the seed of a rare, new idea.

I love the idea of “rewilding your attention”. It puts a name on something I’ve been trying to do for a while now: To stop clicking on the stuff big-tech algorithms push at me… social behavior can influence our attention: What are the high-follower-count folks talking/posting/arguing about today? This isn’t always a bad thing. We’re social animals, so we’re necessarily (and often productively) intrigued by what others are chewing over. But as these three writers note, it’s also crucial to follow your own signal — to cultivate the stuff you’re obsessed with, even if few others are.

On top of the social pressure from people online, there’s technological pressure too — from recommendation systems trying to juke our attention… Medium’s algorithm has deduced that … I’m a nerd. They are correct! I am. The other major social networks, like Twitter or YouTube, offer me the same geek-heavy recommendations when I log in. And hey, they’re not wrong either; I really do like these subjects.

But … I’m also interested in so many other things that are far outside these narrow lanes. I am, for example, a Canadian who’s deeply into Canadian art, and a musician who spends a lot of time thinking about composition and gear and lyric-writing and production and guitar pedals, and a father who thinks a lot about the culture my kids show me, and I have a super-snobby fanboy love of the 18th century poet Alexander Pope.

You’re the same way; you contain your own Whitmanian multitudes, your pockets of woolly-eyed obsession. We all do.

But our truly quirky dimensions are never really grasped by these recommendation algorithms. They have all the dullness of a Demographics 101 curriculum; they sketch our personalities with the crudity of crime-scene chalk-outlines. They’re not wrong about us; but they’re woefully incomplete. This is why I always get a slightly flattened feeling when I behold my feed, robotically unloading boxes of content from the same monotonous conveyor-belt of recommendations, catered to some imaginary marketing version of my identity. It’s like checking my reflection in the mirror and seeing stock-photo imagery.

The other problem with big-tech recommendation systems is they’re designed by people who are convinced that “popularity” and “recency” equal “valuable”. They figure that if they sample the last 15 milliseconds of the global zeitgeist and identify what’s floated to the top of that quantum foam, I’ll care about it. Hey, a thing happened and people are talking about it, here’s the #hashtag!

And again … they’re sometimes right! I am often intrigued to know the big debates of the day, like Oscar Wilde peering into his daily gazette. But I’d also like to stumble over arguments yet more arcane, and material that will never be the subject of a massive online conversation because only a small group of oddballs care about it.

You’re the same way too, I bet. We’re all weird in different ways, but we’re all weird.

Big-tech recommendation systems have been critiqued lately for their manifold sins— i.e. how their remorseless lust for “engagement” leads them to overpromote hotly emotional posts; how they rile people up; how they feed us clicktastic disinfo; how they facilitate “doomscrolling”. All true.

But they pose a subtler challenge, too, for our imaginative lives: their remarkably dull conception of what’s interesting. It’s like intellectual monocropping. You open your algorithmic feed and see rows and rows of neatly planted corn, and nothing else.

That’s why I so enjoy the concept of “rewilding”… For me, it’s meant slowly — over the last few years — building up a big, rangy collection of RSS feeds, that let me check up on hundreds of electic blogs and publications and people. (I use Feedly.) I’ve also started using Fraidycat, a niftily quixotic feed-reader that lets you sort sources into buckets by “how often should I check this source”, which is a cool heuristic; some people/sites you want to check every day, and others, twice a year.

Other times I spend an hour or two simply prospecting — I pick a subject almost at random, then check to see if there’s a hobbyist or interest-group discussion-board devoted to it. (There usually is, running on free warez like phpBB). Then I’ll just trawl through the forum, to find out what does this community care about? It’s like a psychogeographic walk of the mind.

Another awesome technology for rewilding my attention, I’ve found, is the good old-fashioned paper book. I go to a bookstore, pick up something where it’s not immediately obvious why it’d appeal to me, then flip around to see if anything catches my eye. (This works online, too, via the wonderful universe of pre-1923, freely-accessible ebooks and publications at the Internet Archive, Project Gutenberg, or even Google Books. Pre-WWI material is often super odd and thought-provoking.)…

Step away from algorithmic feeds. In praise of free-range curiosity: “Rewilding your attention,” from Clive Thompson (@pomeranian99).

See also “Before Truth: Curiosity, Negative Capability, Humility, ” from Will Wilkinson (@willwilkinson)

* “Curiosity has its own reason for existence. One cannot help but be in awe when he contemplates the mysteries of eternity, of life, of the marvelous structure of reality. It is enough if one tries merely to comprehend a little of this mystery each day.” — Albert Einstein, “Old Man’s Advice to Youth: ‘Never Lose a Holy Curiosity.'” LIFE Magazine (2 May 1955) p. 64”

###

As we revel in rabbit holes, we might send insightfully-humorous birthday greetings to William Penn Adair Rogers; he was born on this date in 1879.  A stage and motion picture actor, vaudeville performer, cowboy, humorist, newspaper columnist, and social commentator, he traveled around the world three times, made 71 films (50 silent films and 21 “talkies”), and wrote more than 4,000 nationally syndicated newspaper columns.  By the mid-1930s Rogers was hugely popular in the United States, its leading political wit and the highest paid Hollywood film star.  He died in 1935 with aviator Wiley Post when their small airplane crashed in northern Alaska.

Known as “Oklahoma’s Favorite Son,” Rogers was a Cherokee citizen, born to a Cherokee family in Indian Territory (now part of Oklahoma).

“I am not a member of an organized political party. I am a Democrat.”- Will Rogers

220px-Will_Rogers_1922

source

“Nothing so much assists learning as writing down what we wish to remember”*…

Lewis Carroll’s commonplace shows his musings on ciphers and detailed handwritten charts exploring labryinths. [source]

Your correspondent is away for the rest of this month; regular service will resume on or around September 1st. For the hiatus, a little something to occupy you…

As readers of this blog will have deduced, (Roughly) Daily is a kind of commonplace book…

Commonplace books (or commonplaces) are a way to compile knowledge, usually by writing information into books. They have been kept from antiquity, and were kept particularly during the Renaissance and in the nineteenth century. Such books are similar to scrapbooks filled with items of many kinds: sententiae, notes, proverbs, adages, aphorisms, maxims, quotes, letters, poems, tables of weights and measures, prayers, legal formulas, and recipes… Commonplaces are used by readers, writers, students, and scholars as an aid for remembering useful concepts or facts. Each one is unique to its creator’s particular interests but they almost always include passages found in other texts, sometimes accompanied by the compiler’s responses.

Commonplace book

As Steven Johnson points out, commonplace books have a storied history…

Scholars, amateur scientists, aspiring men of letters — just about anyone with intellectual ambition in the seventeenth and eighteenth centuries was likely to keep a commonplace book. In its most customary form, “commonplacing,” as it was called, involved transcribing interesting or inspirational passages from one’s reading, assembling a personalized encyclopedia of quotations. It was a kind of solitary version of the original web logs: an archive of interesting tidbits that one encountered during one’s textual browsing. The great minds of the period — Milton, Bacon, Locke — were zealous believers in the memory-enhancing powers of the commonplace book. There is a distinct self-help quality to the early descriptions of commonplacing’s virtues: in the words of one advocate, maintaining the books enabled one to “lay up a fund of knowledge, from which we may at all times select what is useful in the several pursuits of life.”

The philosopher John Locke first began maintaining a commonplace book in 1652, during his first year at Oxford. Over the next decade he developed and refined an elaborate system for indexing the book’s content. Locke thought his method important enough that he appended it to a printing of his canonical work, An Essay Concerning Human Understanding

The Glass Box And The Commonplace Book

Perhaps because in these interconnected days almost anything seems re-retrievable at a click, not too many bother keeping commonplaces. That’s a shame. Your correspondent can testify that the habit– whether practiced in a book or digitally– is a powerful aid both to learning and to writing.

Happily, there are lots of sources of good advice for getting started, e.g., here, here (source of the image above), and here. There’s even a Masterclass.

* Marcus Tullius Cicero

###

As we live and learn, we might recall that it was on this date in 1858 that (prodigious journaler and commonplace keeper) Charles Darwin and Alfred Russel Wallace published “On the tendency of species to form varieties; and on the perpetuation of varieties and species by natural selection” in the Journal of the Proceedings of the Linnean Society. This was the first printed formal exposition of the theory of evolution by natural selection.

Darwin had developed the essential elements of his theory by 1838 and set them on paper in 1844; however, he chose to keep his work on evolution unpublished for the time, instead concentrating his energies first on the preparation for publication of his geological work on the Beagle voyage , and then on an exhaustive eight-year study of the barnacle genus Cirripedia.

In 1856, at the urging of Charles Lyell, Darwin began writing a vast encyclopedic work on natural selection; however, it is possible that the extremely cautious Darwin might never have published his evolutionary theories during his lifetime had not Alfred Russel Wallace, a naturalist born in New Zealand, independently discovered the theory of natural selection. Wallace conceived the theory of natural selection during an attack of malarial fever in Ternate in the Mollucas, Indonesia (Febuary, 1858) and sent a manuscript summary to Darwin, who feared that his discovery would be pre-empted.

In the interest of justice Joseph Dalton Hooker and Charles Lyell suggested joint publication of Wallace’s paper prefaced by a section of a manuscript of a work on species written by Darwin in 1844, when it was read by Hooker, plus an abstract of a letter by Darwin to Asa Gray, dated 1857, to show that Darwin’s views on the subject had not changed between 1844 and 1857. The papers by Darwin and Wallace were read by Lyell before the Linnean Society on July 1, 1858 and published on August 20.

Darwin & Wallace Issue the First Printed Exposition of the Theory of Evolution by Natural Selection

source

“We account the whale immortal in his species, however perishable in individuality”*…

remarkable new study on how whales behaved when attacked by humans in the 19th century has implications for the way they react to changes wreaked by humans in the 21st century.

The paper, published by the Royal Society [in March], is authored by Hal Whitehead and Luke Rendell, pre-eminent scientists working with cetaceans, and Tim D Smith, a data scientist, and their research addresses an age-old question: if whales are so smart, why did they hang around to be killed? The answer? They didn’t.

Using newly digitised logbooks detailing the hunting of sperm whales in the north Pacific, the authors discovered that within just a few years, the strike rate of the whalers’ harpoons fell by 58%. This simple fact leads to an astonishing conclusion: that information about what was happening to them was being collectively shared among the whales, who made vital changes to their behaviour. As their culture made fatal first contact with ours, they learned quickly from their mistakes.

“Sperm whales have a traditional way of reacting to attacks from orca,” notes Hal Whitehead… Before humans, orca were their only predators, against whom sperm whales form defensive circles, their powerful tails held outwards to keep their assailants at bay. But such techniques “just made it easier for the whalers to slaughter them”, says Whitehead.

It was a frighteningly rapid killing, and it accompanied other threats to the ironically named Pacific. From whaling and sealing stations to missionary bases, western culture was imported to an ocean that had remained largely untouched. As Herman Melville, himself a whaler in the Pacific in 1841, would write in Moby-Dick (1851): “The moot point is, whether Leviathan can long endure so wide a chase, and so remorseless a havoc.”

Sperm whales are highly socialised animals, able to communicate over great distances. They associate in clans defined by the dialect pattern of their sonar clicks. Their culture is matrilinear, and information about the new dangers may have been passed on in the same way whale matriarchs share knowledge about feeding grounds. Sperm whales also possess the largest brain on the planet. It is not hard to imagine that they understood what was happening to them.

The hunters themselves realised the whales’ efforts to escape. They saw that the animals appeared to communicate the threat within their attacked groups. Abandoning their usual defensive formations, the whales swam upwind to escape the hunters’ ships, themselves wind-powered. ‘This was cultural evolution, much too fast for genetic evolution,’ says Whitehead.

And in turn, it evokes another irony. Now, just as whales are beginning to recover from the industrial destruction by 20th-century whaling fleets – whose steamships and grenade harpoons no whale could evade – they face new threats created by our technology. ‘They’re having to learn not to get hit by ships, cope with the depredations of longline fishing, the changing source of their food due to climate change,’ says Whitehead. Perhaps the greatest modern peril is noise pollution, one they can do nothing to evade.

Whitehead and Randall have written persuasively of whale culture, expressed in localised feeding techniques as whales adapt to shifting sources, or in subtle changes in humpback song whose meaning remains mysterious. The same sort of urgent social learning the animals experienced in the whale wars of two centuries ago is reflected in the way they negotiate today’s uncertain world and what we’ve done to it.

As Whitehead observes, whale culture is many millions of years older than ours. Perhaps we need to learn from them as they learned from us…

Learning from the ways that whales learn: “Sperm whales in 19th century shared ship attack information.”

* Herman Melville, Moby-Dick

###

As we admire adaptation, we might recall that it was on this date in 1953 that Ernest Hemingway won the Pulitzer Prize for his short novel The Old Man and the Sea. It was cited by the Nobel Committee as contributing to their awarding of the Nobel Prize in Literature to Hemingway the following year.

The Old Man and the Sea reinvigorated Hemingway’s literary reputation and prompted a reexamination of his entire body of work. The novel was initially received with much enthusiasm by critics and the public alike; many critics favorably compared it with Moby-Dick.

source

Written by (Roughly) Daily

May 4, 2021 at 1:01 am

“So many books, so little time”*…

Dear The Sophist, 

I own a lot of books, and nearly enough shelves to fit them. I haven’t read most of them—has anyone with a lot of books read most of them?—yet I still get impulses to buy more. Can you please tell me why it’s OK for me to buy more books? I should add that I live with a partner who doesn’t own a lot of books, but tolerates mine so far. So far.

—Tome-escent

Dear Volume Purchaser,

Books are ridiculous objects to buy, aren’t they? For the sake of spending a day or two, maybe a week, with some author’s thoughts and words, you take custody of this physical item that sticks around, and around, as more and more others accumulate along with it. You look at them, almost unseeingly, day after day; the walls of your rooms press in; you pay extra money to the movers to drag the extra weight around from one dwelling to the next, all because you read an interesting review once or a cover caught your eye in a bookstore.  

You know what else is ridiculous? The sheer impermanence of thought. The constant yet ephemeral flickering of partial understanding across the synapses in our wet and mortal brains, and the dry circuits of the junky and even more short-lived electronic ersatz brains we rely on for backup. A book is an investment against forgetting and death—a poor investment, but it beats the alternatives. It is a slippery yet real toehold on eternity,,, If you stop the flow of new books, you stop this flow of possibilities…

Too many books? Tom Scocca (@tomscocca) explains that there’s no such thing as too many books. (via the ever-illuminating Today in Tabs)

And lest one fear that the only option is to buy books, remember the Public Library…

Central Library, Kansas City (source)

* Frank Zappa

###

As we reorganize our shelves, we might spare a thought for someone whose works definitely deserve places of honor thereon, Octavia Estelle Butler; she died in this date in 2006. An African American woman science fiction author, she was a rarity in her field. But her primary distinction was her extraordinary talent, as manifest in novels and stories that stretch the imagination even as they explore the all-too-real truths of the human condition. She was a multiple recipient of both the Hugo and Nebula awards, and became (in 1995) the first science-fiction writer to receive a MacArthur Fellowship.

It’s measure of her insight that her work– perhaps especially her “Parable” series— is being re-discovered as painfully prescient of our current times.

source

Written by (Roughly) Daily

February 24, 2021 at 1:01 am