(Roughly) Daily

Posts Tagged ‘learning

“Right now I’m having amnesia and déjà vu at the same time. I think I’ve forgotten this before.”*…

Woodcut illustrations from Anianus’ Compotus cum commento (ca. 1492), an adaptation of Bede’s computus system — Source.

Before humans stored memories as zeroes and ones, we turned to digital devices of another kind — preserving knowledge on the surface of fingers and palms. Kensy Cooperrider leads us through a millennium of “hand mnemonics” and the variety of techniques practiced by Buddhist monks, Latin linguists, and Renaissance musicians for remembering what might otherwise elude the mind…

In the beginning, the hand was just a hand — or so we can imagine. It was a workaday organ, albeit a versatile one: a tool for grasping, holding, throwing, and hefting. Then, at some point, after millions of years, it took on other duties. It became an instrument of mental, not just menial, labor. As a species, our systems of understanding, belief, and myth had grown more elaborate, more cognitively overwhelming. And so we started to put those systems out into the world: to tally, track, and record by carving notches into bone, tying knots in string, spreading pigment on cave walls, and aligning rocks with celestial bodies. Hands abetted these early mental labors, of course, but they would later become more than mere accessories. Beginning roughly twelve hundred years ago, we started using the hand itself as a portable repository of knowledge, a place to store whatever tended to slip our mental grasp. The topography of the palm and fingers became invisibly inscribed with information of all kinds — tenets and dates, names and sounds. The hand proved versatile in a new way, as an all-purpose memory machine.

The arts of memory are well known, but the role of the hand in these arts is often overlooked. In the twentieth century, beginning with the pioneering work of Frances Yates, Western scholars started to piece together a rich tradition of mnemonic practices that originated in antiquity and later took hold in Europe. The most celebrated of these is the “memory palace” [see here]. Using this technique, skilled practitioners can memorize vast collections of facts by nesting them in familiar places (or “loci”): the chambers of a building or along a well-known route. (To make these places more memorable, a bizarre image is often added to each one, the more jarring the better.) It is an odd omission that hand mnemonics are rarely mentioned alongside memory palaces. Both techniques are powerful and broadly attested. Both are adaptable, able to accommodate whatever type of information one wants to remember. And both work by similar principles, pinning to-be-remembered items to familiar locations.

The two traditions do have important differences. Memory palaces exist solely in the imagination; hand mnemonics exist half in the mind and half in the flesh. Another difference lies in their intended use. Memory palaces are idiosyncratic in nature, tailored to the quirks of personal experience and association, and used for private purposes; they are very much the province of an individual. Hand mnemonics, by contrast, are the province of a community, a tool for collective understanding. They offer a way of fixing and transmitting a shared system of knowledge. They serve private purposes, certainly — such as contemplation, in the case of the Mogao mnemonic, or calculation, in the case of Bede’s computus. But they also have powerful social functions in teaching, ritual, and communication…

The five-fingered memory machine: “Handy Mnemonics,” from @kensycoop in @PublicDomainRev.

* Steven Wright

###

As we give it (to) the finger, we might recall an occasion for counting that required no fingers at all: on this date in 2015, a baseball game between Chicago White Sox and the Baltimore Orioles at Camden Yards set the all-time low attendance mark for Major League Baseball. Zero (0) fans were in attendance, because the stadium was closed to the public due to the 2015 Baltimore protests (over the death of Freddie Gray while in police custody).

source

“Damn everything but the circus!”*…

Acro-balancing in Circus and Philosophy at the University of Kentucky, fall 2017

Meg Wallace, of the University of Kentucky, teaches the philosophy course that I wish I’d taken…

The circus is ridiculous. Or: most people think it’s ridiculous. We even express our disdain for disorganized, poorly run groups by claiming, disparagingly, that such entities are “run like a circus.” (This isn’t true, of course. The amount of organization, discipline, and hard work that it takes to run a circus is mind-blowingly impressive.) But this is one reason why I teach Circus and Philosophy. I want to show students a new way into philosophy – through doing ridiculous things.

 It seems strange that philosophers often teach philosophy of art, philosophy of sport, philosophy of the performing arts, and so on, without having the students at least minimally participate in the activities or artforms that are being philosophized about. This lack of first-person engagement is especially unfortunate when the topic at hand crucially involves the perspective of the participant– the painter, the dancer, the actor, the aerialist, the clown. Circus and Philosophy is an attempt to explore this participation/theorizing gap. (Another aim is just to magic-trick undergrads into loving philosophy.)

[The circus is] rich with potential for deep discussions about an array of philosophical topics in aesthetics, ethics, social and political philosophy, personal identity, mind, metaphysics, epistemology, and so on. It is also intrinsically interdisciplinary, so students with interests and majors outside of philosophy can easily find a way in…

Finding the profound in the profane: “Circus and Philosophy: Teaching Aristotle through Juggling.”

* e e cummings

###

As we benefit from the big top, we might recall that it was on this date in 1987 that another instructive family of entertainers, The Simpsons, made their debut on television in “Good Night,” the first of 48 shorts that aired on The Tracey Ullman Show before the characters were given their own eponymously-titled show– now the longest-running scripted series in U.S. television history.

Written by (Roughly) Daily

April 19, 2022 at 1:00 am

“Artificial intelligence is growing up fast”*…

A simple prototype system sidesteps the computing bottleneck in tuning– teaching– artificial intelligence algorithms…

A simple electrical circuit [pictured above] has learned to recognize flowers based on their petal size. That may seem trivial compared with artificial intelligence (AI) systems that recognize faces in a crowd, transcribe spoken words into text, and perform other astounding feats. However, the tiny circuit outshines conventional machine learning systems in one key way: It teaches itself without any help from a computer—akin to a living brain. The result demonstrates one way to avoid the massive amount of computation typically required to tune an AI system, an issue that could become more of a roadblock as such programs grow increasingly complex.

“It’s a proof of principle,” says Samuel Dillavou, a physicist at the University of Pennsylvania who presented the work here this week at the annual March meeting of the American Physical Society. “We are learning something about learning.”…

More at “Simple electrical circuit learns on its own—with no help from a computer, from @ScienceMagazine.

* Diane Ackerman

###

As we brace ourselves (and lest we doubt the big things can grow from humble beginnings like these), we might recall that it was on this date in 1959 that Texas Instruments (TI) demonstrated the first working integrated circuit (IC), which had been invented by Jack Kilby. Kilby created the device to prove that resistors and capacitors could exist on the same piece of semiconductor material. His circuit consisted of a sliver of germanium with five components linked by wires. It was Fairchild’s Robert Noyce, however, who filed for a patent within months of Kilby and who made the IC a commercially-viable technology. Both men are credited as co-inventors of the IC. (Kilby won the Nobel Prize for his work in 2000; Noyce, who died in 1990, did not share.)

Kilby and his first IC (source)

“Learning never exhausts the mind”*…

As regular readers know, each year Tom Whitwell shares a list of the more intriguing things he’s learned over the year; happily, 2021 is no exception…

10% of US electricity is generated from old Russian nuclear warheads. [Geoff Brumfiel]

The entire global cosmetic Botox industry is supported by an annual production of just a few milligrams of botulism toxin. Pure toxin would cost ~$100 trillion per kilogram. [Anthony Warner]

Wearing noise cancelling headphones in an open-plan office helps a little bit — reducing cognitive errors by 14% — but actual silence reduces those errors by one third. [Benjamin Müller & co]

Until 1873, Japanese hours varied by season. There were six hours between sunrise and sunset, so a daylight hour in summer was 1/3rd longer than an hour in winter. [Sara J. Schechner]

48 other fascinating finds at: “52 things I learned in 2021,” from @TomWhitwell.

* Leonardo da Vinci

###

As we live and learn, we might recall that it was on this date in 1545, in response to the Protestant Reformation, that the Council of Trent (Concilium Tridentinum) was convened by the Roman Catholic Church. Its work concluded in 1563; and its results were published in 1564, condemning what the Catholic Church deemed to be the heresies of Protestants.  The embodiment of the Counter-Reformation, The Council of Trent established a firm and permanent distinction between the two practices of faith.

200px-Concilio_Trento_Museo_Buonconsiglio
Council of Trent (painting in the Museo del Palazzo del Buonconsiglio, Trento)

source

“Nothing from nothing ever yet was born”*…

Lacy M. Johnson argues that there is no hierarchy in the web of life…

… Humans have been lumbering around the planet for only a half million years, the only species young and arrogant enough to name ourselves sapiens in genus Homo. We share a common ancestor with gorillas and whales and sea squirts, marine invertebrates that swim freely in their larval phase before attaching to rocks or shells and later eating their own brain. The kingdom Animalia, in which we reside, is an offshoot of the domain Eukarya, which includes every life-form on Earth with a nucleus—humans and sea squirts, fungi, plants, and slime molds that are ancient by comparison with us—and all these relations occupy the slenderest tendril of a vast and astonishing web that pulsates all around us and beyond our comprehension.

The most recent taxonomies—those based on genetic evidence that evolution is not a single lineage, but multiple lineages, not a branch that culminates in a species at its distant tip, but a network of convergences—have moved away from their histories as trees and chains and ladders. Instead, they now look more like sprawling, networked webs that trace the many points of relation back to ever more ancient origins, beyond our knowledge or capacity for knowing, in pursuit of the “universal ancestors,” life-forms that came before metabolism, before self-replication—the several-billion-year-old plasmodial blobs from which all life on Earth evolved. We haven’t found evidence for them yet, but we know what we’re looking for: they would be simple, small, and strange.

Slime molds can enter stasis at any stage in their life cycle—as an amoeba, as a plasmodium, as a spore— whenever their environment or the climate does not suit their preferences or needs. The only other species who have this ability are the so-called “living fossils” such as tardigrades and Notostraca (commonly known as water bears and tadpole shrimp, respectively). The ability to become dormant until conditions are more favorable for life might be one of the reasons slime mold has survived as long as it has, through dozens of geologic periods, countless ice ages, and the extinction events that have repeatedly wiped out nearly all life on Earth.

Slime mold might not have evolved much in the past two billion years, but it has learned a few things during that time. In laboratory environments, researchers have cut Physarum polycephalum into pieces and found that it can fuse back together within two minutes. Or, each piece can go off and live separate lives, learn new things, and return later to fuse together, and in the fusing, each individual can teach the other what it knows, and can learn from it in return.

Though, in truth, “individual” is not the right word to use here, because “individuality”—a concept so central to so many humans’ identities—doesn’t apply to the slime mold worldview. A single cell might look to us like a coherent whole, but that cell can divide itself into countless spores, creating countless possible cycles of amoeba to plasmodium to aethalia, which in turn will divide and repeat the cycle again. It can choose to “fruit” or not, to reproduce sexually or asexually or not at all, challenging every traditional concept of “species,” the most basic and fundamental unit of our flawed and imprecise understanding of the biological world. As a consequence, we have no way of knowing whether slime molds, as a broad class of beings, are stable or whether climate change threatens their survival, as it does our own. Without a way to count their population as a species, we can’t measure whether they are endangered or thriving. Should individuals that produce similar fruiting bodies be considered a species? What if two separate slime molds do not mate but share genetic material? The very idea of separateness seems antithetical to slime mold existence. It has so much to teach us…

More at: “What Slime Knows,” from @lacymjohnson in @Orion_Magazine.

See also, “Slime Molds Remember — but Do They Learn?” (from whence the image above) and “Now, in addition to penicillin, we can credit mold with elegant design.”

* Lucretius, On the Nature of Things

###

As we contemplate kinship, we might send insightful birthday greetings to Johann Hedwig; he was born on this date in 1730. A botanist noted for his study of mosses, he is considered “the father of bryology” (the study of mosses… cousins of mold).

source

Written by (Roughly) Daily

December 8, 2021 at 1:00 am

%d bloggers like this: