(Roughly) Daily

Posts Tagged ‘memory

“The clustering of technological innovation in time and space helps explain both the uneven growth among nations and the rise and decline of hegemonic powers”*…

As scholars like Robert Gordon and Tyler Cowan have begun to call out a slowing of progress and growth in the U.S., others are beginning to wonder if “innovation clusters” like Silicon Valley are still advantageous. For example, Brian J. Asquith

In 2011, the economist Tyler Cowen published The Great Stagnation, a short treatise with a provocative hypothesis. Cowen challenged his audience to look beyond the gleam of the internet and personal compu­ting, arguing that these innovations masked a more troubling reality. Cowen contended that, since the 1970s, there has been a marked stagna­tion in critical economic indicators: median family income, total factor productivity growth, and average annual GDP growth have all plateaued…

In the years since the publication of the Great Stagnation hypothesis, others have stepped forward to offer support for this theory. Robert Gordon’s 2017 The Rise and Fall of American Growth chronicles in engrossing detail the beginnings of the Second Industrial Revolution in the United States, starting around 1870, the acceleration of growth spanning the 1920–70 period, and then a general slowdown and stagnation since about 1970. Gordon’s key finding is that, while the growth rate of average total factor productivity from 1920 to 1970 was 1.9 percent, it was just 0.6 percent from 1970 to 2014, where 1970 represents a secular trend break for reasons still not entirely understood. Cowen’s and Gordon’s insights have since been further corroborated by numerous research papers. Research productivity across a variety of measures (researchers per paper, R&D spending needed to maintain existing growth rates, etc.) has been on the decline across the developed world. Languishing productivity growth extends beyond research-intensive industries. In sectors such as construction, the value added per worker was 40 percent lower in 2020 than it was in 1970. The trend is mirrored in firm productivity growth, where a small number of superstar firms see exceptionally strong growth and the rest of the distribution increasingly lags behind.

A 2020 article by Nicholas Bloom and three coauthors in the American Economic Review cut right to the chase by asking, “Are Ideas Getting Harder to Find?,” and answered its own question in the affirm­ative.6 Depending on the data source, the authors find that while the number of researchers has grown sharply, output per researcher has declined sharply, leading aggregate research productivity to decline by 5 percent per year.

This stagnation should elicit greater surprise and concern because it persists despite advanced economies adhering to the established eco­nomics prescription intended to boost growth and inno­vation rates: (1) promote mass higher education, (2) identify particularly bright young people via standardized testing and direct them to re­search‑intensive universities, and (3) pipe basic research grants through the university system to foster locally-driven research and development networks that supercharge productivity…

… the tech cluster phenomenon stands out because there is a fundamental discrepancy between how the clusters function in practice versus their theoretical contributions to greater growth rates. The emergence of tech clusters has been celebrated by many leading economists because of a range of findings that innovative people become more productive (by various metrics) when they work in the same location as other talented people in the same field. In this telling, the essence of innovation can be boiled down to three things: co-location, co-location, co-location. No other urban form seems to facili­tate innovation like a cluster of interconnected researchers and firms.

This line of reasoning yields a straightforward syllogism: technology clusters enhance individual innovation and productivity. The local na­ture of innovation notwithstanding, technologies developed within these clusters can be adopted and enjoyed globally. Thus, while not everyone can live in a tech cluster, individuals worldwide benefit from new advances and innovations generated there, and some of the outsized economic gains the clusters produce can then be redistributed to people outside of the clusters to smooth over any lingering inequalities. There­fore, any policy that weakens these tech clusters leads to a diminished rate of innovation and leaves humanity as a whole poorer.

Yet the fact that the emergence of the tech clusters has also coincided with Cowen’s Great Stagnation raises certain questions. Are there shortcomings in the empirical evidence on the effects of the tech clusters? Does technology really diffuse across the rest of the economy as many economists assume? Do the tech clusters inherently prioritize welfare-enhancing technologies? Is there some role for federal or state action to improve the situation? Clusters are not unique to the postwar period: Detroit famously achieved a large agglomeration economy based on automobiles in the early twentieth century, and several authors have drawn parallels between the ascents of Detroit and Silicon Valley. What makes today’s tech clusters distinct from past ones? The fact that the tech clusters have not yielded the same society-enhancing benefits that they once promised should invite further scrutiny…

How could this be? What can we do about it? Eminently worth reading in full: “Superstars or Black Holes: Are Tech Clusters Causing Stagnation?” (possible soft paywall), from @basquith827.

See also: Brad DeLong, on comments from Eric Schmidt: “That an externality market failure is partly counterbalanced and offset by a behavioral-irrationality-herd-mania cognitive failure is a fact about the world. But it does not mean that we should not be thinking and working very hard to build a better system—or that those who profit mightily from herd mania on the part of others should feel good about themselves.”

* Robert Gilpin

###

As we contemplate co-location, we might recall that it was on this date in 1956 that a denizen of one of America’s leading tech/innovation hubs, Jay Forrester at MIT [see here and here], was awarded a patent for his coincident current magnetic core memory (Patent No. 2,736,880). Forrester’s invention, a “multicoordinate digital information storage device,” became the standard memory device for digital computers until supplanted by solid state (semiconductor) RAM in the mid-1970s.

source

“The control of large numbers is possible, and like unto that of small numbers, if we subdivide them”*…

It’s always been intuitively obvious that we handle small numbers more easily than large ones. But the discovery that the brain has different systems for representing small and large numbers provokes new questions about memory, attention, and mathematics…

More than 150 years ago, the economist and philosopher William Stanley Jevons discovered something curious about the number 4. While musing about how the mind conceives of numbers, he tossed a handful of black beans into a cardboard box. Then, after a fleeting glance, he guessed how many there were, before counting them to record the true value. After more than 1,000 trials, he saw a clear pattern. When there were four or fewer beans in the box, he always guessed the right number. But for five beans or more, his quick estimations were often incorrect.

Jevons’ description of his self-experiment, published in Nature in 1871, set the “foundation of how we think about numbers,” said Steven Piantadosi, a professor of psychology and neuroscience at the University of California, Berkeley. It sparked a long-lasting and ongoing debate about why there seems to be a limit on the number of items we can accurately judge to be present in a set.

Now, a new study in Nature Human Behaviour has edged closer to an answer by taking an unprecedented look at how human brain cells fire when presented with certain quantities. Its findings suggest that the brain uses a combination of two mechanisms to judge how many objects it sees. One estimates quantities. The second sharpens the accuracy of those estimates — but only for small numbers…

Although the new study does not end the debate, the findings start to untangle the biological basis for how the brain judges quantities, which could inform bigger questions about memory, attention and even mathematics…

One, two, three, four… and more: “Why the Human Brain Perceives Small Numbers Better,” from @QuantaMagazine.

* Sun Tzu

###

As we stew over scale, we might spare a thought for a man untroubled by larger (and more complicated) numbers, Émile Picard; he died on this date in 1941. A mathematician whose theories did much to advance research into analysis, algebraic geometry, and mechanics, he made his most important contributions in the field of analysis and analytic geometry. He used methods of successive approximation to show the existence of solutions of ordinary differential equations. Picard also applied analysis to the study of elasticity, heat, and electricity. He and  Henri Poincaré have been described as the most distinguished French mathematicians in their time.

Indeed, Picard was elected the fifteenth member to occupy seat 1 of the Académie française in 1924.

source

Written by (Roughly) Daily

December 11, 2023 at 1:00 am

“Memory is more indelible than ink”*…

Solomon V. Shereshevsky 1896-1958. This photo is a frame grab from a 2007 documentary film produced for Russian television, Zagadky pamyati (source)

At least for some of us it is– for instance, Solomon Shereshevsky, a Soviet journalist and mnemonist, widely-regarded as the “man with the greatest memory ever” (and the subject of neuropsychologist Alexander Luria‘s 1968 case study The Mind of a Mnemonist). From Wikipedia…

He met Luria after an anecdotal event in which he was scolded for not taking any notes while attending a work meeting in the mid-1920s. To the astonishment of everyone there (and to his own astonishment in realizing that others could apparently not do so), he could recall the speech word for word. Throughout his life, Shereshevsky was tasked with memorizing complex mathematical formulas, huge matrices, and even poems in foreign languages that he had never spoken before, all of which he would memorize with meticulous accuracy in a matter of minutes.

On the basis of his studies, Luria diagnosed in Shereshevsky an extremely strong version of synaesthesia, fivefold synaesthesia, in which the stimulation of one of his senses produced a reaction in every other. For example, if Shereshevsky heard a musical tone played he would immediately see a colour, touch would trigger a taste sensation, and so on for each of the senses…

His memory was so powerful that he could still recall decades-old events and experiences in perfect minute detail. After he discovered his own abilities, he performed as a mnemonist; but this created confusion in his mind. He went as far as writing things down on paper and burning it, so that he could see the words in cinders, in a desperate attempt to forget them. Some later mnemonists have speculated that this could have been a mentalist’s technique for writing things down to later commit to long-term memory…

Unforgettable: “Solomon Shereshevsky,” from @Wikipedia.

* Anita Loos

###

As we muse on memory, we might recall that it was on this date in 1900 that The Wonderful Wizard of Oz was published. Written by L. Frank Baum and illustrated by  W. W. Denslow, it was an immediate hit, spawning a flow of further editions (soon known simply as The Wizard of Oz), stage adaptations, and of course the classic 1939 live-action film. It had sold three million copies by the time it entered the public domain in 1956.

source

“I will buckle down to work as soon as I finish reading the Internet”*…

From Aldobrandino da Siena’s Le Régime du corps (1265-70 CE)

Worried that technology is “breaking your brain:? As Joe Stadolnik explains, fears about attention spans and focus are as old as writing itself…

If you suspect that 21st-century technology has broken your brain, it will be reassuring to know that attention spans have never been what they used to be. Even the ancient Roman philosopher Seneca the Younger was worried about new technologies degrading his ability to focus. Sometime during the 1st century CE, he complained that ‘The multitude of books is a distraction’. This concern reappeared again and again over the next millennia. By the 12th century, the Chinese philosopher Zhu Xi saw himself living in a new age of distraction thanks to the technology of print: ‘The reason people today read sloppily is that there are a great many printed texts.’ And in 14th-century Italy, the scholar and poet Petrarch made even stronger claims about the effects of accumulating books:

Believe me, this is not nourishing the mind with literature, but killing and burying it with the weight of things or, perhaps, tormenting it until, frenzied by so many matters, this mind can no longer taste anything, but stares longingly at everything, like Tantalus thirsting in the midst of water.

Technological advances would make things only worse. A torrent of printed texts inspired the Renaissance scholar Erasmus to complain of feeling mobbed by ‘swarms of new books’, while the French theologian Jean Calvin wrote of readers wandering into a ‘confused forest’ of print. That easy and constant redirection from one book to another was feared to be fundamentally changing how the mind worked. Apparently, the modern mind – whether metaphorically undernourished, harassed or disoriented –­ has been in no position to do any serious thinking for a long time.

In the 21st century, digital technologies are inflaming the same old anxieties… and inspiring some new metaphors…

Same as it ever was– a history of the anxieties about attention and memory that new communications technologies have occasioned through history: “We’ve always been distracted,” from @joestadolnik in @aeonmag.

* Stewart Brand @stewartbrand

###

As we learn our way into new media, we might recall that it was it was on this date in 1946 that the first first Washington, D.C. – New York City telecast was accomplished, using AT&T corporation’s coaxial cable; General Dwight Eisenhower was seen to place a wreath at the base of the statue in the Lincoln Memorial and others made brief speeches. The event was judged a success by engineers, although Time magazine called it “as blurred as an early Chaplin movie.”

1946 television (source)

Written by (Roughly) Daily

February 18, 2023 at 1:00 am

“Right now I’m having amnesia and déjà vu at the same time. I think I’ve forgotten this before.”*…

Woodcut illustrations from Anianus’ Compotus cum commento (ca. 1492), an adaptation of Bede’s computus system — Source.

Before humans stored memories as zeroes and ones, we turned to digital devices of another kind — preserving knowledge on the surface of fingers and palms. Kensy Cooperrider leads us through a millennium of “hand mnemonics” and the variety of techniques practiced by Buddhist monks, Latin linguists, and Renaissance musicians for remembering what might otherwise elude the mind…

In the beginning, the hand was just a hand — or so we can imagine. It was a workaday organ, albeit a versatile one: a tool for grasping, holding, throwing, and hefting. Then, at some point, after millions of years, it took on other duties. It became an instrument of mental, not just menial, labor. As a species, our systems of understanding, belief, and myth had grown more elaborate, more cognitively overwhelming. And so we started to put those systems out into the world: to tally, track, and record by carving notches into bone, tying knots in string, spreading pigment on cave walls, and aligning rocks with celestial bodies. Hands abetted these early mental labors, of course, but they would later become more than mere accessories. Beginning roughly twelve hundred years ago, we started using the hand itself as a portable repository of knowledge, a place to store whatever tended to slip our mental grasp. The topography of the palm and fingers became invisibly inscribed with information of all kinds — tenets and dates, names and sounds. The hand proved versatile in a new way, as an all-purpose memory machine.

The arts of memory are well known, but the role of the hand in these arts is often overlooked. In the twentieth century, beginning with the pioneering work of Frances Yates, Western scholars started to piece together a rich tradition of mnemonic practices that originated in antiquity and later took hold in Europe. The most celebrated of these is the “memory palace” [see here]. Using this technique, skilled practitioners can memorize vast collections of facts by nesting them in familiar places (or “loci”): the chambers of a building or along a well-known route. (To make these places more memorable, a bizarre image is often added to each one, the more jarring the better.) It is an odd omission that hand mnemonics are rarely mentioned alongside memory palaces. Both techniques are powerful and broadly attested. Both are adaptable, able to accommodate whatever type of information one wants to remember. And both work by similar principles, pinning to-be-remembered items to familiar locations.

The two traditions do have important differences. Memory palaces exist solely in the imagination; hand mnemonics exist half in the mind and half in the flesh. Another difference lies in their intended use. Memory palaces are idiosyncratic in nature, tailored to the quirks of personal experience and association, and used for private purposes; they are very much the province of an individual. Hand mnemonics, by contrast, are the province of a community, a tool for collective understanding. They offer a way of fixing and transmitting a shared system of knowledge. They serve private purposes, certainly — such as contemplation, in the case of the Mogao mnemonic, or calculation, in the case of Bede’s computus. But they also have powerful social functions in teaching, ritual, and communication…

The five-fingered memory machine: “Handy Mnemonics,” from @kensycoop in @PublicDomainRev.

* Steven Wright

###

As we give it (to) the finger, we might recall an occasion for counting that required no fingers at all: on this date in 2015, a baseball game between Chicago White Sox and the Baltimore Orioles at Camden Yards set the all-time low attendance mark for Major League Baseball. Zero (0) fans were in attendance, because the stadium was closed to the public due to the 2015 Baltimore protests (over the death of Freddie Gray while in police custody).

source