(Roughly) Daily

Posts Tagged ‘memory

“Right now I’m having amnesia and déjà vu at the same time. I think I’ve forgotten this before.”*…

The author, far left, as a very young child

Our first three years are usually a blur, and we don’t remember much before age seven. Kristin Ohlson wondered why…

… Freud argued that we repress our earliest memories because of sexual trauma but, until the 1980s, most researchers assumed that we retained no memories of early childhood because we created no memories – that events took place and passed without leaving a lasting imprint on our baby brains. Then in 1987, a study by the Emory University psychologist Robyn Fivush and her colleagues dispelled that misconception for good, showing that children who were just 2.5 years old could describe events from as far as six months into their past.

But what happens to those memories? Most of us assume that we can’t recall them as adults because they’re just too far back in our past to tug into the present, but this is not the case. We lose them when we’re still children…

To form long-term memories, an array of biological and psychological stars must align, and most children lack the machinery for this alignment. The raw material of memory – the sights, sounds, smells, tastes and tactile sensations of our life experiences – arrive and register across the cerebral cortex, the seat of cognition. For these to become memory, they must undergo bundling in the hippocampus, a brain structure named for its supposed resemblance to a sea horse, located under the cerebral cortex. The hippocampus not only bundles multiple input from our senses together into a single new memory, it also links these sights, sounds, smells, tastes, and tactile sensations to similar ones already stored in the brain. But some parts of the hippocampus aren’t fully developed until we’re adolescents, making it hard for a child’s brain to complete this process.

‘So much has to happen biologically to store a memory,’ the psychologist Patricia Bauer of Emory University told me. There’s ‘a race to get it stabilised and consolidated before you forget it. It’s like making Jell-O: you mix the stuff up, you put it in a mould, and you put it in the refrigerator to set, but your mould has a tiny hole in it. You just hope your Jell-O – your memory – gets set before it leaks out through that tiny hole.’

In addition, young children have a tenuous grip on chronology. They are years from mastering clocks and calendars, and thus have a hard time nailing an event to a specific time and place. They also don’t have the vocabulary to describe an event, and without that vocabulary, they can’t create the kind of causal narrative that [that’s] at the root of a solid memory. And they don’t have a greatly elaborated sense of self, which would encourage them to hoard and reconsider chunks of experience as part of a growing life-narrative.

Frail as they are, children’s memories are then susceptible to a process called shredding. In our early years, we create a storm of new neurons in a part of the hippocampus called the dentate gyrus and continue to form them throughout the rest of our lives, although not at nearly the same rate. A recent study by the neuroscientists Paul Frankland and Sheena Josselyn of the Hospital for Sick Children in Toronto suggests that this process, called neurogenesis, can actually create forgetting by disrupting the circuits for existing memories.

Our memories can become distorted by other people’s memories of the same event or by new information, especially when that new information is so similar to information already in storage. For instance, you meet someone and remember their name, but later meet a second person with a similar name and become confused about the name of the first person. We can also lose our memories when the synapses that connect neurons decay from disuse. ‘If you never use that memory, those synapses can be recruited for something different,’ Bauer told me.

Memories are less vulnerable to shredding and disruptions as the child grows up. Most of the solid memories that we carry into the rest of our lives are formed during what’s called ‘the reminiscence bump’, from ages 15 to 30, when we invest a lot of energy in examining everything to try to figure out who we are. The events, culture and people of that time remain with us and can even overshadow the features of our ageing present, according to Bauer. The movies were the best back then, and so was the music, and the fashion, and the political leaders, and the friendships, and the romances. And so on…

Why we remember so little from our youngest years: “The great forgetting,” from @kristinohlson in @aeonmag.

* Steven Wright

###

As we stroll down memory lane, we might spare a thought for Benjamin McLane Spock; he died on this date in 1998.  The first pediatrician to study psychoanalysis to try to understand children’s needs and family dynamics, he collected his findings in a 1946 book, The Common Sense Book of Baby and Child Care, which was criticized in some academic circles as being too reliant on anecdotal evidence, and in some conservative circles for promoting (what Norman Vincent Peale and others called) “permissiveness” by parents.  Despite that push-back, it became one of the best-selling volumes in history, having sold at the time of Spock’s death in 1998 over 50 million copies in 40 languages.

220px-Benjamin_McLane_Spock_(1976)

source

“The clustering of technological innovation in time and space helps explain both the uneven growth among nations and the rise and decline of hegemonic powers”*…

As scholars like Robert Gordon and Tyler Cowan have begun to call out a slowing of progress and growth in the U.S., others are beginning to wonder if “innovation clusters” like Silicon Valley are still advantageous. For example, Brian J. Asquith

In 2011, the economist Tyler Cowen published The Great Stagnation, a short treatise with a provocative hypothesis. Cowen challenged his audience to look beyond the gleam of the internet and personal compu­ting, arguing that these innovations masked a more troubling reality. Cowen contended that, since the 1970s, there has been a marked stagna­tion in critical economic indicators: median family income, total factor productivity growth, and average annual GDP growth have all plateaued…

In the years since the publication of the Great Stagnation hypothesis, others have stepped forward to offer support for this theory. Robert Gordon’s 2017 The Rise and Fall of American Growth chronicles in engrossing detail the beginnings of the Second Industrial Revolution in the United States, starting around 1870, the acceleration of growth spanning the 1920–70 period, and then a general slowdown and stagnation since about 1970. Gordon’s key finding is that, while the growth rate of average total factor productivity from 1920 to 1970 was 1.9 percent, it was just 0.6 percent from 1970 to 2014, where 1970 represents a secular trend break for reasons still not entirely understood. Cowen’s and Gordon’s insights have since been further corroborated by numerous research papers. Research productivity across a variety of measures (researchers per paper, R&D spending needed to maintain existing growth rates, etc.) has been on the decline across the developed world. Languishing productivity growth extends beyond research-intensive industries. In sectors such as construction, the value added per worker was 40 percent lower in 2020 than it was in 1970. The trend is mirrored in firm productivity growth, where a small number of superstar firms see exceptionally strong growth and the rest of the distribution increasingly lags behind.

A 2020 article by Nicholas Bloom and three coauthors in the American Economic Review cut right to the chase by asking, “Are Ideas Getting Harder to Find?,” and answered its own question in the affirm­ative.6 Depending on the data source, the authors find that while the number of researchers has grown sharply, output per researcher has declined sharply, leading aggregate research productivity to decline by 5 percent per year.

This stagnation should elicit greater surprise and concern because it persists despite advanced economies adhering to the established eco­nomics prescription intended to boost growth and inno­vation rates: (1) promote mass higher education, (2) identify particularly bright young people via standardized testing and direct them to re­search‑intensive universities, and (3) pipe basic research grants through the university system to foster locally-driven research and development networks that supercharge productivity…

… the tech cluster phenomenon stands out because there is a fundamental discrepancy between how the clusters function in practice versus their theoretical contributions to greater growth rates. The emergence of tech clusters has been celebrated by many leading economists because of a range of findings that innovative people become more productive (by various metrics) when they work in the same location as other talented people in the same field. In this telling, the essence of innovation can be boiled down to three things: co-location, co-location, co-location. No other urban form seems to facili­tate innovation like a cluster of interconnected researchers and firms.

This line of reasoning yields a straightforward syllogism: technology clusters enhance individual innovation and productivity. The local na­ture of innovation notwithstanding, technologies developed within these clusters can be adopted and enjoyed globally. Thus, while not everyone can live in a tech cluster, individuals worldwide benefit from new advances and innovations generated there, and some of the outsized economic gains the clusters produce can then be redistributed to people outside of the clusters to smooth over any lingering inequalities. There­fore, any policy that weakens these tech clusters leads to a diminished rate of innovation and leaves humanity as a whole poorer.

Yet the fact that the emergence of the tech clusters has also coincided with Cowen’s Great Stagnation raises certain questions. Are there shortcomings in the empirical evidence on the effects of the tech clusters? Does technology really diffuse across the rest of the economy as many economists assume? Do the tech clusters inherently prioritize welfare-enhancing technologies? Is there some role for federal or state action to improve the situation? Clusters are not unique to the postwar period: Detroit famously achieved a large agglomeration economy based on automobiles in the early twentieth century, and several authors have drawn parallels between the ascents of Detroit and Silicon Valley. What makes today’s tech clusters distinct from past ones? The fact that the tech clusters have not yielded the same society-enhancing benefits that they once promised should invite further scrutiny…

How could this be? What can we do about it? Eminently worth reading in full: “Superstars or Black Holes: Are Tech Clusters Causing Stagnation?” (possible soft paywall), from @basquith827.

See also: Brad DeLong, on comments from Eric Schmidt: “That an externality market failure is partly counterbalanced and offset by a behavioral-irrationality-herd-mania cognitive failure is a fact about the world. But it does not mean that we should not be thinking and working very hard to build a better system—or that those who profit mightily from herd mania on the part of others should feel good about themselves.”

* Robert Gilpin

###

As we contemplate co-location, we might recall that it was on this date in 1956 that a denizen of one of America’s leading tech/innovation hubs, Jay Forrester at MIT [see here and here], was awarded a patent for his coincident current magnetic core memory (Patent No. 2,736,880). Forrester’s invention, a “multicoordinate digital information storage device,” became the standard memory device for digital computers until supplanted by solid state (semiconductor) RAM in the mid-1970s.

source

“The control of large numbers is possible, and like unto that of small numbers, if we subdivide them”*…

It’s always been intuitively obvious that we handle small numbers more easily than large ones. But the discovery that the brain has different systems for representing small and large numbers provokes new questions about memory, attention, and mathematics…

More than 150 years ago, the economist and philosopher William Stanley Jevons discovered something curious about the number 4. While musing about how the mind conceives of numbers, he tossed a handful of black beans into a cardboard box. Then, after a fleeting glance, he guessed how many there were, before counting them to record the true value. After more than 1,000 trials, he saw a clear pattern. When there were four or fewer beans in the box, he always guessed the right number. But for five beans or more, his quick estimations were often incorrect.

Jevons’ description of his self-experiment, published in Nature in 1871, set the “foundation of how we think about numbers,” said Steven Piantadosi, a professor of psychology and neuroscience at the University of California, Berkeley. It sparked a long-lasting and ongoing debate about why there seems to be a limit on the number of items we can accurately judge to be present in a set.

Now, a new study in Nature Human Behaviour has edged closer to an answer by taking an unprecedented look at how human brain cells fire when presented with certain quantities. Its findings suggest that the brain uses a combination of two mechanisms to judge how many objects it sees. One estimates quantities. The second sharpens the accuracy of those estimates — but only for small numbers…

Although the new study does not end the debate, the findings start to untangle the biological basis for how the brain judges quantities, which could inform bigger questions about memory, attention and even mathematics…

One, two, three, four… and more: “Why the Human Brain Perceives Small Numbers Better,” from @QuantaMagazine.

* Sun Tzu

###

As we stew over scale, we might spare a thought for a man untroubled by larger (and more complicated) numbers, Émile Picard; he died on this date in 1941. A mathematician whose theories did much to advance research into analysis, algebraic geometry, and mechanics, he made his most important contributions in the field of analysis and analytic geometry. He used methods of successive approximation to show the existence of solutions of ordinary differential equations. Picard also applied analysis to the study of elasticity, heat, and electricity. He and  Henri Poincaré have been described as the most distinguished French mathematicians in their time.

Indeed, Picard was elected the fifteenth member to occupy seat 1 of the Académie française in 1924.

source

Written by (Roughly) Daily

December 11, 2023 at 1:00 am

“Memory is more indelible than ink”*…

Solomon V. Shereshevsky 1896-1958. This photo is a frame grab from a 2007 documentary film produced for Russian television, Zagadky pamyati (source)

At least for some of us it is– for instance, Solomon Shereshevsky, a Soviet journalist and mnemonist, widely-regarded as the “man with the greatest memory ever” (and the subject of neuropsychologist Alexander Luria‘s 1968 case study The Mind of a Mnemonist). From Wikipedia…

He met Luria after an anecdotal event in which he was scolded for not taking any notes while attending a work meeting in the mid-1920s. To the astonishment of everyone there (and to his own astonishment in realizing that others could apparently not do so), he could recall the speech word for word. Throughout his life, Shereshevsky was tasked with memorizing complex mathematical formulas, huge matrices, and even poems in foreign languages that he had never spoken before, all of which he would memorize with meticulous accuracy in a matter of minutes.

On the basis of his studies, Luria diagnosed in Shereshevsky an extremely strong version of synaesthesia, fivefold synaesthesia, in which the stimulation of one of his senses produced a reaction in every other. For example, if Shereshevsky heard a musical tone played he would immediately see a colour, touch would trigger a taste sensation, and so on for each of the senses…

His memory was so powerful that he could still recall decades-old events and experiences in perfect minute detail. After he discovered his own abilities, he performed as a mnemonist; but this created confusion in his mind. He went as far as writing things down on paper and burning it, so that he could see the words in cinders, in a desperate attempt to forget them. Some later mnemonists have speculated that this could have been a mentalist’s technique for writing things down to later commit to long-term memory…

Unforgettable: “Solomon Shereshevsky,” from @Wikipedia.

* Anita Loos

###

As we muse on memory, we might recall that it was on this date in 1900 that The Wonderful Wizard of Oz was published. Written by L. Frank Baum and illustrated by  W. W. Denslow, it was an immediate hit, spawning a flow of further editions (soon known simply as The Wizard of Oz), stage adaptations, and of course the classic 1939 live-action film. It had sold three million copies by the time it entered the public domain in 1956.

source

“I will buckle down to work as soon as I finish reading the Internet”*…

From Aldobrandino da Siena’s Le Régime du corps (1265-70 CE)

Worried that technology is “breaking your brain:? As Joe Stadolnik explains, fears about attention spans and focus are as old as writing itself…

If you suspect that 21st-century technology has broken your brain, it will be reassuring to know that attention spans have never been what they used to be. Even the ancient Roman philosopher Seneca the Younger was worried about new technologies degrading his ability to focus. Sometime during the 1st century CE, he complained that ‘The multitude of books is a distraction’. This concern reappeared again and again over the next millennia. By the 12th century, the Chinese philosopher Zhu Xi saw himself living in a new age of distraction thanks to the technology of print: ‘The reason people today read sloppily is that there are a great many printed texts.’ And in 14th-century Italy, the scholar and poet Petrarch made even stronger claims about the effects of accumulating books:

Believe me, this is not nourishing the mind with literature, but killing and burying it with the weight of things or, perhaps, tormenting it until, frenzied by so many matters, this mind can no longer taste anything, but stares longingly at everything, like Tantalus thirsting in the midst of water.

Technological advances would make things only worse. A torrent of printed texts inspired the Renaissance scholar Erasmus to complain of feeling mobbed by ‘swarms of new books’, while the French theologian Jean Calvin wrote of readers wandering into a ‘confused forest’ of print. That easy and constant redirection from one book to another was feared to be fundamentally changing how the mind worked. Apparently, the modern mind – whether metaphorically undernourished, harassed or disoriented –­ has been in no position to do any serious thinking for a long time.

In the 21st century, digital technologies are inflaming the same old anxieties… and inspiring some new metaphors…

Same as it ever was– a history of the anxieties about attention and memory that new communications technologies have occasioned through history: “We’ve always been distracted,” from @joestadolnik in @aeonmag.

* Stewart Brand @stewartbrand

###

As we learn our way into new media, we might recall that it was it was on this date in 1946 that the first first Washington, D.C. – New York City telecast was accomplished, using AT&T corporation’s coaxial cable; General Dwight Eisenhower was seen to place a wreath at the base of the statue in the Lincoln Memorial and others made brief speeches. The event was judged a success by engineers, although Time magazine called it “as blurred as an early Chaplin movie.”

1946 television (source)

Written by (Roughly) Daily

February 18, 2023 at 1:00 am