(Roughly) Daily

Posts Tagged ‘neuroscience

“Poetry might be defined as the clear expression of mixed feelings”*…

Can artificial intelligence have those feelings? Scientist and poet Keith Holyoak explores:

… Artificial intelligence (AI) is in the process of changing the world and its societies in ways no one can fully predict. On the hazier side of the present horizon, there may come a tipping point at which AI surpasses the general intelligence of humans. (In various specific domains, notably mathematical calculation, the intersection point was passed decades ago.) Many people anticipate this technological moment, dubbed the Singularity, as a kind of Second Coming — though whether of a savior or of Yeats’s rough beast is less clear. Perhaps by constructing an artificial human, computer scientists will finally realize Mary Shelley’s vision.

Of all the actual and potential consequences of AI, surely the least significant is that AI programs are beginning to write poetry. But that effort happens to be the AI application most relevant to our theme. And in a certain sense, poetry may serve as a kind of canary in the coal mine — an early indicator of the extent to which AI promises (threatens?) to challenge humans as artistic creators. If AI can be a poet, what other previously human-only roles will it slip into?…

A provocative consideration: “Can AI Write Authentic Poetry?@mitpress.

Apposite: a fascinating Twitter thread on “why GPT3 algorithm proficiency at producing fluent, correct-seeming prose is an exciting opportunity for improving how we teach writing, how students learn to write, and how this can also benefit profs who assign writing, but don’t necessarily teach it.”

* W. H. Auden


As we ruminate on rhymes, we might send thoughtful birthday greetings to Michael Gazzaniga; he was born on this date in 1939. A leading researcher in cognitive neuroscience (the study of the neural basis of mind), his work has focused on how the brain enables humans to perform those advanced mental functions that are generally associated with what we call “the mind.” Gazzaniga has made significant contributions to the emerging understanding of how the brain facilitates such higher cognitive functions as remembering, speaking, interpreting, and making judgments.


Written by (Roughly) Daily

December 12, 2022 at 1:00 am

“The brain is a wonderful organ; it starts working the moment you get up in the morning and does not stop until you get into the office”*…

For as long as humans have thought, humans have thought about thinking. George Cave on the power and the limits of the metaphors we’ve used to do that…

For thousands of years, humans have described their understanding of intelligence with engineering metaphors. In the 3rd century BCE, the invention of hydraulics popularized the model of fluid flow (“humours”) in the body. This lasted until the 1500s, supplanted by the invention of automata and the idea of humans as complex machines. From electrical and chemical metaphors in the 1700s to advances in communications a century later, each metaphor reflected the most advanced thinking of that era. Today is no different: we talk of brains that store, process and retrieve memories, mirroring the language of computers.

I’ve always believed metaphors to be helpful and productive in communicating unfamiliar concepts. But this fascinating history of cognitive science metaphors shows that flawed metaphors can take hold and limit the scope for alternative ideas. In the worst case, the EU spent 10 years and $1.3 billion building a model of the brain based on the incorrect belief that the brain functions like a computer…

Thinking about thinking, from @George_Cave in @the_prepared.

Apposite: “Finding Language in the Brain.”

* Robert Frost


As we cogitate on cognition, we might send carefully-computed birthday greetings to Grace Brewster Murray Hopper.  A seminal computer scientist and Rear Admiral in the U.S. Navy, “Amazing Grace” (as she was known to many in her field) was one of the first programmers of the Harvard Mark I computer (in 1944), invented the first compiler for a computer programming language, and was one of the leaders in popularizing the concept of machine-independent programming languages– which led to the development of COBOL, one of the first high-level programming languages.

Hopper also (inadvertently) contributed one of the most ubiquitous metaphors in computer science: she found and documented the first computer “bug” (in 1947).

She has both a ship (the guided-missile destroyer USS Hopper) and a super-computer (the Cray XE6 “Hopper” at NERSC) named in her honor.


Written by (Roughly) Daily

December 9, 2022 at 1:00 am

“Consciousness cannot be accounted for in physical terms. For consciousness is absolutely fundamental. It cannot be accounted for in terms of anything else.”*…

Representation of consciousness from the seventeenth century by Robert Fludd, an English Paracelsian physician (source)

… but that doesn’t mean that we won’t attempt to answer “the hard problem of consciousness.” Indeed, as Elizabeth Fernandez notes, some scientists are using Schrödinger’s own work to try…

Supercomputers can beat us at chess and perform more calculations per second than the human brain. But there are other tasks our brains perform routinely that computers simply cannot match — interpreting events and situations and using imagination, creativity, and problem-solving skills. Our brains are amazingly powerful computers, using not just neurons but the connections between the neurons to process and interpret information.

And then there is consciousness, neuroscience’s giant question mark. What causes it? How does it arise from a jumbled mass of neurons and synapses? After all, these may be enormously complex, but we are still talking about a wet bag of molecules and electrical impulses.

Some scientists suspect that quantum processes, including entanglement, might help us explain the brain’s enormous power, and its ability to generate consciousness. Recently, scientists at Trinity College Dublin, using a technique to test for quantum gravity, suggested that entanglement may be at work within our brains. If their results are confirmed, they could be a big step toward understanding how our brain, including consciousness, works… 

More on why maybe the brain isn’t “classical” after all: “Brain experiment suggests that consciousness relies on quantum entanglement,” from @SparkDialog in @bigthink.

For an orthogonal view: “Why we need to figure out a theory of consciousness.”

* Erwin Schrödinger


As we think about thinking, we might spare a thought for Alexius Meinong; he died on this date in 1920. A philosopher, he is known for his unique ontology and for contributions to the philosophy of mind and axiology– the theory of value.

Meinong’s ontology is notable for its belief in nonexistent objects. He distinguished several levels of reality among objects and facts about them: existent objects participate in actual (true) facts about the world; subsistent (real but non-existent) objects appear in possible (but false) facts; and objects that neither exist nor subsist can only belong to impossible facts. See his Gegenstandstheorie, or the Theory of Abstract Objects.


“Let there be bass”*…

Sometimes, it really is all about that bass…

A recent study in the journal Current Biology found that people danced 12% more when very low frequency bass was played.

The study was done by scientists at the LIVElab at McMaster University in Ontario, Canada, who wanted to see what musical ingredients make us want to dance.

“We look at things like what kinds of rhythms most pull people into that steady beat that we groove along with, and what kinds of interesting, syncopated, complex rhythms make us really drawn in and want to move more,” said Daniel Cameron, a neuroscientist and the lead author of the study.

Now, the lab for this experiment wasn’t the classic fluorescent lights, white coats and goggles setup. Instead, the LIVElab space was converted into an electronic dance music concert, and EDM duo Orphx performed live for volunteers adorned with headbands that had a motion capture sensor.

The lab was equipped with special special speakers that can play a very low frequency bass, undetectable to the human ear. The set lasted about an hour, and researchers introduced that very low bass every 2.5 minutes, and found that the concertgoers moved more when the speakers were on – even though they couldn’t hear it.

“It’s the inner-ear structures that give us a sense of where our head is in space,” he said. “That system is sensitive to low-frequency stimulation, especially if it’s loud.”

“We also know that our tactile system, that’s our sense of touch … is also sensitive to low-frequency stimulation, low-frequency sound.”…

“And that’s feeding into our motor system in the brain, the movement control system in our brain,” Cameron said. “So it’s adding a little bit of gain. It’s giving a little more energy … from that stimulation through those systems.”…

What makes us dance? It really is all about that bass,” from @NPR.

For more on ultra-low frequency sounds and their effects, see “How low can you go?“; and lest we think this phenomenon restricted to humans, “Watch These Rats ‘Dance’ to the Rhythms of Mozart, Lady Gaga, and Queen.”

Leo Fender


As we go low, we might recall that it was on this date in 1792, during George Washington’s first term as president, that the first edition of The Farmer’s Almanac was published.  (It became The Old Farmer’s Almanac in 1832 to distinguish itself from similarly-titled competitors.)  Still going strong, it is the oldest continuously-published periodical in the U.S.



Written by (Roughly) Daily

November 25, 2022 at 1:00 am

“I dream. Sometimes I think that’s the only right thing to do”*…

Why do we need art? And what does it have to do with dreaming? Neuroscientist and author Eric Hoel has a very provocative theory…

How will we spend the remaining 700,000 hours of the 21st century? In the metered time of our own discretion, there have never been more options for our personal entertainment, nor have they ever been more freely available. We find ourselves strolling the aisles of a vast sensorium. On the shelves is a trove of experiences: video games, movies, TV shows, virtual reality, books, podcasts, articles, social media posts, all prepackaged for our consumption. What had previously been accomplished for food through the centralized distribution of supermarkets has now been done with experience itself. The recent grand opening of this supersensorium has been mediated through the screen, a panoply of icons, images, links, downloads, and videos auto-playing, which we browse through entirely at our leisure.

Such abundance of choice would have been heralded as miraculous in any other age. What a rousing cry for progress that our lowly living rooms would have stupefied with their luxuries even the God-like pharaohs, even the court of Versailles! Or maybe not—for it all comes with a price. Who hasn’t lost days from binge-watching Netflix, or deep in the dungeons of some video game? Here’s a scary, or maybe heart-wrenching, thing to consider: of our waking leisure hours, what exactly is the amount of time devoted to the consumption of experiences from the supersensorium? In 2018, Nielsen reported that the average American spent eleven hours a day engaged with media. Does anyone believe that this number is going to decrease? For the technology that undergirds the supersensorium will only improve. The algorithms will grow more personalized, the experiences will become more salient, and the platforms will get faster in their delivery of content. And we should all admit that the vast majority of what lines the shelves of the supersensorium is merely entertainment, for otherwise we wouldn’t feel a gnawing guilt so great most of us avoid consciously calculating how our time is actually spent.

The infinite entertainment of the supersensorium is especially problematic if you happen to be someone who likes and maybe even produces art or fictions. E.g., a writer such as myself, who views the tidal wave of middling fictions with a feeling akin to terror. Not that these problems are entirely new. In a letter to a friend, a 31-year-old Tolstoy wrote:

I shall write no more fiction. It is shameful, when you come to think of it. People are weeping, dying, marrying, and I should sit down and write books telling “how she loved him”? It’s shameful!

If that was Tolstoy’s judgment of himself, what might his fiery judgment be of our now endless ways of telling “how she loved him”? The mere scale of the supersensorium pushes to the fore old questions about the purpose of art and fictions. Why do humans desire these petite narratives we gobble up like treats? What’s the origin of this pull toward artifice, a thing so powerful we might even call it an instinct? Is it virtue or vice? And if it can be a vice and technology is making it easier and easier to while away our lives this way, a reasonable person has to ask: why add to the supersensorium? Why take away from the real when the real is already back on its heels, and behind it, a cliff?…

It turns out, Hoel suggests, that the answers have everything to do with dreaming…

To explain the phenomenology of dreams I recently outlined a scientific theory called the Overfitted Brain Hypothesis (OBH). The OBH posits that dreams are an evolved mechanism to avoid a phenomenon called overfitting. Overfitting, a statistical concept, is when a neural network learns overly specifically, and therefore stops being generalizable. It learns too well. For instance, artificial neural networks have a training data set: the data that they learn from. All training sets are finite, and often the data comes from the same source and is highly correlated in some non-obvious way. Because of this, artificial neural networks are in constant danger of becoming overfitted. When a network becomes overfitted, it will be good at dealing with the training data set but will fail at data sets it hasn’t seen before. All learning is basically a tradeoff between specificity and generality in this manner. Real brains, in turn, rely on the training set of lived life. However, that set is limited in many ways, highly correlated in many ways. Life alone is not a sufficient training set for the brain, and relying solely on it likely leads to overfitting…

What the OBH suggests is that dreams represent the biological version of a combination of such techniques, a form of augmentation or regularization that occurs after the day’s learning—but the point is not to enforce the day’s memories, but rather combat the detrimental effects of their memorization. Dreams warp and play with always-ossifying cognitive and perceptual categories, stress-testing and refining. The inner fabulist shakes up the categories of the plastic brain. The fight against overfitting every night creates a cyclical process of annealing: during wake the brain fits to its environment via learning, then, during sleep, the brain “heats up” through dreams that prevent it from clinging to suboptimal solutions and models and incorrect associations.

The OBH fits with the evidence from human sleep research: sleep seems to be associated not so much with assisting pure memorization, as other hypotheses about dreams would posit, but with an increase in abstraction and generalization. There’s also the famous connection between dreams and creativity, which also fits with the OBH. Additionally, if you stay awake too long you will begin to hallucinate (perhaps because your perceptual processes are becoming overfitted). Most importantly, the OBH explains why dreams are so, well, dreamlike.

… and everything to do with the role that it plays in our lives– and in shaping the media and entertainment that we consume…

From an evolutionary perspective, it’s rather amazing humans are willing to spend so much time on fictions… Why are we so fascinated by things that never happened?

If the OBH is true, then it is very possible writers and artists, not to mention the entirety of the entertainment industry, are in the business of producing what are essentially consumable, portable, durable dreams. Literally. Novels, movies, TV shows—it is easy for us to suspend our disbelief because we are biologically programmed to surrender it when we sleep. I don’t think it’s a coincidence that a TV episode traditionally lasts about the same ~30 minutes in length as the average REM event, and movies last ~90 minutes, an entire sleep cycle (and remember, we dream sometimes in NREM too). They are dream substitutions.

This hypothesized connection explains why humans find the directed dreams we call “fictions” and “art” so attractive and also reveals their purpose: they are artificial means of accomplishing the same thing naturally occurring dreams do. Just like dreams, fictions and art keep us from overfitting our perception, models, and understanding of the world…

And as you’ll see if you read this piece in full, as I hope you will, the implication is that art– real art, good art– matters…

… as the supersensorium expands over more and more of our waking hours, the idea of an aesthetic spectrum, with art on one end and entertainment on the other, is defunct. In fact, explicitly promoting any difference between entertainment and art is considered a product of a bygone age, even a tool of oppression and elitism. At best, the distinction is an embarrassing form of noblesse oblige. One could give a long historical answer about how exactly we got into this cultural headspace, maybe starting with postmodernism and deconstructionism, then moving on to the problematization of the canon, or the saturation of pop culture in academia to feed the more and more degrees, we could trace the ideas, catalog the opinions of the cultural powerbrokers, we could focus on new media and technologies muscling for attention, or changing demographics and work forces and leisure time, or so many other things—but none of it matters. What matters is, now, as it stands, talking about art as being fundamentally different from entertainment brings charges of classism, snobbishness, elitism—of being proscriptive, boring, and stuffy.

And without a belief in some sort of lowbrow-highbrow spectrum of aesthetics, there is no corresponding justification of a spectrum of media consumption habits. Imagine two alien civilizations, both at roughly our own stage of civilization, both with humanity’s innate drive to consume artificial experiences and narratives. One is a culture that scoffs at the notion of art. The other is aesthetically sensitive and even judgmental. Which weathers the storm of the encroaching supersensorium, with its hyper-addictive superstimuli? When the eleven hours a day becomes thirteen, becomes fifteen? A belief in an aesthetic spectrum may be all that keeps a civilization from disappearing up its own brainstem.

In a world of infinite experience, it is the aesthete who is safest, not the ascetic. Abstinence will not work. The only cure for too much fiction is good fiction. Artful fictions are, by their very nature, rare and difficult to produce. In turn, their rarity justifies their existence and promotion. It’s difficult to overeat on caviar alone. Now, it’s important to note here that I don’t mean that art can’t be entertaining, nor that it’s restricted to a certain medium. But art always refuses to be easily assimilated into the supersensorium.

…only by upholding art can we champion the consumption of art. Which is so desperately needed because only art is the counterforce judo for entertainment’s stranglehold on our stone-age brains. And as the latter force gets stronger, we need the former more and more.

So in your own habits of consumption, hold on to art. It will deliver you through this century…

The neuroscientific case for art in the age of Netflix: “Exit the supersensorium,” from @erikphoel.

* Haruki Murakami


As we dream on, we might send birthday greetings to Konstantin Yuon; he was born on this date in 1875. A painter and theater designer, he was involved with Mir Iskusstva, the Russian magazine, and with the artistic movement it inspired and embodied, which was a major influence on the Russians who helped revolutionize European art during the first decade of the 20th century. Later, he co-founded the Union of Russian Artists and the Association of Artists of Revolutionary Russia.

New Planet, 1921


Self-portrait, 1912


Written by (Roughly) Daily

October 24, 2022 at 1:00 am

%d bloggers like this: