Posts Tagged ‘biology’
“No problem can be solved from the same level of consciousness that created it”*…
Annaka Harris on the difficulty in understanding consciousness…
The central challenge to a science of consciousness is that we can never acquire direct evidence of consciousness apart from our own experience. When we look at all the organisms (or collections of matter) in the universe and ask ourselves, “Which of these collections of matter contain conscious experiences?” in the broadest sense, the answer has to be “some” or “all”—the only thing we have direct evidence to support is that the answer isn’t “none,” as we know that at least our own conscious experiences exist.
Until we attain a significantly more advanced understanding of the brain, and of many other systems in nature for that matter, we’re forced to begin with one of two assumptions: either consciousness arises at some point in the physical world, or it is a fundamental part of the physical world (some, or all). And the sciences have thus far led with the assumption that the answer is “some” (and so have I, for most of my career) for understandable reasons. But I would argue that the grounds for this starting assumption have become weaker as we learn more about the brain and the role consciousness plays in behavior.
The problem is that what we deem to be conscious processes in nature is based solely on reportability. And at the very least, the work with split-brain and locked-in patients should have radically shifted our reliance on reportability at this point…
The realization that all of our scientific investigations of consciousness are unwittingly rooted in a blind assumption led me to pose two questions that I think are essential for a science of consciousness to keep asking:
- Can we find conclusive evidence of consciousness from outside a system?
- Is consciousness causal? (Is it doing something? Is it driving any behavior?)
The truth is that we have less and less reason to respond “yes” to either question with any confidence.And if the answer to these questions is in fact “no,” which is entirely possible, we’ll be forced to reconsider our jumping off point. Personally I’m still agnostic, putting the chances that consciousness is fundamental vs. emergent at more or less 50/50. But after focusing on this topic for more than twenty years, I’m beginning to think that assuming consciousness is fundamental is actually a slightly more coherent starting place…
“The Strong Assumption,” from @annakaharris.
See also: “How Do We Think Beyond Our Own Existence?“, from @annehelen.
* Albert Einstein
###
As we noodle on knowing, we might recall that it was on this date in 1987 that a patent (U.S. Patent No. 4,666,425) was awarded to Chet Fleming for a “Device for Perfusing an Animal Head”– a device for keeping a severed head alive.
That device, described as a “cabinet,” used a series of tubes to accomplish what a body does for most heads that are not “discorped”—that is, removed from their bodies. In the patent application, Fleming describes a series of tubes that would circulate blood and nutrients through the head and take deoxygenated blood away, essentially performing the duties of a living thing’s circulatory system. Fleming also suggested that the device might also be used for grimmer purposes.
“If desired, waste products and other metabolites may be removed from the blood, and nutrients, therapeutic or experimental drugs, anti-coagulants and other substances may be added to the blood,” the patent reads.
Although obviously designed for research purposes, the patent does acknowledge that “it is possible that after this invention has been thoroughly tested on research animals, it might also be used on humans suffering from various terminal illnesses.”
Fleming, a trained lawyer who had the reputation of being an eccentric, wasn’t exactly joking, but he was worried that somebody would start doing this research. The patent was a “prophetic patent”—that is, a patent for something which has never been built and may never be built. It was likely intended to prevent others from trying to keep severed heads alive using that technology…
Smithsonian Magazine

“History repeats itself, in part because the genome repeats itself. And the genome repeats itself, in part because history does.”*…
The original Human Genome Project map of the human genome was largely based on the DNA of one mixed-race man from Buffalo, with inputs from a few dozen other individuals, mostly of European descent. Now, researchers have released draft results from an ongoing effort to capture the entirety of human genetic variation…
More than 20 years after the first draft genome from the landmark Human Genome Project was released, researchers have published a draft human ‘pangenome’ — a snapshot of what is poised to become a new reference for genetic research that captures more of human diversity than has been previously available. Geneticists have welcomed the milestone, while also highlighting key ethical considerations surrounding the effort to make genome research more inclusive…
The draft genome, published in Nature on 10 May, was produced by the Human Pangenome Reference Consortium. Launched in 2019, the international project aims to map the entirety of human genetic variation, to create a comprehensive reference against which geneticists will be able to compare other sequences. Such a reference would aid studies investigating potential links between genes and disease.
The draft pangenome follows the 2022 publication of the first complete sequence of the human genome, which filled gaps that had been left by the original Human Genome Project. But unlike the original draft human genome and its successor, both of which were derived mostly from the DNA of just one person, the draft pangenome represents a collection of sequences from a diverse selection of 47 people from around the globe, including individuals from Africa, the Americas, Asia and Europe…
More at “First human ‘pangenome’ aims to catalogue genetic diversity,” in @Nature.
See the paper on the Pangenome Project here; and for more background, “This new genome map tries to capture all human genetic variation.”
* Siddhartha Mukherjee, The Gene: An Intimate History
###
As we go wide on genetics, we might send microscopic birthday greetings to Christian Anfinsen; he was born on this date in 1916. A biochemist, he won the 1972 Nobel Prize for Chemistry for his research on the shape and primary structure of ribonuclease (the enzyme that hydrolyses RNA), in whihc he found that found that its shape and consequently its enzymatic power could be restored– leading him to conclude that ribonuclease must retain all of the information about its configuration within its amino acids.
“Smells are the fallen angels of the senses”*…
In an excerpt from his new book, Where We Meet the World: The Story of the Senses, Ashley Ward contemplates the oft-ignored and much-maligned olfactory sense…
Despite the wonderful contributions that smell makes to our lives, it’s undervalued in modern Western societies. Polls conducted in both the US and the UK reported that of our five main senses, smell was the one that people were least concerned about losing, while a study of British teenagers found that half would rather be without their sense of smell than their phone.
Despite the wonderful contributions that smell makes to our lives, it’s undervalued in modern Western societies. Polls conducted in both the US and the UK reported that of our five main senses, smell was the one that people were least concerned about losing, while a study of British teenagers found that half would rather be without their sense of smell than their phone.
It may have to do with olfaction’s checkered past. For much of human history, smells were things to be wary of. The idea that sickness was borne out of noxious smells was the prominent theory in disease propagation for centuries. Clouds of pungency, known as miasmas, released from unclean dwellings, filthy streets, and even the ploughing of soil, were blamed for contaminating the body, leading to any number of maladies. A debilitating fever emerging from marshes and swamps was named after the medieval Italian for bad air: mal’aria. Terrifying epidemics that haunted the world for centuries seemed to be induced by foul, corrupted air.
…
While odors themselves were regarded with distrust, it seems like every famous man in history who ever felt moved to write about our sense of smell had some derogatory point to make (there’s a notable shortage of opinions from the women of history). Most fall into one of two camps: those who regarded smell as relatively unimportant, and those who associated it with depravity. Plato considered that smell was linked to “base urges,” while others described it as degenerate and animalistic. Aristotle wrote that “man smells poorly” and Darwin asserted that “the sense of smell is of extremely slight service.”
…
The migration of primate eyes to the front of the face allows excellent stereoscopic vision, compared to the side-of-the-head arrangement favored by many other mammals, but limits the space available for the olfactory equipment. The loss of the snout in apes especially seems only to further restrict the capacity for smell.
Finally, primates in general and humans in particular seem to be losing genes associated with our sense of smell. We have something like 400 working olfactory genes, but sitting in our genetic code are close to 500 olfactory pseudogenes. These are the genetic equivalents of fossils; genes that used to contribute to our sense of smell but that no longer work. In other words, we’ve lost over half of our smell genes across evolutionary time…
Eminently worth reading in full: “How Smell—the Most Underrated Sense—Was Overpowered By Our Other Senses,” from @ashleyjwward in @lithub.
* Helen Keller
###
As we breathe in, we might we might recall that it was on this date in 1943 that Swiss chemist Albert Hofmann discovered the sensory- enhancing, even altering– that’s to say, psychedelic– properties of LSD. Hofmann had synthesized the drug five years earlier, but its hoped-for use in treating respiratory problems didn’t pan out, and it was shelved. On this day, he accidentally absorbed some of the drug through his skin (as he touched its container). He became dizzy with hallucinations. Three days later he took the first intentional dose of acid: 0.25 milligrams (250 micrograms), an amount he predicted to be a threshold dose (an actual threshold dose is 20 micrograms). Less than an hour later, Hofmann experienced sudden and intense changes in perception. He asked his laboratory assistant to escort him home and, as use of motor vehicles was prohibited because of wartime restrictions, they had to make the journey on a bicycle… which is why April 19 has been celebrated (since 1985) as “Bicycle Day.”

“I Got Rhythm”*…
… indeed, as Nina Kraus explains, we all do…
Why do we care about rhythm? It connects us to the world. It plays a role in listening, in language, in understanding speech in noisy places, in walking, and even in our feelings toward one another.
Rhythm is much more than a component of music. Nevertheless, music is probably what first comes to mind when we hear the word rhythm: drumming, jazz, rock and roll, marching bands, street performers with wooden spoons and five-gallon buckets, drum circles, time signatures, stomp-stomp-clap — we will, we will rock you — adventures on the dance floor, beatboxing, incantations, mantras, and prayers. Beyond music, we experience the rhythmic changes of the seasons. Some of us have menstrual cycles. We have circadian rhythms — daily cycles of mental and physical peaks and troughs. Frogs croak rhythmically to attract mates and change their rhythm to signal aggression. Tides, 17-year cicadas, lunar phases, perigees, and apogees are other naturally occurring rhythms. Human-made rhythms include the built world — street grids, traffic lights, crop fields, mowed designs in baseball diamond outfields, the backsplash behind the kitchen counter, spatial patterns in geometric visual art forms.
Maintaining rhythm is almost a biological imperative for some of us…
An exploration of the many ways in which temporal patterns plays an important role in how we perceive — and connect with — the world: “The Extraordinary Ways Rhythm Shapes Our Lives,” at @mitpress.
And for an astounding application of rhythm, the story of Niko Tosa: “The Gambler Who Beat Roulette,” (with no electronic or other help). Via @nextdraft.
* George and Ira Gershwin (The piece [performed by Gershwin here] has become a jazz standard– and gave the world a chord progression– the “rhythm changes“– that provide the foundation for many other popular jazz tunes, including Charlie Parker’s and Dizzy Gillespie’s bebop standard “Anthropology (Thrivin’ on a Riff).”)
###
As we contemplate cadence, we might recall that it was on this date in 1965 that the Beatles recorded “Help!” The title song of their 1965 film and its soundtrack album, it was released as a single in July of that year, and was number one for three weeks in the United States and the United Kingdom.
The group recorded “Help!” in 12 takes using four-track equipment. The first nine takes concentrated on the instrumental backing. The descending lead guitar riff that precedes each verse proved to be difficult, so by take 4 it was decided to postpone it for an overdub. To guide that later overdub by George, John thumped the beat on his acoustic guitar body, which can be heard in the final stereo mix.
“I was a peripheral visionary. I could see the future, but only way off to the side.”*…

As Niels Bohr said, “prediciton is hard, especially about the future.” Still, we can try…
While the future cannot be predicted with certainty, present understanding in various scientific fields allows for the prediction of some far-future events, if only in the broadest outline. These fields include astrophysics, which studies how planets and stars form, interact, and die; particle physics, which has revealed how matter behaves at the smallest scales; evolutionary biology, which studies how life evolves over time; plate tectonics, which shows how continents shift over millennia; and sociology, which examines how human societies and cultures evolve.
The far future begins after the current millennium comes to an end, starting with the 4th millennium in 3001 CE, and continues until the furthest reaches of future time. These timelines include alternative future events that address unresolved scientific questions, such as whether humans will become extinct, whether the Earth survives when the Sun expands to become a red giant and whether proton decay will be the eventual end of all matter in the Universe…
A new pole star, the end of Niagara Falls, the wearing away of the Canadian Rockies– and these are just highlights from the first 50-60 million years. Read on for an extraordinary outline of what current science suggests is in store over the long haul: “Timeline of the far future,” a remarkable Wikipedia page.
Related pages: List of future astronomical events, Far future in fiction, and Far future in religion.
* Steven Wright
###
As we take the long view, we might send grateful birthday greetings to the man who “wrote the book” on perspective (a capacity analogically handy in the endeavor featured above), Leon Battista Alberti; he was born on this date in 1404. The archetypical Renaissance humanist polymath, Alberti was an author, artist, architect, poet, priest, linguist, philosopher, cartographer, and cryptographer. He collaborated with Toscanelli on the maps used by Columbus on his first voyage, and he published the the first book on cryptography that contained a frequency table.
But he is surely best remembered as the author of the first general treatise– De Pictura (1434)– on the the laws of perspective, which built on and extended Brunelleschi’s work to describe the approach and technique that established the science of projective geometry… and fueled the progress of painting, sculpture, and architecture from the Greek- and Arabic-influenced formalism of the High Middle Ages to the more naturalistic (and Latinate) styles of Renaissance.


You must be logged in to post a comment.