(Roughly) Daily

Posts Tagged ‘Science

“Life is a DNA software system”*…

 

stranger-visions-wellcome-collection-2305-1

 

DNA from discarded cigarette butts and chewed up gum has been used to create a series of life-sized 3D printed portraits for a new exhibition at the Wellcome Collection.

American artist Heather Dewey-Hagborg walked the streets of New York picking up cigarettes and hair for her project called Stranger Visions.

She then analysed the DNA to work out the gender and ethnicity of the people involved as well as their likely eye colour and other traits including the size of their nose, before using face-generating software and a 3D printer to create a series of speculative portraits

More– and more photos– at “3D printed portraits made with DNA from cigarette butts to feature in new Wellcome Collection display.”

See also the analogically-related “Artificial Intelligence Generates Humans’ Faces Based on Their Voices.”

* Craig Venter

###

As we noodle on nature and nurture, we might recall that it was on this date in 1981 that the USDA announced the first genetically-engineered vaccine for any animal or human disease: an immunization against Hoof and Mouth Disease (also known as Foot and Mouth Disease, or FMD), created using gene splicing.

NHF-FMD-Keep-Out source

 

“As my artist’s statement explains, my work is utterly incomprehensible and is therefore full of deep significance”*…

 

Klimt

 

The neuroscientist was in the art gallery and there were many things to learn. So Eric Kandel excitedly guided me through the bright lobby of the Neue Galerie New York, a museum of fin de siècle Austrian and German art, located in a Beaux-Art mansion, across from Central Park. The Nobel laureate was dressed in a dark blue suit with white pinstripes and red bowtie. I was dressed, well, less elegantly.

Since winning a Nobel Prize in Physiology or Medicine in 2000, for uncovering the electrochemical mechanisms of memory, Kandel had been thinking about art. In 2012 and 2016, respectively, he published The Age of Insight and Reductionism in Art and Brain Science, both of which could be called This Is Your Brain on Art. The Age of Insight detailed the rise of neuroscience out of the medical culture that surrounded Sigmund Freud, and focused on Gustav Klimt and his artistic disciples Oskar Kokoschka and Egon Schiele, whose paintings mirrored the age’s brazen ideas about primal desires smoldering beneath conscious control.

I’d invited Kandel to meet me at the Neue Galerie because it was the premier American home of original works by Klimt, Kokoschka, and Schiele. It was 2014 when we met and I had long been reading about neuroaesthetics, a newish school in neuroscience, and a foundation of The Age of Insight, where brain computation was enlisted to explain why and what in art turned us on. I was anxious to hear Kandel expound on how neuroscience could enrich art, as he had written, though I also came with a handful of doubts…

Kevin Berger learns “what neuroscience is doing to art”: “Gustav Klimt in the Brain Lab.”

* Bill Watterson

###

As we think about thinking about it, we might spare a thought for Jacob Lawrence; he died on this date in 2000.  One of the best-respected 20th century American painters, and one the most well-known African-American artists, Jacobs described his style as “Dynamic Cubism.”  His works are in the permanent collections of numerous museums, including the Philadelphia Museum of Art, the Museum of Modern Art, the Whitney Museum, the Phillips Collection, Metropolitan Museum of Art, the Brooklyn Museum, and Reynolda House Museum of American Art.

He is perhaps best known for a 60-panel work, Migration Series (depicting the migration of rural southern African-Americans to the urban north), which he painted on cardboard.  The collection is now held by two museums: the odd-numbered paintings are on exhibit in the Phillips Collection in Washington, D.C., and the even-numbered are displayed at MOMA in New York.

Migration_Series_Panel_1

The first panel of Migration Series [source]

220px-Portrait_of_Jacob_Lawrence_LCCN2004663191 source

 

Written by LW

June 9, 2019 at 1:01 am

“To sleep, perchance to dream”*…

 

sleep

 

On a typical workday morning, if you’re like most people, you don’t wake up naturally. Instead, the ring of an alarm clock probably jerks you out of sleep. Depending on when you went to bed, what day of the week it is, and how deeply you were sleeping, you may not understand where you are, or why there’s an infernal chiming sound. Then you throw out your arm and hit the snooze button, silencing the noise for at least a few moments. Just another couple of minutes, you think. Then maybe a few minutes more.

It may seem like you’re giving yourself a few extra minutes to collect your thoughts. But what you’re actually doing is making the wake-up process more difficult and drawn out…

Journalist (and professional poker player) Maria Konnikova on why “Snoozers are, in fact, losers.”

* Shakespeare, Hamlet

###

As we ruminate on rest, we might spare a thought for a man who seems barely to have slept at all, Francois-Marie Arouet, better known as Voltaire; he died on this date in 1778.  The Father of the Age of Reason, he produced works in almost every literary form: plays, poems, novels, essays, and historical and scientific works– more than 2,000 books and pamphlets (and more than 20,000 letters).  He popularized Isaac Newton’s work in France by arranging a translation of Principia Mathematica to which he added his own commentary.

A social reformer, Voltaire used satire to criticize the intolerance, religious dogma, and oligopolistic privilege of his day, perhaps nowhere more sardonically than in Candide.

 source

 

 

Written by LW

May 30, 2019 at 1:01 am

“The average scientist unequipped with the powerful lenses of philosophy, is a nearsighted creature, and cheerfully attacks each difficulty in the hope that it may prove to be the last”*…

 

philo_sci

 

There are decisive grounds for holding that we need to bring about a revolution in philosophy, a revolution in science, and then put the two together again to create a modern version of natural philosophy.

Once upon a time, it was not just that philosophy was a part of science; rather, science was a branch of philosophy. We need to remember that modern science began as natural philosophy – a development of philosophy, an admixture of philosophy and science. Today, we think of Galileo, Johannes Kepler, William Harvey, Robert Boyle, Christiaan Huygens, Robert Hooke, Edmond Halley and, of course, Isaac Newton as trailblazing scientists, while we think of Francis Bacon, René Descartes, Thomas Hobbes, John Locke, Baruch Spinoza and Gottfried Leibniz as philosophers. That division is, however, something we impose on the past. It is profoundly anachronistic…

Science broke away from metaphysics, from philosophy, as a result of natural philosophers adopting a profound misconception about the nature of science. As a result, natural philosophy died, the great divide between science and philosophy was born, and the decline of philosophy began.

It was Newton who inadvertently killed off natural philosophy with his claim, in the third edition of his Principia, to have derived his law of gravitation from the phenomena by induction…

Nicholas Maxwell argues that science and philosophy need to be re-joined, lest humanity seek knowledge at the expense of wisdom; only then, he suggests, can we hope to solve the urgent, fundamental problems that we face: “Natural philosophy redux.”

[Image above: source]

* Gilbert N. Lewis

###

As we seek ever-higher ground, we might that it was on this date in 1898 that the heirs of Alfred Nobel signed a “reconciliation agreement,” allowing his lawyers and accountants to execute his will.  The lion’s share of his estate was clearly marked for the establishment of the eponymous Prizes that are awarded each year.  But the residue, which was to be divided among descendants was the subject of much contention.

nobel_will_p1

The first page of Nobel’s will [source]

 

“It’s tough to make predictions, especially about the future”*…

 

prediction

 

As astrophysicist Mario Livo recounts in Brilliant Blunders, in April 1900, the eminent physicist Lord Kelvin proclaimed that our understanding of the cosmos was complete except for two “clouds”—minor details still to be worked out. Those clouds had to do with radiation emissions and with the speed of light… and they pointed the way to two major revolutions in physics: quantum mechanics and the theory of relativity.  Prediction is hard; ironically, it’s especially hard for experts attempting foresight in their own fields…

The idea for the most important study ever conducted of expert predictions was sparked in 1984, at a meeting of a National Research Council committee on American-Soviet relations. The psychologist and political scientist Philip E. Tetlock was 30 years old, by far the most junior committee member. He listened intently as other members discussed Soviet intentions and American policies. Renowned experts delivered authoritative predictions, and Tetlock was struck by how many perfectly contradicted one another and were impervious to counterarguments.

Tetlock decided to put expert political and economic predictions to the test. With the Cold War in full swing, he collected forecasts from 284 highly educated experts who averaged more than 12 years of experience in their specialties. To ensure that the predictions were concrete, experts had to give specific probabilities of future events. Tetlock had to collect enough predictions that he could separate lucky and unlucky streaks from true skill. The project lasted 20 years, and comprised 82,361 probability estimates about the future.

The result: The experts were, by and large, horrific forecasters. Their areas of specialty, years of experience, and (for some) access to classified information made no difference. They were bad at short-term forecasting and bad at long-term forecasting. They were bad at forecasting in every domain. When experts declared that future events were impossible or nearly impossible, 15 percent of them occurred nonetheless. When they declared events to be a sure thing, more than one-quarter of them failed to transpire. As the Danish proverb warns, “It is difficult to make predictions, especially about the future.”…

One subgroup of scholars, however, did manage to see more of what was coming… they were not vested in a single discipline. They took from each argument and integrated apparently contradictory worldviews…

The integrators outperformed their colleagues in pretty much every way, but especially trounced them on long-term predictions. Eventually, Tetlock bestowed nicknames (borrowed from the philosopher Isaiah Berlin) on the experts he’d observed: The highly specialized hedgehogs knew “one big thing,” while the integrator foxes knew “many little things.”…

Credentialed authorities are comically bad at predicting the future. But reliable– at least more reliable– forecasting is possible: “The Peculiar Blindness of Experts.”

See Tetlock discuss his findings at a Long Now Seminar.  Read Berlin’s riff on Archilochus, “The Hedgehog and the Fox,” here.

* Yogi Berra

###

As we ponder prediction, we might send complicating birthday greetings to Edward Norton Lorenz; he was born on this date in 1917.  A mathematician who turned to meteorology during World War II, he established the theoretical basis of weather and climate predictability, as well as the basis for computer-aided atmospheric physics and meteorology.

But he is probably better remembered as the founder of modern chaos theory, a branch of mathematics focusing on the behavior of dynamical systems that are highly sensitive to initial conditions… and thus practically impossible to predict in detail with certainty.

In 1961, Lorenz was using a simple digital computer, a Royal McBee LGP-30, to simulate weather patterns by modeling 12 variables, representing things like temperature and wind speed. He wanted to see a sequence of data again, and to save time he started the simulation in the middle of its course. He did this by entering a printout of the data that corresponded to conditions in the middle of the original simulation. To his surprise, the weather that the machine began to predict was completely different from the previous calculation. The culprit: a rounded decimal number on the computer printout. The computer worked with 6-digit precision, but the printout rounded variables off to a 3-digit number, so a value like 0.506127 printed as 0.506. This difference is tiny, and the consensus at the time would have been that it should have no practical effect. However, Lorenz discovered that small changes in initial conditions produced large changes in long-term outcome. His work on the topic culminated in the publication of his 1963 paper “Deterministic Nonperiodic Flow” in Journal of the Atmospheric Sciences, and with it, the foundation of chaos theory…

His description of the butterfly effect, the idea that small changes can have large consequences, followed in 1969.

lorenz source

 

“If thou thou’st him some thrice, it will not be amiss”*…

 

You

 

‘You’ is the fourteenth most frequently used word in the English language, following closely behind its fellow pronouns ‘it’ at number eight and ‘I’ at number eleven:

The fact that you follows closely behind I in popularity is probably attributable to its being an eight-way word: both subject and object, both singular and plural, and both formal and familiar. The all-purpose second person is an unusual feature of English, as middle-schoolers realize when they start taking French, Spanish, or, especially German, which offers a choice of seven different singular versions of you. It’s relatively new in our language. In early modern English, beginning  in the late fifteenth century, thou, thee and thy were singular forms for the subjective, objective and possessive, and ye, you and your were plural. In the 1500s and 1600s, ye and then the thou/thee/thy forms, faded away, to be replaced by the all-purpose you. But approaches to this second person were interesting in this period of flux. David Crystal writes in The Cambridge Encyclopedia of English that by Shakespeare’s time, you “was used by people of lower rank or status to those above them (such as ordinary people to nobles, children to parents, servants to masters, nobles to the monarch), and was also the standard way for the upper classes to talk to each other. … By contrast, thou/thee were used by people of higher rank to those beneath them, and by the lower classes to each other; also in elevated poetic style, in addressing God, and in talking to witches, ghosts and other supernatural  beings.” The OED cites a 1675 quotation: “No Man will You God but will use the pronoun Thou to him.”

“Needless to say, this ambiguity and variability were gold in the hand of a writer like Shakespeare, and he played with it endlessly, sometimes having a character switch modes of address within a speech to indicate a change in attitude.” [see the title of this post, for example]…

More of this excerpt from Ben Yagoda’s When You Catch an Adjective, Kill It: The Parts of Speech, for Better and/or Worse  at “You.”

[Via the ever-illuminating Delanceyplace.com]

* Sir Toby Belch to Sir Andrew Aguecheek, in Shakespeare’s Twelfth Night

###

As we muse on modes of address, we might send elegantly phrased and eclectic birthday greetings to Persian polymath Omar Khayyam; the philosopher, mathematician, astronomer, epigrammatist, and poet was born on this date in 1048. While he’s probably best known to English-speakers as a poet, via Edward FitzGerald’s famous translation of (what he called) the Rubaiyat of Omar Khayyam, Fitzgerald’s attribution of the book’s poetry to Omar (as opposed to the aphorisms and other quotes in the volume) is now questionable to many scholars (who believe those verses to be by several different Persian authors).

In any case, Omar was unquestionably one of the major mathematicians and astronomers of the medieval period.  He is the author of one of the most important treatises on algebra written before modern times, the Treatise on Demonstration of Problems of Algebra, which includes a geometric method for solving cubic equations by intersecting a hyperbola with a circle.  His astronomical observations contributed to the reform of the Persian calendar.  And he made important contributions to mechanics, geography, mineralogy, music, climatology and Islamic theology.

 source

 

 

Written by LW

May 15, 2019 at 1:01 am

“When the world changes faster than species can adapt, many fall out”*…

 

Dinosaurs

 

(Roughly) Daily recently considered the newly-unearthed fossil record of the asteroid strike that led to the extinction of the dinosaurs.  But what if that asteroid had missed?

An asteroid slammed down and did away with all the dinosaurs, paving the way for such developments as the human race, capitalism, and posting on the internet: it’s the story we all know and love. Yet if things had shaken out differently—if the asteroid had stayed in its place, and the dinosaurs allowed to proceed with their business—what would things have looked like?

Would the earth be a pristine, unsmogged paradise, or would the dinosaurs have somehow evolved into even more rapacious profiteers/industrialists, wrecking the world with their dinosaur refineries and dinosaur dark money? The latter scenario being totally implausible, what’s a likely answer to the question of what our world would look like if that asteroid never hit it?…

Nine scientists– geologists, paleontologists, and evolutionary biologists– provide some fascinating “alternative history”: “What If the Asteroid Never Killed the Dinosaurs?

* Elizabeth Kolbert, The Sixth Extinction: An Unnatural History

###

As we explore the road not taken, we might recall that it was on this date in 1869 that the American Museum of Natural History was incorporated.  Its founding had been urged in a letter, dated December 30, 1868, and sent to Andrew H. Green, Comptroller of Central Park, New York, signed by 19 persons, including Theodore Roosevelt, A.G. Phelps Dodge, and J. Pierpont Morgan.  They wrote: “A number of gentlemen having long desired that a great Museum of Natural History should be established in Central Park, and having now the opportunity of securing a rare and very valuable collection as a nucleus of such Museum, the undersigned wish to enquire if you are disposed to provide for its reception and development.”  Their suggestion was accepted by Park officials; the collections were purchased– and thus the great museum began.  It opened April 27, 1871.

 source

 

%d bloggers like this: