(Roughly) Daily

Posts Tagged ‘Science

“Neither the sun nor death can be looked at steadily”*…

 

All the old rites and superstitions that once warded off mystical evils have been condensed into one single command, so vast and monolithic we’ve forgotten that it’s even possible to disobey: Don’t look directly at the sun.

Not to look directly into the sun is (at a guess) one of the first lessons everyone is taught by their parents. As unquestioned ideological precepts go, it’s enormously effective. You learn it, you internalize it, and never really think of it again until you have kids of your own. And then you say it once more, repeating your parents’ words, and theirs, in an unbroken tradition going back God knows how many millennia. No, honey, never look directly into the sun…  But people do it. And our world is the better for it, because staring directly into the sun is our moral and political duty…

Question authority: “What happens when you stare at the sun.”

* François de La Rochefoucauld

###

As we put down the smoked glass, we might spare a thought for the creator of the object of another set of taboos, Harry Wesley Coover, Jr.; he died on this date in 2011.  A chemist working for Eastman Kodak, he accidentally discovered a substance first marketed as “Eastman 910,” now commonly known as Super Glue. Coover was a prolific inventor– he held 460 patents– but was proudest of the organizational system that he developed and oversaw at Kodak: “programmed innovation,” a management methodology emphasizing research and development, which resulted in the introduction of 320 new products and sales growth from $1.8 billion to $2.5 billion.  In 2004, he was inducted into the National Inventor’s Hall of Fame; then in 2010, received the National Medal of Technology and Innovation.

 source

 

Written by LW

March 26, 2017 at 1:01 am

“Now I am become Death, the destroyer of worlds”*…

 

Your correspondent is old enough to remember the Cold War and the Civil Defense efforts (booklets, films, duck-and-cover drills) aimed at “preparing” us for atomic conflict.  It’s a sad sign of our times that they’re re-emerging:  “Where to Hide If a Nuclear Bomb Goes Off In Your Area.”

(If there’s a silver lining in this fallout-laced cloud, it’s that it’s re-directing attention to a problem– a threat– that never actually went away; c.f., Ploughshares.)

* J. Robert Oppenheimer

###

As we enter the Twilight Zone, we might recall that it was on this date in 1903 that The Times (London) newspaper reported that Marie and Pierre Curie communicated to the Academy of Sciences that the recently discovered Radium…

… possesses the extraordinary property of continuously emitting heat, without combustion, without chemical change of any kind, and without any change to its molecular structure, which remains spectroscopically identical after many months of continuous emission of heat … such that the pure Radium salt would melt more than its own weight of ice every hour … A small tube containing Radium, if kept in contact with the skin for some hours … produces an open sore, by destroying the epidermis and the true skin beneath … and cause the death of living things whose nerve centres do not lie deep enough to be shielded from their influence.

That same year the Curies (and Antoine Henri Becquerel) were awarded the Noble Prize in Physics for their work on radioactivity and radiation.

Marie and Pierre Curie

source

 

Written by LW

March 25, 2017 at 1:01 am

“No problem can be solved from the same level of consciousness that created it”*…

 

Kick at the rock, Sam Johnson, break your bones:

But cloudy, cloudy is the stuff of stones.

– Richard Wilbur

Materialism holds the high ground these days in debates over that most ultimate of scientific questions: the nature of consciousness. When tackling the problem of mind and brain, many prominent researchers advocate for a universe fully reducible to matter. ‘Of course you are nothing but the activity of your neurons,’ they proclaim. That position seems reasonable and sober in light of neuroscience’s advances, with brilliant images of brains lighting up like Christmas trees while test subjects eat apples, watch movies or dream. And aren’t all the underlying physical laws already known?

From this seemly hard-nosed vantage, the problem of consciousness seems to be just one of wiring, as the American physicist Michio Kaku argued in The Future of the Mind (2014). In the very public version of the debate over consciousness, those who advocate that understanding the mind might require something other than a ‘nothing but matter’ position are often painted as victims of wishful thinking, imprecise reasoning or, worst of all, an adherence to a mystical ‘woo.’

It’s hard not to feel the intuitional weight of today’s metaphysical sobriety. Like Pickett’s Charge up the hill at Gettysburg, who wants to argue with the superior position of those armed with ever more precise fMRIs, EEGs and the other material artefacts of the materialist position? There is, however, a significant weakness hiding in the imposing-looking materialist redoubt. It is as simple as it is undeniable: after more than a century of profound explorations into the subatomic world, our best theory for how matter behaves still tells us very little about what matter is. Materialists appeal to physics to explain the mind, but in modern physics the particles that make up a brain remain, in many ways, as mysterious as consciousness itself…

The closer you look, the more the materialist explanation of consciousness (and physics) appears to rest on shaky metaphysical ground: “Minding matter.”

Pair with the two parts of Tim Park‘s conversation with Riccardo Manzotti: “Am I the Apple?” and  “The Mind in the Whirlwind.”

For dessert, “Atom, Archetype, and the Invention of Synchronicity: How Iconic Psychiatrist Carl Jung and Nobel-Winning Physicist Wolfgang Pauli Bridged Mind and Matter.”

* Albert Einstein, riffing on his friend Kurt Gödel

###

As we think about thinking, we might spare a thought for Frederick Winslow Taylor; he died on this date in 1915.  An engineer and inventor (42 patents), he’s best remembered as the father of “Scientific Management,” the discipline rooted in efficiency studies and standardization.  Quoth Peter Drucker:

Frederick W. Taylor was the first man in recorded history who deemed work deserving of systematic observation and study. On Taylor’s ‘scientific management’ rests, above all, the tremendous surge of affluence in the last seventy-five years which has lifted the working masses in the developed countries well above any level recorded before, even for the well-to-do. Taylor, though the Isaac Newton (or perhaps the Archimedes) of the science of work, laid only first foundations, however. Not much has been added to them since – even though he has been dead all of sixty years.

Taylor’s work encouraged many followers (including Frank “Cheaper by the Dozen” Gilbreth) and effectively spawned the field of management consulting.  But Taylor practiced what he preached, and found time to become a champion tennis player as well:  he won the first doubles tournament (1881) in U.S. National Championships, the precursor of the U.S. Open (with partner Clarence Clark).

source

“The rewards for biotechnology are tremendous – to solve disease, eliminate poverty, age gracefully. It sounds so much cooler than Facebook.”*…

 

It’s hard to tell precisely how big a role biotechnology plays in our economy, because it infiltrates so many parts of it. Genetically modified organisms such as microbes and plants now create medicine, food, fuel, and even fabrics. Recently, Robert Carlson, of the biotech firm Biodesic and the investment firm Bioeconomy Capital, decided to run the numbers and ended up with an eye-popping estimate. He concluded that in 2012, the last year for which good data are available, revenues from biotechnology in the United States alone were over $324 billion.

“If we talk about mining or several manufacturing sectors, biotech is bigger than those,” said Carlson. “I don’t think people appreciate that.”…

What makes the scope of biotech so staggering is not just its size, but its youth. Manufacturing first exploded in the Industrial Revolution of the 19th century. But biotech is only about 40 years old. It burst into existence thanks largely to a discovery made in the late 1960s by Hamilton Smith, a microbiologist then at Johns Hopkins University, and his colleagues, that a protein called a restriction enzyme can slice DNA. Once Smith showed the world how restriction enzymes work, other scientists began using them as tools to alter genes…

The whole story at “The Man Who Kicked Off the Biotech Revolution.”

* George M. Church

###

As we clean our petri dishes, we might recall that it was on this date in 182 that Charles M. Graham was issued the first U.S. patent for artificial teeth. The record, and its details, were lost in the Patent Office fire of December, 1836.  Dentures or false teeth had been around for eons. There is evidence Etruscans in what is today northern Italy made dentures out of human or animal teeth as early as 7000 BC; George Washington owned four separate sets of dentures (though none were wooden, despite a myth to that effect).  But Graham was the first to patent his approach.

 source

 

Written by LW

March 9, 2017 at 1:01 am

“When you have mastered numbers, you will in fact no longer be reading numbers, any more than you read words when reading books. You will be reading meanings.”*…

Errors of judgment about large numbers can have a big impact on the way you view policies and government decisions. The rationale goes like this: The National Science Foundation received $7.463 billion for fiscal year 2016 through the Consolidated Appropriations Act. The total United States budget outlay for 2016 was $3.54 trillion. If you’re someone who perceives the difference between a billion and a trillion as relatively small, you’d think the US is spending a lot of money on the National Science Foundation—in fact, depending on your politics, you might applaud the federal government’s investment or even think it wasteful. But, if you understand that a billion is a thousand times less than a trillion, you can calculate that the Foundation got a paltry 0.2 percent of the budget outlay last year. (It may be more straightforward to think of the budget as roughly one-half to one-third of reported costs for the proposed US-Mexico border wall, and let your values guide you from there.)…

On the significance of scale: “How to Understand Extreme Numbers.

[The image above is, of course, from the ever-wonderful xkcd.]

* W.E.B. Du Bois

###

As we nudge ourselves toward numeracy, we might spare a thought for Sewall Wright; he died on this date in 1988.  A geneticist, he was known for his influential work on evolutionary theory and also for his work on path analysis. He was a founder (with Ronald Fisher and J.B.S. Haldane) of population genetics– a major step in the development of the modern evolutionary synthesis combining genetics with evolution.   He is perhaps best remembered for his concept of genetic drift (called the Sewall Wright effect): when small populations of a species are isolated, the few individuals who carry certain relatively rare genes may fail, out of pure chance, to transmit them. The genes may therefore disappear and their loss may lead to the emergence of new species– although natural selection has played no part in the process.

 source

 

Written by LW

March 3, 2017 at 1:01 am

“I am so clever that sometimes I don’t understand a single word of what I am saying”*…

The notion that intelligence could determine one’s station in life… runs like a red thread through Western thought, from the philosophy of Plato to the policies of UK prime minister Theresa May. To say that someone is or is not intelligent has never been merely a comment on their mental faculties. It is always also a judgment on what they are permitted to do. Intelligence, in other words, is political.

Sometimes, this sort of ranking is sensible: we want doctors, engineers and rulers who are not stupid. But it has a dark side. As well as determining what a person can do, their intelligence – or putative lack of it – has been used to decide what others can do to them. Throughout Western history, those deemed less intelligent have, as a consequence of that judgment, been colonised, enslaved, sterilised and murdered (and indeed eaten, if we include non-human animals in our reckoning).

It’s an old, indeed an ancient, story. But the problem has taken an interesting 21st-century twist with the rise of Artificial Intelligence (AI)…

Go mental at “Intelligence: a history.”

Pair with Isaac Asimov’s lighter piece to the same point, “What is intelligence, anyway?

* Oscar Wilde

###

As we celebrate variety, we might send thoughtful birthday greetings to Michel Eyquem de Montaigne; he was born on this date in 1533.  Best known during his lifetime as a statesman, Montaigne is remembered for popularizing the essay as a literary form.  His effortless merger of serious intellectual exercises with casual anecdotes and autobiography– and his massive volume Essais (translated literally as “Attempts” or “Trials”)– contain what are, to this day, some of the most widely-influential essays ever written.  Montaigne had a powerful impact on writers ever after, from Descartes, Pascal, and Rousseau, through Hazlitt, Emerson, and Nietzsche, to Zweig, Hoffer, and Asimov. Indeed, he’s believed to have been an influence on the later works of Shakespeare.

 source

Written by LW

February 28, 2017 at 1:01 am

“Consciousness cannot be accounted for in physical terms. For consciousness is absolutely fundamental. It cannot be accounted for in terms of anything else.”*…

 

As a neuroscientist, I am frequently asked about consciousness. In academic discourse, the celebrated problem of consciousness is often divided into two parts: the “Easy Problem” involves identifying the processes in the brain that correlate with particular conscious experiences. The “Hard Problem” involves murkier questions: what are conscious experiences, and why do they exist at all? This neat separation into Easy and Hard problems, which comes courtesy the Australian philosopher David Chalmers, seems to indicate a division of labor. The neuroscientists, neurologists and psychologists can, at least in principle, systematically uncover the neural correlates of consciousness. Most of them agree that calling this the “Easy Problem” somewhat underestimates the theoretical and experimental challenges involved. It may not be the Hard Problem, but at the very least it’s A Rather Hard Problem. And many philosophers and scientists think that the Hard Problem may well be a non-problem, or, as Ludwig Wittgenstein might have said, the kind of problem that philosophers typically devise in order to maximize unsolvability.

One might assume that as a neuroscientist, I should be gung-ho to prove the imperious philosophers wrong, and to defend the belief that science can solve any sort of problem one might throw at it: hard, soft, or half-baked. But I have become increasingly convinced that science is severely limited in what it can say about consciousness. In a very important sense, consciousness is invisible to science…

Yohan John on “Why some neuroscientists call consciousness ‘the C-word’.”  Via the always-illuminating 3 Quarks Daily.

* Erwin Schrödinger

###

As we muse on mind, we might spare a thought for Mary Whiton Calkins; she died on this date in 1930.  A psychologist and philosopher, Calkins studied psychology at Harvard as a “guest” (since women could not officially register there in her day).  Though she completed all requirements for a doctorate, and had the strong support of William James and her other professors, Harvard still refused to grant a degree to a woman. She went on to become the first prominent woman in her fields:  After leaving Harvard, she established the first psychology laboratory at a women’s college (Wellesley), and later became the first female president of both the American Psychological Association and the American Philosophical Association.

 source

 

%d bloggers like this: