(Roughly) Daily

“Two polar groups: at one pole we have the literary intellectuals, at the other scientists… Between the two a gulf of mutual incomprehension.”*…

 

A contempt for science is neither new, lowbrow, nor confined to the political right. In his famous 1959 lecture “The Two Cultures and the Scientific Revolution,” C.P. Snow commented on the disdain for science among educated Britons and called for a greater integration of science into intellectual life. In response to this overture, the literary critic F.R. Leavis wrote a rebuttal in 1962 that was so vituperative The Spectator had to ask Snow to promise not to sue for libel if they published the work.

The highbrow war on science continues to this day, with flak not just from fossil-fuel-funded politicians and religious fundamentalists but also from our most adored intellectuals and in our most august institutions of higher learning. Magazines that are ostensibly dedicated to ideas confine themselves to those arising in politics and the arts, with scant attention to new ideas emerging from science, with the exception of politicized issues like climate change (and regular attacks on a sin called “scientism”). Just as pernicious is the treatment of science in the liberal-arts curricula of many universities. Students can graduate with only a trifling exposure to science, and what they do learn is often designed to poison them against it.

The most frequently assigned book on science in universities (aside from a popular biology textbook) is Thomas Kuhn’s The Structure of Scientific Revolutions. That 1962 classic is commonly interpreted as showing that science does not converge on the truth but merely busies itself with solving puzzles before lurching to some new paradigm that renders its previous theories obsolete; indeed, unintelligible. Though Kuhn himself disavowed that nihilist interpretation, it has become the conventional wisdom among many intellectuals. A critic from a major magazine once explained to me that the art world no longer considers whether works of art are “beautiful” for the same reason that scientists no longer consider whether theories are “true.” He seemed genuinely surprised when I corrected him…

The usually extremely optimistic Steven Pinker (see here, e.g.) waxes concerned– if not, indeed, pessimistic– about the place of science to today’s society: “The Intellectual War on Science.”

* C.P. Snow, The Two Cultures and the Scientific Revolution (1959)

###

As we rein in our relativism, we might send heavenly birthday greetings to the scientist who inspired Thomas Kuhn (see here and here), Nicolaus Copernicus; he was born on this date in 1473.  A Renaissance polyglot and polymath– he was a canon lawyer, a mathematician, a physician,  a classics scholar, a translator, a governor, a diplomat, and an economist– he is best remembered as an astronomer.  Copernicus’ De revolutionibus orbium coelestium (On the Revolutions of the Celestial Spheres; published just before his death in 1543), with its heliocentric account of the solar system, is often regarded as the beginning both of modern astronomy and of the scientific revolution.

Of all discoveries and opinions, none may have exerted a greater effect on the human spirit than the doctrine of Copernicus. The world had scarcely become known as round and complete in itself when it was asked to waive the tremendous privilege of being the center of the universe. Never, perhaps, was a greater demand made on mankind – for by this admission so many things vanished in mist and smoke! What became of our Eden, our world of innocence, piety and poetry; the testimony of the senses; the conviction of a poetic – religious faith? No wonder his contemporaries did not wish to let all this go and offered every possible resistance to a doctrine which in its converts authorized and demanded a freedom of view and greatness of thought so far unknown, indeed not even dreamed of.

– Goethe

 source

Written by LW

February 19, 2018 at 1:01 am

“I’ve been accused of vulgarity. I say that’s bullshit.”*…

 

The author of A Classical Dictionary of the Vulgar Tongue

Thirty years after Dr Johnson published his great Dictionary of the English Language (1755), Francis Grose put out A Classical Dictionary of the Vulgar Tongue (1785), a compendium of slang Johnson had deemed unfit for his learned tome. Grose was not one for library work. He preferred to do his lexicography in the sordid heart of after-hours London. Supported by his trusty assistant Tom Cocking, he cruised the watering holes of Covent Garden and the East End, eating, boozing, and listening. He took pleasure in hearing his name punningly connected to his rotund frame. And he produced a book brimming with Falstaffian life.

In Vulgar Tongues (2016), Max Décharné called Grose’s dictionary, “A declaration in favour of free speech, and a gauntlet thrown down against official censorship, moralists and the easily offended.” While a good deal of the slang has survived into the present day — to screwis to copulate; to kick the bucket is to die — much would likely have been lost had Grose not recorded it. Some of the more obscure metaphors include a butcher’s dog, meaning someone who “lies by the beef without touching it; a simile often applicable to married men”; to box the Jesuit, meaning “to masturbate; a crime, it is said, much practised by the reverend fathers of that society”; and to polish meaning to be in jail, in the sense of “polishing the king’s iron with one’s eyebrows, by looking through the iron grated windows”. Given this was the era of William Hogarth’s famous painting Gin Lane (1751), it’s not surprising to find the dictionary soaked through with colourful epithets for the juniper-based liquor: blue ruincobblers punchfrog’s wineheart’s easemoonshinestrip me naked. The Grose dictionary also contains hundreds of great insults, like bottle-headed, meaning void of wit, something you can’t say about its author.

Via Public Domain Review; read the Dictionary at the Internet Archive.

* Mel Brooks

###

As we choose our words carefully, we might recall that it was on this date in 1865 that Adventures of Huckleberry Finn (or, in more recent editions, The Adventures of Huckleberry Finn) was published in the U.S.  Routinely listed among the greatest American novels, it was one of the first to be written in vernacular English.

Upon issue of the American edition in 1885 several libraries banned it from their shelves.  The early criticism focused on what was perceived as the book’s crudeness. One incident was recounted in the newspaper the Boston Transcript:

The Concord (Mass.) Public Library committee has decided to exclude Mark Twain’s latest book from the library. One member of the committee says that, while he does not wish to call it immoral, he thinks it contains but little humor, and that of a very coarse type. He regards it as the veriest trash. The library and the other members of the committee entertain similar views, characterizing it as rough, coarse, and inelegant, dealing with a series of experiences not elevating, the whole book being more suited to the slums than to intelligent, respectable people.

Writer Louisa May Alcott criticized the book’s publication as well, saying that if Twain “[could not] think of something better to tell our pure-minded lads and lasses he had best stop writing for them.”

Twain later remarked to his editor, “Apparently, the Concord library has condemned Huck as ‘trash and only suitable for the slums.’ This will sell us another twenty-five thousand copies for sure!”  [source]

Cover of the first U.S.edition

source

 

“The number of transistors on integrated circuits doubles approximately every two years”*…

 

Moore’s Law has held up almost astoundingly well…

 source (and larger version)

This seemingly inexorable march has enabled an extraordinary range of new products and services– from intercontinental ballistic missiles to global environmental monitoring systems and from smart phones to medical implants…  But researchers at Carnegie Mellon University are sounding an alarm…

The speed of our technology doubles every year, right? Not anymore. We’ve come to take for granted that as the years go on, computing technology gets faster, cheaper and more energy-efficient.

In their recent paper, “Science and research policy at the end of Moore’s law” published in Nature Electronics, however, Carnegie Mellon University researchers Hassan Khan, David Hounshell, and Erica Fuchs argue that future advancement in microprocessors faces new and unprecedented challenges…

In the seven decades following the invention of the transistor at Bell Labs, warnings about impending limits to miniaturization and the corresponding slow down of Moore’s Law have come regularly from industry observers and academic researchers. Despite these warnings, semiconductor technology continually progressed along the Moore’s Law trajectory. Khan, Hounshell, and Fuchs’ archival work and oral histories, however, make clear that times are changing.

“The current technological and structural challenges facing the industry are unprecedented and undermine the incentives for continued collective action in research and development,” the authors state in the paper, “which has underpinned the last 50 years of transformational worldwide economic growth and social advance.”

As the authors explain in their paper, progress in semiconductor technology is undergoing a seismic shift driven by changes in the underlying technology and product-end markets…

To continue advancing general purpose computing capabilities at reduced cost with economy-wide benefits will likely require entirely new semiconductor process and device technology.” explains Engineering and Public Policy graduate Hassan Khan. “The underlying science for this technology is as of yet unknown, and will require significant research funds – an order of magnitude more than is being invested today.”

The authors conclude by arguing that the lack of private incentives creates a case for greatly increased public funding and the need for leadership beyond traditional stakeholders. They suggest that funding is needed of $600 million dollars per year with 90% of those funds from public research dollars, and the rest most likely from defense agencies…

Read the complete summary at “Moore’s law has ended. What comes next?“; read the complete Nature article here.

* a paraphrase of Gordon’s Moore’s assertion– known as “Moore’s law”– in the thirty-fifth anniversary issue of Electronics magazine, published on April 19, 1965

###

As we pack ’em ever tighter, we might send carefully-computed birthday greetings to Thomas John Watson Sr.; he was born on this date in 1874.  A mentee of from John Henry Patterson’s at NCR, where Watson began his career, Watson became the chairman and CEO of the Computing-Tabulating-Recording Company (CTR), which, in 1924, he renamed International Business Machines– IBM.  He began using his famous motto– THINK– while still at NCR, but carried it with him to IBM…  where it became that corporation’s first trademark (in 1935).  That motto was the inspiration for the naming of the Thinkpad– and Watson himself (along with Sherlock’s Holmes’ trusty companion), for the naming of IBM’s Artificial Intelligence product.

 source

 

“What we call chaos is just patterns we haven’t recognized. What we call random is just patterns we can’t decipher.”*…

 

A “commonplace” book from the 17th century

On any given day, from her home on the Isle of Man, Linda Watson might be reading a handwritten letter from one Confederate soldier to another, or a list of convicts transported to Australia. Or perhaps she is reading a will, a brief from a long-forgotten legal case, an original Jane Austen manuscript. Whatever is in them, these documents made their way to her because they have one thing in common: They’re close to impossible to read.

Watson’s company, Transcription Services, has a rare specialty—transcribing historical documents that stump average readers. Once, while talking to a client, she found the perfect way to sum up her skills. “We are good at reading the unreadable,” she said. That’s now the company’s slogan.

For hundreds of years, history was handwritten. The problem is not only that our ancestors’ handwriting was sometimes very bad, but also that they used abbreviations, old conventions, and styles of lettering that have fallen out of use. Understanding them takes both patience and skill…

A transcriber on the Isle of Man can decipher almost anything: “Where Old, Unreadable Documents Go to Be Understood.”

* Chuck Palahniuk

###

As we puzzle it out, we might recall that it was on this date in 1659 that the first known check of the modern era was written.  Early Indians of the Mauryan period (from 321 to 185 BC) employed a commercial instrument called adesha; Romans used praescriptiones; Muslim traders used the saqq; and Venetian traders used bills of exchange— but these were effectively either a form of currency or letters of credit.  The 1659 draft– made out for £400, signed by Nicholas Vanacker, made payable to a Mr Delboe, and drawn on Messrs Morris and Clayton, scriveners and bankers of the City of London–  was the first “check” as we came to know them.  It’s on display at Westminster Abbey.  (The world “check” likely also originated in England later in the 1700s when serial numbers were placed on these pieces of paper as a way to keep track of, or “check” on, them.)

 source

 

Written by LW

February 16, 2018 at 1:01 am

“First we eat, then we do everything else”*…

 

Imagine the ideal food. One that contains all the nutrients necessary to meet, but not exceed, our daily nutrient demands. If such a food existed, consuming it, without eating any other, would provide the optimal nutritional balance for our body.

Such a food does not exist. But we can do the next best thing.

The key is to eat a balance of highly nutritional foods, that when consumed together, do not contain too much of any one nutrient, to avoid exceeding daily recommended amounts.

Scientists studied more than 1,000 foods, assigning each a nutritional score. The higher the score, the more likely each food would meet, but not exceed your daily nutritional needs, when eaten in combination with others…

See the top 100, ranked at “The world’s most nutritious foods.”

* M. F. K. Fisher

###

As we help ourselves, we might send bounteous birthday greetings to Cyrus Hall McCormick; he was born on this date in 1809.  Widely credited as the inventor of the first mechanical reaper, he was in fact just one of several contributing to its development.  His more singular achievement as a creator was his success in the development of a modern company, with manufacturing, marketing, and a sales force to market his products.  His McCormick Harvesting Machine Company became part of the International Harvester Company in 1902.

Interestingly, the grains that McCormick’s reapers helped harvest appear nowhere on the list…

 source

 

Written by LW

February 15, 2018 at 1:01 am

“The most exciting phrase to hear in science, the one that heralds new discoveries, is not ‘Eureka!’ but ‘That’s funny’”*…

 

Alexander Fleming’s discovery of penicillin is commonly used as an example of serendipity in science

Scientific folklore is full of tales of accidental discovery, from the stray Petri dish that led Alexander Fleming to discover penicillin to Wilhelm Röntgen’s chance detection of X-rays while tinkering with a cathode-ray tube.

That knowledge often advances through serendipity is how scientists, sometimes loudly, justify the billions of dollars that taxpayers plough into curiosity-driven research each year. And it is the reason some argue that increasing government efforts to control research — with an eye to driving greater economic or social impact — are at best futile and at worst counterproductive.

But just how important is serendipity to science? Scientists debating with policymakers have long relied on anecdotal evidence. Studies rarely try to quantify how much scientific progress was truly serendipitous, how much that cost or the circumstances in which it emerged.

Serendipity can take on many forms, and its unwieldy web of cause and effect is difficult to constrain. Data are not available to track it in any meaningful way. Instead, academic research has focused on serendipity in science as a philosophical concept.

The European Research Council aims to change that…

On the heels of yesterday’s post on the history of dice, and the way they evolved over the centuries to be “fairer”– to favor chance– another post on luck…  more specifically in this case, on whether it’s all that it’s cracked up to be.  Scientists often herald the role of chance in research; a project in Britain aims to test that popular idea with evidence: “The serendipity test.”

* Isaac Asimov

###

As we contemplate contingency, we might send elaborately-engineered birthday greetings to George Washington Gale Ferris Jr.; he was born on this date in 1859.  An engineer and inventor, he had built a successful career testing and inspecting metals for railroads and bridge builders when…

… in 1891, the directors of the World’s Columbian Exposition [to be held in 1893] issued a challenge to American engineers to conceive of a monument for the fair that would surpass the Eiffel Tower, the great structure of the Paris International Exposition of 1889. The planners wanted something “original, daring and unique.” Ferris responded with a proposed wheel from which visitors would be able to view the entire exhibition, a wheel that would “Out-Eiffel Eiffel.” The planners feared his design for a rotating wheel towering over the grounds could not possibly be safe.

Ferris persisted. He returned in a few weeks with several respectable endorsements from established engineers, and the committee agreed to allow construction to begin. Most convincingly, he had recruited several local investors to cover the $400,000 cost of construction. The planning commission of the Exposition hoped that admissions from the Ferris Wheel would pull the fair out of debt and eventually make it profitable. [source]

It carried 2.5 million passengers before it was finally demolished in 1906.  But while the Fair’s promoters hopes were fulfilled– the Ferris Wheel was a windfall– Ferris claimed that the exhibition management had robbed him and his investors of their rightful portion of the nearly $750,000 profit that his wheel brought in.  Ferris spent two years in litigation, trying (unsuccessfully) to recover his investment.  He died despondent and nearly bankrupt (reportedly of typhoid, though some suggest that it was suicide) in 1896.

The original 1893 Chicago Ferris Wheel

source

 source

 

Written by LW

February 14, 2018 at 1:01 am

“The dice of Zeus always fall luckily”*…

 

14th century medieval dice from the Netherlands

Whether at a casino playing craps or engaging with family in a simple board game at home, rolling the dice introduces a bit of chance or “luck” into every game. We expect dice to be fair, where every number has equal probability of being rolled.

But a new study shows this was not always the case. In Roman times, many dice were visibly lopsided, unlike today’s perfect cubes. And in early medieval times, dice were often “unbalanced” in the arrangement of numbers, where 1 appears opposite 2, 3 opposite 4, and 5 opposite 6. It did not matter what the objects were made of (metal, clay, bone, antler and ivory), or whether they were precisely symmetrical or consistent in size or shape, because, like the weather, rolls were predetermined by gods or other supernatural elements.

All that began to change around 1450, when dice makers and players seemingly figured out that form affected function, explained Jelmer Eerkens, University of California, Davis, professor of anthropology and the lead author of a recent study on dice.

“A new worldview was emerging — the Renaissance. People like Galileo and Blaise Pascal were developing ideas about chance and probability, and we know from written records in some cases they were actually consulting with gamblers,” he said. “We think users of dice also adopted new ideas about fairness, and chance or probability in games”…

From fate to fairness: how dice changed over 2,000 years to be more fair: “It’s not how you play the game, but how the dice were made.”

[via Tim Carmody‘s always-illuminating newsletter, Noticing]

* Sophocles

###

As we consider the odds, we might send frontier-challenging birthday greetings to a man who tempted chance– Charles Elwood “Chuck” Yeager; he was born on this date in 1923.  A flying ace, test pilot, and ultimately U.S. Air Force General, Yeager became the first human to officially break the sound barrier when, in 1947, he flew the experimental Bell X-1 at Mach 1 at an altitude of 45,000 ft. 

Perhaps as famously, Yeager was a mentor and role model for the first class of NASA astronauts, as memorialized in Tom Wolfe’s The Right Stuff, and Philip Kaufman’s film adaptation.  On finishing high school at the beginning of World War II, Yeager had enlisted in the Air Force as a private; he served a mechanic before being accepted into the enlisted flight program, from which he graduated as a “Flight Officer” (equivalent to a Chief Warrant Officer).  His extraordinary skill as a pilot fueled his continued rise through the ranks.  But NASA’s requirement that all astronauts have college degrees disqualified Yeager from membership in the space program.  So though he was by most accounts far the most qualified potential astronaut, he became instead their head teacher, the first commandant of the USAF Aerospace Research Pilot School, which produced astronauts for NASA and the USAF.

Yeager in front of the Bell X-1, which, as with all of the aircraft assigned to him, he named Glamorous Glennis (or some variation thereof), after his wife.

source

 

Written by LW

February 13, 2018 at 1:01 am

%d bloggers like this: