(Roughly) Daily

Posts Tagged ‘C. P. Snow

“Down with all kings but King Ludd!”*…

A young man seated in formal attire with a suit and a collared shirt, looking directly at the camera with a serious expression, surrounded by other individuals in the background.
Thomas Pynchon as a high school senior, age 16, at Oyster Bay High School. The image is cropped from a group photo of the staff of the school’s yearbook, The Oysterette, of which Pynchon was the editor. (source)

Further, in a fashion, to yesterday’s post

Thomas Pynchon is having a moment. On the heels of the success of Paul Thomas Anderson’s One Battle After Another (loosely based in Pynchon’s novel, Vineland), he has released his first novel in 12 years, Shadow Ticket, a sufficiently big deal to merit not just a featured focus in The New York Times Book Review, but also a combo review-profile in The New York Times Magazine (both links to gift articles). Your correspondent is about half-way through Shadow Ticket and having a blast…

But here, I offer a much older piece from Pynchon, and non-fiction at that: an essay he wrote for The New York Times in 1984… one resonant with themes that run through his novels; one that speaks to that moment– the mid-Eighties– even as it speaks to ours…

As if being 1984 weren’t enough, it’s also the 25th anniversary this year of C. P. Snow’s famous Rede Lecture, ”The Two Cultures and the Scientific Revolution,” notable for its warning that intellectual life in the West was becoming increasingly polarized into ”literary” and ”scientific” factions, each doomed not to understand or appreciate the other. [See almanac entry here.] The lecture was originally meant to address such matters as curriculum reform in the age of Sputnik and the role of technology in the development of what would soon be known as the third world. But it was the two-culture formulation that got people’s attention. In fact it kicked up an amazing row in its day. To some already simplified points, further reductions were made, provoking certain remarks, name-calling, even intemperate rejoinders, giving the whole affair, though attenuated by the mists of time, a distinctly cranky look.

Today nobody could get away with making such a distinction. Since 1959, we have come to live among flows of data more vast than anything the world has seen. Demystification is the order of our day, all the cats are jumping out of all the bags and even beginning to mingle. We immediately suspect ego insecurity in people who may still try to hide behind the jargon of a specialty or pretend to some data base forever ”beyond” the reach of a layman. Anybody with the time, literacy and access fee these days can get together with just about any piece of specialized knowledge s/he may need. So, to that extent, the two-cultures quarrel can no longer be sustained. As a visit to any local library or magazine rack will easily confirm, there are now so many more than two cultures that the problem has really become how to find the time to read anything outside one’s own specialty.

What has persisted, after a long quarter century, is the element of human character. C. P. Snow, with the reflexes of a novelist after all, sought to identify not only two kinds of education but also two kinds of personality. Fragmentary echoes of old disputes, of unforgotten offense taken in the course of long-ago high- table chitchat, may have helped form the subtext for Snow’s immoderate, and thus celebrated, assertion, ”If we forget the scientific culture, then the rest of intellectuals have never tried, wanted, or been able to understand the Industrial Revolution.” Such ”intellectuals,” for the most part ”literary,” were supposed, by Lord Snow, to be ”natural Luddites.”

Except maybe for Brainy Smurf, it’s hard to imagine anybody these days wanting to be called a literary intellectual, though it doesn’t sound so bad if you broaden the labeling to, say, ”people who read and think.” Being called a Luddite is another matter. It brings up questions such as, Is there something about reading and thinking that would cause or predispose a person to turn Luddite? Is it O.K. to be a Luddite? And come to think of it, what is a Luddite, anyway?…

[Pynchon explains, and puts the “movement” into both socio-political and literary context…]

… The Gothic attitude in general, because it used images of death and ghostly survival toward no more responsible end than special effects and cheap thrills, was judged not Serious enough and confined to its own part of town. It is not the only neighborhood in the great City of Literature so, let us say, closely defined. In westerns, the good people always win. In romance novels, love conquers all. In whodunitsses we know better. We say, ”But the world isn’t like that.” These genres, by insisting on what is contrary to fact, fail to be Serious enough, and so they get redlined under the label ”escapist fare.”

This is especially unfortunate in the case of science fiction, in which the decade after Hiroshima saw one of the most remarkable flowerings of literary talent and, quite often, genius, in our history. It was just as important as the Beat movement going on at the same time, certainly more important than mainstream fiction, which with only a few exceptions had been paralyzed by the political climate of the cold war and McCarthy years. Besides being a nearly ideal synthesis of the Two Cultures, science fiction also happens to have been one of the principal refuges, in our time, for those of Luddite persuasion.

By 1945, the factory system – which, more than any piece of machinery, was the real and major result of the Industrial Revolution – had been extended to include the Manhattan Project, the German long-range rocket program and the death camps, such as Auschwitz. It has taken no major gift of prophecy to see how these three curves of development might plausibly converge, and before too long. Since Hiroshima, we have watched nuclear weapons multiply out of control, and delivery systems acquire, for global purposes, unlimited range and accuracy. An unblinking acceptance of a holocaust running to seven- and eight-figure body counts has become – among those who, particularly since 1980, have been guiding our military policies – conventional wisdom.

To people who were writing science fiction in the 50’s, none of this was much of a surprise, though modern Luddite imaginations have yet to come up with any countercritter Bad and Big enough, even in the most irresponsible of fictions, to begin to compare with what would happen in a nuclear war. So, in the science fiction of the Atomic Age and the cold war, we see the Luddite impulse to deny the machine taking a different direction. The hardware angle got de-emphasized in favor of more humanistic concerns – exotic cultural evolutions and social scenarios, paradoxes and games with space/ time, wild philosophical questions – most of it sharing, as the critical literature has amply discussed, a definition of ”human” as particularly distinguished from ”machine.” Like their earlier counterparts, 20th-century Luddites looked back yearningly to another age – curiously, the same Age of Reason which had forced the first Luddites into nostalgia for the Age of Miracles.

But we now live, we are told, in the Computer Age. What is the outlook for Luddite sensibility? Will mainframes attract the same hostile attention as knitting frames once did? I really doubt it. Writers of all descriptions are stampeding to buy word processors. Machines have already become so user-friendly that even the most unreconstructed of Luddites can be charmed into laying down the old sledgehammer and stroking a few keys instead. Beyond this seems to be a growing consensus that knowledge really is power, that there is a pretty straightforward conversion between money and information, and that somehow, if the logistics can be worked out, miracles may yet be possible. If this is so, Luddites may at last have come to stand on common ground with their Snovian adversaries, the cheerful army of technocrats who were supposed to have the ”future in their bones.” It may be only a new form of the perennial Luddite ambivalence about machines, or it may be that the deepest Luddite hope of miracle has now come to reside in the computer’s ability to get the right data to those whom the data will do the most good. With the proper deployment of budget and computer time, we will cure cancer, save ourselves from nuclear extinction, grow food for everybody, detoxify the results of industrial greed gone berserk – realize all the wistful pipe dreams of our days.

The word ”Luddite” continues to be applied with contempt to anyone with doubts about technology, especially the nuclear kind. Luddites today are no longer faced with human factory owners and vulnerable machines. As well-known President and unintentional Luddite D. D. Eisenhower prophesied when he left office, there is now a permanent power establishment of admirals, generals and corporate CEO’s, up against whom us average poor bastards are completely outclassed, although Ike didn’t put it quite that way. We are all supposed to keep tranquil and allow it to go on, even though, because of the data revolution, it becomes every day less possible to fool any of the people any of the time. If our world survives, the next great challenge to watch out for will come – you heard it here first – when the curves of research and development in artificial intelligence, molecular biology and robotics all converge. Oboy. It will be amazing and unpredictable, and even the biggest of brass, let us devoutly hope, are going to be caught flat-footed. It is certainly something for all good Luddites to look forward to if, God willing, we should live so long. Meantime, as Americans, we can take comfort, however minimal and cold, from Lord Byron’s mischievously improvised song, in which he, like other observers of the time, saw clear identification between the first Luddites and our own revolutionary origins. It begins:

As the Liberty lads o’er the sea
Bought their freedom, and cheaply, with blood,
So we, boys, we
Will die fighting, or live free,
And down with all kings but King Ludd!

Thomas Pynchon considers: “Is It O.K. To Be A Luddite?” from @nytimes.com.

Pair with: “Is This the New ‘Scariest Chart in the World’?

* Lord Byron

###

As we hang onto our humanity, we might recall that it was on this date in 2006 that review copies of Against the Day were distributed; it published later that year. At 1,085 pages, it is the longest of Pynchon’s novels to date (note that there is a rumor that Pynchon, who is now 88, completed another book alongside Shadow Ticket (only 304 pages long)… so who knows if Against the Day will hold its “title”…)

Pynchon has “teased” the novel with a synopsis:

Pynchon’s synopsis states that the novel’s action takes place “between the 1893 Chicago World’s Fair and the years just after World War I”. “With a worldwide disaster looming just a few years ahead, it is a time of unrestrained corporate greed, false religiosity, moronic fecklessness, and evil intent in high places. No reference to the present day is intended or should be inferred.” Pynchon promises “cameo appearances by Nikola Tesla, Bela Lugosi and Groucho Marx”, as well as “stupid songs” and “strange sexual practices”.

The novel’s setting “moves from the labor troubles in Colorado to turn-of-the-century New York City, to London and Göttingen, Venice and Vienna, the Balkans, Central Asia, Siberia at the time of the mysterious Tunguska Event, Mexico during the Revolution, postwar Paris, silent-era Hollywood, and one or two places not strictly speaking on the map at all.”

Like several of Pynchon’s earlier works, Against the Day includes both mathematicians and drug users. “As an era of certainty comes crashing down around their ears and unpredictable future commences, these folks are mostly just trying to pursue their lives. Sometimes they manage to catch up; sometimes it’s their lives that pursue them.”

The synopsis concludes: “If it is not the world, it is what the world might be with a minor adjustment or two. According to some, this is one of the main purposes of fiction. Let the reader decide, let the reader beware. Good luck…”

source

It is probably Pynchon’s most debated novel. Some readers and critics find it too scattered; others believe it to be his masterpiece (a title more commonly awarded to Gravity’s Rainbow). FWIW, Against the Day is your correspondent’s favorite, which, given how much I’ve admired and enjoyed and learned from all of Pynchon’s work, is saying something…

Cover of the novel 'Against the Day' by Thomas Pynchon, featuring a minimalist design with bold text.
First edition cover (source)

Written by (Roughly) Daily

October 24, 2025 at 1:00 am

“Two polar groups: at one pole we have the literary intellectuals, at the other scientists… Between the two a gulf of mutual incomprehension.”*…

 

A contempt for science is neither new, lowbrow, nor confined to the political right. In his famous 1959 lecture “The Two Cultures and the Scientific Revolution,” C.P. Snow commented on the disdain for science among educated Britons and called for a greater integration of science into intellectual life. In response to this overture, the literary critic F.R. Leavis wrote a rebuttal in 1962 that was so vituperative The Spectator had to ask Snow to promise not to sue for libel if they published the work.

The highbrow war on science continues to this day, with flak not just from fossil-fuel-funded politicians and religious fundamentalists but also from our most adored intellectuals and in our most august institutions of higher learning. Magazines that are ostensibly dedicated to ideas confine themselves to those arising in politics and the arts, with scant attention to new ideas emerging from science, with the exception of politicized issues like climate change (and regular attacks on a sin called “scientism”). Just as pernicious is the treatment of science in the liberal-arts curricula of many universities. Students can graduate with only a trifling exposure to science, and what they do learn is often designed to poison them against it.

The most frequently assigned book on science in universities (aside from a popular biology textbook) is Thomas Kuhn’s The Structure of Scientific Revolutions. That 1962 classic is commonly interpreted as showing that science does not converge on the truth but merely busies itself with solving puzzles before lurching to some new paradigm that renders its previous theories obsolete; indeed, unintelligible. Though Kuhn himself disavowed that nihilist interpretation, it has become the conventional wisdom among many intellectuals. A critic from a major magazine once explained to me that the art world no longer considers whether works of art are “beautiful” for the same reason that scientists no longer consider whether theories are “true.” He seemed genuinely surprised when I corrected him…

The usually extremely optimistic Steven Pinker (see here, e.g.) waxes concerned– if not, indeed, pessimistic– about the place of science in today’s society: “The Intellectual War on Science.”

* C.P. Snow, The Two Cultures and the Scientific Revolution (1959)

###

As we rein in our relativism, we might send heavenly birthday greetings to the scientist who inspired Thomas Kuhn (see here and here), Nicolaus Copernicus; he was born on this date in 1473.  A Renaissance polyglot and polymath– he was a canon lawyer, a mathematician, a physician,  a classics scholar, a translator, a governor, a diplomat, and an economist– he is best remembered as an astronomer.  Copernicus’ De revolutionibus orbium coelestium (On the Revolutions of the Celestial Spheres; published just before his death in 1543), with its heliocentric account of the solar system, is often regarded as the beginning both of modern astronomy and of the scientific revolution.

Of all discoveries and opinions, none may have exerted a greater effect on the human spirit than the doctrine of Copernicus. The world had scarcely become known as round and complete in itself when it was asked to waive the tremendous privilege of being the center of the universe. Never, perhaps, was a greater demand made on mankind – for by this admission so many things vanished in mist and smoke! What became of our Eden, our world of innocence, piety and poetry; the testimony of the senses; the conviction of a poetic – religious faith? No wonder his contemporaries did not wish to let all this go and offered every possible resistance to a doctrine which in its converts authorized and demanded a freedom of view and greatness of thought so far unknown, indeed not even dreamed of.

– Goethe

 source

Written by (Roughly) Daily

February 19, 2018 at 1:01 am

“The exploring of the Solar System… constitutes the beginning, much more than the end, of history”*…

Solar System Interactive,” from Jeroen Gommers, is a simple– and simply beautiful– tool for understanding the relative orbits of the planets (and lest we forsake Pluto, the dwarf planets) the circle the Sun…

In a simplified graphical presentation the planets are seen orbiting the sun at a relatively high speed. The user is encouraged to grab any one of these planets, drag it around the sun manually and experience the orbit periods of the other planets as they are driven along their orbit at relative speeds, uncovering the “interplanetary clockwork.”

Click here to give it a whirl.

* Carl Sagan

As we watch ’em go round, we might send synthetic birthday greetings to Charles Percy Snow, Baron Snow; he was born on this date in 1905.  A chemist and physicist, Snow taught at his alma mater, Cambridge, before joining the British Civil Service, where he had a distinguished career as a technical adviser and administrator.  He is probably better remembered these days for his writing (e.g., a biography of Anthony Trollope; the sequence of novels known as Strangers and Brothers).  But he is surely best remembered for his 1959 Rede Lecture, “Two Cultures” (subsequently published as The Two Cultures and the Scientific Revolution).  Snow argued that the breakdown of communication between the “two cultures” of modern society – the sciences and the humanities – was a major hindrance to solving the world’s problems.

A good many times I have been present at gatherings of people who, by the standards of the traditional culture, are thought highly educated and who have with considerable gusto been expressing their incredulity at the illiteracy of scientists. Once or twice I have been provoked and have asked the company how many of them could describe the Second Law of Thermodynamics. The response was cold: it was also negative. Yet I was asking something which is about the scientific equivalent of: ‘Have you read a work of Shakespeare’s?’

I now believe that if I had asked an even simpler question – such as, What do you mean by mass, or acceleration, which is the scientific equivalent of saying, ‘Can you read?’ – not more than one in ten of the highly educated would have felt that I was speaking the same language. So the great edifice of modern physics goes up, and the majority of the cleverest people in the western world have about as much insight into it as their Neolithic ancestors would have had…


source

Written by (Roughly) Daily

October 15, 2015 at 1:01 am

“Correlation does not imply causation”*…

 

From stat-enthusiast (and full-time law student) Tyler Vigen, entertaining examples of patterns that map in compelling– but totally-inconsequential– ways…

More (and larger) examples at the sensational Spurious Correlations.

* a maxim widely repeated in science and statistics; also rendered: (P&Q)≠(P→Q)٧(Q→P).  It addresses the post hoc, ergo propter hoc (“affirming the consequent”) logical fallacy

###

As we think before we leap, we might send energetic (really energetic) birthday greetings to Enrico Fermi; he was born on this date in 1901.  A physicist who is best remembered for (literally) presiding over the birth of the Atomic Age, he was also remarkable as the last “double-threat” in his field:  a genius at creating both important theories and elegant experiments.  As recently observed, the division of labor between theorists and experimentalists has since been pretty complete.

The novelist and historian of science C. P. Snow wrote that “if Fermi had been born a few years earlier, one could well imagine him discovering Rutherford’s atomic nucleus, and then developing Bohr’s theory of the hydrogen atom. If this sounds like hyperbole, anything about Fermi is likely to sound like hyperbole.”

 source

 

Written by (Roughly) Daily

September 29, 2014 at 1:01 am

The Two Cultures*: technology in the service of the Arts…

Last January, The Royal Opera House and Weiden + Kennedy London co-hosted Culture Hack Day. “an event… bringing cultural organisations together with software developers and creative technologists to make interesting new things.”

And make interesting new things they did.  For instance, Roderick Hodgson @roderickhodgson made Altfilm, an elegant interactive directory of venues showing non-mainstream films.  Ben Firshman @bfirsh made BBC Haiku Player (The Guardian got similar treatment from Adam Groves). And your agoraphobic correspondent’s personal fave:  Dan Williams‘ “When Should I Visit?”– which mines Foursquare check-in data to determine “the least busy time to visit the museums, galleries and theatres of London.”

More wonderful examples of creative cross-pollination (and links to descriptions and photos of the proceedings) at Culture Hack Day.   C.P. Snow would be proud.

*The Two Cultures,” the 1959 Rede Lecture by British scientist and novelist C. P. Snow, who argued that the breakdown of communication between the “two cultures” of modern society– the sciences and the humanities– was a major hurdle to solving the world’s problems.

As we think integrative thoughts, we might recall that The Great Exhibition of the Works of Industry of all Nations– or the Great Exhibition, as it was more familiarly known– opened on this date in 1851 at the Crystal Palace in London’s Hyde Park.  Conceived and organized by Queen Victoria’s consort, Prince Albert, the Exhibition was nominally a collection of technological wonders from around the globe.  But the eight miles of tables manned by 6,000 exhibitors within the Crystal Palace were largely British…  in keeping with Albert’s real intent– the mounting of an overwhelming display of Britain’s role as industrial leader of the world.  Six million people (equivalent to roughly a third of Britain’s population at the time) attended during its six-month run.

The Exhibition; architect Sir Joseph Paxton enclosed whole trees in his design. (source)