“Based on his liberal use of the semicolon, I just assumed this date would go well”*…
Mary Norris (“The Comma Queen“) appreciates Cecelia Watson‘s appreciation of a much-maligned mark, Semicolon…
… Watson, a historian and philosopher of science and a teacher of writing and the humanities—in other words, a Renaissance woman—gives us a deceptively playful-looking book that turns out to be a scholarly treatise on a sophisticated device that has contributed eloquence and mystery to Western civilization.
The semicolon itself was a Renaissance invention. It first appeared in 1494, in a book published in Venice by Aldus Manutius. “De Aetna,” Watson explains, was “an essay, written in dialogue form,” about climbing Mt. Etna. Its author, Pietro Bembo, is best known today not for his book but for the typeface, designed by Francesco Griffo, in which the first semicolon was displayed: Bembo. The mark was a hybrid between a comma and a colon, and its purpose was to prolong a pause or create a more distinct separation between parts of a sentence. In her delightful history, Watson brings the Bembo semicolon alive, describing “its comma-half tensely coiled, tail thorn-sharp beneath the perfect orb thrown high above it.” Designers, she explains, have since given the mark a “relaxed and fuzzy” look (Poliphilus), rendered it “aggressive” (Garamond), and otherwise adapted it for the modern age: “Palatino’s is a thin flapper in a big hat slouched against the wall at a party.”
The problem with the semicolon is not how it looks but what it does and how that has changed over time. In the old days, punctuation simply indicated a pause. Comma, colon: semicolon; period. Eventually, grammarians and copy editors came along and made themselves indispensable by punctuating (“pointing”) a writer’s prose “to delineate clauses properly, such that punctuation served syntax.” That is, commas, semicolons, and colons were plugged into a sentence in order to highlight, subordinate, or otherwise conduct its elements, connecting them syntactically. One of the rules is that, unless you are composing a list, a semicolon is supposed to be followed by a complete clause, capable of standing on its own. The semicolon can take the place of a conjunction, like “and” or “but,” but it should not be used in addition to it. This is what got circled in red in my attempts at scholarly criticism in graduate school. Sentence length has something to do with it—a long, complex sentence may benefit from a clarifying semicolon—but if a sentence scans without a semicolon it’s best to leave it alone.
Watson has been keeping an eye out for effective semicolons for years. She calculates that there are four-thousand-odd semicolons in “Moby-Dick,” or “one for every 52 words.” Clumsy as nineteenth-century punctuation may seem to a modern reader, Melville’s semicolons, she writes, act like “sturdy little nails,” holding his wide-ranging narrative together….
Eminently worth reading in full: “Sympathy for the Semicolon,” on @ceceliawatson from @MaryNorrisTNY.
Sort of apposite (and completely entertaining/enlightening): “Naming the Unnamed: On the Many Uses of the Letter X.”
(Image above: source)
* Raven Leilani, Luster
###
As we punctuate punctiliously, we might recall that it was on this date in 1990 that CBS aired the final episode of Bob Newhart’s second successful sitcom series, Newhart, in which he co-starred with Mary Fran through a 184 episode run that had started in 1982. Newhart had, of course, had a huge hit with his first series, The Bob Newhart Show, in which he co-starred with Suzanne Pleshette.
Newhart‘s ending, its final scene, is often cited as the best finale in sit-com history.
“In the sphere of thought, absurdity and perversity remain the masters of this world, and their dominion is suspended only for brief periods”*…
From a (somewhat sarcastic) 1896 essay (“The Art of Controversy”) by that gloomiest of philosophers, Arthur Schopenhauer, advice that (sadly) feels as appropriate today as it surely was then…
1. Carry your opponent’s proposition beyond its natural limits; exaggerate it. The more general your opponent’s statement becomes, the more objections you can find against it. The more restricted and narrow his or her propositions remain, the easier they are to defend by him or her.
2. Use different meanings of your opponent’s words to refute his or her argument.
3. Ignore your opponent’s proposition, which was intended to refer to a particular thing. Rather, understand it in some quite different sense, and then refute it. Attack something different than that which was asserted.
…
The first three of “Schopenhauer’s 38 Stratagems, or 38 Ways to Win an Argument.” Via @TheBrowser.
[Image above: source]
* Arthur Schopenhauer, “The Art of Controversy“
###
As we celebrate sophistry, we might recall that it was on this date (or near; scholars disagree) in 325 that Roman Emperor Constantine I convened a gathering in which all of Scopenhauer’s tricks were surely employed: the First Council of Nicaea. An ecumenical council, it was the first effort to attain consensus in the church through an assembly representing all Christendom. Its main accomplishments were settlement of the Christological issue of the divine nature of God the Son and his relationship to God the Father, the construction of the first part of the Nicene Creed, mandating uniform observance of the date of Easter, and the promulgation of early canon law.
“A nothing will serve just as well as a something about which nothing could be said”*…
Metaphysical debates in quantum physics don’t get at “truth,” physicist and mathematician Timothy Andersen argues; they’re nothing but a form of ritual activity and culture. After a thoughtful intellectual history of both quantum mechanics and Wittgenstein’s thought, he concludes…
If Wittgenstein were alive today, he might have couched his arguments in the vocabulary of cultural anthropology. For this shared grammar and these language games, in his view, form part of much larger ritualistic mechanisms that connect human activity with human knowledge, as deeply as DNA connects to human biology. It is also a perfect example of how evolution works by using pre-existing mechanisms to generate new behaviors.
The conclusion from all of this is that interpretation and representation in language and mathematics are little different than the supernatural explanations of ancient religions. Trying to resolve the debate between Bohr and Einstein is like trying to answer the Zen kōan about whether the tree falling in the forest makes a sound if no one can hear it. One cannot say definitely yes or no, because all human language must connect to human activity. And all human language and activity are ritual, signifying meaning by their interconnectedness. To ask what the wavefunction means without specifying an activity – and experiment – to extract that meaning is, therefore, as sensible as asking about the sound of the falling tree. It is nonsense.
As a scientist and mathematician, Wittgenstein has challenged my own tendency to seek out interpretations of phenomena that have no scientific value – and to see such explanations as nothing more than narratives. He taught that all that philosophy can do is remind us of what is evidently true. It’s evidently true that the wavefunction has a multiverse interpretation, but one must assume the multiverse first, since it cannot be measured. So the interpretation is a tautology, not a discovery.
I have humbled myself to the fact that we can’t justify clinging to one interpretation of reality over another. In place of my early enthusiastic Platonism, I have come to think of the world not as one filled with sharply defined truths, but rather as a place containing myriad possibilities – each of which, like the possibilities within the wavefunction itself, can be simultaneously true. Likewise, mathematics and its surrounding language don’t represent reality so much as serve as a trusty tool for helping people to navigate the world. They are of human origin and for human purposes.
To shut up and calculate, then, recognizes that there are limits to our pathways for understanding. Our only option as scientists is to look, predict and test. This might not be as glamorous an offering as the interpretations we can construct in our minds, but it is the royal road to real knowledge…
A provocative proposition: “Quantum Wittgenstein,” from @timcopia in @aeonmag.
* Ludwig Wittgenstein, Philosophical Investigations
###
As we muse on meaning, we might recall that it was on this date in 1954 that the official ground-breaking for CERN (Conseil européen pour la recherche nucléaire) was held. Located in Switzerland, it is the largest particle physics laboratory in the world… that’s to say, a prime spot to do the observation and calculation that Andersen suggests. Indeed, it’s been the site of many breakthrough discoveries over the years, maybe most notably the 2012 observation of the Higgs Boson.
Because researchers need remote access to these facilities, the lab has historically been a major wide area network hub. Indeed, it was at CERN that Tim Berners-Lee developed the first “browser”– and effectively fomented the emergence of the web.
You must be logged in to post a comment.