(Roughly) Daily

Posts Tagged ‘language

“Based on his liberal use of the semicolon, I just assumed this date would go well”*…

Mary Norris (“The Comma Queen“) appreciates Cecelia Watson‘s appreciation of a much-maligned mark, Semicolon

… Watson, a historian and philosopher of science and a teacher of writing and the humanities—in other words, a Renaissance woman—gives us a deceptively playful-looking book that turns out to be a scholarly treatise on a sophisticated device that has contributed eloquence and mystery to Western civilization.

The semicolon itself was a Renaissance invention. It first appeared in 1494, in a book published in Venice by Aldus Manutius. “De Aetna,” Watson explains, was “an essay, written in dialogue form,” about climbing Mt. Etna. Its author, Pietro Bembo, is best known today not for his book but for the typeface, designed by Francesco Griffo, in which the first semicolon was displayed: Bembo. The mark was a hybrid between a comma and a colon, and its purpose was to prolong a pause or create a more distinct separation between parts of a sentence. In her delightful history, Watson brings the Bembo semicolon alive, describing “its comma-half tensely coiled, tail thorn-sharp beneath the perfect orb thrown high above it.” Designers, she explains, have since given the mark a “relaxed and fuzzy” look (Poliphilus), rendered it “aggressive” (Garamond), and otherwise adapted it for the modern age: “Palatino’s is a thin flapper in a big hat slouched against the wall at a party.”

The problem with the semicolon is not how it looks but what it does and how that has changed over time. In the old days, punctuation simply indicated a pause. Comma, colon: semicolon; period. Eventually, grammarians and copy editors came along and made themselves indispensable by punctuating (“pointing”) a writer’s prose “to delineate clauses properly, such that punctuation served syntax.” That is, commas, semicolons, and colons were plugged into a sentence in order to highlight, subordinate, or otherwise conduct its elements, connecting them syntactically. One of the rules is that, unless you are composing a list, a semicolon is supposed to be followed by a complete clause, capable of standing on its own. The semicolon can take the place of a conjunction, like “and” or “but,” but it should not be used in addition to it. This is what got circled in red in my attempts at scholarly criticism in graduate school. Sentence length has something to do with it—a long, complex sentence may benefit from a clarifying semicolon—but if a sentence scans without a semicolon it’s best to leave it alone.

Watson has been keeping an eye out for effective semicolons for years. She calculates that there are four-thousand-odd semicolons in “Moby-Dick,” or “one for every 52 words.” Clumsy as nineteenth-century punctuation may seem to a modern reader, Melville’s semicolons, she writes, act like “sturdy little nails,” holding his wide-ranging narrative together….

Eminently worth reading in full: “Sympathy for the Semicolon,” on @ceceliawatson from @MaryNorrisTNY.

Sort of apposite (and completely entertaining/enlightening): “Naming the Unnamed: On the Many Uses of the Letter X.”

(Image above: source)

* Raven Leilani, Luster


As we punctuate punctiliously, we might recall that it was on this date in 1990 that CBS aired the final episode of Bob Newhart’s second successful sitcom series, Newhart, in which he co-starred with Mary Fran through a 184 episode run that had started in 1982. Newhart had, of course, had a huge hit with his first series, The Bob Newhart Show, in which he co-starred with Suzanne Pleshette.

Newhart‘s ending, its final scene, is often cited as the best finale in sit-com history.

“A nothing will serve just as well as a something about which nothing could be said”*…

Metaphysical debates in quantum physics don’t get at “truth,” physicist and mathematician Timothy Andersen argues; they’re nothing but a form of ritual activity and culture. After a thoughtful intellectual history of both quantum mechanics and Wittgenstein’s thought, he concludes…

If Wittgenstein were alive today, he might have couched his arguments in the vocabulary of cultural anthropology. For this shared grammar and these language games, in his view, form part of much larger ritualistic mechanisms that connect human activity with human knowledge, as deeply as DNA connects to human biology. It is also a perfect example of how evolution works by using pre-existing mechanisms to generate new behaviors.

The conclusion from all of this is that interpretation and representation in language and mathematics are little different than the supernatural explanations of ancient religions. Trying to resolve the debate between Bohr and Einstein is like trying to answer the Zen kōan about whether the tree falling in the forest makes a sound if no one can hear it. One cannot say definitely yes or no, because all human language must connect to human activity. And all human language and activity are ritual, signifying meaning by their interconnectedness. To ask what the wavefunction means without specifying an activity – and experiment – to extract that meaning is, therefore, as sensible as asking about the sound of the falling tree. It is nonsense.

As a scientist and mathematician, Wittgenstein has challenged my own tendency to seek out interpretations of phenomena that have no scientific value – and to see such explanations as nothing more than narratives. He taught that all that philosophy can do is remind us of what is evidently true. It’s evidently true that the wavefunction has a multiverse interpretation, but one must assume the multiverse first, since it cannot be measured. So the interpretation is a tautology, not a discovery.

I have humbled myself to the fact that we can’t justify clinging to one interpretation of reality over another. In place of my early enthusiastic Platonism, I have come to think of the world not as one filled with sharply defined truths, but rather as a place containing myriad possibilities – each of which, like the possibilities within the wavefunction itself, can be simultaneously true. Likewise, mathematics and its surrounding language don’t represent reality so much as serve as a trusty tool for helping people to navigate the world. They are of human origin and for human purposes.

To shut up and calculate, then, recognizes that there are limits to our pathways for understanding. Our only option as scientists is to look, predict and test. This might not be as glamorous an offering as the interpretations we can construct in our minds, but it is the royal road to real knowledge…

A provocative proposition: “Quantum Wittgenstein,” from @timcopia in @aeonmag.

* Ludwig Wittgenstein, Philosophical Investigations


As we muse on meaning, we might recall that it was on this date in 1954 that the official ground-breaking for CERN (Conseil européen pour la recherche nucléaire) was held. Located in Switzerland, it is the largest particle physics laboratory in the world… that’s to say, a prime spot to do the observation and calculation that Andersen suggests. Indeed, it’s been the site of many breakthrough discoveries over the years, maybe most notably the 2012 observation of the Higgs Boson.

Because researchers need remote access to these facilities, the lab has historically been a major wide area network hub. Indeed, it was at CERN that Tim Berners-Lee developed the first “browser”– and effectively fomented the emergence of the web.

CERN’s main site, from Switzerland looking towards France

“Ultimately, it is the desire, not the desired, that we love”*…

Or is it? The web– and the world– are awash in talk of the Mimetic Theory of Desire (or Rivalry, as its creator, René Girard, would also have it). Stanford professor (and Philosophy Talk co-host) Joshua Landy weights in with a heavy word of caution…

Here are two readings of Shakespeare’s Hamlet. Which do you think we should be teaching in our schools and universities?

Reading 1. Hamlet is unhappy because he, like all of us, has no desires of his own, and therefore has no being, properly speaking. The best he can do is to find another person to emulate, since that’s the only way anyone ever develops the motivation to do anything. Shakespeare’s genius is to show us this life-changing truth.

Reading 2. Hamlet is unhappy because he, like all of us, is full of body thetans, harmful residue of the aliens brought to Earth by Xenu seventy-five million years ago and disintegrated using nuclear bombs inside volcanoes. Since it is still some time until the practice of auditing comes into being, Hamlet has no chance of becoming “clear”; it is no wonder that he displays such melancholy and aimlessness. Shakespeare’s genius is to show us this life-changing truth.

Whatever you make of the first, I’m rather hoping that you feel at least a bit uncomfortable with the second. If so, I have a follow-up question for you: what exactly is wrong with it? Why not rewrite the textbooks so as to make it our standard understanding of Shakespeare’s play? Surely you can’t fault the logic behind it: if humans have indeed been full of body thetans since they came into existence, and Hamlet is a representation of a human being, Hamlet must be full of body thetans. What is more, if everyone is still full of body thetans, then Shakespeare is doing his contemporaries a huge favor by telling them, and the new textbooks will be doing us a huge favor by telling the world. Your worry, presumably, is that this whole body thetan business is just not true. It’s an outlandish hypothesis, with nothing whatsoever to support it. And since, as Carl Sagan once said, “extraordinary claims require extraordinary evidence,” we would do better to leave it alone.

I think you see where I’m going with this. The fact is, of course, that the first reading is just as outlandish as the second. As I’m about to show (not that it should really need showing), human beings do have desires of their own. That doesn’t mean that all our desires are genuine; it’s always possible to be suckered into buying a new pair of boots, regardless of the fact that they are uglier and shoddier than our old ones, just because they are fashionable. What it means is that some of our desires are genuine. And having some genuine desires, and being able to act on them, is sufficient for the achievement of authenticity. For all we care, Hamlet’s inky cloak could be made by Calvin Klein, his feathered hat by Diane von Furstenberg; the point is that he also has motivations (to know things, to be autonomous, to expose guilt, to have his story told accurately) that come from within, and that those are the ones that count.

To my knowledge, no one in the academy actually reads Hamlet (or anything else) the second way. But plenty read works of literature the first way. René Girard, the founder of the approach, was rewarded for doing so with membership in the Académie française, France’s elite intellectual association. People loved his system so much that they established a Colloquium on Violence and Religion, hosted by the University of Innsbruck, complete with a journal under the ironically apt name Contagion. More recently, Peter Thiel, the co-founder of PayPal, loved it so much that he sank millions of dollars into Imitatio, an institute for the dissemination of Girardian thought. And to this day, you’ll find casual references to the idea everywhere, from people who seem to think it’s a truth, one established by René Girard. (Here’s a recent instance from the New York Times opinion pages: “as we have learned from René Girard, this is precisely how desires are born: I desire something by way of imitation, because someone else already has it.”) All of which leads to an inevitable question: what’s the difference between Girardianism and Scientology? Why has the former been more successful in the academy? Why is the madness of theory so, well, contagious?…

Are we really dependent on others for our desires? Does that mechanism inevitably lead to rivalry, scapegoating, and division? @profjoshlandy suggests not: “Deceit, Desire, and the Literature Professor: Why Girardians Exist,” in @StanfordArcade. Via @UriBram in @TheBrowser. Eminently worth reading in full.

* Friedrich Nietzsche (an inspiration to Girard)


As we tease apart theorizing, we might spare a thought for William Whewell; he died on this date in 1866. A scientist, Anglican priest, philosopher, theologian, and historian of science, he was Master of Trinity College, Cambridge.

At a time when specialization was increasing, Whewell was renown for the breadth of his work: he published the disciplines of mechanics, physics, geology, astronomy, and economics, while also finding the time to compose poetry, author a Bridgewater Treatise, translate the works of Goethe, and write sermons and theological tracts. In mathematics, Whewell introduced what is now called the Whewell equation, defining the shape of a curve without reference to an arbitrarily chosen coordinate system. He founded mathematical crystallography and developed a revision of  Friedrich Mohs’s classification of minerals. And he organized thousands of volunteers internationally to study ocean tides, in what is now considered one of the first citizen science projects.

But some argue that Whewell’s greatest gift to science was his wordsmithing: He created the words scientist and physicist by analogy with the word artist; they soon replaced the older term natural philosopher. He also named linguisticsconsiliencecatastrophismuniformitarianism, and astigmatism.

Other useful words were coined to help his friends: biometry for John Lubbock; Eocine, Miocene and Pliocene for Charles Lyell; and for Michael Faraday, electrode, anode, cathode, diamagnetic, paramagnetic, and ion (whence the sundry other particle names ending -ion).


“I resemble that remark”*…

Swiss linguist and philosopher Ferdinand de Saussure articulated the modern version of a belief that dates from Plato, and extended through Locke to modern linguistic scholarship…

… that the letters and words in many writing and language systems have no relationship to what they refer to. The word “cat” doesn’t have anything particularly cat-like about it. The reason that “cat” means cat is because English speakers have decided so—it’s a social convention, not anything ingrained in the letters c-a-t. (According to Saussure, a language like Chinese, where each written character stands for a whole word, was a separate writing system, and his ideas were directed towards writing systems made up of letters or syllables.) 

But new research calls this foundational assumption into question. “The Color Game,” created by researchers at the Max Planck Institute for the Science of Human History to study the evolution of language, suggests that there may be a representational relationship after all…

…the idea that words, or other signs, do actually relate to what they’re describing has been gaining ground. This is called iconicity: when a spoken or written word, or a gestured sign, is iconic in some way to what it’s referring to…

… research now suggests that our languages are riddled with iconicity, and that it may play a role in language evolution, and how we learn and process language. Along with this evidence from The Color Game, in the last decade and a half, an increase in cross-cultural studies has re-upped the attention on iconicity, and pushed back against the doctrine of arbitrariness. 

“It is now generally accepted that natural languages feature plenty of non-arbitrary ways to link form and meaning, and that some forms of iconicity are pretty pervasive,” said Mark Dingemanse, a linguist at Radboud University, who said that he too learned in Linguistics 101 that “the sign is arbitrary.” “Iconicity has become impossible to ignore.”…

Iconicity has always been around. One familiar example is onomatopoeias, like “ding-dong,” “chirp,” or “swish”—words that sound like what they’re referring to. Those words aren’t random, they have a direct relationship to what they represent. Yet, onomatopoeias were thought to be the exception to a wholly arbitrary set of signifiers, said Marcus Perlman, a lecturer in English language and linguistics at the University of Birmingham. This belief persisted despite hints that other words might have some connection to what they signified…

There could also be iconicity in what the letters themselves look like, and not just the sounds or gestures of words. In 2017, linguists Nora Turoman and Suzy Styles showed people who spoke unfamiliar languages different letters and asked them to guess which made the /i/ sound (“ee” in feet), and which was /u/ (“oo” sound in shoe). The participants were able to do so better than chance just by looking at the shape of the letters…

Language is most likely a mix of arbitrariness and iconicity, Perlman said, along with something called systematicity, when relationships form between words and meaning that aren’t necessarily iconic. (An example is words that start with gl- in English often are related to light, like glisten, glitter, gleam, and glow. There’s nothing necessarily light-like about the sound gl-, but the relationship is still there.)

Morin thinks of iconicity as the “icing on the cake” of language. It makes words more intuitive, more easy to guess. Iconicity might make languages easier to learn; Kim said there’s a saying about Hangul, that: “A wise man can learn it in a morning, and a fool can learn it in the space of ten days.”…  

Rethinking our most fundamental tool, as new research reveals a connection between what words look and sound like, and what they mean: “Why Are Letters Shaped the Way They Are?,” from @shayla__love in @motherboard.

* Curly, in The Three Stooges’ “Idle Roomers


As we reflect on resemblance, we might spare a thought for a champion of a different sort of mimesis: James Morrison Steele (“Steele”) MacKaye; he died on this date in 1894.  A well-known theatrical actor, dramatist, producer, and scenic innovator in his time, he is best remembered for his revolutionary contributions to theatrical design.  MacKaye opened the Madison Square Theatre in 1879, where he created a huge elevator with two stages stacked one on top of the other so that elaborate furnishings could be changed quickly between scenes.  MacKaye was the first to light a New York theatre– the Lyceum, which he founded in 1884– entirely by electricity.  And he invented and installed overhead and indirect stage lighting, movable stage wagons, artificial ventilation, the disappearing orchestra pit, and folding seats.  In all, MacKaye patented over a hundred inventions, mostly for the improvement of theatrical production and its experience.


“My work consists of two parts; that presented here plus all I have not written. It is this second part that is important.”*…

Ludwig Wittgenstein’s wooden cabin in Skjolden, Norway

On the occasion of it centenary, Peter Salmon considers the history, context, and lasting significance of Wittgenstein‘s revolutionary first work…

One hundred years ago, a slim volume of philosophy was published by the then unknown Austrian philosopher Ludwig Wittgenstein. The book was as curious as its title, Tractatus Logico-Philosophicus. Running to only 75 pages, it was in the form of a series of propositions, the gnomic quality of the first as baffling to the newcomer today as it was then.

1. The world is all that is the case.
1.1 The world is a totality of facts not of things.
1.11 The world is determined by the facts, and by their being all the facts.
1.12 For the totality of facts determines what is the case, and also whatever is not the case.
1.13 The facts in logical space are the world.

And so on, through six propositions, 526 numbered statements, equally emphatic and enigmatic, until the seventh and final proposition, which stands alone at the end of the text: “Whereof we cannot speak, thereof we must remain silent.”

The book’s influence was to be dramatic and far-reaching. Wittgenstein believed he had found a “solution” to how language and the world relate, that they shared a logical form. This also set a limit as to what questions could be meaningfully asked. Any question which could not be verified was, in philosophical terms, nonsense.

Written in the First World War trenches, Tractatus is, in many ways, a work of mysticism…

Ludwig Wittgenstein’s Tractatus is as brilliant and baffling today as it was on its publication a century ago: “The logical mystic,” from @petesalmon in @NewHumanist.

* Ludwig Wittgenstein


As we wrestle with reason and reality, we might recall that it was on this date in 1930 that Dashiell Hammett‘s The Maltese Falcon— likely a favorite of Wittgenstein’s— was published. In 1990 the novel ranked 10th in Top 100 Crime Novels of All Time list by the Crime Writer’s Association. Five years later, in a similar list by Mystery Writers of America, the novel was ranked third.


%d bloggers like this: