(Roughly) Daily

Posts Tagged ‘language

“If names be not correct, language is not in accordance with the truth of things”*…

What’s in a name?…

The goal of this article is to promote clear thinking and clear writing among students and teachers of psychological science by curbing terminological misinformation and confusion. To this end, we present a provisional list of 50 commonly used terms in psychology, psychiatry, and allied fields that should be avoided, or at most used sparingly and with explicit caveats. We provide corrective information for students, instructors, and researchers regarding these terms, which we organize for expository purposes into five categories: inaccurate or misleading terms, frequently misused terms, ambiguous terms, oxymorons, and pleonasms. For each term, we (a) explain why it is problematic, (b) delineate one or more examples of its misuse, and (c) when pertinent, offer recommendations for preferable terms. By being more judicious in their use of terminology, psychologists and psychiatrists can foster clearer thinking in their students and the field at large regarding mental phenomena…

From “a gene for” through “multiple personality disorder” and “scientific proof” to “underlying biological dysfunction”: “Fifty psychological and psychiatric terms to avoid: a list of inaccurate, misleading, misused, ambiguous, and logically confused words and phrases.”

[TotH to @BoingBoing, whence the photo above]

* Confucius, The Analects

###

As we speak clearly, we might send carefully-worded birthday greetings to Francois-Marie Arouet, better known as Voltaire; he was born on this date in 1694.  The Father of the Age of Reason, he produced works in almost every literary form: plays, poems, novels, essays, and historical and scientific works– more than 2,000 books and pamphlets (and more than 20,000 letters).  He popularized Isaac Newton’s work in France by arranging a translation of Principia Mathematica to which he added his own commentary.

A social reformer, Voltaire used satire to criticize the intolerance, religious dogma, and oligopolistic privilege of his day, perhaps nowhere more sardonically than in Candide.

 source

Written by (Roughly) Daily

November 21, 2022 at 1:00 am

“Simplicity is the ultimate sophistication”*…

Sometimes less is more…

Scientists have identified evolutionary modifications in the voice box distinguishing people from other primates that may underpin a capability indispensable to humankind – speaking.

Researchers said… an examination of the voice box, known as the larynx, in 43 species of primates showed that humans differ from apes and monkeys in lacking an anatomical structure called a vocal membrane – small, ribbon-like extensions of the vocal cords.

Humans also lack balloon-like laryngeal structures called air sacs that may help some apes and monkeys produce loud and resonant calls, and avoid hyperventilating, they found.

The loss of these tissues, according to the researchers, resulted in a stable vocal source in humans that was critical to the evolution of speech – the ability to express thoughts and feelings using articulate sounds. This simplification of the larynx enabled humans to have excellent pitch control with long and stable speech sounds, they said.

Sound production mechanisms in people and nonhuman primates are similar, with air from the lungs driving oscillations of the vocal cords. Acoustical energy generated this way then passes through the pharyngeal, oral and nasal cavities and emerges in a form governed by the filtering of specific frequencies dictated by the vocal tract.

“Speech and language are critically related, but not synonymous,” said primatologist and psychologist Harold Gouzoules of Emory University in Atlanta, who wrote a commentary in Science accompanying the study. “Speech is the audible sound-based manner of language expression – and humans, alone among the primates, can produce it.”

Paradoxically, the increased complexity of human spoken language followed an evolutionary simplification.

“I think it’s pretty interesting that sometimes in evolution ‘less is more’ – that by losing a trait you might open the door to some new adaptations,” Fitch said…

Pivotal evolutionary change helped pave the way for human speech,” from Will Dunham @Reuters.

[Image above: source]

* Leonardo da Vinci

###

As we simpify, we might send thoughtfully-analyzed birthday greetings to Karl Gegenbaur; he was born on this date in 1826. An anatomist and professor, he was the first to demonstrate that the field of comparative anatomy offers important evidence supporting of the theory of evolution— of which, he became one of Europe’s leading proponents.

Gegenbaur’s book Grundzüge der vergleichenden Anatomie (1859; English translation: Elements of Comparative Anatomy) became the standard textbook, at the time, of evolutionary morphology, emphasizing that structural similarities among various animals provide clues to their evolutionary history. In a way that prefigured the research featured above, Gegenbaur noted that the most reliable clue to evolutionary history is homology, the comparison of anatomical parts which have a common evolutionary origin.

source

Written by (Roughly) Daily

August 21, 2022 at 1:00 am

“The sort of twee person who thinks swearing is in any way a sign of a lack of education or a lack of verbal interest is just a f–king lunatic”*…

Scene from The New Art and Mystery of Gossiping, Being a Genuine Account of All the Women’s Clubs in and about the City and Suburbs of London, c.1760. [British Library]

Rude words are a constant; but, as Suzannah Lipscomb explains, their ability to cause offense is in flux…

I stumbled upon this question as a historical consultant for a new drama set in the 16th century, when I needed to assess whether certain curse words in the script would have been familiar to the Tudors. The revelation – given away in the title of Melissa Mohr’s wonderful book Holy Sh*t – is that all swear words concern what is sacred or what is scatological. In the Middle Ages, the worst words had been about what was holy; by the 18th century they were about bodily functions. The 16th century was a period when what was considered obscene was in flux.

The most offensive words still used God’s name: God’s blood, God’s wounds, God’s bones, death, flesh, foot, heart, arms, nails, body, sides, guts, tongue, eyes. A statute of 1606 forbade the use of words that ‘iestingly or prophanely’ spoke the name of God in plays. Damn and hell were early modern variations of such blasphemous oaths (bloody came later), as were the euphemistic asseverations, gad, gog and egad.

Many words we consider, at best, crude were medieval common-or-garden words of description – arse, shit, fart, bollocks, prick, piss, turd – and were not considered obscene. To say ‘I’m going to piss’ was the equivalent of saying ‘I’m going to wee’ today and was politer than the new 16th-century vulgarity, ‘I’m going to take a leak’. Putting body parts or products where they shouldn’t normally be created delightfully defiant phrases such as ‘turd in your teeth’, which appears in the 1509 compendium of the Oxford don John Stanbridge. Non-literal uses of these words – which is what tends to be required for swearing – like ‘take the piss’, ‘on the piss’, ‘piss off’ – all seem to be 20th-century flourishes. For the latter, the Tudors would have substituted something diabolical – ‘the devil rot thee’ – or epidemiological – ‘a pox on you’.

But the scatological was starting to become obscene. Sard, swive, and fuck were all slightly rude words for sexual intercourse. An early recorded use of the f-word was a piece of marginalia by an anonymous monk writing in 1528 in a manuscript copy of Cicero’s De officiis (a treatise on moral philosophy). The inscription reads: ‘O d fuckin Abbot’. Given that the use of the f-word as an intensifier didn’t catch on for another three centuries, this is likely a punchy comment on the abbot’s immoral behaviour…

The chronicles of cursing: “Explicit Content,” from @sixteenthCgirl.

See also: “I’ve been accused of vulgarity. I say that’s bullshit” and “All slang is metaphor, and all metaphor is poetry.”

* Stephen Fry

###

As we choose our words, we might recall that it was on this date in 1960 that “Itsy Bitsy Teenie Weenie Yellow Polkadot Bikini” hit #1 on Billboard’s Hot 100 chart. A novelty song written by Paul Vance and Lee Pockriss and first released in June 1960 by Brian Hyland, it tells the tale of a shy young lady wearing her new swimsuit for the first time. Hyland’s recording also sailed up the charts in the rest of the Anglophone world, and subsequent versions topped the charts in France and Germany.

The tune is believed to have had a broader impact: the bikini had been introduced over a decade before, but hadn’t found wide acceptance; after Hyland’s hit two-pieces began to fly off the racks… and teen “surf movies” became the rage.

source

“Based on his liberal use of the semicolon, I just assumed this date would go well”*…

Mary Norris (“The Comma Queen“) appreciates Cecelia Watson‘s appreciation of a much-maligned mark, Semicolon

… Watson, a historian and philosopher of science and a teacher of writing and the humanities—in other words, a Renaissance woman—gives us a deceptively playful-looking book that turns out to be a scholarly treatise on a sophisticated device that has contributed eloquence and mystery to Western civilization.

The semicolon itself was a Renaissance invention. It first appeared in 1494, in a book published in Venice by Aldus Manutius. “De Aetna,” Watson explains, was “an essay, written in dialogue form,” about climbing Mt. Etna. Its author, Pietro Bembo, is best known today not for his book but for the typeface, designed by Francesco Griffo, in which the first semicolon was displayed: Bembo. The mark was a hybrid between a comma and a colon, and its purpose was to prolong a pause or create a more distinct separation between parts of a sentence. In her delightful history, Watson brings the Bembo semicolon alive, describing “its comma-half tensely coiled, tail thorn-sharp beneath the perfect orb thrown high above it.” Designers, she explains, have since given the mark a “relaxed and fuzzy” look (Poliphilus), rendered it “aggressive” (Garamond), and otherwise adapted it for the modern age: “Palatino’s is a thin flapper in a big hat slouched against the wall at a party.”

The problem with the semicolon is not how it looks but what it does and how that has changed over time. In the old days, punctuation simply indicated a pause. Comma, colon: semicolon; period. Eventually, grammarians and copy editors came along and made themselves indispensable by punctuating (“pointing”) a writer’s prose “to delineate clauses properly, such that punctuation served syntax.” That is, commas, semicolons, and colons were plugged into a sentence in order to highlight, subordinate, or otherwise conduct its elements, connecting them syntactically. One of the rules is that, unless you are composing a list, a semicolon is supposed to be followed by a complete clause, capable of standing on its own. The semicolon can take the place of a conjunction, like “and” or “but,” but it should not be used in addition to it. This is what got circled in red in my attempts at scholarly criticism in graduate school. Sentence length has something to do with it—a long, complex sentence may benefit from a clarifying semicolon—but if a sentence scans without a semicolon it’s best to leave it alone.

Watson has been keeping an eye out for effective semicolons for years. She calculates that there are four-thousand-odd semicolons in “Moby-Dick,” or “one for every 52 words.” Clumsy as nineteenth-century punctuation may seem to a modern reader, Melville’s semicolons, she writes, act like “sturdy little nails,” holding his wide-ranging narrative together….

Eminently worth reading in full: “Sympathy for the Semicolon,” on @ceceliawatson from @MaryNorrisTNY.

Sort of apposite (and completely entertaining/enlightening): “Naming the Unnamed: On the Many Uses of the Letter X.”

(Image above: source)

* Raven Leilani, Luster

###

As we punctuate punctiliously, we might recall that it was on this date in 1990 that CBS aired the final episode of Bob Newhart’s second successful sitcom series, Newhart, in which he co-starred with Mary Fran through a 184 episode run that had started in 1982. Newhart had, of course, had a huge hit with his first series, The Bob Newhart Show, in which he co-starred with Suzanne Pleshette.

Newhart‘s ending, its final scene, is often cited as the best finale in sit-com history.

“A nothing will serve just as well as a something about which nothing could be said”*…

Metaphysical debates in quantum physics don’t get at “truth,” physicist and mathematician Timothy Andersen argues; they’re nothing but a form of ritual activity and culture. After a thoughtful intellectual history of both quantum mechanics and Wittgenstein’s thought, he concludes…

If Wittgenstein were alive today, he might have couched his arguments in the vocabulary of cultural anthropology. For this shared grammar and these language games, in his view, form part of much larger ritualistic mechanisms that connect human activity with human knowledge, as deeply as DNA connects to human biology. It is also a perfect example of how evolution works by using pre-existing mechanisms to generate new behaviors.

The conclusion from all of this is that interpretation and representation in language and mathematics are little different than the supernatural explanations of ancient religions. Trying to resolve the debate between Bohr and Einstein is like trying to answer the Zen kōan about whether the tree falling in the forest makes a sound if no one can hear it. One cannot say definitely yes or no, because all human language must connect to human activity. And all human language and activity are ritual, signifying meaning by their interconnectedness. To ask what the wavefunction means without specifying an activity – and experiment – to extract that meaning is, therefore, as sensible as asking about the sound of the falling tree. It is nonsense.

As a scientist and mathematician, Wittgenstein has challenged my own tendency to seek out interpretations of phenomena that have no scientific value – and to see such explanations as nothing more than narratives. He taught that all that philosophy can do is remind us of what is evidently true. It’s evidently true that the wavefunction has a multiverse interpretation, but one must assume the multiverse first, since it cannot be measured. So the interpretation is a tautology, not a discovery.

I have humbled myself to the fact that we can’t justify clinging to one interpretation of reality over another. In place of my early enthusiastic Platonism, I have come to think of the world not as one filled with sharply defined truths, but rather as a place containing myriad possibilities – each of which, like the possibilities within the wavefunction itself, can be simultaneously true. Likewise, mathematics and its surrounding language don’t represent reality so much as serve as a trusty tool for helping people to navigate the world. They are of human origin and for human purposes.

To shut up and calculate, then, recognizes that there are limits to our pathways for understanding. Our only option as scientists is to look, predict and test. This might not be as glamorous an offering as the interpretations we can construct in our minds, but it is the royal road to real knowledge…

A provocative proposition: “Quantum Wittgenstein,” from @timcopia in @aeonmag.

* Ludwig Wittgenstein, Philosophical Investigations

###

As we muse on meaning, we might recall that it was on this date in 1954 that the official ground-breaking for CERN (Conseil européen pour la recherche nucléaire) was held. Located in Switzerland, it is the largest particle physics laboratory in the world… that’s to say, a prime spot to do the observation and calculation that Andersen suggests. Indeed, it’s been the site of many breakthrough discoveries over the years, maybe most notably the 2012 observation of the Higgs Boson.

Because researchers need remote access to these facilities, the lab has historically been a major wide area network hub. Indeed, it was at CERN that Tim Berners-Lee developed the first “browser”– and effectively fomented the emergence of the web.

CERN’s main site, from Switzerland looking towards France
%d bloggers like this: