(Roughly) Daily

Posts Tagged ‘satire

“Life is more fun if you play games”*…

A scanned image of the title page of a scientific paper titled 'The Unsuccessful Self-Treatment of a Case of Writer's Block' by Dennis Upper, published in the Journal of Applied Behavior Analysis in 1974.

David Freidman ponders scientific satire…

I first encountered the scientific paper simply titled “Strapless Evening Gowns” four years ago, when I was flipping through a collection of magazines that once belonged to the cybernetics pioneer Norbert Wiener.

Among his magazines was the May 1960 issue of Voo Doo magazine, which was MIT’s “only intentionally humorous campus publication” going all the way back to 1919.

I got a chuckle out of this article, which attempted to semi-seriously analyze what exactly keeps a strapless dress from falling down:

A page from the May 1960 issue of Voo Doo magazine featuring an article titled 'Strapless Evening Gowns,' discussing the structural analysis of strapless dresses with diagrams and equations.

Scientists have a long history of amusing themselves with humor. In addition to Voo Doo, other science humor magazines include the Annals of Improbable Research, the Journal of Irreproducible Results, and the Worm Runner’s Digest which included both satirical and serious scientific papers, much to the confusion of their readers – a problem eventually solved by printing the satirical articles upside down.

And then there are the quasi-serious scientific studies meant to be amusing, such as this study on the effectiveness of tin foil hats in protecting you from government surveillance (spoiler: tin foil hats can actually amplify certain radio frequencies, so the authors speculate that the government has been behind promotion of tin foil hats all along).

And back in 1974, the Journal of Applied Behavior Analysis published a paper by clinical psychologist Dennis Upper called “The Unsuccessful Self-Treatment Of A Case Of Writer’s Block” [pictured at the top]. And lest you question the veracity of the author’s finding, I should note that the author’s failure to treat his writer’s block has been successfully replicated

Read on for more on the engineering of the formal dress, both the social (largely sexist) context and the (interestingly meaninful) scientific content– and the art it has inspired: “Science And The Strapless Evening Gown” from @ironicsans.com.

More seriously: “a Nature analysis signals the beginnings of a US science brain drain“: “Researchers in the United States are seeking career opportunities abroad as President Donald Trump’s administration slashes science funding and workforce numbers, finds an analysis of Nature’s jobs-board data…”

* Roald Dahl

###

As we play, we might recall that it was on this date in 1961 that Robert Noyce was issued patent number 2981877 for his “semiconductor device-and-lead structure,” the first patent for what would come to be known as the integrated circuit.  In fact another engineer, Jack Kilby, had separately and essentially simultaneously developed the same technology. Ineligible (as a new Texas Instruments empoyee) for a vacation over the summer of 1959, he gave himself the “assignment” of creating “a body of semiconductor material … wherein all the components of the electronic circuit are completely integrated.”

Kilby’s design was rooted in germanium; Noyce’s in silicon and had filed a few months earlier than Noyce. But Kilby’s invention was not a true monolithic integrated circuit chip since it had external gold-wire connections, which would have made it difficult to mass-produce– an obstacle Noyce overcame. Still, Kilby’s contribution was recognized in 2000 when he was Awarded the Nobel Prize– in which Noyce, who had died in 1990, did not share.

A historical black-and-white photo of Robert Noyce holding a semiconductor device design while posed in front of a wooden backdrop.
Noyce with his “motherboard” (source)

Written by (Roughly) Daily

April 25, 2025 at 1:00 am

“Just remember: Abraham Lincoln didn’t die in vain, he died in Washington D.C.”*…

Ken Goffman (better known as R. U. Sirius, the co-founder and first editor of the seminal Mondo 2000) on an equally-seminal comedy group…

If you were a college student or Western counterculture person in the late 1960s-70s, the albums of Firesign Theatre occupied more space on your shelf and in your hippocampus than even The Beatles or Pink Floyd. If you were an incipient techno-geek or hacker, this was even more the case. Firesign was the premier comedy recording act of the very first media-saturated technofreak tribe.

In his tremendously informative and enjoyable history of Firesign Theatre titled Firesign: The Electromagnetic History of Everything as told in Nine Comedy Albums, author Jeremy Braddock starts by giving us the roots of a band of satirists that started out (to varying degrees) as social activists with a sense of humor. He shows them slowly coming together in Los Angeles while infiltrating, first, the alternative Pacifica radio stations like KPFK in Los Angeles, and eventually, briefly, hosting programs in the newly thriving hip commercial rock radio stations of the times, before they lost that audience share to corporatization. 

Braddock takes us through the entire Firesign career and doesn’t stint on media theory and the sociopolitics of America in the 20th century that were a part of the Firesign oeuvre. 

For those of us out in the wilds of the youth counterculture of the time, without access to their radio programs, it was Columbia Records albums that captured our ears and minds, starting with Waiting For the Electrician or Someone Like Him in early 1968. Their third album, Don’t Crush That Dwarf Hand Me the Pliers sold 300,000 right out of the gate and, in the words of an article written for the National Registry in 2005, “breaking into the charts, continually stamped, pressed and available by Columbia Records in the US and Canada, hammering its way through all of the multiple commercial formats over the years: LPs, EPs, 8-Track and Cassette tapes, and numerous reissues on CD, licensed to various companies here and abroad, continuing up to this day.” As covered toward the end of the book, they have been frequently sampled in recent years by hip-hop artists.

My introduction to Firesign came as the result of seeing the cover of their second album How Can You Be In Two Places At Once When You’re Not Anywhere At All in the record section of a department store in upstate New York. It was the cover, with pictures of Groucho Marx and John Lennon and the words “All Hail Marx and Lennon” that caught my eye. 

It was the mind-breaking trip of Babe, as he enters a new car purchased from a then-stereotypical, obnoxiously friendly car salesman, and finds himself transitioning from one mediated space to another, eventually landing in a Turkish prison and witnessing the spread of plague, as an element of a TV quiz show. 

The album ends with a chanteuse named Lurlene singing “We’re Bringing the War Back Home.” This was all during the militant opposition to the US war in Vietnam. Probably few listeners would have recognized “Bring The War Home” as the slogan of The Weatherman faction of Students for a Democratic Society (SDS), but Braddock gets it, like he gets the seriousness of Firesign’s satire. Indeed, Braddock notes that several reviewers, writing with appreciation about one of their albums, averred that its dystopia was “not funny.”

Most fans would agree with me that the peak of the Firesign run on Columbia Records was the exceedingly multivalent Don’t Crush That Dwarf, Hand Me The Pliers and the futuristic, AI-saturated I Think We’re All Bozos on this Bus, which I note in this interview predicted the future better than any of the self-described futurists of the 1970s. But to apprehend the richness of those two psychedelic assaults on the senses and on the idiocracy of its times, you will need to read the book and listen to the recordings or at least read this interview… 

Read on for his conversation with Jeremy Braddock: “Firesign Theatre: The Greatest Satirists of 20th Century Media Culture and its Techno-romanticism were… Not Insane!” from @rusirius.bsky.social and @jbraddock.bsky.social

* Firesign Theatre, How Can You Be In Two Places At Once When You’re Not Anywhere At All

###

As we cherish canny comedy, we might recall that it was on this date in 1932 that Walk a Little Faster opened on Broadway at the St. James Theatre. A “musical review with sketches,” it featured “April in Paris” (by E. Y “Yip” Harburg, who seven years later provided all of the songs– including “Over the Rainbow— for the film The Wizard of Oz) and writing by S. J. Perelman (who had just scripted the Marx Brothers films Monkey Business and Horse Feathers).

source

“No one knows toward what center human things are going to gravitate in the near future, and hence the life of the world has become scandalously provisional”*…

The estimable Ted Gioia has pulled a 2022 essay from his newsletter up from behind the paywall. It was very relevant then; if anything, more relevant now…

Back in 2014, I sketched out a widely-read outline of an alternative interpretation of cultural conflict. Curiously enough, the conceptual tools I used came from a 1929 book from philosopher José Ortega y Gasset entitled The Revolt of the Masses—a work that offers surprisingly timely insights into our current situation.

That article stirred up a lot of debate at the time, but the whole situation has intensified further since 2014. Everything I’ve seen in those eight years has made painfully clear how insightful Ortega had been. The time has come to revisit that framework, summarizing its key insights and offering predictions for what might happen in the future.

Here’s part of what I wrote back in 2014:

First, let me tell you what you won’t find in this book. Despite a title that promises political analysis, The Revolt of the Masses has almost nothing to say about conventional party ideologies and alignments. Ortega shows little interest in fascism or capitalism or Marxism, and this troubled me when I first read the book. (Although, in retrospect, the philosopher’s passing comments on these matters proved remarkably prescient—for example his smug dismissal of Russian communism as destined to failure in the West, and his prediction of the rise of a European union.) Above all, he hardly acknowledges the existence of ‘left’ and ‘right’ in political debates.

Ortega’s brilliant insight came in understanding that the battle between ‘up’ and ‘down’ could be as important in spurring social and cultural change as the conflict between ‘left’ and ‘right’. This is not an economic distinction in Ortega’s mind. The new conflict, he insists, is not between “hierarchically superior and inferior classes…. upper classes or lower classes.” A millionaire could be a member of the masses, according to Ortega’s surprising schema. And a pauper might represent the elite.

The key driver of change, as Ortega sees it, comes from a shocking attitude characteristic of the modern age—or, at least, Ortega was shocked. Put simply, the masses hate experts. If forced to choose between the advice of the learned and the vague impressions of other people just like themselves, the masses invariably turn to the latter. The upper elites still try to pronounce judgments and lead, but fewer and fewer of those down below pay attention.

This dynamic is now far more significant than it was eight years ago. So I want to share 15 observations on the emerging vertical dimension of cultural conflict—these both define the rupture and try to predict how it will play out…

Read on for: “15 Observations on the New Phase in Cultural Conflict” from @tedgioia.bsky.social.

(Image above: source)

* José Ortega y Gasset in The Revolt of the Masses… where he also observed: “Liberalism – it is well to recall this today – is the supreme form of generosity; it is the right which the majority concedes to minorities and hence it is the noblest cry that has ever resounded in this planet. It announces the determination to share existence with the enemy; more than that, with an enemy that is weak. It was incredible that the human species should have arrived at so noble an attitude, so paradoxical, so refined, so acrobatic, so antinatural. Hence, it is not to be wondered at that this same humanity should soon appear anxious to get rid of it. It is a discipline too difficult and complex to take firm root on earth.”

###

As we contend with contention, we might send rational birthday greetings to an avatar of the Enlightenment (which did so much to spawn liberalism), Francois-Marie Arouet, better known as Voltaire; he was born on this date in 1694.  The Father of the Age of Reason, he produced works in almost every literary form: plays, poems, novels, essays, and historical and scientific works– more than 2,000 books and pamphlets (and more than 20,000 letters).  He popularized Isaac Newton’s work in France by arranging a translation of Principia Mathematica to which he added his own commentary.

A social reformer, Voltaire used satire to criticize the intolerance, religious dogma, and oligopolistic privilege of his day, perhaps nowhere more sardonically than in Candide.

source

“The great men turn out to be all alike. They never stop working. They never lose a minute. It is very depressing.”*…

Data storyteller RJ Andrews demonstrates…

How do creatives – composers, painters, writers, scientists, philosophers – find the time to produce their opus?

Each routine day is represented as a continuous 24 hour cycle. Midnight is placed at 12 o’clock position and noon at 6 o’clock. Colors mark major categories of activity including work, sleep, and exercise…

The daily rituals of great creators: “Creative Routines,” from @infowetrust.

* Mason Curry, Daily Rituals: How Artists Work (the source of much of the data that informed the gaphics above)

###

As we contemplate cultivating customs, we might send learned birthday greetings to Desiderius Erasmus Roterodamus, better known simply as Erasmus; he was born on this date in 1466.  A Catholic priest, social critic, teacher, translator, and theologian, probably best remembered for his book In Praise of Folly, he was the greatest scholar of the northern Renaissance, the first editor of the New Testament (“Do unto others…”), and an important figure in patristics and classical literature. 

Erasmus had contrasting experiences of routine: on being orphaned, Erasmus was sent to a series of monastic or semi-monastic schools, which he despised both for their discipline and for their disdain of inquiry. Graduating with few prospects, he joined an Augustinian monastery where he considered his superiors “barbarians” discouraging his classical studies. On ordination, he escaped– and began a career that began in struggle (as he balanced the demands of study with those of serving as a Clerk, and Priest, a tutor… all while trying to distinguish himself as a poet.

His luck changed in 1499, when he connected with a reformist English circle (notably John Colet and Thomas More), then with radical French Franciscan Jean Vitrier, and later with the Aldine New Academy in Venice… which led, in the Reformation, to his emergence as a prime influencer of European thought. Among fellow scholars and philosophers of that era he was– and is still– known as the “Prince of the Humanists.”

Erasmus’ views were contentious on his time and elicited a good bit of criticism. While we don’t know too much about his daily routine, we do know that for the last half of his life it included enough time on a regular basis to read these attacks and to prepare and publish apologetic works in his own defense, in many cases leading to a long series of back-and-forth polemical books… kind of like Twitter feud, but at the speed of Gutenberg.

Portrait of Erasmus of Rotterdam (1523) by Hans Holbein the Younger

source

“We live, in fact, in a world starved for solitude, silence, and private: and therefore starved for meditation and true friendship”*…

… if then, even more so now. Ben Tarnoff takes off from Lowry Pressly‘s new book to ponder why privacy matters and why we have such trouble even thinking about how to protect it…

… Today, it is harder to keep one’s mind in place. Our thoughts leak through the sieve of our smartphones, where they join the great river of everyone else’s. The consequences, for both our personal and collective lives, are much discussed: How can we safeguard our privacy against state and corporate surveillance? Is Instagram making teen-agers depressed? Is our attention span shrinking?

There is no doubt that an omnipresent Internet connection, and the attendant computerization of everything, is inducing profound changes. Yet the conversation that has sprung up around these changes can sometimes feel a little predictable. The same themes and phrases tend to reappear. As the Internet and the companies that control it have become an object of permanent public concern, the concerns themselves have calcified into clichés. There is an algorithmic quality to our grievances with algorithmic life.

Lowry Pressly’s new book, “The Right to Oblivion: Privacy and the Good Life,” defies this pattern. It is a radiantly original contribution to a conversation gravely in need of new thinking. Pressly, who teaches political science at Stanford, takes up familiar fixations of tech discourse—privacy, mental health, civic strife—but puts them into such a new and surprising arrangement that they are nearly unrecognizable. The effect is like walking through your home town after a tornado: you recognize the buildings, but after some vigorous jumbling they have acquired a very different shape.

Pressly trained as a philosopher, and he has a philosopher’s fondness for sniffing out unspoken assumptions. He finds one that he considers fundamental to our networked era: “the idea that information has a natural existence in human affairs, and that there are no aspects of human life which cannot be translated somehow into data.” This belief, which he calls the “ideology of information,” has an obvious instrumental value to companies whose business models depend on the mass production of data, and to government agencies whose machinery of monitoring and repression rely on the same.

But Pressly also sees the ideology of information lurking in a less likely place—among privacy advocates trying to defend us from digital intrusions. This is because the standard view of privacy assumes there is “some information that already exists,” and what matters is keeping it out of the wrong hands. Such an assumption, for Pressly, is fatal. It “misses privacy’s true value and unwittingly aids the forces it takes itself to be resisting,” he writes. To be clear, Pressly is not opposed to reforms that would give us more power over our data—but it is a mistake “to think that this is what privacy is for.” “Privacy is valuable not because it empowers us to exercise control over our information,” he argues, “but because it protects against the creation of such information in the first place.”

If this idea sounds intriguing but exotic, you may be surprised to learn how common it once was. “A sense that privacy is fundamentally opposed to information has animated public moral discourse on the subject since the very beginning,” Pressly writes…

[Tarnoff recaps Pressly’s a brief history of the technologies that changed our relationship to information, from Kodak through CCTV, to AI…]

… The reason that Pressly feels so strongly about imposing limits on datafication is not only because of the many ways that data can be used to damage us. It is also because, in his view, we lose something precious when we become information, regardless of how it is used. In the very moment when data are made, Pressly believes, a line is crossed. “Oblivion” is his word for what lies on the other side.

Oblivion is a realm of ambiguity and potential. It is fluid, formless, and opaque. A secret is an unknown that can become known. Oblivion, by contrast, is unknowable: it holds those varieties of human experience which are “essentially resistant to articulation and discovery.” It is also a place beyond “deliberate, rational control,” where we lose ourselves or, as Pressly puts it, “come apart.” Sex and sleep are two of the examples he provides. Both bring us into the “unaccountable regions of the self,” those depths at which our ego dissolves and about which it is difficult to speak in definite terms. Physical intimacy is hard to render in words—“The experience is deflated by description,” Pressly observes—and the same is notoriously true of the dreams we have while sleeping, which we struggle to narrate, or even to remember, on waking.

Oblivion is fragile, however. When it comes into contact with information, it disappears. This is why we need privacy: it is the protective barrier that keeps oblivion safe from information. Such protection insures that “one can actually enter into oblivion from time to time, and that it will form a reliably available part of the structure of one’s society.”

But why do we need to enter into oblivion from time to time, and what good does it do us? Pressly gives a long list of answers, drawn not only from the Victorians but also from the work of Michel Foucault, Roland Barthes, Gay Talese, Jorge Luis Borges, and Hannah Arendt. One is that oblivion is restorative: we come apart in order to come back together. (Sleep is a case in point; without a nightly suspension of our rational faculties, we go nuts.) Another is the notion that oblivion is integral to the possibility of personal evolution. “The main interest in life and work is to become someone else that you were not in the beginning,” Foucault writes. To do so, however, you must believe that the future can be different from the past—a belief that becomes harder to sustain when one is besieged by information, as the obsessive documentation of life makes it “more fixed, more factual, with less ambiguity and life-giving potentiality.” Oblivion, by setting aside a space for forgetting, offers a refuge from this “excess of memory,” and thus a standpoint from which to imagine alternative futures.

Oblivion is also essential for human dignity. Because we cannot be fully known, we cannot be fully instrumentalized. Immanuel Kant urged us to treat others as ends in themselves, not merely as means. For Pressly, our obscurities are precisely what endow us with a sense of value that exceeds our usefulness. This, in turn, helps assure us that life is worth living, and that our fellow human beings are worthy of our trust. “There can be no trust of any sort without some limits to knowledge,” Pressly writes…

… Psychoanalysis first emerged in the late nineteenth century, in parallel with the idea of privacy. This was a period when the boundary between public and private was being redrawn, not only with the onslaught of handheld cameras but also, more broadly, because of the dislocating forces of what historians call the Second Industrial Revolution. Urbanization pulled workers from the countryside and packed them into cities, while mass production meant they could buy (rather than make) most of what they needed. These developments weakened the institution of the family, which lost its primacy as people fled rural kin networks and the production of life’s necessities moved from the household to the factory.

In response, a new freedom appeared. For the first time, the historian Eli Zaretsky observes, “personal identity became a problem and a project for individuals.” If you didn’t have your family to tell you who you were, you had to figure it out yourself. Psychoanalysis helped the moderns to make sense of this question, and to try to arrive at an answer.

More than a century later, the situation looks different. If an earlier stage of capitalism laid the material foundations for a new experience of individuality, the present stage seems to be producing the opposite. In their taverns, theatres, and dance halls, the city dwellers of the Second Industrial Revolution created a culture of social and sexual experimentation. Today’s young people are lonely and sexless. At least part of the reason is the permanent connectivity that, as Pressly argues, conveys the feeling that “one’s time and attention—that is to say, one’s life—are not entirely one’s own.”

The modernist city promised anonymity, reinvention. The Internet is devoid of such pleasures. It is more like a village: a place where your identity is fixed. Online, we are the sum of what we have searched, clicked, liked, and bought. But there are futures beyond those predicted through statistical extrapolations from the present. In fact, the past is filled with the arrival of such futures: those blind corners when no amount of information could tell you what was coming. History has a habit of humbling its participants. Somewhere in its strange rhythms sits the lifelong work of making a life of one’s own…

We often want to keep some information to ourselves. But information itself may be the problem: “What Is Privacy For?” from @bentarnoff in @NewYorker. (Possible paywall; archived link here.)

Pair with the two (marvelous, provocative) documentary series from Adam Curtis and the BBC: The Century of Self and Hypernormalization, both of which are available on You Tube.)

* C. S. Lewis

###

As we make room, we might send painfully-observant birthday greetings to Lenny Bruce; he was born on this date in 1925. A comedian, social critic, and satirist, he was ranked (in a 2017 Roling Stone poll) the third best stand-up comic of all time– behind Richard Pryor and George Carlin, both of whom credit Bruce as an influence.

source

Written by (Roughly) Daily

October 13, 2024 at 1:00 am