(Roughly) Daily

Posts Tagged ‘creativity

“What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book because there would be no one who wanted to read one.”*…

A person lying on a couch, reading a book, in a black and white photograph.

In the 4th century BCE, Plato recounts (in the Phaedrus) Socrates’ thoughts on a “technology” that was then moving from specialized (administrative, commercial, religious) to broader (more literary and philosphical) use– writing. Socrates was not a fan. He worried that writing weakened the necessity (and thus, the power) of memory, and that it created the pretense of understanding, rather than real comprehension and mastery.

Still, of course, writing– and the reading that it enabled– became the dominant form of communication.

Today, reading (for anything other than business or formal study) is down. Way down. But not to worry, today’s champions of big tech argue: their streaming and AI will usher in a new golden age of learning and connectivity. Their critics, of course– in an echo of Socrates– suggest that they will do the exact opposite.

James Marriott (and here) puts the skeptic’s case…

… in the middle of the eighteenth century huge numbers of ordinary people began to read.

For the first couple of centuries after the invention of the printing press, reading remained largely an elite pursuit. But by the beginning of the 1700s, the expansion of education and an explosion of cheap books began to diffuse reading rapidly down through the middle classes and even into the lower ranks of society. People alive at the time understood that something momentous was going on. Suddenly it seemed that everyone was reading everywhere: men, women, children, the rich, the poor. Reading began to be described as a “fever”, an “epidemic”, a “craze”, a “madness”. As the historian Tim Blanning writes, “conservatives were appalled and progressives were delighted, that it was a habit that knew no social boundaries.”

This transformation is sometimes known as the “reading revolution”. It was an unprecedented democratisation of information; the greatest transfer of knowledge into the hands of ordinary men and women in history.

In Britain only 6,000 books were published in the first decade of the eighteenth century; in the last decade of the same century the number of new titles was in excess of 56,000. More than half a million new publications appeared in German over the course of the 1700s. The historian Simon Schama has gone so far as to write that “literacy rates in eighteenth century France were much higher than in the late twentieth century United States”.

Where readers had once read “intensively”, spending their lives reading and re-reading two or three books, the reading revolution popularised a new kind of “extensive” reading. People read everything they could get their hands on: newspapers, journals, history, philosophy, science, theology and literature. Books, pamphlets and periodicals poured off the presses.

It was an age of monumental works of thought and knowledge: the Encyclopédie, Samuel Johnson’s Dictionary of the English Language, Edward Gibbon’s Decline and Fall of the Roman Empire, Immanuel Kant’s Critique of Pure Reason. Radical new ideas about God, about history, about society, about politics, and even the whole purpose and meaning of life flooded through Europe.

Even more importantly print changed how people thought.

The world of print is orderly, logical and rational. In books, knowledge is classified, comprehended, connected and put in its place. Books make arguments, propose theses, develop ideas. “To engage with the written word”, the media theorist Neil Postman wrote, “means to follow a line of thought, which requires considerable powers of classifying, inference-making and reasoning.”

As Postman pointed out, it is no accident, that the growth of print culture in the eighteenth century was associated with the growing prestige of reason, hostility to superstition, the birth of capitalism, and the rapid development of science. Other historians have linked the eighteenth century explosion of literacy to the Enlightenment, the birth of human rights, the arrival of democracy and even the beginnings of the industrial revolution.

The world as we know it was forged in the reading revolution.

Now, we are living through the counter-revolution.

More than three hundred years after the reading revolution ushered in a new era of human knowledge, books are dying.

Numerous studies show that reading is in free-fall. Even the most pessimistic twentieth-century critics of the screen-age would have struggled to predict the scale of the present crisis.

In America, reading for pleasure has fallen by forty per cent in the last twenty years. In the UK, more than a third of adults say they have given up reading. The National Literacy Trust reports “shocking and dispiriting” falls in children’s reading, which is now at its lowest level on record. The publishing industry is in crisis: as the author Alexander Larman writes, “books that once would have sold in the tens, even hundreds, of thousands are now lucky to sell in the mid-four figures.”

Most remarkably, in late 2024 the OECD published a report which found that literacy levels were “declining or stagnating” in most developed countries. Once upon a time a social scientist confronted with statistics like these might have guessed the cause was a societal crisis like a war or the collapse of the education system.

What happened was the smartphone, which was widely adopted in developed countries in the mid-2010s. Those years will be remembered as a watershed in human history…

[Marriott explores the impact and some if its implications…]

… This draining away of culture, critical thinking and intelligence represents a tragic loss of human potential and human flourishing. It is also one of the major challenges facing modern societies. Our vast, interconnected, tolerant and technologically advanced civilisation is founded on the complex, rational kinds of thinking fostered by literacy.

As Walter Ong writes in his book Orality and Literacy, certain kinds of complex and logical thinking simply cannot be achieved without reading and writing. It is virtually impossible to develop a detailed and logical argument in spontaneous speech — you would get lost, lose your thread, contradict yourself, and confuse your audience trying to re-phrase ineptly expressed points…

The classicist Eric Havelock argued that the arrival of literacy in ancient Greece was the catalyst for the birth of philosophy. Once people had a means of pinning ideas down on the page to interrogate them, refine them and build on them, a whole new revolutionary way of analytic and abstract thinking was born — one that would go on to shape our entire civilisation. With the birth of writing received ways of thinking could be challenged and improved. This was our species’ cognitive liberation…

Not only philosophy but the entire intellectual infrastructure of modern civilisation depends on the kinds of complex thinking inseparable from reading and writing: serious historical writing, scientific theorems, detailed policy proposals and the kinds of rigorous and dispassionate political debate conducted in books and magazines.

These forms of advanced thought provide the intellectual underpinnings of modernity. If our world feels unstable at the moment — like the ground is shifting beneath us — it is because those underpinnings are falling to pieces underneath our feet…

[Marriott explores what a return to an “oral” society might mean, then contemplates what he fears will be “the end of creativity”– If the literate world was characterised by complexity and innovation, the post literate world is characterised by simplicity, ignorance and stagnation. He turns then to its impact on civil society…]

… Amusingly from the perspective of the present the reading revolution of the eighteenth century was accompanied not only by excitement but by a moral panic.

“No lover of tobacco or coffee, no wine drinker or lover of games, can be as addicted to their pipe, bottle, games or coffee-table as those many hungry readers are to their reading habit”, thundered one German clergyman.

Richard Steele feared that “novels raise expectations which the ordinary course of life can never realise”. Others fretted that reading “excites the imagination too much, and fatigues the heart”.

It is easy to laugh at these anxieties. We have spent our whole lives hearing how virtuous and sensible it is to read books. How could reading be dangerous?

But in hindsight, these conservative moralists were right to worry. The rapid expansion of literacy helped to destroy the orderly, hierarchical, and profoundly socially unequal world they cherished.

The reading revolution was a catastrophe for the ultra-privileged and exploitative aristocrats of the European aristocratic ancien regime — the old autocratic system of government with almighty kings at the top, lords and clergy underneath and peasants squirming at the very bottom.

Ignorance was a foundation stone of feudal Europe. The vast inequalities of the aristocratic order were partly able to be sustained because the population had no way to find out about the scale of the corruption, abuses and inefficiencies of their governments…

… you do not have to believe print is a perfect and incorruptible system of communication to accept it is also almost certainly a necessary pre-condition of democracy.

In Amusing Ourselves to Death Neil Postman argues that democracy and print are virtually inseparable. An effective democracy pre-supposes a reasonably informed and somewhat critical citizenry capable of understanding and debating the issues of the day in detail and at length.

Democracy draws immeasurable strength from print — the old dying world of books, newspapers and magazines — with its tendency to foster deep knowledge, logical argument, critical thought, objectivity and dispassionate engagement. In this environment, ordinary people have the tools to understand their rulers, to criticise them and, perhaps, to change them…

… Politics in the age of short form video favours heightened emotion, ignorance and unevidenced assertions. Such circumstances are highly propitious for charismatic charlatans. Inevitably, parties and politicians hostile to democracy are flourishing in the post-literate world. TikTok usage correlates with increased vote share for populist parties and the far right…

… The big tech companies like to see themselves as invested in spreading knowledge and curiosity. In fact in order to survive they must promote stupidity. The tech oligarchs have just as much of a stake in the ignorance of the population as the most reactionary feudal autocrat. Dumb rage and partisan thinking keep us glued to our phones.

And where the old European monarchies had to (often ineptly) try to censor dangerously critical material, the big tech companies ensure our ignorance much more effectively by flooding our culture with rage, distraction and irrelevance.

These companies are actively working to destroy human enlightenment and usher in a new dark age.

The screen revolution will shape our politics as profoundly as the reading revolution of the eighteenth century.

Without the knowledge and without the critical thinking skills instilled by print, many of the citizens of modern democracies find themselves as helpless and as credulous as medieval peasants — moved by irrational appeals and prone to mob thinking. The world after print increasingly resembles the world before print.

Superstitions and anti-democratic thinking flourish. Scholarship in our universities is shaped by rigid partisanship not by tolerance and curiosity. Our art and literature is cruder and more simplistic…

… As power, wealth and knowledge concentrate at the top of society, an angry, divided and uninformed public lacks a way understand or analyse or criticise or change what is going on. Instead more and more people are impressed by the kinds of highly emotional charismatic and mystical appeals that were the foundation of power in the age before widespread literacy.

Just as the advent of print dealt the final death blow to the decaying world of feudalism, so the screen is destroying the world of liberal democracy.

As tech companies wipe out literacy and middle class jobs, we may find ourselves a second feudal age. Or it may be that we are entering a political era beyond our imagining.

Whatever happens, we are already seeing the world we once knew melt away. Nothing will ever be the same again.

Welcome to the post-literate society…

The end of civilization? A sobering assessment of “The dawn of the post-literate society” from @j-amesmarriott.bsky.social. Eminently worth reading in full.

FWIW, your correspondent would note that while Socrates was surely right that writing diminished the power of memory and at least partially right that text allowed its readers to appear more knowledgeable about things than perhaps they were, it was the development of writing that provided the foundation on which the the print revolution Marriott celebrates was able to emerge.

I’d also note that the earliest days of printing (before the 18th century “revolution in reading”) were pretty fraught: from the publication of Luther’s 95 Thesis (and the religious and civil turmoil– both ideological and “bloody”– they occasioned) on through more than a century of conflict that included the Thirty Years War, The English Civil War, and ultimately, the American and French Revolutions– indeed, also the American Civil War. As Ada Palmer notes, “Whenever a new information technology comes along, and this includes the printing press, among the very first groups to be ‘loud’ in it are the people who were silenced in the earlier system, which means radical voices”… very like the our current situation, as Marriott describes it.

Again FWIW, I find Marriott’s take all-too-resonant with my own (geezer’s) sense of loss (as the epistemological and civic superstructure in which I came of age dissolve). I find his pessimism-unto-despair much more plausible than I’d like. But I hold onto the hope that in this transition– as in the transitions from oral to writing, and then to printing/publishing– we will, as societies, find ways to manage the chaos and establish new foundations for reason, creativity, and coherent, constructive civic life.

It starts with us wanting– and working hard– to find that new, more solid ground.

* Neil Postman, Amusing Ourselves to Death

###

As we buckle up, we might spare a thought for George Grenville; he died on this date in 1770. An English politician who served as Prime Minister in the early years of the reign of George III, Grenville’s primary challenge was to solve the problem of the massive debt resulting from the Seven Years’ War. A centerpiece of his effort was a policy of taxing the American colonies more heavily, starting with his Sugar Act of 1764 and the Stamp Act of 1765– which began the train of events (much discussed in printed material of the time) that led to the American Revolution.

Portrait of George Grenville, an English politician and Prime Minister, seated and dressed in 18th-century attire, holding a document, with books in the background.

source

Written by (Roughly) Daily

November 13, 2025 at 1:00 am

“Whatever happens to musicians happens to everybody”*…

Further, In a fashion, to yesterday’s post (and for that matter, to “Nature doesn’t feel compelled to stick to a mathematically precise algorithm; in fact, nature probably can’t stick to an algorithm.”), a provocative proposal from Justin Patrick Moore

We don’t have enough Dada in this world of too much data. Something is needed to break-through the over-curated simulacrum that is the online world in order to let in a bit of non-artificial light. One way to make a break is through the deliberate cultivation of the glitch.

The exact etymology of the word glitch is not known, though it may derive from the Yiddish “glitsh” which means a “slippery place.” In the mid-twentieth century the word first started showing up in technical texts and related to sudden surges of voltage within an electrical circuit causing it to overload. Today a glitch is any kind of malfunction in hardware or error in software.

In the 1990’s glitch music became a kind of sub-genre of electronic music found at the meeting points of the avant-garde, noise, and more popular forms. This type of music, and the methods surrounding it, including circuit-bending, can provide a window, cracked as it is, for looking out at adjacent electronic worlds, including the internet…

[Moore explains circuit-bending and it’s history…]

… Digital natives need chance like a body needs water. Algorithms have taken the fun out of what was once unplanned and unstructured; internet surfing has been made accident proof, as if it were run by insurance agents and safety specialists. Spots of possible slippage are mopped up in favor of putting forth pre-chewed opinions and junk food clickbait. A similar environment prevails for electronic musicians. The hardware and software being made more often than not makes it difficult to fail. Sound libraries, instrument and effect presets, samplers pre-loaded with perfect pulsing patterns, make it hard to even play in the wrong pitch. These fully loaded tools make it a possible to become a producer of music in a matter of minutes.

Preconfigured musical gear may make it easier to get grooving right off the bat, but the gift of instant gratification steals the sense of accomplishment and intimacy that comes from knowing every inch and crevice of an instrument. And while on first meeting, a run in with a run of the mill modular set up might cause sparks to fly, the slow burn of excitable electrons grows even further from long association. The nuance and subtlety available to those who explore in depth comes across in the very sounds. Circuit-bending is one way to go into those depths, down to the wire.

Prefab music is low risk music. Something might be made from it that could be used as a backdrop to a car commercial or fit into a DJ set at a dance club, as filler, but without investigating the underlying assumptions of a piece of gear, or software, the things that come out of it will tend to not have the rewards associated with riskier behavior. Disfigured musical gear gives the gift of decomposition and recomposition to electronic composers. With their materials mangled and mutilated, the gear becomes a mutt, with all the natural advantages over thoroughbred, store bought, off-the-shelf kit. The system may be less predictable, but that is the point…

[Moore unpacks examples, and explains how, as the solution was itself absorbed into the problem…]

Kim Cascone pointed this out in his inspired essay The Aesthetics of Failure [here] that glitch is just the latest way of investigating the creative misuse of technology. Yet as the internet grew, the process by which those techniques spread happened much faster than in previous decades. In sharing technique of glitch, some of the imaginative grain within the music was lost as it became just another commodity. With the widespread availability of digital music software, “the medium is no longer the message in glitch music: the tool has become the message.”

Failure had reached a point of failure.

If our own thinking can be glitched then perhaps it is still possible to create systems that embrace the slippage. If we don’t want the “tool to become the message” than a third element beyond the digital must be added into the mix.

The technopoly runs on data. Is there a way to make it more Dada? The artists of the Dada movement rejected many things, but logic and reason were chief among them. Where was the logic in the atrocities of World War I? The founders of the movement had lived through the war and in reaction against it, sought to elevate nonsense and the irrational above cruel, cold logic.

In our own time reason and logic have failed to deliver the utopia of technology as promised and promoted by Big Techs advertisers and PR specialists. It can seem that humanities dystopian nightmares are what are actually manifesting. Perhaps part of technologies failure is due to the fact that the digital world is built on binaries.

Logic circuits or gates are the brick and mortar of digital systems. They are electronic circuits that have one or more than one input, but only one output. Logic gates are the switches that turn ON or OFF depending on what the user does. A logic gates turn ON when a certain condition is true, and OFF when the condition is false. A logic gate is able to check whether or not the information they get follows a certain rule, and the output is thus determined.

There are several types of logic gates, but the three most common are the NOT gate, the AND gate, and the OR gate. The NOT gate is the simplest. It’s sole function is to take an input that is either ON or OFF and give it back as the opposite, what the original signal is NOT. The AND circuit requires two inputs. It can only turn on when both inputs are ON. If only one input is on it turns OFF, and when both inputs are off, it turns OFF.

The OR circuit also requires two inputs. It needs one input to be on for it be ON, and is also still ON when both inputs are ON, and it is only OFF when both inputs are OFF.

While variations from these basic circuits have been used to build complex systems, they still have at their core, the binary which undergirds the entire techonosphere. It is rather difficult for the unknown to break through when only two outcomes are possible. A third position between ON and OFF is never arrived at. This would require ternary logic, and as far as I know, a ternary computer has yet to be built.

In lieu of a ternary computer, a third element needs to be added to digital systems: that is the human component. This is also where I think modes of artistic creation in the spirit of Dada can help. By moving away from pure logic and reason, by letting a bit of nonsense or irrationality slip through, the human tendency to also think in binaries can be glitched.

So much of the creative process is automated when working with digital tools, but it has little in common with the methods of automatism that came out of the Surrealist milieu. The various methods of automatism developed by the Surrealists put a person in touch with the unknown, whether it be the unconscious or from beyond the fragile borders of this world. Bringing these techniques back into play could give back a sense of humanity to the sounds of dead electric emitted from programmed machines.

Automatism came in part from the method of automatic writing or spirit writing, when mediums and others of their psychic ilk were said to be in touch with disembodied spirits. The writing came through them from the other side. For the Surrealists tapping into these forces became a source of creativity. The results were often startling as they bypassed logic and reason.

To the point of this essay, in artistic creation, logic is rarely the principle that needs to be abided. Automation needs to be bypassed in favor of automatism. In electronic music strategies and interventions need to be used to work around and supplant the built-in binary biases of the tools, otherwise the music being made on them ends up just sounding like a commercial for the tool…

[Moore offers examples from Ben Chasney and Max Ernst…]

Whatever the source may be, if we are to glitch the circuit, we need to open ourselves up to the slippage that comes in from the unknown. Otherwise people might as well just let AIs design the music for them. And while generative music systems can be built that produce startling beauty, such as Wotja and Brian Eno’s Bloom, they leave too little for unintended influences from outside the confines of the system. For that a human really does have to put themselves into line with the flow of the circuit path. 

To create something new, we need to become conduits, connect and plug into to an outside source…

Putting the Dada into data: “Glitching the Circuit,” from @igloomag.bsky.social.

* Bruce Sterling (@bruces.mastodon.social.ap.brid.gy)

###

As we explore, we might recall that it was on this date in 1977 that Iggy Pop, former frontman of The Stooges, released his debut solo album, The Idiot. It was produced by Pop’s friend David Bowie, who also wrote much of the album’s music (to which Pop added most of the lyrics). Described by Pop as “a cross between James Brown and Kraftwerk”, The Idiot marked a departure from the proto-punk of the Stooges to a more subdued, mechanical sound with electronic overtones.

source

Written by (Roughly) Daily

March 18, 2025 at 1:00 am

“Every great advance in science has issued from a new audacity of the imagination”*…

Itai Yanai and Martin Lercher on the importance of interdisciplinarity and creativity in science…

The hypothesis-testing mode of science, which François Jacob called “day science,” operates within the confines of a particular scientific field. As highly specialized experts, we confidently and safely follow the protocols of our paradigms and research programs . But there is another side of science, which Jacob called “night science”: the much less structured process by which new ideas arise and questions and hypotheses are generated. While day science is compartmentalized, night science is truly interdisciplinary. You may bring an answer from your home field to another discipline, or conversely, venturing into another field may let you discover a route towards answering a research question in your
main discipline. To be most creative, we may be best off cultivating interests in many areas, much like Renaissance thinkers such as Leonardo da Vinci or Galileo Galilei. But this creativity-enhancing interdisciplinarity comes at a price we may call “expert’s dilemma”: with your loss of status as a highly focused expert comes a loss of credibility, making it harder to get your work accepted by your peers. To resolve the dilemma, we must find our own balance between disciplinary day science expertise and interdisciplinary night science creativity…

Eminently worth reading in full: “Renaissance minds in 21st century science,” from @ItaiYanai and @MartinJLercher.

See also: “Night Science

And for more: see their project’s home page and listen to their podcast.

Apposite: “8 lessons on lifelong learning from an astrophysicist,” from Ethan Siegel.

* John Dewey

###

As we find a balance, we might send easily-reproducible birthday greetings to a man who was moved by necessity to cross disciplinary boundaries, Alois Senefelder; he was born on this date in 1771. A playwright and actor, was having trouble getting his plays printed;  he needed a less expensive and more efficient printing alternative to relief printed hand set type or etched plates. So he invented the technique we call lithography– the biggest revolution in the printing industry since Gutenberg’s movable type.

 The principle is simple: oil-based printing ink and water repel each other. The image is drawn on a stone (Bavarian limestone for Senefelder) with greasy crayon, after which the stone is soaked in water, which is absorbed into the part of the stone not covered in greasy paint. The ink is rolled onto the stone. The image areas of the stone accept ink and undrawn areas reject it. Finally, a piece of paper is pressed onto the stone, and the ink transfers onto the paper from the stone.

Senefelder called the technique “stone printing” or “chemical printing,” but the French name “lithography” became more widely adopted. Today photo lithography is used to print magazines and books, but the original process of drawing by hand on litho stones still exists in the fine art world.

Lithograph of Senefelder (source)

“We live, in fact, in a world starved for solitude, silence, and private: and therefore starved for meditation and true friendship”*…

… if then, even more so now. Ben Tarnoff takes off from Lowry Pressly‘s new book to ponder why privacy matters and why we have such trouble even thinking about how to protect it…

… Today, it is harder to keep one’s mind in place. Our thoughts leak through the sieve of our smartphones, where they join the great river of everyone else’s. The consequences, for both our personal and collective lives, are much discussed: How can we safeguard our privacy against state and corporate surveillance? Is Instagram making teen-agers depressed? Is our attention span shrinking?

There is no doubt that an omnipresent Internet connection, and the attendant computerization of everything, is inducing profound changes. Yet the conversation that has sprung up around these changes can sometimes feel a little predictable. The same themes and phrases tend to reappear. As the Internet and the companies that control it have become an object of permanent public concern, the concerns themselves have calcified into clichés. There is an algorithmic quality to our grievances with algorithmic life.

Lowry Pressly’s new book, “The Right to Oblivion: Privacy and the Good Life,” defies this pattern. It is a radiantly original contribution to a conversation gravely in need of new thinking. Pressly, who teaches political science at Stanford, takes up familiar fixations of tech discourse—privacy, mental health, civic strife—but puts them into such a new and surprising arrangement that they are nearly unrecognizable. The effect is like walking through your home town after a tornado: you recognize the buildings, but after some vigorous jumbling they have acquired a very different shape.

Pressly trained as a philosopher, and he has a philosopher’s fondness for sniffing out unspoken assumptions. He finds one that he considers fundamental to our networked era: “the idea that information has a natural existence in human affairs, and that there are no aspects of human life which cannot be translated somehow into data.” This belief, which he calls the “ideology of information,” has an obvious instrumental value to companies whose business models depend on the mass production of data, and to government agencies whose machinery of monitoring and repression rely on the same.

But Pressly also sees the ideology of information lurking in a less likely place—among privacy advocates trying to defend us from digital intrusions. This is because the standard view of privacy assumes there is “some information that already exists,” and what matters is keeping it out of the wrong hands. Such an assumption, for Pressly, is fatal. It “misses privacy’s true value and unwittingly aids the forces it takes itself to be resisting,” he writes. To be clear, Pressly is not opposed to reforms that would give us more power over our data—but it is a mistake “to think that this is what privacy is for.” “Privacy is valuable not because it empowers us to exercise control over our information,” he argues, “but because it protects against the creation of such information in the first place.”

If this idea sounds intriguing but exotic, you may be surprised to learn how common it once was. “A sense that privacy is fundamentally opposed to information has animated public moral discourse on the subject since the very beginning,” Pressly writes…

[Tarnoff recaps Pressly’s a brief history of the technologies that changed our relationship to information, from Kodak through CCTV, to AI…]

… The reason that Pressly feels so strongly about imposing limits on datafication is not only because of the many ways that data can be used to damage us. It is also because, in his view, we lose something precious when we become information, regardless of how it is used. In the very moment when data are made, Pressly believes, a line is crossed. “Oblivion” is his word for what lies on the other side.

Oblivion is a realm of ambiguity and potential. It is fluid, formless, and opaque. A secret is an unknown that can become known. Oblivion, by contrast, is unknowable: it holds those varieties of human experience which are “essentially resistant to articulation and discovery.” It is also a place beyond “deliberate, rational control,” where we lose ourselves or, as Pressly puts it, “come apart.” Sex and sleep are two of the examples he provides. Both bring us into the “unaccountable regions of the self,” those depths at which our ego dissolves and about which it is difficult to speak in definite terms. Physical intimacy is hard to render in words—“The experience is deflated by description,” Pressly observes—and the same is notoriously true of the dreams we have while sleeping, which we struggle to narrate, or even to remember, on waking.

Oblivion is fragile, however. When it comes into contact with information, it disappears. This is why we need privacy: it is the protective barrier that keeps oblivion safe from information. Such protection insures that “one can actually enter into oblivion from time to time, and that it will form a reliably available part of the structure of one’s society.”

But why do we need to enter into oblivion from time to time, and what good does it do us? Pressly gives a long list of answers, drawn not only from the Victorians but also from the work of Michel Foucault, Roland Barthes, Gay Talese, Jorge Luis Borges, and Hannah Arendt. One is that oblivion is restorative: we come apart in order to come back together. (Sleep is a case in point; without a nightly suspension of our rational faculties, we go nuts.) Another is the notion that oblivion is integral to the possibility of personal evolution. “The main interest in life and work is to become someone else that you were not in the beginning,” Foucault writes. To do so, however, you must believe that the future can be different from the past—a belief that becomes harder to sustain when one is besieged by information, as the obsessive documentation of life makes it “more fixed, more factual, with less ambiguity and life-giving potentiality.” Oblivion, by setting aside a space for forgetting, offers a refuge from this “excess of memory,” and thus a standpoint from which to imagine alternative futures.

Oblivion is also essential for human dignity. Because we cannot be fully known, we cannot be fully instrumentalized. Immanuel Kant urged us to treat others as ends in themselves, not merely as means. For Pressly, our obscurities are precisely what endow us with a sense of value that exceeds our usefulness. This, in turn, helps assure us that life is worth living, and that our fellow human beings are worthy of our trust. “There can be no trust of any sort without some limits to knowledge,” Pressly writes…

… Psychoanalysis first emerged in the late nineteenth century, in parallel with the idea of privacy. This was a period when the boundary between public and private was being redrawn, not only with the onslaught of handheld cameras but also, more broadly, because of the dislocating forces of what historians call the Second Industrial Revolution. Urbanization pulled workers from the countryside and packed them into cities, while mass production meant they could buy (rather than make) most of what they needed. These developments weakened the institution of the family, which lost its primacy as people fled rural kin networks and the production of life’s necessities moved from the household to the factory.

In response, a new freedom appeared. For the first time, the historian Eli Zaretsky observes, “personal identity became a problem and a project for individuals.” If you didn’t have your family to tell you who you were, you had to figure it out yourself. Psychoanalysis helped the moderns to make sense of this question, and to try to arrive at an answer.

More than a century later, the situation looks different. If an earlier stage of capitalism laid the material foundations for a new experience of individuality, the present stage seems to be producing the opposite. In their taverns, theatres, and dance halls, the city dwellers of the Second Industrial Revolution created a culture of social and sexual experimentation. Today’s young people are lonely and sexless. At least part of the reason is the permanent connectivity that, as Pressly argues, conveys the feeling that “one’s time and attention—that is to say, one’s life—are not entirely one’s own.”

The modernist city promised anonymity, reinvention. The Internet is devoid of such pleasures. It is more like a village: a place where your identity is fixed. Online, we are the sum of what we have searched, clicked, liked, and bought. But there are futures beyond those predicted through statistical extrapolations from the present. In fact, the past is filled with the arrival of such futures: those blind corners when no amount of information could tell you what was coming. History has a habit of humbling its participants. Somewhere in its strange rhythms sits the lifelong work of making a life of one’s own…

We often want to keep some information to ourselves. But information itself may be the problem: “What Is Privacy For?” from @bentarnoff in @NewYorker. (Possible paywall; archived link here.)

Pair with the two (marvelous, provocative) documentary series from Adam Curtis and the BBC: The Century of Self and Hypernormalization, both of which are available on You Tube.)

* C. S. Lewis

###

As we make room, we might send painfully-observant birthday greetings to Lenny Bruce; he was born on this date in 1925. A comedian, social critic, and satirist, he was ranked (in a 2017 Roling Stone poll) the third best stand-up comic of all time– behind Richard Pryor and George Carlin, both of whom credit Bruce as an influence.

source

Written by (Roughly) Daily

October 13, 2024 at 1:00 am

“I failed in some subjects in exams, but my friend passed in all. Now he is an engineer in Microsoft and I am the owner of Microsoft.”*…

Excerpt from the scroll Viewing the Pass Lists, traditionally attributed to Qiu Ying (1494-1552)

And that, Yasheng Huang argues, is not something likely to happen in China, for a reason that dates back to the 6th century…

On 7 and 8 June 2023, close to 13 million high-school students in China sat for the world’s most gruelling college entrance exam. ‘Imagine,’ wrote a Singapore journalist, ‘the SAT, ACT, and all of your AP tests rolled into two days. That’s Gao Kao, or “higher education exam”.’ In 2023, almost 2.6 million applied to sit China’s civil service exam to compete for only 37,100 slots.

Gao Kao and China’s civil service exam trace their origin to, and are modelled on, an ancient Chinese institution, Keju, the imperial civil service exam established by the Sui Dynasty (581-618). It can be translated as ‘subject recommendation.’ Toward the end of its reign, the Qing dynasty (1644-1911) abolished it in 1905 as part of its effort to reform and modernize the Chinese system. Until then, Keju had been the principal recruitment route for imperial bureaucracy. Keju reached its apex during the Ming dynasty (1368-1644). All the prime ministers but one came through the Keju route and many of them were ranked at the very top in their exam cohort…

Much of the academic literature focuses on the meritocracy of Keju. The path-breaking book in this genre is Ping-ti Ho’s The Ladder of Success in Imperial China (1962). One of his observations is eye catching: more than half of those who obtained the Juren degree were first generation: ie, none of their ancestors had ever attained a Juren status. (Juren was, at the time, the first degree granted in the three-tiered hierarchy of Keju.) More recent literature demonstrates the political effects of Keju. In 1905, the Qing dynasty abolished Keju, dashing the aspirations of millions and sparking regional rebellions that eventually toppled China’s last imperial regime in 1911.

The political dimension of Keju goes far beyond its meritocracy and its connection to the 1911 republican revolution. For an institution that had such deep penetration, both cross-sectionally in society and across time in history, Keju was all encompassing, laying claims to the time, effort and cognitive investment of a significant swathe of the male Chinese population. It was a state institution designed to augment the state’s own power and capabilities. Directly, the state monopolised the very best human capital; indirectly, the state deprived society of access to talent and pre-empted organised religion, commerce and the intelligentsia. Keju anchored Chinese autocracy.

The impact of Keju is still felt today, not only in the form and practice of Gao Kao and the civil service exam but also because Keju incubated values and work ethics. Today, Chinese minds still bear its imprint. For one, Keju elevated the value of education and we see this effect today. A 2020 study shows that, for every doubling of successful Keju candidates per 10,000 of the population in the Ming-Qing period, there was a 6.9 per cent increase in years of schooling in 2010. The Keju exams loom as part of China’s human capital formation today, but they also cultivated and imposed the values of deference to authority and collectivism that the Chinese Communist Party has reaped richly for its rule and legitimacy…

An ultimate autocracy is one that reigns without society. Society shackles the state in many ways. One is ex ante: it checks and balances the actions of the state. The other is ex post. A strong society provides an outside option to those inside the state. Sometimes, this is derisively described as ‘a revolving door’, but it may also have the positive function of checking the power of the state. State functionaries can object to state actions by voting with their feet, as many US civil servants did during the Donald Trump administration, and thereby drain the state of the valuable human capital it needs to function and operate. A strong society raises the opportunity costs for the state to recruit human capital but such a receptor function of society has never existed at scale in imperial China nor today, thanks – in large part, I would argue – to Keju.

Keju was so precocious that it pre-empted and displaced an emergent society. Meritocracy empowered the Chinese state at a time when society was still at an embryonic stage. Massive resources and administrative manpower were poured into Keju such that it completely eclipsed all other channels of upward mobility that could have emerged. In that sense, the celebration by many of Keju’s meritocracy misses the bigger picture of Chinese history. It is a view of a tree rather than of a forest…

…Its impressive bureaucratic mobility demolished all other mobility channels and possibilities. Keju was an anti-mobility mobility channel. It packed all the upward mobility within one channel – that of the state. Society was crowded out, and over time, due to its deficient access to quality human capital, it atrophied. This is the root of the power of Chinese autocracy and, I would argue, it is a historical development that is unique to China and explains the awesome power of Chinese autocracy…

There was, however, a massive operational advantage to the Neo-Confucianist curriculum: it standardised everything. Standardisation abhors nuance and the evaluations became more straightforward as the baseline comparison was more clearly delineated. There was objectivity, even if the objectivity was a manufactured artefact. The Chinese invented the modern state and meritocracy, but above all the Chinese invented specialised standardised testing – the memorisation, cognitive inclination and frame of references of an exceedingly narrow ideology.

Ming standardised Keju further: it enforced a highly scripted essay format, known as the ‘eight-legged essay’, or baguwen in Chinese (八股文), to which every Keju candidate had to adhere. A ‘leg’ here refers to each section of an essay, with a Keju essay requiring eight sections: 1) breaking open the topic; 2) receiving the topic; 3) beginning the discussion; 4) the initial leg; 5) the transition leg; 6) the middle leg; 7) the later leg; and 8) conclusion. The eight-legged essay fixed more than the aggregate structure of exposition. The specifications were granular and detailed. For example, the number of phrases was specified in each of the sections and the entire essay required expressions in paired sentences – a minimum of six paired sentences, up to a maximum of 12. The key contribution of the eight-legged essay is that it packed information into a pre-set presentational format.

Standardisation was designed to scale the Keju system and it succeeded brilliantly in that regard, but it had a devastating effect on expositional freedom and human creativity. All elements of subjectivity and judgment were taken out. In his book Traditional Government in Imperial China (1982), the historian Ch’ien Mu describes the ‘eight-legged essay’ as ‘the greatest destroyer of human talent.’…

In his book The WEIRDest People in the World (2020), Joseph Henrich posited that the West prospered because of its early lead in literacy. Yet the substantial Keju literacy produced none of the liberalising effects on Chinese ideas, economy or society. The literacy that Henrich had in mind was a particular kind of literacy – Protestant literacy – and the contrast with Keju literacy could not have been sharper. Keju literacy was drilled and practised in classical and highly stratified Chinese, the language of the imperial court rather than the language of the masses, in sharp contrast to Protestant literacy. Protestant literacy empowered personal agency by embracing and spreading vernaculars of the masses. Henrich’s liberalising ‘WEIRD’ effect – Western, educated, industrialised, rich and democratic – was a byproduct of Protestant literacy. It is no accident that Keju literacy produced an opposite effect…

Not everyone sees the Western/WEIRD definition of creativity and innovation as the only important one (c.f., here and here), nor that China is as lacking in what Westerners call creativity and innovation (c.f., here— possible soft paywall, and here). Still, Huang’s essay on Keju, China’s incredibly difficult civil service test, and how it strengthened the state at the cost of freedom and creativity, is eminently worthy of reading full: “The exam that broke society,” from @YashengHuang in @aeonmag.

And for the amazing (and amusing) story of how the Keju was instrumental in the introduction of Catholicism into China, see Jonathan Spence’s wonderful The Memory Palace of Matteo Ricci.

* Bill Gates

###

As we study, we might recall that it was on this date in 4004 BCE that the Universe was created… as per calculations by Archbishop James Ussher in the mid-17th century.

When Clarence Darrow prepared his famous examination of William Jennings Bryan in the Scopes trial [see here], he chose to focus primarily on a chronology of Biblical events prepared by a seventeenth-century Irish bishop, James Ussher. American fundamentalists in 1925 found—and generally accepted as accurate—Ussher’s careful calculation of dates, going all the way back to Creation, in the margins of their family Bibles.  (In fact, until the 1970s, the Bibles placed in nearly every hotel room by the Gideon Society carried his chronology.)  The King James Version of the Bible introduced into evidence by the prosecution in Dayton contained Ussher’s famous chronology, and Bryan more than once would be forced to resort to the bishop’s dates as he tried to respond to Darrow’s questions.

“Bishop James Ussher Sets the Date for Creation”
Ussher

source