Posts Tagged ‘creativity’
“I failed in some subjects in exams, but my friend passed in all. Now he is an engineer in Microsoft and I am the owner of Microsoft.”*…
And that, Yasheng Huang argues, is not something likely to happen in China, for a reason that dates back to the 6th century…
On 7 and 8 June 2023, close to 13 million high-school students in China sat for the world’s most gruelling college entrance exam. ‘Imagine,’ wrote a Singapore journalist, ‘the SAT, ACT, and all of your AP tests rolled into two days. That’s Gao Kao, or “higher education exam”.’ In 2023, almost 2.6 million applied to sit China’s civil service exam to compete for only 37,100 slots.
Gao Kao and China’s civil service exam trace their origin to, and are modelled on, an ancient Chinese institution, Keju, the imperial civil service exam established by the Sui Dynasty (581-618). It can be translated as ‘subject recommendation.’ Toward the end of its reign, the Qing dynasty (1644-1911) abolished it in 1905 as part of its effort to reform and modernize the Chinese system. Until then, Keju had been the principal recruitment route for imperial bureaucracy. Keju reached its apex during the Ming dynasty (1368-1644). All the prime ministers but one came through the Keju route and many of them were ranked at the very top in their exam cohort…
Much of the academic literature focuses on the meritocracy of Keju. The path-breaking book in this genre is Ping-ti Ho’s The Ladder of Success in Imperial China (1962). One of his observations is eye catching: more than half of those who obtained the Juren degree were first generation: ie, none of their ancestors had ever attained a Juren status. (Juren was, at the time, the first degree granted in the three-tiered hierarchy of Keju.) More recent literature demonstrates the political effects of Keju. In 1905, the Qing dynasty abolished Keju, dashing the aspirations of millions and sparking regional rebellions that eventually toppled China’s last imperial regime in 1911.
The political dimension of Keju goes far beyond its meritocracy and its connection to the 1911 republican revolution. For an institution that had such deep penetration, both cross-sectionally in society and across time in history, Keju was all encompassing, laying claims to the time, effort and cognitive investment of a significant swathe of the male Chinese population. It was a state institution designed to augment the state’s own power and capabilities. Directly, the state monopolised the very best human capital; indirectly, the state deprived society of access to talent and pre-empted organised religion, commerce and the intelligentsia. Keju anchored Chinese autocracy.
The impact of Keju is still felt today, not only in the form and practice of Gao Kao and the civil service exam but also because Keju incubated values and work ethics. Today, Chinese minds still bear its imprint. For one, Keju elevated the value of education and we see this effect today. A 2020 study shows that, for every doubling of successful Keju candidates per 10,000 of the population in the Ming-Qing period, there was a 6.9 per cent increase in years of schooling in 2010. The Keju exams loom as part of China’s human capital formation today, but they also cultivated and imposed the values of deference to authority and collectivism that the Chinese Communist Party has reaped richly for its rule and legitimacy…
An ultimate autocracy is one that reigns without society. Society shackles the state in many ways. One is ex ante: it checks and balances the actions of the state. The other is ex post. A strong society provides an outside option to those inside the state. Sometimes, this is derisively described as ‘a revolving door’, but it may also have the positive function of checking the power of the state. State functionaries can object to state actions by voting with their feet, as many US civil servants did during the Donald Trump administration, and thereby drain the state of the valuable human capital it needs to function and operate. A strong society raises the opportunity costs for the state to recruit human capital but such a receptor function of society has never existed at scale in imperial China nor today, thanks – in large part, I would argue – to Keju.
Keju was so precocious that it pre-empted and displaced an emergent society. Meritocracy empowered the Chinese state at a time when society was still at an embryonic stage. Massive resources and administrative manpower were poured into Keju such that it completely eclipsed all other channels of upward mobility that could have emerged. In that sense, the celebration by many of Keju’s meritocracy misses the bigger picture of Chinese history. It is a view of a tree rather than of a forest…
…Its impressive bureaucratic mobility demolished all other mobility channels and possibilities. Keju was an anti-mobility mobility channel. It packed all the upward mobility within one channel – that of the state. Society was crowded out, and over time, due to its deficient access to quality human capital, it atrophied. This is the root of the power of Chinese autocracy and, I would argue, it is a historical development that is unique to China and explains the awesome power of Chinese autocracy…
…
There was, however, a massive operational advantage to the Neo-Confucianist curriculum: it standardised everything. Standardisation abhors nuance and the evaluations became more straightforward as the baseline comparison was more clearly delineated. There was objectivity, even if the objectivity was a manufactured artefact. The Chinese invented the modern state and meritocracy, but above all the Chinese invented specialised standardised testing – the memorisation, cognitive inclination and frame of references of an exceedingly narrow ideology.
Ming standardised Keju further: it enforced a highly scripted essay format, known as the ‘eight-legged essay’, or baguwen in Chinese (八股文), to which every Keju candidate had to adhere. A ‘leg’ here refers to each section of an essay, with a Keju essay requiring eight sections: 1) breaking open the topic; 2) receiving the topic; 3) beginning the discussion; 4) the initial leg; 5) the transition leg; 6) the middle leg; 7) the later leg; and 8) conclusion. The eight-legged essay fixed more than the aggregate structure of exposition. The specifications were granular and detailed. For example, the number of phrases was specified in each of the sections and the entire essay required expressions in paired sentences – a minimum of six paired sentences, up to a maximum of 12. The key contribution of the eight-legged essay is that it packed information into a pre-set presentational format.
Standardisation was designed to scale the Keju system and it succeeded brilliantly in that regard, but it had a devastating effect on expositional freedom and human creativity. All elements of subjectivity and judgment were taken out. In his book Traditional Government in Imperial China (1982), the historian Ch’ien Mu describes the ‘eight-legged essay’ as ‘the greatest destroyer of human talent.’…
In his book The WEIRDest People in the World (2020), Joseph Henrich posited that the West prospered because of its early lead in literacy. Yet the substantial Keju literacy produced none of the liberalising effects on Chinese ideas, economy or society. The literacy that Henrich had in mind was a particular kind of literacy – Protestant literacy – and the contrast with Keju literacy could not have been sharper. Keju literacy was drilled and practised in classical and highly stratified Chinese, the language of the imperial court rather than the language of the masses, in sharp contrast to Protestant literacy. Protestant literacy empowered personal agency by embracing and spreading vernaculars of the masses. Henrich’s liberalising ‘WEIRD’ effect – Western, educated, industrialised, rich and democratic – was a byproduct of Protestant literacy. It is no accident that Keju literacy produced an opposite effect…
Not everyone sees the Western/WEIRD definition of creativity and innovation as the only important one (c.f., here and here), nor that China is as lacking in what Westerners call creativity and innovation (c.f., here— possible soft paywall, and here). Still, Huang’s essay on Keju, China’s incredibly difficult civil service test, and how it strengthened the state at the cost of freedom and creativity, is eminently worthy of reading full: “The exam that broke society,” from @YashengHuang in @aeonmag.
And for the amazing (and amusing) story of how the Keju was instrumental in the introduction of Catholicism into China, see Jonathan Spence’s wonderful The Memory Palace of Matteo Ricci.
* Bill Gates
###
As we study, we might recall that it was on this date in 4004 BCE that the Universe was created… as per calculations by Archbishop James Ussher in the mid-17th century.
When Clarence Darrow prepared his famous examination of William Jennings Bryan in the Scopes trial [see here], he chose to focus primarily on a chronology of Biblical events prepared by a seventeenth-century Irish bishop, James Ussher. American fundamentalists in 1925 found—and generally accepted as accurate—Ussher’s careful calculation of dates, going all the way back to Creation, in the margins of their family Bibles. (In fact, until the 1970s, the Bibles placed in nearly every hotel room by the Gideon Society carried his chronology.) The King James Version of the Bible introduced into evidence by the prosecution in Dayton contained Ussher’s famous chronology, and Bryan more than once would be forced to resort to the bishop’s dates as he tried to respond to Darrow’s questions.
“Bishop James Ussher Sets the Date for Creation”

“Sheer dumb sentience”*…
As the power of AI grows, we find ourselves searching for a way to tell it might– or has– become sentient. Kristen Andrews and Jonathan Birch suggest that we should look to the minds of animals…
… Last year, [Google engineer Blake] Lemoine leaked the transcript [of an exchange he’d had with LaMDA, a Google AI system] because he genuinely came to believe that LaMDA was sentient – capable of feeling – and in urgent need of protection.
Should he have been more sceptical? Google thought so: they fired him for violation of data security policies, calling his claims ‘wholly unfounded’. If nothing else, though, the case should make us take seriously the possibility that AI systems, in the very near future, will persuade large numbers of users of their sentience. What will happen next? Will we be able to use scientific evidence to allay those fears? If so, what sort of evidence could actually show that an AI is – or is not – sentient?
The question is vast and daunting, and it’s hard to know where to start. But it may be comforting to learn that a group of scientists has been wrestling with a very similar question for a long time. They are ‘comparative psychologists’: scientists of animal minds.
We have lots of evidence that many other animals are sentient beings. It’s not that we have a single, decisive test that conclusively settles the issue, but rather that animals display many different markers of sentience. Markers are behavioural and physiological properties we can observe in scientific settings, and often in our everyday life as well. Their presence in animals can justify our seeing them as having sentient minds. Just as we often diagnose a disease by looking for lots of symptoms, all of which raise the probability of having that disease, so we can look for sentience by investigating many different markers…
On learning from our experience of animals to assess AI sentience: “What has feelings?” from @KristinAndrewz and @birchlse in @aeonmag.
Apposite: “The Future of Human Agency” (a Pew round-up of expert opinion on the future impact of AI)
Provocative in a resonant way: “The Philosopher Who Believes in Living Things.”
* Kim Stanley Robinson, 2312
###
As we talk to the animals, we might send thoughtful birthday greetings to J. P. Guilford; he was born on this date in 1897. A psychologist, he’s best remembered as a developer and practitioner of psychometrics, the quantitative measurement of subjective psychological phenomena (such sensation, personality, attention).
Guilford’s Structure of Intellect (SI) theory rejected the view that intelligence could be characterized in a single numerical parameter. He proposed that three dimensions were necessary for accurate description: operations, content, and products.
Guilford also developed the concepts of “convergent” and “divergent” thinking, as part of work he did emphasizing the importance of creativity in industry, science, arts, and education, and in urging more research into it nature.
A Review of General Psychology survey, published in 2002, ranked Guilford as the 27th most cited psychologist of the 20th century.
“Poetry might be defined as the clear expression of mixed feelings”*…
Can artificial intelligence have those feelings? Scientist and poet Keith Holyoak explores:
… Artificial intelligence (AI) is in the process of changing the world and its societies in ways no one can fully predict. On the hazier side of the present horizon, there may come a tipping point at which AI surpasses the general intelligence of humans. (In various specific domains, notably mathematical calculation, the intersection point was passed decades ago.) Many people anticipate this technological moment, dubbed the Singularity, as a kind of Second Coming — though whether of a savior or of Yeats’s rough beast is less clear. Perhaps by constructing an artificial human, computer scientists will finally realize Mary Shelley’s vision.
Of all the actual and potential consequences of AI, surely the least significant is that AI programs are beginning to write poetry. But that effort happens to be the AI application most relevant to our theme. And in a certain sense, poetry may serve as a kind of canary in the coal mine — an early indicator of the extent to which AI promises (threatens?) to challenge humans as artistic creators. If AI can be a poet, what other previously human-only roles will it slip into?…
A provocative consideration: “Can AI Write Authentic Poetry?” @mitpress.
Apposite: a fascinating Twitter thread on “why GPT3 algorithm proficiency at producing fluent, correct-seeming prose is an exciting opportunity for improving how we teach writing, how students learn to write, and how this can also benefit profs who assign writing, but don’t necessarily teach it.”
* W. H. Auden
###
As we ruminate on rhymes, we might send thoughtful birthday greetings to Michael Gazzaniga; he was born on this date in 1939. A leading researcher in cognitive neuroscience (the study of the neural basis of mind), his work has focused on how the brain enables humans to perform those advanced mental functions that are generally associated with what we call “the mind.” Gazzaniga has made significant contributions to the emerging understanding of how the brain facilitates such higher cognitive functions as remembering, speaking, interpreting, and making judgments.
“Your memory and your senses will be nourishment for your creativity”*…
On which senses do great creators rely? Randall Collins investigates…
Beethoven started going deaf in his late 20s. Already famous by age 25 for his piano sonatas, at 31 he was traumatized by losing his hearing. But he kept on composing: the Moonlight Sonata during the onset of deafness; the dramatic Waldstein Sonata at 32; piano sonatas kept on coming until he was 50. In his deaf period came the revolutionary sounds of his 3rd through 8th symphonies, piano and violin concertos (age 32-40). After 44 he became less productive, with intermittent flashes (Missa Solemnis, Diabelli variations, 9th symphony) composed at 47-53, dying at 56. His last string quartets were composed entirely in his head, left unperformed in his lifetime.
Handel went blind in one eye at age 66; laboriously finished the oratorio he was working on; went completely blind at 68. He never produced another significant work. But he kept on playing organ concertos, “performing from memory, or extemporizing while the players waited for their cue” almost to the day he died, aged 74.
Johann Sebastian Bach fell ill in his 64th year; next year his vision was nearly gone; he died at 65 “after two unsuccessful operations for a cataract.” At 62 he was still producing great works; at 64 he finished assembling the pieces of his B Minor Mass (recycling his older works being his modus operandi). At death he left unfinished his monument of musical puzzles, The Art of the Fugue, on which he had been working since 55.
Can we conclude, it is more important for a composer to see than hear?…
And given examples like Milton, that it’s more critical to poets and writers to hear than see? More at “Deaf or Blind: Beethoven, Handel,” from @sociologicaleye.
* Arthur Rimbaud
###
As we contemplate creativity, we might recall that it was on this date in 2013 that Google– Google Search, YouTube, Google Mail, and Google Drive, et al.– went down for about 5 minutes. During that brief window, internet traffic around the world dropped by 40 percent.
“Man, sometimes it takes a long time to sound like yourself”*…
Ian Leslie on why we need to take control of our influences and what we can learn from artists about how to do so…
We live in age of social influence, and while there is no shortage of advice on how to take advantage of that – how to influence others, how to build a following, how to change minds – there is a dearth of thinking on how to be influenced. Which is odd, because that seems, to me, to be one of the key questions of the age…
Being influenced by others is inevitable and essential. But it’s also true that when we over-conform to influences, we surrender individuality. We get infected by harmful behaviours: smoking, anorexia, even suicide are all subject to social influence. We swallow conspiracy theories and false beliefs. We become mindless creatures of habit unable to imagine new possibilities. Conforming to influence can generate anxiety: we become worried that we’re not conforming well enough. There are externalities to be considered, too. Over-conformity is a kind of free-riding. The over-conformer takes from the shared pot of memes but fails to contribute to it. A society with too much imitation is liable to decay and degenerate, because it stops creating, thinking and innovating.
Each of us, then, has to try and strike a balance. Be impervious to social influence and you get closed off from the best that your fellow humans have to offer. Be defenceless against it and you become easily manipulable, boring, and unhappy.
But it’s harder than ever to strike this balance, because we live in societies where influence is everywhere, pressing upon us from all sides. We can instantly find out what strangers think, or at least what they say they’re thinking, on any given topic. We can consult with our friends every second of the day. It’s easier to outsource your opinions than ever; it feels good, it feels safe, to side with a crowd. There are higher costs to non-conformity, too: online communities assiduously police the boundaries of acceptable thought and behaviour…
… on the one hand, we have access to a broader range of information and insight than any generation in history, which ought to make us all more interesting. On the other, it’s very difficult, amidst the crossfire hurricane of influence, to think and act for yourself – to be you.
I could leave it there, with the conclusion that we’re all being influenced all the time and we’re not remotely prepared for how to manage these influences, and that maybe we should think about that a little more. But I want to add this: that there is a group of people who have a lot to teach us about how to live in the age of influence, because they have confronted this question with a special intensity for hundreds of years.
Artists (in the broad sense – painters, novelists, composers, etc) are pretty much defined by the struggle to be themselves; to absorb influences without surrendering to them; to be open to others and stubbornly individual. Consequently, artists have a different relationship to influence than the rest of us do. The core difference is this: artists do not absorb their influences passively. They choose their influences, and they choose how to be influenced by them…
Read on for sound advice: “How To Be Influenced,” from @mrianleslie via @TheBrowser.
Apposite: “The Age of Algorithmic Anxiety,” from @chaykak
* Miles Davis
###
As we steal like an artist, we might recall that it was on this date in 1790 that the first U.S. patent was issued to Samuel Hopkins for an improvement “in the making of Potash and Pearlash by a new Apparatus and Process.” It was signed by then-President George Washington.
A number of inventors had been clamoring for patents and copyrights (which were, of course, anticipated in Article I, Section 8, Clause 8 of the Constitution), but the first session of the First Congress in 1789 acted on none of the petitions. On January 8, 1790, President Washington recommended in his State of the Union address that Congress give attention to the encouragement of new and useful inventions; and within the month, the House appointed a committee to draft a patent statute. Even then the process worked slowly: Hopkins’ patent was issued over six months later.

You must be logged in to post a comment.