Posts Tagged ‘creation’
“The question of whether a computer can think is no more interesting than the question of whether a submarine can swim”*…
Anil Dash, with a grounded view of artificial intelligence…
Even though AI has been the most-talked-about topic in tech for a few years now, we’re in an unusual situation where the most common opinion about AI within the tech industry is barely ever mentioned.
Most people who actually have technical roles within the tech industry, like engineers, product managers, and others who actually make the technologies we all use, are fluent in the latest technologies like LLMs. They aren’t the big, loud billionaires that usually get treated as the spokespeople for all of tech.
And what they all share is an extraordinary degree of consistency in their feelings about AI, which can be pretty succinctly summed up:
Technologies like LLMs have utility, but the absurd way they’ve been over-hyped, the fact they’re being forced on everyone, and the insistence on ignoring the many valid critiques about them make it very difficult to focus on legitimate uses where they might add value.
What’s amazing is the reality that virtually 100% of tech experts I talk to in the industry feel this way, yet nobody outside of that cohort will mention this reality. What we all want is for people to just treat AI as a “normal technology“, as Arvind Naryanan and Sayash Kapoor so perfectly put it. I might be a little more angry and a little less eloquent: stop being so goddamn creepy and weird about the technology! It’s just tech, everything doesn’t have to become some weird religion that you beat people over the head with, or gamble the entire stock market on…
Eminently worth reading in full: “The Majority AI View,” from @anildash.com.
Pair with: “Artificial Intelligences, So Far,” from @kevinkelly.bsky.social.
For an explanation of (some of) the dangers of over-hyping, see: “America’s future could hinge on whether AI slightly disappoints,” from @noahpinion.blog.web.brid.gy.
And for a peek at what lies behind each GenAI query: “Cartography of generative AI,” from @tallerestampa.bsky.social via @flowingdata.com.
While the arguments above are practical, note that a plethora of tech experts have weighed in with a a note of existential caution: “Statement on Superintelligence.”
Further to which (and finally), a piece from the Federal Reserve Bank of Dallas, projecting the economic impact of AI. It suggests that AI could provide a modest but meaningful boost to GDP over the next 25 years… if The Fed’s “Goldilocks Scenario” (in which, per Dash’s and Kelly’s comments, AI makes consistent incremental contributions to “keep living standards improving at their historical rate”) plays out. You’ll note that they also considered two other scenarios: a “benign singularity” scenario in which “AI eventually surpasses human intelligence, leading to rapid and unpredictable changes to the economy and society” and an “extinction singularity” in which “machine intelligence overtakes human intelligence at some finite point in the near future, the machines become malevolent, and this eventually leads to human extinction.”
Interesting times in which we live…
###
As we parse pumped prognostication, we might recall that it was on this date in 4004 BCE that the Universe was created… as per calculations by Archbishop James Ussher in the mid-17th century. Ussher, the head of the Anglican Church of Ireland at the time, attempted to calculate the dates of many important events described in the Old Testament. His calculations, which he published in 1650, were not that far off from many other estimates made at the time. Isaac Newton, for example, believed that the world was created in 4000 BC.
When Clarence Darrow prepared his famous examination of William Jennings Bryan in the Scopes trial [see here], he chose to focus primarily on a chronology of Biblical events prepared by a seventeenth-century Irish bishop, James Ussher. American fundamentalists in 1925 found—and generally accepted as accurate—Ussher’s careful calculation of dates, going all the way back to Creation, in the margins of their family Bibles. (In fact, until the 1970s, the Bibles placed in nearly every hotel room by the Gideon Society carried his chronology.) The King James Version of the Bible introduced into evidence by the prosecution in Dayton contained Ussher’s famous chronology, and Bryan more than once would be forced to resort to the bishop’s dates as he tried to respond to Darrow’s questions.
“I fear the day when the technology overlaps with our humanity. The world will only have a generation of idiots.”*…
Alva Noë on the importance of humans hanging on to their humanity– for all the promise and dangers of AI, computers plainly can’t think. To think is to resist – something no machine does:
Computers don’t actually do anything. They don’t write, or play; they don’t even compute. Which doesn’t mean we can’t play with computers, or use them to invent, or make, or problem-solve. The new AI is unexpectedly reshaping ways of working and making, in the arts and sciences, in industry, and in warfare. We need to come to terms with the transformative promise and dangers of this new tech. But it ought to be possible to do so without succumbing to bogus claims about machine minds.
What could ever lead us to take seriously the thought that these devices of our own invention might actually understand, and think, and feel, or that, if not now, then later, they might one day come to open their artificial eyes thus finally to behold a shiny world of their very own? One source might simply be the sense that, now unleashed, AI is beyond our control. Fast, microscopic, distributed and astronomically complex, it is hard to understand this tech, and it is tempting to imagine that it has power over us.
But this is nothing new. The story of technology – from prehistory to now – has always been that of the ways we are entrained by the tools and systems that we ourselves have made. Think of the pathways we make by walking. To every tool there is a corresponding habit, that is, an automatised way of acting and being. From the humble pencil to the printing press to the internet, our human agency is enacted in part by the creation of social and technological landscapes that in turn transform what we can do, and so seem, or threaten, to govern and control us.
Yet it is one thing to appreciate the ways we make and remake ourselves through the cultural transformation of our worlds via tool use and technology, and another to mystify dumb matter put to work by us. If there is intelligence in the vicinity of pencils, shoes, cigarette lighters, maps or calculators, it is the intelligence of their users and inventors. The digital is no different.
But there is another origin of our impulse to concede mind to devices of our own invention, and this is what I focus on here: the tendency of some scientists to take for granted what can only be described as a wildly simplistic picture of human and animal cognitive life. They rely unchecked on one-sided, indeed, milquetoast conceptions of human activity, skill and cognitive accomplishment. The surreptitious substitution (to use a phrase of Edmund Husserl’s) of this thin gruel version of the mind at work – a substitution that I hope to convince you traces back to Alan Turing and the very origins of AI – is the decisive move in the conjuring trick.
What scientists seem to have forgotten is that the human animal is a creature of disturbance. Or as the mid-20th-century philosopher of biology Hans Jonas wrote: ‘Irritability is the germ, and as it were the atom, of having a world…’ With us there is always, so to speak, a pebble in the shoe. And this is what moves us, turns us, orients us to reorient ourselves, to do things differently, so that we might carry on. It is irritation and disorientation that is the source of our concern. In the absence of disturbance, there is nothing: no language, no games, no goals, no tasks, no world, no care, and so, yes, no consciousness…
[Starting with Turing, Noë considers the relative roles of humans and technology across a number of spheres, including music…]
… The piano was invented, to be sure, but not by you or me. We encounter it. It pre-exists us and solicits our submission. To learn to play is to be altered, made to adapt one’s posture, hands, fingers, legs and feet to the piano’s mechanical requirements. Under the regime of the piano keyboard, it is demanded that we ourselves become player pianos, that is to say, extensions of the machine itself.
But we can’t. And we won’t. To learn to play, to take on the machine, for us, is to struggle. It is hard to master the instrument’s demands.
And this fact – the difficulty we encounter in the face of the keyboard’s insistence – is productive. We make art out of it. It stops us being player pianos, but it is exactly what is required if we are to become piano players.
For it is the player’s fraught relation to the machine, and to the history and tradition that the machine imposes, that supplies the raw material of musical invention. Music and play happen in that entanglement. To master the piano, as only a person can, is not just to conform to the machine’s demands. It is, rather, to push back, to say no, to rage against the machine. And so, for example, we slap and bang and shout out. In this way, the piano becomes not merely a vehicle of habit and control – a mechanism – but rather an opportunity for action and expression.
And, as with the piano, so with the whole of human cultural life. We live in the entanglement between government and resistance. We fight back…
… The telling fact: computers are used to play our games; they are engineered to make moves in the spaces opened up by our concerns. They don’t have concerns of their own, and they make no new games. They invent no new language.
The British philosopher R G Collingwood noticed that the painter doesn’t invent painting, and the musician doesn’t invent the musical culture in which they find themselves. And for Collingwood this served to show that no person is fully autonomous, a God-like fount of creativity; we are always to some degree recyclers and samplers and, at our best, participants in something larger than ourselves.
But this should not be taken to show that we become what we are (painters, musicians, speakers) by doing what, for example, LLMs do – i.e., merely by getting trained up on large data sets. Humans aren’t trained up. We have experience. We learn. And for us, learning a language, for example, isn’t learning to generate ‘the next token’. It’s learning to work, play, eat, love, flirt, dance, fight, pray, manipulate, negotiate, pretend, invent and think. And crucially, we don’t merely incorporate what we learn and carry on; we always resist. Our values are always problematic. We are not merely word-generators. We are makers of meaning.
We can’t help doing this; no computer can do this…
Eminently worth reading in full: “Rage against the machine,” from @alvanoe in @aeonmag.
For more, see Noë’s The Entanglement: How Art and Philosophy Make Us What We Are.
* Albert Einstein
###
As we resolve to wrestle, we might recall that it was on this date in 1969 that UCLA professor Leonard Kleinrock (aided by his student assistant Charley Kline) created the first networked computer-to-computer connection (with SRI programmer Bill Duvall in Palo Alto), via which they sent the first networked computer-to-computer communication)… or at least part of it. Duvall’s machine crashed partway through the transmission, meaning the only letters received from the attempted “login” were “lo.” The next month two more nodes were added (UCSB and the University of Utah) and the network was dubbed ARPANET.
Still, “lo”– perhaps an appropriate way to announce what would grow up to be the internet.

“The greatest obstacle to discovery is not ignorance – it is the illusion of knowledge”*…
Learning from the past: as John Thornhill explains in his consideration of Jason Roberts‘ Every Living Thing, the rivalry between Buffon and Linnaeus has lessons about disrupters and exploitation…
The aristocratic French polymath Georges-Louis Leclerc, Comte de Buffon chose a good year to die: 1788. Reflecting his status as a star of the Enlightenment and author of 35 popular volumes on natural history, Buffon’s funeral carriage drawn by 14 horses was watched by an estimated 20,000 mourners as it processed through Paris. A grateful Louis XVI had earlier erected a statue of a heroic Buffon in the Jardin du Roi, over which the naturalist had masterfully presided. “All nature bows to his genius,” the inscription read.
The next year the French Revolution erupted. As a symbol of the ancien regime, Buffon was denounced as an enemy of progress, his estates in Burgundy seized, and his son, known as the Buffonet, guillotined. In further insult to his memory, zealous revolutionaries marched through the king’s gardens (nowadays known as the Jardin des Plantes) with a bust of Buffon’s great rival, Carl Linnaeus. They hailed the Swedish scientific revolutionary as a true man of the people.
The intense intellectual rivalry between Buffon and Linnaeus, which still resonates today, is fascinatingly told by the author Jason Roberts in his book Every Living Thing, my holiday reading while staying near Buffon’s birthplace in Burgundy. Natural history, like all history, might be written by the victors, as Roberts argues. And for a long time, Linnaeus’s highly influential, but flawed, views held sway. But the book makes a sympathetic case for the further rehabilitation of the much-maligned Buffon.
The two men were, as Roberts writes, exact contemporaries and polar opposites. While Linnaeus obsessed about classifying all biological species into neat categories with fixed attributes and Latin names (Homo sapiens, for example), Buffon emphasised the vast diversity and constantly changing nature of every living thing.
In Roberts’s telling, Linnaeus emerges as a brilliant but ruthless dogmatist, who ignored inconvenient facts that did not fit his theories and gave birth to racial pseudoscience. But it was Buffon’s painstaking investigations and acceptance of complexity that helped inspire the evolutionary theories of Charles Darwin, who later acknowledged that the Frenchman’s ideas were “laughably like mine”.
In two aspects, at least, this 18th-century scientific clash rhymes with our times. The first is to show how intellectual knowledge can often be a source of financial gain. The discovery of crops and commodities in other parts of the world and the development of new methods of cultivation had a huge impact on the economy in that era. “All that is useful to man originates from these natural objects,” Linnaeus wrote. “In one word, it is the foundation of every industry.”
Great wealth was generated from trade in sugar, potatoes, coffee, tea and cochineal while Linnaeus himself explored ways of cultivating pineapples, strawberries and freshwater pearls.
“In many ways, the discipline of natural history in the 18th century was roughly analogous to technology today: a means of disrupting old markets, creating new ones, and generating fortunes in the process,” Roberts writes. As a former software engineer at Apple and a West Coast resident, Roberts knows the tech industry.
Then as now, the addition of fresh inputs into the economy — whether natural commodities back then or digital data today — can lead to astonishing progress, benefiting millions. But it can also lead to exploitation. As Roberts tells me in a telephone interview, it was the scaling up of the sugar industry in the West Indies that led to the slave trade. “Sometimes we think we are inventing the future when we are retrofitting the past,” he says.
The second resonance with today is the danger of believing we know more than we do. Roberts compares Buffon’s state of “curious unknowing” to the concept of “negative capability” described by the English poet John Keats. In a letter written in 1817, Keats argued that we should resist the temptation to explain away things we do not properly understand and accept “uncertainties, mysteries, doubts, without any irritable reaching after fact and reason.”
Armed today with instant access to information and smart machines, the temptation is to ascribe a rational order to everything, as Linnaeus did. But scientific progress depends on a humble acceptance of relative ignorance and a relentless study of the fabric of reality. The spooky nature of quantum mechanics would have blown Linnaeus’s mind. If Buffon still teaches us anything, it is to study the peculiarity of things as they are, not as we might wish them to be…
“What an epic 18th-century scientific row teaches us today,” @johnthornhillft on @itsJason in @FT (gift link)
Pair with “Frameworks” from Céline Henne (@celinehenne) “Knowledge is often a matter of discovery. But when the nature of an enquiry itself is at question, it is an act of creation.”
* Daniel J. Boorstin
###
As we embrace the exceptions, we might send carefully-coded birthday greetings to John McCarthy; he was born on this date in 1927. An eminent computer and cognitive scientist– he was awarded both the Turning Prize and the National Medal of Science– McCarthy coined the phrase “artificial Intelligence” to describe the field of which he was a founder.
It was McCarthy’s 1979 article, “Ascribing Mental Qualities to Machines” (in which he wrote, “Machines as simple as thermostats can be said to have beliefs, and having beliefs seems to be a characteristic of most machines capable of problem solving performance”) that provoked John Searle‘s 1980 disagreement in the form of his famous Chinese Room Argument… provoking a broad debate that continues to this day.

“This is not your average, everyday darkness. This is… ADVANCED darkness.”*…
As Rob Beschizza explains, Pere Rosselló, an astrophysics student at Universidad de La Laguna in Tenerife, Spain, has created an animation depicting the gravitational collapse of Spongebob…
Beschizza muses…
Just imagine being part of a civilization on the cusp of attaining a decent model of the universe’s origins—somewhere between Halley and Lemaître, and you start plotting backwards from where we are and where the Big Bang should be you find Spongebob instead. Running the numbers again and again. Such a universe has no need of Lovecraft, cosmic horror would be right there in the maths.
Rosselló [also] solved a three-body problem: the one of animating three bodies to look really cool…
“N-body simulation of the gravitational collapse of Spongebob Squarepants,” by @PeRossello via @Beschizza in @BoingBoing.
* SpongeBob, “Rock Bottom“
###
As we deconstruct deconstruction, we might recall that it was on this date (in an unspecified year) that SpongeBob met the green seahorse Mystery.
“I failed in some subjects in exams, but my friend passed in all. Now he is an engineer in Microsoft and I am the owner of Microsoft.”*…
And that, Yasheng Huang argues, is not something likely to happen in China, for a reason that dates back to the 6th century…
On 7 and 8 June 2023, close to 13 million high-school students in China sat for the world’s most gruelling college entrance exam. ‘Imagine,’ wrote a Singapore journalist, ‘the SAT, ACT, and all of your AP tests rolled into two days. That’s Gao Kao, or “higher education exam”.’ In 2023, almost 2.6 million applied to sit China’s civil service exam to compete for only 37,100 slots.
Gao Kao and China’s civil service exam trace their origin to, and are modelled on, an ancient Chinese institution, Keju, the imperial civil service exam established by the Sui Dynasty (581-618). It can be translated as ‘subject recommendation.’ Toward the end of its reign, the Qing dynasty (1644-1911) abolished it in 1905 as part of its effort to reform and modernize the Chinese system. Until then, Keju had been the principal recruitment route for imperial bureaucracy. Keju reached its apex during the Ming dynasty (1368-1644). All the prime ministers but one came through the Keju route and many of them were ranked at the very top in their exam cohort…
Much of the academic literature focuses on the meritocracy of Keju. The path-breaking book in this genre is Ping-ti Ho’s The Ladder of Success in Imperial China (1962). One of his observations is eye catching: more than half of those who obtained the Juren degree were first generation: ie, none of their ancestors had ever attained a Juren status. (Juren was, at the time, the first degree granted in the three-tiered hierarchy of Keju.) More recent literature demonstrates the political effects of Keju. In 1905, the Qing dynasty abolished Keju, dashing the aspirations of millions and sparking regional rebellions that eventually toppled China’s last imperial regime in 1911.
The political dimension of Keju goes far beyond its meritocracy and its connection to the 1911 republican revolution. For an institution that had such deep penetration, both cross-sectionally in society and across time in history, Keju was all encompassing, laying claims to the time, effort and cognitive investment of a significant swathe of the male Chinese population. It was a state institution designed to augment the state’s own power and capabilities. Directly, the state monopolised the very best human capital; indirectly, the state deprived society of access to talent and pre-empted organised religion, commerce and the intelligentsia. Keju anchored Chinese autocracy.
The impact of Keju is still felt today, not only in the form and practice of Gao Kao and the civil service exam but also because Keju incubated values and work ethics. Today, Chinese minds still bear its imprint. For one, Keju elevated the value of education and we see this effect today. A 2020 study shows that, for every doubling of successful Keju candidates per 10,000 of the population in the Ming-Qing period, there was a 6.9 per cent increase in years of schooling in 2010. The Keju exams loom as part of China’s human capital formation today, but they also cultivated and imposed the values of deference to authority and collectivism that the Chinese Communist Party has reaped richly for its rule and legitimacy…
An ultimate autocracy is one that reigns without society. Society shackles the state in many ways. One is ex ante: it checks and balances the actions of the state. The other is ex post. A strong society provides an outside option to those inside the state. Sometimes, this is derisively described as ‘a revolving door’, but it may also have the positive function of checking the power of the state. State functionaries can object to state actions by voting with their feet, as many US civil servants did during the Donald Trump administration, and thereby drain the state of the valuable human capital it needs to function and operate. A strong society raises the opportunity costs for the state to recruit human capital but such a receptor function of society has never existed at scale in imperial China nor today, thanks – in large part, I would argue – to Keju.
Keju was so precocious that it pre-empted and displaced an emergent society. Meritocracy empowered the Chinese state at a time when society was still at an embryonic stage. Massive resources and administrative manpower were poured into Keju such that it completely eclipsed all other channels of upward mobility that could have emerged. In that sense, the celebration by many of Keju’s meritocracy misses the bigger picture of Chinese history. It is a view of a tree rather than of a forest…
…Its impressive bureaucratic mobility demolished all other mobility channels and possibilities. Keju was an anti-mobility mobility channel. It packed all the upward mobility within one channel – that of the state. Society was crowded out, and over time, due to its deficient access to quality human capital, it atrophied. This is the root of the power of Chinese autocracy and, I would argue, it is a historical development that is unique to China and explains the awesome power of Chinese autocracy…
…
There was, however, a massive operational advantage to the Neo-Confucianist curriculum: it standardised everything. Standardisation abhors nuance and the evaluations became more straightforward as the baseline comparison was more clearly delineated. There was objectivity, even if the objectivity was a manufactured artefact. The Chinese invented the modern state and meritocracy, but above all the Chinese invented specialised standardised testing – the memorisation, cognitive inclination and frame of references of an exceedingly narrow ideology.
Ming standardised Keju further: it enforced a highly scripted essay format, known as the ‘eight-legged essay’, or baguwen in Chinese (八股文), to which every Keju candidate had to adhere. A ‘leg’ here refers to each section of an essay, with a Keju essay requiring eight sections: 1) breaking open the topic; 2) receiving the topic; 3) beginning the discussion; 4) the initial leg; 5) the transition leg; 6) the middle leg; 7) the later leg; and 8) conclusion. The eight-legged essay fixed more than the aggregate structure of exposition. The specifications were granular and detailed. For example, the number of phrases was specified in each of the sections and the entire essay required expressions in paired sentences – a minimum of six paired sentences, up to a maximum of 12. The key contribution of the eight-legged essay is that it packed information into a pre-set presentational format.
Standardisation was designed to scale the Keju system and it succeeded brilliantly in that regard, but it had a devastating effect on expositional freedom and human creativity. All elements of subjectivity and judgment were taken out. In his book Traditional Government in Imperial China (1982), the historian Ch’ien Mu describes the ‘eight-legged essay’ as ‘the greatest destroyer of human talent.’…
In his book The WEIRDest People in the World (2020), Joseph Henrich posited that the West prospered because of its early lead in literacy. Yet the substantial Keju literacy produced none of the liberalising effects on Chinese ideas, economy or society. The literacy that Henrich had in mind was a particular kind of literacy – Protestant literacy – and the contrast with Keju literacy could not have been sharper. Keju literacy was drilled and practised in classical and highly stratified Chinese, the language of the imperial court rather than the language of the masses, in sharp contrast to Protestant literacy. Protestant literacy empowered personal agency by embracing and spreading vernaculars of the masses. Henrich’s liberalising ‘WEIRD’ effect – Western, educated, industrialised, rich and democratic – was a byproduct of Protestant literacy. It is no accident that Keju literacy produced an opposite effect…
Not everyone sees the Western/WEIRD definition of creativity and innovation as the only important one (c.f., here and here), nor that China is as lacking in what Westerners call creativity and innovation (c.f., here— possible soft paywall, and here). Still, Huang’s essay on Keju, China’s incredibly difficult civil service test, and how it strengthened the state at the cost of freedom and creativity, is eminently worthy of reading full: “The exam that broke society,” from @YashengHuang in @aeonmag.
And for the amazing (and amusing) story of how the Keju was instrumental in the introduction of Catholicism into China, see Jonathan Spence’s wonderful The Memory Palace of Matteo Ricci.
* Bill Gates
###
As we study, we might recall that it was on this date in 4004 BCE that the Universe was created… as per calculations by Archbishop James Ussher in the mid-17th century.
When Clarence Darrow prepared his famous examination of William Jennings Bryan in the Scopes trial [see here], he chose to focus primarily on a chronology of Biblical events prepared by a seventeenth-century Irish bishop, James Ussher. American fundamentalists in 1925 found—and generally accepted as accurate—Ussher’s careful calculation of dates, going all the way back to Creation, in the margins of their family Bibles. (In fact, until the 1970s, the Bibles placed in nearly every hotel room by the Gideon Society carried his chronology.) The King James Version of the Bible introduced into evidence by the prosecution in Dayton contained Ussher’s famous chronology, and Bryan more than once would be forced to resort to the bishop’s dates as he tried to respond to Darrow’s questions.
“Bishop James Ussher Sets the Date for Creation”








You must be logged in to post a comment.