Posts Tagged ‘ideas’
“It is what you read when you don’t have to that determines what you will be when you can’t help it”*…
… What we read– and, librarian Carlo Iacono argues, how we read.
Our inabilty to focus isn’t a failing. It’s a design problem, and the answer isn’t getting rid of our screen time…
Everyone is panicking about the death of reading. The statistics look damning: the share of Americans who read for pleasure on an average day has fallen by more than 40 per cent over the past 20 years, according to research published in iScience this year. The OECD calls the 2022 decline in educational outcomes ‘unprecedented’ across developed nations. In the OECD’s latest adult-skills survey, Denmark and Finland were the only participating countries where average literacy proficiency improved over the past decade. Your nephew speaks in TikTok references. Democracy itself apparently hangs by the thread of our collective attention span.
This narrative has a seductive simplicity. Screens are destroying civilisation. Children can no longer think. We are witnessing the twilight of the literate mind. A recent Substack essay by James Marriott proclaimed the arrival of a ‘post-literate society’ and invited us to accept this as a fait accompli. (Marriott does also write for The Times.) The diagnosis is familiar: technology has fundamentally degraded our capacity for sustained thought, and there’s nothing to be done except write elegiac essays from a comfortable distance.
I spend my working life in a university library, watching how people actually engage with information. What I observe doesn’t match this narrative. Not because the problems aren’t real, but because the diagnosis is wrong.
The declinist position rests on a category error: treating ‘screen culture’ as a unified phenomenon with inherent cognitive properties. As if the same device that delivers algorithmically curated rage-bait and also the complete works of Shakespeare is itself the problem rather than how we decide to use it…
[… observing that “people who ‘can’t focus’ on traditional texts can maintain extraordinary concentration when working across modes, he argues that “we haven’t become post-literate. We’ve become post-monomodal. Text hasn’t disappeared; it’s been joined by a symphony of other channels.”…]
… What troubles me most about the declinist position is not its diagnosis but its conclusion. The commentators who lament the post-literate society often identify the same villains I do. They recognise that technology companies are, in Marriott’s words, ‘actively working to destroy human enlightenment’, that tech oligarchs ‘have just as much of a stake in the ignorance of the population as the most reactionary feudal autocrat.’
And then they surrender. As Marriott says: ‘Nothing will ever be the same again. Welcome to the post-literate society.’
This is the move I cannot follow. To name the actors responsible and then treat the outcome as inevitable is to provide them cover. If the crisis is a force of nature, ‘screens’ destroying civilisation like some technological weather system, then there’s nothing to be done but write elegiac essays from a comfortable distance. But if the crisis is the product of specific design choices made by specific companies for specific economic reasons, then those choices can be challenged, regulated, reversed.
The fatalism, however beautifully expressed, serves the very interests it condemns. The technology companies would very much like us to believe that what they’re doing to human attention is simply the inevitable result of technological progress rather than something they’re doing to us, something that could, with sufficient political will, be stopped.
Your inability to focus isn’t a moral failing. It’s a design problem. You’re trying to think in environments built to prevent thinking. You’re trying to sustain attention in spaces engineered to shatter it. You’re fighting algorithms explicitly optimised to keep you scrolling, not learning.
The solution isn’t discipline. It’s architecture. Build different defaults. Create different spaces. Establish different rhythms. Make depth as easy as distraction currently is. Make thinking feel as natural as scrolling currently does.
What if, instead of mourning some imaginary golden age of pure text, we got serious about designing for depth across all modes? Every video could come with a searchable transcript. Every article could offer multiple entry points for different levels of attention. Our devices could recognise when we’re trying to think and protect that thinking. Schools could teach students to translate between modes the way they once taught translation between languages.
Books aren’t going anywhere. They remain unmatched for certain kinds of sustained, complex thinking. But they’re no longer the only game in town for serious ideas. A well-crafted video essay can carry philosophical weight. A podcast can enable the kind of long-form thinking we associate with written essays. An interactive visualisation can reveal patterns that pages of description struggle to achieve.
The future belongs to people who can dance between all modes without losing their balance. Someone who can read deeply when depth is needed, skim efficiently when efficiency matters, listen actively during a commute, and watch critically when images carry the argument. This isn’t about consuming more. It’s about choosing consciously.
We stand at an inflection point. We can drift into a world where sustained thought becomes a luxury good, where only the privileged have access to the conditions that enable deep thinking. Or we can build something unprecedented: a culture that preserves the best of print’s cognitive gifts while embracing the possibilities of a world where ideas travel through light, sound and interaction.
The choice isn’t between books and screens. The choice is between intentional design and profitable chaos. Between habitats that cultivate human potential and platforms that extract human attention.
The civilisations that thrive won’t be the ones that retreat into text or surrender to the feed. They’ll be the ones that understand a simple truth: every idea has a natural form, and wisdom lies in matching the mode to the meaning. Some ideas want to be written. Others need to be seen. Still others must be heard, felt or experienced. The mistake is forcing all ideas through a single channel, whether that channel is a book or a screen.
Your great-grandchildren won’t read less than you do. They’ll read differently, as part of a richer symphony of sense-making. Whether that symphony sounds like music or noise depends entirely on the choices we make right now about the shape of our tools, the structure of our schools, and the design of our days.
The elegant lamenters offer a eulogy. I’m more interested in a fight…
Reunderstanding reading: “Books and screens,” from @carloiacono.bsky.social in @aeon.co.
* Oscar Wilde
###
As we turn the page, we might note that we’ve been here before, and celebrate the emergence of a design, an innovation, a technology that took on a life of its own and changed reading and… well, everything: this day in 1455 is the traditionally-given date of the publication of the Gutenberg Bible, the first Western book printed from movable type.
(Lest we think that there’s actually anything new under the sun, we might recall that The Jikji— the world’s oldest known extant movable metal type printed book– was published in Korea in 1377; and that Bi Sheng created the first known moveable type– out of wood– in China in 1040.)

“To understand anything, you just need to understand the little bits”*…
Oscar Schwartz begs to differ. Here, excerpts from his provocative critique of TED Talks…
Bill Gates wheels a hefty metal barrel out onto a stage. He carefully places it down and then faces the audience, which sits silent in a darkened theater. “When I was a kid, the disaster we worried about most was a nuclear war,” he begins. Gates is speaking at TED’s flagship conference, held in Vancouver in 2015. He wears a salmon pink sweater, and his hair is combed down over his forehead, Caesar-style. “That’s why we had a barrel like this down in our basement, filled with cans of food and water,” he says. “When the nuclear attack came, we were supposed to go downstairs, hunker down, and eat out of that barrel.”
Now that he is an adult, Gates continues, it is no longer nuclear apocalypse that scares him, but pestilence. A year ago, Ebola killed over ten thousand people in West Africa. If the virus had been airborne or spread to a large city center, things would have been far worse. It might’ve snowballed into a pandemic and killed tens of millions of people. Gates tells the TED attendees that humanity is not ready for this scenario — that a pandemic would trigger a global catastrophe at an unimaginable scale. We have no basement to retreat to and no metal barrel filled with supplies to rely on.
But, Gates adds, the future might turn out okay. He has an idea. Back when he was a kid, the U.S. military had sufficient funding to mobilize for war at any minute. Gates says that we must prepare for a pandemic with the same fearful intensity. We need to build a medical reserve corps. We need to play germ games like generals play war games. We need to make alliances with other virus-fighting nations. We need to build an arsenal of biomedical weapons to attack any non-human entity that might attack our bodies. “If we start now, we can be ready for the next epidemic,” Gates concludes, to a round of applause.
Of course, Gates’s popular and well-shared TED talk — viewed millions of times — didn’t alter the course of history. Neither did any of the other “ideas worth spreading” (the organization’s tagline) presented at the TED conference that year — including Monica Lewinsky’s massively viral speech about how to stop online bullying through compassion and empathy, or a Google engineer’s talk about how driverless cars would make roads smarter and safer in the near future. In fact, seven years after TED 2015, it feels like we are living in a reality that is the exact opposite of the future envisioned that year. A president took office in part because of his talent for online bullying. Driverless cars are nowhere near as widespread as predicted, and those that do share our roads keep crashing. Covid has killed five million people and counting.
At the start of the pandemic, I noticed people sharing Gates’s 2015 talk. The general sentiment was one of remorse and lamentation: the tech-prophet had predicted the future for us! If only we had heeded his warning! I wasn’t so sure. It seems to me that Gates’s prediction and proposed solution are at least part of what landed us here. I don’t mean to suggest that Gates’s TED talk is somehow directly responsible for the lack of global preparedness for Covid. But it embodies a certain story about “the future” that TED talks have been telling for the past two decades — one that has contributed to our unending present crisis.
The story goes like this: there are problems in the world that make the future a scary prospect. Fortunately, though, there are solutions to each of these problems, and the solutions have been formulated by extremely smart, tech-adjacent people. For their ideas to become realities, they merely need to be articulated and spread as widely as possible. And the best way to spread ideas is through stories — hence Gates’s opening anecdote about the barrel. In other words, in the TED episteme, the function of a story isn’t to transform via metaphor or indirection, but to actually manifest a new world. Stories about the future create the future. Or as Chris Anderson, TED’s longtime curator, puts it, “We live in an era where the best way to make a dent on the world… may be simply to stand up and say something.” And yet, TED’s archive is a graveyard of ideas. It is a seemingly endless index of stories about the future — the future of science, the future of the environment, the future of work, the future of love and sex, the future of what it means to be human — that never materialized. By this measure alone, TED, and its attendant ways of thinking, should have been abandoned…
…
… TED talks began to take on a distinct rhetorical style, later laid out in Anderson’s book TED Talks: The Official TED Guide to Public Speaking. In it, Anderson insists anyone is capable of giving a TED-esque talk. You just need an interesting topic and then you need to attach that topic to an inspirational story. Robots are interesting. Using them to eat trash in Nairobi is inspiring. Put the two together, and you have a TED talk.
I like to call this fusion “the inspiresting.” Stylistically, the inspiresting is earnest and contrived. It is smart but not quite intellectual, personal but not sincere, jokey but not funny. It is an aesthetic of populist elitism. Politically, the inspiresting performs a certain kind of progressivism, as it is concerned with making the world a better place, however vaguely…
…
Perhaps the most incisive critique came, ironically, at a 2013 TEDx conference. In “What’s Wrong with TED Talks?” media theorist Benjamin Bratton told a story about a friend of his, an astrophysicist, who gave a complex presentation on his research before a donor, hoping to secure funding. When he was finished, the donor decided to pass on the project. “I’m just not inspired,” he told the astrophysicist. “You should be more like Malcolm Gladwell.” Bratton was outraged. He felt that the rhetorical style TED helped popularize was “middlebrow megachurch infotainment,” and had begun to directly influence the type of intellectual work that could be undertaken. If the research wasn’t entertaining or moving, it was seen as somehow less valuable. TED’s influence on intellectual culture was “taking something with value and substance and coring it out so that it can be swallowed without chewing,” Bratton said. “This is not the solution to our most frightening problems — rather, this is one of our most frightening problems.” (Online, his talk proved to be one of many ideas worth spreading. “This is by far the most interesting and challenging thing I’ve heard on TED,” one commenter posted. “Very glad to come across it!”)…
Some thoughts on the “inspiresting”: “What Was the TED Talk?” from @scarschwartz in @thedrift_mag.
* Chris Anderson, proprietor and curator of TED
###
As we unchain our curiosity, we might send ruthless curious (and immensely entertaining) birthday greetings to Martin Gardner; he was born on this date in 1914. Though not an academic, nor ever a formal student of math or science, he wrote widely and prolifically on both subjects in such popular books as The Ambidextrous Universe and The Relativity Explosion and as the “Mathematical Games” columnist for Scientific American. Indeed, his elegant– and understandable– puzzles delighted professional and amateur readers alike, and helped inspire a generation of young mathematicians.
Gardner’s interests were wide; in addition to the math and science that were his power alley, he studied and wrote on topics that included magic, philosophy, religion, and literature (c.f., especially his work on Lewis Carroll– including the delightful Annotated Alice— and on G.K. Chesterton). And he was a fierce debunker of pseudoscience: a founding member of CSICOP, and contributor of a monthly column (“Notes of a Fringe Watcher,” from 1983 to 2002) in Skeptical Inquirer, that organization’s monthly magazine.
Gardner died in 2010, having never given a TED Talk.
“Trees and people used to be good friends”*…
What’s old is new again… yet again…
If there’s a style that defines 2020, it has to be “cottagecore.” In March 2020, the New York Times defined it as a “budding aesthetic movement… where tropes of rural self-sufficiency converge with dainty décor to create an exceptionally twee distillation of pastoral existence.” In August, consumer-culture publication The Goods by Vox heralded cottagecore as “the aesthetic where quarantine is romantic instead of terrifying.”
Baking, one of the activities the quarantined population favored at the height of the pandemic, is a staple of cottagecore, whose Instagram hashtag features detailed depictions of home-baked goods. Moreover, the designer Lirika Matoshi’s Strawberry Dress, defined as The Dress of 2020, fully fits into the cottagecore aesthetic. A movement rooted in self-soothing through exposure to nature and land, it proved to be the antidote to the stress of the 2020 pandemic for many.
Despite its invocations of rural and pastoral landscapes, the cottagecore aesthetic is, ultimately, aspirational. While publications covering trends do point out that cottagecore is not new—some locate its origins in 2019, others in 2017—in truth, people have sought to create an escapist and aspirational paradise in the woods or fields for 2,300 years.
Ancient Greece had an enduring fascination with the region of Arcadia, located in the Peloponnesus, which many ancient Greeks first dismissed as a primitive place. After all, Arcadia was far from the refined civilization of Athens. Arcadians were portrayed as hunters, gatherers, and sensualists living in an inclement landscape. In the Hellenistic age, however, Arcadia became an idea in the popular consciousness more than a geographical place…
And the pastoral ideal resurfaced regularly therafter. Theocritus, Virgil, Longus, Petrarch, Shakespeare, Thomas Hardy, even Marie-Antoinette– keeping cozy in a countryside escape, through the ages: “Cottagecore Debuted 2,300 Years Ago,” from Angelica Frey (@angelica_frey) in @JSTOR_Daily.
* Hayao Miyazaki, My Neighbor Totoro
###
As we pursue the pastoral, we might recall that it was on this date in 1865, after four years of Civil War, approximately 630,000 deaths, and over 1 million casualties, that General Robert E. Lee surrendered the Confederate Army of Northern Virginia to the commander of the Union Army, Lieutenant General Ulysses S. Grant, at the home of Wilmer and Virginia McLean in the town of Appomattox Court House, Virginia… a one-time pastoral setting.

“A mind that is stretched by a new idea can never go back to its original dimensions”*…
Alex Berezow observes (in an appreciation of Peter Atkins‘ Galileo’s Finger: The Ten Great Ideas of Science) that, while scientific theories are always being tested, scrutinized for flaws, and revised, there are ten concepts so durable that it is difficult to imagine them ever being replaced with something better…
In his book The Structure of Scientific Revolutions, Thomas Kuhn argued that science, instead of progressing gradually in small steps as is commonly believed, actually moves forward in awkward leaps and bounds. The reason for this is that established theories are difficult to overturn, and contradictory data is often dismissed as merely anomalous. However, at some point, the evidence against the theory becomes so overwhelming that it is forcefully displaced by a better one in a process that Kuhn refers to as a “paradigm shift.” And in science, even the most widely accepted ideas could, someday, be considered yesterday’s dogma.
Yet, there are some concepts which are considered so rock solid, that it is difficult to imagine them ever being replaced with something better. What’s more, these concepts have fundamentally altered their fields, unifying and illuminating them in a way that no previous theory had done before…
The bedrock of modern biology, chemistry, and physics: “The ten greatest ideas in the history of science,” from @AlexBerezow in @bigthink.
* Oliver Wendell Holmes
###
As we forage for first principles, we might send carefully-calcuated birthday greetings to Georgiy Antonovich Gamov; he was born on this date in 1904. Better known by the name he adopted on immigrating to the U.S., George Gamow, he was a physicist and cosmologist whose early work was instrumental in developing the Big Bang theory of the universe; he also developed the first mathematical model of the atomic nucleus. In 1954, he expanded his interests into biochemistry and his work on deoxyribonucleic acid (DNA) made a basic contribution to modern genetic theory.
But mid-career Gamow began to shift his energy to teaching and to writing popular books on science… one of which, One Two Three… Infinity, inspired legions of young scientists-to-be and kindled a life-long interest in science in an even larger number of other youngsters (including your correspondent).
“If economists could manage to get themselves thought of as humble, competent people on a level with dentists, that would be splendid”*…

Economists have a name for this
There are plenty of economics terms regular people would find not only very interesting, but useful for thinking about policy. Sadly, the most commonly used econ words tend to be the ones with the vaguest meanings — “rational,” “equilibrium” and “efficient.” Instead, here are some of my suggestions:
• Endogeneity
Everyone knows that correlation doesn’t equal causation, but somehow people seem to forget. Endogeneity is a word that can help you remember. Something is endogenous when you don’t know whether it’s a cause or an effect (or both). For example, lots of people note that people who go to college tend to make more money. But how much of this is because college boosts earning power, and how much is because smarter, harder-working, better-connected people tend to go to college in the first place? It’s endogenous. The media is full of stories about how which kind of people stay married, or what diet is associated with better health. Whenever you see these stories, you should ask “What about endogeneity?”…
Noah Smith suggest four other useful concepts in “5 Economics Terms We All Should Use.”
* John Maynard Keynes
###
As we get dismal, we might send fancy birthday greetings to Sir Frederick Henry Royce; he was born on this date in 1863. An engineer and car designer, he founded (with Charles Rolls and Claude Johnson) the Rolls-Royce company, which introduced the first successful luxury cars in the emerging automotive industry.






You must be logged in to post a comment.