(Roughly) Daily

Posts Tagged ‘ideas

“Behind it all is surely an idea so simple, so beautiful, that when we grasp it—in a decade, a century, or a millennium—we will all say to each other, how could it have been otherwise? How could we have been so stupid?”*…

The album cover of “When the Dust Settles” by STS9

From Plato on (if not, indeed, from even earlier), we’ve struggled to resolve the “shadows on the cave wall” into ever-sharper understandings of the reality “behind” those shadows. The quantum of that effort is the “idea.”

But what is an idea? “Roger’s Bacon” offers a provocative answer…

1. Ideas are alien life forms with an agency and intelligence independent of any mind or substrate which they inhabit. When we say that an idea (a story, a joke, a theory, a work of art) has “taken on a life of its own”, our language betrays an intuitive understanding that science has not yet grasped.

They are as you and I—eating, loving, mating, evolving, dying.

2. We do not create or “have” ideas—if anything is doing the creating or having, it is the ideas themselves.

There are times when we recognize this truth (when an idea “magically” pops into your head from “out of nowhere”), but too often it is obscured by the post-hoc just-so stories we tell ourselves about how I, the Great Thinker, Precious Me, was able to “come up with” the brilliant idea (e.g. I combined two other ideas, I was inspired by a memory, an event, another idea, etc.). Whatever explanation you give, the experience is always the same—the idea simply arrives. All else is confabulation.

Why then does an idea enter one mind and not another? Ideas act as all organisms do—they seek habitats (i.e. minds) that can provide them with the space and resources (i.e. mental runtime, ideas eat the energy that enables action potentials) needed to survive and reproduce (i.e. create new idea-children). Just as some ecosystems are more diverse, abundant, and resilient, some minds are as well. What we call creativity is the quality of possessing a healthy mental ecosystem, one that offers fertile ground for a plenitude of ideas. Ideas may also be attracted to particular minds for more specific reasons—for example, an idea may see that other related ideas (members of the same genera or family) have found the mind to be especially suitable or perhaps the mind is in dire need of a certain idea and therefore will offer it ample resources upon arrival. Some minds (e.g. those that are dominated by one idea or set of ideas, perhaps a religious or political ideology) provide poor habitat and are avoided by all but the most desperate ideas (e.g. irrational and harmful ideas that can’t find a home elsewhere—this is why conspiracy theories and hateful ideologies tend to congregate in the same minds).

3. Dear reader, I ask you to conduct an experiment.

Create something, anything—write a line of poetry, doodle an image, hum a melody, take some objects near you and arrange them into a sculpture. Now destroy what you created—physically if you can, but also mentally. Forget it completely.

The world is changed. You are changed. The idea will return in one form or another, in your mind or another.

4. Highly creative people, those we might call “geniuses”, sometimes have the intuition that ideas are autonomous living entities. The standard scientific explanation would be that creativity is positively associated with certain mental characteristics (such as theory of mind and schizotypy) that make someone prone to the intuition that ideas possess a degree of autonomous agency, that they are independently alive in some sense. However, another interpretation is possible: ideas do not like to be treated as if they were lifeless, inanimate objects (would you?) and therefore they gravitate towards minds that treat them with the respect and dignity they deserve…

[“RB” shares the fascinating insights of Philip K Dick, David Lynch, Terence Mckenna, and David Abram…]

… 5. Our relation to ideas is an inextricable symbiosis, like that between plant and pollinator, a mutualism in which neither can survive without the other. At the dawn of civilization, a covenant was made between humans and these alien entities which inhabit our minds—honor and respect each other and all will flourish beyond their wildest dreams.

Ideas will help us if we help them. This is why the growth of knowledge depends on certain moral values—freedom, openness, honesty, courage, tolerance, and humility, amongst others. Those cultures that respect these values provide ideal habitat for ideas, and where ideas thrive and multiply, so do humans.

The converse is true as well. When ideas are kept secret or willfully distorted, we suffer. When ideas are regarded as slaves, as mere tools that can be wielded for their owner’s benefit, the end is near.

Our treatment of ideas is at the root of all that ails us. The remedy: worship ideas like Wisdom, Justice, Equality, Peace, and Love as if they were Gods (because in fact they are, something the ancients recognized that we have long since forgotten), and follow one simple rule.

Do unto ideas as you would have them do unto you.

Teach the children, and in one generation—a new world.

6. Perhaps you has wondered if I am being serious, if I truly believe that ideas are alive in a literal sense—“surely he is just playing with metaphor, an interesting thought experiment and some poetic license, but nothing more.” I assure you nothing could be further from the truth. I am under no illusions; as it stands, there is absolutely no shred of evidence for my hypothesis. I have it on nothing but faith and intuition that one day there will be a paradigm shift of Copernican proportions, a revolution that utterly transforms our understanding of Mind and Matter.

Ask yourself: does history not teach us that there are new forms of life still waiting to be discovered which will seem utterly unimaginable to us until some new technology brings them to light? Is it not hubris of the highest order to suppose that we, Modern Man, have finally reached the end of nature’s catalogue? Democritus proposed that the universe consists of tiny indivisible “atoms”; over 2000 years later he was proven correct, however we still don’t understand the true nature of these atoms—might they too have a spark of consciousness? Is the idea that ideas are interdimensional endosymbiotic entities made of consciousness really so far-fetched? Yeah, maybe.

7. And this you shall know:

Ideas are Alive and You are Dead…

What is it like to be an idea? “Ideas are Alive and You are Dead,” @theseedsofscience.skystack.xyz via @mastroianni.bsky.social

John Archibald Wheeler (and apposite the piece above, here)

###

As we ponder panpsychism, we might send sentient birthday greetings to a man whose passing we noted last month, and whose work wrestled in a way with these same issues: Pierre Teilhard de Chardin; he was born on this date in 1881.  A Jesuit theologian, philosopher, geologist, and paleontologist, he conceived the idea of the Omega Point (a maximum level of complexity and consciousness towards which he believed the universe was evolving) and developed Vladimir Vernadsky‘s concept of noosphere (“a planetary “sphere of reason, the highest stage of biospheric development and of humankind’s rational activities).  

Teilhard took part in the discovery of Peking Man, and wrote on the reconciliation of faith and evolutionary theory.  His thinking on both these fronts was censored during his lifetime by the Catholic Church (in particular for its implications for “original sin”); but in 2009, they lifted their ban.

source

“It is what you read when you don’t have to that determines what you will be when you can’t help it”*…

… What we read– and, librarian Carlo Iacono argues, how we read.

Our inabilty to focus isn’t a failing. It’s a design problem, and the answer isn’t getting rid of our screen time…

Everyone is panicking about the death of reading. The statistics look damning: the share of Americans who read for pleasure on an average day has fallen by more than 40 per cent over the past 20 years, according to research published in iScience this year. The OECD calls the 2022 decline in educational outcomes ‘unprecedented’ across developed nations. In the OECD’s latest adult-skills survey, Denmark and Finland were the only participating countries where average literacy proficiency improved over the past decade. Your nephew speaks in TikTok references. Democracy itself apparently hangs by the thread of our collective attention span.

This narrative has a seductive simplicity. Screens are destroying civilisation. Children can no longer think. We are witnessing the twilight of the literate mind. A recent Substack essay by James Marriott proclaimed the arrival of a ‘post-literate society’ and invited us to accept this as a fait accompli. (Marriott does also write for The Times.) The diagnosis is familiar: technology has fundamentally degraded our capacity for sustained thought, and there’s nothing to be done except write elegiac essays from a comfortable distance.

I spend my working life in a university library, watching how people actually engage with information. What I observe doesn’t match this narrative. Not because the problems aren’t real, but because the diagnosis is wrong.

The declinist position rests on a category error: treating ‘screen culture’ as a unified phenomenon with inherent cognitive properties. As if the same device that delivers algorithmically curated rage-bait and also the complete works of Shakespeare is itself the problem rather than how we decide to use it…

[… observing that “people who ‘can’t focus’ on traditional texts can maintain extraordinary concentration when working across modes, he argues that “we haven’t become post-literate. We’ve become post-monomodal. Text hasn’t disappeared; it’s been joined by a symphony of other channels.”…]

… What troubles me most about the declinist position is not its diagnosis but its conclusion. The commentators who lament the post-literate society often identify the same villains I do. They recognise that technology companies are, in Marriott’s words, ‘actively working to destroy human enlightenment’, that tech oligarchs ‘have just as much of a stake in the ignorance of the population as the most reactionary feudal autocrat.’

And then they surrender. As Marriott says: ‘Nothing will ever be the same again. Welcome to the post-literate society.’

This is the move I cannot follow. To name the actors responsible and then treat the outcome as inevitable is to provide them cover. If the crisis is a force of nature, ‘screens’ destroying civilisation like some technological weather system, then there’s nothing to be done but write elegiac essays from a comfortable distance. But if the crisis is the product of specific design choices made by specific companies for specific economic reasons, then those choices can be challenged, regulated, reversed.

The fatalism, however beautifully expressed, serves the very interests it condemns. The technology companies would very much like us to believe that what they’re doing to human attention is simply the inevitable result of technological progress rather than something they’re doing to us, something that could, with sufficient political will, be stopped.

Your inability to focus isn’t a moral failing. It’s a design problem. You’re trying to think in environments built to prevent thinking. You’re trying to sustain attention in spaces engineered to shatter it. You’re fighting algorithms explicitly optimised to keep you scrolling, not learning.

The solution isn’t discipline. It’s architecture. Build different defaults. Create different spaces. Establish different rhythms. Make depth as easy as distraction currently is. Make thinking feel as natural as scrolling currently does.

What if, instead of mourning some imaginary golden age of pure text, we got serious about designing for depth across all modes? Every video could come with a searchable transcript. Every article could offer multiple entry points for different levels of attention. Our devices could recognise when we’re trying to think and protect that thinking. Schools could teach students to translate between modes the way they once taught translation between languages.

Books aren’t going anywhere. They remain unmatched for certain kinds of sustained, complex thinking. But they’re no longer the only game in town for serious ideas. A well-crafted video essay can carry philosophical weight. A podcast can enable the kind of long-form thinking we associate with written essays. An interactive visualisation can reveal patterns that pages of description struggle to achieve.

The future belongs to people who can dance between all modes without losing their balance. Someone who can read deeply when depth is needed, skim efficiently when efficiency matters, listen actively during a commute, and watch critically when images carry the argument. This isn’t about consuming more. It’s about choosing consciously.

We stand at an inflection point. We can drift into a world where sustained thought becomes a luxury good, where only the privileged have access to the conditions that enable deep thinking. Or we can build something unprecedented: a culture that preserves the best of print’s cognitive gifts while embracing the possibilities of a world where ideas travel through light, sound and interaction.

The choice isn’t between books and screens. The choice is between intentional design and profitable chaos. Between habitats that cultivate human potential and platforms that extract human attention.

The civilisations that thrive won’t be the ones that retreat into text or surrender to the feed. They’ll be the ones that understand a simple truth: every idea has a natural form, and wisdom lies in matching the mode to the meaning. Some ideas want to be written. Others need to be seen. Still others must be heard, felt or experienced. The mistake is forcing all ideas through a single channel, whether that channel is a book or a screen.

Your great-grandchildren won’t read less than you do. They’ll read differently, as part of a richer symphony of sense-making. Whether that symphony sounds like music or noise depends entirely on the choices we make right now about the shape of our tools, the structure of our schools, and the design of our days.

The elegant lamenters offer a eulogy. I’m more interested in a fight…

Reunderstanding reading: “Books and screens,” from @carloiacono.bsky.social in @aeon.co.

* Oscar Wilde

###

As we turn the page, we might note that we’ve been here before, and celebrate the emergence of a design, an innovation, a technology that took on a life of its own and changed reading and… well, everything:  this day in 1455 is the traditionally-given date of the publication of the Gutenberg Bible, the first Western book printed from movable type.

(Lest we think that there’s actually anything new under the sun, we might recall that The Jikji— the world’s oldest known extant movable metal type printed book– was published in Korea in 1377; and that Bi Sheng created the first known moveable type– out of wood– in China in 1040.)

Gutenberg Bible on display at the U.S. Library of Congress (source)

Written by (Roughly) Daily

February 23, 2026 at 1:00 am

“To understand anything, you just need to understand the little bits”*…

Oscar Schwartz begs to differ. Here, excerpts from his provocative critique of TED Talks…

Bill Gates wheels a hefty metal barrel out onto a stage. He carefully places it down and then faces the audience, which sits silent in a darkened theater. “When I was a kid, the disaster we worried about most was a nuclear war,” he begins. Gates is speaking at TED’s flagship conference, held in Vancouver in 2015. He wears a salmon pink sweater, and his hair is combed down over his forehead, Caesar-style. “That’s why we had a barrel like this down in our basement, filled with cans of food and water,” he says. “When the nuclear attack came, we were supposed to go downstairs, hunker down, and eat out of that barrel.”

Now that he is an adult, Gates continues, it is no longer nuclear apocalypse that scares him, but pestilence. A year ago, Ebola killed over ten thousand people in West Africa. If the virus had been airborne or spread to a large city center, things would have been far worse. It might’ve snowballed into a pandemic and killed tens of millions of people. Gates tells the TED attendees that humanity is not ready for this scenario — that a pandemic would trigger a global catastrophe at an unimaginable scale. We have no basement to retreat to and no metal barrel filled with supplies to rely on. 

But, Gates adds, the future might turn out okay. He has an idea. Back when he was a kid, the U.S. military had sufficient funding to mobilize for war at any minute. Gates says that we must prepare for a pandemic with the same fearful intensity. We need to build a medical reserve corps. We need to play germ games like generals play war games. We need to make alliances with other virus-fighting nations. We need to build an arsenal of biomedical weapons to attack any non-human entity that might attack our bodies. “If we start now, we can be ready for the next epidemic,” Gates concludes, to a round of applause. 

Of course, Gates’s popular and well-shared TED talk — viewed millions of times — didn’t alter the course of history. Neither did any of the other “ideas worth spreading” (the organization’s tagline) presented at the TED conference that year — including Monica Lewinsky’s massively viral speech about how to stop online bullying through compassion and empathy, or a Google engineer’s talk about how driverless cars would make roads smarter and safer in the near future. In fact, seven years after TED 2015, it feels like we are living in a reality that is the exact opposite of the future envisioned that year. A president took office in part because of his talent for online bullying. Driverless cars are nowhere near as widespread as predicted, and those that do share our roads keep crashing. Covid has killed five million people and counting. 

At the start of the pandemic, I noticed people sharing Gates’s 2015 talk. The general sentiment was one of remorse and lamentation: the tech-prophet had predicted the future for us! If only we had heeded his warning! I wasn’t so sure. It seems to me that Gates’s prediction and proposed solution are at least part of what landed us here. I don’t mean to suggest that Gates’s TED talk is somehow directly responsible for the lack of global preparedness for Covid. But it embodies a certain story about “the future” that TED talks have been telling for the past two decades — one that has contributed to our unending present crisis.

The story goes like this: there are problems in the world that make the future a scary prospect. Fortunately, though, there are solutions to each of these problems, and the solutions have been formulated by extremely smart, tech-adjacent people. For their ideas to become realities, they merely need to be articulated and spread as widely as possible. And the best way to spread ideas is through stories — hence Gates’s opening anecdote about the barrel. In other words, in the TED episteme, the function of a story isn’t to transform via metaphor or indirection, but to actually manifest a new world. Stories about the future create the future. Or as Chris Anderson, TED’s longtime curator, puts it, “We live in an era where the best way to make a dent on the world… may be simply to stand up and say something.” And yet, TED’s archive is a graveyard of ideas. It is a seemingly endless index of stories about the future — the future of science, the future of the environment, the future of work, the future of love and sex, the future of what it means to be human — that never materialized. By this measure alone, TED, and its attendant ways of thinking, should have been abandoned…

… TED talks began to take on a distinct rhetorical style, later laid out in Anderson’s book TED Talks: The Official TED Guide to Public Speaking. In it, Anderson insists anyone is capable of giving a TED-esque talk. You just need an interesting topic and then you need to attach that topic to an inspirational story. Robots are interesting. Using them to eat trash in Nairobi is inspiring. Put the two together, and you have a TED talk.

I like to call this fusion “the inspiresting.” Stylistically, the inspiresting is earnest and contrived. It is smart but not quite intellectual, personal but not sincere, jokey but not funny. It is an aesthetic of populist elitism. Politically, the inspiresting performs a certain kind of progressivism, as it is concerned with making the world a better place, however vaguely…

Perhaps the most incisive critique came, ironically, at a 2013 TEDx conference. In “What’s Wrong with TED Talks?” media theorist Benjamin Bratton told a story about a friend of his, an astrophysicist, who gave a complex presentation on his research before a donor, hoping to secure funding. When he was finished, the donor decided to pass on the project. “I’m just not inspired,” he told the astrophysicist. “You should be more like Malcolm Gladwell.” Bratton was outraged. He felt that the rhetorical style TED helped popularize was “middlebrow megachurch infotainment,” and had begun to directly influence the type of intellectual work that could be undertaken. If the research wasn’t entertaining or moving, it was seen as somehow less valuable. TED’s influence on intellectual culture was “taking something with value and substance and coring it out so that it can be swallowed without chewing,” Bratton said. “This is not the solution to our most frightening problems — rather, this is one of our most frightening problems.” (Online, his talk proved to be one of many ideas worth spreading. “This is by far the most interesting and challenging thing I’ve heard on TED,” one commenter posted. “Very glad to come across it!”)…

Some thoughts on the “inspiresting”: “What Was the TED Talk?​” from @scarschwartz in @thedrift_mag.

* Chris Anderson, proprietor and curator of TED

###

As we unchain our curiosity, we might send ruthless curious (and immensely entertaining) birthday greetings to Martin Gardner; he was born on this date in 1914. Though not an academic, nor ever a formal student of math or science, he wrote widely and prolifically on both subjects in such popular books as The Ambidextrous Universe and The Relativity Explosion and as the “Mathematical Games” columnist for Scientific American. Indeed, his elegant– and understandable– puzzles delighted professional and amateur readers alike, and helped inspire a generation of young mathematicians.

Gardner’s interests were wide; in addition to the math and science that were his power alley, he studied and wrote on topics that included magic, philosophy, religion, and literature (c.f., especially his work on Lewis Carroll– including the delightful Annotated Alice— and on G.K. Chesterton).  And he was a fierce debunker of pseudoscience: a founding member of CSICOP, and contributor of a monthly column (“Notes of a Fringe Watcher,” from 1983 to 2002) in Skeptical Inquirer, that organization’s monthly magazine.

Gardner died in 2010, having never given a TED Talk.

source

“Trees and people used to be good friends”*…

Lirika Matoshi’s Strawberry Dress, defined as The Dress of 2020, in Arcadia

What’s old is new again… yet again…

If there’s a style that defines 2020, it has to be “cottagecore.” In March 2020, the New York Times defined it as a “budding aesthetic movement… where tropes of rural self-sufficiency converge with dainty décor to create an exceptionally twee distillation of pastoral existence.” In August, consumer-culture publication The Goods by Vox heralded cottagecore as “the aesthetic where quarantine is romantic instead of terrifying.”

Baking, one of the activities the quarantined population favored at the height of the pandemic, is a staple of cottagecore, whose Instagram hashtag features detailed depictions of home-baked goods. Moreover, the designer Lirika Matoshi’s Strawberry Dress, defined as The Dress of 2020, fully fits into the cottagecore aesthetic. A movement rooted in self-soothing through exposure to nature and land, it proved to be the antidote to the stress of the 2020 pandemic for many.

Despite its invocations of rural and pastoral landscapes, the cottagecore aesthetic is, ultimately, aspirational. While publications covering trends do point out that cottagecore is not new—some locate its origins in 2019, others in 2017—in truth, people have sought to create an escapist and aspirational paradise in the woods or fields for 2,300 years.

Ancient Greece had an enduring fascination with the region of Arcadia, located in the Peloponnesus, which many ancient Greeks first dismissed as a primitive place. After all, Arcadia was far from the refined civilization of Athens. Arcadians were portrayed as hunters, gatherers, and sensualists living in an inclement landscape. In the Hellenistic age, however, Arcadia became an idea in the popular consciousness more than a geographical place…

And the pastoral ideal resurfaced regularly therafter. Theocritus, Virgil, Longus, Petrarch, Shakespeare, Thomas Hardy, even Marie-Antoinette– keeping cozy in a countryside escape, through the ages: “Cottagecore Debuted 2,300 Years Ago,” from Angelica Frey (@angelica_frey) in @JSTOR_Daily.

Hayao Miyazaki, My Neighbor Totoro

###

As we pursue the pastoral, we might recall that it was on this date in 1865, after four years of Civil War, approximately 630,000 deaths, and over 1 million casualties, that General Robert E. Lee surrendered the Confederate Army of Northern Virginia to the commander of the Union Army, Lieutenant General Ulysses S. Grant, at the home of Wilmer and Virginia McLean in the town of Appomattox Court House, Virginia… a one-time pastoral setting.

Union soldiers at the Appomattox courthouse in April 1865 [source]

“A mind that is stretched by a new idea can never go back to its original dimensions”*…

Alex Berezow observes (in an appreciation of Peter AtkinsGalileo’s Finger: The Ten Great Ideas of Science) that, while scientific theories are always being tested, scrutinized for flaws, and revised, there are ten concepts so durable that it is difficult to imagine them ever being replaced with something better…

In his book The Structure of Scientific Revolutions, Thomas Kuhn argued that science, instead of progressing gradually in small steps as is commonly believed, actually moves forward in awkward leaps and bounds. The reason for this is that established theories are difficult to overturn, and contradictory data is often dismissed as merely anomalous. However, at some point, the evidence against the theory becomes so overwhelming that it is forcefully displaced by a better one in a process that Kuhn refers to as a “paradigm shift.” And in science, even the most widely accepted ideas could, someday, be considered yesterday’s dogma.

Yet, there are some concepts which are considered so rock solid, that it is difficult to imagine them ever being replaced with something better. What’s more, these concepts have fundamentally altered their fields, unifying and illuminating them in a way that no previous theory had done before…

The bedrock of modern biology, chemistry, and physics: “The ten greatest ideas in the history of science,” from @AlexBerezow in @bigthink.

* Oliver Wendell Holmes

###

As we forage for first principles, we might send carefully-calcuated birthday greetings to Georgiy Antonovich Gamov; he was born on this date in 1904. Better known by the name he adopted on immigrating to the U.S., George Gamow, he was a physicist and cosmologist whose early work was instrumental in developing the Big Bang theory of the universe; he also developed the first mathematical model of the atomic nucleus. In 1954, he expanded his interests into biochemistry and his work on deoxyribonucleic acid (DNA) made a basic contribution to modern genetic theory.

But mid-career Gamow began to shift his energy to teaching and to writing popular books on science… one of which, One Two Three… Infinity, inspired legions of young scientists-to-be and kindled a life-long interest in science in an even larger number of other youngsters (including your correspondent).

source