(Roughly) Daily

Posts Tagged ‘printing press

“Attention is the rarest and purest form of generosity”*…

An illustration depicting a large black fish with an open mouth, consuming smaller red fish, accompanied by the text 'what price media consolidation?'

… But that most valuable of gifts is being hijacked, subverted/converted into a commodity, and used to mold not just consumer behavior, but society-as-a-whole. We live in an attention economy, and its media/tech ownership landscape is becoming ever more consoldiated.

Kyla Scanlon unpacks the way in which concentrated ownership of media and tech and their automated manipulation reshape democracy…

It’s nearly impossible not to get lost in the news right now. I was at a wedding last week, and every conversation eventually drifted back to the same subject: the World We Are in and All That is Happening. The ground feels like it’s moving faster than anyone can feasibly keep up with.

Some people think the shift is progress. Others see collapse. Either way, the line between digital and physical life is increasingly blurry. What happens online is real life. What we consume is what we become.

Plenty of thinkers have circled this before – Postman, Debord, Huxley, Orwell on media; Machiavelli, Tocqueville, Thucydides, Gibbon on human corruptibility during times of uncertainty. The convergence of endless information and a ragebait economy creates the perfect environment for splintering how we understand the world and how we understand each other.

The deeper problem is this: we no longer trust institutions to provide truth, fairness, or mobility. Once, they were scaffolding that helped us climb from raw data to wisdom. And when that scaffolding gives out, people adapt: some over-perform in the status race (because you have to) and others defect from obligations altogether (why would I work for institutions if they don’t work for me).

There are a few ways to picture our distorted information ecosystem.

  • The DIKW Pyramid (Data → Information → Knowledge → Wisdom): raw posts and clicks at the bottom, trending content in the middle, shared truths above that, and finally wisdom, the rare ability to see causes instead of just symptoms.
  • Or the Ladder of Inference: we start with data, add meaning, make assumptions – and our beliefs tend to affect what data we select. Bots and algorithms hijack that ladder, nudging us toward polarized beliefs before we realize what’s happening.

Taken together, we can combine them into what we might call a hierarchy of information:

  • Raw data: the endless stream of posts, likes, bot spam
  • Information: headlines, hashtags, trending things
  • Knowledge: the narratives we share and fight over.
  • Understanding: recognizing what might not be real (or is hyperreal)
  • Wisdom: systemic analysis, the ability to see causes instead of just symptoms.

Right now, we’re stuck sloshing around in the middle layers of the hierarchy: drowning in outrage, fighting over partisan hot takes, rarely reaching understanding, almost never wisdom.

Chaos always has an architect. And if we want to make sense of American democracy today, we need to understand who those architects are, and how they profit from confusion.

This polarization rests on media concentration.The Telecommunications Act of 1996 was sold as a way to increase competition in media and telecommunications, but in reality, it did quite the opposite. Within five years, four firms controlled ~85% of US telephone infrastructure. That deregulated spine carried today’s consolidation of the entire media environment – not just telephones. Newspapers. Social media. TV stations.

We have the increasing concentration of media ownership, the financialization of attention, and the transformation of information from a public good into a private commodity to be bought, sold, and manipulated…

[Scanlon characterizes and explains the concentration, examines its impacts, and unpacks the roles of bots…]

When manufacturing consensus is both cheap to produce and valuable to those who benefit from confusion, you get industrial-scale manipulation.

Truth becomes whatever can capture the most attention in the shortest amount of time. Traditional journalism, with its slow fact-checking and institutional processes, can’t compete with bot-amplified outrage. Democratic deliberation, which requires shared facts and good faith dialogue, becomes nearly impossible when the information environment is designed to maximize conflict.

We’re living in a speculation economy where perception drives value more than fundamentals. Look at the stock market: Nvidia gained $150 billion in value based the back of a $100 billion OpenAI investment (which OpenAI will use to buy more Nvidia chips). Ten companies pass hundreds of billions back and forth, and the S&P jumps like it’s measuring something real.

It’s all memes wearing suits. Meme stocks and Dogecoin at least looked like jokes; now the same speculative energy runs through the corporate core. Attention, perception, and narrative drive valuation more than production or profit.

We’ve built a world where the hierarchy of information has flipped upside down.

At the bottom, bots flood us with raw noise. In the middle, outrage and team narratives harden into “knowledge.” At the top, the ladders to wisdom like journalism, schools, civic discourse, shared institutions are weakened. The scaffolding that once helped us climb no longer holds.

The traditional solutions – fact-checking, media literacy, content moderation – assume we’re dealing with a content problem when we’re actually facing an infrastructure problem. You can’t fact-check your way out of a system designed to reward misinformation. You can’t educate your way around algorithms optimized for polarization. You can’t moderate your way past economic incentives that make confusion profitable.

Recognizing this as a market structure problem rather than an information problem changes everything. Instead of focusing on individual bad actors or specific false claims, you start thinking about the underlying systems that make manipulation both profitable and scalable.

The information wars are economic policy, determining how we allocate attention, structure incentives, and organize the flow of information that shapes every other market and political decision we make. I don’t think it’s useful to get on a Substack soapbox about this – but we need to take (1) the power of media seriously and (2) those trying to influence it extremely seriously. There is a way to get to the top of the information hierarchy! We don’t have to be stuck in these middle layers…

Follow the money: “Who’s Getting Rich Off Your Attention?” from @kyla.bsky.social

For more on how the Telecommunications Act of 1996 helped set all of this in motion, see: “On Jimmy Kimmel: It’s Time to Destroy the Censorship Machine and Repeal the Telecommunications Act of 1996” from @matthewstoller.bsky.social.

For more on thoughts on why companies are behaving in the ways they are: “Why Corporate America Is Caving to Trump” and “Media consolidation is shaping who folds under political pressure — and who could be next.”

And lest we think that this came out of nowhere: “David Foster Wallace Tried to Warn Us About these Eight Things.”

[Image above: source]

Simone Weil

###

As we reclaim recognition, we might recall that on this date in 1452 an earlier information revolution began: Johannes Gutenberg started work on his Bible (which was completed and published in 1455). An inventor and craftsman, Gutenberg created the movable-type printing press, enabling a much faster (and cheaper) printing process. (Movable type was already in use in East Asia, but was slower and used for smaller jobs.) His Bible was his first major work, and his most impactful.

The printing press later spread across the world, leading to an information revolution– the unprecedented mass-spread of literature throughout Europe. It had a profound impact on the development of the Renaissance, Reformation, and Humanist movements.

A close-up view of an open Gutenberg Bible displayed in a museum, showcasing text on aged paper and illustrating the early printing technique.
Gutenberg Bible in the New York Public Library (source)

“The historian of science may be tempted to exclaim that when paradigms change, the world itself changes with them”*…

What we now call AI has gone through a series of paradigm shifts, and there appears to be no end in sight. Ashlee Vance shares an anecdote that suggests that AI might itself be an agent (perhaps the agent) of a broader paradigm shift (or shifts)…

AI madness is upon many of us, and it can take different forms. In August 2024, for example, I stumbled upon a post from a 20-year-old who had built a nuclear fusor [see here] in his home with a bunch of mail-ordered parts. More to the point, he’d done this while under the tutelage of Anthropic’s Claude AI service…

… The guy who built the fusor in question, Hudhayfa Nazoordeen, better known as HudZah on the internet, was a math student on his summer break from the University of Waterloo. I reached out and asked to see his experiment in person partly because it seemed weird and interesting and partly because it seemed to say something about AI technology and how some people are going to be in for a very uncomfortable time in short order.

A couple days after the fusor posts hit X, I showed up at Nazoordeen’s front door, a typical Victorian in San Francisco’s Lower Haight neighborhood. Nazoordeen, a tall, skinny dude with lots of energy and the gesticulations to match, had been crashing there for the summer with a bunch of his university friends as they tried to soak in the start-up and AI lifestyle. Decades ago, these same kids might have yearned to catch Jerry Garcia and The Dead playing their first gigs or to happen upon an Acid Test. This Waterloo set, though, had a different agenda. They were turned on and LLMed up.

Like many of the Victorian-style homes in the city, this one had a long hallway that stretched from the front door to the kitchen with bedrooms jutting off on both sides. The wooden flooring had been blackened in the center from years of foot traffic, but that was not the first thing anyone would notice. Instead, they’d see the mass of electrical cables that were 10-, 25- and sometimes 50-feet long and coming out of each room and leading to somewhere else in the house.

One of the cables powered a series of mind-reading experiments. Someone in the house, Nazoordeen said, had built his own electroencephalogram (EEG) device for measuring brain activity and had been testing it out on houseguests for weeks. Most of the cables, though, were there to feed GPU clusters, the computing systems filled with graphics chips (often designed by Nvidia) that have powered the recent AI boom. You’d follow a cable from one room to another and end up in front of a black box on the floor. All across San Francisco, I imagined, twenty-somethings were gathered around similar GPU altars to try out their ideas…

Vance tells HudZah’s story, recounts the building of his fusor, explains Claude’s (sometimes reluctant) role, and raises the all-too-legitimate safety questions the experiment raises… though in fairness, one might note that the web is rife with instuctions for building a fusor, e.g., here, here, and here, some of which encuraged HudZah.

But in the end, the takeaway for Vance was not the product, but the process…

I must admit, though, that the thing that scared me most about HudZah was that he seemed to be living in a different technological universe than I was. If the previous generation were digital natives, HudZah was an AI native.

HudZah enjoys reading the old-fashioned way, but he now finds that he gets more out of the experience by reading alongside an AI. He puts PDFs of books into Claude or ChatGPT and then queries the books as he moves through the text. He uses Granola to listen in on meetings so that he can query an AI after the chats as well. His friend built Globe Explorer, which can instantly break down, say, the history of rockets, as if you had a professional researcher at your disposal. And, of course, HudZah has all manner of AI tools for coding and interacting with his computer via voice.

It’s not that I don’t use these things. I do. It’s more that I was watching HudZah navigate his laptop with an AI fluency that felt alarming to me. He was using his computer in a much, much different way than I’d seen someone use their computer before, and it made me feel old and alarmed by the number of new tools at our disposal and how HudZah intuitively knew how to tame them.

It also excited me. Just spending a couple of hours with HudZah left me convinced that we’re on the verge of someone, somewhere creating a new type of computer with AI built into its core. I believe that laptops and PCs will give way to a more novel device rather soon.

I’m not sure that people know what’s coming for them. You’re either with the AIs now and really learning how to use them or you’re getting left behind in a profound way. Obviously, these situations follow every major technology transition, but I’m a very tech-forward person, and there were things HudZah could accomplish on his machine that gave off alien vibes to me. So, er, like, good luck if you’re not paying attention to this stuff.

After doing his AI and fusor show for me, HudZah gave me a tour of the house. Most of his roommates had already bailed out and returned to Canada. He was left to clean up the mess, which included piles of beer cans and bottles of booze in the backyard from a last hurrah.

The AI housemates had also left some gold panning equipment in a bathtub. At some point during the summer, they had decided to grab “a shit ton of sand from a nearby creek” and work it over in their communal bathroom for fun.

I’m honestly not sure what the takeaway there was exactly other than that something profound happened to the Bay Area brain in 1849, and it’s still doing its thing…

Goodbye, Digital Natives; hello, AI Natives: “A Young Man Used AI to Build A Nuclear Fusor and Now I Must Weep,” from @ashleevance. Eminently worth reading in full.

And for a look at one attempt to understand what may be the emerging new pardigm(s) of which AI may be a motive part, see Benjamin Bratton‘s explantion of the work he and his collegues are doing at a new institute at UCSD: “Antikythera.” See his recent Long Now Foundation talk on this same subject here.

On the other hand: “The Future Is Too Easy” (gift article) by David Roth in the always-illuminating Defector.

(Image above: source)

Thomas Kuhn

###

As we ponder progress, we might spare a thought for Johannes Gutenberg; he died on this date in 1416. A craftsman and inventor, he invented the movable-type printing press. (Though movable type was already in use in East Asia, Gutenberg’s invention of the printing press enabled a much faster rate of printing.)

The printing press spread across the world and led to an information revolution and the unprecedented mass-spread of literature throughout Europe. It was a profound enabler of the arts and the sciences of the Renaissance, of the Reformation (and Counter-Reformation), and of humanist movements… which is to say that it contributed to a series of pardigm shifts.

source

“You say you’re a pessimist, but I happen to know that you’re in the habit of practicing your flute for two hours every evening”*…

The Harrowing of Hell, Hieronymus Bosch

A couple of weeks ago, (R)D featured a piece by Jonathan Haidt, “Why the Past 10 Years of American Life Have Been Uniquely Stupid,” in which Haidt critiqued, among others, Robert Wright and his influential book, Non-Zero. In the spirit of George Bernard Shaw (who observed: “Both optimists and pessimists contribute to society. The optimist invents the aeroplane, the pessimist the parachute.“) Wright responds…

… There are three main culprits in Haidt’s story, three things that have torn our world asunder: the like button, the share button (or, on Twitter, the retweet button), and the algorithms that feed on those buttons. “Babel is a metaphor for what some forms of social media have done to nearly all of the groups and institutions most important to the country’s future—and to us as a people.”

I would seem uniquely positioned to cheer us up by taking issue with Haidt’s depressing diagnosis. Near the beginning of his piece, he depicts my turn-of-the-millennium book Nonzero: The Logic of Human Destiny as in some ways the antithesis of his thesis—as sketching a future in which information technology unites rather than divides…

Well, two things I’m always happy to do are (1) cheer people up; and (2) defend a book I’ve written. I’d like to thank Haidt (who is actually a friend—but whom I’ll keep calling “Haidt” to lend gravitas to this essay) for providing me the opportunity to do both at once.

But don’t let your expectations get too high about the cheering people up part—because, for starters, the book I’m defending wasn’t that optimistic. I wrote in Nonzero, “While I’m basically optimistic, an extremely bleak outcome is obviously possible.” And even if we avoid a truly apocalyptic fate, I added, “several moderately bleak outcomes are possible.”

Still, looking around today, I don’t see quite as much bleakness as Haidt seems to see. And one reason, I think, is that I don’t see the causes of our current troubles as being quite as novel as he does. We’ve been here before, and humankind survived…

Read on for a brief history of humankind’s wrestling with new information technologies (e.g., writing and the printing press). Wright concludes…

In underscoring the importance of working to erode the psychology of tribalism (a challenge approachable from various angles, including one I wrote a book about), I don’t mean to detract from the value of piecemeal reforms. Haidt offers worthwhile ideas about how to make social media less virulent and how to reduce the paralyzing influence of information technology on democracy. (He spends a lot of time on the info tech and democracy issue—and, once again, I’d say he’s identified a big problem but also a longstanding problem; I wrote about it in 1995, in a Time magazine piece whose archival version is mis-dated as 2001.) The challenge we face is too big to let any good ideas go to waste, and Haidt’s piece includes some good ones.

Still, I do think that stepping back and looking at the trajectory of history lets us assess the current turmoil with less of a sense of disorientation than Haidt seems to feel. At least, that’s one takeaway from my argument in Nonzero, which chronicled how the evolution of technology, especially information technology, had propelled human social organization from the hunter-gatherer village to the brink of global community—a threshold that, I argued, we will fail to cross at our peril.

This isn’t the place to try to recapitulate that argument in compelling form. (There’s a reason I devoted a whole book to it.) So there’s no reason the argument should make sense to you right now. All I can say is that if you do ever have occasion to assess the argument, and it does make sense to you, the turbulence we’re going through will also make more sense to you.

Is Everything Falling Apart?@JonHaidt thinks so; @robertwrighter is not so sure.

Apposite: “An optimist’s guide to the future: the economist who believes that human ingenuity will save the world,” and “The Future Will Be Shaped by Optimists,” from @kevin2kelly at @TedConferences.

* Friedrich Nietzsche (criticizing Schopenhauer)

###

As we look on the bright side of life, we might send darkly-tinted birthday greetings to Oswald Spengler; he was born on this date in 1880. Best known for his two-volume work, The Decline of the West (Der Untergang des Abendlandes), published in 1918 and 1922, he was a historian and philosopher of history who developed an “organic theory” of history that suggested that human cultures and civilizations are akin to biological entities, each with a limited, predictable, and deterministic lifespan– and that around the year 2000, Western civilization would enter the period of pre‑death emergency whose countering would lead to 200 years of Caesarism (extra-constitutional omnipotence of the executive branch of government) before Western civilization’s final collapse. He was a major influence on many historians (including Arnold Toynbee and Samuel “Clash of Civilizations” Huntington).

source

“English is flexible: you can jam it into a Cuisinart for an hour, remove it, and meaning will still emerge”*…

Plus ça change. The opening pages of The Lytille Childrenes Lytil Boke, an instructional book of table manners dating from around 1480 and written in Middle English. Amongst other directives, children are told Bulle not as a bene were in thi throote (Don’t burp as if you had a bean in your throat) and Pyke notte thyne errys nothyr thy nostrellys’(Don’t pick your ears or nose).

To be honest, it is a mess…

English spelling is ridiculous. Sew and new don’t rhyme. Kernel and colonel do. When you see an ough, you might need to read it out as ‘aw’ (thought), ‘ow’ (drought), ‘uff’ (tough), ‘off’ (cough), ‘oo’ (through), or ‘oh’ (though). The ea vowel is usually pronounced ‘ee’ (weak, please, seal, beam) but can also be ‘eh’ (bread, head, wealth, feather). Those two options cover most of it – except for a handful of cases, where it’s ‘ay’ (break, steak, great). Oh wait, one more… there’s earth. No wait, there’s also heart.

The English spelling system, if you can even call it a system, is full of this kind of thing. Yet not only do most people raised with English learn to read and write it; millions of people who weren’t raised with English learn to use it too, to a very high level of accuracy.

Admittedly, for a non-native speaker, such mastery usually involves a great deal of confusion and frustration. Part of the problem is that English spelling looks deceptively similar to other languages that use the same alphabet but in a much more consistent way. You can spend an afternoon familiarising yourself with the pronunciation rules of Italian, Spanish, German, Swedish, Hungarian, Lithuanian, Polish and many others, and credibly read out a text in that language, even if you don’t understand it. Your pronunciation might be terrible, and the pace, stress and rhythm would be completely off, and no one would mistake you for a native speaker – but you could do it. Even French, notorious for the spelling challenges it presents learners, is consistent enough to meet the bar. There are lots of silent letters, but they’re in predictable places. French has plenty of rules, and exceptions to those rules, but they can all be listed on a reasonable number of pages.

English is in a different league of complexity. The most comprehensive description of its spelling – the Dictionary of the British English Spelling System by Greg Brooks (2015) – runs to more than 450 pages as it enumerates all the ways particular sounds can be represented by letters or combinations of letters, and all the ways particular letters or letter combinations can be read out as sounds.

From the early Middle Ages, various European languages adopted and adapted the Latin alphabet. So why did English end up with a far more inconsistent orthography than any other? The basic outline of the messy history of English is widely known: the Anglo-Saxon tribes bringing Old English in the 5th century, the Viking invasions beginning in the 8th century adding Old Norse to the mix, followed by the Norman Conquest of the 11th century and the French linguistic takeover. The moving and mixing of populations, the growth of London and the merchant class in the 13th and 14th centuries. The contact with the Continent and the balance among Germanic, Romance and Celtic cultural forces. No language Academy was established, no authority for oversight or intervention in the direction of the written form. English travelled and wandered and haphazardly tied pieces together. As the blogger James Nicoll put it in 1990, English ‘pursued other languages down alleyways to beat them unconscious and rifle their pockets for new vocabulary’.

But just how does spelling factor into all this? It wasn’t as if the rest of Europe didn’t also contend with a mix of tribes and languages. The remnants of the Roman Empire comprised Germanic, Celtic and Slavic communities spread over a huge area. Various conquests installed a ruling-class language in control of a population that spoke a different language: there was the Nordic conquest of Normandy in the 10th century (where they now write French with a pretty regular system); the Ottoman Turkish rule over Hungary in the 16th and 17th centuries (which now has very consistent spelling rules for Hungarian); Moorish rule in Spain in the 8th to 15th centuries (which also has very consistent spelling). True, other languages did have official academies and other government attempts at standardisation – but those interventions have largely only ever succeeded at implementing minor changes to existing systems in very specific areas. English wasn’t the only language to pick the pockets of others for useful words.

The answer to the weirdness of English has to do with the timing of technology. The rise of printing caught English at a moment when the norms linking spoken and written language were up for grabs, and so could be hijacked by diverse forces and imperatives that didn’t coordinate with each other, or cohere, or even have any distinct goals at all. If the printing press has arrived earlier in the life of English, or later, after some of the upheaval had settled, things might have ended up differently…

Why is English spelling so weird and unpredictable? Don’t blame the mix of languages; look to quirks of timing and technology: “Typos, tricks, and misprints,” from Arika Okrent (@arikaokrent).

* Douglas Coupland

###

As we muse on the mother tongue, we might spare a thought for a man who used it to wonderful effect: Seymour Wilson “Budd” Schulberg. The son of B. P. Schulberg (head production at Paramount Pictures in it’s 1930s-30s heyday) and Adeline Jaffe Schulberg (who founded one of Hollywood’s most successful talent/literary agencies), Budd went into the family business, finding success as a screenwriter, television producer, novelist, and sports writer. He is probably best remembered for his novels What Makes Sammy Run? and The Harder They Fall, his Academy Award-winning screenplay for On the Waterfront, and his (painfully prescient) screenplay for A Face in the Crowd.

source

“The gods keep livelihood hidden from men. Otherwise a day’s labor could bring man enough to last a whole year with no more work.”*…

 

Ind Rev

Upheaval more than a century into the Industrial Revolution, and more than 100 years ago:
An International Workers of the World union demonstration
in New York City in 1914. Credit: Library of Congress

 

As automation and artificial intelligence technologies improve, many people worry about the future of work. If millions of human workers no longer have jobs, the worriers ask, what will people do, how will they provide for themselves and their families, and what changes might occur (or be needed) in order for society to adjust?

Many economists say there is no need to worry. They point to how past major transformations in work tasks and labor markets – specifically the Industrial Revolution during the 18th and 19th centuries – did not lead to major social upheaval or widespread suffering. These economists say that when technology destroys jobs, people find other jobs…

They are definitely right about the long period of painful adjustment! The aftermath of the Industrial Revolution involved two major Communist revolutions, whose death toll approaches 100 million. The stabilizing influence of the modern social welfare state emerged only after World War II, nearly 200 years on from the 18th-century beginnings of the Industrial Revolution.

Today, as globalization and automation dramatically boost corporate productivity, many workers have seen their wages stagnate. The increasing power of automation and artificial intelligence technology means more pain may follow. Are these economists minimizing the historical record when projecting the future, essentially telling us not to worry because in a century or two things will get better?…

We should listen not only to economists when it comes to predicting the future of work; we should listen also to historians, who often bring a deeper historical perspective to their predictions. Automation will significantly change many people’s lives in ways that may be painful and enduring.

Get a start on understanding that history at “What the Industrial Revolution Really Tells Us About the Future of Automation and Work.”

* Hesiod, Work and Days

###

As we hum “Hi Ho, Hi Ho,” we might send ink-stained birthday greetings to Richard March Hoe; he was born on this date in 1812.  In 1847, he patented the rotary printing press.  Hoe had invented the press a couple of years earlier and improved it before submission. His creation greatly increased the speed of printing, as it involved rolling a cylinder over stationary plates of inked type, using the cylinder to make an impression on paper– thus eliminating the need to make impressions from pressing type plates, which were heavy and difficult to maneuver.  In 1871, Hoe added the ability to print to continuous rolls of paper, creating the “web press” that revolutionized newspaper and magazine printing.  His first customer was Horace Greeley’s New York Tribune.

Hoe’s “web perfecting press,” with continuous feed

source

 

Written by (Roughly) Daily

September 12, 2017 at 1:01 am