Posts Tagged ‘innovation’
“Nothing is so painful to the human mind as a great and sudden change”*…
If an AI-infused web is the future, what can we learn from the past? Jeff Jarvis has some provocative thoughts…
The Gutenberg Parenthesis—the theory that inspired my book of the same name—holds that the era of print was a grand exception in the course of history. I ask what lessons we may learn from society’s development of print culture as we leave it for what follows the connected age of networks, data, and intelligent machines—and as we negotiate the fate of such institutions as copyright, the author, and mass media as they are challenged by developments such as generative AI.
Let’s start from the beginning…
In examining the half-millennium of print’s history, three moments in time struck me:
- After Johannes Gutenberg’s development of movable type in the 1450s in Europe (separate from its prior invention in China and Korea), it took a half-century for the book as we now know it to evolve out of its scribal roots—with titles, title pages, and page numbers. It took another century, until this side and that of 1600, before there arose tremendous innovation with print: the invention of the modern novel with Cervantes, the essay with Montaigne, a market for printed plays with Shakespeare, and the newspaper.
- It took another century before a business model for print at last emerged with copyright, which was enacted in Britain in 1710, not to protect authors but instead to transform literary works into tradable assets, primarily for the benefit of the still-developing industry of publishing.
- And it was one more century—after 1800—before major changes came to the technology of print: the steel press, stereotyping (to mold complete pages rather than resetting type with every edition), steam-powered presses, paper made from abundant wood pulp instead of scarce rags, and eventually the marvelous Linotype, eliminating the job of the typesetter. Before the mechanization and industrialization of print, the average circulation of a daily newspaper in America was 4,000 (the size of a healthy Substack newsletter these days). Afterwards, mass media, the mass market, and the idea of the mass were born alongside the advertising to support them.
One lesson in this timeline is that the change we experience today, which we think is moving fast, is likely only the beginning. We are only a quarter century past the introduction of the commercial web browser, which puts us at about 1480 in Gutenberg years. There could be much disruption and invention still ahead. Another lesson is that many of the institutions we assume are immutable—copyright, the concept of creativity as property, mass media and its scale, advertising and the attention economy—are not forever. That is to say that we can reconsider, reinvent, reject, or replace them as need and opportunity present…
Read on for his suggestion for a reinvention of copyright: “Gutenberg’s lessons in the era of AI,” from @jeffjarvis via @azeem in his valuable newsletter @ExponentialView.
* Mary Wollstonecraft Shelley, Frankenstein
###
As we contemplate change, we might spare a thought for Jan Hus. A Czech theologian and philosopher who became a Church reformer, he was burned at the stake as a heretic (for condemning indulgences and the Crusades) on this date in 1415. His teachings (which largely echoed those of Wycliffe) had a strong influence, over a century later, on Martin Luther, helping inspire the Reformation… which was fueled by Gutenberg’s technology, which had been developed and begun to spread in the meantime.

“No problem can be solved from the same level of consciousness that created it”*…
… perhaps especially not the problem of consciousness itself. At least for now…
A 25-year science wager has come to an end. In 1998, neuroscientist Christof Koch bet philosopher David Chalmers that the mechanism by which the brain’s neurons produce consciousness would be discovered by 2023. Both scientists agreed publicly on 23 June, at the annual meeting of the Association for the Scientific Study of Consciousness (ASSC) in New York City, that it is still an ongoing quest — and declared Chalmers the winner.
What ultimately helped to settle the bet was a key study testing two leading hypotheses about the neural basis of consciousness, whose findings were unveiled at the conference.
“It was always a relatively good bet for me and a bold bet for Christof,” says Chalmers, who is now co-director of the Center for Mind, Brain and Consciousness at New York University. But he also says this isn’t the end of the story, and that an answer will come eventually: “There’s been a lot of progress in the field.”
Consciousness is everything a person experiences — what they taste, hear, feel and more. It is what gives meaning and value to our lives, Chalmers says.
Despite a vast effort — and a 25-year bet — researchers still don’t understand how our brains produce it, however. “It started off as a very big philosophical mystery,” Chalmers adds. “But over the years, it’s gradually been transmuting into, if not a ‘scientific’ mystery, at least one that we can get a partial grip on scientifically.”…
Neuroscientist Christof Koch wagered philosopher David Chalmers 25 years ago that researchers would learn how the brain achieves consciousness by now. But the quest continues: “Decades-long bet on consciousness ends — and it’s philosopher 1, neuroscientist 0,” from @Nature. Eminently worth reading in full for background and state-of-play.
* Albert Einstein
###
As we ponder pondering, we might spare a thought for Vannevar Bush; he died on this date in 1974. An engineer, inventor, and science administrator, he headed the World War II U.S. Office of Scientific Research and Development (OSRD), through which almost all wartime military R&D was carried out, including important developments in radar and the initiation and early administration of the Manhattan Project. He emphasized the importance of scientific research to national security and economic well-being, and was chiefly responsible for the movement that led to the creation of the National Science Foundation.
Bush also did his own work. Before the war, in 1925, at age 35, he developed the differential analyzer, the world’s first analog computer, capable of solving differential equations. It put into productive form, the mechanical concept left incomplete by Charles Babbage, 50 years earlier; and theoretical work by Lord Kelvin. The machine filled a 20×30 ft room. He seeded ideas later adopted as internet hypertext links.
“Ice contains no future, just the past, sealed away”*…
From her new book, Ice: From Mixed Drinks to Skating Rinks—A cool history of a hot commodity, Amy Brady…
Despite more than 150 years’ worth of study and experimentation, no one really knows why ice is slippery….
Nineteenth-century Americans used ice to store perishable foods in amounts that astounded visitors from Europe, where an ice trade had yet to be developed. Apples, for example, became so commonplace in the young republic that visitors coined the phrase “as American as apple pie.”…
By WWII, the burgeoning industry of electric refrigeration was catching up to the ice industry, and companies like the Southland Ice Company were forced to rethink their business plans. Southland began selling kitchen staples like milk and bread alongside their ice. The combination became so popular, the company extended its hours to keep up with demand, and within a few years renamed itself after its new hours of operation. The 7-Eleven was born, and convenience stores today still sell ice…
Between WWII and 1975, the amount of electricity refrigerators consumed grew by more than 350 percent. Today, a look at energy use around the globe reveals that the cooling industry (refrigerators, freezers, and air conditioners) accounts for almost 10 percent of all CO2 emissions…
Six more cool facts at “10 Things You Probably Didn’t Know about Ice,” from @ingredient_x in @Orion_Magazine.
* Haruki Murakami
###
As we chill, we might recall that it was on this date in 1982 that record low temperature of -117 F was recorded in Antarctica. That record was broken the following year, also in Antarctica, at -128.6 F– a mark that stands to this date, as Antartica has been warming… leading Dr. Brady to ask, “in an age of accelerating global warming… can ice in the freezer and ice on our planetary poles continue to coexist?”

“The relationship between commitment and doubt is by no means an antagonistic one. Commitment is healthiest when it is not without doubt, but in spite of doubt.”*…
Jamie Catherwood with the story of an inspired Catholic priest and hard-earned belief…
Two days before the wildly popular World’s Fair in his city came to an end, on October 30, 1893, a disturbed office seeker shot and killed the Mayor of Chicago in his own home… This devastated Chicago and America more broadly. This high-profile assassination detracted from a triumphant World’s Fair, which showcased American strength and global prominence.
For whatever reason, it really devastated… Catholic Priest, Casimir Zeglen. A local newspaper reported:
“the sensitive priest was shocked more than most people, because it occurred to him that there must be some way to create bullet-proof clothing that would protect people who, by their position, are most vulnerable to fanatics.”
And so… Zeglen began designing a bullet-proof vest… Around this time, there were a few publicized cases of men being shot in the chest and surviving because a silk handkerchief in their breast pockets, folded a few times, had stopped the bullet. This inspired Zeglen to explore the application of silk on a larger scale: bulletproof vests. He spent two years experimenting and tinkering, and in 1897 Zeglen received two patents from the USPTO for his invention: “armor protecting against bullets from a handgun.”
Although he was not always the target, Zeglen frequently placed himself in the line of fire during public demonstrations (Source)… How much conviction must you have in your own research to willingly place yourself in harm’s way? And that’s just it. True conviction can only come from deep research and long hours. Doing the work.
I’d never let someone shoot me while wearing a silk vest unless I’d spent thousands of hours researching, building, iterating, and studying everything there is to know about bulletproof materials.
Yet Zeglen had done just that. Backed by meticulous research and years of trial and error, Casimir Zeglen demonstrated his invention before the new Chicago Mayor – four years after the assassination of his predecessor – and the Chicago Police Department…
“A Story of Conviction & Bulletproof Priests,” from @InvestorAmnesia.
For more on Zeglen, see here; for more on other early bulletproof vests (and the dramatic photos they spawned), here.
* Rollo May, The Courage to Create
###
As we do the work, we might recall that it was on this date in 1946 that Annie Get Your Gun opened at the Imperial Theater on Broadway. With a score by Irving Berlin and a book by Dorothy Fields and her brother Herbert Fields, it told the fictionalized story of Annie Oakley (1860–1926), a sharpshooter who starred in Buffalo Bill’s Wild West, and her romance with sharpshooter Frank E. Butler (1847–1926).
It was huge hit, running for 1,147 performances, and spawned revivals, a 1950 film version and television versions. Songs that became hits include “There’s No Business Like Show Business“, “Doin’ What Comes Natur’lly“, “You Can’t Get a Man with a Gun“, “They Say It’s Wonderful“, and “Anything You Can Do (I Can Do Better).”

“‘Big’ government? Who wants that? I just want effective government. That means America’s government needs to be big in some places, small in others and non-existent in others.”*…
The Red would like to see less; the Blue, more– still, for all of their contention, the right and left in the U.S. agree that government isn’t doing the job that it could/should.
Indeed, the quantitation question obscures a qualitative issue: as our lives (and our businesses) have become more digital, governments have fallen behind in taking effective advantage of technology. Jen Pahlka, founder of Code for America, has devoted most of her life to precisely that problem. Now, she’s sharing the lessons she’s learned in a new book. Your correspondent has read and deeply appreciated it; but don’t take his word…
Beginning with “I’m Just a Bill,” an animated musical introduction to the American legislation system from Schoolhouse Rock!, Pahlka, the deputy chief technology officer during the Obama administration, delivers an eye-opening and accessible examination of why online interactions with government in America work—or, often, do not. The author provides numerous examples of failures, including a form for Veterans Affairs health insurance that only really worked on certain computers with certain versions of software; the development of healthcare.gov, where “the full set of rules governing the program they were supposed to administer wasn’t finalized until the site was due to launch”; or an “application for food stamps that requires answering 212 separate questions.” Through these and many other illustrative cases, Pahlka effectively shows that “when systems or organizations don’t work the way you think they should, it is generally not because the people in them are stupid or evil. It is because they are operating according to structures and incentives that aren’t obvious from the outside.” Indeed, by tracing the requirements of any technology developed by or for the government, it becomes increasingly apparent that simply adding new laws or throwing money at the problems fails to alleviate the confusion or waste. Throughout this empowering book, the author makes compelling, clear arguments, revealing inefficiency, bureaucracy, and incompetence, whether it stems from legislators, administrators, or IT professionals. “The good news is that software and the US government have something very important in common: they are made by and for people,” writes Pahlka. “In the end, we get to decide how they work.” Anyone dealing with the implementation of technology in government should pay attention to the author’s suggestions…
Starred review, Kirkus Reviews
An important– and eminently readable– exploration of the fraught intersection of technological innovation and government bureaucracy, and a guide to navigating it: Recoding America: why government is failing in the digital age and how we can do better, from @pahlkadot.
* Van Jones
###
As we get smart, we might spare a thought for Auguste Lumière; he died on this date in 1954. The son of a French portrait painter who added photography to his repertoire, Auguste joined with his brother Louis to pioneer a pre-digital technology that changed the world: cinema.
Their father returned in 1894 from a trip to the U.S. where he’d been enchanted by Edison’s kinetoscope. The brothers (who’d already pioneered new darkroom techniques for still photography) were excited… until they understood that Edison’s display could only be seen by a single viewer at a time. They envisioned something different: a projected image that could be shared by an audience, in the same way that audiences share a play. With his brother’s help, Lumière designed the Cinematograph, a self-contained camera and projector that used a clawed-gear to advance sprocketed film. It was the first apparatus for making and showing films to audiences in a way that would be recognizable today as “going to the movies,” thus the Lumière brothers are often credited as inventors of the motion picture. In any case, the principle at work in the Cinematograph was the principle used in movie cameras and projectors for more than a century afterwards.
You must be logged in to post a comment.