(Roughly) Daily

Posts Tagged ‘web

“Nothing is so painful to the human mind as a great and sudden change”*…

If an AI-infused web is the future, what can we learn from the past? Jeff Jarvis has some provocative thoughts…

The Gutenberg Parenthesis—the theory that inspired my book of the same name—holds that the era of print was a grand exception in the course of history. I ask what lessons we may learn from society’s development of print culture as we leave it for what follows the connected age of networks, data, and intelligent machines—and as we negotiate the fate of such institutions as copyright, the author, and mass media as they are challenged by developments such as generative AI. 

Let’s start from the beginning…

In examining the half-millennium of print’s history, three moments in time struck me: 

  • After Johannes Gutenberg’s development of movable type in the 1450s in Europe (separate from its prior invention in China and Korea), it took a half-century for the book as we now know it to evolve out of its scribal roots—with titles, title pages, and page numbers. It took another century, until this side and that of 1600, before there arose tremendous innovation with print: the invention of the modern novel with Cervantes, the essay with Montaigne, a market for printed plays with Shakespeare, and the newspaper.
  • It took another century before a business model for print at last emerged with copyright, which was enacted in Britain in 1710, not to protect authors but instead to transform literary works into tradable assets, primarily for the benefit of the still-developing industry of publishing. 
  • And it was one more century—after 1800—before major changes came to the technology of print: the steel press, stereotyping (to mold complete pages rather than resetting type with every edition), steam-powered presses, paper made from abundant wood pulp instead of scarce rags, and eventually the marvelous Linotype, eliminating the job of the typesetter. Before the mechanization and industrialization of print, the average circulation of a daily newspaper in America was 4,000 (the size of a healthy Substack newsletter these days). Afterwards, mass media, the mass market, and the idea of the mass were born alongside the advertising to support them. 

One lesson in this timeline is that the change we experience today, which we think is moving fast, is likely only the beginning. We are only a quarter century past the introduction of the commercial web browser, which puts us at about 1480 in Gutenberg years. There could be much disruption and invention still ahead. Another lesson is that many of the institutions we assume are immutable—copyright, the concept of creativity as property, mass media and its scale, advertising and the attention economy—are not forever. That is to say that we can reconsider, reinvent, reject, or replace them as need and opportunity present…

Read on for his suggestion for a reinvention of copyright: “Gutenberg’s lessons in the era of AI,” from @jeffjarvis via @azeem in his valuable newsletter @ExponentialView.

* Mary Wollstonecraft Shelley, Frankenstein

###

As we contemplate change, we might spare a thought for Jan Hus. A  Czech theologian and philosopher who became a Church reformer, he was burned at the stake as a heretic (for condemning indulgences and the Crusades) on this date in 1415. His teachings (which largely echoed those of Wycliffe) had a strong influence, over a century later, on Martin Luther, helping inspire the Reformation… which was fueled by Gutenberg’s technology, which had been developed and begun to spread in the meantime.

Jan Hus at the stake, Jena codex (c. 1500) source

“I get slightly obsessive about working in archives because you don’t know what you’re going to find. In fact, you don’t know what you’re looking for until you find it.”*…

An update on that remarkable treasure, The Internet Archive

Within the walls of a beautiful former church in San Francisco’s Richmond district [the facade of which is pictured above], racks of computer servers hum and blink with activity. They contain the internet. Well, a very large amount of it.

The Internet Archive, a non-profit, has been collecting web pages since 1996 for its famed and beloved Wayback Machine. In 1997, the collection amounted to 2 terabytes of data. Colossal back then, you could fit it on a $50 thumb drive now.

Today, the archive’s founder Brewster Kahle tells me, the project is on the brink of surpassing 100 petabytes – approximately 50,000 times larger than in 1997. It contains more than 700bn web pages.

The work isn’t getting any easier. Websites today are highly dynamic, changing with every refresh. Walled gardens like Facebook are a source of great frustration to Kahle, who worries that much of the political activity that has taken place on the platform could be lost to history if not properly captured. In the name of privacy and security, Facebook (and others) make scraping difficult. News organisations’ paywalls (such as the FT’s) are also “problematic”, Kahle says. News archiving used to be taken extremely seriously, but changes in ownership or even just a site redesign can mean disappearing content. The technology journalist Kara Swisher recently lamented that some of her early work at The Wall Street Journal has “gone poof”, after the paper declined to sell the material to her several years ago…

A quarter of a century after it began collecting web pages, the Internet Archive is adapting to new challenges: “The ever-expanding job of preserving the internet’s backpages” (gift article) from @DaveLeeFT in the @FinancialTimes.

Antony Beevor

###

As we celebrate collection, we might recall that it was on this date in 2001 that the Polaroid Corporation– best known for its instant film and cameras– filed for bankruptcy. Its employment had peaked in 1978 at 21,000; it revenues, in 1991 at $3 Billion.

Polaroid 80B Highlander instant camera made in the USA, circa 1959

source

Written by (Roughly) Daily

October 11, 2022 at 1:00 am

“History is who we are and why we are the way we are”*…

What a long, strange trip it’s been…

March 12, 1989 Information Management, a Proposal

While working at CERN, Tim Berners-Lee first comes up with the idea for the World Wide Web. To pitch it, he submits a proposal for organizing scientific documents to his employers titled “Information Management, a Proposal.” In this proposal, Berners-Lee sketches out what the web will become, including early versions of the HTTP protocol and HTML.

The first entry a timeline that serves as a table of contents for a series of informative blog posts: “The History of the Web,” from @jay_hoffmann.

* David McCullough

###

As we jack in, we might recall that it was on this date in 1969 that the world first learned of what would become the internet, which would, in turn, become that backbone of the web: UCLA announced it would “become the first station in a nationwide computer network which, for the first time, will link together computers of different makes and using different machine languages into one time-sharing system.” It went on to say that “Creation of the network represents a major forward step in computer technology and may server as the forerunner of large computer networks of the future.”

UCLA will become the first station in a nationwide computer network which, for the first time, will link together computers of different makes and using different machine languages into one time-sharing system.

Creation of the network represents a major forward step in computer technology and may serve as the forerunner of large computer networks of the future.

The ambitious project is supported by the Defense Department’s Advanced Research Project Agency (ARPA), which has pioneered many advances in computer research, technology and applications during the past decade. The network project was proposed and is headed by ARPA’s Dr. Lawrence G. Roberts.

The system will, in effect, pool the computer power, programs and specialized know-how of about 15 computer research centers, stretching from UCLA to M.I.T. Other California network stations (or nodes) will be located at the Rand Corp. and System Development Corp., both of Santa Monica; the Santa Barbara and Berkeley campuses of the University of California; Stanford University and the Stanford Research Institute.

The first stage of the network will go into operation this fall as a subnet joining UCLA, Stanford Research Institute, UC Santa Barbara, and the University of Utah. The entire network is expected to be operational in late 1970.

Engineering professor Leonard Kleinrock [see here], who heads the UCLA project, describes how the network might handle a sample problem:

Programmers at Computer A have a blurred photo which they want to bring into focus. Their program transmits the photo to Computer B, which specializes in computer graphics, and instructs B’s program to remove the blur and enhance the contrast. If B requires specialized computational assistance, it may call on Computer C for help.

The processed work is shuttled back and forth until B is satisfied with the photo, and then sends it back to Computer A. The messages, ranging across the country, can flash between computers in a matter of seconds, Dr. Kleinrock says.

UCLA’s part of the project will involve about 20 people, including some 15 graduate students. The group will play a key role as the official network measurement center, analyzing computer interaction and network behavior, comparing performance against anticipated results, and keeping a continuous check on the network’s effectiveness. For this job, UCLA will use a highly specialized computer, the Sigma 7, developed by Scientific Data Systems of Los Angeles.

Each computer in the network will be equipped with its own interface message processor (IMP) which will double as a sort of translator among the Babel of computer languages and as a message handler and router.

Computer networks are not an entirely new concept, notes Dr. Kleinrock. The SAGE radar defense system of the Fifties was one of the first, followed by the airlines’ SABRE reservation system. At the present time, the nation’s electronically switched telephone system is the world’s largest computer network.

However, all three are highly specialized and single-purpose systems, in contrast to the planned ARPA system which will link a wide assortment of different computers for a wide range of unclassified research functions.

“As of now, computer networks are still in their infancy,” says Dr. Kleinrock. “But as they grow up and become more sophisticated, we will probably see the spread of ‘computer utilities’, which, like present electronic and telephone utilities, will service individual homes and offices across the country.”

source
Boelter Hall, UCLA

source

Written by (Roughly) Daily

July 3, 2022 at 1:00 am

“The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function”*…

The Long Tail

One the one hand: Ted Gioia suggests that, while ‘The Long Tail’ was supposed to boost alternative voices in music, movies, and books, the exact opposite has happened…

When I first heard people predict the rise of the Long Tail, I was amused. Not only did it seem wrong-headed, but it ran counter to everything I saw happening around me.

It pains me to say this—because the Long Tail was sold to us as an economic law that not only predicted a more inclusive era of prosperity, but would especially help creative people. According to its proponents, the Long Tail would revitalize our culture by expanding the scope of the arts and giving a boost to visionaries on the fringes of society.

Alternative voices would be nurtured and flourish. Music would get cooler and more surprising. Books would become more diverse and interesting. Indie films would reach larger audiences. Etc. etc. etc.

Hey, what’s not to like?

But it never happened. More to the point, it was never going to happen because the story was a fairy tale. I knew it back then because I had been hired on a number of occasions to analyze the Long Tail myself. But the flaws in the reasoning are far more obvious today, even to me.

Nonetheless many believed it—and many still do. So it’s worth digging into the story of the Long Tail, and examining exactly why it never delivered its promise.

And maybe we can find some alternative pathway to that lost cultural renaissance by seeing how this one went off the rails.

On the other hand: Cal Newport suggest that Kevin Kelly‘s fourteen-year-old prediction that an artist could make a living online with a thousand true fans is (finally) coming true…

In his “1,000 True Fans” essay, Kelly explains that he wasn’t as excited about this new economic model as others seemed to be. “The long tail is famously good news for two classes of people: a few lucky aggregators, such as Amazon and Netflix, and 6 billion consumers,” he writes. “But the long tail is a decidedly mixed blessing for creators.” If your work lives in the long tail, the introduction of Internet-based markets might mean that you go from selling zero units of your creations to selling a handful of units a month, but this makes little difference to your livelihood. “The long tail offers no path out of the quiet doldrums of minuscule sales,” Kelly writes. “Other than aim for a blockbuster hit, what can an artists do to escape the long tail?”

This question might seem fatalistic, but Kelly had a solution. If your creative work exists in the long tail, generating a small but consistent number of sales, then it’s probably sufficiently good to support a small but serious fan base, assuming you’re willing to put in the work required to cultivate this community. In an earlier age, a creative professional might be limited to fans who lived nearby. But by using the tools of the Internet, Kelly argued, it was now possible for creative types to both find and interact with supporters all around the world…

A shining example of the 1,000 True Fans model is the podcasting boom. There are more than eight hundred and fifty thousand active podcasts available right now. Although most of these shows are small and don’t generate much money, the number of people making a full-time living off original audio content is substantial. The key to a financially viable podcast is to cultivate a group of True Fans eager to listen to every episode. The value of each such fan, willing to stream hours and hours of a creator’s content, is surprisingly large; if sufficiently committed, even a modest-sized audience can generate significant income for a creator. According to an advertising agency I consulted, for example, a weekly podcast that generates thirty thousand downloads per episode should be able to reach Kelly’s target of generating a hundred thousand dollars a year in income. Earning a middle-class salary by talking through a digital microphone to a fiercely loyal band of supporters around the world, who are connected by the magic of the Internet, is about as pure a distillation of Kelly’s vision as you’re likely to find…

The real breakthroughs that enabled the revival of the 1,000 True Fans model are better understood as cultural. The rise in both online news paywalls and subscription video-streaming services trained users to be more comfortable paying à la carte for content. When you already shell out regular subscription fees for newyorker.com, Netflix, Peacock, and Disney+, why not also pay for “Breaking Points,” or throw a monthly donation toward Maria Popova? In 2008, when Kelly published the original “1,000 True Fans” essay, it was widely assumed that it would be hard to ever persuade people to pay money for most digital content. (This likely explains why so many of Kelly’s examples focus on selling tangible goods, such as DVDs or custom prints.) This is no longer true. Opening up these marketplaces to purely digital artifacts—text, audio, video, online classes—significantly lowered the barriers to entry for creative professionals looking to make a living online…

But can this last? Is it destined to fall prey to the forces that Gioia catalogues?

The recent history of the Internet, however, warns that we shouldn’t necessarily expect the endearingly homegrown nature of these 1,000 True Fans communities to persist. When viable new economic niches emerge online, venture-backed businesses, looking to extract their cut, are typically not far behind. Services such as Patreon and Kickstarter are jostling for a dominant position in this direct-to-consumer creative marketplace. A prominent recent example of such attempts to centralize the True Fan economy is Substack, which eliminates friction for writers who want to launch paid e-mail newsletters. Substack now has more than a million subscribers who pay for access to newsletters, and is currently valued at around six hundred and fifty million dollars. With this type of money at stake, it’s easy to imagine a future in which a small number of similarly optimized platforms dominate most of the mechanisms by which creative professionals interact with their 1,000 True Fans. In the optimistic scenario, this competition will lead to continued streamlining of the process of serving supporters, increasing the number of people who are able to make a good living off of their creative work: an apotheosis of sorts of Kelly’s original vision. A more pessimistic prediction is that the current True Fan revolution will eventually go the way of the original Web 2.0 revolution, with creators increasingly ground in the gears of monetization. The Substack of today makes it easy for a writer to charge fans for a newsletter. The Substack of tomorrow might move toward a flat-fee subscription model, driving users toward an algorithmically optimized collection of newsletter content, concentrating rewards within a small number of hyper-popular producers, and in turn eliminating the ability for any number of niche writers to make a living…

The future of the creative economy: “Where Did the Long Tail Go?,” from @tedgioia and “The Rise of the Internet’s Creative Middle Class,” from Cal Newport on @kevin2kelly in @NewYorker.

* F. Scott Fitzgerald (“The Crack-Up,” Esquire, February, 1936)

###

As we contemplate culture and commerce, we might recall that it was on this date in 1894 (after 30 states had already enshrined the occasion) that Labor Day became a federal holiday in the United States.

labor day
The country’s first Labor Day parade in New York City on Sept. 5, 1882. This sketch appeared in Frank Leslie’s Illustrated Newspaper.

source (and source of more on the history of Labor Day)

“A nothing will serve just as well as a something about which nothing could be said”*…

Metaphysical debates in quantum physics don’t get at “truth,” physicist and mathematician Timothy Andersen argues; they’re nothing but a form of ritual activity and culture. After a thoughtful intellectual history of both quantum mechanics and Wittgenstein’s thought, he concludes…

If Wittgenstein were alive today, he might have couched his arguments in the vocabulary of cultural anthropology. For this shared grammar and these language games, in his view, form part of much larger ritualistic mechanisms that connect human activity with human knowledge, as deeply as DNA connects to human biology. It is also a perfect example of how evolution works by using pre-existing mechanisms to generate new behaviors.

The conclusion from all of this is that interpretation and representation in language and mathematics are little different than the supernatural explanations of ancient religions. Trying to resolve the debate between Bohr and Einstein is like trying to answer the Zen kōan about whether the tree falling in the forest makes a sound if no one can hear it. One cannot say definitely yes or no, because all human language must connect to human activity. And all human language and activity are ritual, signifying meaning by their interconnectedness. To ask what the wavefunction means without specifying an activity – and experiment – to extract that meaning is, therefore, as sensible as asking about the sound of the falling tree. It is nonsense.

As a scientist and mathematician, Wittgenstein has challenged my own tendency to seek out interpretations of phenomena that have no scientific value – and to see such explanations as nothing more than narratives. He taught that all that philosophy can do is remind us of what is evidently true. It’s evidently true that the wavefunction has a multiverse interpretation, but one must assume the multiverse first, since it cannot be measured. So the interpretation is a tautology, not a discovery.

I have humbled myself to the fact that we can’t justify clinging to one interpretation of reality over another. In place of my early enthusiastic Platonism, I have come to think of the world not as one filled with sharply defined truths, but rather as a place containing myriad possibilities – each of which, like the possibilities within the wavefunction itself, can be simultaneously true. Likewise, mathematics and its surrounding language don’t represent reality so much as serve as a trusty tool for helping people to navigate the world. They are of human origin and for human purposes.

To shut up and calculate, then, recognizes that there are limits to our pathways for understanding. Our only option as scientists is to look, predict and test. This might not be as glamorous an offering as the interpretations we can construct in our minds, but it is the royal road to real knowledge…

A provocative proposition: “Quantum Wittgenstein,” from @timcopia in @aeonmag.

* Ludwig Wittgenstein, Philosophical Investigations

###

As we muse on meaning, we might recall that it was on this date in 1954 that the official ground-breaking for CERN (Conseil européen pour la recherche nucléaire) was held. Located in Switzerland, it is the largest particle physics laboratory in the world… that’s to say, a prime spot to do the observation and calculation that Andersen suggests. Indeed, it’s been the site of many breakthrough discoveries over the years, maybe most notably the 2012 observation of the Higgs Boson.

Because researchers need remote access to these facilities, the lab has historically been a major wide area network hub. Indeed, it was at CERN that Tim Berners-Lee developed the first “browser”– and effectively fomented the emergence of the web.

CERN’s main site, from Switzerland looking towards France
%d bloggers like this: