(Roughly) Daily

Posts Tagged ‘web

“I get slightly obsessive about working in archives because you don’t know what you’re going to find. In fact, you don’t know what you’re looking for until you find it.”*…

An update on that remarkable treasure, The Internet Archive

Within the walls of a beautiful former church in San Francisco’s Richmond district [the facade of which is pictured above], racks of computer servers hum and blink with activity. They contain the internet. Well, a very large amount of it.

The Internet Archive, a non-profit, has been collecting web pages since 1996 for its famed and beloved Wayback Machine. In 1997, the collection amounted to 2 terabytes of data. Colossal back then, you could fit it on a $50 thumb drive now.

Today, the archive’s founder Brewster Kahle tells me, the project is on the brink of surpassing 100 petabytes – approximately 50,000 times larger than in 1997. It contains more than 700bn web pages.

The work isn’t getting any easier. Websites today are highly dynamic, changing with every refresh. Walled gardens like Facebook are a source of great frustration to Kahle, who worries that much of the political activity that has taken place on the platform could be lost to history if not properly captured. In the name of privacy and security, Facebook (and others) make scraping difficult. News organisations’ paywalls (such as the FT’s) are also “problematic”, Kahle says. News archiving used to be taken extremely seriously, but changes in ownership or even just a site redesign can mean disappearing content. The technology journalist Kara Swisher recently lamented that some of her early work at The Wall Street Journal has “gone poof”, after the paper declined to sell the material to her several years ago…

A quarter of a century after it began collecting web pages, the Internet Archive is adapting to new challenges: “The ever-expanding job of preserving the internet’s backpages” (gift article) from @DaveLeeFT in the @FinancialTimes.

Antony Beevor

###

As we celebrate collection, we might recall that it was on this date in 2001 that the Polaroid Corporation– best known for its instant film and cameras– filed for bankruptcy. Its employment had peaked in 1978 at 21,000; it revenues, in 1991 at $3 Billion.

Polaroid 80B Highlander instant camera made in the USA, circa 1959

source

Written by (Roughly) Daily

October 11, 2022 at 1:00 am

“History is who we are and why we are the way we are”*…

What a long, strange trip it’s been…

March 12, 1989 Information Management, a Proposal

While working at CERN, Tim Berners-Lee first comes up with the idea for the World Wide Web. To pitch it, he submits a proposal for organizing scientific documents to his employers titled “Information Management, a Proposal.” In this proposal, Berners-Lee sketches out what the web will become, including early versions of the HTTP protocol and HTML.

The first entry a timeline that serves as a table of contents for a series of informative blog posts: “The History of the Web,” from @jay_hoffmann.

* David McCullough

###

As we jack in, we might recall that it was on this date in 1969 that the world first learned of what would become the internet, which would, in turn, become that backbone of the web: UCLA announced it would “become the first station in a nationwide computer network which, for the first time, will link together computers of different makes and using different machine languages into one time-sharing system.” It went on to say that “Creation of the network represents a major forward step in computer technology and may server as the forerunner of large computer networks of the future.”

UCLA will become the first station in a nationwide computer network which, for the first time, will link together computers of different makes and using different machine languages into one time-sharing system.

Creation of the network represents a major forward step in computer technology and may serve as the forerunner of large computer networks of the future.

The ambitious project is supported by the Defense Department’s Advanced Research Project Agency (ARPA), which has pioneered many advances in computer research, technology and applications during the past decade. The network project was proposed and is headed by ARPA’s Dr. Lawrence G. Roberts.

The system will, in effect, pool the computer power, programs and specialized know-how of about 15 computer research centers, stretching from UCLA to M.I.T. Other California network stations (or nodes) will be located at the Rand Corp. and System Development Corp., both of Santa Monica; the Santa Barbara and Berkeley campuses of the University of California; Stanford University and the Stanford Research Institute.

The first stage of the network will go into operation this fall as a subnet joining UCLA, Stanford Research Institute, UC Santa Barbara, and the University of Utah. The entire network is expected to be operational in late 1970.

Engineering professor Leonard Kleinrock [see here], who heads the UCLA project, describes how the network might handle a sample problem:

Programmers at Computer A have a blurred photo which they want to bring into focus. Their program transmits the photo to Computer B, which specializes in computer graphics, and instructs B’s program to remove the blur and enhance the contrast. If B requires specialized computational assistance, it may call on Computer C for help.

The processed work is shuttled back and forth until B is satisfied with the photo, and then sends it back to Computer A. The messages, ranging across the country, can flash between computers in a matter of seconds, Dr. Kleinrock says.

UCLA’s part of the project will involve about 20 people, including some 15 graduate students. The group will play a key role as the official network measurement center, analyzing computer interaction and network behavior, comparing performance against anticipated results, and keeping a continuous check on the network’s effectiveness. For this job, UCLA will use a highly specialized computer, the Sigma 7, developed by Scientific Data Systems of Los Angeles.

Each computer in the network will be equipped with its own interface message processor (IMP) which will double as a sort of translator among the Babel of computer languages and as a message handler and router.

Computer networks are not an entirely new concept, notes Dr. Kleinrock. The SAGE radar defense system of the Fifties was one of the first, followed by the airlines’ SABRE reservation system. At the present time, the nation’s electronically switched telephone system is the world’s largest computer network.

However, all three are highly specialized and single-purpose systems, in contrast to the planned ARPA system which will link a wide assortment of different computers for a wide range of unclassified research functions.

“As of now, computer networks are still in their infancy,” says Dr. Kleinrock. “But as they grow up and become more sophisticated, we will probably see the spread of ‘computer utilities’, which, like present electronic and telephone utilities, will service individual homes and offices across the country.”

source
Boelter Hall, UCLA

source

Written by (Roughly) Daily

July 3, 2022 at 1:00 am

“The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function”*…

The Long Tail

One the one hand: Ted Gioia suggests that, while ‘The Long Tail’ was supposed to boost alternative voices in music, movies, and books, the exact opposite has happened…

When I first heard people predict the rise of the Long Tail, I was amused. Not only did it seem wrong-headed, but it ran counter to everything I saw happening around me.

It pains me to say this—because the Long Tail was sold to us as an economic law that not only predicted a more inclusive era of prosperity, but would especially help creative people. According to its proponents, the Long Tail would revitalize our culture by expanding the scope of the arts and giving a boost to visionaries on the fringes of society.

Alternative voices would be nurtured and flourish. Music would get cooler and more surprising. Books would become more diverse and interesting. Indie films would reach larger audiences. Etc. etc. etc.

Hey, what’s not to like?

But it never happened. More to the point, it was never going to happen because the story was a fairy tale. I knew it back then because I had been hired on a number of occasions to analyze the Long Tail myself. But the flaws in the reasoning are far more obvious today, even to me.

Nonetheless many believed it—and many still do. So it’s worth digging into the story of the Long Tail, and examining exactly why it never delivered its promise.

And maybe we can find some alternative pathway to that lost cultural renaissance by seeing how this one went off the rails.

On the other hand: Cal Newport suggest that Kevin Kelly‘s fourteen-year-old prediction that an artist could make a living online with a thousand true fans is (finally) coming true…

In his “1,000 True Fans” essay, Kelly explains that he wasn’t as excited about this new economic model as others seemed to be. “The long tail is famously good news for two classes of people: a few lucky aggregators, such as Amazon and Netflix, and 6 billion consumers,” he writes. “But the long tail is a decidedly mixed blessing for creators.” If your work lives in the long tail, the introduction of Internet-based markets might mean that you go from selling zero units of your creations to selling a handful of units a month, but this makes little difference to your livelihood. “The long tail offers no path out of the quiet doldrums of minuscule sales,” Kelly writes. “Other than aim for a blockbuster hit, what can an artists do to escape the long tail?”

This question might seem fatalistic, but Kelly had a solution. If your creative work exists in the long tail, generating a small but consistent number of sales, then it’s probably sufficiently good to support a small but serious fan base, assuming you’re willing to put in the work required to cultivate this community. In an earlier age, a creative professional might be limited to fans who lived nearby. But by using the tools of the Internet, Kelly argued, it was now possible for creative types to both find and interact with supporters all around the world…

A shining example of the 1,000 True Fans model is the podcasting boom. There are more than eight hundred and fifty thousand active podcasts available right now. Although most of these shows are small and don’t generate much money, the number of people making a full-time living off original audio content is substantial. The key to a financially viable podcast is to cultivate a group of True Fans eager to listen to every episode. The value of each such fan, willing to stream hours and hours of a creator’s content, is surprisingly large; if sufficiently committed, even a modest-sized audience can generate significant income for a creator. According to an advertising agency I consulted, for example, a weekly podcast that generates thirty thousand downloads per episode should be able to reach Kelly’s target of generating a hundred thousand dollars a year in income. Earning a middle-class salary by talking through a digital microphone to a fiercely loyal band of supporters around the world, who are connected by the magic of the Internet, is about as pure a distillation of Kelly’s vision as you’re likely to find…

The real breakthroughs that enabled the revival of the 1,000 True Fans model are better understood as cultural. The rise in both online news paywalls and subscription video-streaming services trained users to be more comfortable paying à la carte for content. When you already shell out regular subscription fees for newyorker.com, Netflix, Peacock, and Disney+, why not also pay for “Breaking Points,” or throw a monthly donation toward Maria Popova? In 2008, when Kelly published the original “1,000 True Fans” essay, it was widely assumed that it would be hard to ever persuade people to pay money for most digital content. (This likely explains why so many of Kelly’s examples focus on selling tangible goods, such as DVDs or custom prints.) This is no longer true. Opening up these marketplaces to purely digital artifacts—text, audio, video, online classes—significantly lowered the barriers to entry for creative professionals looking to make a living online…

But can this last? Is it destined to fall prey to the forces that Gioia catalogues?

The recent history of the Internet, however, warns that we shouldn’t necessarily expect the endearingly homegrown nature of these 1,000 True Fans communities to persist. When viable new economic niches emerge online, venture-backed businesses, looking to extract their cut, are typically not far behind. Services such as Patreon and Kickstarter are jostling for a dominant position in this direct-to-consumer creative marketplace. A prominent recent example of such attempts to centralize the True Fan economy is Substack, which eliminates friction for writers who want to launch paid e-mail newsletters. Substack now has more than a million subscribers who pay for access to newsletters, and is currently valued at around six hundred and fifty million dollars. With this type of money at stake, it’s easy to imagine a future in which a small number of similarly optimized platforms dominate most of the mechanisms by which creative professionals interact with their 1,000 True Fans. In the optimistic scenario, this competition will lead to continued streamlining of the process of serving supporters, increasing the number of people who are able to make a good living off of their creative work: an apotheosis of sorts of Kelly’s original vision. A more pessimistic prediction is that the current True Fan revolution will eventually go the way of the original Web 2.0 revolution, with creators increasingly ground in the gears of monetization. The Substack of today makes it easy for a writer to charge fans for a newsletter. The Substack of tomorrow might move toward a flat-fee subscription model, driving users toward an algorithmically optimized collection of newsletter content, concentrating rewards within a small number of hyper-popular producers, and in turn eliminating the ability for any number of niche writers to make a living…

The future of the creative economy: “Where Did the Long Tail Go?,” from @tedgioia and “The Rise of the Internet’s Creative Middle Class,” from Cal Newport on @kevin2kelly in @NewYorker.

* F. Scott Fitzgerald (“The Crack-Up,” Esquire, February, 1936)

###

As we contemplate culture and commerce, we might recall that it was on this date in 1894 (after 30 states had already enshrined the occasion) that Labor Day became a federal holiday in the United States.

labor day
The country’s first Labor Day parade in New York City on Sept. 5, 1882. This sketch appeared in Frank Leslie’s Illustrated Newspaper.

source (and source of more on the history of Labor Day)

“A nothing will serve just as well as a something about which nothing could be said”*…

Metaphysical debates in quantum physics don’t get at “truth,” physicist and mathematician Timothy Andersen argues; they’re nothing but a form of ritual activity and culture. After a thoughtful intellectual history of both quantum mechanics and Wittgenstein’s thought, he concludes…

If Wittgenstein were alive today, he might have couched his arguments in the vocabulary of cultural anthropology. For this shared grammar and these language games, in his view, form part of much larger ritualistic mechanisms that connect human activity with human knowledge, as deeply as DNA connects to human biology. It is also a perfect example of how evolution works by using pre-existing mechanisms to generate new behaviors.

The conclusion from all of this is that interpretation and representation in language and mathematics are little different than the supernatural explanations of ancient religions. Trying to resolve the debate between Bohr and Einstein is like trying to answer the Zen kōan about whether the tree falling in the forest makes a sound if no one can hear it. One cannot say definitely yes or no, because all human language must connect to human activity. And all human language and activity are ritual, signifying meaning by their interconnectedness. To ask what the wavefunction means without specifying an activity – and experiment – to extract that meaning is, therefore, as sensible as asking about the sound of the falling tree. It is nonsense.

As a scientist and mathematician, Wittgenstein has challenged my own tendency to seek out interpretations of phenomena that have no scientific value – and to see such explanations as nothing more than narratives. He taught that all that philosophy can do is remind us of what is evidently true. It’s evidently true that the wavefunction has a multiverse interpretation, but one must assume the multiverse first, since it cannot be measured. So the interpretation is a tautology, not a discovery.

I have humbled myself to the fact that we can’t justify clinging to one interpretation of reality over another. In place of my early enthusiastic Platonism, I have come to think of the world not as one filled with sharply defined truths, but rather as a place containing myriad possibilities – each of which, like the possibilities within the wavefunction itself, can be simultaneously true. Likewise, mathematics and its surrounding language don’t represent reality so much as serve as a trusty tool for helping people to navigate the world. They are of human origin and for human purposes.

To shut up and calculate, then, recognizes that there are limits to our pathways for understanding. Our only option as scientists is to look, predict and test. This might not be as glamorous an offering as the interpretations we can construct in our minds, but it is the royal road to real knowledge…

A provocative proposition: “Quantum Wittgenstein,” from @timcopia in @aeonmag.

* Ludwig Wittgenstein, Philosophical Investigations

###

As we muse on meaning, we might recall that it was on this date in 1954 that the official ground-breaking for CERN (Conseil européen pour la recherche nucléaire) was held. Located in Switzerland, it is the largest particle physics laboratory in the world… that’s to say, a prime spot to do the observation and calculation that Andersen suggests. Indeed, it’s been the site of many breakthrough discoveries over the years, maybe most notably the 2012 observation of the Higgs Boson.

Because researchers need remote access to these facilities, the lab has historically been a major wide area network hub. Indeed, it was at CERN that Tim Berners-Lee developed the first “browser”– and effectively fomented the emergence of the web.

CERN’s main site, from Switzerland looking towards France

“In our world of big names, curiously, our true heroes tend to be anonymous”*…

Pattie Maes was inventing the core principles behind the social media age when Mark Zuckerberg was still in kindergarten, but her contributions been largely unrecognized. Steven Johnson explains…

Anyone who was around for the early days of the World Wide Web, before the Netscape IPO and the dotcom boom, knows that there was a strange quality to the medium back then – in many ways the exact opposite of the way the Web works today. It was oddly devoid of people. Tim Berners-Lee had conjured up a radically new way of organizing information through the core innovations of hypertext and URLs, which created a standardized way of pointing to the location of documents. But almost every Web page you found yourself on back in those frontier days was frozen in the form that its author had originally intended. The words on the screen couldn’t adapt to your presence and your interests as you browsed. Interacting with other humans and having conversations – all that was still what you did with email or USENET or dial-up bulletin boards like The Well. The original Web was more like a magic library, filled with pages that could connect to other pages through miraculous wormholes of links. But the pages themselves were fixed, and everyone browsed alone.

One of the first signs that the Web might eventually escape those confines arrived in the last months of 1994, with the release of an intriguing (albeit bare-bones) prototype called HOMR, short for the Helpful Online Music Recommendation service.

HOMR was one of a number of related projects that emerged in the early-to-mid-90s out of the MIT lab of the Belgian-born computer scientist Pattie Maes, projects that eventually culminated in a company that Maes co-founded, called Firefly. HOMR pulled off a trick that was genuinely unprecedented at the time: it could make surprisingly sophisticated recommendations of music that you might like. It seemed to be capable of learning something about you as an individual. Unlike just about everything else on the Web back then, HOMR’s pages were not one-size-fits all. They suggested, perhaps for the first time, that this medium was capable of conveying personalized information. Firefly would then take that advance to the next level: not just recommending music, but actually connecting you to other people who shared your tastes.

Maes called the underlying approach “collaborative filtering”, but looking back on it with more than two decades’ worth of hindsight, it’s clear that what we were experiencing with HOMR and Firefly was the very beginnings of a new kind of software platform that would change the world in the coming decades, for better and for worse: social networks…

Read on at “Intelligent Agent: How Pattie Maes almost invented social media,” from @stevenbjohnson, the first in a new series, “Hidden Heroes.”

* Daniel J. Boorstin

###

As we give credit where credit is due, we might recall that it was on this date in 1960 that the FDA approved the first birth control pill– an oral medication for use by women as a contraceptive. In 1953, birth control crusader Margaret Sanger and her supporter/financier, philanthropist Katharine Dexter McCormick had given Dr. Gregory Pincus $150,000 to continue his prior research and develop a safe and effective oral contraceptive for women.

In just five years, almost half of married women on birth control were using it.

But the real revolution would come when unmarried women got access to oral contraceptives. That took time. But in around 1970 – 10 years after the pill was first approved – US state after US state started to make it easier for single women to get the pill…

And that was when the economic revolution really began.

Women in America started studying particular kinds of degrees – law, medicine, dentistry and MBAs – which had previously been very masculine.

In 1970, medical degrees were over 90% male. Law degrees and MBAs were over 95% male. Dentistry degrees were 99% male. But at the beginning of the 1970s – equipped with the pill – women surged into all these courses. At first, women made up a fifth of the class, then a quarter. By 1980 they often made up a third…

The tiny pill which gave birth to an economic revolution,” by Tim Harford, in the BBC’s series 50 Things That Made the Modern Economy

source

%d bloggers like this: