Posts Tagged ‘web’
“You must have chaos within you to give birth to a dancing star”*…
Last May, Pulitzer Prize-winning journalist and author John Markoff was asked to write an op-ed for The Wall Street Journal on the heels of the murder in San Francisco of tech exec Bob Lee. The paper rejected his piece, leaving Markoff to “suspect that they were looking for more of a ‘drugs, sex and rock & roll’ analysis, which isn’t what they got. My 2005 book What the Dormouse Said is occasionally cited by people making the argument that there is some kind of causal relationship between psychedelic drugs and creativity. I have never believed that to be the case and I’ve always been more interested in sociological than psychological assessments of psychedelics.”
Happily for us, he has shared it on Medium…
The head-spinning speed with which the murder of software creator Bob Lee went from being a story about rampant crime in San Francisco to a sex and drugs tale of Silicon Valley excess says a great deal about the way the world now perceives the nation’s technology heartland.
Lee, who had gone from being a Google software engineer to become the creator of the mobile finance program Cash App, and who had more recently became the chief product officer for a crypto-currency company, is now alleged to have been stabbed to death by the brother of a wealthy socialite with whom Lee is thought to have had an affair.
On the surface it would seem to evoke something more out of a Hollywood soap opera than the world’s technology center. But the Valley is more complex than cases like Bob Lee, or dark takes on the evils of technology, suggest.
Silicon Valley has always been built around a paradox represented by the built-in tension between the open-source spirit of a hacker counterculture and the naked capitalist ambitions of Sand Hill Road, where the offices of its venture capitalists are concentrated.
Stewart Brand, who authored the Whole Earth Catalog in Menlo Park, Ca., at the same moment the high-tech region was forming in the 1960s, expressed the paradox most clearly at the first meeting of the Hackers Conference in 1984. In responding to Steve Wozniak, Apple’s cofounder, who was describing the danger of technology companies hoarding information, what the audience heard Brand say, was “information wants to be free.” Indeed, a decade later that became the rallying cry of the dot-com era, a period in which technology start-ups thrived on disrupting traditional commerce and railing against regulation.
But that is not what Brand said. He actually stated: “Information sort of wants to be expensive because its so valuable, the right information at the right point changes your life. On the other hand information almost wants to be free, because the cost of getting it out is getting lower and lower all the time.”
Brand had been influenced by social scientist Gregory Bateson who proposed the idea of “the double bind” to describe situations in which even when you win, you lose. Understanding that paradox, which was lost in translation, might have saved the Valley from some of the excess that has taken it into the dark territory it has found itself in recently.
From its inception, the very nature of Silicon Valley was about its ability to simultaneously allow diverse cultures to thrive. During the 1960s and 1970s, while Silicon Valley was being formed, you could easily drive from Walker’s Wagon Wheel in Mountain View, where crewcut hard-drinking computer chip designers gathered, to a very different long-haired scene in just up the road in Palo Alto and Menlo Park, which surrounded Stanford Research Institute, the Stanford Artificial Intelligence Laboratory and Xerox’s Palo Alto Research Center, the three labs that pioneered the technologies that would become the modern personal computer and the Internet.
The paradox is perhaps best expressed in the formation of Apple Computer — a company that grew out of the separate interests of its two founders. One, Steve Wozniak was simply interested in building a computer to share with his friends at the Homebrew Computer Club, a hobbyist group founded by a convicted draft resister and a software engineer that would ultimately birth several dozen start-up PC hardware and software companies including Apple. Wozniak would combine his hacker’s instincts for sharing with Steve Jobs, who had the insight to realize that there would be a market for these machines…
…
… Silicon Valley engineers believed they were just one good idea away from becoming the next Jobs or Wozniak.
That deeply entrenched culture of risk-taking — and frequent failure — originally exemplified by the Gold Rush, today remains an integral part of the California and by extension Silicon Valley, Dream.
In recent weeks, much has been made of Lee’s partying life style, which included claims of recreational drug use and attendance at the Burning Man Festival in the Nevada desert, which began on a San Francisco Beach and is based on various anti-capitalist principles such as gifting, decommodification and radical inclusion. The festival, which grew out of the counterculture, has come to embrace a very different technology culture where attendees including Google founders, Sergay Brin and Larry Page and former CEO Eric Schmitt as well as Elon Musk, Peter Thiel and Mark Zuckerberg often arrive by corporate jet. Certainly! Here’s an alternative rewrite for clarity: It has gained a reputation for surpassing the confines of a traditional California scene by integrating technology, art, drugs, and rock & roll, creating a unique and boundary-pushing experience.
Experimentation with psychedelic drugs has been a continuous theme for a subculture in Silicon Valley, going back to the 1960s when group that included engineers from Ampex and Stanford, created a research project to explore the relationship between LSD and creativity.
Yet despite this fascination originally with psychedelics and more recently in the idea of “microdosing” small amounts of LSD, the science has never been clear…
It is more likely that an alternative proposed by a group of social scientists at the Santa Fe Institute offers a more cogent explanation. Creativity, they argued, takes place at the edge of chaos. And that certainly describes the early Silicon Valley which emerged in the midst of a tumultuous time on the San Francisco mid-peninsula during the Sixties…
Eminently worth reading in full.
* Friedrich Nietzsche
###
As we cultivate creative contradictions, we might recall that it was on this date in 1978 that Ward Christensen and Randy Suess launched the first public dialup computer bulletin board system, or BBS– the foundation of what would eventually become the world wide web, countless online messaging systems, and, arguably, Twitter.
It was several decades before the hardware or the network caught up to Christensen and Suess’ imaginations, but all the basic seeds of today’s online communities were in place when the two launched the first bulletin board…
“Bulletin Board Goes Electronic“
“Cyberspace undeniably reflects some form of geography”*…
Your correspondent in stepping again into the rapids, so (Roughly) Daily is going into a short hiatus. Regular service should resume on or around Nov 4. Here, something to enjoy in the meantime…
Our old friend Neal Agarwal has created an interactive museum of sorts, a stroll through the history of the internet, as manifest in the artifacts of important “firsts”– the first smiley, the first MP3, the first “LOL.” the first live-steamed concert, and so, so much more…
Browse through Internet Artifacts, from @nealagarwal.
* Sandra Day O’Connor
###
As we touch the exhibits, we might send imperial birthday greetings to William Henry Gates III; he was born on this date in 1955. Gates is, of course, best known for co-founding the technology giant Microsoft, along with his childhood friend Paul Allen. He led the company from its packaged software beginnings onto the internet. After leaving the company in 2008, he founded several other companies, including BEN, Cascade Investment, TerraPower, bgC3, and Breakthrough Energy; but he has increasingly turned his attention to philanthropy.
“Nothing is so painful to the human mind as a great and sudden change”*…
If an AI-infused web is the future, what can we learn from the past? Jeff Jarvis has some provocative thoughts…
The Gutenberg Parenthesis—the theory that inspired my book of the same name—holds that the era of print was a grand exception in the course of history. I ask what lessons we may learn from society’s development of print culture as we leave it for what follows the connected age of networks, data, and intelligent machines—and as we negotiate the fate of such institutions as copyright, the author, and mass media as they are challenged by developments such as generative AI.
Let’s start from the beginning…
In examining the half-millennium of print’s history, three moments in time struck me:
- After Johannes Gutenberg’s development of movable type in the 1450s in Europe (separate from its prior invention in China and Korea), it took a half-century for the book as we now know it to evolve out of its scribal roots—with titles, title pages, and page numbers. It took another century, until this side and that of 1600, before there arose tremendous innovation with print: the invention of the modern novel with Cervantes, the essay with Montaigne, a market for printed plays with Shakespeare, and the newspaper.
- It took another century before a business model for print at last emerged with copyright, which was enacted in Britain in 1710, not to protect authors but instead to transform literary works into tradable assets, primarily for the benefit of the still-developing industry of publishing.
- And it was one more century—after 1800—before major changes came to the technology of print: the steel press, stereotyping (to mold complete pages rather than resetting type with every edition), steam-powered presses, paper made from abundant wood pulp instead of scarce rags, and eventually the marvelous Linotype, eliminating the job of the typesetter. Before the mechanization and industrialization of print, the average circulation of a daily newspaper in America was 4,000 (the size of a healthy Substack newsletter these days). Afterwards, mass media, the mass market, and the idea of the mass were born alongside the advertising to support them.
One lesson in this timeline is that the change we experience today, which we think is moving fast, is likely only the beginning. We are only a quarter century past the introduction of the commercial web browser, which puts us at about 1480 in Gutenberg years. There could be much disruption and invention still ahead. Another lesson is that many of the institutions we assume are immutable—copyright, the concept of creativity as property, mass media and its scale, advertising and the attention economy—are not forever. That is to say that we can reconsider, reinvent, reject, or replace them as need and opportunity present…
Read on for his suggestion for a reinvention of copyright: “Gutenberg’s lessons in the era of AI,” from @jeffjarvis via @azeem in his valuable newsletter @ExponentialView.
* Mary Wollstonecraft Shelley, Frankenstein
###
As we contemplate change, we might spare a thought for Jan Hus. A Czech theologian and philosopher who became a Church reformer, he was burned at the stake as a heretic (for condemning indulgences and the Crusades) on this date in 1415. His teachings (which largely echoed those of Wycliffe) had a strong influence, over a century later, on Martin Luther, helping inspire the Reformation… which was fueled by Gutenberg’s technology, which had been developed and begun to spread in the meantime.

“I get slightly obsessive about working in archives because you don’t know what you’re going to find. In fact, you don’t know what you’re looking for until you find it.”*…
An update on that remarkable treasure, The Internet Archive…
Within the walls of a beautiful former church in San Francisco’s Richmond district [the facade of which is pictured above], racks of computer servers hum and blink with activity. They contain the internet. Well, a very large amount of it.
The Internet Archive, a non-profit, has been collecting web pages since 1996 for its famed and beloved Wayback Machine. In 1997, the collection amounted to 2 terabytes of data. Colossal back then, you could fit it on a $50 thumb drive now.
Today, the archive’s founder Brewster Kahle tells me, the project is on the brink of surpassing 100 petabytes – approximately 50,000 times larger than in 1997. It contains more than 700bn web pages.
The work isn’t getting any easier. Websites today are highly dynamic, changing with every refresh. Walled gardens like Facebook are a source of great frustration to Kahle, who worries that much of the political activity that has taken place on the platform could be lost to history if not properly captured. In the name of privacy and security, Facebook (and others) make scraping difficult. News organisations’ paywalls (such as the FT’s) are also “problematic”, Kahle says. News archiving used to be taken extremely seriously, but changes in ownership or even just a site redesign can mean disappearing content. The technology journalist Kara Swisher recently lamented that some of her early work at The Wall Street Journal has “gone poof”, after the paper declined to sell the material to her several years ago…
A quarter of a century after it began collecting web pages, the Internet Archive is adapting to new challenges: “The ever-expanding job of preserving the internet’s backpages” (gift article) from @DaveLeeFT in the @FinancialTimes.
###
As we celebrate collection, we might recall that it was on this date in 2001 that the Polaroid Corporation– best known for its instant film and cameras– filed for bankruptcy. Its employment had peaked in 1978 at 21,000; it revenues, in 1991 at $3 Billion.
“History is who we are and why we are the way we are”*…
What a long, strange trip it’s been…
March 12, 1989 Information Management, a Proposal
While working at CERN, Tim Berners-Lee first comes up with the idea for the World Wide Web. To pitch it, he submits a proposal for organizing scientific documents to his employers titled “Information Management, a Proposal.” In this proposal, Berners-Lee sketches out what the web will become, including early versions of the HTTP protocol and HTML.
…
The first entry a timeline that serves as a table of contents for a series of informative blog posts: “The History of the Web,” from @jay_hoffmann.
* David McCullough
###
As we jack in, we might recall that it was on this date in 1969 that the world first learned of what would become the internet, which would, in turn, become that backbone of the web: UCLA announced it would “become the first station in a nationwide computer network which, for the first time, will link together computers of different makes and using different machine languages into one time-sharing system.” It went on to say that “Creation of the network represents a major forward step in computer technology and may server as the forerunner of large computer networks of the future.”
UCLA will become the first station in a nationwide computer network which, for the first time, will link together computers of different makes and using different machine languages into one time-sharing system.
Creation of the network represents a major forward step in computer technology and may serve as the forerunner of large computer networks of the future.
The ambitious project is supported by the Defense Department’s Advanced Research Project Agency (ARPA), which has pioneered many advances in computer research, technology and applications during the past decade. The network project was proposed and is headed by ARPA’s Dr. Lawrence G. Roberts.
The system will, in effect, pool the computer power, programs and specialized know-how of about 15 computer research centers, stretching from UCLA to M.I.T. Other California network stations (or nodes) will be located at the Rand Corp. and System Development Corp., both of Santa Monica; the Santa Barbara and Berkeley campuses of the University of California; Stanford University and the Stanford Research Institute.
The first stage of the network will go into operation this fall as a subnet joining UCLA, Stanford Research Institute, UC Santa Barbara, and the University of Utah. The entire network is expected to be operational in late 1970.
Engineering professor Leonard Kleinrock [see here], who heads the UCLA project, describes how the network might handle a sample problem:
Programmers at Computer A have a blurred photo which they want to bring into focus. Their program transmits the photo to Computer B, which specializes in computer graphics, and instructs B’s program to remove the blur and enhance the contrast. If B requires specialized computational assistance, it may call on Computer C for help.
The processed work is shuttled back and forth until B is satisfied with the photo, and then sends it back to Computer A. The messages, ranging across the country, can flash between computers in a matter of seconds, Dr. Kleinrock says.
UCLA’s part of the project will involve about 20 people, including some 15 graduate students. The group will play a key role as the official network measurement center, analyzing computer interaction and network behavior, comparing performance against anticipated results, and keeping a continuous check on the network’s effectiveness. For this job, UCLA will use a highly specialized computer, the Sigma 7, developed by Scientific Data Systems of Los Angeles.
Each computer in the network will be equipped with its own interface message processor (IMP) which will double as a sort of translator among the Babel of computer languages and as a message handler and router.
Computer networks are not an entirely new concept, notes Dr. Kleinrock. The SAGE radar defense system of the Fifties was one of the first, followed by the airlines’ SABRE reservation system. At the present time, the nation’s electronically switched telephone system is the world’s largest computer network.
However, all three are highly specialized and single-purpose systems, in contrast to the planned ARPA system which will link a wide assortment of different computers for a wide range of unclassified research functions.
“As of now, computer networks are still in their infancy,” says Dr. Kleinrock. “But as they grow up and become more sophisticated, we will probably see the spread of ‘computer utilities’, which, like present electronic and telephone utilities, will service individual homes and offices across the country.”
source









You must be logged in to post a comment.