(Roughly) Daily

Posts Tagged ‘Internet

“Reality is not a function of the event as event, but of the relationship of that event to past, and future, events”*…

 

ARPAnet

Dr. Leonard Kleinrock poses beside the processor in the UCLA lab where the first ARPANET message was sent

 

The first message transmitted over ARPANET, the pioneering Pentagon-funded data-sharing network, late in the evening on October 29, 1969, was incomplete due to a technical error. UCLA graduate student Charley Kline was testing a “host to host” connection across the nascent network to a machine at SRI in Menlo Park, California, and things seemed to be going well–until SRI’s machine, operated by Bill Duvall, crashed partway through the transmission, meaning the only letters received from the attempted “login” were “lo.”

Kline thought little of the event at the time, but it’s since become the stuff of legend and poetic reinterpretation. “As in, lo and behold!” ARPANET developer and early internet icon Leonard Kleinrock says, grinning as he recounts the story in the 2016 Werner Herzog documentary Lo and Behold: Reveries of the Connected World. Others have interpreted the truncated transmission as “a stuttered hello”; one camp argues it was a prescient “LOL.”

It’s a staple of tech hagiography to inject history’s banal realities with monumental foresight and noble intentions; Mark Zuckerberg demonstrated as much recently, when he claimed Facebook was founded in response to the Iraq War, rather than to rate the attractiveness of Harvard women. It’s understandable to wish that ARPANET’s inaugural message, too, had offered a bit more gravity, given all that the network and its eventual successor, the internet, hath wrought upon the world. But perhaps the most enduring truth of the internet is that so many of its foundational moments and decisive turning points—from Kline’s “lo” to Zuckerberg’s late-night coding sessions producing a service for “dumb fucks” at Harvard—emerged from ad hoc actions and experiments undertaken with little sense of foresight or posterity. In this respect, the inaugural “lo” was entirely apt…

Fifty years after the first successful (or, successful enough) transmission across the ARPANET, we’ve effectively terraformed the planet into a giant computer founded on the ARPANET’s architecture. The messages transmitted across it have certainly become more complex, but the illusion that its ad-hoc infrastructure developed in a political vacuum has become harder and harder to maintain. That illusion has been pierced since 2016, but the myth that seems poised to replace it—that technology can in fact automate away bias and politics itself—is no less insidious.

The vapidity of the first ARPANET message is a reminder of the fallacy of this kind of apolitical, monumental storytelling about technology’s harms and benefits. Few isolated events in the development of the internet were as heroic as we may imagine, or as nefarious as we may fear. But even the most ad hoc of these events occurred in a particular ideological context. What is the result of ignoring or blithely denying that context? Lo and behold: It looks a lot like 2019.

Half a century after the first ARPANET message, pop culture still views connectivity as disconnected from the political worldview that produced it.  The always-illuminating Ingrid Burrington argues that that’s a problem: “How We Misremember the Internet’s Origins.”

“Is everyone who lives in Ignorance like you?” asked Milo.
“Much worse,” he said longingly. “But I don’t live here. I’m from a place very far away called Context.”
Norton Juster, The Phantom Tollbooth

* Robert Penn Warren, All the King’s Men

###

As we ruminate on roots, we might send carefully-coded birthday greetings to Gordon Eubanks; he was born on this date in 1946.  A microcomputer pioneer, he earned his PhD studying under Garry Kildall, who founded Digital Research; his dissertation was BASIC-E, a compiler designed for Kildall’s CP/M operating system.  In 1981, after DR lost the IBM operating system contract to Microsoft (per yesterday’s almanac entry), Eubanks joined DR to create new programming languages.  He soon came to doubt DR’s viability, and left to join Symantec, where he helped develop Q & A, an integrated database and wordprocessor with natural language query. He rose through Symantec’s ranks to become it’s President and CEO.  Later he became president and CEO of Oblix, a silicon valley startup that creates software for web security (acquired by Oracle in 2005).

eubanks source

 

Written by LW

November 7, 2019 at 1:01 am

“PLATO was my Alexandria. It was my library, it was the place where I could attach myself to anything.”*…

 

Plato_ist2_terminal

 

One upon a time [in the 60s and 70s], there was a computer network with thousands of users across the world. It featured chat rooms, message boards, multiplayer games, a blog-like newspaper, and accredited distance learning, all piped to flat-panel plasma screens that were also touchscreens. And it wasn’t the internet.

It was PLATO (Programmed Logic for Automatic Teaching Operations), and its original purpose was to harness the power of the still-obscure world of computing as a teaching tool. Developing PLATO required simultaneous quantum leaps in technological sophistication, and it worked—college and high-school students quickly learned how to use it, and also pushed it to do new things.

Despite decades of use at major universities, it all but vanished in the 1980s and from popular memory in the years that followed, a victim of the microcomputer revolution. At its peak, PLATO was surprisingly similar to the modern internet, and it left its DNA in technology we still use today…

The story of the ur-internet: “PLATO.”

* novelist Richard Powers (who was a coder before he turned to literary fiction)

###

As we log on, we might send super birthday greetings to Seymour Roger Cray; he was born on this date in 1925.  An electrical engineer and computer architect, he designed a series of computers that were the fastest in the world for decades, and founded Cray Research which built many of these machines– effectively creating the “supercomputer” industry and earning the honorific “father of supercomputing.”

Seymour_Cray

With a Cray-1

source

 

Written by LW

September 28, 2019 at 1:01 am

“There’s a compounding and unraveling chaos that is perpetually in motion in the Dark Web’s toxic underbelly”*…

 

Dark Web

 

CIRCL, Luxembourg’s computer security incident response team, has published a dataset of 37,500 .onion website screenshots, a subset of which have been categorized by topic (e.g., “drugs-narcotics”, “extremism”, “finance”) and/or purpose (e.g., “forum”, “file-sharing”, “scam”)

Via Jeremy Singer-Vine’s fascinating Data is Plural.

[For more background see “WTF is Dark Web?,” whence the image above]

* James Scott, Senior Fellow, Institute for Critical Infrastructure Technology

###

As we grab our flashlights, we might recall that it was on this date in 1979 that CompuServe launched the first consumer-oriented online information service, which they called MicroNET (and marketed via Radio Shack)– the first time a consumer had access to services such as e-mail.

The service was not initially favored internally within the business-oriented CompuServe, but as the service became a hit, they renamed it CompuServe Information Service, or CIS.  By the mid-1980’s CompuServe was the largest consumer information service in the world and half their revenue came from CIS.

In 1989 CompuServe connected its proprietary e-mail system to the Internet e-mail system, making it one of the first commercial Internet services.  But CompuServe did not compete well with America On-Line or independent Internet Service Providers in the 1990’s and rapidly lost its dominant market position.

compuserve-300x205 source

 

Written by LW

September 24, 2019 at 1:01 am

“One of the things I did not understand, was that these systems can be used to manipulate public opinion in ways that are quite inconsistent with what we think of as democracy”*…

 

bk_134_howard_rheingold

Nineteen years ago, in his third annual call for answers to an Annual Question, John Brockman asked members of the Edge community what they believed to be “today’s [2000’s] most important unreported story.” The remarkable Howard Rheingold (@hrheingold) answered in a way that has turned out to be painfully prophetic…

The way we learn to use the Internet in the next few years (or fail to learn) will influence the way our grandchildren govern themselves. Yet only a tiny fraction of the news stories about the impact of the Net focus attention on the ways many to-many communication technology might be changing democracy — and those few stories that are published center on how traditional political parties are using the Web, not on how grassroots movements might be finding a voice…

Every communication technology alters governance and political processes. Candidates and issues are packaged and sold on television by the very same professionals who package and sell other commodities. In the age of mass media, the amount of money a candidate can spend on television advertising is the single most important influence on the electoral success. Now that the Internet has transformed every desktop into a printing press, broadcasting station, and place of assembly, will enough people learn to make use of this potential? Or will our lack of news, information, and understanding of the Net as a political tool prove insufficient against the centralization of capital, power, and knowledge that modern media also make possible?…

The political power afforded to citizens by the Web is not a technology issue. Technology makes a great democratization of publishing, journalism, public discourse possible, but does not determine whether or not that potential will be realized. Every computer connected to the Net can publish a manifesto, broadcast audio and video eyewitness reports of events in real time, host a virtual community where people argue about those manifestos and broadcasts. Will only the cranks, the enthusiasts, the fringe groups take advantage of this communication platform? Or will many-to-many communication skills become a broader literacy, the way knowing and arguing about the issues of the day in print was the literacy necessary for the American revolution?…

The Scylla and Charybdis of which Howard warned– centralization-by-capital/political power and atomization-into-cacophony (whether via the pollution of manipulation/”fake news” or simple tribalism)– is now all too apparent… even if it’s not at all clear how we sail safely between them.  It’s almost 20 years later– but not too late to heed Howard’s call, which you can read in full at “How Will The Internet Influence Democracy?

* Eric Schmidt, Executive Chairman of Google [as Howard’s 2000 insight dawns on him in 2017, source]

###

As we try harder, we might recall that it was on this date in 1911 that financier and “Father of Trusts” Charles R. Flint incorporated The Computing-Tabulating-Recording Company as a holding company into which he rolled up manufacturers of record-keeping and measuring systems: Bundy Manufacturing Company, International Time Recording Company, The Tabulating Machine Company, and the Computing Scale Company of America.

Four years later Flint hired Thomas J. Watson, Sr. to run the company; nine years after that, in 1924, Watson organized the formerly disparate units into a single operating company, which he named “International Business Machines,” or as we now know it, IBM.

150px-CTR_Company_Logo source

 

 

“Outward show is a wonderful perverter of the reason”*…

 

facial analysis

Humans have long hungered for a short-hand to help in understanding and managing other humans.  From phrenology to the Myers-Briggs Test, we’ve tried dozens of short-cuts… and tended to find that at best they weren’t actually very helpful; at worst, they were reinforcing of stereotypes that were inaccurate, and so led to results that were unfair and ineffective.  Still, the quest continues– these days powered by artificial intelligence.  What could go wrong?…

Could a program detect potential terrorists by reading their facial expressions and behavior? This was the hypothesis put to the test by the US Transportation Security Administration (TSA) in 2003, as it began testing a new surveillance program called the Screening of Passengers by Observation Techniques program, or Spot for short.

While developing the program, they consulted Paul Ekman, emeritus professor of psychology at the University of California, San Francisco. Decades earlier, Ekman had developed a method to identify minute facial expressions and map them on to corresponding emotions. This method was used to train “behavior detection officers” to scan faces for signs of deception.

But when the program was rolled out in 2007, it was beset with problems. Officers were referring passengers for interrogation more or less at random, and the small number of arrests that came about were on charges unrelated to terrorism. Even more concerning was the fact that the program was allegedly used to justify racial profiling.

Ekman tried to distance himself from Spot, claiming his method was being misapplied. But others suggested that the program’s failure was due to an outdated scientific theory that underpinned Ekman’s method; namely, that emotions can be deduced objectively through analysis of the face.

In recent years, technology companies have started using Ekman’s method to train algorithms to detect emotion from facial expressions. Some developers claim that automatic emotion detection systems will not only be better than humans at discovering true emotions by analyzing the face, but that these algorithms will become attuned to our innermost feelings, vastly improving interaction with our devices.

But many experts studying the science of emotion are concerned that these algorithms will fail once again, making high-stakes decisions about our lives based on faulty science…

“Emotion detection” has grown from a research project to a $20bn industry; learn more about why that’s a cause for concern: “Don’t look now: why you should be worried about machines reading your emotions.”

* Marcus Aurelius, Meditations

###

As we insist on the individual, we might recall that it was on this date in 1989 that Tim Berners-Lee submitted a proposal to CERN for developing a new way of linking and sharing information over the Internet.

It was the first time Berners-Lee proposed a system that would ultimately become the World Wide Web; but his proposal was basically a relatively vague request to research the details and feasibility of such a system.  He later submitted a proposal on November 12, 1990 that much more directly detailed the actual implementation of the World Wide Web.

web25-significant-white-300x248 source

 

“Printing…is the preservative of all arts”*…

 

dunhuang-diamond-sutra-frontispiece

Frontispiece of the Dunhuang Diamond Sūtra

 

In 366, the itinerant monk Yuezun was wandering through the arid landscape [around the Western Chinese city of Dunhuang] when a fantastical sight appeared before him: a thousand buddhas, bathed in golden light. (Whether heat, exhaustion or the strange voice of the sands worked themselves on his imagination is anyone’s guess.) Awed by his vision, Yuezun took up hammer and chisel and carved a devotional space into a nearby cliff-face. It soon became a centre for religion and art: Dunhuang was situated at the confluence of two major Silk Road routes, and both departing and returning merchants made offerings. By the time the site fell into disuse in the 14th century, almost 500 temples had been carved from the cliff.

Among the hundreds of caves was a chamber that served as a storeroom for books. The Library Cave held more than 50,000 texts: religious tracts, business reports, calendars, dictionaries, government documents, shopping lists, and the oldest dated printed book in the world. A colophon at the end of the Dunhuang Diamond Sūtra scroll dates it to 868, nearly six centuries before the first Gutenberg Bible…

Learn more at: “The Oldest Printed Book in the World.”  Then page through the British Libraries digitization of its restoration.

* Isaiah Thomas (the 19th century publisher and author, not the basketball player)

###

As we treasure tomes, we might recall that it was on this date in 1990 that  Tim Berners-Lee published a formal proposal for aa “Hypertext project” that he called the World Wide Web (though at the time he rendered it in one word: “WorldWideWeb”)… laying the foundation for a network that has become central to the information age– a network that, with its connected technologies, is believed by many to have sparked a revolution as fundamental and impactful as the revolution ignited by Gutenberg and moveable type.

Sir_Tim_Berners-Lee_(cropped) source

 

Written by LW

November 12, 2018 at 1:01 am

“Chance favors the connected mind”*…

 

The Wall Street Journal‘s review of the web in late 1996– completely intact, with links still live…

Stroll down memory lane here.

[TotH to Benedict Evans]

See also “We haven’t learned anything about what the web is for since 1996.”

* Steven Johnson, Where Good Ideas Come From: The Natural History of Innovation

###

As we try to remember, we might send well-connected birthday greetings to Bob Wallace; he was born on this date in 1949.  A software developer, programmer and the ninth employee of Microsoft, He was the first popular user of the term “shareware,” creator of the word processing program PC-Write, founder of the software company Quicksoft, and an “online drug guru” who devoted much time and money to the research of psychedelic drugs.

Bob ended his Usenet posts with the phrase, “Bob Wallace (just my opinion).”

 source

 

Written by LW

May 29, 2018 at 1:01 am

%d bloggers like this: