(Roughly) Daily

Posts Tagged ‘Internet

“Some folks look for answers, others look for fights”*…

Grateful Dead plays Red Rocks for the final time, August 13, 1987 [source]

Max Abelson takes a break from his (essential) coverage of money and power at Bloomberg News and Businessweek to appreciate the community that’s grown up around the Internet Archive’s Grateful Dead Archive (where one can find– among the over 15,000 concert recordings– not one, but two full takes of the show pictured above)…

On the Archive, the writing about the Dead’s live music often transcends the personal mode and approaches something closer to the galactic. Nothing brings out that cosmic style like “Dark Star,” a song that the band stretched from a three-minute studio single into its own solar system. Ginosega left a flight log for the same forty-three-minute 1973 version that played in the friend’s basement: “About 12 minutes in, Phil fires the engines and turns the ship out of orbit, until at 17 minutes we have arrived in the deepest, darkest part of the galaxy.” The trip isn’t half over. “Only at 21 minutes into the song do they actually start playing the song.” The post, which has a kind of sci-fi internal logic, describes interstellar wind and multicolored ooze, before, “at about 36 minutes, we start the return trip, passing through more familiar systems on our way back home.”

One of the magical things about how high the Dead flew is that they managed to do it without, say, Sly Stone’s rhythm, Joni Mitchell’s poetry, or Brian Wilson’s voice. The allure of this band—whatever it is that keeps sparking so much cosmic wonder and nostalgia—is foggy and mysterious. Paumgarten, in his New Yorker piece, identified a sprawling combination of factors, including Garcia’s soulful charisma and Appalachian gloom, the band’s 26,000-watt sound system, an ethos of group improvisation, and the “particular note of decay” in each cassette swapped from hand to hand. You can think about the Archive as not just the best tape rack of all, but as a collection of thousands of swings at saying the inexplicable. A user named Scottie78 was so moved by a half-hour version of “Dark Star” at the Spectrum in Philadelphia in 1972 that he not only came close to leaving a bullet point for each minute, but more or less created an identification system to differentiate the micro-micro-genres he heard, from “Space Jazz” and “Acid Jazz” to “Acid Jazzgrass.” It’s embarrassing and magnetic at the same time.

Others tip over from starry-eyed to freaked out. “So cacophonous, atonal and scary that it could potentially traumatize animals when played loud,” Phleshy said in 2004 about a version from Rotterdam in 1972. “If this explanation sounds stupid in words, then listen to the last half-hour of ‘Dark Star’ in a darkened room and see if you feel remotely secure.”

The line between the personal and astronomical is thin. Boboboy’s recollection of the 1989 show at JFK Stadium is what Didion might have described if she had witnessed more people sway: “I clearly remember seeing the swirling masses of thousands on the floor from my perch all the way back.” The dancers below looked like birds up above, “a flock of starlings cruising the sky, but in slow motion.”

Some of the writing aims even higher. “When you want to know what it is like being in heaven, cue up the second set,” Seedanrun wrote about the band’s beloved 1977 show at Cornell. “When you want to feel what it is like to be face to face with God, dim the lights and really focus on the ‘Morning Dew.’”

The glory of that show, performed inside the university’s Barton Hall on a snowy night in May, is perhaps the nearest the Dead Archives come to consensus. The thought of sullying it with a rating scale offended a user named GruUbic: “If this is five stars, is heaven a 4.5?” In 2004, BillDP went further, calling the show “the single best live performance I have ever heard from any group at any time.” His authoritativeness is only outdone by the dumbstruck. “Mere words cannot do justice,” Grateful Hillbilly posted in 2015. “Words like amazing and unbelievable and incomparable don’t capture the immensity of awe.”

[Brewster] Kahle, the Internet Archive’s founder, tells me that he wishes more of the web was shaped like the Dead Archive. “What you’re looking at,” he said, “is from an era of the Internet that I think is best typified by what Tim Berners-Lee called ‘pages.’” Today, he said, instead, what dominates is the “feed.” (“Horrible word,” he added.) Facebook and Twitter scroll by endlessly, unaccountably, and unpleasantly, but “it wasn’t always that way, and it was a choice.” Each Dead show, he said, is “something you can anchor to, it’s something you can revolve around.” He went on: “By making things endure, we can have people cherish them, use them, and invest in them. So the writing is fundamentally different. I think we should go back to it—or forward to it.”…

The way the internet was… and should be? “In the Dead Archives,” from @maxabelson.

* The Grateful Dead, “Playing In The Band” (written by Bob Weir, Robert Hunter, and Mickey Hart)

###

As we go hear Uncle John’s Band, we might send bluesy birthday greetings to Ronald Charles McKernan; he was born on this date in 1945. Better known by his stage name, Pigpen,” he was a founding member of The Warlocks… which became the Grateful Dead. He was the band’s original frontman, playing harmonica and electric organ; but Jerry Garcia’s and Phil Lesh’s influences on the band became increasingly stronger as they embraced psychedelic rock. Pigpen’s contributions receded to vocals, harmonica, and percussion (though he continued to be a frontman in concert for some numbers, including his interpretations of Bobby Bland’s “Turn On Your Love Light” and the Rascals’ “Good Lovin'”).

Pigpen was unique among his bandmates in preferring alcohol to psychedelics, and sadly succumbed to alcoholism– from complications of which he died in 1973.

source

“Foresight begins when we accept that we are now creating a civilization of risk”*…

There have been a handful folks– Vernor Vinge, Don Michael, Sherry Turkle, to name a few– who were, decades ago, exceptionally foresightful about the technologically-meditated present in which we live. Philip Agre belongs in their number…

In 1994 — before most Americans had an email address or Internet access or even a personal computer — Philip Agre foresaw that computers would one day facilitate the mass collection of data on everything in society.

That process would change and simplify human behavior, wrote the then-UCLA humanities professor. And because that data would be collected not by a single, powerful “big brother” government but by lots of entities for lots of different purposes, he predicted that people would willingly part with massive amounts of information about their most personal fears and desires.

“Genuinely worrisome developments can seem ‘not so bad’ simply for lacking the overt horrors of Orwell’s dystopia,” wrote Agre, who has a doctorate in computer science from the Massachusetts Institute of Technology, in an academic paper.

Nearly 30 years later, Agre’s paper seems eerily prescient, a startling vision of a future that has come to pass in the form of a data industrial complex that knows no borders and few laws. Data collected by disparate ad networks and mobile apps for myriad purposes is being used to sway elections or, in at least one case, to out a gay priest. But Agre didn’t stop there. He foresaw the authoritarian misuse of facial recognition technology, he predicted our inability to resist well-crafted disinformation and he foretold that artificial intelligence would be put to dark uses if not subjected to moral and philosophical inquiry.

Then, no one listened. Now, many of Agre’s former colleagues and friends say they’ve been thinking about him more in recent years, and rereading his work, as pitfalls of the Internet’s explosive and unchecked growth have come into relief, eroding democracy and helping to facilitate a violent uprising on the steps of the U.S. Capitol in January.

“We’re living in the aftermath of ignoring people like Phil,” said Marc Rotenberg, who edited a book with Agre in 1998 on technology and privacy, and is now founder and executive director for the Center for AI and Digital Policy…

As Reed Albergotti (@ReedAlbergotti) explains, better late than never: “He predicted the dark side of the Internet 30 years ago. Why did no one listen?

Agre’s papers are here.

* Jacques Ellul

###

As we consider consequences, we might recall that it was on this date in 1858 that Queen Victoria sent the first official telegraph message across the Atlantic Ocean from London to U. S. President James Buchanan, in Washington D.C.– an initiated a new era in global communications.

Transmission of the message began at 10:50am and wasn’t completed until 4:30am the next day, taking nearly eighteen hours to reach Newfoundland, Canada. Ninety-nine words, containing five hundred nine letters, were transmitted at a rate of about two minutes per letter.

After White House staff had satisfied themselves that it wasn’t a hoax, the President sent a reply of 143 words in a relatively rapid ten hours. Without the cable, a dispatch in one direction alone would have taken rouighly twelve days by the speediest combination of inland telegraph and fast steamer.

source

“The most incomprehensible thing about the world is that it is at all comprehensible”*…

There is an order to the ordered search for ordered understanding…

Science (from Latin scientia, meaning “knowledge”) is a systematic enterprise that builds and organizes knowledge in the form of testable explanations and predictions about the universe.

Modern science is typically divided into three major branches that consist of the natural sciences (biology, chemistry, physics, astronomy and Earth science), which study nature in the broadest sense; the social sciences (e.g. psychology, sociology, economics, history) which study people and societies; and the formal sciences (e.g. mathematics, logic, theoretical computer science), which study abstract concepts. There is disagreement, however, on the formal sciences being a science [as they use an a priori, as opposed to empirical, methodology]. Disciplines that use science, such as engineering and medicine, are described as applied sciences

And there is a dazzling array of “branches” of the scientific endeavor:

Acanthochronology – study of cactus spines grown in time ordered sequence

Acarology – study of mites and ticks

Aceology – science of remedies, or of therapeutics; iamatology

Acology – study of medical remedies

Acoustics – science of sound

Actinobiology – synonymous with radiobiologyAdenology – study of glands…

Browse dozens and dozens at “Index of branches of science,” from Wikipedia… whose contributors may be erring on the generous side, as the list includes such entries as “Hamartiology” (the study of sin) and “Taxidermy” (the art of curing and stuffing animals).

* Albert Einstein

###

As we tackle taxonomy, we might recall that it was on this date in 2013 that Google experienced a five-minute outage affecting all of it’s services, including Google Search, YouTube, and Google Drive. During that brief period global internet traffic dropped 40%.

“Doing research on the Web is like using a library assembled piecemeal by pack rats and vandalized nightly”*…

But surely, argues Jonathan Zittrain, it shouldn’t be that way…

Sixty years ago the futurist Arthur C. Clarke observed that any sufficiently advanced technology is indistinguishable from magic. The internet—how we both communicate with one another and together preserve the intellectual products of human civilization—fits Clarke’s observation well. In Steve Jobs’s words, “it just works,” as readily as clicking, tapping, or speaking. And every bit as much aligned with the vicissitudes of magic, when the internet doesn’t work, the reasons are typically so arcane that explanations for it are about as useful as trying to pick apart a failed spell.

Underpinning our vast and simple-seeming digital networks are technologies that, if they hadn’t already been invented, probably wouldn’t unfold the same way again. They are artifacts of a very particular circumstance, and it’s unlikely that in an alternate timeline they would have been designed the same way.

The internet’s distinct architecture arose from a distinct constraint and a distinct freedom: First, its academically minded designers didn’t have or expect to raise massive amounts of capital to build the network; and second, they didn’t want or expect to make money from their invention.

The internet’s framers thus had no money to simply roll out a uniform centralized network the way that, for example, FedEx metabolized a capital outlay of tens of millions of dollars to deploy liveried planes, trucks, people, and drop-off boxes, creating a single point-to-point delivery system. Instead, they settled on the equivalent of rules for how to bolt existing networks together.

Rather than a single centralized network modeled after the legacy telephone system, operated by a government or a few massive utilities, the internet was designed to allow any device anywhere to interoperate with any other device, allowing any provider able to bring whatever networking capacity it had to the growing party. And because the network’s creators did not mean to monetize, much less monopolize, any of it, the key was for desirable content to be provided naturally by the network’s users, some of whom would act as content producers or hosts, setting up watering holes for others to frequent.

Unlike the briefly ascendant proprietary networks such as CompuServe, AOL, and Prodigy, content and network would be separated. Indeed, the internet had and has no main menu, no CEO, no public stock offering, no formal organization at all. There are only engineers who meet every so often to refine its suggested communications protocols that hardware and software makers, and network builders, are then free to take up as they please.

So the internet was a recipe for mortar, with an invitation for anyone, and everyone, to bring their own bricks. Tim Berners-Lee took up the invite and invented the protocols for the World Wide Web, an application to run on the internet. If your computer spoke “web” by running a browser, then it could speak with servers that also spoke web, naturally enough known as websites. Pages on sites could contain links to all sorts of things that would, by definition, be but a click away, and might in practice be found at servers anywhere else in the world, hosted by people or organizations not only not affiliated with the linking webpage, but entirely unaware of its existence. And webpages themselves might be assembled from multiple sources before they displayed as a single unit, facilitating the rise of ad networks that could be called on by websites to insert surveillance beacons and ads on the fly, as pages were pulled together at the moment someone sought to view them.

And like the internet’s own designers, Berners-Lee gave away his protocols to the world for free—enabling a design that omitted any form of centralized management or control, since there was no usage to track by a World Wide Web, Inc., for the purposes of billing. The web, like the internet, is a collective hallucination, a set of independent efforts united by common technological protocols to appear as a seamless, magical whole.

This absence of central control, or even easy central monitoring, has long been celebrated as an instrument of grassroots democracy and freedom. It’s not trivial to censor a network as organic and decentralized as the internet. But more recently, these features have been understood to facilitate vectors for individual harassment and societal destabilization, with no easy gating points through which to remove or label malicious work not under the umbrellas of the major social-media platforms, or to quickly identify their sources. While both assessments have power to them, they each gloss over a key feature of the distributed web and internet: Their designs naturally create gaps of responsibility for maintaining valuable content that others rely on. Links work seamlessly until they don’t. And as tangible counterparts to online work fade, these gaps represent actual holes in humanity’s knowledge…

The glue that holds humanity’s knowledge together is coming undone: “The Internet Is Rotting.” @zittrain explains what we can do to heal it.

(Your correspondent seconds his call to support the critically-important work of The Internet Archive and the Harvard Library Innovation Lab, along with the other initiatives he outlines.)

* Roger Ebert

###

As we protect our past for the future, we might recall that it was on this date in 1937 that Hormel introduced Spam. It was the company’s attempt to increase sales of pork shoulder, not at the time a very popular cut. While there are numerous speculations as to the “meaning of the name” (from a contraction of “spiced ham” to “Scientifically Processed Animal Matter”), its true genesis is known to only a small circle of former Hormel Foods executives.

As a result of the difficulty of delivering fresh meat to the front during World War II, Spam became a ubiquitous part of the U.S. soldier’s diet. It became variously referred to as “ham that didn’t pass its physical,” “meatloaf without basic training,” and “Special Army Meat.” Over 150 million pounds of Spam were purchased by the military before the war’s end. During the war and the occupations that followed, Spam was introduced into Guam, Hawaii, Okinawa, the Philippines, and other islands in the Pacific. Immediately absorbed into native diets, it has become a unique part of the history and effects of U.S. influence in the Pacific islands.

source

“The tribalizing power of the new electronic media, the way in which they return to us to the unified fields of the old oral cultures, to tribal cohesion and pre-individualist patterns of thought, is little understood”*…

Nokia was dominant in mobile phone sales from 1998 to around 2010. Nokia’s slogan: Connecting people.

It was amazing to connect with people in the late 90s/early 2000s. I don’t think we were lonely exactly. But maybe meeting people was somewhere between an opportunity, something novel, and, yes, a need – suddenly it was possible to find the right person, or the right community.

So, the zeitgeist of the early 2000s.

I ran across a previous zeitgeist in an article about Choose Your Own Adventure books. They appeared and became massively popular at the same time as text adventure computer games, but neither inspired the invention of the other. How? The real answer may lie far deeper in the cultural subconscious … in the zeitgeist of the 1980s.

1980s: you.

2000s: connection.

2020s: ?

Zeitgeists don’t lead and zeitgeists don’t follow.

I think when we spot some kind of macro trend in establishment consumer ads, it’s never going to be about presenting people with something entirely new. To resonate, it has to be familiar – the trajectory that the consumer is already on – but it also has to scratch an itch. The brand wants to be a helpful fellow traveller, if you like.

I wonder what the zeitgeist of the 2020s will be, or is already maybe. What deep human need will be simultaneously a comfort and an aspiration? There should be hints of it in popular culture already. (If I knew how to put my finger on it, I’d be an ad planner.)

If I had to guess then it would be something about belonging.

There was a hint of this in Reddit’s 5 second Super Bowl commercial which went hard on one their communities, r/WallStreetBets, ganging up to bring down hedge funds. Then we’ve got a couple of generations now who grew up with the idea of fandoms, and of course conspiracy theories like QAnon too. If you squint, you can kind of see this in the way Tesla operates: it’s a consumer brand but it’s also a passionate, combative cause.

Belonging to a tribe is about identity and strength, it’s solace and empowerment all at once. And also knowledge, certainty, and trust in an era of complexity, disinfo, and hidden agendas.

Given that backdrop, it’s maybe unsurprising that the trend in software is towards Discord servers and other virtual private neighbourhoods. But how else will this appear? And is it just the beginnings of something else, something bigger?

1980s (you), 2000s (connection). What’s the 2020s zeitgeist?” From Matt Webb (@genmon)

* Marshall McLuhan

###

As we double down on diversity, we might send well-connected birthday greetings to Joseph Carl Robnett Licklider; he was born on this date in 1015. Better known as “J.C,R.” or “Lick,” he was a prominent figure in the development of computing and computer science. He was especially impactful Considered the “Johnny Appleseed” of computing, he planted many of the seeds of computing in the digital age– escpecially via his idea of a universal computer network to easily transfer and retrieve information which his successors developed into the internet.

Robert Taylor, founder of Xerox PARC‘s Computer Science Laboratory and Digital Equipment Corporation‘s Systems Research Center, noted that “most of the significant advances in computer technology—including the work that my group did at Xerox PARC—were simply extrapolations of Lick’s vision. They were not really new visions of their own. So he was really the father of it all.”

source

Written by (Roughly) Daily

March 11, 2021 at 1:01 am

%d bloggers like this: