(Roughly) Daily

Posts Tagged ‘Technology

“History is who we are and why we are the way we are”*…

What a long, strange trip it’s been…

March 12, 1989 Information Management, a Proposal

While working at CERN, Tim Berners-Lee first comes up with the idea for the World Wide Web. To pitch it, he submits a proposal for organizing scientific documents to his employers titled “Information Management, a Proposal.” In this proposal, Berners-Lee sketches out what the web will become, including early versions of the HTTP protocol and HTML.

The first entry a timeline that serves as a table of contents for a series of informative blog posts: “The History of the Web,” from @jay_hoffmann.

* David McCullough

###

As we jack in, we might recall that it was on this date in 1969 that the world first learned of what would become the internet, which would, in turn, become that backbone of the web: UCLA announced it would “become the first station in a nationwide computer network which, for the first time, will link together computers of different makes and using different machine languages into one time-sharing system.” It went on to say that “Creation of the network represents a major forward step in computer technology and may server as the forerunner of large computer networks of the future.”

UCLA will become the first station in a nationwide computer network which, for the first time, will link together computers of different makes and using different machine languages into one time-sharing system.

Creation of the network represents a major forward step in computer technology and may serve as the forerunner of large computer networks of the future.

The ambitious project is supported by the Defense Department’s Advanced Research Project Agency (ARPA), which has pioneered many advances in computer research, technology and applications during the past decade. The network project was proposed and is headed by ARPA’s Dr. Lawrence G. Roberts.

The system will, in effect, pool the computer power, programs and specialized know-how of about 15 computer research centers, stretching from UCLA to M.I.T. Other California network stations (or nodes) will be located at the Rand Corp. and System Development Corp., both of Santa Monica; the Santa Barbara and Berkeley campuses of the University of California; Stanford University and the Stanford Research Institute.

The first stage of the network will go into operation this fall as a subnet joining UCLA, Stanford Research Institute, UC Santa Barbara, and the University of Utah. The entire network is expected to be operational in late 1970.

Engineering professor Leonard Kleinrock [see here], who heads the UCLA project, describes how the network might handle a sample problem:

Programmers at Computer A have a blurred photo which they want to bring into focus. Their program transmits the photo to Computer B, which specializes in computer graphics, and instructs B’s program to remove the blur and enhance the contrast. If B requires specialized computational assistance, it may call on Computer C for help.

The processed work is shuttled back and forth until B is satisfied with the photo, and then sends it back to Computer A. The messages, ranging across the country, can flash between computers in a matter of seconds, Dr. Kleinrock says.

UCLA’s part of the project will involve about 20 people, including some 15 graduate students. The group will play a key role as the official network measurement center, analyzing computer interaction and network behavior, comparing performance against anticipated results, and keeping a continuous check on the network’s effectiveness. For this job, UCLA will use a highly specialized computer, the Sigma 7, developed by Scientific Data Systems of Los Angeles.

Each computer in the network will be equipped with its own interface message processor (IMP) which will double as a sort of translator among the Babel of computer languages and as a message handler and router.

Computer networks are not an entirely new concept, notes Dr. Kleinrock. The SAGE radar defense system of the Fifties was one of the first, followed by the airlines’ SABRE reservation system. At the present time, the nation’s electronically switched telephone system is the world’s largest computer network.

However, all three are highly specialized and single-purpose systems, in contrast to the planned ARPA system which will link a wide assortment of different computers for a wide range of unclassified research functions.

“As of now, computer networks are still in their infancy,” says Dr. Kleinrock. “But as they grow up and become more sophisticated, we will probably see the spread of ‘computer utilities’, which, like present electronic and telephone utilities, will service individual homes and offices across the country.”

source
Boelter Hall, UCLA

source

Written by (Roughly) Daily

July 3, 2022 at 1:00 am

“Speed and acceleration are merely the dream of making time reversible”*…

In the early 20th century, there was Futurism…

The Italian Futurists, from the first half of the twentieth century… wanted to drive modernisation in turn-of-the-century Italy at a much faster pace. They saw the potential in machines, and technology, to transform the country, to demand progress. It was not however merely an incrementalist approach they were after: words like annihilation, destruction and apocalypse appear in the writings of the futurists, including the author of The Futurist Manifesto, Filippo Tomasso Marinetti. ‘We want to glorify war – the only cure for the world…’ Marinetti proclaimed – this was not for the faint hearted! That same Marinetti was the founder of the Partito Politico Futuristo in 1918, which became part of Mussolini’s Fascist party in 1919. Things did not go well after that.

Beautiful Ideas Which Kill: Accelerationism, Futurism and Bewilderment

And now, in the early 21st century, there is Accelerationism…

These [politically-motivated mass] killings were often linked to the alt-right, described as an outgrowth of the movement’s rise in the Trump era. But many of these suspected killers, from Atomwaffen thugs to the New Zealand mosque shooter to the Poway synagogue attacker, are more tightly connected to a newer and more radical white supremacist ideology, one that dismisses the alt-right as cowards unwilling to take matters into their own hands.

It’s called “accelerationism,” and it rests on the idea that Western governments are irreparably corrupt. As a result, the best thing white supremacists can do is accelerate their demise by sowing chaos and creating political tension. Accelerationist ideas have been cited in mass shooters’ manifestos — explicitly, in the case of the New Zealand killer — and are frequently referenced in white supremacist web forums and chat rooms.

Accelerationists reject any effort to seize political power through the ballot box, dismissing the alt-right’s attempts to engage in mass politics as pointless. If one votes, one should vote for the most extreme candidate, left or right, to intensify points of political and social conflict within Western societies. Their preferred tactic for heightening these contradictions, however, is not voting, but violence — attacking racial minorities and Jews as a way of bringing us closer to a race war, and using firearms to spark divisive fights over gun control. The ultimate goal is to collapse the government itself; they hope for a white-dominated future after that…

Accelerationism: the obscure idea inspiring white supremacist killers around the world” (and source of the image above)

See also: “A Year After January 6, Is Accelerationism the New Terrorist Threat?

For a look at the “intellectual” roots of accelerationism, see “Accelerationism: how a fringe philosophy predicted the future we live in.”

For a powerful articulation of the dangers of Futurism (and even more, Acclerationism), see “The Perils of Smashing the Past.”

And for a reminder of the not-so-obvious ways that movements like these live on, see “The Intentionally Scandalous 1932 Cookbook That Stands the Test of Time,” on The Futurist Cookbook, by Futurist Manifesto author Filippo Tommaso Marinetti… which foreshadowed the “food as fuel” culinary movements that we see today.

* Jean Baudrillard

###

As we slow down, we might send a “Alles Gute zum Geburtstag” to the polymathic Gottfried Wilhelm Leibniz, the philosopher, mathematician, and political adviser, who was important both as a metaphysician and as a logician, but who is probably best remembered for his independent invention of the calculus; he was born on this date in 1646.  Leibniz discovered and developed differential and integral calculus on his own, which he published in 1684; but he became involved in a bitter priority dispute with Isaac Newton, whose ideas on the calculus were developed earlier (1665), but published later (1687).

As it happens, Leibnitz was a wry and incisive political and cultural observer.  Consider, e.g…

If geometry conflicted with our passions and our present concerns as much as morality does, we would dispute it and transgress it almost as much–in spite of all Euclid’s and Archimedes’ demonstrations, which would be treated as fantasies and deemed to be full of fallacies. [Leibniz, New Essays, p. 95]

28134677537_d79a889e6a_o

 source

“O brave new world, that has such people in ‘t!”*…

The estimable Steven Johnson suggests that the creation of Disney’s masterpiece, Snow White, gives us a preview of what may be coming with AI algorithms sophisticated enough to pass for sentient beings…

… You can make the argument that the single most dramatic acceleration point in the history of illusion occurred between the years of 1928 and 1937, the years between the release of Steamboat Willie [here], Disney’s breakthrough sound cartoon introducing Mickey Mouse, and the completion of his masterpiece, Snow White, the first long-form animated film in history [here— actually the first full-length animated feature produced in the U.S; the first produced anywhere in color]. It is hard to think of another stretch where the formal possibilities of an artistic medium expanded in such a dramatic fashion, in such a short amount of time.

[There follows an fascinating history of the Disney Studios technical innovations that made Snow White possible, and an account of the film;’s remarkable premiere…]

In just nine years, Disney and his team had transformed a quaint illusion—the dancing mouse is whistling!—into an expressive form so vivid and realistic that it could bring people to tears. Disney and his team had created the ultimate illusion: fictional characters created by hand, etched onto celluloid, and projected at twenty-four frames per second, that were somehow so believably human that it was almost impossible not to feel empathy for them.

Those weeping spectators at the Snow White premiere signaled a fundamental change in the relationship between human beings and the illusions concocted to amuse them. Complexity theorists have a term for this kind of change in physical systems: phase transitions. Alter one property of a system—lowering the temperature of a cloud of steam, for instance—and for a while the changes are linear: the steam gets steadily cooler. But then, at a certain threshold point, a fundamental shift happens: below 212 degrees Fahrenheit, the gas becomes liquid water. That moment marks the phase transition: not just cooler steam, but something altogether different.

It is possible—maybe even likely—that a further twist awaits us. When Charles Babbage encountered an automaton of a ballerina as a child in the early 1800s, the “irresistible eyes” of the mechanism convinced him that there was something lifelike in the machine.  Those robotic facial expressions would seem laughable to a modern viewer, but animatronics has made a great deal of progress since then. There may well be a comparable threshold in simulated emotion—via robotics or digital animation, or even the text chat of an AI like LaMDA—that makes it near impossible for humans not to form emotional bonds with a simulated being. We knew the dwarfs in Snow White were not real, but we couldn’t keep ourselves from weeping for their lost princess in sympathy with them. Imagine a world populated by machines or digital simulations that fill our lives with comparable illusion, only this time the virtual beings are not following a storyboard sketched out in Disney’s studios, but instead responding to the twists and turns and unmet emotional needs of our own lives. (The brilliant Spike Jonze film Her imagined this scenario using only a voice.) There is likely to be the equivalent of a Turing Test for artificial emotional intelligence: a machine real enough to elicit an emotional attachment. It may well be that the first simulated intelligence to trigger that connection will be some kind of voice-only assistant, a descendant of software like Alexa or Siri—only these assistants will have such fluid conversational skills and growing knowledge of our own individual needs and habits that we will find ourselves compelled to think of them as more than machines, just as we were compelled to think of those first movie stars as more than just flickering lights on a fabric screen. Once we pass that threshold, a bizarre new world may open up, a world where our lives are accompanied by simulated friends…

Are we in for a phase-shift in our understanding of companionship? “Natural Magic,” from @stevenbjohnson, adapted from his book Wonderland: How Play Made The Modern World.

And for a different, but aposite perspective, from the ever-illuminating L. M. Sacasas (@LMSacasas), see “LaMDA, Lemoine, and the Allures of Digital Re-enchantment.”

* Shakespeare, The Tempest

###

As we rethink relationships, we might recall that it was on this date in 2007 that the original iPhone was introduced. Generally downplayed by traditional technology pundits after its announcement six months earlier, the iPhone was greeted by long lines of buyers around the country on that first day. Quickly becoming a phenomenon, one million iPhones were sold in only 74 days. Since those early days, the ensuing iPhone models have continued to set sales records and have radically changed not only the smartphone and technology industries, but the world in which they operate as well.

The original iPhone

source

“The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time, and still retain the ability to function”*…

The Long Tail

One the one hand: Ted Gioia suggests that, while ‘The Long Tail’ was supposed to boost alternative voices in music, movies, and books, the exact opposite has happened…

When I first heard people predict the rise of the Long Tail, I was amused. Not only did it seem wrong-headed, but it ran counter to everything I saw happening around me.

It pains me to say this—because the Long Tail was sold to us as an economic law that not only predicted a more inclusive era of prosperity, but would especially help creative people. According to its proponents, the Long Tail would revitalize our culture by expanding the scope of the arts and giving a boost to visionaries on the fringes of society.

Alternative voices would be nurtured and flourish. Music would get cooler and more surprising. Books would become more diverse and interesting. Indie films would reach larger audiences. Etc. etc. etc.

Hey, what’s not to like?

But it never happened. More to the point, it was never going to happen because the story was a fairy tale. I knew it back then because I had been hired on a number of occasions to analyze the Long Tail myself. But the flaws in the reasoning are far more obvious today, even to me.

Nonetheless many believed it—and many still do. So it’s worth digging into the story of the Long Tail, and examining exactly why it never delivered its promise.

And maybe we can find some alternative pathway to that lost cultural renaissance by seeing how this one went off the rails.

On the other hand: Cal Newport suggest that Kevin Kelly‘s fourteen-year-old prediction that an artist could make a living online with a thousand true fans is (finally) coming true…

In his “1,000 True Fans” essay, Kelly explains that he wasn’t as excited about this new economic model as others seemed to be. “The long tail is famously good news for two classes of people: a few lucky aggregators, such as Amazon and Netflix, and 6 billion consumers,” he writes. “But the long tail is a decidedly mixed blessing for creators.” If your work lives in the long tail, the introduction of Internet-based markets might mean that you go from selling zero units of your creations to selling a handful of units a month, but this makes little difference to your livelihood. “The long tail offers no path out of the quiet doldrums of minuscule sales,” Kelly writes. “Other than aim for a blockbuster hit, what can an artists do to escape the long tail?”

This question might seem fatalistic, but Kelly had a solution. If your creative work exists in the long tail, generating a small but consistent number of sales, then it’s probably sufficiently good to support a small but serious fan base, assuming you’re willing to put in the work required to cultivate this community. In an earlier age, a creative professional might be limited to fans who lived nearby. But by using the tools of the Internet, Kelly argued, it was now possible for creative types to both find and interact with supporters all around the world…

A shining example of the 1,000 True Fans model is the podcasting boom. There are more than eight hundred and fifty thousand active podcasts available right now. Although most of these shows are small and don’t generate much money, the number of people making a full-time living off original audio content is substantial. The key to a financially viable podcast is to cultivate a group of True Fans eager to listen to every episode. The value of each such fan, willing to stream hours and hours of a creator’s content, is surprisingly large; if sufficiently committed, even a modest-sized audience can generate significant income for a creator. According to an advertising agency I consulted, for example, a weekly podcast that generates thirty thousand downloads per episode should be able to reach Kelly’s target of generating a hundred thousand dollars a year in income. Earning a middle-class salary by talking through a digital microphone to a fiercely loyal band of supporters around the world, who are connected by the magic of the Internet, is about as pure a distillation of Kelly’s vision as you’re likely to find…

The real breakthroughs that enabled the revival of the 1,000 True Fans model are better understood as cultural. The rise in both online news paywalls and subscription video-streaming services trained users to be more comfortable paying à la carte for content. When you already shell out regular subscription fees for newyorker.com, Netflix, Peacock, and Disney+, why not also pay for “Breaking Points,” or throw a monthly donation toward Maria Popova? In 2008, when Kelly published the original “1,000 True Fans” essay, it was widely assumed that it would be hard to ever persuade people to pay money for most digital content. (This likely explains why so many of Kelly’s examples focus on selling tangible goods, such as DVDs or custom prints.) This is no longer true. Opening up these marketplaces to purely digital artifacts—text, audio, video, online classes—significantly lowered the barriers to entry for creative professionals looking to make a living online…

But can this last? Is it destined to fall prey to the forces that Gioia catalogues?

The recent history of the Internet, however, warns that we shouldn’t necessarily expect the endearingly homegrown nature of these 1,000 True Fans communities to persist. When viable new economic niches emerge online, venture-backed businesses, looking to extract their cut, are typically not far behind. Services such as Patreon and Kickstarter are jostling for a dominant position in this direct-to-consumer creative marketplace. A prominent recent example of such attempts to centralize the True Fan economy is Substack, which eliminates friction for writers who want to launch paid e-mail newsletters. Substack now has more than a million subscribers who pay for access to newsletters, and is currently valued at around six hundred and fifty million dollars. With this type of money at stake, it’s easy to imagine a future in which a small number of similarly optimized platforms dominate most of the mechanisms by which creative professionals interact with their 1,000 True Fans. In the optimistic scenario, this competition will lead to continued streamlining of the process of serving supporters, increasing the number of people who are able to make a good living off of their creative work: an apotheosis of sorts of Kelly’s original vision. A more pessimistic prediction is that the current True Fan revolution will eventually go the way of the original Web 2.0 revolution, with creators increasingly ground in the gears of monetization. The Substack of today makes it easy for a writer to charge fans for a newsletter. The Substack of tomorrow might move toward a flat-fee subscription model, driving users toward an algorithmically optimized collection of newsletter content, concentrating rewards within a small number of hyper-popular producers, and in turn eliminating the ability for any number of niche writers to make a living…

The future of the creative economy: “Where Did the Long Tail Go?,” from @tedgioia and “The Rise of the Internet’s Creative Middle Class,” from Cal Newport on @kevin2kelly in @NewYorker.

* F. Scott Fitzgerald (“The Crack-Up,” Esquire, February, 1936)

###

As we contemplate culture and commerce, we might recall that it was on this date in 1894 (after 30 states had already enshrined the occasion) that Labor Day became a federal holiday in the United States.

labor day
The country’s first Labor Day parade in New York City on Sept. 5, 1882. This sketch appeared in Frank Leslie’s Illustrated Newspaper.

source (and source of more on the history of Labor Day)

“Plants can’t move, yet the insects come to them and spread their pollen”*…

A canola plant damaged by heat and drought in Saskatchewan, Canada last July

The impact of climate change on agriculture is much discussed– but mostly at the level of yields. Carolyn Beans looks into what a warming planet means for fertilization and reproduction…

… heat is a pollen killer. Even with adequate water, heat can damage pollen and prevent fertilization in canola and many other crops, including corn, peanuts, and rice.

For this reason, many growers aim for crops to bloom before the temperature rises. But as climate change increases the number of days over 90 degrees in regions across the globe, and multi-day stretches of extreme heat become more common, getting that timing right could become challenging, if not impossible.

Faced with a warmer future, researchers are searching for ways to help pollen beat the heat. They’re uncovering genes that could lead to more heat-tolerant varieties and breeding cultivars that can survive winter and flower before heat strikes. They’re probing pollen’s precise limits and even harvesting pollen at large scales to spray directly onto crops when weather improves.

At stake is much of our diet. Every seed, grain, and fruit that we eat is a direct product of pollination…

Farmers and scientists are increasingly observing that unusually high springtime temperatures can kill pollen and interfere with the fertilization of crops. Researchers are now searching for ways to help pollen beat the heat, including developing more heat-tolerant varieties: “Pollen and Heat: A Looming Challenge for Global Agriculture,” from @carolynmbeans in @YaleE360.

* Nahoko Uehashi

###

As we try to stay cool, we might recall that it was on this date in 1960 that chlorophyll– the green pigment responsible for photosynthesis in plants– was first synthesized. The feat was accomplished by Robert Burns Woodward, the preeminent synthetic organic chemist of the twentieth century, who was awarded the Nobel Prize in 1965 for this and other syntheses of complex natural compounds (including Vitamin b12).

Robert Burns Woodward
%d bloggers like this: