(Roughly) Daily

Posts Tagged ‘apple

“Patents need inventors more than inventors need patents”*…

 

patent-toiletpaper

 

Patents for invention — temporary monopolies on the use of new technologies — are frequently cited as a key contributor to the British Industrial Revolution. But where did they come from? We typically talk about them as formal institutions, imposed from above by supposedly wise rulers. But their origins, or at least their introduction to England, tell a very different story…

How the 15th century city guilds of Italy paved the way for the creation of patents and intellectual property as we know it: “Age of Invention: The Origin of Patents.”

(Image above: source)

* Kalyan C. Kankanala, Fun IP, Fundamentals of Intellectual Property

###

As we ruminate on rights, we might recall that it was on this date in 1981 that IBM introduced the IBM Personal Computer, commonly known as the IBM PC, the original version of the IBM PC compatible computer design… a relevant descriptor, as the IBM PC was based on open architecture, and third-party suppliers soon developed to provide peripheral devices, expansion cards, software, and ultimately, IBM compatible computers.  While IBM has gone out of the PC business, it had a substantial influence on the market in standardizing a design for personal computers; “IBM compatible” became an important criterion for sales growth.  Only Apple has been able to develop a significant share of the microcomputer market without compatibility with the IBM architecture (and what it has become).

300px-Bundesarchiv_B_145_Bild-F077948-0006,_Jugend-Computerschule_mit_IBM-PC source

 

“The future is there… looking back at us. Trying to make sense of the fiction we will have become”*…

 

Octavia Butler

 

Tim Maughan, an accomplished science fiction writer himself, considers sci-fi works from the 1980s and 90s, and their predictive power.  Covering Bruce Sterling, William Gibson, Rudy Rucker, Steven King, P.D. James, an episode of Star Trek: Deep Space Nine, and Bladerunner, he reserves special attention for a most deserving subject…

When you imagine the future, what’s the first date that comes into your mind? 2050? 2070? The year that pops into your head is almost certainly related to how old you are — some point within our lifetimes yet distant enough to be mysterious, still just outside our grasp. For those of us growing up in the 1980s and ’90s — and for a large number of science fiction writers working in those decades — the 2020s felt like that future. A decade we would presumably live to see but also seemed sufficiently far away that it could be a world full of new technologies, social movements, or political changes. A dystopia or a utopia; a world both alien and familiar.

That future is, of course, now…

Two science fiction books set in the 2020s tower over everything else from that era in their terrifying prescience: Octavia Butler’s Parable of the Sower (1993) and Parable of the Talents (1998). These books by the late master kick off in 2024 Los Angeles and are set against a backdrop of a California that’s been ravaged by floods, storms, and droughts brought on by climate change. Middle- and working-class families huddle together in gated communities, attempting to escape the outside world through addictive pharmaceuticals and virtual reality headsets. New religions and conspiracy theory–chasing cults begin to emerge. A caravan of refugees head north to escape the ecological and social collapse, while a far-right extremist president backed by evangelical Christians comes to power using the chillingly familiar election slogan Make America Great Again.

Although it now feels like much of Butler’s Parable books might have been pulled straight from this afternoon’s Twitter or tonight’s evening news, some elements are more far-fetched. The second book ends with followers of the new religion founded by the central character leaving Earth in a spaceship to colonize Alpha Centauri. Butler originally planned to write a third book following the fates of these interstellar explorers but, sadly, passed away in 2005 before she had a chance. She left us with a duology that remains more grounded and scarily familiar to those of us struggling to come to terms with the everyday dystopias that the real 2020s seem to be already presenting us.

Not that this remarkable accuracy was ever her objective.

“This was not a book about prophecy; this was an if-this-goes-on story,” Butler said about the books during a talk at MIT in 1998. “This was a cautionary tale, although people have told me it was prophecy. All I have to say to that is I certainly hope not.”

In the same talk, Butler describes in detail the fears that drove her to write this warning: the debate over climate change, the eroding of workers’ rights, the rise of the private prison industry, and the media’s increasing refusal to talk about all of these in favor of focusing on soundbite propaganda and celebrity news. Again, these are fears that feel instantly familiar today…

What Blade Runner, cyberpunk– and Octavia Butler– had to say about the age we’re entering now: “How Science Fiction Imagined the 2020s.”

* William Gibson, Pattern Recognition

###

As we honor prophets, we might recall that it was on this date in 1984 that Apple aired an epoch-making commercial, “1984” (directed by Blade Runner director Ridley Scott),  during Superbowl XVIII– for the first and only time.  Two days later, the first Apple Macintosh went on sale.

 

Written by LW

January 22, 2020 at 1:01 am

“When you have seen one ant, one bird, one tree, you have not seen them all”*…

 

tree2

 

In fact, you may not have seen them– really seen them– at all…

Photographs of trees whose trunk and all visible branches have been removed by computer. Only the explosive power of the leaves remains, like fireworks in broad daylight. Through the process of retouching images, I sought to extract by subtraction this explosiveness, this will of life which participates in the majestuousness of the plant world but which is sometimes veiled by our habits of perception...

tree1

tree3

 

Visual artist and photographer Hugo Livet on his series “Fireworks” (“Feu d’artifice”).  More at his site.

* E. O. Wilson

###

As we commune with Kilmer, we might recall that it was on this date in 1307 that Wilhelm Tell (or we Anglos tend to know him, William Tell) shot an apple off his son’s head.

Tell, originally from Bürglen, was a resident of the Canton of Uri (in what is now Switzerland), well known as an expert marksman with the crossbow. At the time, the Habsburg emperors of Austria were seeking to dominate Uri.  Hermann Gessler, the newly appointed Austrian Vogt (the Holy Roman Empire’s title for “overlord”) of Altdorf, raised a pole in the village’s central square, hung his hat on top of it, and demanded that all the local townsfolk bow before the hat.  When Tell passed by the hat without bowing, he was arrested; his punishment was being forced to shoot an apple off the head of his son, Walter– or else both would be executed. Tell was promised freedom if he succeeded.

As the legend has it, Tell split the fruit with a single bolt from his crossbow.  When Gessler queried him about the purpose of a second bolt in his quiver, Tell answered that if he had killed his son, he would have turned the crossbow on Gessler himself.  Gessler became enraged at that comment, and had Tell bound and brought to his ship to be taken to his castle at Küssnacht.  But when a storm broke on Lake Lucerne, Tell managed to escape.  On land, he went to Küssnacht, and when Gessler arrived, Tell shot him with his crossbow.

Tell’s defiance of Gessler sparked a rebellion, in which Tell himself played a major part, leading to the formation of the Swiss Confederation.

Tell and his son

 

 

Written by LW

November 18, 2019 at 1:01 am

“By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it”*…

 

robit writer

 

Recently, OpenAI announced its latest breakthrough, GPT-2, a language model that can write essays to a prompt, answer questions, and summarize longer works… sufficiently successfully that OpenAI has said that it’s too dangerous to release the code (lest it result in “deepfake news” or other misleading mischief).

Scott Alexander contemplates the results.  His conclusion:

a brain running at 5% capacity is about as good as the best AI that the brightest geniuses working in the best-equipped laboratories in the greatest country in the world are able to produce in 2019. But:

We believe this project is the first step in the direction of developing large NLP systems without task-specific training data. That is, we are developing a machine language system in the generative style with no explicit rules for producing text. We hope for future collaborations between computer scientists, linguists, and machine learning researchers.

A boring sentiment from an interesting source: the AI wrote that when asked to describe itself. We live in interesting times.

His complete post, eminently worthy of reading in full: “Do Neural Nets Dream of Electric Hobbits?

[image above, and another account of OpenAI’s creation: “OpenAI says its new robo-writer is too dangerous for public release“]

* Eliezer Yudkowsky

###

As we take the Turing Test, we might send elegantly-designed birthday greetings to Steve Jobs; he was born on this date in 1955.  While he is surely well-known to every reader here, let us note for the record that he was was instrumental in developing the Macintosh, the computer that took Apple to unprecedented levels of success.  After leaving the company he started with Steve Wozniak, Jobs continued his personal computer development at his NeXT Inc.  In 1997, Jobs returned to Apple to lead the company into a new era based on NeXT technologies and consumer electronics.  Some of Jobs’ achievements in this new era include the iMac, the iPhone, the iTunes music store, the iPod, and the iPad.  Under Jobs’ leadership Apple was at one time the world’s most valuable company. (And, of course, he bought Pixar from George Lucas, and oversaw both its rise to animation dominance and its sale to Disney– as a product of which Jobs became Disney’s largest single shareholder.)

Jobs source

 

Written by LW

February 24, 2019 at 1:01 am

“Less, but better”*…

 

Lange-Dieter-Rams_01

Dieter Rams is a giant among modern industrial designers; his clean, “functionalist” work inspired legions of modern designers– perhaps most visibly his work for Braun, which clearly shaped the thinking of Steve Jobs, Jony Ive (and before him, the folks at Frog Design)– perhaps most noticeably in the design of the iPod…

rams ipod

Rams’ 1958 portable radio for Braun, alongside the iPod

source

But have we, in fact, learned the lessons that Rams worked so hard to teach?

Is it perfect timing or merely perverse to release a documentary promoting the design philosophy “Less, but better” during the holiday season? The opening moments of Gary Hustwit’s “Rams,” about Dieter Rams, is more likely to have you revising your gift list than tossing it out. As the camera pauses on the details of the eighty-six-year-old design legend’s single-story home, built in 1971 in Kronberg, Germany, you may find yourself wondering if you, too, need to buy a wall-mounted stereo (the Audio 2/3, designed by Rams for Braun, in 1962-1963) or a boxy leather swivel chair (the 620 armchair, designed by Rams for Vitsoe, in 1962), or to take up the art of bonsai, which Rams practices in his compact, Japanese-inspired garden.

After this montage, the sound of birds chirping is replaced by the sound of typing, and we see Rams seated in front of the rare object in his home that’s not of his own design: the red Valentine typewriter, designed by Ettore Sottsass and Perry King for Olivetti, in 1968. (Rams doesn’t own a computer.)

If you listen to Rams… rather than just look at the elements of his edited world, you will appreciate how his aesthetic and his ethic align. “Less, but better,” the title of his 1995 book, is, Rams says, “not a constraint, it is an advantage which allows us more space for our real life.”…

Ive has always acknowledged his debt to Rams (he contributed a foreword to Sophie Lovell’s book “Dieter Rams: As Little Design as Possible”) but, as the Philadelphia exhibition text suggests, Ive, embedded within Apple’s upgrade cycle, may have missed the point: “the rapid obsolescence and environmental impact of these devices sits uneasily against Rams’s advocacy of long-lasting, durable design.”

The minute design twitches of each year’s Apple launch are a far cry from the revolutionary change that the click wheel ushered in. Newfangled ports, rose-gold backs, the elimination of the home button—these don’t change our relationships to our phones, except to annoy…

The provocative story in full: “What we learned from Dieter Rams, and what we’ve ignored.”

Rams’ “Ten Principles of Design.”

* Dieter Rams

###

As we savor simplicity, we might send well-designed birthday greetings to Walter Dorwin Teague; he was born on this date in 1883.  An  industrial designer, architect, illustrator, graphic designer, writer, and entrepreneur, he is often called the “Dean of Industrial Design,” a field that he pioneered as a profession in the US, along with Norman Bel Geddes, Raymond Loewy, and Henry Dreyfuss.  He is widely known for his exhibition designs during the 1939-40 New York World’s Fair (including the Ford Building), and for his iconic product and package designs, from Eastman Kodak’s Bantam Special to the steel-legged Steinway piano.

Walter_Dorwin_Teague source

 

 

Written by LW

December 18, 2018 at 1:01 am

%d bloggers like this: