(Roughly) Daily

Posts Tagged ‘apple

“By far, the greatest danger of Artificial Intelligence is that people conclude too early that they understand it”*…

 

robit writer

 

Recently, OpenAI announced its latest breakthrough, GPT-2, a language model that can write essays to a prompt, answer questions, and summarize longer works… sufficiently successfully that OpenAI has said that it’s too dangerous to release the code (lest it result in “deepfake news” or other misleading mischief).

Scott Alexander contemplates the results.  His conclusion:

a brain running at 5% capacity is about as good as the best AI that the brightest geniuses working in the best-equipped laboratories in the greatest country in the world are able to produce in 2019. But:

We believe this project is the first step in the direction of developing large NLP systems without task-specific training data. That is, we are developing a machine language system in the generative style with no explicit rules for producing text. We hope for future collaborations between computer scientists, linguists, and machine learning researchers.

A boring sentiment from an interesting source: the AI wrote that when asked to describe itself. We live in interesting times.

His complete post, eminently worthy of reading in full: “Do Neural Nets Dream of Electric Hobbits?

[image above, and another account of OpenAI’s creation: “OpenAI says its new robo-writer is too dangerous for public release“]

* Eliezer Yudkowsky

###

As we take the Turing Test, we might send elegantly-designed birthday greetings to Steve Jobs; he was born on this date in 1955.  While he is surely well-known to every reader here, let us note for the record that he was was instrumental in developing the Macintosh, the computer that took Apple to unprecedented levels of success.  After leaving the company he started with Steve Wozniak, Jobs continued his personal computer development at his NeXT Inc.  In 1997, Jobs returned to Apple to lead the company into a new era based on NeXT technologies and consumer electronics.  Some of Jobs’ achievements in this new era include the iMac, the iPhone, the iTunes music store, the iPod, and the iPad.  Under Jobs’ leadership Apple was at one time the world’s most valuable company. (And, of course, he bought Pixar from George Lucas, and oversaw both its rise to animation dominance and its sale to Disney– as a product of which Jobs became Disney’s largest single shareholder.)

Jobs source

 

Written by LW

February 24, 2019 at 1:01 am

“Less, but better”*…

 

Lange-Dieter-Rams_01

Dieter Rams is a giant among modern industrial designers; his clean, “functionalist” work inspired legions of modern designers– perhaps most visibly his work for Braun, which clearly shaped the thinking of Steve Jobs, Jony Ive (and before him, the folks at Frog Design)– perhaps most noticeably in the design of the iPod…

rams ipod

Rams’ 1958 portable radio for Braun, alongside the iPod

source

But have we, in fact, learned the lessons that Rams worked so hard to teach?

Is it perfect timing or merely perverse to release a documentary promoting the design philosophy “Less, but better” during the holiday season? The opening moments of Gary Hustwit’s “Rams,” about Dieter Rams, is more likely to have you revising your gift list than tossing it out. As the camera pauses on the details of the eighty-six-year-old design legend’s single-story home, built in 1971 in Kronberg, Germany, you may find yourself wondering if you, too, need to buy a wall-mounted stereo (the Audio 2/3, designed by Rams for Braun, in 1962-1963) or a boxy leather swivel chair (the 620 armchair, designed by Rams for Vitsoe, in 1962), or to take up the art of bonsai, which Rams practices in his compact, Japanese-inspired garden.

After this montage, the sound of birds chirping is replaced by the sound of typing, and we see Rams seated in front of the rare object in his home that’s not of his own design: the red Valentine typewriter, designed by Ettore Sottsass and Perry King for Olivetti, in 1968. (Rams doesn’t own a computer.)

If you listen to Rams… rather than just look at the elements of his edited world, you will appreciate how his aesthetic and his ethic align. “Less, but better,” the title of his 1995 book, is, Rams says, “not a constraint, it is an advantage which allows us more space for our real life.”…

Ive has always acknowledged his debt to Rams (he contributed a foreword to Sophie Lovell’s book “Dieter Rams: As Little Design as Possible”) but, as the Philadelphia exhibition text suggests, Ive, embedded within Apple’s upgrade cycle, may have missed the point: “the rapid obsolescence and environmental impact of these devices sits uneasily against Rams’s advocacy of long-lasting, durable design.”

The minute design twitches of each year’s Apple launch are a far cry from the revolutionary change that the click wheel ushered in. Newfangled ports, rose-gold backs, the elimination of the home button—these don’t change our relationships to our phones, except to annoy…

The provocative story in full: “What we learned from Dieter Rams, and what we’ve ignored.”

Rams’ “Ten Principles of Design.”

* Dieter Rams

###

As we savor simplicity, we might send well-designed birthday greetings to Walter Dorwin Teague; he was born on this date in 1883.  An  industrial designer, architect, illustrator, graphic designer, writer, and entrepreneur, he is often called the “Dean of Industrial Design,” a field that he pioneered as a profession in the US, along with Norman Bel Geddes, Raymond Loewy, and Henry Dreyfuss.  He is widely known for his exhibition designs during the 1939-40 New York World’s Fair (including the Ford Building), and for his iconic product and package designs, from Eastman Kodak’s Bantam Special to the steel-legged Steinway piano.

Walter_Dorwin_Teague source

 

 

Written by LW

December 18, 2018 at 1:01 am

“In any field, it is easy to see who the pioneers are — they are the ones lying face down with arrows in their backs”*…

 

The story of Vector Graphic, a personal computer company that outran Apple in their early days: “How Two Bored 1970s Housewives Helped Create the PC Industry.”

* Anonymous

###

As we try to remember what “CP/M” stood for, we might recall that it was on this date in 1991 (on the anniversary of the issuing of IBM’s first patent in 1911) that  Microsoft Corp. for the first time reported revenues of more than $1 billion for its fiscal year (1990), the first software company ever to achieve that scale.  While in this age of ‘unicorns,” a billion dollars in revenue seems a quaint marker, that was real money at the time.

As readers who followed the link above will know, Microsoft, founded in 1975, was an early purveyor of the CP/M operating system on which the Vector ran; but (unlike Vector) Gates and Allen embraced IBM’s new architecture, creating DOS (for younger readers: the forerunner of Windows)… and laying the foundation for Microsoft’s extraordinary growth.

Bill Gates in 1990

 source

 

Written by LW

July 25, 2015 at 1:01 am

“I _am_ big. It’s the _pictures_ that got small”*…

 

Apple paid $10 billion to developers in calendar 2014– thus, iOS app developers earned more than Hollywood did from box office in the U.S.  Of course, Hollywood studios make money in foreign theaters, in cable, in home video, and in digital.  But, as Horace Dedieu observes…

Apple’s App Store billings is not the complete App revenue picture either. The Apps economy includes Android and ads and service businesses and custom development.  Including all revenues, apps are still likely to be bigger than Hollywood.

But there’s more to the story. It’s also likely that the App industry is healthier. On an individual level, some App developers earn more than Hollywood stars, and I would guess that the median income of app developers is higher than the median income of actors [a large majority of whom earn less than $1,000 a year from acting jobs]. The app economy sustains more jobs (627,000 iOS jobs in the US vs. 374,000 in Hollywood) and is easier to enter and has wider reach. As the graph [above] shows, it’s also growing far more rapidly…

Grab some popcorn and read the rest at “Bigger Than Hollywood.”

* “Norma Desmond” (Gloria Swanson) in Billy Wilder’s Sunset Boulevard

###

As we disable in-app purchases, we might recall that it was on this date in 1931 that City Lights premiered.  Written and directed by its star, Charlie Chaplin, the film follows Chaplin’s “Tramp” character as he falls in love with a blind woman (Virginia Cherrill).  Though sound films– “talkies”– were the rage at the time, Chaplin produced City Lights as a scored silent– for which he composed the music himself.  It was a huge success on its release, grossing over $5 million ($730 million in 2015 dollars).  And it has grown in critical stature ever since:  In 1992, the Library of Congress selected City Lights for preservation in the United States National Film Registry; then in 2007, the American Film Institute‘s 100 Years… 100 Movies ranked City Lights as the 11th greatest American film of all time.  The critic James Agee referred to the final scene in the film as the “greatest single piece of acting ever committed to celluloid.”

 source

 

Written by LW

January 30, 2015 at 1:01 am

Finding a higher use for those left-over Easter eggs…

From the always-inspirational Instructables, and user bbstudio (among whose passions is carving that natural geometric marvel, the egg shell, as above):

This was done simply to discover if I could do it. I went though a stage where my goal was to remove as much material from an egg shell as possible while still retaining the shape and image of the egg.

More views of this minimalist marvel here; links to more views of the scrimshaw egg shell, and to other contra-seasonal sensations here.

As we gratefully put away the Rit dye, we might recall that it was on this date in 1961 that Robert Noyce was awarded the patent for the integrated circuit that changed electronics.  Readers may recall that Jack Kilby had (separately and independently) patented the integrated earlier than Noyce— and won a Nobel Prize for it.  But Noyce’s design (rooted in silicon, as opposed to the germanium that Kilby used) was more practical… and paved the way for an altogether new kind of “Easter egg.”

Noyce made his breakthrough at Fairchild Semiconductor, of which he was a founding member.  He went on to co-found Intel, then to serve as the unofficial “Mayor of Silicon Valley,” a mentor to scores to tech entrepreneurs– including Steve Jobs.

Noyce with a print of his integrated circuit (source: BBC)

Infinitely Flat(land)…

source

Readers may recall your correspondent’s respect and affection for the extraordinary novella Flatland: A Romance of Many Dimensions— so won’t be surprised that he’s excited to discover the work of Vi Hart.

Hart is an artist and composer with a gift for using mundane materials (like balloons) to illustrate abstruse concepts.  Her most recent creation is a wonderful animation of Flatland…  on a moebius strip.

[TotH to BrainPickings]

As we we give up our search for a beginning or an end, we might recall that it was on this date in 1984– two days after it was introduced in an epoch-making commercial during Superbowl XVIII– that the first Apple Macintosh went on sale.

source

 

Written by LW

January 24, 2011 at 1:01 am

No more pencils, no more books, no more teacher’s dirty looks…

Exam season draws to a close; the glorious expanse of Summer beckons.   As one turns one’s papers face down for the last time this term, a look back at some of the more creative exam answers of the year:

More at Creative Test Answers

As we trade White-Out for sun screen, we might recall that it was on this date in 1977 that the Apple II– the first practical personal computer– went on sale.  The II had been unveiled six weeks earlier at the first West Coast Computer Faire, generally considered the “birthplace the personal computer industry” as both the Apple II and the Commodore PET were introduced there.

Apple II, with floppy drives and a monitor

%d bloggers like this: