(Roughly) Daily

Posts Tagged ‘history of technology

“As long as art lives never shall I accept that men are truly dead”*…

From a self-portrait by Giorgio Vasari [source]

An appreciation of Giorgio Vasari’s seminal The Lives of the Most Excellent Painters, Sculptors, and Architects, the beginning of art history as we know it

I found Giorgio Vasari through Burckhardt and Barzun. The latter writes: “Vasari, impelled by the unexampled artistic outburst of his time, divided his energies between his profession of painter and builder in Florence and biographer of the modern masters in the three great arts of design. His huge collection of Lives, which is a delight to read as well as a unique source of cultural history, was an amazing performance in an age that lacked organized means of research. […] Throughout, Vasari makes sure that his reader will appreciate the enhanced human powers shown in the works that he calls ‘good painting’ in parallel with ‘good letters’.”

Vasari was mainly a painter, but also worked as an architect. He was not the greatest artist in the world, but he had a knack for ingratiating himself with the rich and powerful, so his career was quite successful. Besides painting, he also cared a lot about conservation: both the physical preservation of works and the conceptual preservation of the fame and biographies of artists. He gave a kind of immortality to many lost paintings and sculptures by describing them to us in his book.

His Lives are a collection of more than 180 biographies of Italian artists, starting with Cimabue (1240-1302) and reaching a climax with Michelangelo Buonarroti (1475-1564). They’re an invaluable resource, as there is very little information available about these people other than his book; his biography of Botticelli is 8 pages long, yet on Botticelli’s wikipedia page, Vasari is mentioned 36 times…

Giorgio Vasari was one of the earliest philosophers of progress. Petrarch (1304-1374) invented the idea of the dark ages in order to explain the deficiencies of his own time relative to the ancients, and dreamt of a better future:

My fate is to live among varied and confusing storms. But for you perhaps, if as I hope and wish you will live long after me, there will follow a better age. This sleep of forgetfulness will not last forever. When the darkness has been dispersed, our descendants can come again in the former pure radiance.

To this scheme of ancient glory and medieval darkness, Vasari added a third—modern—age and gave it a name: rinascita. And within his rinascita, Vasari described an upward trajectory starting with Cimabue, and ending in a golden age beginning with eccentric Leonardo and crazed sex maniac Raphael, only to give way to the perfect Michelangelo in the end. It is a trajectory driven by the modern conception of the artist as an individual auteur, rather than a faceless craftsman.

The most benign Ruler of Heaven in His clemency turned His eyes to the earth, and, having perceived the infinite vanity of all those labours, the ardent studies without any fruit, and the presumptuous self-sufficiency of men, which is even further removed from truth than is darkness from light, and desiring to deliver us from such great errors, became minded to send down to earth a spirit with universal ability in every art and every profession.

This golden age was certainly no utopia, as 16th century Italy was ravaged by political turbulence, frequent plague, and incessant war. Many of the artists mentioned were at some point taken hostage by invading armies; Vasari himself had to rescue a part of Michelangelo’s David when it was broken off in the battle to expel the Medici from Florence.

And yet Vasari saw greatness in his time, and the entire book is structured around a narrative of artistic progress. He documented the spread of new technologies and techniques (such as the spread of oil painting, imported from the Low Countries), which—as an artist—he had an intimate understanding of.

This story of progress is paralleled with the rediscovery (and, ultimately, surpassing) of the ancients. It would take until the 17th century for the querelle des Anciens et des Modernes to really take off in France, but in Florence Vasari had already seen enough to decide the question in favor of his contemporaries—the essence of the Enlightenment is already present in 1550…

Art history, cultural history, tech history, the history of ideas– a review of Vasari’s The Lives of the Most Excellent Painters, Sculptors, and Architects by @AlvaroDeMenard.

* Giorgio Vasari


As we frame it up, we might recall that it was on this date in 1946 that the first Cannes Film Festival opened.  It had originally been scheduled for September, 1939 as an “answer” to the Venice Film Fest, which had become a propaganda vehicle for Mussolini and Hitler; but the outbreak of World War II occasioned a delay.


“English is flexible: you can jam it into a Cuisinart for an hour, remove it, and meaning will still emerge”*…

Plus ça change. The opening pages of The Lytille Childrenes Lytil Boke, an instructional book of table manners dating from around 1480 and written in Middle English. Amongst other directives, children are told Bulle not as a bene were in thi throote (Don’t burp as if you had a bean in your throat) and Pyke notte thyne errys nothyr thy nostrellys’(Don’t pick your ears or nose).

To be honest, it is a mess…

English spelling is ridiculous. Sew and new don’t rhyme. Kernel and colonel do. When you see an ough, you might need to read it out as ‘aw’ (thought), ‘ow’ (drought), ‘uff’ (tough), ‘off’ (cough), ‘oo’ (through), or ‘oh’ (though). The ea vowel is usually pronounced ‘ee’ (weak, please, seal, beam) but can also be ‘eh’ (bread, head, wealth, feather). Those two options cover most of it – except for a handful of cases, where it’s ‘ay’ (break, steak, great). Oh wait, one more… there’s earth. No wait, there’s also heart.

The English spelling system, if you can even call it a system, is full of this kind of thing. Yet not only do most people raised with English learn to read and write it; millions of people who weren’t raised with English learn to use it too, to a very high level of accuracy.

Admittedly, for a non-native speaker, such mastery usually involves a great deal of confusion and frustration. Part of the problem is that English spelling looks deceptively similar to other languages that use the same alphabet but in a much more consistent way. You can spend an afternoon familiarising yourself with the pronunciation rules of Italian, Spanish, German, Swedish, Hungarian, Lithuanian, Polish and many others, and credibly read out a text in that language, even if you don’t understand it. Your pronunciation might be terrible, and the pace, stress and rhythm would be completely off, and no one would mistake you for a native speaker – but you could do it. Even French, notorious for the spelling challenges it presents learners, is consistent enough to meet the bar. There are lots of silent letters, but they’re in predictable places. French has plenty of rules, and exceptions to those rules, but they can all be listed on a reasonable number of pages.

English is in a different league of complexity. The most comprehensive description of its spelling – the Dictionary of the British English Spelling System by Greg Brooks (2015) – runs to more than 450 pages as it enumerates all the ways particular sounds can be represented by letters or combinations of letters, and all the ways particular letters or letter combinations can be read out as sounds.

From the early Middle Ages, various European languages adopted and adapted the Latin alphabet. So why did English end up with a far more inconsistent orthography than any other? The basic outline of the messy history of English is widely known: the Anglo-Saxon tribes bringing Old English in the 5th century, the Viking invasions beginning in the 8th century adding Old Norse to the mix, followed by the Norman Conquest of the 11th century and the French linguistic takeover. The moving and mixing of populations, the growth of London and the merchant class in the 13th and 14th centuries. The contact with the Continent and the balance among Germanic, Romance and Celtic cultural forces. No language Academy was established, no authority for oversight or intervention in the direction of the written form. English travelled and wandered and haphazardly tied pieces together. As the blogger James Nicoll put it in 1990, English ‘pursued other languages down alleyways to beat them unconscious and rifle their pockets for new vocabulary’.

But just how does spelling factor into all this? It wasn’t as if the rest of Europe didn’t also contend with a mix of tribes and languages. The remnants of the Roman Empire comprised Germanic, Celtic and Slavic communities spread over a huge area. Various conquests installed a ruling-class language in control of a population that spoke a different language: there was the Nordic conquest of Normandy in the 10th century (where they now write French with a pretty regular system); the Ottoman Turkish rule over Hungary in the 16th and 17th centuries (which now has very consistent spelling rules for Hungarian); Moorish rule in Spain in the 8th to 15th centuries (which also has very consistent spelling). True, other languages did have official academies and other government attempts at standardisation – but those interventions have largely only ever succeeded at implementing minor changes to existing systems in very specific areas. English wasn’t the only language to pick the pockets of others for useful words.

The answer to the weirdness of English has to do with the timing of technology. The rise of printing caught English at a moment when the norms linking spoken and written language were up for grabs, and so could be hijacked by diverse forces and imperatives that didn’t coordinate with each other, or cohere, or even have any distinct goals at all. If the printing press has arrived earlier in the life of English, or later, after some of the upheaval had settled, things might have ended up differently…

Why is English spelling so weird and unpredictable? Don’t blame the mix of languages; look to quirks of timing and technology: “Typos, tricks, and misprints,” from Arika Okrent (@arikaokrent).

* Douglas Coupland


As we muse on the mother tongue, we might spare a thought for a man who used it to wonderful effect: Seymour Wilson “Budd” Schulberg. The son of B. P. Schulberg (head production at Paramount Pictures in it’s 1930s-30s heyday) and Adeline Jaffe Schulberg (who founded one of Hollywood’s most successful talent/literary agencies), Budd went into the family business, finding success as a screenwriter, television producer, novelist, and sports writer. He is probably best remembered for his novels What Makes Sammy Run? and The Harder They Fall, his Academy Award-winning screenplay for On the Waterfront, and his (painfully prescient) screenplay for A Face in the Crowd.


“You can never be overdressed or overeducated”*…

So many choices…

Take online courses from the world’s top universities for free. Below, you will find 1,700 free online courses from universities like Yale, MIT, Harvard, Oxford and more. Our site also features collections of Online Certificate Programs and Online Degree & Mini-Degree Programs

From Open Culture (@openculture), “1,700 Free Online Courses from Top Universities.”

A personal fave: MIT’s “Gödel, Escher, Bach: A Mental Space Odyssey.”

[image above: source]

* Oscar Wilde


As we hit the e-books, we might recall that it was on this date in 1922 that the United States paid tribute to the man instrumental in the technology that enables on-line education, Alexander Graham Bell…

There were more than 14 million telephones in the United States by the time Alexander Graham Bell died. For one minute on August 4, 1922, they were all silent.

The reason: Bell’s funeral. The American inventor was the first to patent telephone technology in the United States and who founded the Bell Telephone System in 1877. Though Bell wasn’t the only person to invent “the transmission of speech by electrical wires,” writes Randy Alfred for Wired, achieving patent primacy in the United States allowed him to spend his life inventing. Even though the telephone changed the world, Bell didn’t stop there.

Bell died on August 2, 1922, just a few days after his 75th birthday. “As a mark of respect every telephone exchange in the United States and Canada closed for a minute when his funeral began around 6:30 p.m. Eastern Standard Time,” Alfred writes.

On the day of the funeral, The New York Times reported that Bell was also honored by advocates for deaf people. “Entirely apart from the monumental achievement of Professor Bell as the inventor of the telephone, his conspicuous work in [sic] behalf of the deaf of this country would alone entitle him to everlasting fame,” said Felix H. Levey, president of the Institution for the Improved Instruction of Deaf Mutes.

In fact, Bell spent much of his income from the telephone on helping deaf people. The same year he founded the Bell Telephone System, 1880, Bell founded the Volta Laboratory. The laboratory, originally called Volta Associates, capitalized on Bell’s work and the work of other sound pioneers. It made money by patenting new innovations for the gramophone and other recorded sound technologies. In 1887, Bell took his share of the money from the sale of gramophone patents and founded the Volta Bureau “as an instrument for the increase and diffusion of knowledge relating to the Deaf,’” writes the National Park Service. Bell and Volta continued to work for deaf rights throughout his life.

Volta Laboratory eventually became Bell Laboratories, which was home to many of the twentieth century’s communication innovations.



“If people had understood how patents would be granted when most of today’s ideas were invented, and had taken out patents, the industry would be at a complete standstill today.”*…

From the Wright Brothers’ patent filings (source)

… It’s illuminating to point out that all three transformative technologies of the twentieth century – aviation, the automobile, and the digital computer – started off in patent battles and required a voluntary suspension of hostilities (a collective decision to ignore patents) before the technology could truly take hold.

The Wright brothers won every patent case they fought, and it did them absolutely no good. The prospect of a fortune wasn’t what motivated them to build an airplane, but ironically enough they could have made a fortune had they just passed on the litigation. In 1905, the Wrights were five years ahead of any potential competitor, and posessed a priceless body of practical knowledge. Their trade secrets and accumulated experience alone would have made them the leaders in the field, especially if they had teamed up with Curtiss. Instead, they got to watch heavily government-subsidized programs in Europe take the technical lead in airplane design as American aviation stagnated.

If you are someone who believes that the Internet and computer software are a transformative technology on a par with aviation, you may find it interesting to note that there is now a patent cease-fire in effect in the world of software, the occasional high-profile infringement case notwithstanding. The reason for the cease-fire is simple: if companies like IBM, Xerox, and Sun were to begin fully enforcing their patent portfolios, it would mean an apocalypse of litigation for all software developers. Everyone understands that the health and growth of the Internet are contingent on ignoring the patent system as much as possible.

At the same time, more patents are being granted than ever before, for broader claims, and with an almost complete disregard for prior art. Entire companies – and not just legal firms – are basing business models on extracting money from the patent system without actually creating any products. And the boundaries of patent law are expanding. For the first time in history, it’s possible to patent pure mathematical ideas (in the form of software patents), or even biological entities. The SARS virus was patented shortly after being isolated for the first time.

But if the patent system doesn’t even work for the archetypal example – two inventors, working alone, who singlehandedly invent a major new technology – why do we keep it at all? Who really benefits, and who pays?…

Learning from (the unhappy experiences of) the Wright Brothers– Maciej Cegłowski explains why the U.S. patent system is counter-productive: “100 Years of Turbulence.” Eminently worthy of reading in full.

See also, Bruce Perens: “Software Patents vs. Free Software.”

* Bill Gates, Challenges and Strategy Memo, Microsoft, May 16, 1991


As we apply our intellects to intellectual property, we might recall that it was on this date in 1976 that Steve Jobs, Steve Wozniak, and Ronald Wayne signed a partnership agreement that established the company that would become Apple Computer, Inc.– a company all about the IP– on January 3, 1977.

Wayne left the partnership eleven days later, relinquishing his ten percent share for $2,300.

Apple in Steve Job’s parents’ home on Crist Drive in Los Altos, California. Although it is widely believed that the company was founded in the house’s garage, Apple co-founder Steve Wozniak called it “a bit of a myth”. Jobs and Wozniak did, however, move some operations to the garage when the bedroom became too crowded.


“The tribalizing power of the new electronic media, the way in which they return to us to the unified fields of the old oral cultures, to tribal cohesion and pre-individualist patterns of thought, is little understood”*…

Nokia was dominant in mobile phone sales from 1998 to around 2010. Nokia’s slogan: Connecting people.

It was amazing to connect with people in the late 90s/early 2000s. I don’t think we were lonely exactly. But maybe meeting people was somewhere between an opportunity, something novel, and, yes, a need – suddenly it was possible to find the right person, or the right community.

So, the zeitgeist of the early 2000s.

I ran across a previous zeitgeist in an article about Choose Your Own Adventure books. They appeared and became massively popular at the same time as text adventure computer games, but neither inspired the invention of the other. How? The real answer may lie far deeper in the cultural subconscious … in the zeitgeist of the 1980s.

1980s: you.

2000s: connection.

2020s: ?

Zeitgeists don’t lead and zeitgeists don’t follow.

I think when we spot some kind of macro trend in establishment consumer ads, it’s never going to be about presenting people with something entirely new. To resonate, it has to be familiar – the trajectory that the consumer is already on – but it also has to scratch an itch. The brand wants to be a helpful fellow traveller, if you like.

I wonder what the zeitgeist of the 2020s will be, or is already maybe. What deep human need will be simultaneously a comfort and an aspiration? There should be hints of it in popular culture already. (If I knew how to put my finger on it, I’d be an ad planner.)

If I had to guess then it would be something about belonging.

There was a hint of this in Reddit’s 5 second Super Bowl commercial which went hard on one their communities, r/WallStreetBets, ganging up to bring down hedge funds. Then we’ve got a couple of generations now who grew up with the idea of fandoms, and of course conspiracy theories like QAnon too. If you squint, you can kind of see this in the way Tesla operates: it’s a consumer brand but it’s also a passionate, combative cause.

Belonging to a tribe is about identity and strength, it’s solace and empowerment all at once. And also knowledge, certainty, and trust in an era of complexity, disinfo, and hidden agendas.

Given that backdrop, it’s maybe unsurprising that the trend in software is towards Discord servers and other virtual private neighbourhoods. But how else will this appear? And is it just the beginnings of something else, something bigger?

1980s (you), 2000s (connection). What’s the 2020s zeitgeist?” From Matt Webb (@genmon)

* Marshall McLuhan


As we double down on diversity, we might send well-connected birthday greetings to Joseph Carl Robnett Licklider; he was born on this date in 1015. Better known as “J.C,R.” or “Lick,” he was a prominent figure in the development of computing and computer science. He was especially impactful Considered the “Johnny Appleseed” of computing, he planted many of the seeds of computing in the digital age– escpecially via his idea of a universal computer network to easily transfer and retrieve information which his successors developed into the internet.

Robert Taylor, founder of Xerox PARC‘s Computer Science Laboratory and Digital Equipment Corporation‘s Systems Research Center, noted that “most of the significant advances in computer technology—including the work that my group did at Xerox PARC—were simply extrapolations of Lick’s vision. They were not really new visions of their own. So he was really the father of it all.”


Written by (Roughly) Daily

March 11, 2021 at 1:01 am

%d bloggers like this: