(Roughly) Daily

Posts Tagged ‘history

“People in any organization are always attached to the obsolete – the things that should have worked but did not, the things that once were productive and no longer are”*…

Ed Zitron argues that America has too many managers, and managers misbehaving at that…

In a 2016 Harvard Business Review analysis, two writers calculated the annual cost of excess corporate bureaucracy as about $3 trillion, with an average of one manager per every 4.7 workers. Their story mentioned several case studies—a successful GE plant with 300 technicians and a single supervisor, a Swedish bank with 12,000 workers and three levels of hierarchy—that showed that reducing the number of managers usually led to more productivity and profit. And yet, at the time of the story, 17.6 percent of the U.S. workforce (and 30 percent of the workforce’s compensation) was made up of managers and administrators—an alarming statistic that shows how bloated America’s management ranks had become.

The United States, more than anywhere else in the world, is addicted to the concept of management. As I’ve written before, management has become a title rather than a discipline. We have a glut of people in management who were never evaluated on their ability to manage before being promoted to their role. We have built corporate America around the idea that if you work hard enough, one day you might become a manager, someone who makes rather than takes orders. While this is not the only form of management, based on the response to my previous article and my newsletters on the subject, this appears to be how many white-collar employees feel. Across disparate industries, an overwhelming portion of management personnel is focused more on taking credit and placing blame rather than actually managing people, with dire consequences.

This type of “hall monitor” management, as a practice, is extremely difficult to execute remotely, and thus the coming shift toward permanent all- or part-remote work will lead to a dramatic rethinking of corporate structure. Many office workers—particularly those in industries that rely on the skill or creativity of day-to-day employees—are entering a new world where bureaucracy will be reduced not because executives have magically become empathetic during the pandemic, but because slowing down progress is bad business. In my eyes, that looks like a world in which the power dynamics of the office are inverted. With large swaths of people working from home some or all of the time, managers will be assessed not on their ability to intimidate other people into doing things, but on their ability to provide their workers with the tools they need to measurably succeed at their job.

In order to survive, managers, in other words, will need to start proving that they actually do something. What makes this shift all the more complicated is that many 21st-century, white-collar employees don’t necessarily need a hands-on manager to make sure they get their work done…

The pandemic has laid bare that corporate America disrespects entry-level workers. At many large companies, the early years of your career are a proving ground with little mentorship and training. Too many companies hand out enormous sums to poach people trained elsewhere, while ignoring the way that the best sports teams tend to develop stars—by taking young, energetic people and investing in their future (“trust the process,” etc.). This goes beyond investing in education and courses; it involves taking rising stars in your profession and working to make them as good as your top performer.

In a mostly remote world, a strong manager is someone who gets the best out of the people they’re managing, and sees the forest from the trees—directing workers in a way that’s informed by both experience and respect. Unfortunately, the traditional worker-to-manager pipeline often sets people up for inefficiency and failure. It’s the equivalent of taking a pitcher in their prime and making them a coach—being good at one thing doesn’t mean you can make other people good at the same thing. This is known as the Peter principle, a management concept developed by Laurence J. Peter in the late ’60s that posits that a person who’s good at their job in a hierarchical organization will invariably be promoted to a position that requires different skills, until they’re eventually promoted to something they can’t do, at which point they’ve reached their “maximum incompetence.” Consistent evidence shows that the principle is real: A study of sales workers at 214 firms by the National Bureau of Economic Research found that firms prioritize current job performance in promotion decisions over whether the person can actually do the job for which they’re being considered. In doing so, they’re placing higher value on offering the incentive of promotion to get more out of their workers, at the cost of potentially injecting bad management into their organization.

What I’m talking about here is a fundamental shift in how we view talent in the workplace. Usually, when someone is good at their job, they are given a soft remit to mentor people, but rarely is that formalized into something that is mutually beneficial. A lack of focus on fostering talent is counterintuitive, and likely based on a level of fear that one could train one’s own replacement, or that a business could foster its own competition. This is a problem that could be solved by paying people more money for being better at their job. Growing talent is also a more sustainable form of business—one that harkens back to the days of apprenticeships—where you’re fostering and locking up talent so that it doesn’t go elsewhere, and doesn’t cost you time and money to have to recruit it (or onboard it, which costs, on average, more than $4,000 a person). Philosophically, it changes organizations from a defensive position (having to recruit to keep up) to an offensive position (building an organization from within), and also greatly expands an organization’s ability to scale affordably…

The problem is that modern American capitalism has equated “getting the most out of someone” with “getting the most hours out of them,” rather than getting the most value out of them. “Success,” as I’ve discussed before, is worryingly disconnected from actually succeeding in business.

Reducing bureaucracy is also a net positive for the labor market, especially for young people. Entry-level corporate work is extremely competitive and painful, a years-long process in which you’re finding your footing in an industry and an organization. If we can change the lens through which we view those new to the workforce—as the potential hotshots of the future, rather than people who have to prove themselves—we’ll have stronger organizations that waste less money. We should be trying to distill and export the talents of our best performers, and give them what they need to keep doing great things for our companies while also making their colleagues better too.

All of this seems inevitable, to me, because a remote future naturally reconfigures the scaffolding of how work is done and how workers are organized. The internet makes the world a much smaller place, which means that simple things such as keeping people on task don’t justify an entire position—but mentorship and coaching that can get the best out of each worker do.

Hopefully we can move beyond management as a means of control, and toward a culture that appreciates a manager who fosters and grows the greatness in others.

The pandemic has exposed a fundamental weakness in the system: “Say Goodbye to Your Manager,” from @edzitron.

* Peter Drucker

###

As we reorganize, we might recall that it was on this date that Henri Giffard made the first first powered and controlled flight of an airship, traveling 27 km from Paris to Élancourt in his “Giffard dirigible.”

Airships were the first aircraft capable of controlled powered flight, and were most commonly used before the 1940s, largely floated with (highly-flammable) hydrogen gas. Their use decreased as their capabilities were surpassed by those of airplanes- and then plummeted after a series of high-profile accidents, including the 1930 crash and burning of the British R101 in France, the 1933 and 1935 storm-related crashes of the twin airborne aircraft carrier U.S. Navy helium-filled rigids, the USS Akron and USS Macon respectively, and– most famously– the 1937 burning of the German hydrogen-filled Hindenburg.

The Giffard dirigible [source]

Written by (Roughly) Daily

September 24, 2021 at 1:00 am

“Facts are facts and will not disappear on account of your likes”*…

This artist rendering provided by the European South Observatory shows some of the 32 new planets astronomers found outside our solar system

… That said, some facts may morph out from under us. In consideration of “in-between” facts:

When people think of knowledge, they generally think of two sorts of facts: facts that don’t change, like the height of Mount Everest or the capital of the United States, and facts that fluctuate constantly, like the temperature or the stock market close.

But in between there is a third kind: facts that change slowly. These are facts which we tend to view as fixed, but which shift over the course of a lifetime. For example: What is Earth’s population? I remember learning 6 billion, and some of you might even have learned 5 billion. Well, it turns out it’s about 6.8 billion.

Or, imagine you are considering relocating to another city. Not recognizing the slow change in the economic fortunes of various metropolitan areas, you immediately dismiss certain cities. For example, Pittsburgh, a city in the core of the historic Rust Belt of the United States, was for a long time considered to be something of a city to avoid. But recently, its economic fortunes have changed, swapping steel mills for technology, with its job growth ranked sixth in the entire United States.

These slow-changing facts are what I term “mesofacts.” Mesofacts are the facts that change neither too quickly nor too slowly, that lie in this difficult-to-comprehend middle, or meso-, scale. Often, we learn these in school when young and hold onto them, even after they change. For example, if, as a baby boomer, you learned high school chemistry in 1970, and then, as we all are apt to do, did not take care to brush up on your chemistry periodically, you would not realize that there are 12 new elements in the Periodic Table. Over a tenth of the elements have been discovered since you graduated high school! While this might not affect your daily life, it is astonishing and a bit humbling.

For these kinds of facts, the analogy of how to boil a frog is apt: Change the temperature quickly, and the frog jumps out of the pot. But slowly increase the temperature, and the frog doesn’t realize that things are getting warmer, until it’s been boiled. So, too, is it with humans and how we process information. We recognize rapid change, whether it’s as simple as a fast-moving object or living with the knowledge that humans have walked on the moon. But anything short of large-scale rapid change is often ignored. This is the reason we continue to write the wrong year during the first days of January.

Our schools are biased against mesofacts. The arc of our educational system is to be treated as little generalists when children, absorbing bits of knowledge about everything from biology to social studies to geology. But then, as we grow older, we are encouraged to specialize. This might have been useful in decades past, but in our increasingly fast-paced and interdisciplinary world, lacking an even approximate knowledge of our surroundings is unwise.

Updating your mesofacts can change how you think about the world. Do you know the percentage of people in the world who use mobile phones? In 1997, the answer was 4 percent. By 2007, it was nearly 50 percent. The fraction of people who are mobile phone users is the kind of fact you might read in a magazine and quote at a cocktail party. But years later the number you would be quoting would not just be inaccurate, it would be seriously wrong. The difference between a tiny fraction of the world and half the globe is startling, and completely changes our view on global interconnectivity.

Mesofacts can also be fun. Let’s focus for a moment on some mesofacts that can be of vital importance if you’re a child, or parent of a child: those about dinosaurs. Just a few decades ago, dinosaurs were thought to be cold-blooded, slow-witted lizards that walked with their legs splayed out beside them. Now, scientists think that many dinosaurs were warm-blooded and fast-moving creatures. And they even had feathers! Just a few weeks ago we learned about the color patterns of dinosaurs (stripes! with orange tufts!). These facts might not affect how you live your life, but then again, you’re probably not 6 years old. There is another mesofact that is unlikely to affect your daily routine, but might win you a bar bet: the number of planets known outside the solar system. After the first extrasolar planet around an ordinary star made headlines back in 1995, most people stopped paying attention. Well, the number of extrasolar planets is currently over 400. Know this, and the next round won’t be on you.

The fact that the world changes rapidly is exciting, but everyone knows about that. There is much change that is neither fast nor momentous, but no less breathtaking.

Introducing the mesofact: “Warning- Your reality is out of date,” from Samuel Arbesman (@arbesman) who went on to develop this notion in a wonderful book, The Half-Life of Facts. Via @inevernu who notes that the above article, which ran in 2010, contains examples of mesofacts that have already changed again– illustrating Arbesman’s point…

* Jawaharlal Nehru

###

As we noodle on knowledge, we might recall that it was on this date in 1642 that the first American college commencement ceremony was held at Harvard College. It was North America’s first taste of non-religious ritual– and was designed to send a clear message to England that its American colonies were a going concern. Still, of the nine seniors graduated, three soon crossed the Atlantic the other way, one to serve as a diplomat for the rebellious Oliver Cromwell and another to study medicine in Italy.

Apropos the piece above, the curriculum followed by those graduates was rather different– was filled with different facts– than those of classes in later centuries.

source

Written by (Roughly) Daily

September 23, 2021 at 1:00 am

“I’m a little tea pot / Short and stout”*…

The original Utah teapot, currently on display at the Computer History Museum in Mountain View, California.

The fascinating story of the “Utah teapot,” the ur-object in the development of computer graphics…

This unassuming object—the “Utah teapot,” as it’s affectionately known—has had an enormous influence on the history of computing, dating back to 1974, when computer scientist Martin Newell was a Ph.D. student at the University of Utah.

The U of U was a powerhouse of computer graphics research then, and Newell had some novel ideas for algorithms that could realistically display 3D shapes—rendering complex effects like shadows, reflective textures, or rotations that reveal obscured surfaces. But, to his chagrin, he struggled to find a digitized object worthy of his methods. Objects that were typically used for simulating reflections, like a chess pawn, a donut, and an urn, were too simple.

One day over tea, Newell told his wife Sandra that he needed more interesting models. Sandra suggested that he digitize the shapes of the tea service they were using, a simple Melitta set from a local department store. It was an auspicious choice: The curves, handle, lid, and spout of the teapot all conspired to make it an ideal object for graphical experiment. Unlike other objects, the teapot could, for instance, cast a shadow on itself in several places. Newell grabbed some graph paper and a pencil, and sketched it.

Back in his lab, he entered the sketched coordinates—called Bézier control points, first used in the design of automobile bodies—on a Tektronix storage tube, an early text and graphics computer terminal. The result was a lovely virtual teapot, more versatile (and probably cuter) than any 3D model to date.

The new model was particularly appealing to Newell’s colleague, Jim Blinn [of whom Ivan Sutherland, the head of the program at Utah and a computer graphics pioneer said, “There are about a dozen great computer graphics people and Jim Blinn is six of them”]. One day, demonstrating how his software could adjust an object’s height, Blinn flattened the teapot a bit, and decided he liked the look of that version better. The distinctive Utah teapot was born.

The computer model proved useful for Newell’s own research, featuring prominently in his next few publications. But he and Blinn also took the important step of sharing their model publicly. As it turned out, other researchers were also starved for interesting 3D models, and the digital teapot was exactly the experimental test bed they needed. At the same time, the shape was simple enough for Newell to input and for computers to process. (Rumor has it some researchers even had the data points memorized!) And unlike many household items, like furniture or fruit-in-a-bowl, the teapot’s simulated surface looked realistic without superimposing an artificial, textured pattern.

The teapot quickly became a beloved staple of the graphics community. Teapot after teapot graced the pages and covers of computer graphics journals.  “Anyone with a new idea about rendering and lighting would announce it by first trying it out on a teapot,” writes animator Tom Sito in Moving Innovation...

These days, the Utah teapot has achieved legendary status. It’s a built-in shape in many 3D graphics software packages used for testing, benchmarking, and demonstration. Graphics geeks like to sneak it into scenes and games as an in-joke, an homage to their countless hours of rendering teapots; hence its appearances in Windows, Toy Story, and The Simpsons

Over the past few years, the teapot has been 3D printed back into the physical world, both as a trinket and as actual china. Pixar even made its own music video in honor of the teapot, titled “This Teapot’s Made for Walking,” and a teapot wind-up toy as a promotion for its Renderman software.

Newell has jokingly lamented that, despite all his algorithmic innovations, he’ll be remembered primarily for “that damned teapot.” But as much as computer scientists try to prove their chops by inventing clever algorithms, test beds for experimentation often leave a bigger mark. Newell essentially designed the model organism of computer graphics: to graphics researchers as lab mice are to biologists.

For the rest of us the humble teapot serves as a reminder that, in the right hands, something simple can become an icon of creativity and hidden potential…

How a humble serving piece shaped a technological domain: “The Most Important Object In Computer Graphics History Is This Teapot,” from Jesse Dunietz (@jdunietz)

* from “I’m a Little Tea Pot,” a 1939 novelty song by George Harold Sanders and Clarence Z. Kelley

###

As we muse on models, we might send foundational birthday greetings to Michael Faraday; he was born on this date in 1791. One of the great experimental scientists of all time, Faraday made huge contributions to the study of electromagnetism and electrochemistry.

Although Faraday received little formal education, he was one of the most influential scientists in history. It was by his research on the magnetic field around a conductor carrying a direct current that Faraday established the basis for the concept of the electromagnetic field in physics. Faraday also established that magnetism could affect rays of light and that there was an underlying relationship between the two phenomena. He similarly discovered the principles of electromagnetic induction and diamagnetism, and the laws of electrolysis. His inventions of electromagnetic rotary devices formed the foundation of electric motor technology, and it was largely due to his efforts that electricity became practical for use in technology [including, of course, computing and computer graphics].

As a chemist, Faraday discovered benzene, investigated the clathrate hydrate of chlorine, invented an early form of the Bunsen burner and the system of oxidation numbers, and popularised terminology such as “anode“, “cathode“, “electrode” and “ion“. Faraday ultimately became the first and foremost Fullerian Professor of Chemistry at the Royal Institution, a lifetime position.

Faraday was an excellent experimentalist who conveyed his ideas in clear and simple language; his mathematical abilities, however, did not extend as far as trigonometry and were limited to the simplest algebra. James Clerk Maxwell took the work of Faraday and others and summarized it in a set of equations which is accepted as the basis of all modern theories of electromagnetic phenomena. On Faraday’s uses of lines of force, Maxwell wrote that they show Faraday “to have been in reality a mathematician of a very high order – one from whom the mathematicians of the future may derive valuable and fertile methods.”…

Albert Einstein kept a picture of Faraday on his study wall, alongside pictures of Arthur Schopenhauer and James Clerk Maxwell. Physicist Ernest Rutherford stated, “When we consider the magnitude and extent of his discoveries and their influence on the progress of science and of industry, there is no honour too great to pay to the memory of Faraday, one of the greatest scientific discoverers of all time.”

Wikipedia

source

“This potential possibility need only play a role as a counterfactual, according to quantum theory, for it to have an actual effect!”*…

Contemplate counterfactuals: things that have not happened — but could happen — a neglected area of scientific theory…

If you could soar high in the sky, as red kites often do in search of prey, and look down at the domain of all things known and yet to be known, you would see something very curious: a vast class of things that science has so far almost entirely neglected. These things are central to our understanding of physical reality, both at the everyday level and at the level of the most fundamental phenomena in physics — yet they have traditionally been regarded as impossible to incorporate into fundamental scientific explana­tions. They are facts not about what is — the ‘actual’ — but about what could or could not be. In order to distinguish them from the ac­tual, they are called counterfactuals.

Suppose that some future space mission visited a remote planet in another solar system, and that they left a stainless-steel box there, containing among other things the critical edition of, say, William Blake’s poems. That the poetry book is subsequently sit­ting somewhere on that planet is a factual property of it. That the words in it could be read is a counterfactual property, which is true regardless of whether those words will ever be read by anyone. The box may be never found; and yet that those words could be read would still be true — and laden with significance. It would signify, for instance, that a civilization visited the planet, and much about its degree of sophistication.

To further grasp the importance of counterfactual properties, and their difference from actual properties, imagine a computer programmed to produce on its display a string of zeroes. That is a factual property of the computer, to do with its actual state — with what is. The fact that it could be reprogrammed to output other strings is a counterfactual property of the computer. The computer may never be so programmed; but the fact that it could is an essential fact about it, without which it would not qualify as a computer.

The counterfactuals that matter to science and physics, and that have so far been neglected, are facts about what could or could not be made to happen to physical systems; about what is possible or impossible. They are fundamental because they express essential features of the laws of physics — the rules that govern every system in the universe. For instance, a counterfactual property imposed by the laws of physics is that it is impossible to build a perpetual motion machine. A perpetual motion machine is not simply an object that moves forever once set into motion: it must also gener­ate some useful sort of motion. If this device could exist, it would produce energy out of no energy. It could be harnessed to make your car run forever without using fuel of any sort. Any sequence of transformations turning something without energy into some thing with energy, without depleting any energy supply, is impos­sible in our universe: it could not be made to happen, because of a fundamental law that physicists call the principle of conservation of energy.

Another significant counterfactual property of physical sys­tems, central to thermodynamics, is that a steam engine is possible. A steam engine is a device that transforms energy of one sort into energy of a different sort, and it can perform useful tasks, such as moving a piston, without ever violating that principle of conserva­tion of energy. Actual steam engines (those that have been built so far) are factual properties of our universe. The possibility of build­ing a steam engine, which existed long before the first one was actually built, is a counterfactual.

So the fundamental types of counterfactuals that occur in physics are of two kinds: one is the impossibility of performing a transformation (e.g., building a perpetual motion machine); the other is the possibility of performing a transformation (e.g., building a steam engine). Both are cardinal properties of the laws of phys­ics; and, among other things, they have crucial implications for our endeavours: no matter how hard we try, or how ingeniously we think, we cannot bring about transformations that the laws of physics declare to be impossible — for example, creating a per­petual motion machine. However, by thinking hard enough, we can come up with more and better ways of performing a pos­sible transformation — for instance, that of constructing a steam engine — which can then improve over time.

In the prevailing scientific worldview, counterfactual proper­ties of physical systems are unfairly regarded as second-class citi­zens, or even excluded altogether. Why? It is because of a deep misconception, which, paradoxically, originated within my own field, theoretical physics. The misconception is that once you have specified everything that exists in the physical world and what happens to it — all the actual stuff — then you have explained every­thing that can be explained. Does that sound indisputable? It may well. For it is easy to get drawn into this way of thinking with­out ever realising that one has swallowed a number of substantive assumptions that are unwarranted. For you can’t explain what a computer is solely by specifying the computation it is actually per­forming at a given time; you need to explain what the possible com­putations it could perform are, if it were programmed in possible ways. More generally, you can’t explain the presence of a lifeboat aboard a pirate ship only in terms of an actual shipwreck. Everyone knows that the lifeboat is there because of a shipwreck that could happen (a counterfactual explanation). And that would still be the reason even if the ship never did sink!

Despite regarding counterfactuals as not fundamental, science has been making rapid, relentless progress, for example, by devel­oping new powerful theories of fundamental physics, such as quantum theory and Einstein’s general relativity; and novel expla­nations in biology — with genetics and molecular biology — and in neuroscience. But in certain areas, it is no longer the case. The assumption that all fundamental explanations in science must be expressed only in terms of what happens, with little or no refer­ence to counterfactuals, is now getting in the way of progress. For counterfactuals are essential to a number of things that are cur­rently explained only vaguely in science, or not explained at all. Counterfactuals are central to an exact, unified theory of heat, work, and information (both classical and quantum); to explain mat­ters such as the appearance of design in living things; and to a sci­entific explanation of knowledge…

An excerpt from Chiara Marletto‘s The Science of Can and Can’t: A Physicist’s Journey Through the Land of Counterfactuals, via the invaluable @delanceyplace.

[Image above: source]

* Roger Penrose, Shadows of the Mind: A Search for the Missing Science of Consciousness

###

As we ponder the plausible, we might send superlatively speculative birthday greetings to an accomplished counterfactualist, H.G. Wells; he was born on this date in 1866.  A prolific writer of novels, history, political and social commentary, textbooks, and rules for war games, Wells is best remembered (with Jules Verne and Hugo Gernsback) as “the father of science fiction” for his “scientific romances”– The War of the WorldsThe Time MachineThe Invisible Man, The Island of Doctor Moreau, et al.

 source

“As long as art lives never shall I accept that men are truly dead”*…

From a self-portrait by Giorgio Vasari [source]

An appreciation of Giorgio Vasari’s seminal The Lives of the Most Excellent Painters, Sculptors, and Architects, the beginning of art history as we know it

I found Giorgio Vasari through Burckhardt and Barzun. The latter writes: “Vasari, impelled by the unexampled artistic outburst of his time, divided his energies between his profession of painter and builder in Florence and biographer of the modern masters in the three great arts of design. His huge collection of Lives, which is a delight to read as well as a unique source of cultural history, was an amazing performance in an age that lacked organized means of research. […] Throughout, Vasari makes sure that his reader will appreciate the enhanced human powers shown in the works that he calls ‘good painting’ in parallel with ‘good letters’.”

Vasari was mainly a painter, but also worked as an architect. He was not the greatest artist in the world, but he had a knack for ingratiating himself with the rich and powerful, so his career was quite successful. Besides painting, he also cared a lot about conservation: both the physical preservation of works and the conceptual preservation of the fame and biographies of artists. He gave a kind of immortality to many lost paintings and sculptures by describing them to us in his book.

His Lives are a collection of more than 180 biographies of Italian artists, starting with Cimabue (1240-1302) and reaching a climax with Michelangelo Buonarroti (1475-1564). They’re an invaluable resource, as there is very little information available about these people other than his book; his biography of Botticelli is 8 pages long, yet on Botticelli’s wikipedia page, Vasari is mentioned 36 times…

Giorgio Vasari was one of the earliest philosophers of progress. Petrarch (1304-1374) invented the idea of the dark ages in order to explain the deficiencies of his own time relative to the ancients, and dreamt of a better future:

My fate is to live among varied and confusing storms. But for you perhaps, if as I hope and wish you will live long after me, there will follow a better age. This sleep of forgetfulness will not last forever. When the darkness has been dispersed, our descendants can come again in the former pure radiance.

To this scheme of ancient glory and medieval darkness, Vasari added a third—modern—age and gave it a name: rinascita. And within his rinascita, Vasari described an upward trajectory starting with Cimabue, and ending in a golden age beginning with eccentric Leonardo and crazed sex maniac Raphael, only to give way to the perfect Michelangelo in the end. It is a trajectory driven by the modern conception of the artist as an individual auteur, rather than a faceless craftsman.

The most benign Ruler of Heaven in His clemency turned His eyes to the earth, and, having perceived the infinite vanity of all those labours, the ardent studies without any fruit, and the presumptuous self-sufficiency of men, which is even further removed from truth than is darkness from light, and desiring to deliver us from such great errors, became minded to send down to earth a spirit with universal ability in every art and every profession.

This golden age was certainly no utopia, as 16th century Italy was ravaged by political turbulence, frequent plague, and incessant war. Many of the artists mentioned were at some point taken hostage by invading armies; Vasari himself had to rescue a part of Michelangelo’s David when it was broken off in the battle to expel the Medici from Florence.

And yet Vasari saw greatness in his time, and the entire book is structured around a narrative of artistic progress. He documented the spread of new technologies and techniques (such as the spread of oil painting, imported from the Low Countries), which—as an artist—he had an intimate understanding of.

This story of progress is paralleled with the rediscovery (and, ultimately, surpassing) of the ancients. It would take until the 17th century for the querelle des Anciens et des Modernes to really take off in France, but in Florence Vasari had already seen enough to decide the question in favor of his contemporaries—the essence of the Enlightenment is already present in 1550…

Art history, cultural history, tech history, the history of ideas– a review of Vasari’s The Lives of the Most Excellent Painters, Sculptors, and Architects by @AlvaroDeMenard.

* Giorgio Vasari

###

As we frame it up, we might recall that it was on this date in 1946 that the first Cannes Film Festival opened.  It had originally been scheduled for September, 1939 as an “answer” to the Venice Film Fest, which had become a propaganda vehicle for Mussolini and Hitler; but the outbreak of World War II occasioned a delay.

source

%d bloggers like this: