(Roughly) Daily

Posts Tagged ‘Goethe

“What matters to you defines your mattering”*…

Further in a fashion to yesterday’s post, and via the always illuminating Delanceyplace.com, an explication of one of the most fundamental of all human needs: an excerpt from Rebecca Goldstein‘s The Mattering Instinct, in which she draws on one of the fathers of both pragmatism and psychology, William James

We speak both of what matters and of who matters. In fact, we speak a great deal about both.

Consider what matters. In recent decades, the phrase why X matters has become a template for dozens of book titles, including Why Beauty Matters, Why Emotions Matter, Why Family Matters, Why Genealogy Matters, Why Good Sex Matters, Why Jesus Matters, Why Knowledge Matters, Why Liberalism Matters, Why Money Matters, and Why Stories Matter. The profusion of titles, many of them mutually exclusive–after all, if Jesus matters, then how, too, can money?–testifies to our preoccupation with what matters.

And it’s not only the question of what matters but also of who matters that’s urgent. Consider: In 2013, seventeen-year-old Trayvon Martin, a Black American, was visiting, together with his father, his father’s fiance at her townhouse in a gated community in Florida. While the grownups were out, Trayvon went to a nearby convenience store to get himself some snacks and, on his way back, was shot by a Neighborhood Watch volunteer, George Zimmerman, himself a member of a minority as a Hispanic American. Zimmerman found Trayvon suspicious looking–the boy’s hoodie was prominently mentioned in news stories–and called the police, while he continued to trail the teenager, a course of action ultimately ending in the boy’s death. Trayvon hadn’t been armed. All that was found on him was a bag of Skittles and an iced tea.

After the acquittal of the shooter, the hashtag #BlackLivesMatter exploded onto social media. The three-word slogan soon went beyond mere hashtags and placards, following the deaths of two more unarmed Black Americans, Michael Brown and Eric Garner, to become a political movement. Those who opposed Black Lives Matter sometimes offered as rejoinders their own three-word slogans: ‘All Lives Matter,’ or ‘Blue Lives Matter,’ this last referring to police officers. Of course, ‘Black Lives Matter’ isn’t inconsistent with either ‘All Lives Matter’ or ‘Blue Lives Matter,’ since ‘Black Lives Matter’ isn’t synonymous with ‘Only Black Lives Matter.’ The power and the poignancy of the original slogan lay in its minimalism. But what the battle of the slogans made clear is the potency of the verb to matter, in this instance applied not to the question of what matters but rather who matters.

So what exactly does the verb to matter mean? Here is a quick working definition: To matter is to be deserving of attention. It’s the same whether we are speaking of what matters or who matters. The thing or the person that matters makes a claim on us; at the very least, a claim is made on our attention.

The claim of being deserving of attention may be based on consequences that would ensue from paying attention or not paying attention–as when we ask, say, does voting really matter? We’re asking whether voting makes a difference; and so whether it’s worth our while to pay the attention called for in voting. It’s still the question of being deserving of attention, but what decides the issue is the consequences. In other circumstances, claims of mattering–of being deserving of attention–are independent of considerations of consequences, as when we assert that Black lives matter or that all lives matter. Here it’s intrinsic mattering, having nothing to do with consequences. And what intrinsic mattering comes down to is being deserving of attention. To claim that Black lives matter, as all lives matter, is to make claims regarding the deservingness of attention.

This leaves us with two more terms to explicate: attention and deservingness.

Attention is a mental phenomenon studied by contemporary psychologists, cognitive scientists, and neuroscientists–in other words, it is a subject for the empirical sciences.

The best definition I know of the phenomenon was given by the philosopher and psychologist William James. Attention, he wrote, is ‘the taking possession by the mind, in clear and vivid form, of one out of what may seem several simultaneously possible objects or trains of thoughts.’ Focalization, concentration of consciousness, are of its essence. It implies withdrawal from some things in order to deal effectively with others and is a condition which has a real opposite in the confused, dazed, scatterbrained state which in French is called distraction, and Zerstreutheit in German.

James implies that attention is something we do. ‘It is the taking possession by the mind.’ The world’s languages agree. In English we pay attention, while in other languages we give, lend, gift, dedicate, sacrifice, prepare, turn, attach, apply, infuse, and arouse our attention. The linguistic formations all imply that there is activity and agency in attention. His definition also makes clear how attention, as an activity, is to be distinguished from the broader notion of consciousness. After all, that confused, dazed, scatterbrained state is a state of consciousness, though the ‘real opposite’ of paying attention.

His definition also entails that attention is limited and selective: withdrawal from some things. Every act of attention is an act of exclusion. In paying attention to something, we are forced to ignore a multitude of other things. And he ties this limitedness and selectivity with attention’s usefulness: in order to deal effectively. Contemporary psychology agrees. Attention’s limitedness and selectivity is crucial to its usefulness and linked to the reason why organisms evolved attention in the first place: to pay attention to changeable things in the organism’s immediate environment that can help or hinder it, nourish or annihilate it. That unpleasant smell, for example, may very well signal toxicity. Note the presence of the word changeable. The function of attention is tied to what is variable, not just to what is relevant to fitness. Oxygen, our heartbeat, gravity, and many other things are vital to our survival, and our unconscious mental processes must take them into account. But they tend to be constant, so there is no need to allocate our limited window of attention to them, unless circumstances alarmingly change.

The agency entailed in the act of paying attention means that we have some control over what we do and don’t pay attention to. You may be unable to remain oblivious to the bad music blasting in your gym or the rank smell seeping into your kitchen–stimuli that are intense or that pop out of your surroundings. But you can decide to pay no attention to, say, gossip or popular culture, social media or your weight. You can decide that they simply don’t matter, which is to say that they’re not deserving of your attention. And this brings us to the second component of the English verb to matter–namely deservingness.

Deservingness introduces an entirely different level of consideration into our preoccupations with mattering. It’s a level that goes beyond the psychological, beyond the empirical altogether. Deservingness draws us into the nonempirical sphere of values and justifications, of oughts and ought-nots. This is the sphere that philosophers call normative, because it invokes norms of justification. The mattering instinct means that we are normative creatures down to our core. We think and act and shape our lives within the sphere of justifications. Instead of calling ourselves Homo sapiens, we might better have christened ourselves Homo justificans.

It’s the presence of deservingness in the concept of mattering that raises us up into an entirely different order of both complexity and perplexedness. The mattering instinct has us straining beyond the empirical for the normative knowledge that eludes us. We are carried over into the sphere of values and justifications without being equipped to see our way through. Here is the epistemic elusiveness that injects the unsubdued doubt–and hence unease–into the heart of what it is to pursue a human life.

We speak both of what matters and of who matters. And behind our preoccupations with both is the most urgent of all our mattering questions, which is voiced in the first person: Do I matter? This is the mother of our mattering questions. Ultimately, we want to know what matters because we desperately want our own lives to be driven by what matters. We want to know who matters because we desperately want to be numbered among the ones who matter.

Self-mattering–feeling ourselves overwhelmingly deserving of our own attention–is baked into our identity. The usefulness of attention, to which William James alluded, is its usefulness to ourselves. So it’s no wonder that the greater part of our attention is given over to ourselves, whether overtly or tacitly. Throughout the enormous complexity of how the mind works, our self-mattering is presumed. And yet, astonishing creatures that we are, we are able, by way of the capacity for self-reflection with which our brains come equipped, to step outside of our self-mattering, which is to step outside ourselves, to pose the mother of all mattering questions…

It’s the deservingness component that separates the mattering for which we long from such empirical psychological states as having confidence or self-esteem. You can go online right now, or schedule a visit to a psychologist, and take a test that measures your confidence or self-esteem. There will be a series of statements to which you respond with the degree of your agreement, such as: I feel that I am a person of worth, at least on an equal plane with others. I feel that I have a number of good qualities. All in all, I am inclined to feel that I’m a failure. The test may even provide a numerical score, similar to an IQ test. The Rosenberg Self-Esteem Scale, for example, which is one of the most widely used measures of self-esteem and from which I’ve taken the above statements, provides a numerical value from 1 to 30, with any score under 15 indicating low self-esteem. It was none other than William James who first formulated the concept of self-esteem, offering an equation as its definition.

But these assessments of how good you feel about yourself, often in relation to others, aren’t tests of whether you truly, objectively, existentially matter. To figure out that question, the mother of all mattering questions, you can’t take an empirical test. Your self-esteem score, whether high or low, may be grounded in self-delusion, and the mother question is a demand for the answer that lies on the other side of self-delusion. Do I truly and objectively matter? I know that I can’t help feeling that I do, but do I really?

When it comes to our own mattering, we are staunch realists. We don’t want feelings. We want the facts.”…

Mattering

See also “Why We Need to Feel Like We Matter” (source of the image above)

John Green

###

As we wonder about worth, we might spare a thought for a man who unquestionably mattered, Johann Wolfgang von Goethe; he died on this date in 1832. A poet, playwright, artist, biologist, theoretical physicist, and philosopher, he is probably best remembered these days for Faust. But by virtue of the breadth and depth of his work, he is considered “the master spirit of the German people,” and, after Napoleon, the leading figure of his age.
 

Portrait by Joseph Karl Stieler, 1828 (source)

“If one could only catch that true color of nature. The very thought of it drives me mad.”*…

An ancient cave painting depicting a bison with a solid orange square marking the animal.

From ochre to lapis lazuli, Stephanie Krzywonos opens a door into the entangled histories of our most iconic pigments, revealing how colors hold stories of both lightness and darkness. She begins…

OCHRE

Darkness filled Font-de-Gaume cave. But dark is not the same as the color black. Dark means little or no light. Black isn’t an absence, but a presence.

Font-de-Gaume contains the only polychromatic prehistoric cave art still open to the public. Lascaux, fourteen miles away, closed to visitors in 1963: the carbon dioxide and humidity from human sweat and breath was damaging the paintings. Outside the cave in daylight, large beige-grey marbled cliffs overhung the habitable cottages of Les Eyzies-de-Tayac and the ruins of human shelters around 40,000 years old. Here, in 1868, a geologist uncovered the bones of five Cro-Magnon skeletons, which, at the time, were the earliest known examples of Homo sapiens sapiens—us. Homo sapiens, in Latin, means the wise human. Inside the cave, 250 earth-colored images of reindeer, bison, woolly mammoths, ibexes, horses, and a wolf haunted the walls. Most of the animals were outlined in thick black lines and filled in with rich, earthy ochre paint.

Pigment, a form of nature, is an insoluble substance that gives color to paint and other materials. The melanin in hair, skin, and eyes is a pigment, so is the chlorophyll in plants. Ochre—found decorating 350,000-year-old paleolithic bones—is the oldest colored pigment used by humans. In 2008, archaeologists discovered a 100,000-year-old painting kit in a cave in South Africa which included ochre pigments, an abalone shell used as a palette, a stone for grinding, and bone spatulas for mixing and dabbing the paste. Ochre is life in the form of a paste. To make it, gather earth, specifically clay or rock from land rich in mineral oxides. If not already a powder, crush it, then mix with liquid like fish oil, animal fat, blood, or saliva. Ochre’s iron oxides vary in color, from pale yellow to brown and red, but the oldest documented ochre pigment is red. In the languages of the many ancient Aboriginal cultures who used ochre, there is no distinction between ochre’s color and its substance, ancestral land. We still use red ochre in lipstick.

In the cave, our guide, Blaise, told us that experts thought the artist, or artists, intended these paintings for worship, reverence, and ceremony and painted on scaffolding by the flickering light of reindeer fat. Blaise also told us the artists were tall and dark-skinned—we would call them Black. As a so-called “person of color,” this intrigued me.

Our battery-powered light was steady and still as it illuminated a bison placed so precisely that the natural lumpiness of the rock wall made its bulk look real. Blaise asked us to imagine how the movement of the flames would have made the animals look alive, their muscles rippling and quivering on a canvas made of rock. In front of me, a female reindeer kneeled, and a male tenderly licked her forehead. The image was around 16,000 years old. Between 10,000 and 11,000 years ago, as the planet warmed and our last ice age ended, all of the reindeer in these paintings retreated northward with glaciers or were hunted to extinction.

Blaise asked us to stand near a different wall, then turned on a light. An outline of an artist’s hand was inches from my face. The hand seemed to pulse. Maybe it was just my heart pumping blood behind my eyes. The artist had filled his mouth with the same ochre used to color the animals, held up his large left hand, then sprayed ochre paint against the wall. Whose blood, whose spit, whose fat was this? The artist could have been hunting, been doing anything else in sunlight, but chose to enter the dark to paint, to love what is, to record a reindeer kiss forever, or for as long as rock lasts.

These artists didn’t just try to capture what they loved—they left a warning. One motif in prehistoric cave art, Blaise said, was to paint the most dangerous, violent animals in the deepest part of the cave. We were not allowed to travel deeper inside of Earth to see them: Font-de-Gaume, this museum of prehistoric art, was re-discovered in 1901, and already people had destroyed too many paintings. The animals in the very back were a woolly rhinoceros, a lion, and the profile of a human face, on which a tear appears to fall.

Maybe, as reindeer herds dwindled, the artists were expressing their sorrow and regret. Perhaps the face is saying: if we are not wise, our loves can lead down hideous paths…

Continue around the wheel: “Museum of Color,” from @stephkrzywonos.bsky.social‬ in @emergencemagazine.bsky.social‬.

* Andrew Wyeth

###

As we ponder pigment, we might recall that it was on this date in 1810 that a work in sympathy with Krzywonos’ first appeared: Johann Wolfgang von Goethe‘s Theory of Colors (Zur Farbenlehre, literally, ‘On color theory’).

Though he is, of course, remembered– and revered– for Faust and so much more, Goethe considered Theory of Colors his most important work.  In it, he contentiously (and incorrectly) characterized color as arising from “the dynamic interplay of light and darkness through the mediation of a turbid medium.”  Scientists have coalesced around (and built upon) Isaac Newton’s theory.

Still, Goethe was the first systematically to study the psychological effects of color; his observations of the effect of opposed colors led him to a symmetric arrangement of his color wheel, “for the colors diametrically opposed to each other… are those which reciprocally evoke each other in the eye.”  Indeed, after being translated into English by Charles Eastlake in 1840, his theory became widely adopted by the art world, most notably by J. M. W. Turner and later, the Pre-Raphaelites and Wassily Kandinsky.

As Ernst Lehrs wrote, “In point of fact, the essential difference between Goethe’s theory of colour and the theory which has prevailed in science (despite all modifications) since Newton’s day, lies in this: While the theory of Newton and his successors was based on excluding the colour-seeing faculty of the eye, Goethe founded his theory on the eye’s experience of colour.” Or, as Wittgenstein put it (in Culture and Value, MS 112 255:26.11.1931): “I believe that what Goethe was really seeking was not a physiological but a psychological theory of colours.”

A color wheel featuring sections in various hues, including red, blue, yellow, and green, with handwritten labels in German.
Goethe’s color wheel (source)

Written by (Roughly) Daily

September 6, 2025 at 1:00 am

“Not in utter loneliness to live / Myself at last did to the Devil give!”*…

Faust And Mephistopheles Playing Chess, Moritz Retzsch 1831

With an excerpt from his new book, Devil’s Contract: The History of the Faustian Bargain, Ed Simon on “the most important story ever told”– the story of Humanity’s transactional relationship with evil…

The legend of the Devil’s contract is the most alluring, the most provocative, the most insightful, the most important story ever told. It concerns a humanity strung between Heaven and Hell, the saintly and the satanic; how a man could trade his soul for powers omnipotent, signing a covenant with the Devil so that he could briefly live as a god before being pulled down to Hell. Frequently associated with Christopher Marlowe’s Doctor Faustus, that Elizabethan play wasn’t the origin of that myth, but his is certainly a sterling example of that eternal script. Yet long before that Renaissance play and long afterwards, we can find the inky traces of Faust’s damned signature in a multitude of works both high and low, canonical and popular. More disturbing than that is the way that the Devil’s hoof-prints can be found across the wide swatch of history, in our willingness to embrace power and engage in exploitation, to summon self-interestedness and to conjure cruelty…

Tamburlaine the Great’s iconoclasm and The Jew of Malta’s irreverence aside, no work of sacred heresy in Marlowe’s oeuvre is as profound as Doctor Faustus. His quisling scholar selling his birthright for the pottage of trickery and illusion may be modernity’s operative metaphor, but Marlowe was hardly the originator of the myth. As you’ll read in the chapters ahead, Marlowe adapted the historical Johann Faust from German folkloric tradition, though the myth of a contract with Satan existed centuries before that unfortunate alchemist first crossed potassium nitrate with sulfur. Nor of course was Marlowe’s rendition the final word, as thousands of permutations of the basic story have been produced over the half-millennium, from Goethe to the musical Damn Yankees, Thomas Mann to the Dixie-fried pablum of the execrable Charlie Daniels Band number “The Devil Went Down to Georgia.” High culture like Franz Liszt’s Faust Symphony and Gustav Mahler’s Symphony No. 8; pop culture from the comic book Ghost Rider to the Jack Black flick Tenacious D in the Pick of Destiny.

“The figure of Faust is—after Christ, Mary, and the Devil—the single most popular character in the history of Western Christian culture,” writes Jeffrey Burton Russell in his classic Mephistopheles: The Devil in the Modern World. And of those characters, Faust is the most fully human to us, in his arrogance and his failure, his negotiations and his capitulations, in the whole litany of abuse which the cankered soul is capable of inflicting upon itself. Russell’s contention is far from hyperbole, and amending the word “character” to “narrative,” I’d say that there are few archetypal scripts in our culture as essential as the legend of a man selling his soul to the Devil. Thousands of works of literature and film, music and art, grapple with the bargain whereby somebody trades what’s most human for power or wealth, influence or knowledge. Only the myth of Adam and Eve being cast out of Eden competes with Faust in terms of influence, and that story is arguably an early variation on the Devil’s contract…

… though it is ostensibly a history, and this narrative moves onward rather chronologically, I prefer to think of the story it tells as being about a character who is outside of time, who lives parallel to past, present, and future. An eternal story. Because what this book is concerned with are the implications— culturally, politically, theologically—of these highly symbolically charged narratives concerning the abjuration of a soul, of the ceding of what’s intrinsic to us, of the capitulations and negotiations which make up any failed life, which is to say every life. More than a history, then, Devil’s Contract is an account of what it means to be human in all of our failings.

Increasingly an account of humanity right now. For all the legend’s archaicism, the muttered Latin and the alchemical conjuration, Faust’s story has always been estimably modern, perhaps the first modern story. Unlike Adam and Eve, with their inscrutable Bronze Age story composed in an idiom so ancient and foreign that centuries of theologians have disagreed on what the implications of each facet of the tale might mean, the details in the Faust legend are inescapably of our time. This is, after all, the story of a contract. The dénouement of most versions of the Faust story involves the signing of a legally binding document, an experience foreign to the authors of Genesis but replete in our own lives, whether interacting with human resources or clicking on an agreement with our phone company. Faust’s tale may deal in the numinous and the transcendent, but it’s also about bureaucracy and paperwork, our contemporary hell and its sacrament, respectively. We recognize Faust in a manner that no character in the Bible can ever be our contemporary…

… Marlowe staged his play at the very beginning of what is increasingly being called the Anthropocene, the geological epoch in which humanity was finally able to impose its will (in an almost occult manner) upon the earth. There are costs to any such contract, as the wisdom of the legend has it, so that it’s worth considering after five centuries of human domination of the planet that we might now be facing our own collective appointment at Deptford. We seem to finally be facing the final act, the apocalyptic tenor of our times, from climate change to nuclear brinkmanship making the continued survival of humanity an open question, our sad predicament the result of hubris, and greed, and vainglory. It may be appropriate to rechristen this age the Faustocene. Because whether or not the Devil is real, his effects in the world are. When it comes to “truth” and “facts,” the two words are not synonymous, and I wouldn’t at all be surprised if I could make out the smoke of some devilish chimera beyond the neon-line of the Rose Theater, deep within a darkness so all-encompassing that not a squib of light is capable of escaping…

A Deal With the Devil: What the Age-Old Faustian Bargain Reveals About the Modern World” in @lithub.

See also: “You Are Equal To The Spirit You Understand,” Nathan Gardels‘ consideration of the lessons in Goethe’s Faust, in @NoemaMag.

* Johann Wolfgang von Goethe, Faust

###

As we reconsider our contracts, we might recall that it was on this date in 1834 that slavery was abolished in the British Empire, as the Slavery Abolition Act 1833 came into force (though it remained legal in the possessions of the East India Company until the passage of the Indian Slavery Act, 1843).

200px-Official_medallion_of_the_British_Anti-Slavery_Society_(1795)
“Am I Not a Man and a Brother?”, 1787 medallion designed by Josiah Wedgwood for the British anti-slavery campaign (source)

“Demography is destiny”*…

But what destiny? Strong population growth has fueled economic development in some countries (where the phenomenon of economic growth tends to moderate population growth), and it has exacerbated problems in others. Conversely, a shrinking population can vex the prospects for development and growth. The Economist weighs in with thoughts on the world’s two most populous nations– one shrinking, the other growing…

China has been the world’s most populous country for hundreds of years. In 1750 it had an estimated 225m people, more than a quarter of the world’s total. India, not then a politically unified country, had roughly 200m, which ranked it second. In 2023 it will seize the crown. The UN guesses that India’s population will surpass that of China on April 14th. India’s population on the following day is projected to be 1,425,775,850.

The crown itself has little value, but it is a signal of things that matter. That India does not have a permanent seat on the UN Security Council while China does will come to seem more anomalous. Although China’s economy is nearly six times larger, India’s growing population will help it catch up. India is expected to provide more than a sixth of the increase of the world’s population of working age (15-64) between now and 2050.

China’s population, by contrast, is poised for a steep decline. The number of Chinese of working age peaked a decade ago. By 2050 the country’s median age will be 51, 12 years higher than now. An older China will have to work harder to maintain its political and economic clout. [See also:”For the first time since the 1960s, China’s population is shrinking” and “Here’s why China’s population dropped for the first time in decades.”]

Both countries took draconian measures in the 20th century to limit the growth of their populations. A famine in 1959-61 caused by China’s “great leap forward” was a big factor in persuading the Communist Party of the need to rein in population growth. A decade later China launched a “later, longer, fewer” campaign—later marriages, longer gaps between children and fewer of them. That had a bigger effect than the more famous one-child policy, introduced in 1980, says Tim Dyson, a British demographer. The decline in fertility, from more than six babies per woman in the late 1960s to fewer than three by the late 1970s, was the swiftest in history for any big population, he says.

It paid dividends. China’s economic miracle was in part the result of the rising ratio of working-age adults to children and oldsters from the 1970s to the early 2000s…

India’s attempt to reduce fertility was less successful. It was the first country to introduce family planning on a national scale in the 1950s. Mass-sterilisation campaigns, encouraged by Western donors, grew and were implemented more forcefully during the state of emergency declared by Indira Gandhi, the prime minister, in 1975-77… Though brutal, the campaign was not thorough enough to cause a dramatic drop in India’s birth rate. India’s fertility has dropped, but by less, and more slowly than China’s. With a median age of 28 and a growing working-age population, India now has a chance to reap its own demographic dividend. Its economy recently displaced Britain’s as the world’s fifth-biggest and will rank third by 2029, predicts State Bank of India. But India’s prosperity depends on the productivity of its youthful people, which is not as high as in China. Fewer than half of adult Indians are in the workforce, compared with two-thirds in China. Chinese aged 25 and older have on average 1.5 years more schooling than Indians of the same age.

That will not spare China from suffering the consequences of the demographic slump it engineered. The government ended the one-child policy in 2016 and removed all restrictions on family size in 2021. But birth rates have kept falling. China’s zero-covid policy has made young adults even more reluctant to bear children. The government faces resistance to its plans to raise the average retirement age, which at 54 is among the lowest in the world. The main pension fund may run out of money by 2035. Yet perhaps most painful for China will be the emergence of India as a superpower on its doorstep…

The contrasting demographic dynamics of China and India, and what they might mean: “India will become the world’s most populous country in 2023,” from @TheEconomist.

* attributed to Auguste Comte

###

As we ponder population, we might recall that it was on this date in 1829 that the first part of the tragic play Faust by Johann Wolfgang von Goethe premiered. Originally published in 1808, the tale of Dr. Faust’s deal with Memphistopheles is considered by many to be the greatest work of German literature.

source

Written by (Roughly) Daily

January 19, 2023 at 1:00 am

“The method preferred by most balding men for making themselves look silly is called the comb over”*…

Balding has been the constant scourge of man since the beginning of time, and for millennia, our best solution was the comb-over. Brian VanHooker tells the story of how its once-ubiquitous popularity thinned, receded, and then got pushed to the side…

For decades now, having a comb-over to cover one’s baldness has been generally seen as unacceptable. There may be exceptions, but men with prominent, noticeable comb-overs are often regarded as desperate — instead of aging gracefully, they’re seen as hopelessly clinging to a time when they had a full head of hair. Worst of all, for people with advanced hair loss, the comb-over is entirely ineffective. Instead of disguising a man’s baldness, it only accentuates it, thus laying bare their lack of hair and, even worse, their insecurity.

This wasn’t always the case. For at least a couple thousand years, comb-overs were perfectly acceptable and worn by the most powerful men in the world. It was only during the latter half of the 20th century that it all came crashing (flopping?) down…

From Julius Caesar to Donald Trump, a tonsorial trip through time: “The Rise, Flop, and Fall of the Comb-Over,” from @TrivialHistory in @WeAreMel.

* Dave Barry

###

As we resist the urge, we might send scandalous birthday greetings to Giacomo Casanova; he was born on this date in 1725. A Venetian adventurer and author, he is best remembered– as a product both of his memoir and of other contemporary accounts– as a libertine, a womanizer who carried on complicated and elaborate affairs with numerous women.

At the same time, he associated with European royalty, popes, and cardinals, along with intellectual and artistic figures like Voltaire, Goethe, and Mozart. His memoir (written toward the end of his life, while he served as librarian to Count Waldstein) is regarded as one of the most authentic sources of the customs and norms of European social life during the 18th century.

Casanova preferred a wig to a comb-over.

Potrait by Casanova’s brother Francesco

source