(Roughly) Daily

Posts Tagged ‘Psychology

“Unhappy is the land that needs a hero”*…

To the extent that evolutionary biologist and sociobiologist Robert Trivers has been in the news over the last decade, it has been for his entanglement with and highly-questionable defense of Jeffrey Epstein. But as Lionel Page reminds us, two decades before that– well before he could have known the execrable “financier”– Trivers made hugely important contributions to his field…

Steve Stewart-Williams announced… that Robert Trivers passed away.

Trivers was one of the most—perhaps the most—influential evolutionary biologists of the 20th century. His work should be much more widely known in social and behavioural sciences, in particular in economics, as Trivers’ intellectual approach is very much in line with a game theoretic understanding of social interactions.

It is hard to overstate the importance of his work. Einstein famously published four groundbreaking papers in 1905, a year often referred to as his “Annus mirabilis”, during which he revolutionised physics. Trivers might be said to have had a “Quinquennium Mirabile” for the five years between 1971 and 1976, during which he produced a series of ideas that revolutionised evolutionary biology…

[Page unpacks four of those contributions: Reciprocal Alturism, Parental Investment, Parental Offspring Conflict, and Self-Deception, each fascinating…]

… Trivers has been one of the most influential evolutionary biologists, and his papers are still worth reading today. His insights, published more than 50 years ago, are fascinating. They often align very well with economic theories of behaviour, and it is therefore regrettable that his ideas are not more well-known in economics, and in particular in behavioural economics.

A key feature of Trivers’ take across these contributions was to see that beneath the world of social interactions we observe, there are deep structures in terms of incentives that shape the game we play. Understanding these games and their structures helps us make sense of the seemingly endless complexity of human psychology and social dynamics. In several key contributions, Trivers helped lift the veil on the underlying logic of human behaviour…

From cooperation to conflict: the evolutionary grammar of social interactions: “The fascinating insights of Robert Trivers” from @lionelpage.bsky.social.

For more on Trivers and the controversies in his life (Epstein, but also the Black Panthers and a Rutgers set-to), all of which followed the burst of productivity described above, see here.

And for some thoughts on how one might reconcile appreciation for a scientist’s work with abhorence of his later sins, see “Ghosts of Science Past Still Haunt Us. We Can Put Them to Rest.

* Bertolt Brecht (through the mouth of Galileo, in The Life of Galileo)

###

As we linger over legacies, we might send material birthday greetings to a man who helped lay the groundwork for the field to which Trivers contributed, Ludwig Büchner; he was born on this date in 1824. A philosopher, physiologist, and physician, he became one of the leading exponents of 19th-century scientific materialism. Büchner was an early champion of Darwin’s theory of evolution, endorsing it within a decade of its first issuance, then did much to spread it by citing and building on it in his own books.

As far as we know, Büchner’s life was free of the scandal and conflict that plagued Trivers. He taught at the University of Tübingen and published dozens of books and papers. Later in his life he founded he “German Freethinkers League” (“Deutsche Freidenkerbund”) and served as a member of the second chamber of the Landstände of the Grand Duchy of Hesse as a representative of the German Free-minded Party from 1884 to 1890. He was the younger brother of Georg Büchner, a famous revolutionary playwright, and Luise Büchner, a women’s rights advocate; and he was the uncle of Ernst Büchner, inventor of the Büchner flask.

source

“What matters to you defines your mattering”*…

Further in a fashion to yesterday’s post, and via the always illuminating Delanceyplace.com, an explication of one of the most fundamental of all human needs: an excerpt from Rebecca Goldstein‘s The Mattering Instinct, in which she draws on one of the fathers of both pragmatism and psychology, William James

We speak both of what matters and of who matters. In fact, we speak a great deal about both.

Consider what matters. In recent decades, the phrase why X matters has become a template for dozens of book titles, including Why Beauty Matters, Why Emotions Matter, Why Family Matters, Why Genealogy Matters, Why Good Sex Matters, Why Jesus Matters, Why Knowledge Matters, Why Liberalism Matters, Why Money Matters, and Why Stories Matter. The profusion of titles, many of them mutually exclusive–after all, if Jesus matters, then how, too, can money?–testifies to our preoccupation with what matters.

And it’s not only the question of what matters but also of who matters that’s urgent. Consider: In 2013, seventeen-year-old Trayvon Martin, a Black American, was visiting, together with his father, his father’s fiance at her townhouse in a gated community in Florida. While the grownups were out, Trayvon went to a nearby convenience store to get himself some snacks and, on his way back, was shot by a Neighborhood Watch volunteer, George Zimmerman, himself a member of a minority as a Hispanic American. Zimmerman found Trayvon suspicious looking–the boy’s hoodie was prominently mentioned in news stories–and called the police, while he continued to trail the teenager, a course of action ultimately ending in the boy’s death. Trayvon hadn’t been armed. All that was found on him was a bag of Skittles and an iced tea.

After the acquittal of the shooter, the hashtag #BlackLivesMatter exploded onto social media. The three-word slogan soon went beyond mere hashtags and placards, following the deaths of two more unarmed Black Americans, Michael Brown and Eric Garner, to become a political movement. Those who opposed Black Lives Matter sometimes offered as rejoinders their own three-word slogans: ‘All Lives Matter,’ or ‘Blue Lives Matter,’ this last referring to police officers. Of course, ‘Black Lives Matter’ isn’t inconsistent with either ‘All Lives Matter’ or ‘Blue Lives Matter,’ since ‘Black Lives Matter’ isn’t synonymous with ‘Only Black Lives Matter.’ The power and the poignancy of the original slogan lay in its minimalism. But what the battle of the slogans made clear is the potency of the verb to matter, in this instance applied not to the question of what matters but rather who matters.

So what exactly does the verb to matter mean? Here is a quick working definition: To matter is to be deserving of attention. It’s the same whether we are speaking of what matters or who matters. The thing or the person that matters makes a claim on us; at the very least, a claim is made on our attention.

The claim of being deserving of attention may be based on consequences that would ensue from paying attention or not paying attention–as when we ask, say, does voting really matter? We’re asking whether voting makes a difference; and so whether it’s worth our while to pay the attention called for in voting. It’s still the question of being deserving of attention, but what decides the issue is the consequences. In other circumstances, claims of mattering–of being deserving of attention–are independent of considerations of consequences, as when we assert that Black lives matter or that all lives matter. Here it’s intrinsic mattering, having nothing to do with consequences. And what intrinsic mattering comes down to is being deserving of attention. To claim that Black lives matter, as all lives matter, is to make claims regarding the deservingness of attention.

This leaves us with two more terms to explicate: attention and deservingness.

Attention is a mental phenomenon studied by contemporary psychologists, cognitive scientists, and neuroscientists–in other words, it is a subject for the empirical sciences.

The best definition I know of the phenomenon was given by the philosopher and psychologist William James. Attention, he wrote, is ‘the taking possession by the mind, in clear and vivid form, of one out of what may seem several simultaneously possible objects or trains of thoughts.’ Focalization, concentration of consciousness, are of its essence. It implies withdrawal from some things in order to deal effectively with others and is a condition which has a real opposite in the confused, dazed, scatterbrained state which in French is called distraction, and Zerstreutheit in German.

James implies that attention is something we do. ‘It is the taking possession by the mind.’ The world’s languages agree. In English we pay attention, while in other languages we give, lend, gift, dedicate, sacrifice, prepare, turn, attach, apply, infuse, and arouse our attention. The linguistic formations all imply that there is activity and agency in attention. His definition also makes clear how attention, as an activity, is to be distinguished from the broader notion of consciousness. After all, that confused, dazed, scatterbrained state is a state of consciousness, though the ‘real opposite’ of paying attention.

His definition also entails that attention is limited and selective: withdrawal from some things. Every act of attention is an act of exclusion. In paying attention to something, we are forced to ignore a multitude of other things. And he ties this limitedness and selectivity with attention’s usefulness: in order to deal effectively. Contemporary psychology agrees. Attention’s limitedness and selectivity is crucial to its usefulness and linked to the reason why organisms evolved attention in the first place: to pay attention to changeable things in the organism’s immediate environment that can help or hinder it, nourish or annihilate it. That unpleasant smell, for example, may very well signal toxicity. Note the presence of the word changeable. The function of attention is tied to what is variable, not just to what is relevant to fitness. Oxygen, our heartbeat, gravity, and many other things are vital to our survival, and our unconscious mental processes must take them into account. But they tend to be constant, so there is no need to allocate our limited window of attention to them, unless circumstances alarmingly change.

The agency entailed in the act of paying attention means that we have some control over what we do and don’t pay attention to. You may be unable to remain oblivious to the bad music blasting in your gym or the rank smell seeping into your kitchen–stimuli that are intense or that pop out of your surroundings. But you can decide to pay no attention to, say, gossip or popular culture, social media or your weight. You can decide that they simply don’t matter, which is to say that they’re not deserving of your attention. And this brings us to the second component of the English verb to matter–namely deservingness.

Deservingness introduces an entirely different level of consideration into our preoccupations with mattering. It’s a level that goes beyond the psychological, beyond the empirical altogether. Deservingness draws us into the nonempirical sphere of values and justifications, of oughts and ought-nots. This is the sphere that philosophers call normative, because it invokes norms of justification. The mattering instinct means that we are normative creatures down to our core. We think and act and shape our lives within the sphere of justifications. Instead of calling ourselves Homo sapiens, we might better have christened ourselves Homo justificans.

It’s the presence of deservingness in the concept of mattering that raises us up into an entirely different order of both complexity and perplexedness. The mattering instinct has us straining beyond the empirical for the normative knowledge that eludes us. We are carried over into the sphere of values and justifications without being equipped to see our way through. Here is the epistemic elusiveness that injects the unsubdued doubt–and hence unease–into the heart of what it is to pursue a human life.

We speak both of what matters and of who matters. And behind our preoccupations with both is the most urgent of all our mattering questions, which is voiced in the first person: Do I matter? This is the mother of our mattering questions. Ultimately, we want to know what matters because we desperately want our own lives to be driven by what matters. We want to know who matters because we desperately want to be numbered among the ones who matter.

Self-mattering–feeling ourselves overwhelmingly deserving of our own attention–is baked into our identity. The usefulness of attention, to which William James alluded, is its usefulness to ourselves. So it’s no wonder that the greater part of our attention is given over to ourselves, whether overtly or tacitly. Throughout the enormous complexity of how the mind works, our self-mattering is presumed. And yet, astonishing creatures that we are, we are able, by way of the capacity for self-reflection with which our brains come equipped, to step outside of our self-mattering, which is to step outside ourselves, to pose the mother of all mattering questions…

It’s the deservingness component that separates the mattering for which we long from such empirical psychological states as having confidence or self-esteem. You can go online right now, or schedule a visit to a psychologist, and take a test that measures your confidence or self-esteem. There will be a series of statements to which you respond with the degree of your agreement, such as: I feel that I am a person of worth, at least on an equal plane with others. I feel that I have a number of good qualities. All in all, I am inclined to feel that I’m a failure. The test may even provide a numerical score, similar to an IQ test. The Rosenberg Self-Esteem Scale, for example, which is one of the most widely used measures of self-esteem and from which I’ve taken the above statements, provides a numerical value from 1 to 30, with any score under 15 indicating low self-esteem. It was none other than William James who first formulated the concept of self-esteem, offering an equation as its definition.

But these assessments of how good you feel about yourself, often in relation to others, aren’t tests of whether you truly, objectively, existentially matter. To figure out that question, the mother of all mattering questions, you can’t take an empirical test. Your self-esteem score, whether high or low, may be grounded in self-delusion, and the mother question is a demand for the answer that lies on the other side of self-delusion. Do I truly and objectively matter? I know that I can’t help feeling that I do, but do I really?

When it comes to our own mattering, we are staunch realists. We don’t want feelings. We want the facts.”…

Mattering

See also “Why We Need to Feel Like We Matter” (source of the image above)

John Green

###

As we wonder about worth, we might spare a thought for a man who unquestionably mattered, Johann Wolfgang von Goethe; he died on this date in 1832. A poet, playwright, artist, biologist, theoretical physicist, and philosopher, he is probably best remembered these days for Faust. But by virtue of the breadth and depth of his work, he is considered “the master spirit of the German people,” and, after Napoleon, the leading figure of his age.
 

Portrait by Joseph Karl Stieler, 1828 (source)

“Technology challenges us to assert our human values, which means that first of all, we have to figure out what they are”*…

A hand is holding a glowing projection of interconnected dots and lines against a dark background, representing technology and innovation.

As we head into the weekend, some food for thought…

A decade ago, the world was, at once, both the seed of today and a very different place: In what was considered one of the biggest political upsets in American political history (and the fifth and most recent presidential election in which the winning candidate lost the popular vote), Donald Trump was elected to his first term. The U.K. chose Brexit. The stock market finished strong, with the Dow Jones, S&P 500, and Nasdaq reaching new highs. (In the 10 years that have followed, the Dow has risen about 150%; the S&P 500, roughly 400%; and the NASDAQ has roughly sextupled.)

It was a big year for pop culture, marked by Beyoncé’s Lemonade, the massive Pokémon Go craze, and the rise of Netflix with Stranger Things, the Rio Olympics, and the loss of icons like David Bowie and Prince.

It was also a big year in tech: Russian hacking and disinfo (especially on Facebook) was a huge story– as was Apple’s elimination of the headphone jack in the iPhone 7. Theranos collapsed; and Wells fargo opened millions of accounts for customers without those customers’ permission (for which they were sunsequently fined $3 Billion). And Virtual Reality was everywhere (in the promises/offers from tech companies), but nowhere in the market. TikTok was launched in 2016, but hadn’t yet become the phenomenon (and avatar of algorithmly-driven feeds) that it has become. And in the course of 2016, artificial intelligence made the leap from “science fiction concept” to “almost meaningless buzzword” (though in fairness, 2016 was the year that Google DeepMind’s AlphaGo program triumphed against South Korean Go grandmaster Lee Sedol).

Back in 2016, the estimable Alan Jacobs was pondering the road ahead. In a piece for The New Atlantis, he coined and discussed a series of aphorisms relevant to the future as then he saw it. He begins…

Aphorisms are essentially an aristocratic genre of writing. The apho-
rist does not argue or explain, he asserts; and implicit in his assertion
is a conviction that he is wiser or more intelligent than his readers.
– W. H. Auden and Louis Kronenberger, The Viking Book of Aphorisms

Author’s Note: I hope that the statement above is wrong, believing that certain adjustments can be made to the aphoristic procedure that will rescue the following collection from arrogance. The trick is to do this in a way that does not sacrifice
the provocative character that makes the aphorism, at its best, such a powerful form of utterance.

Here I employ two strategies to enable me to walk this tightrope. The first is to characterize the aphorisms as “theses for disputation,” à la Martin Luther — that is, I invite response, especially response in the form of disagreement or correction. The second is to create a kind of textual conversation, both on the page and beyond it, by adding commentary (often in the form of quotation) that elucidates each thesis, perhaps even increases its provocativeness, but never descends into coarsely explanatory pedantry…

[There follows a series of provocations and discussions that feel as relevant– and important– today as they were a decade ago. He concludes…]

Precisely because of this mystery, we need to evaluate our technologies according to the criteria established by our need for “conviviality.”

I use the term with the particular meaning that Ivan Illich gives it in Tools for Conviviality [here]:

I intend it to mean autonomous and creative intercourse among per-
sons, and the intercourse of persons with their environment; and this
in contrast with the conditioned response of persons to the demands
made upon them by others, and by a man-made environment. I con-
sider conviviality to be individual freedom realized in personal inter-
dependence and, as such, an intrinsic ethical value. I believe that, in
any society, as conviviality is reduced below a certain level, no amount
of industrial productivity can effectively satisfy the needs it creates
among society’s members.

In my judgment, nothing is more needful in our present technological moment than the rehabilitation and exploration of Illich’s notion of conviviality, and the use of it, first, to apprehend the tools we habitually employ and, second, to alter or replace them. For the point of any truly valuable critique of technology is not merely to understand our tools but to change them — and us…

Eminently worth reading in full, as its still all-too-relevant: “Attending to Technology- Theses for Disputation,” from @ayjay.bsky.social.

Pair with a provocative piece from another fan of Illich, L. M. Sacasas (@lmsacasas.bsky.social): “Surviving the Show: Illich And The Case For An Askesis of Perception.”

[Image above: source]

Sherry Turkle

###

As we think about tech, we might recall that it was on this date in 1946 that an ancestor of today’s social networks, streaming services, and AIs, the ENIAC (Electronic Numerical Integrator And Computer), was first demonstrated in operation.  (It was announced to the public the following day.) The first general-purpose computer (Turing-complete, digital, and capable of being programmed and re-programmed to solve different problems), ENIAC was begun in 1943, as part of the U.S’s war effort (as a classified military project known as “Project PX“); it was conceived and designed by John Mauchly and Presper Eckert of the University of Pennsylvania, where it was built.  The finished machine, composed of 17,468 electronic vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints, weighed more than 27 tons and occupied a 30 x 50 foot room– in its time the largest single electronic apparatus in the world.  ENIAC’s basic clock speed was 100,000 cycles per second (or Hertz). Today’s home computers have clock speeds of 3,500,000,000 cycles per second or more.

Historic black and white image of an early computer room featuring large machines with intricate wiring, a male technician working at one of the machines, and a female operator reading from a data sheet.

source

Written by (Roughly) Daily

February 13, 2026 at 1:00 am

“Tell me to what you pay attention and I will tell you who you are”*…

A man wearing a gas mask operates a device at a wooden table, with letters L, A, M, F, and E visible on the table. Equipment and hoses are connected to the device.
A test subject has his oxygen consumption measured while using Walter R. Miles’ Pursuitmeter, as pictured in the inventor’s 1921 article for the Journal of Experimental PsychologySource.

Before the attention economy consumed our lives, “pursuit tests” devised by the US military coupled man to machine with the aim of assessing focus under pressure. D. Graham Burnett explores these devices for evaluating aviators, finding a pre-history of the laboratory research that has relentlessly worked to slice and dice the attentional powers of human beings…

We worry about our attention these days — nearly all of us. There is something. . . wrong. We cannot manage to do what we want to do with our eyes and minds — not for long, anyway. We keep coming back to the machines, to the screens, to the notifications, to the blinking cursor and the frictionless swipe that renews the feed.

An ethnographer from Mars, moving among us (would we even notice?), might have trouble understanding our complaint: “Trouble with their attention? They stare at small slabs of versicolor glass all day! Their attentive powers are. . . sublime!”

And that misunderstanding rather sharpens the point: we don’t have any problem at all with the forms of attention that involve remaining engaged with, and responsive to, machines. We are amazing at the click and tap of durational vigilance to this or that stimulus, presented at the business end of a complex device. Our uncanny and immersive cybernetic attention is a defining characteristic of the age. Our human attention — our ability to be with ourselves and with others, our ability to receive the world with our minds and senses, our ability to daydream, read a book uninterrupted, or watch a sunset — well, many of us are finding it increasingly difficult to remember what that might even mean.

This isn’t really an accident. Over the last century or so, a series of elaborate programs of laboratory research have worked to slice and dice the attentional powers of human beings. Their aim? To understand the operational capacities of those who would be asked to shoot down airplanes, monitor radar screens, and otherwise sit at the controls of large and expensive machines. Seated in front of countless instruments, experimental subjects were asked to listen and look, to track and trigger. Psychologists stood by with stopwatches, quantifying our cybernetic capacities, and seeking ways to extend them. For those of us who have come of age in the fluorescence of the “attention economy”, it is interesting to look back and try to catch glimpses of the way that the movement of human eyeballs came under precise scrutiny, the way that machine vigilance became a field of study. We know now that the mechanomorphic attention dissected in those laboratories is the machine attention that is relentlessly priced in our online lives — to deleterious effects.

You could say that this process began with the fascinating and now mostly forgotten tool known as the “pursuit test”. Part steampunk videogame, part laboratory snuff-flick, the pursuit test staged and restaged the integration of man and machine across the first decades of the twentieth century…

Fascinating– and timely: “Cybernetic Attention– All Watched over by Machines We Learned to Watch,” from @publicdomainrev.bsky.social. Eminently worth reading in full.

* Jose Ortega y Gasset

###

As we untangle engagement, we might send thoughtful birthday greetings to a man whose work influenced the endeavors described in the piece featured above, Hermann Ebbinghaus; he was born on this date in 1850. A psychologist, he pioneered the experimental study of memory and discovered the learning curve, the forgetting curve, and the spacing effect.

Black and white portrait of a man with a large beard, wearing round glasses and a formal suit, looking directly at the camera.

source

“Time flies like an arrow; fruit flies like a banana”*…

A historical timeline chart depicting significant events, figures, and locations from various ancient civilizations, including biblical references and ancient empires like Babylon, Egypt, and Greece.
Detail from Adams Synchronological Chart of Universal History created by Sebastian C Adams in 1881, a visual representation of world history, spanning from 4004 BCE to 1881 CE (the David Rumsey Map Collection)

A companion of a sort to last Friday’s post: In the 19th century, the linear idea of time became dominant. As Emily Thomas explains, that has had profound implications for how we experience the world…

‘It’s natural,’ says the Stanford Encyclopedia of Philosophy, ‘to think that time can be represented by a line.’ We imagine the past stretching in a line behind us, the future stretching in an unseen line ahead. We ride an ever-moving arrow – the present. However, this picture of time is not natural. Its roots stretch only to the 18th century, yet this notion has now entrenched itself so deeply in Western thought that it’s difficult to imagine time as anything else. And this new representation of time has affected all kinds of things, from our understanding of history to time travel.

Let’s journey back to ancient Greece. Amid rolls of papyrus and purplish figs, philosophers like Plato looked up into the night. His creation myth, Timaeus, connected time with the movements of celestial bodies. The god ‘brought into being’ the sun, moon and other stars, for the ‘begetting of time’. They trace circles in the sky, creating days, months, years. The ‘wanderings’ of other, ‘bewilderingly numerous’ celestial bodies also make time. When all their wanderings are ‘completed together’, they achieve ‘consummation’ in a ‘perfect year’. At the end of this ‘Great Year’, all the heavenly bodies will have completed their cycles, returning to where they started. Taking millennia, this will complete one cycle of the universe. As ancient Greek philosophy spread through Europe, these ideas of time spread too. For instance, Greek and Roman Stoics connected time with their doctrine of ‘Eternal Recurrence’: the universe undergoes infinite cycles, ending and restarting in fire.

Such views of time are cyclical: time comprises a repeating cycle, as events occur, pass, and occur again. They echo processes in nature. Day and night. Summer to winter. As the historian Stephen Jay Gould explains in Time’s Arrow, Time’s Cycle (1987), within the West, cyclical conceptions dominated ancient thought. It’s even hinted at in the Bible. For example, Ecclesiastes proclaims: ‘What has been will be again … there is nothing new under the sun.’ Yet, Gould writes, the Bible also contains a linear conception of time: time comprises a one-way sequence of unrepeatable events. Take Biblical history: ‘God creates the earth once, instructs Noah to ride out a unique flood in a singular ark.’ Gould describes this linear understanding of history as an ‘important and distinctive’ contribution of Jewish thought. Biblical history helped power linear ideas of time.

Cyclical and linear conceptions of time thrived side by side for centuries, sometimes blurring into one another. After all, we live through natural, cyclical seasons and unrepeatable events – birth, first marriage, death. Importantly, medievals and early moderns didn’t literally see cyclical time as a circle, or linear time as a line. Yet in the 19th-century world of frock coats, petticoats and suet puddings, change was afoot. Gradually, the linear model of time gained ground, and thinkers literally began drawing time as a line…

[Thomas explores four key developments that fueled the shift, chronography (the development of timelines), Darwin and the emergence of the concept of evolution, chronophotography, and theories in math and physics of a “fourth dimension” (then explored by Einstein and Bergson, Mary Calkins and Victoria Welby, Bertrand Russell, H. G. Wells, and so many others…]

… Today, conceiving of time as a line remains widespread. Timelines are everywhere: in the history of evolution, the history of video games, and the history of chocolate. There’s even a timeline of timelines. And the effects of this line of thought (pun intended) are still with us. Philosophers continue to debate the reality of past and future: just check out this bumper encyclopaedia article on ‘Presentism’, ‘the view that only present things exist’. Time-travel stories run rife. Back to the Future. Groundhog Day. The Time Traveler’s Wife. Historians have largely dropped Victorian faith in the progress of humanity, yet progress stories about particular areas remain. For example, take this timeline: it straightforwardly depicts technological progress over time. All these ideas are powered by the notion that time is a line. Were we to reshape our idea of time, perhaps these other ideas would also find themselves bent into new forms…

The Shape of Time,” from @aeon.co.

Anthony Oettinger and separately, Susumu Kuno (though often mis-attributed to Groucho Marx)

###

As we wonder at Yeat’s widening gyre, we might send echoing birthday greetings to Charles Louis de Secondat, baron de La Brède et de Montesquieu; he was born on this date in 1689. Better known simply as Montesquieu, he was a French judge, historian, and political philosopher.

Montesquieu is the principal source of the theory of separation of powers, which is implemented (if not always observed) in many constitutions throughout the world. He is also known for doing more than any other author to secure the place of the word “despotism” in the political lexicon.  His anonymously published The Spirit of Law (De l’esprit des lois, 1748; first translated into English in 1750) was received well in both Great Britain and the American colonies, and influenced the Founding Fathers of the United States in drafting the U.S. Constitution.

Profile portrait of a man with wavy hair and a thoughtful expression, wearing a draped garment.

source