(Roughly) Daily

Posts Tagged ‘Psychology

“The web of our life is of a mingled yarn”*…

In what does our personhood consist? From what/where does it come? João de Pina Cabral unpacks the seminal thinking of Lucien Lévy-Bruhl and the advances in cognitive science and developmental psychology that suggest that a person is not self-contained, but the outcome of a lifelong process of living with others…

It matters to understand what constitutes a person. After all, if there is one feature that distinguishes human society from other forms of sociality, it is that, at around one year of age, most human beings attain personhood: they learn to speak a language, develop object permanence – the understanding that things do not disappear when out of sight – and relate to others in consciously moral ways. Should all persons be accorded the same rights and duties by virtue of this condition? These are weighty questions that have occupied social scientists and philosophers since antiquity – particularly at moments such as the present, when war and imperial oppression once again raise their ugly heads.

Nevertheless, this question cannot be approached as a purely moral matter, for in order to determine what rights and duties may be attributed to persons, it is necessary to establish what persons are. This longstanding perplexity can now be addressed in increasingly sophisticated ways, following a century of sustained anthropological enquiry.

In September 1926, two of the most eminent anthropologists of the day met in person for the first time in New York. Both were Jewish and born in Europe, but one – Franz Boas – had become an American citizen and was a leading figure at Columbia University in New York, while the other – Lucien Lévy-Bruhl – was a professor in Paris. Both were highly learned, humanistically inclined and politically liberal; they respected one another, yet they did not seem to agree about the matter of the person.

Lévy-Bruhl had begun his career as a philosopher of ethics. His doctoral thesis focused on the legal concept of responsibility. He was struck by the fact that responsibility first arose between persons not as a law, but as an emotion – a deep-seated feeling. He argued that co-responsibility implies a bond between persons grounded less in reason than in the conditions of their emergence as persons. As children, individuals do not emerge out of nothing, but through deep engagement with prior persons – their caregivers. Thus, moral responsibility could not have arisen from adherence to norms or rules; rather, norms and rules emerged from the sense of responsibility that humans acquire as they become persons.

This led him to question how we become thinking beings. Do all humans, after all, think in the same way? He began reading the increasingly sophisticated ethnographic accounts emerging from Australia, Africa, Asia and South America, and was deeply influenced by an extended trip to China. He was an empirical realist, but also a personalist – that is, he accorded primacy to the person as such, refusing to subsume the individual into the group. In this respect, he was not persuaded by the arguments of the great sociologist Émile Durkheim concerning the exceptional status of the ‘sacred’ or the special powers of ‘collective consciousness’. Lévy-Bruhl soon arrived at a striking conclusion: in their everyday practices and especially in their ritual actions, the so-called ‘primitive’ peoples studied by ethnographers did not appear to conform to the norms of logic that had been regarded as universally valid since the time of Aristotle.

As a friend of his put it, Lévy-Bruhl discovered that such peoples are characterised by ‘a mystical mentality – full of the “supernatural in nature” and prelogic, of a different kind than ours’. Indeed, the basic principles of Aristotelian logic that continue to guide scientific thinking – underpinning modern technological development – seemed to be ignored by premodern peoples. Aristotle’s law of the excluded middle (p or not-p) did not appear to apply to their ‘mystical’ modes of thought, both because they tended to think in terms of concrete objects rather than abstractions, and because they exhibited what Lévy-Bruhl termed ‘participation’…

[de Pina Cabral traces the development of Lévy-Bruhl’s thought, starting with Plato’s concept of methexis; elaborates on Lévy-Bruhl’s ideas; and traces te advances in cognitive science and developmental psychology that support them…]

… the very experience of personhood – that is, the sense that I am myself – is not ‘individual’, since its emergence presupposes a prior condition of being-with others. The self arises from a sharing of being with others, from having been part of those who are close to us. One does not emerge as an addition to society, but rather as a partial separation from the participations that initially constituted one’s being.

As I become a person, I learn to relate to myself as an other; I transcend my immediate position in the world. Without this, I would not be able to speak a language, since the use of pronouns presupposes reflexive thought. Thus, as Lévy-Bruhl had already insisted in his notebooks, participation precedes the person. Intersubjectivity is not the meeting of already constituted subjects, but the ground from which subjectivity emerges. Participation, therefore, may be understood as the constitutive tension between the singular and the plural in the formation of the person in the world. In 1935, the great phenomenologist Edmund Husserl expressed this insight clearly in a letter to Lévy-Bruhl where he thanked him for his ideas on participation:

Saying ‘I’ and ‘we’, [persons] find themselves as members of families, associations, [socialities], as living ‘together’, exerting an influence on and suffering from their world – the world that has sense and reality for them, through their intentional life, their experiencing, thinking, [and] valuing.

In acting and being acted upon together in human company during the first year of life, children become ‘we’ at the same time as they become ‘I’, which means that persons are always, ambivalently, both ‘I’ and ‘we’. Participation and transcendence will remain sources of theoretical perplexity for as long as the ‘we’ is approached as a categorical matter – a question of ‘identity’ – rather than as the presence and activity of living persons in dynamic interaction with the world and with one another.

By contrast, once we accept that personhood is the outcome of a process – the encounter between the embodied capacities of human beings and the historically constituted world that surrounds them – participation loses its mystery. As Lévy-Bruhl put it in one of his final notes: ‘The impossibility for the individual to separate within himself what would be properly him and what he participates in in order to exist …’ Participation, therefore, is the ground upon which everyday social interaction is constituted. The ‘mystical’ (or transcendental) potential within each of us – that which animates the symbolic life of groups – is part of the very process through which each of us becomes ourselves…

How does one become a person? “We” before “I”: “To be is to participate,” from @aeon.co.

A (if not the) next question: how does personhood emerge when the formative interactions are increasingly mediated/attentuated by technology?

* Shakespeare, All’s Well That Ends Well, Act 4, Scene 3

###

As we get together, we might send behaviorist birthday greetings to a man whose work focused on how one might train the “persons” who emerge: Kenneth Spence; he was born on this date in 1907. A psychologist, he worked to construct a comprehensive theory of behavior to encompass conditioning and other simple forms of learning and behavior modification.

Spence attempted to establish a precise, mathematical formulation to describe the acquisition of learned behavior, trying to measure simple learned behaviors (e.g., salivating in anticipation of eating). Much of his research focused on classically conditioned, easily measured, eye-blinking behavior in relation to anxiety and other factors.

One of the leading theorists of his time, Spence was the most cited psychologist in the 14 most influential psychology journals in the last six years of his life (1962 – 1967).  A Review of General Psychology survey, published in 2002, ranked Spence as the 62nd most cited psychologist of the 20th century.

source

Written by (Roughly) Daily

May 6, 2026 at 1:00 am

“Unhappy is the land that needs a hero”*…

To the extent that evolutionary biologist and sociobiologist Robert Trivers has been in the news over the last decade, it has been for his entanglement with and highly-questionable defense of Jeffrey Epstein. But as Lionel Page reminds us, two decades before that– well before he could have known the execrable “financier”– Trivers made hugely important contributions to his field…

Steve Stewart-Williams announced… that Robert Trivers passed away.

Trivers was one of the most—perhaps the most—influential evolutionary biologists of the 20th century. His work should be much more widely known in social and behavioural sciences, in particular in economics, as Trivers’ intellectual approach is very much in line with a game theoretic understanding of social interactions.

It is hard to overstate the importance of his work. Einstein famously published four groundbreaking papers in 1905, a year often referred to as his “Annus mirabilis”, during which he revolutionised physics. Trivers might be said to have had a “Quinquennium Mirabile” for the five years between 1971 and 1976, during which he produced a series of ideas that revolutionised evolutionary biology…

[Page unpacks four of those contributions: Reciprocal Alturism, Parental Investment, Parental Offspring Conflict, and Self-Deception, each fascinating…]

… Trivers has been one of the most influential evolutionary biologists, and his papers are still worth reading today. His insights, published more than 50 years ago, are fascinating. They often align very well with economic theories of behaviour, and it is therefore regrettable that his ideas are not more well-known in economics, and in particular in behavioural economics.

A key feature of Trivers’ take across these contributions was to see that beneath the world of social interactions we observe, there are deep structures in terms of incentives that shape the game we play. Understanding these games and their structures helps us make sense of the seemingly endless complexity of human psychology and social dynamics. In several key contributions, Trivers helped lift the veil on the underlying logic of human behaviour…

From cooperation to conflict: the evolutionary grammar of social interactions: “The fascinating insights of Robert Trivers” from @lionelpage.bsky.social.

For more on Trivers and the controversies in his life (Epstein, but also the Black Panthers and a Rutgers set-to), all of which followed the burst of productivity described above, see here.

And for some thoughts on how one might reconcile appreciation for a scientist’s work with abhorence of his later sins, see “Ghosts of Science Past Still Haunt Us. We Can Put Them to Rest.

* Bertolt Brecht (through the mouth of Galileo, in The Life of Galileo)

###

As we linger over legacies, we might send material birthday greetings to a man who helped lay the groundwork for the field to which Trivers contributed, Ludwig Büchner; he was born on this date in 1824. A philosopher, physiologist, and physician, he became one of the leading exponents of 19th-century scientific materialism. Büchner was an early champion of Darwin’s theory of evolution, endorsing it within a decade of its first issuance, then did much to spread it by citing and building on it in his own books.

As far as we know, Büchner’s life was free of the scandal and conflict that plagued Trivers. He taught at the University of Tübingen and published dozens of books and papers. Later in his life he founded he “German Freethinkers League” (“Deutsche Freidenkerbund”) and served as a member of the second chamber of the Landstände of the Grand Duchy of Hesse as a representative of the German Free-minded Party from 1884 to 1890. He was the younger brother of Georg Büchner, a famous revolutionary playwright, and Luise Büchner, a women’s rights advocate; and he was the uncle of Ernst Büchner, inventor of the Büchner flask.

source

“What matters to you defines your mattering”*…

Further in a fashion to yesterday’s post, and via the always illuminating Delanceyplace.com, an explication of one of the most fundamental of all human needs: an excerpt from Rebecca Goldstein‘s The Mattering Instinct, in which she draws on one of the fathers of both pragmatism and psychology, William James

We speak both of what matters and of who matters. In fact, we speak a great deal about both.

Consider what matters. In recent decades, the phrase why X matters has become a template for dozens of book titles, including Why Beauty Matters, Why Emotions Matter, Why Family Matters, Why Genealogy Matters, Why Good Sex Matters, Why Jesus Matters, Why Knowledge Matters, Why Liberalism Matters, Why Money Matters, and Why Stories Matter. The profusion of titles, many of them mutually exclusive–after all, if Jesus matters, then how, too, can money?–testifies to our preoccupation with what matters.

And it’s not only the question of what matters but also of who matters that’s urgent. Consider: In 2013, seventeen-year-old Trayvon Martin, a Black American, was visiting, together with his father, his father’s fiance at her townhouse in a gated community in Florida. While the grownups were out, Trayvon went to a nearby convenience store to get himself some snacks and, on his way back, was shot by a Neighborhood Watch volunteer, George Zimmerman, himself a member of a minority as a Hispanic American. Zimmerman found Trayvon suspicious looking–the boy’s hoodie was prominently mentioned in news stories–and called the police, while he continued to trail the teenager, a course of action ultimately ending in the boy’s death. Trayvon hadn’t been armed. All that was found on him was a bag of Skittles and an iced tea.

After the acquittal of the shooter, the hashtag #BlackLivesMatter exploded onto social media. The three-word slogan soon went beyond mere hashtags and placards, following the deaths of two more unarmed Black Americans, Michael Brown and Eric Garner, to become a political movement. Those who opposed Black Lives Matter sometimes offered as rejoinders their own three-word slogans: ‘All Lives Matter,’ or ‘Blue Lives Matter,’ this last referring to police officers. Of course, ‘Black Lives Matter’ isn’t inconsistent with either ‘All Lives Matter’ or ‘Blue Lives Matter,’ since ‘Black Lives Matter’ isn’t synonymous with ‘Only Black Lives Matter.’ The power and the poignancy of the original slogan lay in its minimalism. But what the battle of the slogans made clear is the potency of the verb to matter, in this instance applied not to the question of what matters but rather who matters.

So what exactly does the verb to matter mean? Here is a quick working definition: To matter is to be deserving of attention. It’s the same whether we are speaking of what matters or who matters. The thing or the person that matters makes a claim on us; at the very least, a claim is made on our attention.

The claim of being deserving of attention may be based on consequences that would ensue from paying attention or not paying attention–as when we ask, say, does voting really matter? We’re asking whether voting makes a difference; and so whether it’s worth our while to pay the attention called for in voting. It’s still the question of being deserving of attention, but what decides the issue is the consequences. In other circumstances, claims of mattering–of being deserving of attention–are independent of considerations of consequences, as when we assert that Black lives matter or that all lives matter. Here it’s intrinsic mattering, having nothing to do with consequences. And what intrinsic mattering comes down to is being deserving of attention. To claim that Black lives matter, as all lives matter, is to make claims regarding the deservingness of attention.

This leaves us with two more terms to explicate: attention and deservingness.

Attention is a mental phenomenon studied by contemporary psychologists, cognitive scientists, and neuroscientists–in other words, it is a subject for the empirical sciences.

The best definition I know of the phenomenon was given by the philosopher and psychologist William James. Attention, he wrote, is ‘the taking possession by the mind, in clear and vivid form, of one out of what may seem several simultaneously possible objects or trains of thoughts.’ Focalization, concentration of consciousness, are of its essence. It implies withdrawal from some things in order to deal effectively with others and is a condition which has a real opposite in the confused, dazed, scatterbrained state which in French is called distraction, and Zerstreutheit in German.

James implies that attention is something we do. ‘It is the taking possession by the mind.’ The world’s languages agree. In English we pay attention, while in other languages we give, lend, gift, dedicate, sacrifice, prepare, turn, attach, apply, infuse, and arouse our attention. The linguistic formations all imply that there is activity and agency in attention. His definition also makes clear how attention, as an activity, is to be distinguished from the broader notion of consciousness. After all, that confused, dazed, scatterbrained state is a state of consciousness, though the ‘real opposite’ of paying attention.

His definition also entails that attention is limited and selective: withdrawal from some things. Every act of attention is an act of exclusion. In paying attention to something, we are forced to ignore a multitude of other things. And he ties this limitedness and selectivity with attention’s usefulness: in order to deal effectively. Contemporary psychology agrees. Attention’s limitedness and selectivity is crucial to its usefulness and linked to the reason why organisms evolved attention in the first place: to pay attention to changeable things in the organism’s immediate environment that can help or hinder it, nourish or annihilate it. That unpleasant smell, for example, may very well signal toxicity. Note the presence of the word changeable. The function of attention is tied to what is variable, not just to what is relevant to fitness. Oxygen, our heartbeat, gravity, and many other things are vital to our survival, and our unconscious mental processes must take them into account. But they tend to be constant, so there is no need to allocate our limited window of attention to them, unless circumstances alarmingly change.

The agency entailed in the act of paying attention means that we have some control over what we do and don’t pay attention to. You may be unable to remain oblivious to the bad music blasting in your gym or the rank smell seeping into your kitchen–stimuli that are intense or that pop out of your surroundings. But you can decide to pay no attention to, say, gossip or popular culture, social media or your weight. You can decide that they simply don’t matter, which is to say that they’re not deserving of your attention. And this brings us to the second component of the English verb to matter–namely deservingness.

Deservingness introduces an entirely different level of consideration into our preoccupations with mattering. It’s a level that goes beyond the psychological, beyond the empirical altogether. Deservingness draws us into the nonempirical sphere of values and justifications, of oughts and ought-nots. This is the sphere that philosophers call normative, because it invokes norms of justification. The mattering instinct means that we are normative creatures down to our core. We think and act and shape our lives within the sphere of justifications. Instead of calling ourselves Homo sapiens, we might better have christened ourselves Homo justificans.

It’s the presence of deservingness in the concept of mattering that raises us up into an entirely different order of both complexity and perplexedness. The mattering instinct has us straining beyond the empirical for the normative knowledge that eludes us. We are carried over into the sphere of values and justifications without being equipped to see our way through. Here is the epistemic elusiveness that injects the unsubdued doubt–and hence unease–into the heart of what it is to pursue a human life.

We speak both of what matters and of who matters. And behind our preoccupations with both is the most urgent of all our mattering questions, which is voiced in the first person: Do I matter? This is the mother of our mattering questions. Ultimately, we want to know what matters because we desperately want our own lives to be driven by what matters. We want to know who matters because we desperately want to be numbered among the ones who matter.

Self-mattering–feeling ourselves overwhelmingly deserving of our own attention–is baked into our identity. The usefulness of attention, to which William James alluded, is its usefulness to ourselves. So it’s no wonder that the greater part of our attention is given over to ourselves, whether overtly or tacitly. Throughout the enormous complexity of how the mind works, our self-mattering is presumed. And yet, astonishing creatures that we are, we are able, by way of the capacity for self-reflection with which our brains come equipped, to step outside of our self-mattering, which is to step outside ourselves, to pose the mother of all mattering questions…

It’s the deservingness component that separates the mattering for which we long from such empirical psychological states as having confidence or self-esteem. You can go online right now, or schedule a visit to a psychologist, and take a test that measures your confidence or self-esteem. There will be a series of statements to which you respond with the degree of your agreement, such as: I feel that I am a person of worth, at least on an equal plane with others. I feel that I have a number of good qualities. All in all, I am inclined to feel that I’m a failure. The test may even provide a numerical score, similar to an IQ test. The Rosenberg Self-Esteem Scale, for example, which is one of the most widely used measures of self-esteem and from which I’ve taken the above statements, provides a numerical value from 1 to 30, with any score under 15 indicating low self-esteem. It was none other than William James who first formulated the concept of self-esteem, offering an equation as its definition.

But these assessments of how good you feel about yourself, often in relation to others, aren’t tests of whether you truly, objectively, existentially matter. To figure out that question, the mother of all mattering questions, you can’t take an empirical test. Your self-esteem score, whether high or low, may be grounded in self-delusion, and the mother question is a demand for the answer that lies on the other side of self-delusion. Do I truly and objectively matter? I know that I can’t help feeling that I do, but do I really?

When it comes to our own mattering, we are staunch realists. We don’t want feelings. We want the facts.”…

Mattering

See also “Why We Need to Feel Like We Matter” (source of the image above)

John Green

###

As we wonder about worth, we might spare a thought for a man who unquestionably mattered, Johann Wolfgang von Goethe; he died on this date in 1832. A poet, playwright, artist, biologist, theoretical physicist, and philosopher, he is probably best remembered these days for Faust. But by virtue of the breadth and depth of his work, he is considered “the master spirit of the German people,” and, after Napoleon, the leading figure of his age.
 

Portrait by Joseph Karl Stieler, 1828 (source)

“Technology challenges us to assert our human values, which means that first of all, we have to figure out what they are”*…

A hand is holding a glowing projection of interconnected dots and lines against a dark background, representing technology and innovation.

As we head into the weekend, some food for thought…

A decade ago, the world was, at once, both the seed of today and a very different place: In what was considered one of the biggest political upsets in American political history (and the fifth and most recent presidential election in which the winning candidate lost the popular vote), Donald Trump was elected to his first term. The U.K. chose Brexit. The stock market finished strong, with the Dow Jones, S&P 500, and Nasdaq reaching new highs. (In the 10 years that have followed, the Dow has risen about 150%; the S&P 500, roughly 400%; and the NASDAQ has roughly sextupled.)

It was a big year for pop culture, marked by Beyoncé’s Lemonade, the massive Pokémon Go craze, and the rise of Netflix with Stranger Things, the Rio Olympics, and the loss of icons like David Bowie and Prince.

It was also a big year in tech: Russian hacking and disinfo (especially on Facebook) was a huge story– as was Apple’s elimination of the headphone jack in the iPhone 7. Theranos collapsed; and Wells fargo opened millions of accounts for customers without those customers’ permission (for which they were sunsequently fined $3 Billion). And Virtual Reality was everywhere (in the promises/offers from tech companies), but nowhere in the market. TikTok was launched in 2016, but hadn’t yet become the phenomenon (and avatar of algorithmly-driven feeds) that it has become. And in the course of 2016, artificial intelligence made the leap from “science fiction concept” to “almost meaningless buzzword” (though in fairness, 2016 was the year that Google DeepMind’s AlphaGo program triumphed against South Korean Go grandmaster Lee Sedol).

Back in 2016, the estimable Alan Jacobs was pondering the road ahead. In a piece for The New Atlantis, he coined and discussed a series of aphorisms relevant to the future as then he saw it. He begins…

Aphorisms are essentially an aristocratic genre of writing. The apho-
rist does not argue or explain, he asserts; and implicit in his assertion
is a conviction that he is wiser or more intelligent than his readers.
– W. H. Auden and Louis Kronenberger, The Viking Book of Aphorisms

Author’s Note: I hope that the statement above is wrong, believing that certain adjustments can be made to the aphoristic procedure that will rescue the following collection from arrogance. The trick is to do this in a way that does not sacrifice
the provocative character that makes the aphorism, at its best, such a powerful form of utterance.

Here I employ two strategies to enable me to walk this tightrope. The first is to characterize the aphorisms as “theses for disputation,” à la Martin Luther — that is, I invite response, especially response in the form of disagreement or correction. The second is to create a kind of textual conversation, both on the page and beyond it, by adding commentary (often in the form of quotation) that elucidates each thesis, perhaps even increases its provocativeness, but never descends into coarsely explanatory pedantry…

[There follows a series of provocations and discussions that feel as relevant– and important– today as they were a decade ago. He concludes…]

Precisely because of this mystery, we need to evaluate our technologies according to the criteria established by our need for “conviviality.”

I use the term with the particular meaning that Ivan Illich gives it in Tools for Conviviality [here]:

I intend it to mean autonomous and creative intercourse among per-
sons, and the intercourse of persons with their environment; and this
in contrast with the conditioned response of persons to the demands
made upon them by others, and by a man-made environment. I con-
sider conviviality to be individual freedom realized in personal inter-
dependence and, as such, an intrinsic ethical value. I believe that, in
any society, as conviviality is reduced below a certain level, no amount
of industrial productivity can effectively satisfy the needs it creates
among society’s members.

In my judgment, nothing is more needful in our present technological moment than the rehabilitation and exploration of Illich’s notion of conviviality, and the use of it, first, to apprehend the tools we habitually employ and, second, to alter or replace them. For the point of any truly valuable critique of technology is not merely to understand our tools but to change them — and us…

Eminently worth reading in full, as its still all-too-relevant: “Attending to Technology- Theses for Disputation,” from @ayjay.bsky.social.

Pair with a provocative piece from another fan of Illich, L. M. Sacasas (@lmsacasas.bsky.social): “Surviving the Show: Illich And The Case For An Askesis of Perception.”

[Image above: source]

Sherry Turkle

###

As we think about tech, we might recall that it was on this date in 1946 that an ancestor of today’s social networks, streaming services, and AIs, the ENIAC (Electronic Numerical Integrator And Computer), was first demonstrated in operation.  (It was announced to the public the following day.) The first general-purpose computer (Turing-complete, digital, and capable of being programmed and re-programmed to solve different problems), ENIAC was begun in 1943, as part of the U.S’s war effort (as a classified military project known as “Project PX“); it was conceived and designed by John Mauchly and Presper Eckert of the University of Pennsylvania, where it was built.  The finished machine, composed of 17,468 electronic vacuum tubes, 7,200 crystal diodes, 1,500 relays, 70,000 resistors, 10,000 capacitors and around 5 million hand-soldered joints, weighed more than 27 tons and occupied a 30 x 50 foot room– in its time the largest single electronic apparatus in the world.  ENIAC’s basic clock speed was 100,000 cycles per second (or Hertz). Today’s home computers have clock speeds of 3,500,000,000 cycles per second or more.

Historic black and white image of an early computer room featuring large machines with intricate wiring, a male technician working at one of the machines, and a female operator reading from a data sheet.

source

Written by (Roughly) Daily

February 13, 2026 at 1:00 am

“Tell me to what you pay attention and I will tell you who you are”*…

A man wearing a gas mask operates a device at a wooden table, with letters L, A, M, F, and E visible on the table. Equipment and hoses are connected to the device.
A test subject has his oxygen consumption measured while using Walter R. Miles’ Pursuitmeter, as pictured in the inventor’s 1921 article for the Journal of Experimental PsychologySource.

Before the attention economy consumed our lives, “pursuit tests” devised by the US military coupled man to machine with the aim of assessing focus under pressure. D. Graham Burnett explores these devices for evaluating aviators, finding a pre-history of the laboratory research that has relentlessly worked to slice and dice the attentional powers of human beings…

We worry about our attention these days — nearly all of us. There is something. . . wrong. We cannot manage to do what we want to do with our eyes and minds — not for long, anyway. We keep coming back to the machines, to the screens, to the notifications, to the blinking cursor and the frictionless swipe that renews the feed.

An ethnographer from Mars, moving among us (would we even notice?), might have trouble understanding our complaint: “Trouble with their attention? They stare at small slabs of versicolor glass all day! Their attentive powers are. . . sublime!”

And that misunderstanding rather sharpens the point: we don’t have any problem at all with the forms of attention that involve remaining engaged with, and responsive to, machines. We are amazing at the click and tap of durational vigilance to this or that stimulus, presented at the business end of a complex device. Our uncanny and immersive cybernetic attention is a defining characteristic of the age. Our human attention — our ability to be with ourselves and with others, our ability to receive the world with our minds and senses, our ability to daydream, read a book uninterrupted, or watch a sunset — well, many of us are finding it increasingly difficult to remember what that might even mean.

This isn’t really an accident. Over the last century or so, a series of elaborate programs of laboratory research have worked to slice and dice the attentional powers of human beings. Their aim? To understand the operational capacities of those who would be asked to shoot down airplanes, monitor radar screens, and otherwise sit at the controls of large and expensive machines. Seated in front of countless instruments, experimental subjects were asked to listen and look, to track and trigger. Psychologists stood by with stopwatches, quantifying our cybernetic capacities, and seeking ways to extend them. For those of us who have come of age in the fluorescence of the “attention economy”, it is interesting to look back and try to catch glimpses of the way that the movement of human eyeballs came under precise scrutiny, the way that machine vigilance became a field of study. We know now that the mechanomorphic attention dissected in those laboratories is the machine attention that is relentlessly priced in our online lives — to deleterious effects.

You could say that this process began with the fascinating and now mostly forgotten tool known as the “pursuit test”. Part steampunk videogame, part laboratory snuff-flick, the pursuit test staged and restaged the integration of man and machine across the first decades of the twentieth century…

Fascinating– and timely: “Cybernetic Attention– All Watched over by Machines We Learned to Watch,” from @publicdomainrev.bsky.social. Eminently worth reading in full.

* Jose Ortega y Gasset

###

As we untangle engagement, we might send thoughtful birthday greetings to a man whose work influenced the endeavors described in the piece featured above, Hermann Ebbinghaus; he was born on this date in 1850. A psychologist, he pioneered the experimental study of memory and discovered the learning curve, the forgetting curve, and the spacing effect.

Black and white portrait of a man with a large beard, wearing round glasses and a formal suit, looking directly at the camera.

source