Posts Tagged ‘performance’
“Tell me to what you pay attention and I will tell you who you are”*…

Before the attention economy consumed our lives, “pursuit tests” devised by the US military coupled man to machine with the aim of assessing focus under pressure. D. Graham Burnett explores these devices for evaluating aviators, finding a pre-history of the laboratory research that has relentlessly worked to slice and dice the attentional powers of human beings…
We worry about our attention these days — nearly all of us. There is something. . . wrong. We cannot manage to do what we want to do with our eyes and minds — not for long, anyway. We keep coming back to the machines, to the screens, to the notifications, to the blinking cursor and the frictionless swipe that renews the feed.
An ethnographer from Mars, moving among us (would we even notice?), might have trouble understanding our complaint: “Trouble with their attention? They stare at small slabs of versicolor glass all day! Their attentive powers are. . . sublime!”
And that misunderstanding rather sharpens the point: we don’t have any problem at all with the forms of attention that involve remaining engaged with, and responsive to, machines. We are amazing at the click and tap of durational vigilance to this or that stimulus, presented at the business end of a complex device. Our uncanny and immersive cybernetic attention is a defining characteristic of the age. Our human attention — our ability to be with ourselves and with others, our ability to receive the world with our minds and senses, our ability to daydream, read a book uninterrupted, or watch a sunset — well, many of us are finding it increasingly difficult to remember what that might even mean.
This isn’t really an accident. Over the last century or so, a series of elaborate programs of laboratory research have worked to slice and dice the attentional powers of human beings. Their aim? To understand the operational capacities of those who would be asked to shoot down airplanes, monitor radar screens, and otherwise sit at the controls of large and expensive machines. Seated in front of countless instruments, experimental subjects were asked to listen and look, to track and trigger. Psychologists stood by with stopwatches, quantifying our cybernetic capacities, and seeking ways to extend them. For those of us who have come of age in the fluorescence of the “attention economy”, it is interesting to look back and try to catch glimpses of the way that the movement of human eyeballs came under precise scrutiny, the way that machine vigilance became a field of study. We know now that the mechanomorphic attention dissected in those laboratories is the machine attention that is relentlessly priced in our online lives — to deleterious effects.
You could say that this process began with the fascinating and now mostly forgotten tool known as the “pursuit test”. Part steampunk videogame, part laboratory snuff-flick, the pursuit test staged and restaged the integration of man and machine across the first decades of the twentieth century…
Fascinating– and timely: “Cybernetic Attention– All Watched over by Machines We Learned to Watch,” from @publicdomainrev.bsky.social. Eminently worth reading in full.
* Jose Ortega y Gasset
###
As we untangle engagement, we might send thoughtful birthday greetings to a man whose work influenced the endeavors described in the piece featured above, Hermann Ebbinghaus; he was born on this date in 1850. A psychologist, he pioneered the experimental study of memory and discovered the learning curve, the forgetting curve, and the spacing effect.
“Nothing is poorer than a truth expressed as it was thought”*…
Many bemoan the experiential art that has taken over our galleries and screens, but is, Róisín Lanigan asks, it a populist fad or a way to make art more accessible?…
You don’t have to be a historian or a creative to notice it: art just isn’t what it used to be. Or at least, the act of experiencing art in public isn’t what it used to be. Whereas once we paid to go to galleries to silently view paint behind plexiglass, a new wave of curators and creators have decided that for art to be truly appreciated, we must be completely immersed in the audio, visual and experiential world it inhabits. From London’s Outernet to Yayoi Kusama’s Infinity Mirrors and Vegas’s controversial The Sphere, it’s never been clearer that we’re living in the midst of the immersive art boom.
Even if you’ve never been to one of these spaces – all immersive art exhibitions exist as ‘spaces’; ‘gallery’ is increasingly an archaic term – you’ll be cognisant of their existence. Because they don’t just live in the real world, they live on your screen too. Social media is awash with immersive exhibition selfies, with videos recommending the top ten best immersive art events to see for free in most big cities. The hashtag ‘immersive art’ clocks in at over 99 million views on TikTok and nearly half a million on Instagram (where in all fairness millennials are less amenable to the transformative power of the hashtag than the algorithmically attuned TikTok zoomers).
Outernet, based in Soho, sees around 1.5 million to their ‘district’ on a monthly basis, and say they’re on track to hit 6.8 million visitors this year alone, which would put them on track to be one of the most visited destinations in the UK, just one year after opening. The Smithsonian says that Kusama’s roving Infinity Mirrors exhibition has reached 330 million people across Twitter and other platforms. It’s not a leap to say we’re reaching, if we haven’t already, peak immersive art. But is that a bad thing? And if we’re already at the apex, where do we go from here?
It’s easy to take up the mantle that immersive-mania is, of course, wholly bad. The arguments for this always follow common throughlines; it’s common, it’s populist, it’s diluting the experience of what true art really is. In a recent scathing example, a Vulture review of Refik Anadol’s Unsupervised at Moma called the immersive exhibition “a glorified lava lamp” and accused it of being nothing more than “crowd-pleasing, like-generating mediocrity”.
But who decides what that experience is, what it looks like? The art world, much like the fashion and film industries, has undergone much-needed transformations in recent years to get rid of those antiquated ideals and cultural gatekeepers. Slowly but surely, it’s become more diverse, younger, more experimental and in the process, more accessible. For every charge that immersive events are diluting our experience of artistry, there’s a counterpoint to be made that it’s opening that experience out to people who might not normally gravitate towards it…
[But, she reports, there’s a snake in the garden…]
There is, though, for all the accessibility that immersive exhibitions offer, something antithetical to the experience of being moved by a piece of art when in the back of our minds we’re thinking about how many likes we might get for it on social media. Immersive events which actively encourage selfies and photo opportunities risk detracting from the art itself; a depressing natural end-point to queues to take photos in front of the Mona Lisa and cameras being banned from Basquiat’s recent exhibition in London. Although cameras could never be conceivably banned from the grid-ready world of immersive art, it’s a fine line to treat between posting too much too; leaving your exhibition open to an ‘Instagram vs Reality’ takedown, or revealing spoilers. In the case of Sphere, organisers briefly considered asking guests to put stickers over their phone cameras before realising that their footage is as much promotion as it is a leak…
… There are so many immersive pop-ups that even the gallerists and producers themselves are getting sick of it. Lizzie Pocock tells me almost every brief she’s received for the past five years has used the word “immersive”, a term she now calls “overused”. “I don’t want to sort of be disrespectful, but so many things that get called immersive, you still sort of just go and watch,” she says. “You don’t feel like you’re in them, or you’re affected by them. It’s almost a bit of a lazy word, a buzzword, isn’t it? It’s like, let’s do something that’s immersive. It’s perhaps an excuse to not really delve into sort of the deeper experience and the deeper reason for why you’re putting it live.”
If the people behind the immersive shows are getting sick of them, then perhaps we finally have reached peak immersive. Now we just have to wait for audiences to catch up, for the algorithm to get bored, and for the art world to determine what their next lucrative buzzword will be. Personally, my money’s on AI…
“Have we reached peak immersive art?,” from @rosielanners in @itsnicethat.
See also: “The Rise of ‘Immersive’ Art” and “Ready to plunge in? The rise and rise of immersive art.”
* “Nothing is poorer than a truth expressed as it was thought. Committed to writing in such cases, it is not even a bad photograph. Truth wants to be startled abruptly, at one stroke, from her self-immersion, whether by uproar, music or cries for help.” – Walter Benjamin
###
As we dive in, we might spare a thought for Michelangelo di Lodovico Buonarroti Simoni; he died on this date in 1564. A sculptor, painter, architect, poet, and engineer in the High Renaissance, Michelangelo was considered one of the greatest artists of his time. And given his profound influence on the development of Western art, he has subsequently been considered one of the greatest artists of all time. Indeed, he is widely held to be (with Leonardo da Vinci) the archetypal Renaissance man.
Further to the item above, we might also note that, via his painting on the ceiling of the Sistine Chapel in Rome, he was a pioneer of immersive art.

“I am sorry I have not learned to play at cards. It is very useful in life: it generates kindness and consolidates society.”*…
Day after tomorrow– this this Wednesday, the 27th– Sotheby’s will be auctioning the late, great Ricky Jay’s remarkable collection of magic publications and artifacts…
… You have the rare opportunity to get your hands on a complete guide to the base practices of highwaymen, sharpers, swindlers, money-droppers, duffers, setters, mock-auctions, quacks, bawds, jilts, etc. in eighteenth-century London.
That text is part of The Ricky Jay Collection, perhaps the world’s greatest assemblage of books on magic, deception, and trickery. As detailed in this enjoyable New York Times report, the Sotheby’s sale is a cornucopia of oddities from the late conjurer.
What’s really for sale — beyond the Houdini posters, guides to card tricks, and beautiful landscapes painted by armless entertainers — is the source material for Ricky Jay’s storytelling.
Jay (1946-2018) was, by all accounts, one of the world’s greatest practitioners of legerdemain, a word that literally translates as light of hand [see here for its amusing etymology]. In other words, he did card tricks. But not just any card tricks: His 1977 book Cards As Weapons (available for free here!) begins with a letter to the Secretary of Defense explaining just how valuable his skills might be:
“Drawing on techniques developed hundreds of years ago by ‘ninja’ assassins, I have developed my own system of self-defence based solely on a pack of cards,” he wrote. “I believe I have discovered a viable method of reducing the defence budget while keeping a few steps ahead of the Russkies.”
And Jay could indeed pierce the skin of a watermelon with a playing card from across a room. But when you came right down to what he did with his 52 assistants, the man was famous for moving pieces of waxed paper around on a table. The gulf between collating stationery and “the theatrical representation of the defiance of natural law” was filled by his deep knowledge and ready wit.
One of his signature tricks was The Four Queens, in which the waxed rectangles with the Qs in their corners are blended into the pack and need to be reunited. Or as Jay framed it, “I have taken advantage of these tenderly nurtured and unsophisticated young ladies by placing them in positions extremely galling to their aristocratic sensibilities.”
You can really see the storytelling taking shape here in his Sword of Vengeance trick. What do shogun assassins have to do with cards? That’s exactly what you forget to ask until it’s too late:
Jay’s ability to unspool a story was clearly infectious, as his profilers couldn’t resist taking flights of erudition.
“He’s like someone carving scrimshaw while surrounded by Macy’s Thanksgiving Day dirigibles,” wrote Tom Carson in Grantland.
“His patter was voluble, embroidered with orotund, baroque locutions; he would describe the watermelon rind, for instance, as the ‘thick pachydermatous outer melon layer’,” wrote Mark Singer in The New Yorker.
To include him in the pantheon of Great Wits is to recognize why he amassed The Ricky Jay Collection and what he learned from it. The shaggy dog story, as previously detailed in GWQ #101, is a psychological non-sequitur: You follow it at great length to eventually learn it goes nowhere. But in Ricky Jay’s dexterous hands, the story was an ideal way to distract you from his dexterous hands. His words were how he could really transport the audience into a world of wonders. It’s as if he harnessed the shaggy dogs and mushed them through the Iditarod…
The wit that powered the tricks: “Ricky Jay’s slight of tongue,” from Benjamin Errett (@benjaminerrett)
* Samuel Johnson
###
As we riffle and cut, we might send accomplished birthday greetings to Marion Eileen Ross; she was born on this date in 1928. An actress with a long history in film (e.g., The Glenn Miller Story, Sabrina, Lust for Life, Teacher’s Pet, Some Came Running, and Operation Petticoat), she is best remembered for her role as Marion Cunningham on the sitcom Happy Days, on which she starred from 1974 to 1984 and for which she received two Primetime Emmy Award nominations. (That said, your correspondent will always remember her for her remarkable performances as Grandma SquarePants in SpongeBob SquarePants.)
“Every time you learn you can do something, you can go a little bit faster next time”*…

Joey Chestnut set a new world record by eating 75 hotdogs in 10 minutes at Nathan’s Famous Hot Dog Eating Contest on July 4
… at least, up to a point. Readers will know that (Roughly) Daily has checked in on the competitive eating circuit before (e.g., here), with special attention to the the event hosted by the iconic Nathan’s. So imagine your correspondent’s surprise to learn that the era of dramatic new records year after year might be coming to a close…
The four-minute mile and the two-hour marathon were once believed impossible: now a new gauntlet has been thrown down for the world of elite competition. A scientific analysis suggests competitive eaters have come within nine hotdogs of the limits of human performance.
The theoretical ceiling has been set at 84 hotdogs in 10 minutes. The current world record, set by Joey “Jaws” Chestnut earlier this month, stands at 75.
James Smoliga, a sports medicine specialist at High Point University in North Carolina who authored the research, described 84 hotdogs as “the maximum possible limit for a Usain Bolt-type performance”.
The analysis is based on 39 years of historical data from Nathan’s Famous Hot Dog Eating Contest, an annual spectacle of gluttony held on Coney Island, New York, combined with the latest sports science theory, which uses mathematical modelling to project trends in performance.
Hotdog composition and size have, reportedly, remained unchanged at Nathan’s Famous in the fast food company’s 104-year history, allowing for valid comparison between competitors across years.
Improvement curves in elite sports ranging from sprinting to pole vaulting tend to follow a so-called sigmoidal curve, featuring an initial slow and steady rise, followed by an era of rapid improvement and finally a levelling off. “Hotdog eating has definitely reached that second plateau,” said Smoliga…
The limit of progress? The end of history? “Competitive hotdog eaters nearing limit of human performance.” See also “Scientists Have Finally Calculated How Many Hot Dogs a Person Can Eat at Once.”
* Joey Chestnut, competitive eating champion [pictured above]
###
As we chow down, we might recall that it was on this date in 1762 that Catherine II– better known as Catherine the Great– became the Empress of Russia after the murder of her husband (in a coup that she’d helped arrange). While her personal habits (largely her love life) tend to dominate popular memories of her, scholars note that her reign (through 1792) was a “Golden Age,” during which she revitalized Russia, which grew larger and stronger, and became one of the great powers of Europe and Asia.
Catherine enthusiastically supported the ideals of the Enlightenment; and as a patron of the arts, presided over the age of the Russian Enlightenment, including the establishment of the Smolny Institute for Noble Maidens, the first state-financed higher education institution for women in Europe.






You must be logged in to post a comment.