Posts Tagged ‘subjectivity’
“That’s the artist’s job, really: continually setting yourself free, and giving yourself new options and new ways of thinking about things”*…
Further, in a fashion, to last week’s post on literacy (and post-literacy), Nathan Gardels alerts us to a conversation between Ken Liu and Nils Gilman, in which Liu suggests that, in a way analogous to the the camera’s ability to capture motion (and thus, transform storytelling), AI is emerging as a new artistic medium for capturing subjective experience…
For the celebrated novelist Ken Liu, whose works include “The Paper Menagerie” and Chinese-to-English translation of “The Three-Body Problem,” science fiction is a way to plumb the anxieties, hopes and abiding myths of the collective unconscious.
In this pursuit, he argues in a Futurology podcast, AI should not be regarded as a threat to the distinctive human capacity to organize our reality or imagine alternative worlds through storytelling. On the contrary, the technology should be seen as an entirely new way to access that elusive realm beneath the surface and deepen our self-knowledge.
As a window into the interiority of others, and indeed, of ourselves, Liu believes the communal mirror of Large Language Models opens the horizons of how we experience and situate our presence in the world.
“It’s fascinating to me to think about AI as a potential new artistic medium in the same way that the camera was a new artistic medium,” he muses. What the roving aperture enabled was the cinematic art form of capturing motion, “so you can splice movement around … and can break all kinds of rules about narrative art that used to be true.
“In the dramatic arts, it was just assumed that because you had to perform in front of an audience on the stage, that you had to follow certain unities to make your story comprehensible. The unity of action, of place, of time. You can’t just randomly jump around, or the audience wouldn’t be able to follow you.
But with this motion-capturing machine, you can in fact do that. That’s why an actual movie is very different from a play.
You can do the reaction shots, you can do the montages, you can do the cuts, you can do the swipes, you can do all sorts of things in the language of cinema.
You can put audiences in perspectives that they normally can never be in. So it’s such a transformation of the understanding of presence, of how a subject can be present in a dramatic narrative story.”
He continues: “Rather than thinking about AI as a cheap way to replace filmmakers, to replace writers, to replace artists, think of [it] as a new kind of machine that captures something and plays back something. What is the thing that it captures and plays back? The content of thought, or subjectivity.”
The ancient Greeks called the content, or object of a person’s thought, “noema,” which is why this publication bears that name.
Liu thus invents the term “Noematograph” as analogous to “the cinematograph not for motion, but for thought … AI is really a subjectivity capturing machine, because by being trained on the products of human thinking, it has captured the subjectivities, the consciousnesses, that were involved in the creation of those things.”
Liu sees value in what some regard as the worst qualities of generative AI.
“This is a machine that allows people to play with subjectivities and to craft their own fictions, to engage in their own narrative self-construction in the process of working with an AI,” he observes. “The fact that AI is sycophantic and shapeable by you is the point. It’s not another human being. It’s a simulation. It’s a construction. It’s a fictional thing.
You can ask the AI to explain, to interpret. You can role-play with AI. You can explore a world that you construct together.
You can also share these things with other humans. One of the great, fun trends on the internet involving using AI, in fact, is about people crafting their own versions of prompts with models and then sharing the results with other humans.
And then a large group, a large community, comes together to collaboratively play with AI. So I think it’s the playfulness, it’s that interactivity, that I think is going to be really, really determinative of the future of AI as an art form.”
So, what will the product of this new art form look like?
“As a medium for art, what will come out of it won’t look anything like movies or novels …They’re going to be much more like conversations with friends. They’re going to be more like a meal you share with people. They are much more ephemeral in the moment. They’re about the participation. They’re about the consumer being also the creator.
They’re much more personalized. They’re about you looking into the strange mirror and sort of examining your own subjectivity.”
Much of what Liu posits echoes the views of the philosopher of technology, Tobias Rees, in a previous conversation with Noema.
As Rees describes it, “AI has much more information available than we do, and it can access and work through this information faster than we can. It also can discover logical structures in data — patterns — where we see nothing.
AI can literally give us access to spaces that we, on our own, qua human, cannot discover and cannot access.”
He goes on: “Imagine an AI model … that has access to all your data. Your emails, your messages, your documents, your voice memos, your photos, your songs, etc.
Such an AI system can make me visible to myself … it literally can lift me above me. It can show me myself from outside of myself, show me the patterns of thoughts and behaviors that have come to define me. It can help me understand these patterns, and it can discuss with me whether they are constraining me, and if so, then how. What is more, it can help me work on those patterns and, where appropriate, enable me to break from them and be set free.”
Philosophically put, says Rees, invoking the meaning of “noema” as Liu does, “AI can help me transform myself into an ‘object of thought’ to which I can relate and on which I can work.
“The work of the self on the self has formed the core of what Greek philosophers called meletē and Roman philosophers meditatio. And the kind of AI system I evoke here would be a philosopher’s dream. It could make us humans visible to ourselves from outside of us.”
Liu’s insight as a writer of science fiction realism is to see what Rees describes in the social context of interactive connectivity.
The arrival of new technologies is always disruptive to familiar ways of seeing that were cultivated from within established capacities. Letting go of those comforting narratives that guide our inner world is existentially disorienting. It is here that art’s vocation comes into play as the medium that helps move the human condition along. To see technology as an art form, as Liu does, is to capture the epochal moment of transformation that we are presently living through…
Is AI birthing a new art form? “From Cinema To The Noematograph,” @kyliu99.bsky.social and @nilsgilman.bsky.social in @futurologypod.bsky.social.
See/her the full conversation:
See also: “O brave new world, that has such people in ‘t!“
* Miranda July
###
As we observe, with William Gibson, that the street finds its own uses for things, we might recall that it was on this date in 1959 that perhaps the pinnacle of cinema’s ability to capture motion was released: the most famous the the six films of Ben Hur, “the Charlton Heston version.”
At the time, Ben Hur had the largest budget ($15.175 million), the largest sets, a wardrobe staff of 100, over 200 artists, about 200 camels and 2,500 horses and about 10,000 extras.
Filming began on May 18, 1958, and didn’t wrap up until January 7, 1959. The film crew worked between 12 to 14 hours a day, six days a week.
The chariot race scene lasts for nine minutes in the finished film and Miklos Rozsa’s film score is the longest ever composed for a film.
– source
“Don’t think, but look!”*…
The scene is London; the year, 1941. Ludwig Wittgenstein, likely the greatest philosopher of the twentieth century, has taken a hiatus from his Cambridge professorship to do “war work” in a menial position at Guy’s Hospital. By the time he arrives there, in September, the worst of the Blitz is over, but there’s no way of knowing that—the bombing could begin again any night. Wittgenstein serves as a dispensary porter, meaning he pushes a big cart from ward to ward, delivering medicine to patients. He’s 52 years old, small and thin, not to say frail. He writes in a letter that sometimes after work he can “hardly move.”
To John Ryle, brother of Oxford philosopher Gilbert Ryle, Wittgenstein explains his reason for volunteering in London: “I feel I will die slowly if I stay there [in Cambridge]. I would rather take the chance of dying quickly.”
Wittgenstein’s time at Guy’s Hospital is an especially lonely period in a lonely life. Socially awkward in the extreme, he does not endear himself to his coworkers. Although it soon gets out, he initially hopes to conceal that he’s a professor in regular life, hating the prospect of being treated differently. But he is different. His attempts to hide in plain sight must strike everyone as yet another eccentricity.
Nevertheless, he makes at least one friend at the hospital, a fellow staffer named Roy Fouracre. After some time, Fouracre is permitted to visit Wittgenstein in his room, a rare privilege with the reclusive philosopher. Crossing the threshold into Wittgenstein’s private quarters, Fouracre must expect to find books everywhere, hefty, awe-inspiring tomes by Aristotle and Kant and the like. Nothing of the sort. The only reading material in evidence is “neat piles of detective magazines.”
Those magazines would have been American detective pulps, the kind that chronicled the adventures of Philip Marlowe, Mike Hammer, Sam Spade and other hardboiled heroes. During the last two decades of his life, Wittgenstein read such fiction compulsively. But what drew him to detective stories, and to American hardboiled ones in particular? How did a man engaged in a fundamental reform of philosophy—no less than an overhaul of how we think and talk about the world—develop such a passion for pulps?…
How pulp magazines inspired Wittgenstein’s investigations of the mysteries of language: “The Philosopher of the Detectives- Ludwig Wittgenstein’s enduring passion for hardboiled fiction.”
For more on Wittgenstein’s thought, see this Stanford Encyclopedia of Philosophy article; for more on his life, this engaging biography.
* Ludwig Wittgenstein
###
As we “only describe, don’t explain,” we might spare a thought for Henri-Louis Bergson; he died on this date in 1941. A philosopher especially influential in the first half of the 20th Century, Bergson convinced many of the primacy of immediate experience and intuition over rationalism and science for the understanding of reality…. many, but not Wittgenstein (nor Russell, Moore, nor Santayana), who thought that he willfully misunderstood the scientific method in order to justify his “projection of subjectivity onto the physical world.” Still, in 1927 Bergson won the Nobel Prize (in Literature); and in 1930, received France’s highest honor, the Grand-Croix de la Legion d’honneur.
Bergson’s influence waned mightily later in the century. To the extent that there’s been a bit of a resurgence of interest, it’s largely the result, in philosophical circles, of Gilles Deleuze’s appropriation of Bergson’s concept of “mulitplicity” and his treatment of duration, which Deleuze used in his critique of Hegel’s dialectic, and in the religious and spiritualist studies communities, of Bergson’s seeming embrace of the concept of an overriding/underlying consciousness in which humans participate.

“I don’t think necessity is the mother of invention. Invention, in my opinion, arises directly from idleness, possibly also from laziness – to save oneself trouble”*…

Like all other animals, our species evolved by gradual processes of natural selection that equipped us to survive and reproduce within a certain environmental niche. Unlike other animals, however, our species managed to escape its inherited biological role and take control of its own destiny. It began to innovate, actively reshaping its way of life, its environment and, eventually, the planet itself. How did we do it? What set our species, Homo sapiens, apart from the rest?
Searching for just one event, a decisive change in culture or brain structure, would probably be a mistake. For more than 1.5 million years, archaic humans (earlier Homo species, such as Homo erectus) had been slowly diverging from the other great apes, developing a way of life marked by increased collaboration. They made simple stone tools, hunted together, might have cooked their food, and probably engaged in communal parenting.
Still, their lifestyle remained largely static over vast periods of time, with few, if any, signs of artistic activity or technical innovation. Things started to change only in the past 300,000 years, with the emergence of our own species and our cousins the Neanderthals, and even then the pace of change didn’t quicken much until 40-60,000 years ago.
What caused our species to break out of the pattern set by archaic humans? Again, there were probably many factors. But from my perspective as someone who studies the human mind, one development stands out as of special importance. There is a mental ability we possess today that must have emerged at some point in our history, and whose emergence would have vastly enhanced our ancestors’ creative powers.
The ability I mean is that of hypothetical thinking – the ability to detach one’s mind from the here and now, and consciously think about other possibilities. This is the key to sustained innovation and creativity, and to the development of art, science and technology. Archaic humans, in all probability, didn’t possess it. The static nature of their lifestyle suggests that they lived in the present, their attention locked on to the world, and their behaviour driven by habit and environmental stimuli. In the course of their daily activities, they might accidentally hit on a better way of doing something, and so gradually acquire new habits and skills, but they didn’t actively think up innovations for themselves…
The story at “Our greatest invention was the invention of invention itself.”
* Agatha Christie (who would surely have agreed that invention is also, sometimes, aimed at explaining ourselves to our selves… and sometimes simply at delivering delight)
###
As we contemplate creativity, we might recall that it was on this date in 1925 that the trial of John T. Scopes in Scopes vs. The State of Tennessee (aka “the Scopes Monkey Trial”) began.
The State of Tennessee had responded to the urging of William Bell Riley, head of the World’s Christian Fundamentals Association, by passing a law prohibiting the teaching of evolution– the Butler Act. In response, The American Civil Liberties Union offered to defend anyone accused of violating the Act. George Rappleyea, who managed several local mines, convinced a group of businessmen in Dayton, Tennessee, a town of 1,756, that the controversy of such a trial would give Dayton some much needed publicity. With their agreement, he called in his friend, the 24-year-old Scopes, who taught High School biology in the local school– and who agreed to be the test case.
The rest is celebrity-filled history, and star-studded drama.



You must be logged in to post a comment.