(Roughly) Daily

Posts Tagged ‘films

“In comics at their best, words and pictures are like partners in a dance, and each one takes turns leading”*…

Frank Bellew, “The ‘Fourth’ and the Pig.” Nick Nax for All Creation, July 1856 (source)

In his new book, Lost Literacies: Experiments in the 19th Century US Comic Strip, literary historian Alex Beringer demonstrates how the birth of the genre of printed comic long preceded the Sunday Funny Pages. He elaborates in conversation with Tim Brinkhof, who introduces the colloquy…

Most people consider the introduction of the Funny Pages in the late nineteenth century as the birthday of the “modern” American comic strip. Alex Beringer is not most people.

A literary historian and professor of English at the University of Montevallo, Beringer dates the history of comics earlier, to roughly the mid-1800s, a period of prolific and uninhibited experimentation. He came to this understanding by piecing together the medium’s fractured archaeological record, diving through myriad online resources and archives. In the middle of the nineteenth century, New York-based artists followed the lead of their French and Swiss colleagues, particularly Rodolphe Töpffer, the “Father of the Comic Strip,” exchanging single-image political cartoons and caricatures for multi-panel sequences that, many believe, for the first time enabled them to play around with characterization, worldbuilding, and—well—storytelling.

Coming decades before the standardization of speech bubbles and panel borders, these early American comics seem to have little in common with their modern, more streamlined counterparts; they featured sudden and purposefully jarring jump cuts reminiscent of the yet-to-be-invented film montage or musical notes instead of text. One comic artist tells a story through shadows behind the curtains of a window; another, with hieroglyphs the reader must decipher with the help of a legend.

“The audience for this first wave of US comic strips was strikingly sophisticated in its reception of this material,” Beringer writes in Lost Literacies: Experiments in the Nineteenth-Century US Comic Strip, which chronicles this oft-forgotten renaissance. Out from the Ohio State University Press, the book is one of hundreds of titles included in JSTOR’s Path to Open program, making scholarly books accessible online to wide audiences (read chapter four here, free of charge).

“The sense of flux—the idea that the visual language could turn on a dime—was often precisely the appeal,” Beringer observes in his chronicle of this oft-forgotten renaissance.

Foretelling the philosopher Martin Heidegger’s assertion that drawing is in itself a “form of knowing,” early comic strip artists and their consumers treated the medium as a philosophical exercise; Beringer quotes the observation by media scholars Hilary Chute and Patrick Jagoda that comics “enable an intense focus on how complexly woven stories unfold across time and space and, particularly, how these involve the reader…to generate meaning through interacting with, or themselves shaping, spatiotemporal form.”

While some early American artists blatantly plagiarized illustrations and formats that originated in France and Switzerland, others used them as a springboard, giving European drawings a decidedly American twist. For example, where Töpffer’s character Monsieur Vieux Bois (“Mr. Oldbuck”) satirized the European bourgeoisie, comics featuring his Yankee doppelganger, Jeremiah Oldpot (artist unknown), a New York tin merchant who leaves his family to prospect gold in California, often hinge on what Beringer defines as the contradiction between his “romantic view of himself as a rugged frontiersman and his attachment to consumer goods.”

Beringer discusses this and other critical facets of this period in comics history…

Read on for their fascinating exchange: “Lost Literacies Strips Down the Dawn of Comics,” from @jstordaily.bsky.social.

* Scott McCloud, in his wonderful Understanding Comics: The Invisible Art

###

As we tell and show, we might ponder where all of this has led, recalling that it was on this date in 2007 that the then-latest entry in a comic-born franchise dropped: TMNT, the first animated entry in the Teenage Mutant Ninja Turtles film series, was released. The film (which was entirely computer animated), is set after the final defeat of their arch-enemy, the Shredder; the four Turtles — Leonardo, Raphael, Donatello, and Michelangelo (voiced respectively by James Arnold TaylorNolan NorthMitchell Whitfield, and Mikey Kelley) — having grown apart, reunite and overcome their faults to save the world from evil ancient creatures. It also features the voices of Chris EvansSarah Michelle GellarMakoKevin SmithPatrick Stewart, and Ziyi Zhang, with narration by Laurence Fishburne

TMNT ranked number one at the box office on its opening weekend, beating 300 (the top film of the previous two weeks), The Last MimzyShooterPrideThe Hills Have Eyes 2, and Reign Over Me, grossing $25.45 million over the weekend of March 23–25, 2007. That said, the film grossed (only) $95.8 million million worldwide, including $54 million domestically during its 91-day run in the 3,120 North American theaters… as the Rotten Tomatoes consensus read: “TMNT’s art direction is splendid, but the plot is non-existent and the dialogue lacks the irony and goofy wit of the earlier Ninja Turtles movies.”

source

Written by (Roughly) Daily

March 23, 2026 at 1:00 am

“The worst part about having a mental illness is people expect you to behave as if you don’t”*…

Trends across all causes and risks of disease/disability show that there have been substantial declines in infectious diseases, malnutrition, cardiovascular diseases, and several cancers. But even as we make strides in addressing physical health, mental health challenges are on the rise. In sharp contrast, mental health disorders and alcohol-related disability adjusted life years (DALYS) have increased sharply over the last few decades, especially among people aged 25 to 74.

The WHO found that the two most common mental disorders, anxiety and depression, cost global GDP
$1 trillion in 2010. Lost output for the same time period attributed to mental, neurological, and substance
abuse disorders – which often intersect – was estimated between $2.5-$8.5 trillion. This is expected to double by 2030.

A report from the Aspen Institute and Dalberg explores the global rise of mental illness through economics, lived experiences, and expert insights…

According to the World Health Organization (WHO), 450 million people suffer from some form of mental illness over the course of their lives. So, it’s no surprise that many of us have experienced, or know some-one who has experienced, severe struggles with mental health. This is a full-blown crisis exacerbated by a lack of infrastructure, lack of funding, and a lack of health equity. This is despite the fact that mental health issues are the leading cause of disability globally. Also, according to the WHO, mental health conditions are the primary cause of suicide. And suicide is the second leading cause of death for people age 15to 29. This is a crisis of our time.

In this report, we offer a snapshot into both the magnitude and the scope of the mental health crisis facing humanity. In addition to briefly framing the issues, we share summaries of dozens of interviews we held with both “expert practitioners” working both in the public and private sectors and individuals with a “lived experience” touched by mental health struggles.

In the course of our work, we looked for recurring themes that could promote a dialogue about seeking sustainable, scalable solutions to the crisis. Among those themes are the challenges of building an infrastructure for access to quality mental healthcare, the continued lack of parity between the provision of services for mental health versus physical health, and the pervasiveness of stigma associated with diseases of the mind.

Further, although most of us do not think of mental health as related to investing, and if we do, we might find the notion distasteful, there are indeed a growing number of developing technologies and treatment modalities that hold promise for expanding access to mental health services and offering innovative practices. We highlight a handful of examples. The individuals who generously shared their personal struggles also shared the resources and practices that they found most helpful.

We acknowledge the global nature of the crisis and the role that both the pandemic and other contextual factors have played in substantial increases in anxiety disorders and other mental health issues. Further, we are seeing increases in specific demographics, such as poorer mental health among women, with one in five women experience a more common mental disorder (such as anxiety or depression), compared with one in eight men. No demographic is immune.

Given the crisis at hand, it is our hope that offering greater transparency to the world of mental health will stimulate a search for solutions…

Bracing– but important– reading: “A Crisis of Our Time.”

(Image above from a series of photos illustrating mental illness, from Christian Sampson.)

* from the notebook of Arthur Fleck (AKA, The Joker), via Todd Phillips 2019 film Joker

###

As we care about care, we might recall that it was on this date in 2019 that the first presentation print of Todd Phillip’s film Joker was shipped to Italy, where it premiered at the Venice International Film Festival and won the Golden Lion, the fest’s top prize. The film went on to box office success and set records for an October release. It grossed over $1 billion; the first R-rated to do so. It received numerous accolades, including two Academy Award wins at the 92nd Academy Awards for Best Actor (Joaquin Phoenix) & Best Original Score (Hildur Guðnadóttir) out of 11 nominations including Best Picture, first DC film to score.

source

“The crisis consists precisely in the fact that the old is dying and the new cannot be born; in this interregnum a great variety of morbid symptoms appear.”*…

… for many, this phrase from Italian Marxist Antonio Gramsci sums up the current crisis of world politics and world power. Adam Tooze unpacks the oddity– and potential danger– of these lines becoming one of the soundbites of the early 21st-century…

… I want to put this conceptualization of our current crisis in question. Gramsci’s notion of interregnum may have served him to illuminate his immediate context. But, it transmits to our era a philosophy of history that actually obscures how we got from his moment of writing to our present day. It thus stands in the way of thinking hard about the challenges and opportunities of our current moment.

The currency of Gramsci’s lines today today should give us pause. After all, if we take Antonio Gramsci seriously as a historical thinker, as we absolutely must, we should also acknowledge the huge gulf that separates him from the present. He was a Communist who paid with his life for his commitment to the cause of world revolution. His lines on interregnum, now the stock in trade of after dinner speeches and think tank meetings, were composed in November 1930 in a fascist jail. Gramsci was thirty-nine. He would die at 46, his fragile health irrevocably broken by harsh imprisonment.

With morbid symptoms Gramsci may have been referring to fascism. Alternatively, he may have been criticizing the turn to the ultraleft of the Italian Communist party under pressure from Moscow. His medical language evokes Lenin’s famous denunciation of left communism as an “infantile disorder”.

Wondering about the popularity of Gramsci’s lines today, I’ve come to think that it may have something to do with the way in which they combine drama – crisis, birth, death, interregnum – with an undertone of reassurance. If this is true, it is a deep historic irony. Gramsci derived his fortitude and belief from his Marxist understanding of world history. Today his words serve very different purposes.

First of all, Gramsci’s quote implies a definite direction of historical travel. We know what is old. We know what is new. We may currently be in crisis, but it is only a matter of time before “the new” will eventually be delivered.

A transition from old to new might imply significant change, which could open us up to thinking about radically different futures. That might be good news. But it might also be disturbing. Once again reassurance is provided by Gramsci’s definition of the crisis as interregnum. The present is an inter-regnum, because it a period between two orders. It may be messy now, but a new era is on the way.

This kind of historical thinking is not confined to Gramsci. It is nicely illustrated, for example, in this conventional chronology of modern economic history.

Once cast in terms of the sequence of regnum-interregnum-regnum our current disorder becomes merely a passing moment. Given this sequence, who could doubt that a new white bar lies ahead of us. In this graphic, the grey phase of interregnum that began in 2008 already has a right-hand demarcation, even if no date is, as yet, attached to the endpoint.

A further point of certainty amidst Gramsci’s interregnum, is that we can confidently distinguish what is morbid from what is healthy. This again implies a superior vantage point, something that one might think would be in jeopardy in a true moment of crisis.

The obvious question is: what is the basis for Gramsci’s judgement? This troubling question is especially pressing if Gramsci was, in fact, applying the label morbid not to fascism but to those he disagreed with in the ranks of the international communist movement. Was this a medico-technical diagnosis? Or, was his judgement, like Lenin’s, a political act, an act of polemic, stigmatizing disagreement? In which case the naturalized conception of crisis is, in fact, disguising a political clash.

Finally, and most fundamentally, Gramsci’s diagnosis locates the current crisis within history imagined as a natural cycle of life, of birth and death…

… Models of this kind gesture to the confusion and terror of interregnum, all the while reducing that phase of disorder to something temporary, recurring and predictable. The model gestures to historical development – the upward step of phase after phase – but actually reduces radical change to repetition. After one hegemony what we look forward to, is simply another.

This line of thinking is not just simplistic. In the current moment it is dangerously so. To the drama of America’s evidently waning hegemony it adds the intensity of the follow-on question: who comes next? This question – far from necessary – is framed by the assumption of historical repetition – hegemon-interregnum-hegemony. In the current moment there can only be one possible answer: CCP-led China. That in turns eggs the flailing American elite on to a more intense rearguard action. But why assume that in the 21st century there will be a successor to America’s 20th century power?

Any serious examination of the foundations of modern power actually suggests that this kind of cyclical or sequential view of history is misplaced.

Take GDP as a proxy for power resources and think about the kind of intellectual gymnastics which are necessary to turn history as depicted by GDP below into [a cyclical] sequence…

Source: Maddison Project data

As for the long-run continuity, global GDP prior to the 19th century is so low that it can not be sensibly depicted on the same graph extending into the 20th century. This is also true for economic heft and destructive power. Of course, there were highly destructive wars, in the 17th century for instance, but their violence unfolded according to a very different logic from that in the 20th century.

And as for the pattern of change, what we see is not a neat sequence of substitutions in which one hegemon displaces another, but rather something more akin to “piling on”. This is history not as repetition, but in Mark Blyth’s wonderful phrase as a “one-way trip into the unknown”…

… I see the construction of global hegemony in the 20th century not as a repetition of something familiar, but as itself a venture into the unknown. To put it simply, I see global hegemony as a 20th-century problem…

Of course, the distinctive 20th-century visions of global power had precursors. They had preconditions. Something had first to constitute our modern conception of globality. This happened through the global system of power, communication, transport and commerce created by the British Empire in the 19th century. This constituted for the first time what Michael Geyer and Charles Bright called the “global condition”. All too often this encourages thinking in terms of an Anglo-American sequence. But again this underestimates the power of accumulation and overlay. Compared to US power in its mid 20th-century pomp, the British empire was a thin mesh of networks. The British empire maintained its grip with extremely limited resources in large part as a result of the weakness of its rivals…

… But by 1916 it was clear that only the United States had the power to manage the new configuration of global forces. The weird architecture of global mobilization in the first phase of WWI could not be sustained without at least the approval of the government of the USA. This is the story of my book, Deluge.

The economy, as measured by novel statistics of national income, would be America’s trump card. But that in itself is not an obvious fact. It was an effect of particular circumstances. 1916 is a pivotal moment because with the inconclusive battles of materiel at Verdun and the Somme, following the “hunger winter” of 1915/1916, it became clear that purely military operations were at an impasse and this meant that war production and home front stability would take on a new and central role in determining the course of the war. War became a new kind of totalizing war. It was also an election year in the USA, arguably the first formal democratic event (as opposed to a revolution) to matter on a global scale. Certainly, it was the first US election that was watched with baited breath by political classes around the world.

It was out of the historically specific, financial and economic urgencies of World War I, that a new American-centered network of power emerged. This was something new, something architected and built to meet the urgency of the moment. There was no womb of hegemonic logic from which the US power was born to replace a dying British global order. It was not the inevitable sequence of monetary logic that ground through its inevitable development to make the dollar replace the pound sterling. It was war and war finance. The dollar in the 20th century would play a role quite different from sterling under the 19th century gold standard. Furthermore, America’s global power did not substitute for British power. It overlayed Britain’s own efforts, first in World War I, then in the interwar period and finally during World War II, to master the new hegemonic problem. In crucial areas, notably the oil fields of the Middle East, it was not until the late 1960s that the US finally took over…

… America’s age of hegemony was not an answer to an interregnum. It truly was new. It was thus not the latest iteration of some familiar form of power. It did not replace the British empire. The British empire was reinventing itself too in response to the new challenges of the early 20th century. It was overtaken by the US and nested itself under the wings of that power. It was not something born. It was built.

And if that is true for the early 20th century the question of how 21st century global power will be organized should be considered no less open. Certainly, our problem right now is not that the old is simply dying. Things are far from that simple. In certain crucial respects “the old” is hanging around and indeed seeking to mobilize new strength. At the same time the principal challenger may be “new” in the sense of unfamiliar. But the CCP-regime draws inspiration from a first successful century of ascent, evokes ancient Chinese history. And what its underlying drivers are, is a matter of contentious debate.

What is old and what is new, what morbid and what vigorous, what the underlying generative logic of history actually is, these are all questions that are at this moment up for debate. We are, therefore, experiencing a crisis of confidence and a period of uncertainty, that is far deeper than talk of an interregnum à la Gramsci implies. To be clear this does not necessarily mean more lethal or more tragic than the epoch that cut short Gramsci’s life. Our normality, however catastrophic, may be manageable. The environmental clock is ticking, but the majority of us are no longer poor. We live longer. Today, Gramsci’s life could probably have been saved. There are gigantic technological resources that democratic and progressive crisis-management could draw on. What we must let go of, is the false mantle of confidence and historical clarity that evoking the concepts of an earlier epoch entails. Abandoning talk of an interregnum may rob us of certainty. But rather than a council of despair this is simply a demand of realism. What that promises, is the chance to trade historic phantoms for new projects and the exploration of the actual possibilities of the present…

Eminently worth reading in full: “Built not Born – against “interregnum”-talk,” from @adam_tooze.

And what might we build? For a reminder that our options are perhaps broader than we might think: “The Gap at the End of the World,” from Cynthia Cruz in @LAReviewofBooks.

* Gramsci, Prison Notebooks

###

As we keep our eyes up, we might recall that it was on this date last year that a new kind of cultural phenomenon burst onto the scene, surrounding the simultaneous theatrical release of two films, Warner Bros. Pictures’ Barbie and Universal Pictures’ Oppenheimer— Barbenheimer, as it came to be called.

The simultaneous release was an instance of counterprogramming. As the release date approached, discussion from rivalry to the prospect of watching the films as a double feature. Cast members of both responded by encouraging audiences to watch the films on the same day. Celebrity participants included actor Tom Cruise, who purchased tickets to watch both while his latest film, Mission: Impossible – Dead Reckoning Part One, was still playing in theaters.

In the event, both Barbie and Oppenheimer received critical acclaim and exceeded box-office expectations.  Their joint opening weekend was the fourth-largest at the American box office, and both rank among the highest-grossing films of 2023. The phenomenon also extended to the year’s awards season, in which both films emerged as leading contenders, earning a combined 21 nominations at the 96th Academy Awards. Both films were nominated for the Academy Award for Best Picture, which Oppenheimer won.

source

Written by (Roughly) Daily

July 21, 2024 at 1:00 am

“Hollywood will rot on the windmills of Eternity”*…

… or possibly, Daniel Bessner argues, sooner…

… Thanks to decades of deregulation and a gush of speculative cash that first hit the industry in the late Aughts, while prestige TV was climbing the rungs of the culture, massive entertainment and media corporations had been swallowing what few smaller companies remained, and financial firms had been infiltrating the business, moving to reduce risk and maximize efficiency at all costs, exhausting writers in evermore unstable conditions.

“The industry is in a deep and existential crisis,” the head of a midsize studio told me in early August. We were in the lounge of the Soho House in West Hollywood. “It is probably the deepest and most existential crisis it’s ever been in. The writers are losing out. The middle layer of craftsmen are losing out. The top end of the talent are making more money than they ever have, but the nuts-and-bolts people who make the industry go round are losing out dramatically.”

Hollywood had become a winner-takes-all economy. As of 2021, CEOs at the majority of the largest companies and conglomerates in the industry drew salaries between two hundred and three thousand times greater than those of median employees. And while writer-producer royalty such as Shonda Rhimes and Ryan Murphy had in recent years signed deals reportedly worth hundreds of millions of dollars, and a slightly larger group of A-list writers, such as Smith, had carved out comfortable or middle-class lives, many more were working in bare-bones, short-term writers’ rooms, often between stints in the service industry, without much hope for more steady work. As of early 2023, among those lucky enough to be employed, the median TV writer-producer was making 23 percent less a week, in real dollars, than their peers a decade before. Total earnings for feature-film writers had dropped nearly 20 percent between 2019 and 2021.

Writers had been squeezed by the studios many times in the past, but never this far. And when the WGA went on strike last spring, they were historically unified: more guild members than ever before turned out for the vote to authorize, and 97.9 percent voted in favor. After five months, the writers were said to have won: they gained a new residuals model for streaming, new minimum lengths of employment for TV, and more guaranteed paid work on feature-film screenplays, among other protections.

But the business of Hollywood had undergone a foundational change. The new effective bosses of the industry—colossal conglomerates, asset-management companies, and private-equity firms—had not been simply pushing workers too hard and grabbing more than their fair share of the profits. They had been stripping value from the production system like copper pipes from a house—threatening the sustainability of the studios themselves. Today’s business side does not have a necessary vested interest in “the business”—in the health of what we think of as Hollywood, a place and system in which creativity is exchanged for capital. The union wins did not begin to address this fundamental problem.

Currently, the machine is sputtering, running on fumes. According to research by Bloomberg, in 2013 the largest companies in film and television were more than $20 billion in the black; by 2022, that number had fallen by roughly half. From 2021 to 2022, revenue growth for the industry dropped by almost 50 percent. At U.S. box offices, by the end of last year, revenue was down 22 percent from 2019. Experts estimate that cable-television revenue has fallen 40 percent since 2015. Streaming has rarely been profitable at all. Until very recently, Netflix was the sole platform to make money; among the other companies with streaming services, only Warner Bros. Discovery’s platforms may have eked out a profit last year. And now the streaming gold rush—the era that made Dickinson—is over. In the spring of 2022, the Federal Reserve began raising interest rates after years of nearly free credit, and at roughly the same time, Wall Street began calling in the streamers’ bets. The stock prices of nearly all the major companies with streaming platforms took precipitous falls, and none have rebounded to their prior valuation.

The industry as a whole is now facing a broad contraction. Between August 2022 and the end of last year, employment fell by 26 percent—more than one job gone in every four. Layoffs hit Warner Bros. Discovery, Netflix, Paramount Global, Roku, and others in 2022. In 2023, firings swept through the representation giants United Talent Agency and Creative Artists Agency; Netflix, Paramount Global, and Roku again; plus Hulu, NBCUniversal, and Lionsgate. In early 2024, it was announced that Amazon was cutting hundreds of jobs from its Prime Video and Amazon MGM Studios divisions. In February, Paramount Global laid off roughly eight hundred people. It’s unclear which streamers will survive. As James Dolan, the interim executive chair of AMC Networks, told employees in late 2022 as he delivered news of massive layoffs—roughly 1,700 people (20 percent of U.S. staff) would lose their jobs—“the mechanisms for the monetization of content are in disarray.”

Profit will of course find a way; there will always be shit to watch. But without radical intervention, whether by the government or the workers, the industry will become unrecognizable. And the writing trade—the kind where one actually earns a living—will be obliterated…

Film and television writers face an existential threat; viewers, a drab future: “The Life and Death of Hollywood,” from @dbessner in @Harpers. A bracing piece, eminently worth reading in full.

* Allen Ginsberg

###

As we study streaming, we might recall that it was on this date in 1964 that AT&T connected the first Picturephone call (between Disneyland in California and the World’s Fair in New York). The device consisted of a telephone handset and a small, matching TV, which allowed telephone users to see each other in fuzzy video images as they carried on a conversation. It was commercially-released shortly thereafter (prices ranged from $16 to $27 for a three-minute call between special booths AT&T set up in New York, Washington, and Chicago), but didn’t catch on… though, of course, it augured the “future” in which now we live.

source

Written by (Roughly) Daily

April 20, 2024 at 1:00 am

“Visualizations act as a campfire around which we gather to tell stories”*…

From home ownership to digital media consumption, climate change to job growth– more, with commentary, at: “10 Charts That Capture How the World Is Changing,” from @rex_woodbury.

Al Shalloway

###

As we ponder patterns, we might recall that it was on this date in 1940 that RKO released Walt Disney’s animated musical anthology Fantasia— eight animated segments set to pieces of classical music conducted by Leopold Stokowski. First released as a theatrical roadshow held in 13 cities across the U.S. between 1940 and 1941, it was acclaimed by critics. But it initially failed to turn a profit owing to World War II’s cutting off distribution to the European market, the film’s high production costs, and the expense of building Fantasound equipment and leasing theatres for the roadshow presentations. That said, since 1942, the film has been reissued multiple times by RKO and Buena Vista Distribution (with its original footage and audio being variously deleted, modified, or restored in each version). To date, when adjusted for inflation, Fantasia is the 23rd highest-grossing film of all time in the U.S.

source

Written by (Roughly) Daily

November 13, 2022 at 1:00 am