(Roughly) Daily

Posts Tagged ‘television

“We ceased to be the lunatic fringe. We’re now the lunatic core.”*…

Further, in a fashion, to yesterday’s post on analog computing, an essay from Benjamin Labatut (the author of two remarkable works of “scientific-historical fiction,” When We Cease to Understand the World and The MANIAC, continuing the animating theme of those books…

We will never know how many died during the Butlerian Jihad. Was it millions? Billions? Trillions, perhaps? It was a fantastic rage, a great revolt that spread like wildfire, consuming everything in its path, a chaos that engulfed generations in an orgy of destruction lasting almost a hundred years. A war with a death toll so high that it left a permanent scar on humanity’s soul. But we will never know the names of those who fought and died in it, or the immense suffering and destruction it caused, because the Butlerian Jihad, abominable and devastating as it was, never happened.

The Jihad was an imagined event, conjured up by Frank Herbert as part of the lore that animates his science-fiction saga Dune. It was humanity’s last stand against sentient technology, a crusade to overthrow the god of machine-logic and eradicate the conscious computers and robots that in the future had almost entirely enslaved us. Herbert described it as “a thalamic pause for all humankind,” an era of such violence run amok that it completely transformed the way society developed from then onward. But we know very little of what actually happened during the struggle itself, because in the original Dune series, Herbert gives us only the faintest outlines—hints, murmurs, and whispers, which carry the ghostly weight of prophecy. The Jihad reshaped civilization by outlawing artificial intelligence or any machine that simulated our minds, placing a damper on the worst excesses of technology. However, it was fought so many eons before the events portrayed in the novels that by the time they occur it has faded into legend and crystallized in apocrypha. The hard-won lessons of the catastrophe are preserved in popular wisdom and sayings: “Man may not be replaced.” “Once men turned their thinking over to machines in the hope that this would set them free. But that only permitted other men with machines to enslave them.” “We do not trust the unknown which can arise from imaginative technology.” “We must negate the machines-that-think.” The most enduring legacy of the Jihad was a profound change in humankind’s relationship to technology. Because the target of that great hunt, where we stalked and preyed upon the very artifacts we had created to lift ourselves above the seat that nature had intended for us, was not just mechanical intelligence but the machinelike attitude that had taken hold of our species: “Humans had set those machines to usurp our sense of beauty, our necessary selfdom out of which we make living judgments,” Herbert wrote.

Humans must set their own guidelines. This is not something machines can do. Reasoning depends upon programming, not on hardware, and we are the ultimate program!

The Butlerian Jihad removed a crutch—the part of ourselves that we had given over to technology—and forced human minds to develop above and beyond the limits of mechanistic reasoning, so that we would no longer depend on computers to do our thinking for us.

Herbert’s fantasy, his far-flung vision of a devastating war between humanity and the god of machine-logic, seemed quaint when he began writing it in the Sixties. Back then, computers were primitive by modern standards, massive mainframe contraptions that could process only hundreds of thousands of cycles per second (instead of billions, like today), had very little memory, operated via punch cards, and were not connected to one another. And we have easily ignored Herbert’s warnings ever since, but now the Butlerian Jihad has suddenly returned to plague us. The artificial-intelligence apocalypse is a new fear that keeps many up at night, a terror born of great advances that seem to suggest that, if we are not very careful, we may—with our own hands—bring forth a future where humanity has no place. This strange nightmare is a credible danger only because so many of our dreams are threatening to come true. It is the culmination of a long process that hearkens back to the origins of civilization itself, to the time when the world was filled with magic and dread, and the only way to guarantee our survival was to call down the power of the gods.

Apotheosis has always haunted the soul of humankind. Since ancient times we have suffered the longing to become gods and exceed the limits nature has placed on us. To achieve this, we built altars and performed rituals to ask for wisdom, blessings, and the means to reach beyond our capabilities. While we tend to believe that it is only now, in the modern world, that power and knowledge carry great risks, primitive knowledge was also dangerous, because in antiquity a part of our understanding of the world and ourselves did not come from us, but from the Other. From the gods, from spirits, from raging voices that spoke in silence.

[Labatut invokes the mysteries of the Vedas and their Altar of Fire, which was meant to develop “a mind, (that) when properly developed, could fly like a bird with outstretched wings and conquer the skies.”…]

Seen from afar by people who were not aware of what was being made, these men and women must surely have looked like bricklayers gone mad. And that same frantic folly seems to possess those who, in recent decades, have dedicated their hearts and minds to the building of a new mathematical construct, a soulless copy of certain aspects of our thinking that we have chosen to name “artificial intelligence,” a tool so formidable that, if we are to believe the most zealous among its devotees, will help us reach the heavens and become immortal…

[Labatut recounts the stories– and works– of some of the creators of AI’s DNA: George Boole (and his logic), Claude Shannon (who put that logic to work), and Geoffrey Hinton (Boole’s great-great-grandson, and “the Godfather of AI,” who created of the first neural networks, but has more recently undergone a change of opinion)…]

… Hinton has been transformed. He has mutated from an evangelist of a new form of reason into a prophet of doom. He says that what changed his mind was the realization that we had, in fact, not replicated our intelligence, but created a superior one.

Or was it something else, perhaps? Did some unconscious part of him whisper that it was he, rather than his great-great-grandfather, who was intended by God to find the mechanisms of thought? Hinton does not believe in God, and he would surely deny his ancestor’s claim that pain is an instrument of the Lord’s will, since he was forced to have every one of his meals on his knees, resting on a pillow like a monk praying at the altar, because of a back injury that caused him excruciating pain. For more than seventeen years, he could not sit down, and only since 2022 has he managed to do so long enough to eat.

Hinton is adamant that the dangers of thinking machines are real. And not just short-term effects like job replacement, disinformation, or autonomous lethal weapons, but an existential risk that some discount as fantasy: that our place in the world might be supplanted by AI. Part of his fear is that he believes AI could actually achieve a sort of immortality, as the Vedic gods did. “The good news,” he has said, “is we figured out how to build things that are immortal. When a piece of hardware dies, they don’t die. If you’ve got the weights stored in some medium and you can find another piece of hardware that can run the same instructions, then you can bring it to life again. So, we’ve got immortality. But it’s not for us.”

Hinton seems to be afraid of what we might see when the embers of the Altar of Fire die down at the end of the sacrifice and the sharp coldness of the beings we have conjured up starts to seep into our bones. Are we really headed for obsolescence? Will humanity perish, not because of the way we treat all that surrounds us, nor due to some massive unthinking rock hurled at us by gravity, but as a consequence of our own irrational need to know all that can be known? The supposed AI apocalypse is different from the mushroom-cloud horror of nuclear war, and unlike the ravages of the wildfires, droughts, and inundations that are becoming commonplace, because it arises from things that we have, since the beginning of civilization, always considered positive and central to what makes us human: reason, intelligence, logic, and the capacity to solve the problems, puzzles, and evils that taint even the most fortunate person’s existence with everyday suffering. But in clawing our way to apotheosis, in daring to follow the footsteps of the Vedic gods who managed to escape from Death, we may shine a light on things that should remain in darkness. Because even if artificial intelligence never lives up to the grand and terrifying nightmare visions that presage a nonhuman world where algorithms hum along without us, we will still have to contend with the myriad effects this technology will have on human society, culture, and economics.

In the meantime, the larger specter of superintelligent AI looms over us. And while it is less likely and perhaps even impossible (nothing but a fairy tale, some say, a horror story intended to attract more money and investment by presenting a series of powerful systems not as the next step in our technological development but as a death-god that ends the world), it cannot be easily dispelled, for it reaches down and touches the fibers of our mythmaking apparatus, that part of our being that is atavistic and fearful, because it reminds us of a time when we shivered in caves and huddled together, while outside in the dark, with eyes that could see in the night, the many savage beasts and monsters of the past sniffed around for traces of our scent.

As every new AI model becomes stronger, as the voices of warning form a chorus, and even the most optimistic among us begin to fear this new technology, it is harder and harder to think without panic or to reason with logic. Thankfully, we have many other talents that don’t answer to reason. And we can always rise and take a step back from the void toward which we have so hurriedly thrown ourselves, by lending an ear to the strange voices that arise from our imagination, that feral territory that will always remain a necessary refuge and counterpoint to rationality.

Faced, as we are, with wild speculation, confronted with dangers that no one, however smart or well informed, is truly capable of managing or understanding, and taunted by the promises of unlimited potential, we may have to sound out the future not merely with science, politics, and reason, but with that devil-eye we use to see in the dark: fiction. Because we can find keys to doors we have yet to encounter in the worlds that authors have imagined in the past. As we grope forward in a daze, battered and bewildered by the capabilities of AI, we could do worse than to think about the desert planet where the protagonists of Herbert’s Dune novels sought to peer into the streaming sands of future time, under the heady spell of a drug called spice, to find the Golden Path, a way for human beings to break from tyranny and avoid extinction or stagnation by being more diverse, resilient, and free, evolving past purely logical reasoning and developing our minds and faculties to the point where our thoughts and actions are unpredictable and not bound by statistics. Herbert’s books, with their strange mixture of past and present, remind us that there are many ways in which we can continue forward while preserving our humanity. AI is here already, but what we choose to do with it and what limits we agree to place on its development remain decisions to be made. No matter how many billions of dollars are invested in the AI companies that promise to eliminate work, solve climate change, cure cancer, and rain down miracles unlike anything we have seen before, we can never fully give ourselves over to these mathematical creatures, these beings with no soul or sympathy, because they are neither alive nor conscious—at least not yet, and certainly not like us—so they do not share the contradictory nature of our minds.

In the coming years, as people armed with AI continue making the world faster, stranger, and more chaotic, we should do all we can to prevent these systems from giving more and more power to the few who can build them. But we should also consider a warning from Herbert, the central commandment he chose to enshrine at the heart of future humanity’s key religious text, a rule meant to keep us from becoming subservient to the products of our reason, and from bowing down before the God of Logic and his many fearsome offspring:

Thou shalt not make a machine in the likeness of a human mind

Before and after artificial intelligence: “The Gods of Logic” in @Harpers. Eminently worth reading in full.

For a less pessimistic view, see: “A Journey Through the Uncanny Valley: Our Relational Futures with AI,” from @dylanhendricks at @iftf.

* Geoffrey Hinton

###

As we deliberate on Daedalus’ caution, we might we might send fantastically far-sighted birthday greetings to a tecno-optimist who might likely have brushed aside Labatut’s concerns: Hugo Gernsback, a Luxemborgian-American inventor, broadcast pioneer, writer, and publisher; he was born on this date in 1884.

Gernsback held 80 patents at the time of his death; he founded radio station WRNY, was involved in the first television broadcasts, and is considered a pioneer in amateur radio.  But it was as a writer and publisher that he probably left his most lasting mark:  In 1926, as owner/publisher of the magazine Modern Electrics, he filled a blank spot in his publication by dashing off the first chapter of a series called “Ralph 124C 41+.” The twelve installments of “Ralph” were filled with inventions unknown in 1926, including “television” (Gernsback is credited with introducing the word), fluorescent lighting, juke boxes, solar energy, television, microfilm, vending machines, and the device we now call radar.

The “Ralph” series was an astounding success with readers; and later that year Gernsback founded the first magazine devoted to science fiction, Amazing Stories.  Believing that the perfect sci-fi story is “75 percent literature interwoven with 25 percent science,” he coined the term “science fiction.”

Gernsback was a “careful” businessman, who was tight with the fees that he paid his writers– so tight that H. P. Lovecraft and Clark Ashton Smith referred to him as “Hugo the Rat.”

Still, his contributions to the genre as publisher were so significant that, along with H.G. Wells and Jules Verne, he is sometimes called “The Father of Science Fiction”; in his honor, the annual Science Fiction Achievement awards are called the “Hugos.”

(Coincidentally, today is also the birthday– in 1906– of Philo T. Farnsworth, the man who actually did invent television.)

Gernsback, wearing one of his inventions, TV Glasses

source

“If you are not paying for it, you’re not the customer; you’re the product being sold.”*…

Julia Barton on a question that haunts us still…

After yet another day reading about audio industry layoffs and show cancellations, or listening to podcasts about layoffs and show cancellations, I sometimes wonder, “With all this great audio being given away for free, who did we think was supposed to pay for it all?”

I find some consolation in the fact that that question is more than a century old. In the spring of 1924, Radio Broadcast posed it in a contest called “Who is to Pay for Broadcasting and How?”The monthly trade magazine offered a prize of $500 (more than $9,000 in today’s dollars) for “a workable plan which shall take into account the problems in present radio broadcasting and propose a practical solution.”

The need for such a contest more than 100 years ago is revealing enough, but the reaction of the judges to the prize-winning plan turned out to be even more so — and it says a lot about why business models for audio production and broadcast remain a struggle.

Back in the mid-1920s, radio was just starting to catch on in America. For a couple of decades, the medium had been used mostly for logistics, to help ships communicate with each other and the shore. But after World War I, new technology allowed Americans to send and receive the sounds of music, lectures, and live events over “the ether.”

By all accounts, Americans — whiplashed by war, a flu pandemic, and massive social changes like Prohibition —  went crazy to hear what the ether could deliver to the privacy of their homes. They started buying or building their own radio receivers at a pace that shocked observers. In his book This Fascinating Radio Business, Robert Landry recalls curious customers lining up behind velvet ropes to see and place orders for the latest receivers. “The size, cost, gloss and make of one’s radio was, with the family car and the family icebox, an index of social swank.”

Many stations at the time were run by department stores that wanted to demonstrate the miracle of the expensive radio sets they sold. One of the first broadcast radio stations in the country, WOR sat in the furniture department at Bamberger’s in Newark, and its first announcers were also the employees selling furniture. But as the consumer market started to be saturated, those early stations were either bleeding money or shutting down entirely. The equipment needed constant updating, the workers expected salaries, and the performers who’d once been persuaded to fill airtime “for exposure” now demanded payment.

To make things more complicated, the government required so-called “clear channel” stations (high-powered, with signals that reached far and wide) to be on the air live for 18 hours a day, forbidding the use of “mechanically reproduced” music (as in, phonograph records) to fill the time. All this made broadcasting a very expensive proposition by 1924.

I first read about the “Who Is To Pay” contest in the 1994 book Selling Radio by Susan Smulyan, who starts off noting that from the beginning, “no one knew how to make money from broadcasting.” What about advertising, the solution that seems most obvious in hindsight? The man in charge of regulating radio, then-Secretary of Commerce Herbert Hoover, hated the idea.

“I don’t think there is anything the people would take more offense at than the attempt to sell goods over radio advertising,” Hoover declared, as part of a full-page spread in The New York Times on May 18, 1924, the same month that Radio Broadcast first announced its contest.

The Secretary had been speaking out against advertising for a few years by this point. Indirect advertising (or sponsorship, as it would soon be called) was acceptable in his mind — and via some math that’s hard to figure out, he guessed sponsorship could support about 150 stations nationwide.

Consumers in the 1920s were used to paying for telephone calls and telegrams, and there were other experiments to get listeners to pay for radio. One, dubbed “wired wireless,” licensed special devices to subscribers on Staten Island, who then got programs delivered via their power lines — a proto-version of cable TV that didn’t last long…

Radio Broadcast received close to a thousand entries to its contest. They proposed everything from a 30-day fundraising drive to the sale of copyrighted radio programming bulletins. The winner, announced in the March 1925 issue, proposed a $2 federal tax on vacuum tubes, at the time the cutting-edge technology for radio reception. The prizewinner, HD Kellogg Jr. of Haverford, Pennsylvania, reasoned that vacuum tubes were the best index of high-quality gear — the better the gear, the more radio a household could consume. Kellogg also argued that only the federal government, which already regulated radio, could collect and administer such a tax. His idea was basically a less regressive version of the licensing fee the British government already levied U.K. households to fund the BBC.

Though the contest’s judges awarded Kellogg’s proposal their prize, they were ambivalent about, if not downright hostile to, his plan. One can only imagine young Kellogg’s feelings as he read the many dismissals of his idea in later issues of Radio Broadcast. “A Government tax would be obnoxious,” wrote Paul Klugh, executive chairman of the National Association of Broadcasters. “I do not believe your prize-winning plan is feasible under conditions as they exist in this country,” wrote Secretary Hoover. 

America’s radio brain trust would go on to denounce almost any federal funds for broadcasting, fearing such a model could lead to censorship. Some of that aversion makes historical sense, given that Americans could still vividly remember the ugly and heavy-handed wartime censorship of Wilson-era U.S. postmaster Albert Sidney Burleson. As Adam Hochshild writes in his chilling history American Midnight, Burleson — until he left Washington with his boss Woodrow Wilson in 1921 — used his office to seize socialist and foreign-language publications, and revoke the postal privilege of other publications that reported on the war. So when broadcasting advocates in the 1920s talked about government “censorship,” the term was not abstract — it was a recent fact.

But rather than try to figure out a smarter way to fund public-minded, high-quality broadcasting, the men behind the Radio Broadcast contest decided the real winner should be: Nothing. “For the present, I think it is better to let things ride along as they are,” wrote columnist Zeh Brouck in May 1925.

Things did ride along, straight to direct advertising. Within a few years, huge swathes of the airwaves were the province of Lucky Strikes and Jergen’s Lotion, racial minstrelsy and unbelievable quackery

… For many happy decades of the 20th century, advertising did make commercial broadcasters a ton of money. But as historians from Robert McChesney to Susan Douglas to Michele Hilmes have pointed out, the “American system” is uniquely unstable, and it leaves public-interest programming — or, at times, any programming at all — hard to sustain.

While researching this piece, I learned I’m not the first writer to notice an anniversary of Radio Broadcast’s contest. Back in 1995, Todd Lappin explored it in Wired. He marveled at how much the nascent Web was following the same chaotic business arc of radio. But he held out hope that things might turn out better. “Perhaps radio wasn’t the right technology. But the Web and the Net may well be,” Lappin wrote. “Our job is to make sure that glorious potential doesn’t get stuffed into yet another tired, old media box.”

In retrospect, that’s a depressing read. But there is something irresistible about the original contest, and the era when all ideas were still up for debate. We’ve had a century of letting things “ride along.” It seems like a good time to open the contest again…

An all-too-timely read: “In 1924, a magazine ran a contest: “Who is to pay for broadcasting and how?” A century later, we’re still asking the same question,” from @bartona104 in @NiemanLab.

Digg commenter blue_beetle (Anthony Lewis)– now a meme.

###

As we contemplate culture, we might recall that it was on this date in 2007 that two local television helicopters covering a police chase in Phoenix, Arizona collided in air. Pilot Craig Smith and photographer Rick Krolak from KNXV-TV, and pilot Scott Bowerbank and photographer Jim Cox from KTVK were killed; there were no reported casualties on the ground.

Photograph circulated by AP of the two helicopters falling after the crash (source)

“So much of performing is a mind game”*…

Michael Caine in The Ipcress File

As John Seamon explains, in describing how they remember their lines, actors are telling us an important truth about memory…

… Actors face the demanding task of learning their lines with great precision, but they rarely do so by rote repetition. They did not, they said, sit down with a script and recite their lines until they knew them by heart. Repeating items over and over, called maintenance rehearsal, is not the most effective strategy for remembering. Instead, actors engage in elaborative rehearsal, focusing their attention on the meaning of the material and associating it with information they already know. Actors study the script, trying to understand their character and seeing how their lines relate to that character. In describing these elaborative processes, the actors assembled that evening offered sound advice for effective remembering.

Similarly, when psychologists Helga and Tony Noice surveyed actors on how they learn their lines, they found that actors search for meaning in the script, rather than memorizing lines. The actors imagine the character in each scene, adopt the character’s perspective, relate new material to the character’s background, and try to match the character’s mood. Script lines are carefully analyzed to understand the character’s motivation. This deep understanding of a script is achieved by actors asking goal-directed questions, such as “Am I angry with her when I say this?” Later, during a performance, this deep understanding provides the context for the lines to be recalled naturally, rather than recited from a memorized text. In his book “Acting in Film,” actor Michael Caine described this process well:

You must be able to stand there not thinking of that line. You take it off the other actor’s face. Otherwise, for your next line, you’re not listening and not free to respond naturally, to act spontaneously…

Deep understanding involves focusing your attention on the underlying meaning of an item or event, and each of us can use this strategy to enhance everyday retention. In picking up an apple at the grocers, for example, you can look at its color and size, you can say its name, and you can think of its nutritional value and use in a favorite recipe. Focusing on these visual, acoustic, and conceptual aspects of the apple correspond to shallow, moderate, and deep levels of processing, and the depth of processing that is devoted to an item or event affects its memorability. Memory is typically enhanced when we engage in deep processing that provides meaning for an item or event, rather than shallow processing. Given a list of common nouns to read, people recall more words on a surprise memory test if they previously attended to the meaning of each word than if they focused on each word’s font or sound.

Deep, elaborative processing enhances understanding by relating something you are trying to learn to things you already known. Retention is enhanced because elaboration produces more meaningful associations than does shallow processing — links that can serve as potential cues for later remembering. For example, your ease of recalling the name of a specific dwarf in Walt Disney’s animated film, “Snow White and the Seven Dwarfs,” depends on the cue and its associated meaning:

Try to recall the name of the dwarf that begins with the letter B.

People often have a hard time coming up with the correct name with this cue because many common names begin with the letter B and all of them are wrong. Try it again with a more meaningful cue:

Recall the name of the dwarf whose name is synonymous with shyness.

If you know the Disney film, this time the answer is easy. Meaningful associations help us remember, and elaborative processing produces more semantic associations than does shallow processing. This is why the meaningful cue produces the name Bashful

On the art of recall: “How Actors Remember Their Lines,” an excerpt from Seamon’s book, Memory and Movies: What Films Can Teach Us About Memory, from @mitpress.

* Joshua Bell

###

As we recollect, we might recall that it was on this date in 1952 that Guiding Light (AKA The Guiding Light) transferred from CBS Radio to CBS Television… and, as while radio actors could read from scripts, tv performers couldn’t, an enormous new occasion for the memorization of lines was created.

And indeed, there were lots and lots of lines to remember: with 72 years of radio and television runs (18,262 episodes), Guiding Light remains the longest-running soap opera, ahead of General Hospital, and is the fifth-longest-running program in all of global broadcast history.

source

Written by (Roughly) Daily

June 30, 2024 at 1:00 am

“Hollywood will rot on the windmills of Eternity”*…

… or possibly, Daniel Bessner argues, sooner…

… Thanks to decades of deregulation and a gush of speculative cash that first hit the industry in the late Aughts, while prestige TV was climbing the rungs of the culture, massive entertainment and media corporations had been swallowing what few smaller companies remained, and financial firms had been infiltrating the business, moving to reduce risk and maximize efficiency at all costs, exhausting writers in evermore unstable conditions.

“The industry is in a deep and existential crisis,” the head of a midsize studio told me in early August. We were in the lounge of the Soho House in West Hollywood. “It is probably the deepest and most existential crisis it’s ever been in. The writers are losing out. The middle layer of craftsmen are losing out. The top end of the talent are making more money than they ever have, but the nuts-and-bolts people who make the industry go round are losing out dramatically.”

Hollywood had become a winner-takes-all economy. As of 2021, CEOs at the majority of the largest companies and conglomerates in the industry drew salaries between two hundred and three thousand times greater than those of median employees. And while writer-producer royalty such as Shonda Rhimes and Ryan Murphy had in recent years signed deals reportedly worth hundreds of millions of dollars, and a slightly larger group of A-list writers, such as Smith, had carved out comfortable or middle-class lives, many more were working in bare-bones, short-term writers’ rooms, often between stints in the service industry, without much hope for more steady work. As of early 2023, among those lucky enough to be employed, the median TV writer-producer was making 23 percent less a week, in real dollars, than their peers a decade before. Total earnings for feature-film writers had dropped nearly 20 percent between 2019 and 2021.

Writers had been squeezed by the studios many times in the past, but never this far. And when the WGA went on strike last spring, they were historically unified: more guild members than ever before turned out for the vote to authorize, and 97.9 percent voted in favor. After five months, the writers were said to have won: they gained a new residuals model for streaming, new minimum lengths of employment for TV, and more guaranteed paid work on feature-film screenplays, among other protections.

But the business of Hollywood had undergone a foundational change. The new effective bosses of the industry—colossal conglomerates, asset-management companies, and private-equity firms—had not been simply pushing workers too hard and grabbing more than their fair share of the profits. They had been stripping value from the production system like copper pipes from a house—threatening the sustainability of the studios themselves. Today’s business side does not have a necessary vested interest in “the business”—in the health of what we think of as Hollywood, a place and system in which creativity is exchanged for capital. The union wins did not begin to address this fundamental problem.

Currently, the machine is sputtering, running on fumes. According to research by Bloomberg, in 2013 the largest companies in film and television were more than $20 billion in the black; by 2022, that number had fallen by roughly half. From 2021 to 2022, revenue growth for the industry dropped by almost 50 percent. At U.S. box offices, by the end of last year, revenue was down 22 percent from 2019. Experts estimate that cable-television revenue has fallen 40 percent since 2015. Streaming has rarely been profitable at all. Until very recently, Netflix was the sole platform to make money; among the other companies with streaming services, only Warner Bros. Discovery’s platforms may have eked out a profit last year. And now the streaming gold rush—the era that made Dickinson—is over. In the spring of 2022, the Federal Reserve began raising interest rates after years of nearly free credit, and at roughly the same time, Wall Street began calling in the streamers’ bets. The stock prices of nearly all the major companies with streaming platforms took precipitous falls, and none have rebounded to their prior valuation.

The industry as a whole is now facing a broad contraction. Between August 2022 and the end of last year, employment fell by 26 percent—more than one job gone in every four. Layoffs hit Warner Bros. Discovery, Netflix, Paramount Global, Roku, and others in 2022. In 2023, firings swept through the representation giants United Talent Agency and Creative Artists Agency; Netflix, Paramount Global, and Roku again; plus Hulu, NBCUniversal, and Lionsgate. In early 2024, it was announced that Amazon was cutting hundreds of jobs from its Prime Video and Amazon MGM Studios divisions. In February, Paramount Global laid off roughly eight hundred people. It’s unclear which streamers will survive. As James Dolan, the interim executive chair of AMC Networks, told employees in late 2022 as he delivered news of massive layoffs—roughly 1,700 people (20 percent of U.S. staff) would lose their jobs—“the mechanisms for the monetization of content are in disarray.”

Profit will of course find a way; there will always be shit to watch. But without radical intervention, whether by the government or the workers, the industry will become unrecognizable. And the writing trade—the kind where one actually earns a living—will be obliterated…

Film and television writers face an existential threat; viewers, a drab future: “The Life and Death of Hollywood,” from @dbessner in @Harpers. A bracing piece, eminently worth reading in full.

* Allen Ginsberg

###

As we study streaming, we might recall that it was on this date in 1964 that AT&T connected the first Picturephone call (between Disneyland in California and the World’s Fair in New York). The device consisted of a telephone handset and a small, matching TV, which allowed telephone users to see each other in fuzzy video images as they carried on a conversation. It was commercially-released shortly thereafter (prices ranged from $16 to $27 for a three-minute call between special booths AT&T set up in New York, Washington, and Chicago), but didn’t catch on… though, of course, it augured the “future” in which now we live.

source

Written by (Roughly) Daily

April 20, 2024 at 1:00 am

“Words are sacred. They deserve respect. If you get the right ones, in the right order, you can nudge the world a little.”*…

And as Gail Sherman observes, that principle operates at a pretty basic level…

There is a Royal Order of Adjectives, and you follow it without knowing what it is—a particular sequence to use when more than one adjective precedes a noun. There are exceptions, of course, because English is three languages in a trenchcoat. According to the Cambridge Dictionary, in general, the proper order is:

Opinion
Size
Physical quality
Shape
Age
Color
Origin
Material
Type
Purpose

Most people couldn’t tell you this rule, but everyone follows it. If you use the wrong order, it just sounds weird. If you have a fancy new blue metal lunchbox but call it a metal new fancy blue lunchbox, people might be worried you are having a stroke…

There is a Royal Order of Adjectives, and you follow it without knowing what it is,” from @CambridgeWords via @BoingBoing.

* Tom Stoppard

###

As we parse, we might send powerfully-phrased birthday greetings to a spare but graceful user of adjectives, Seymour Wilson “Budd” Schulberg; he was born on this date in 1914. A screenwriter, television producer, novelist, and sportswriter, Schulberg is best remembered for his novels What Makes Sammy Run? (1941) and The Harder They Fall (1947), as well as his screenplays for On the Waterfront (1954, for which he received an Academy Award) and A Face in the Crowd (1957).

As a sportswriter, Schulberg was most famously chief boxing correspondent for Sports Illustrated.  He wrote some well-received books on boxing, including Sparring with Hemingway and was inducted into the International Boxing Hall of Fame (in 2002).

The son of B. P. Schulberg, head of Paramount Studios in its golden age, Budd wrote Moving Pictures: Memoirs of a Hollywood Prince, an autobiography covering his youth in Hollywood, growing up in the 1920s and 1930s among the famous.

source