(Roughly) Daily

Posts Tagged ‘tv

“The past is a foreign country: they do things differently there”*…

The inimitable Tim Urban on the children who populate print ads from the first (the “pre-TV”) half of the 20th century…

Girls who are a weird level of hungry

Kids with old faces…

Infants drinking soda…

Children at risk…

… and so much more: “Creepy Kids in Creepy Vintage Ads,” from @waitbutwhy.

* L.P. Hartley, The Go-Between

###

As we contemplate change, we might recall that it was on this date in 1941, before a Brooklyn Dodgers–Philadelphia Phillies game at Ebbets Field, that NBC-owned station WNBT in New York aired the first (legal) television commercial– The “Bulova Time Check.” Bulova paid $4 in air fees plus $5 in station fees; there were about 4,000 TV sets in the New York Area at the time. The average cost of a 30-second spot in the broadcast of the last Super Bowl was $7,000,000.

Written by (Roughly) Daily

July 1, 2023 at 1:00 am

“Television is now so desperately hungry for material that they’re scraping the top of the barrel”*…

The television industry, in its new streaming-led form, is in turmoil. The companies that control it are slashing their available libraries and “reorganizing” their operations, leading to layoffs throughout the industry, and writers are on strike. To some extent, these are consequences of the years of deficit-funded efforts to grab subscribers coming home to roost. But as Max Read argues, there is a deeper problem: the people making our entertainment don’t like it…

[David] Zaslav is this cycle’s big Hollywood villain, most famous for burying completed but unreleased movies for tax write-offs and removing shows from their streaming platforms to save money. When “Max,” the new streaming service that combined all of WBD’s streaming apps into a single offering, was released in May, its interface credited directors and producers together under the hilariously dismissive heading “Creators,” which was both a blatant violation of bargaining agreements around credits and an on-the-nose suggestion that the WBD people simply couldn’t be bothered to care what a “director” is or what one does.

But he is hardly the only person in Hollywood who seems to have more contempt than love for what the industry does. This excellent Vulture piece about the state of streaming by Lane Brown and Joseph Adalian has been rattling around in my head all week, specifically this quote:

One high-level agent says that studios regard the WGA’s demands — for higher minimum pay and staffing requirements, among other things — as simply incompatible with the way TV is now made: “The Writers Guild, delusionally, is harkening back to a day when there were 25 episodes of Nash Bridges a year and repeats and residuals. Back-end payments existed because Europeans were willing to watch our garbage, and Americans were willing to watch repeats of that garbage on cable at 11 at night. The real issue is that the medium changed. Instead of getting a job as a staff writer on CSI: Miami for 46 weeks a year, now it’s a 25-week job working on Wednesday, which is a better show. That’s just progress.”

This begins as a relatively lucid description of why back-end payments existed and then becomes a bizarre fantasy in which, for some strange reason, writers are now obligated to trade stable and remunerative employment for a vague sense of creative fulfillment and prestige, and this trade is called “progress.”…

I don’t meant to make one agent’s offhand quotes to Vulture emblematic of an entire industry, and I know there are many people in Hollywood who would disagree. But the contempt on display for (1) the product being produced, (2) the people who make that product, and (3) the people who consume that product is, I think, widely shared–look at Zaslav, HBO, and TCM. That contempt is nothing new in the entertainment industry, of course. But as it grows and develops it has increasingly made the people in charge unable to distinguish between good product and bad product.

The agent here is committing the same mistake as a lot of bad critics and even more bad development executives, which is to think of “prestige” as a desirable marker of quality, instead of as a kind of genre, or, more cynically, a set of narrative and aesthetic tropes (antiheroes, serialized narratives, film-like cinematography) designed to appeal to a particular marketing demographic–one that happens to be a target demographic for subscription streaming services. As the Vulture piece goes on to point out, just because something is eight episodes long and “actually about trauma” doesn’t automatically make it good, let alone popular…

The agent describes the old residuals system like this: “Back-end payments existed because Europeans were willing to watch our garbage, and Americans were willing to watch repeats of that garbage on cable at 11 at night.” The idea is that people are no longer willing to watch “garbage,” presumably because so many more options are available to them. But this is only a satisfying answer if you assume that “garbage” is automatically bad. What if the problem is that we’re not really making much good garbage anymore?…

You don’t even have to watch the garbage to appreciate its role in the creative ecosystem. The Shield creator Shawn Ryan, who’s quoted in the story above, was a Nash Bridges writer; so too was Watchmen and The Leftovers creator Damon Lindelof. I’m not the first person to make this point, but the entire first generation of “prestige TV” in the 2000s–which is to say, 90 percent of the actually good prestige TV–was written by people who’d spent a lot of time learning to write quickly to a tight structure for a big audience, a set of skills no longer as widespread among writers, to dire consequences for audiences, who have essentially traded consistent, engaging entertainment for the convenience of on-demand streaming.

Networks used to create several Honda Civic shows a year (and, yes, a lot of lemons); these days, if I can stretch this metaphor past the breaking point, streaming platforms seem to mostly create Tesla Model 3s, which is to say expensive, technologically interesting products that gesture at luxury and quality but tend to fall apart quickly and rely almost entirely on hype and conspicuous consumption (not to mention labor exploitation!) to make themselves profitable–and then only after years of burning cash in pursuit of a business model…

Is the problem really that streamers (or writers) are too focused on “prestige” at the expense of “populist” “garbage”? Netflix, the biggest streamer of all, produces mind-boggling amounts of middlebrow and trash TV; every time I open the app there’s a new reality competition between friendship bracelet makers or whatever.

There are many, many cultural and technological reasons for the various (and often overstated) malaises of the streaming era, and there’s no one weird trick for the industry to fix itself. But it’s hard not to notice that, from a labor perspective, the big difference between the era of West Wing and Ally McBeal and now is not so much that writers and directors and actors are too pretentious for lady-lawyer shows but that back then seasons lasted for 20+ episodes, paid more people, promised more consistency (to audiences and to workers above and below the line), and underwent more development. Streamers seem happy to make middlebrow TV; but they also seem unable or unwilling to consistently make good middlebrow TV–by paying enough people, building enough institutional knowledge, committing enough resources, and marketing the product.

You hear sometimes a call from writers or directors or other creatives for studios and streamers to take more risks and get more creative. But I don’t really think the problem of bad TV in the streaming era is an issue of “creativity” (versus conservatism) or “risk” (versus safety) so much as it is an issue of professionalism (versus saying “yes” to 1,000 shows at once, under-developing them, and then killing them en masse for no clear reason). Maybe the reason writers and directors and other creatives are treating TV “like art” instead of like “a job” is because none of the people who hire them are treating it like a job either!

If you do not like movies and TV you cannot make good movies and TV: “Why do entertainment executives hate entertainment?” from @readmaxread in his ever-illuminating newsletter, Read Max. Eminently worth reading in full.

* Gore Vidal

###

As we change the channel, we might recall that it was on this date in 1977 that Elvis performed his last concert at Indianapolis’ Market Square Arena.

source

Written by (Roughly) Daily

June 26, 2023 at 1:00 am

“[Hannah] Arendt wrote about the subjugation of public space – in effect the disappearance of public space, which, by depriving a person of boundaries and agency, rendered him profoundly lonely”*…

… and Alexandra Lange writes about an (imperfect) modern “work-around”: the way in which shopping malls won over a wide range of admirers, from teens to seniors, by providing something they couldn’t find in their dwindling public parks or on their crowded sidewalks…

… The mall, in its quiet early hours, provides affordances most cities and suburbs cannot: even, open walkways, consistent weather, bathrooms and benches. The mall is also “safe,” as Genevieve Bogdan told The New York Times in 1985; the Connecticut school nurse was “apprehensive about walking alone outdoors early in the morning before work.”

For the more vulnerable among us, malls’ privately owned and privately managed amenities offer an on- or off-ramp from the real world, sometimes literally. Skateboarders and wheelchair users both appreciate the fact that most malls were built to include ramps, escalators and elevators, or have been retrofitted to do so. At Grossmont Center, a mall in La Mesa, California, the parking lot features signs giving the step counts from your parking spot to Target, Macy’s and the movie theater. Few cities can say the same.

It isn’t only the ease of exercise that has made mall walking programs durable. On Twitter, city planner Amina Yasin praised malls as spaces that accommodate many racialized and even unhoused senior citizens, offering free and low-cost-of-entry access to air-conditioning, bathrooms and exercise, while throwing up her hands that “white urbanism decided malls are evil.” Gabrielle Peters, a writer and former member of the city of Vancouver’s Active Transportation and Policy Council, responded with her own thread on some ways malls offer better access for people with physical disabilities than city streets: dedicated transit stops, wide automatic doors, wide level passages, multiple types of seating, elevators prominently placed rather than hidden, ramps paired with stairs, public bathrooms and so on. 

The food court at the Gallery offered a relatively low-cost way to hang out after the transit trip or mall walk. While public libraries and senior centers offer free public seating, they have neither the proximity to shopping, nor the proximity to the action that a mall offers. Like teens hanging out in the atrium, the seniors in the food court can observe without penalty and be a part of community life that can be overwhelming in truly public spaces. After police officers removed elderly Korean Americans from a McDonald’s in Flushing, Queens — managers claimed the group overstayed their welcome, buying only coffee and french fries — sociologist Stacy Torres wrote in The New York Times, “Centers offer vital services, but McDonald’s offers an alternative that doesn’t segregate people from intergenerational contact. ‘I hate old people,’ one 89-year-old man told me.”

As malls closed in the spring of 2020 because of Covid-19, mall walkers across the country were forced back outside, battling weather and uneven sidewalks in their neighborhoods, missing the groups that easily formed on the neutral ground of the mall. One New Jersey couple took to walking the parking lot of the mall they once traversed inside, drawn to its spaciousness and their sense of routine. As malls reopened in the summer and fall with social-distancing and mask-wearing policies, some malls suspended their programs until the pandemic’s end, while others curtailed the hours…

Lessons From the Golden Age of the Mall Walkers,” from @LangeAlexandra in @CityLab.

* Masha Gessen

###

As we ponder perambulation, we might recall that it was on this date in 1949 that Hopalong Cassidy (starring William Boyd, who’d created and developed the role in 66 films, starting in 1935) premiered on the fledgling NBC TV network and became the first Western television series.

source

“The way of paradoxes is the way of truth”*

Circular references…

Cyclic TV Reference Paradoxes occur when a chain of fictional TV show references form a cycle. Each show’s reality depends on another being fictional, so a cycle of these dependencies is a paradox [like the one above].

Using subtitles, a large dataset of TV references were generated. This tool displays this dataset in a graph where the nodes are TV shows, and the edges are references. References can be viewed by clicking on individual nodes in this graph. Cycles can be selected to inspect a specific instance of this paradox.

Prepare for your head to spin, then head over to Cyclic TV Reference Paradox Finder. Creator Jamie Pinheiro (@jamiepinheiro) unpacks the backstory and explains his technique here.

* Oscar Wilde

###

As we get meta, we might recall that it was on this date in 1994 that people all around the U.S. (and some parts of the world) watched police in a low-speed chase of a not-so-mysterious white Ford Bronco.

Just five days earlier, it was discovered that O.J. Simpson’s ex-wife and her friend Ron Goldman were brutally murdered outside of her home. Simpson became a chief suspect and had agreed to turn himself in but apparently decided to take a u-turn. Traveling with a friend, A.C. Cowlings, Simpson was carrying his passport, a disguise and $8,750 in cash. Instead of surrendering to police, Simpson took them on a low-speed chase on the L.A. freeways all the way back to his home in Brentwood [where he was arrested].

source

source

Written by (Roughly) Daily

June 17, 2022 at 1:00 am

“Based on his liberal use of the semicolon, I just assumed this date would go well”*…

Mary Norris (“The Comma Queen“) appreciates Cecelia Watson‘s appreciation of a much-maligned mark, Semicolon

… Watson, a historian and philosopher of science and a teacher of writing and the humanities—in other words, a Renaissance woman—gives us a deceptively playful-looking book that turns out to be a scholarly treatise on a sophisticated device that has contributed eloquence and mystery to Western civilization.

The semicolon itself was a Renaissance invention. It first appeared in 1494, in a book published in Venice by Aldus Manutius. “De Aetna,” Watson explains, was “an essay, written in dialogue form,” about climbing Mt. Etna. Its author, Pietro Bembo, is best known today not for his book but for the typeface, designed by Francesco Griffo, in which the first semicolon was displayed: Bembo. The mark was a hybrid between a comma and a colon, and its purpose was to prolong a pause or create a more distinct separation between parts of a sentence. In her delightful history, Watson brings the Bembo semicolon alive, describing “its comma-half tensely coiled, tail thorn-sharp beneath the perfect orb thrown high above it.” Designers, she explains, have since given the mark a “relaxed and fuzzy” look (Poliphilus), rendered it “aggressive” (Garamond), and otherwise adapted it for the modern age: “Palatino’s is a thin flapper in a big hat slouched against the wall at a party.”

The problem with the semicolon is not how it looks but what it does and how that has changed over time. In the old days, punctuation simply indicated a pause. Comma, colon: semicolon; period. Eventually, grammarians and copy editors came along and made themselves indispensable by punctuating (“pointing”) a writer’s prose “to delineate clauses properly, such that punctuation served syntax.” That is, commas, semicolons, and colons were plugged into a sentence in order to highlight, subordinate, or otherwise conduct its elements, connecting them syntactically. One of the rules is that, unless you are composing a list, a semicolon is supposed to be followed by a complete clause, capable of standing on its own. The semicolon can take the place of a conjunction, like “and” or “but,” but it should not be used in addition to it. This is what got circled in red in my attempts at scholarly criticism in graduate school. Sentence length has something to do with it—a long, complex sentence may benefit from a clarifying semicolon—but if a sentence scans without a semicolon it’s best to leave it alone.

Watson has been keeping an eye out for effective semicolons for years. She calculates that there are four-thousand-odd semicolons in “Moby-Dick,” or “one for every 52 words.” Clumsy as nineteenth-century punctuation may seem to a modern reader, Melville’s semicolons, she writes, act like “sturdy little nails,” holding his wide-ranging narrative together….

Eminently worth reading in full: “Sympathy for the Semicolon,” on @ceceliawatson from @MaryNorrisTNY.

Sort of apposite (and completely entertaining/enlightening): “Naming the Unnamed: On the Many Uses of the Letter X.”

(Image above: source)

* Raven Leilani, Luster

###

As we punctuate punctiliously, we might recall that it was on this date in 1990 that CBS aired the final episode of Bob Newhart’s second successful sitcom series, Newhart, in which he co-starred with Mary Fran through a 184 episode run that had started in 1982. Newhart had, of course, had a huge hit with his first series, The Bob Newhart Show, in which he co-starred with Suzanne Pleshette.

Newhart‘s ending, its final scene, is often cited as the best finale in sit-com history.

%d bloggers like this: