(Roughly) Daily

Posts Tagged ‘television

“[Hannah] Arendt wrote about the subjugation of public space – in effect the disappearance of public space, which, by depriving a person of boundaries and agency, rendered him profoundly lonely”*…

… and Alexandra Lange writes about an (imperfect) modern “work-around”: the way in which shopping malls won over a wide range of admirers, from teens to seniors, by providing something they couldn’t find in their dwindling public parks or on their crowded sidewalks…

… The mall, in its quiet early hours, provides affordances most cities and suburbs cannot: even, open walkways, consistent weather, bathrooms and benches. The mall is also “safe,” as Genevieve Bogdan told The New York Times in 1985; the Connecticut school nurse was “apprehensive about walking alone outdoors early in the morning before work.”

For the more vulnerable among us, malls’ privately owned and privately managed amenities offer an on- or off-ramp from the real world, sometimes literally. Skateboarders and wheelchair users both appreciate the fact that most malls were built to include ramps, escalators and elevators, or have been retrofitted to do so. At Grossmont Center, a mall in La Mesa, California, the parking lot features signs giving the step counts from your parking spot to Target, Macy’s and the movie theater. Few cities can say the same.

It isn’t only the ease of exercise that has made mall walking programs durable. On Twitter, city planner Amina Yasin praised malls as spaces that accommodate many racialized and even unhoused senior citizens, offering free and low-cost-of-entry access to air-conditioning, bathrooms and exercise, while throwing up her hands that “white urbanism decided malls are evil.” Gabrielle Peters, a writer and former member of the city of Vancouver’s Active Transportation and Policy Council, responded with her own thread on some ways malls offer better access for people with physical disabilities than city streets: dedicated transit stops, wide automatic doors, wide level passages, multiple types of seating, elevators prominently placed rather than hidden, ramps paired with stairs, public bathrooms and so on. 

The food court at the Gallery offered a relatively low-cost way to hang out after the transit trip or mall walk. While public libraries and senior centers offer free public seating, they have neither the proximity to shopping, nor the proximity to the action that a mall offers. Like teens hanging out in the atrium, the seniors in the food court can observe without penalty and be a part of community life that can be overwhelming in truly public spaces. After police officers removed elderly Korean Americans from a McDonald’s in Flushing, Queens — managers claimed the group overstayed their welcome, buying only coffee and french fries — sociologist Stacy Torres wrote in The New York Times, “Centers offer vital services, but McDonald’s offers an alternative that doesn’t segregate people from intergenerational contact. ‘I hate old people,’ one 89-year-old man told me.”

As malls closed in the spring of 2020 because of Covid-19, mall walkers across the country were forced back outside, battling weather and uneven sidewalks in their neighborhoods, missing the groups that easily formed on the neutral ground of the mall. One New Jersey couple took to walking the parking lot of the mall they once traversed inside, drawn to its spaciousness and their sense of routine. As malls reopened in the summer and fall with social-distancing and mask-wearing policies, some malls suspended their programs until the pandemic’s end, while others curtailed the hours…

Lessons From the Golden Age of the Mall Walkers,” from @LangeAlexandra in @CityLab.

* Masha Gessen

###

As we ponder perambulation, we might recall that it was on this date in 1949 that Hopalong Cassidy (starring William Boyd, who’d created and developed the role in 66 films, starting in 1935) premiered on the fledgling NBC TV network and became the first Western television series.

source

“The way of paradoxes is the way of truth”*

Circular references…

Cyclic TV Reference Paradoxes occur when a chain of fictional TV show references form a cycle. Each show’s reality depends on another being fictional, so a cycle of these dependencies is a paradox [like the one above].

Using subtitles, a large dataset of TV references were generated. This tool displays this dataset in a graph where the nodes are TV shows, and the edges are references. References can be viewed by clicking on individual nodes in this graph. Cycles can be selected to inspect a specific instance of this paradox.

Prepare for your head to spin, then head over to Cyclic TV Reference Paradox Finder. Creator Jamie Pinheiro (@jamiepinheiro) unpacks the backstory and explains his technique here.

* Oscar Wilde

###

As we get meta, we might recall that it was on this date in 1994 that people all around the U.S. (and some parts of the world) watched police in a low-speed chase of a not-so-mysterious white Ford Bronco.

Just five days earlier, it was discovered that O.J. Simpson’s ex-wife and her friend Ron Goldman were brutally murdered outside of her home. Simpson became a chief suspect and had agreed to turn himself in but apparently decided to take a u-turn. Traveling with a friend, A.C. Cowlings, Simpson was carrying his passport, a disguise and $8,750 in cash. Instead of surrendering to police, Simpson took them on a low-speed chase on the L.A. freeways all the way back to his home in Brentwood [where he was arrested].

source

source

Written by (Roughly) Daily

June 17, 2022 at 1:00 am

“Almost everybody today believes that nothing in economic history has ever moved as fast as, or had a greater impact than, the Information Revolution. But the Industrial Revolution moved at least as fast in the same time span, and had probably an equal impact if not a greater one.”*…

Actors pretend to be in the Industrial Revolution as part of the opening ceremony for the London Olympics in 2012

Dylan Matthews talks with Jared Rubin and Mark Koyama, the authors of an ambitious new economic history…

You can crudely tell the story of our species in three stages. In the first, which lasted for the vast majority of our time on Earth, from the emergence of Homo sapiens over 300,000 years ago to about 12,000 years ago, humans lived largely nomadic lifestyles, subsisting through hunting and foraging for food. In the second, lasting from about 10,000 BC to around 1750 AD, humans adopted agriculture, allowing for a more secure supply of food and leading to the establishment of towns, cities, even empires.

The third period, in which we all live, is characterized by an unprecedented phenomenon: sustained economic growth. Quality of life went from improving very gradually if at all for the vast majority of human history to improving very, very quickly. In the United Kingdom, whose Industrial Revolution kicked off this transformation, GDP per capita grew about 40 percent between 1700 and 1800. It more than doubled between 1800 and 1900. And between 1900 and 2000, it grew more than fourfold.

What today we’d characterize as extreme poverty was until a few centuries ago the condition of almost every human on Earth. In 1820, some 94 percent of humans lived on less than $2 a day. Over the next two centuries, extreme poverty fell dramatically; in 2018, the World Bank estimated that 8.6 percent of people lived on less than $1.90 a day. And the gains were not solely economic. Before 1800, average lifespans didn’t exceed 40 years anywhere in the world. Today, the average human life expectancy is more like 73. Deaths in childhood have plunged, and adult heights have surged as malnutrition decreased.

The big question is what drove this transformation. Historians, economists, and anthropologists have proposed a long list of explanations for why human life suddenly changed starting in 18th-century England, from geographic effects to forms of government to intellectual property rules to fluctuations in average wages.

For a long time, there was no one book that could explain, compare, and evaluate these theories for non-experts. That’s changed: How the World Became Rich, by Chapman University’s Jared Rubin and George Mason University’s Mark Koyama, provides a comprehensive look at what, exactly, changed when sustained economic growth began, what factors help explain its beginning, and which theories do the best job of making sense of the new stage of life that humans have been experiencing for a couple brief centuries…

Two economic historians explain what made the Industrial Revolution, and modern life, possible: “About 200 years ago, the world started getting rich. Why?,” from @dylanmatt @jaredcrubin @MarkKoyama in @voxdotcom.

* Peter Drucker

###

As we contemplate change and its causes, we might spare a thought for Charles Francis Jenkins; he died on this date in 1934. An engineer and inventor, he is rightly remembered for his contributions to film and television: he invented a film projector and sold the rights to Thomas Edison, who marketed it as the Vitascope, the projector that Edison used in paid, public screenings in vaudeville theaters; and he opened the first television broadcasting station in the U.S. (W3XK in Washington, D.C.).

But Jenkins also pioneered in other areas. He was the first to move an automobile engine from under the seat to the front of the car; he invented the automotive self starter (replacing the crank) and an improved altimeter for aviation; and he created the cone-shaped drinking cup.

source

“I shall have more to say when I am dead”*…

Brian Brodeur reassesses an unjustly-forgotten modernist…

On December 22, 2019, the sesquicentennial of a writer Donald Justice referred to as “the first modern American poet” passed without a whimper. Edwin Arlington Robinson (1869–1935) would’ve found this critical neglect fitting; obscurity was one of his perennial subjects. Though he won three Pulitzers and was a favorite poet of Theodore Roosevelt, Robinson, whose own mother waited seven months to name him, was attracted to characters few people acknowledged, cared about, or understood.

Before Robinson, very little lived experience had crept into the lines of late Victorian American poetry, which included the likes of rightly forgotten Richard Watson Gilder (1844–1909) and Robert Underwood Johnson (1853–1937): parlor versifiers Whitman famously dismissed as “tea-pot poets.” Rather than saturating his work with overblown symbols, hackneyed aphorisms, and hollow moralism, Robinson relied on the more sophisticated techniques of understatement, irony, and sparse detail. He also confronted such 19th-century taboos as alcoholism, homelessness, and assisted suicide.

So why has Robinson’s Collected Poems remained out of print since the 1970s? Like Henry Wadsworth Longfellow (1807–1882), another virtual nonperson for most 21st-century readers, Robinson is often overlooked as being insufficiently modern, unfashionably didactic, and even culturally problematic. Though this latter description might be justly applied to Longfellow’s The Song of Hiawatha (1855), which perpetuates stereotypes of Native American life, none of these epithets accurately describes Robinson.

Understanding this collective lapse in critical judgment begins by acknowledging that Robinson continues to challenge dominant literary conventions. To begin with, his poems almost always tell a story, almost exclusively in meter and nearly always in rhyme; he also valued clarity of style and rationality of thought over the experimental fragmentation of many high modernists, and, unlike the Confessional poets who came later, hardly ever wrote about himself explicitly. Another reason for his neglect involves a commonly held misconception about literary history. Though Robinson was born nearly 20 years before Ezra Pound (1885), many consider him a peer of the much younger modernists who are often lumped together with him in anthologies of modern American poetry. Robinson broke new ground in his best books, which were published between 1897 and 1925, but his poems can sound antique when compared to The Waste Land (1922) and The Pisan Cantos (1948).

Yet it serves to remember that art has no present without its past. Acknowledging practices of earlier periods gives poets the knowledgeable freedom to experiment in their own time. Robinson’s best work offers contemporary practitioners options, ways of writing largely ignored by 21st-century American poets…

An appreciation: “‘The Flicker, Not the Flame’: E. A. Robinson’s Narrative Compression,” from @bbrodeurpoet in @LAReviewofBooks.

* Edwin Arlington Robinson

###

As we give credit where credit is due, we might spare a thought for Maya Angelou; she died on this date in 2014. A poet, memoirist, and civil rights activist, she published several books of poetry, seven autobiographies, three books of essays, and is credited with a list of plays, movies, and television shows through a career that spanned over 50 years.

Her autobiographical work drew on her experiences as a fry cook, sex worker, nightclub performer, Porgy and Bess cast member, Southern Christian Leadership Conference coordinator, and correspondent in Egypt and Ghana during the decolonization of Africa. She went on to work as an actress, writer, director, and producer of plays, movies, and public television programs. Then, in 1982, she was named the first Reynolds Professor of American Studies at Wake Forest University. In 1993, Angelou recited her poem “On the Pulse of Morning” (1993) at the first inauguration of Bill Clinton (making her the first poet to make an inaugural recitation since Robert Frost at the inauguration of John F. Kennedy in 1961).

Angelou was nominated for the Pulitzer and the Tony, won three Grammys, and was awarded over 50 honorary degrees. She won the Spingarn Medal in 1994, the National Medal of Arts in 2000, and the Presidential Medal of Freedom in 2011. And in 2022 she became the first Black woman to be depicted on a U.S. quarter.

Angelou at the Clinton inauguration [source]

“Based on his liberal use of the semicolon, I just assumed this date would go well”*…

Mary Norris (“The Comma Queen“) appreciates Cecelia Watson‘s appreciation of a much-maligned mark, Semicolon

… Watson, a historian and philosopher of science and a teacher of writing and the humanities—in other words, a Renaissance woman—gives us a deceptively playful-looking book that turns out to be a scholarly treatise on a sophisticated device that has contributed eloquence and mystery to Western civilization.

The semicolon itself was a Renaissance invention. It first appeared in 1494, in a book published in Venice by Aldus Manutius. “De Aetna,” Watson explains, was “an essay, written in dialogue form,” about climbing Mt. Etna. Its author, Pietro Bembo, is best known today not for his book but for the typeface, designed by Francesco Griffo, in which the first semicolon was displayed: Bembo. The mark was a hybrid between a comma and a colon, and its purpose was to prolong a pause or create a more distinct separation between parts of a sentence. In her delightful history, Watson brings the Bembo semicolon alive, describing “its comma-half tensely coiled, tail thorn-sharp beneath the perfect orb thrown high above it.” Designers, she explains, have since given the mark a “relaxed and fuzzy” look (Poliphilus), rendered it “aggressive” (Garamond), and otherwise adapted it for the modern age: “Palatino’s is a thin flapper in a big hat slouched against the wall at a party.”

The problem with the semicolon is not how it looks but what it does and how that has changed over time. In the old days, punctuation simply indicated a pause. Comma, colon: semicolon; period. Eventually, grammarians and copy editors came along and made themselves indispensable by punctuating (“pointing”) a writer’s prose “to delineate clauses properly, such that punctuation served syntax.” That is, commas, semicolons, and colons were plugged into a sentence in order to highlight, subordinate, or otherwise conduct its elements, connecting them syntactically. One of the rules is that, unless you are composing a list, a semicolon is supposed to be followed by a complete clause, capable of standing on its own. The semicolon can take the place of a conjunction, like “and” or “but,” but it should not be used in addition to it. This is what got circled in red in my attempts at scholarly criticism in graduate school. Sentence length has something to do with it—a long, complex sentence may benefit from a clarifying semicolon—but if a sentence scans without a semicolon it’s best to leave it alone.

Watson has been keeping an eye out for effective semicolons for years. She calculates that there are four-thousand-odd semicolons in “Moby-Dick,” or “one for every 52 words.” Clumsy as nineteenth-century punctuation may seem to a modern reader, Melville’s semicolons, she writes, act like “sturdy little nails,” holding his wide-ranging narrative together….

Eminently worth reading in full: “Sympathy for the Semicolon,” on @ceceliawatson from @MaryNorrisTNY.

Sort of apposite (and completely entertaining/enlightening): “Naming the Unnamed: On the Many Uses of the Letter X.”

(Image above: source)

* Raven Leilani, Luster

###

As we punctuate punctiliously, we might recall that it was on this date in 1990 that CBS aired the final episode of Bob Newhart’s second successful sitcom series, Newhart, in which he co-starred with Mary Fran through a 184 episode run that had started in 1982. Newhart had, of course, had a huge hit with his first series, The Bob Newhart Show, in which he co-starred with Suzanne Pleshette.

Newhart‘s ending, its final scene, is often cited as the best finale in sit-com history.

%d bloggers like this: