Mendacious politicians, duplicitous corporations, AI slop– it’s getting harder and harder to find authenticity, to get to the truth. Further to our occasional posts on misinformation in history, a look at Johns Hopkins University’s Bibliotheca Fictiva Collection of Literary and Historical Forgery, a tangible demonstration that humans have been creating fan fiction and fake news for millennia…
In “The History of Fake News From the Flood to the Apocalypse,” the course Earle Havens [see here] teaches at Johns Hopkins University, he presents undergrads with a formidable challenge. They have to create historical forgeries and then defend the authenticity of their deceptions.
Forgeries, hoaxes, and other types of literary fakery have preoccupied Havens, a rare books and manuscripts curator at the university’s Stern Center for the History of the Book, for many years now. As part of his curatorial brief, Havens oversees the Bibliotheca Fictiva Collection of Literary and Historical Forgery, available via JSTOR. It includes more than 2,000 items—rare books, manuscripts, and ephemera—and was the brainchild of Arthur and Janet Freeman, who amassed most of its holdings over a period of some fifty years. Johns Hopkins acquired the majority of the collection from the Freemans in 2011; it has continued to expand in the years since…
* Rudyard Kipling “The Conundrum of the Workshops” (quoted by Orson Welles in his remarkable film F for Fake)
###
As we grab for a grain of salt, we might recall that it was on this date in 1964 that the Rolling Stones made their first appearance on The Ed Sullivan Show performing Chuck Berry’s “Around and Around” (and closing the show with “Time Is On My Side”).
The band’s appearance on the show generated over a million dollars in ticket sales for their fall concert tour, and despite outrage from conservative adults who disapproved of the Stones’ “unkempt” image, the group returned to The Ed Sullivan Show for another six appearances throughout the rest of the 1960s. – source
Last week, Northwestern Professor Joel Mokyr was awarded a half-share in The Nobel Prize in Economic Sciences (AKA The Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel) “for having identified the prerequisites for sustained growth through technological progress.” Anton Howes explains why this is noteworthy…
Among today’s winners of the Nobel prize in Economics is Joel Mokyr, the professor at Northwestern whose name is indelibly associated with the primacy of innovation to modern economic growth – the gradual, sustained, and unprecedented improvement in living standards that first Britain, and then country after country, have enjoyed over the past few hundred years. It was reading Mokyr’s The Enlightened Economy that first opened my eyes to the importance of studying the history of invention to explaining the causes of the Industrial Revolution, which I have since made my career.
What makes this Nobel win so remarkable, and so pleasantly surprising, is that Mokyr’s work is not the kind that is often published by economics journals, or even many economic history journals anymore. Over the past few decades, journal editors and peer-reviewers have increasingly insisted that papers must present large datasets that have been treated using complex statistical methods in order to make even the mildest claims about what caused what. Although Mokyr is a master of such methods – he was one of the early pioneers of economic history’s quantitative turn – the work for which he has won the prize is firmly and necessarily qualitative.
Mokyr’s is the economic history that gets written up in books – his classics are The Lever of Riches, The Gifts of Athena, The Enlightened Economy, and A Culture of Growth – and in readable papers shorn of unnecessary formulae. His is history accessible to the layman, though rigorously applying the insights of economics. The prize is a clear signal from the economics profession that it doesn’t just value the application of fancy statistical methods; its highest prize can go to works of history.
Whereas most of the public, and even many historians, think of the causes of modern economic growth – the beginnings of the Industrial Revolution – as being rooted in material factors, like conquest, colonialism, or coal, Mokyr tirelessly argued that it was rooted in ideas, in the intellectual entrepreneurship of figures like Francis Bacon and Isaac Newton, and in the uniquely precocious accumulation in eighteenth-century Britain of useful, often mechanically actionable knowledge. Britain, he argued, through its scientific and literary societies, and its penchant for publications and sharing ideas, was the site of a world-changing Industrial Enlightenment – the place where progress was thoughtpossible, and then became real.
One of Mokyr’s big early insights, first appearing in Lever of Riches, was that many inventions could not be predicted by economic factors. Society could enjoy remarkable productivity improvements from simply increasing the size of the market, leading to division of labour and specialization – what he labelled ‘micro-inventions’ – in the vein popularised by Adam Smith. But this could not explain an invention that appeared out of the blue, like Montgolfier’s hot air balloon in the 1780s – what he called a ‘macro-invention’, not for the magnitude of its impact, but for its novelty. Macro-inventions often required further development to make them important, but the original breakthrough could not be predicted by looking at changes in prices or the availability of resources. It ultimately came down to advances in our understanding of the world. Mokyr put the Scientific Revolution – and the factors that contributed to it – on the economist’s map.
Mokyr also looked at the relationship between different kinds of knowledge. A scientist might know, through observation, that the air has a weight. A craftsman might know, through long training and experience with glass, how to make a long glass tube. Each could not get far alone. But combining them, by creating means to ensure that scientists and craftsmen talked with one another and collaborated – through connecting their propositional and prescriptive knowledge, their heads and hands – very quickly led to the invention of thermometers, barometers, and much more besides, in an ever expanding field of knowledge. What Mokyr taught economists is that it’s not knowledge per se that makes the difference, but the way it is organized. Much of his later work has shown just how deep a pool Britain’s scientists could draw on, of skilled artisans.
In a way, Mokyr himself has practised what he preached. As editor of Princeton University Press’s book series on the Economic History of the Western World, Mokyr has for decades provided an all-important space for economists and historians to write the kinds of research that would never have been publishable in economics journals – including of explanations of the Industrial Revolution that are the polar opposite to his own. He helped keep the connection between history and economics alive.
Mokyr’s case for the primacy of knowledge and ideas was not an easy one to make to economists. They are naturally drawn to data that can be counted, and not to narrative, often no matter how well evidenced. But it appears that Mokyr’s persistence, elevated by his infectious, irrepressible sprightliness, has paid off. His prize is a long overdue recognition of the historyin economic history, and a remarkable testament to the power of ideas to persuade…
As we ponder the process of progress, we might send creative birthday greetings to one of the subjects Mokyr’s study, Sir Christopher Wren; he born on this date in 1632. A mathematician and astronomer (who co-founded and later served as president of the Royal Society), he is better remembered as one of the most highly acclaimed English architects in history; he was given responsibility for rebuilding 52 churches in the City of London after the Great Fire in 1666, including what is regarded as his masterpiece, St. Paul’s Cathedral, on Ludgate Hill.
Wren, whose scientific work ranged broadly– e.g., he invented a “weather clock” similar to a modern barometer, new engraving methods, and helped develop a blood transfusion technique– was admired by Isaac Newton, as Newton noted in the Principia.
Your correspondent was in New York last week and ducked into Grand Central Station (or more formally, Grand Central Terminal)… to find it transformed. Sarah Cascone has the backstory…
For the first time possibly ever, there is not a single ad to be seen in Grand Central Terminal. “Humans of New York,” Brandon Stanton‘s popular social media art series of photographs of people he’s interviewed on the city’s streets, has taken over each and every one of the 150 video billboards in the grand concourse, as well as the subway ads below in Grand Central Station for “Dear New York.”
“This beautiful art installation transforms the terminal into a photographic display of New Yorkers telling their stories from all walks of life—serving as a powerful reminder of our shared humanity,” MTA director of commercial ventures Mary John said in a statement. “It is the first time an artist has unified digital displays in both the terminal and subway station below, and the MTA coordinated across many corners of our organization to make this happen.”
“If it provides even the slightest amount of joy, solace, beauty, or connection to the 750,000 people who pass through Grand Central every day—we have achieved our goal,” he wrote on Facebook.
The original plan was to use the proceeds from his new book, Dear New York, but Stanton ended up having to dip into his life savings to cover the total cost, which included space rental and covering the station’s lost ad revenue. The artist and journalist, who wrote the best-selling book Humans of New York, declined to provide an exact figure, but told the New York Times that “I no longer have any stocks.”
Stanton has shot portraits of 10,000 people across the five boroughs and beyond since beginning “Humans of New York” in 2010, creating a kind of photographic census of the city. (He has since expanded the project’s scope internationally, to 40 countries and counting.)…
“Humans of New York” founder Brandon Stanton in Grand Central Terminal with his new book, “Dear New York,” at his photography exhibition of the same name
As we see, we might spare a thought for a photographer with a different– but also crucially-important– focus, Edwin Way Teale; he died on this date in 1980. A naturalist, photographer, and writer, his works serve as primary source material documenting environmental conditions across North America from 1930–1980. He is perhaps best known for his series The American Seasons, four books documenting over 75,000 miles of automobile travel across North America following the changing seasons.
Teale’s Hampton, CT home, “Trail Wood” (chronicled in his A Naturalist Buys an Old Farm and further described in A Walk through the Year) is now managed as a nature preserve by the Connecticut Audubon Society. His papers, housed in the University of Connecticut Archives & Special Collections, take up 238 feet of shelf space and include field notes and drafts for each of his books, early childhood writings, professional writings for magazines, newspapers and book reviews, correspondence- both personal and professional, personal and family documents, scrapbooks, and memorabilia, as well as his photographs (prints, negatives, and transparencies) and his personal library. But he bequeathed to the Concord (MA) Free Public Library his collection of Henry David Thoreau books, letters, correspondence, mementos and any other material dealing with Thoreau and Ralph Waldo Emerson and other material relating to Concord, Massachusetts– 12 containers and 108 printed books and pamphlets.
A few months ago I had the fleeting thought to write a post about Stephen Biesty, the DK books cross-section legend. After learning he’d passed only just last year, I was disheartened to discover his personal website and galleries had gone offline, and there were no significant retrospectives of his career that I could find.
Now, after having looked through nearly every single work he produced and having read literally everything I could find online about him, I have come to find his quiet denouement rather touching. He certainly seemed to be private by design, offering only a handful of interviews in his lifetime. The longest profile I could find is weirdly condemning of his workmanlike ethos:
The artist himself is not quite as immediately engaging. Biesty is 35, with the smooth face and straight jeans of a Microsoft programmer. He lives in a Somerset cottage of grey-gold stone between a village church and a pair of wandering geese.
Biesty’s garden glows in the late-summer sun, yet he leads the way straight up to his studio and questions about his business. The room is almost bare of artist’s clutter, more an office with fax and easel and three paintbrushes laid parallel on a tissue to dry. ‘I don’t collect stuff,’ says Biesty.
He talks about his illustrating with a stern set to his chin, as if filling out a tricky detail. He doesn’t sketch – “There isn’t time to be doing reams of doodles” – but expands his work straight from thumbnail ideas to full-scale final pieces. These he completes, eyes close to the paper and hand in rhythm, layer by repetitive layer, between 7.30am and 5.30pm every weekday. “At lunchtime I go downstairs for half an hour and a sandwich.”
Biesty makes all this sound like mass production. “You’re employed to do one thing,” he says, straight as a factory manager. “Something that’s going to sell.” There are no posters of his pictures on his studio walls.
[…] Often, he answers with “we” rather than “I”.
I have to confess a great soft spot for all this. My great grandfather was studying technical illustration at Pratt before he was drafted into the war and lost at sea. His daughter became a graphic designer, as did her daughter — my mom. I appreciate that we all sat somewhere between art and science, heart and mind. Biesty’s seeming indifference towards an artistic identity gives his work more credibility for my tastes.
Biesty’s breakout moment came when the K of DK books asked Biesty to draw a steamboat in cross section. “I tried it lengthways and he said, ‘Fine. But try it the other way, like a loaf.’” And lo:
This became the centerpiece for his first book, Incredible Cross-Sections [here], and the seed for the rest of his career. Anyone around my age and above a certain threshold of autistic will have burned much library time on this amazing ‘90s run of DK books…
[Cole goes on to show and discuss other examples of Biesty’s work and to examine his influences…]
… Later in life, Biesty was able to admit some of the depth so evident in the work, as he accepted an award in 2011 for Into the Unknown [here]:
In a world where most information is stored and conveyed electronically, conventional non-fiction books for young people have taken a heavy hit. So is Into The Unknown a dinosaur, a final example of a Dying Breed? I believe not. In the years ahead, certainly fewer paper books will be produced. But those that are designed, written, and manufactured will be a bit like medieval manuscripts — special creations, works of art, unique, beautiful products to be collected and cherished. Into the Unknown, therefore, is not the end of a line but the beginning of a new, fresh and very beautiful one, and you have so kindly recognized that fact. Thank you all very much indeed…
* Robert J. Bezucha, The Art of the July Monarchy: France, 1830 to 1848
###
As we show (in addition to telling), we might spare a thought for an illustrator of a different ilk, William Steig; he died on this date in 2003. A cartoonist (most notably, in The New Yorker), and illustrator and writer of children’s books, he’s best known for Shrek!, which inspired the film series of the same name, as well as others that included Sylvester and the Magic Pebble (which won the Caldecott Medal), Abel’s Island, and Doctor De Soto. He was the U.S. nominee for the biennial and international Hans Christian Andersen Awards, as both a children’s book illustrator in 1982 and a writer in 1988.
When asked his opinion about the movie based on his picture book, Shrek!, William Steig responded: “It’s vulgar, it’s disgusting — and I loved it.” (With the release of Shrek 2 in 2004, Steig became the first sole-creator of an animated movie franchise that went on to generate over $1 billion from theatrical and ancillary markets after only one sequel.)
Your correspondent is headed off on the road, so (R)D will be in temporary hiatus. Regular service should resume on/around October 13. To keep you occupied until then, this tasty tidbit from Neal.fun (Neal Agarwal): “I’m not a robot.”
He was a radical, the inventor of blank verse, a master of internal monologue, and a victim of murder. This was the English playwright Christopher Marlowe, a contemporary and rival of William Shakespeare—and perhaps the Bard’s key creative influence.
At 14, young Marlowe—the son of a poor Canterbury cobbler—won a scholarship to the prestigious King’s School, becoming the first in his family to receive a formal education. He excelled, went on to the University of Cambridge, and there studied the great works of antiquity, from Virgil’s Aeneid to Ovid’s Metamorphoses. Where his classmates saw musty mandatory reading, Marlowe found something else: worlds of ecstatic violence and erotic excess, of vengeful outcasts and capricious gods, worlds that upended the Christian moral order in which he was raised.
After graduation, Marlowe faced an uncertain future—unlike his wealthy classmates, his education didn’t secure for him a place in society. So, he decided to take a risk, moving to London to try his hand at an unstable, disreputable profession: writing for the stage.
When Marlowe was born in 1564, says Stephen Greenblatt, the Cogan University Professor of the Humanities, England was still stuck in the Middle Ages, even as the Renaissance bloomed on the continent. Public entertainment revolved around bearbaiting and hangings; poetry was weighed down by moralizing and clumsy rhymes; brutal censorship stifled any art that challenged the crown’s authority.
By the time Marlowe died in 1593, at just 29 years old, England was in the midst of a cultural and intellectual flourishing. Greenblatt credits Marlowe with sparking this transformation. In a new book, Dark Renaissance: The Dangerous Times and Fatal Genius of Christopher Marlowe, Greenblatt—one of the world’s foremost Shakespeare scholars—argues that Marlowe didn’t merely precede Shakespeare, he made Shakespeare’s career possible.
“It was Marlowe who cracked something open,” Greenblatt says, “and enabled Shakespeare to walk through—how should we say?—over his dead body.”
Marlowe’s story, Greenblatt adds, is also relevant to many of academia’s current preoccupations. He was a “first-gen” student who glimpsed radical possibilities in the supposedly conservative texts of “great books courses.” He faced a “vocational crisis” familiar to many humanities students today—and pursued his passion despite the risk.
That career began with Marlowe’s debut play, Tamburlaine the Great, written in 1587 or 1588. “Virtually everything in the Elizabethan theater,” Greenblatt writes, “is pre- and post-Tamburlaine.”
Part of the play’s shock value lay in its plot. Loosely based on the rise of the fourteenth century Central Asian conqueror Timur (also known as Tamerlane), Tamburlaine the Great tells the story of a Scythian shepherd who ascends from obscurity to become a dominating tyrant. The violence is unrelenting, and the ambition unchecked: Tamburlaine faces no moral comeuppance for his pride. This rags-to-riches arc may have mirrored Marlowe’s own desires, Greenblatt writes—and defined many of the other outsider characters Marlowe would go on to write.
But the play’s most revolutionary element was formal: the use of “this hallucinatory blank verse, which Marlowe basically invented,” Greenblatt says. Marlowe’s characters spoke in unrhymed iambic pentameter—“elegant, musical, and forward-thrusting,” Greenblatt writes—which gave English drama a new expressive register.
Before Tamburlaine, English playwrights were trapped in stiff structures such as Poulter’s measures—couplets in which 12-syllable iambic lines rhyme with 14-syllable iambic lines. Blank verse enabled Marlowe’s characters to sound like they were “actually speaking English,” Greenblatt says, dramatized by some structure, but still alive. Shakespeare would come to rely heavily on blank verse in his own work.
A few years later came Doctor Faustus, first performed in 1594. It was Marlowe’s most famous play and the first dramatization of the Faust legend, in which a scholar makes a deal with the devil, trading his soul for magical powers. This work, Greenblatt argues, marked the first time “a powerful, complex inner life” was represented on the stage.
Before Marlowe, English theater externalized psychology through allegory: morality plays populated by characters such as Pride and Shame. In Doctor Faustus, by contrast, Marlowe relies on soliloquy and dialogue about the characters’ internal states. “It was from Doctor Faustus that the author of Hamlet and Macbeth learned how it could be done,” Greenblatt writes.
Marlowe’s life ended as dramatically as one of his plays: he was stabbed to death in a tavern in Deptford. Officials claimed the death resulted from a quarrel over a dinner bill—but Greenblatt points to a more complicated story. While still a student at university, Greenblatt writes, Marlowe was likely recruited as a spy for Queen Elizabeth’s secret service, possibly to monitor Catholic dissidents or plots against the crown.
But over the years, Marlowe drew scrutiny for his radical ideas and was accused at times of atheism—a grave offense in Elizabethan England. Greenblatt believes that Marlowe was killed for his beliefs, possibly on orders carried out by an “overly zealous servant” of Queen Elizabeth herself.
To Greenblatt, Marlowe’s life serves as a reminder of how repressive Elizabethan England was: “It was basically wise to keep your head down, unless you wanted your head to be chopped off.” Marlowe didn’t and paid the price. Shakespeare was watching, Greenblatt argues, and learned he had to be more careful. But Shakespeare’s blend of conservatism and radicalism was only possible because Marlowe had first ventured too far. Shakespeare relied, Greenblatt writes, on Marlowe’s legacy of “reckless courage and genius.”
And Greenblatt believes Shakespeare was aware of his debt. Greenblatt’s Dark Renaissance ends with a line from Shakespeare’s As You Like It, a reference to Marlowe’s mysterious death in that small tavern room in Deptford: “When a man’s verses cannot be understood…it strikes a man more dead than a great reckoning in a little room.”…
* translation of the phrase– “Quod me alit, me extinguit”– found on the portrait of Marlowe above (at Corpus Christi College, Cambridge)
###
As we ponder profundity, we might spare a thought for a more modern playwright, August Wilson; he died on this date in 2005. Often referred to as “theater’s poet of Black America,” Wilson is best known for a series of 10 plays, collectively called The Pittsburgh Cycle (or The Century Cycle), which chronicle the experiences and heritage of the African-American community in the 20th century. (Plays in the series include Fences and The Piano Lesson, each of which won the Pulitzer Prize for Drama, as well as Ma Rainey’s Black Bottom and Joe Turner’s Come and Gone.) In 2006, Wilson was inducted into the American Theater Hall of Fame.
You must be logged in to post a comment.